RNN- and LSTM-Based Soft Sensors Transferability for an Industrial Process
<p>Basic flow of transfer learning. Source input (<math display="inline"><semantics> <msub> <mi>X</mi> <mi>S</mi> </msub> </semantics></math>), source output (<math display="inline"><semantics> <msub> <mi>Y</mi> <mi>S</mi> </msub> </semantics></math>), target input (<math display="inline"><semantics> <msub> <mi>X</mi> <mi>T</mi> </msub> </semantics></math>), and target output (<math display="inline"><semantics> <msub> <mi>Y</mi> <mi>T</mi> </msub> </semantics></math>).</p> "> Figure 2
<p>Block scheme of an Recurrent Neural Network (RNN) with two hidden layers with delays and recurrent connections.</p> "> Figure 3
<p>Working diagram of a long short-term memory network (LSTM) unit, showing how the gates forget, update, and output both cell and hidden states.</p> "> Figure 4
<p>The industrial Sulfur Recovery Unit (SRU) taken into account contains four parallel processing lines.</p> "> Figure 5
<p>Simplified working scheme of an SRU line.</p> "> Figure 6
<p>Transfer learning (TL) strategies applied to the SRU lines.</p> "> Figure 7
<p>Statistical distribution of the CC on test data for different RNN network architectures. The output 2 for SRU line 2 is considered. The circles indicate the mean value, whereas the bars represent the standard deviation. The final selected structure is highlighted.</p> "> Figure 8
<p>Statistical distribution of the CC on test data for different LSTM network architectures for the design of the SS modeling the output 2 of SRU line 2. The circles indicate the mean value, whereas the bars represent the standard deviation. Hyperparameter values are reported in the three colored bars below, and the chosen combination is highlighted.</p> "> Figure 9
<p>Network outputs and measured ones comparison for: the SRU line 2 output 1 with the RNN (<b>a</b>) and the LSTM model (<b>b</b>); the SRU line 4 output 2 with the RNN (<b>c</b>) and LSTM model (<b>d</b>).</p> "> Figure 10
<p>Comparison between the transferred RNN models on the SRU line 2 before (<b>a</b>) and after (<b>b</b>) the re-tuning. The same comparison is shown for the LSTM model (<b>c</b>,<b>d</b>).</p> "> Figure 11
<p>Comparison between the transferred models on the SRU line 2 through the THM technique: (<b>a</b>) RNN and (<b>b</b>) LSTM models.</p> ">
Abstract
:1. Introduction
- data acquisition, selection, and pre-processing;
- model class selection;
- model order selection;
- model identification; and
- model validation.
2. Related Works
3. Theory Fundamentals
3.1. Recurrent Neural Networks
3.2. Long Short-Term Memory Network
- Forget gate (f), that controls the level of cell state reset;
- Cell candidate (g), that adds information to cell state;
- Input gate (i), that controls the level of cell state update;
- Output gate (o), that controls the level of cell state added to the hidden state.
- Input weights:
- Recurrent weights:
- Biases:
3.3. Model Description
- Number of input delays;
- Number of internal delays;
- Number of output delays;
- Number of hidden layers;
- Number of neurons for each hidden layer;
- Number of hidden units in the LSTM layer;
- Number of hidden neurons in the fully connected layer;
- Dropout probability value.
4. Methodology
4.1. Fine-Tuned Transferred Model
Algorithm 1: FTTM Algorithm. |
Load ; |
Load , ; |
Fine-tune with , ; |
Test |
Compute MAE, RMSE, CC; |
4.2. Transferred Hyperparameters
Algorithm 2: THM Algorithm. |
Load ; |
Load , ; |
Inizialize |
Train with , ; |
Test |
Compute MAE, RMSE, CC; |
5. Simulation Results
5.1. Case of Study: The Sulfur Recovery Unit
5.2. Design of the Optimal SSs
5.3. Transferred Models
5.3.1. Fine-Tuned Transferred Models
5.3.2. Transferred Hyperparameters Models
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Fortuna, L.; Graziani, S.; Rizzo, A.; Xibilia, M. Soft Sensors for Monitoring and Control of Industrial Processes; Springer: London, UK, 2007. [Google Scholar] [CrossRef]
- Kadlec, P.; Gabrys, B.; Strandt, S. Data-driven Soft Sensors in the process industry. Comput. Chem. Eng. 2009, 33, 795–814. [Google Scholar] [CrossRef] [Green Version]
- Pani, A.K.; Amin, K.G.; Mohanta, H.K. Soft sensing of product quality in the debutanizer column with principal component analysis and feed-forward artificial neural network. Alex. Eng. J. 2016, 55, 1667–1674. [Google Scholar] [CrossRef] [Green Version]
- Graziani, S.; Xibilia, M.G. Deep Structures for a Reformer Unit Soft Sensor. In Proceedings of the 2018 IEEE 16th International Conference on Industrial Informatics (INDIN), Porto, Portugal, 18–20 July 2018; pp. 927–932. [Google Scholar] [CrossRef]
- Stanišić, D.; Jorgovanović, N.; Popov, N.; Čongradac, V. Soft sensor for real-time cement fineness estimation. ISA Trans. 2015, 55, 250–259. [Google Scholar] [CrossRef] [PubMed]
- Sujatha, K.; Bhavani, N.; Cao, S.Q.; Ram Kumar, K. Soft Sensor for Flame Temperature Measurement and IoT based Monitoring in Power Plants. Mater. Today Proc. 2018, 5, 10755–10762. [Google Scholar] [CrossRef]
- Galicia, H.J.; He, Q.P.; Wang, J. A reduced order soft sensor approach and its application to a continuous digester. J. Process Control 2011, 21, 489–500. [Google Scholar] [CrossRef]
- Zhu, X.; Rehman, K.U.; Wang, B.; Shahzad, M. Modern Soft-Sensing Modeling Methods for Fermentation Processes. Sensors 2020, 20, 1771. [Google Scholar] [CrossRef] [Green Version]
- Zhu, C.; Zhang, J. Developing Soft Sensors for Polymer Melt Index in an Industrial Polymerization Process Using Deep Belief Networks. Int. J. Autom. Comput. 2020, 17, 44–54. [Google Scholar] [CrossRef] [Green Version]
- Pisa, I.; Santín, I.; Vicario, J.; Morell, A.; Vilanova, R. ANN-Based Soft Sensor to Predict Effluent Violations in Wastewater Treatment Plants. Sensors 2019, 19, 1280. [Google Scholar] [CrossRef] [Green Version]
- D’Andrea, R.; Dullerud, G.E. Distributed control design for spatially interconnected systems. IEEE Trans. Autom. Control 2003, 48, 1478–1495. [Google Scholar] [CrossRef] [Green Version]
- Chitralekha, S.B.; Shah, S.L. Support Vector Regression for soft sensor design of nonlinear processes. In Proceedings of the 18th Mediterranean Conference on Control and Automation, MED’10, Marrakech, Morocco, 23–25 June 2010; pp. 569–574. [Google Scholar] [CrossRef]
- Song, Y.; Ren, M. A Novel Just-in-Time Learning Strategy for Soft Sensing with Improved Similarity Measure Based on Mutual Information and PLS. Sensors 2020, 20, 3804. [Google Scholar] [CrossRef]
- Jain, V.; Kishore, P.; Kumar, R.A.; Pani, A.K. Inferential Sensing of Output Quality in Petroleum Refinery Using Principal Component Regression and Support Vector Regression. In Proceedings of the 2017 IEEE 7th International Advance Computing Conference (IACC), Hyderabad, India, 5–7 January 2017; pp. 461–465. [Google Scholar]
- Souza, A.M.F.; Soares, F.M.; Castro, M.A.G.; Nagem, N.F.; Bitencourt, A.H.J.; Affonso, C.M.; Oliveira, R.C.L. Soft Sensors in the Primary Aluminum Production Process Based on Neural Networks Using Clustering Methods. Sensors 2019, 19, 5255. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Guo, H.; Sung, Y. Movement Estimation Using Soft Sensors Based on Bi-LSTM and Two-Layer LSTM for Human Motion Capture. Sensors 2020, 20, 1801. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Huang, R.; Li, Z.; Cao, B. A Soft Sensor Approach Based on an Echo State Network Optimized by Improved Genetic Algorithm. Sensors 2020, 20, 5000. [Google Scholar] [CrossRef] [PubMed]
- Shang, C.; Yang, F.; Huang, D.; Lyu, W. Data-driven soft sensor development based on deep learning technique. J. Process Control 2014, 24, 223–233. [Google Scholar] [CrossRef]
- Graziani, S.; Xibilia, M.G. Deep Learning for Soft Sensor Design; Springer: Cham, Switzerland, 2020; pp. 31–59. [Google Scholar] [CrossRef]
- Ando, B.; Graziani, S.; Xibilia, M. Low-order Nonlinear Finite-Impulse Response Soft Sensors for Ionic Electroactive Actuators Based on Deep Learning. IEEE Trans. Instrum. Meas. 2018, 68, 1637–1646. [Google Scholar] [CrossRef]
- Yuan, X.; Li, L.; Shardt, Y.; Wang, Y.; Yang, C. Deep learning with spatiotemporal attention-based LSTM for industrial soft sensor model development. IEEE Trans. Ind. Electron. 2020, 1–11. [Google Scholar] [CrossRef]
- Ke, W.; Huang, D.; Yang, F.; Jiang, Y. Soft sensor development and applications based on LSTM in deep neural networks. In Proceedings of the 2017 IEEE Symposium Series on Computational Intelligence (SSCI), Honolulu, HI, USA, 27 November–1 December 2017; pp. 1–6. [Google Scholar]
- Yuan, X.; Qi, S.; Wang, Y. Stacked Enhanced Auto-encoder for Data-driven Soft Sensing of Quality Variable. IEEE Trans. Instrum. Meas. 2020, 69, 7953–7961. [Google Scholar] [CrossRef]
- Xibilia, M.G.; Latino, M.; Marinković, Z.; Atanasković, A.; Donato, N. Soft Sensors Based on Deep Neural Networks for Applications in Security and Safety. IEEE Trans. Instrum. Meas. 2020, 69, 7869–7876. [Google Scholar] [CrossRef]
- Yao, L.; Jiang, X.; Huang, G.; Qian, J.; Shen, B.; Xu, L.; Ge, Z. Virtual Sensing f-CaO Content of Cement Clinker Based on Incremental Deep Dynamic Features Extracting and Transferring Model. IEEE Trans. Instrum. Meas. 2021, 70, 1–10. [Google Scholar] [CrossRef]
- Yuan, X.; Wang, Y.; Yang, C.; Gui, W. Stacked isomorphic autoencoder based soft analyzer and its application to sulfur recovery unit. Inf. Sci. 2020, 534, 72–84. [Google Scholar] [CrossRef]
- Ge, Z.; Song, Z. Semisupervised Bayesian method for soft sensor modeling with unlabeled data samples. AIChE J. 2011, 57, 2109–2119. [Google Scholar] [CrossRef]
- Grbić, R.; Slišković, D.; Kadlec, P. Adaptive soft sensor for online prediction and process monitoring based on a mixture of Gaussian process models. Comput. Chem. Eng. 2013, 58, 84–97. [Google Scholar] [CrossRef]
- Shao, W.; Ge, Z.; Song, Z.; Wang, K. Nonlinear industrial soft sensor development based on semi-supervised probabilistic mixture of extreme learning machines. Control Eng. Pract. 2019, 91, 104098. [Google Scholar] [CrossRef]
- Xie, L.; Zeng, J.; Gao, C. Novel Just-In-Time Learning-Based Soft Sensor Utilizing Non-Gaussian Information. IEEE Trans. Control Syst. Technol. 2014, 22, 360–368. [Google Scholar] [CrossRef]
- Zhang, W.; Li, Y.; Xiong, W.; Xu, B. Adaptive soft sensor for online prediction based on enhanced moving window GPR. In Proceedings of the 2015 International Conference on Control, Automation and Information Sciences (ICCAIS), Changshu, China, 29–31 October 2015; pp. 291–296. [Google Scholar]
- Parvizi Moghadam, R.; Shahraki, F.; Sadeghi, J. Online Monitoring for Industrial Processes Quality Control Using Time Varying Parameter Model. Int. J. Eng. 2018, 31, 524–532. [Google Scholar]
- Ljung, L. System Identification: Theory for the User; Pearson: London, UK, 1997. [Google Scholar]
- Curreri, F.; Graziani, S.; Xibilia, M.G. Input selection methods for data-driven Soft sensors design: Application to an industrial process. Inf. Sci. 2020, 537, 1–17. [Google Scholar] [CrossRef]
- Souza, F.A.; Araújo, R.; Mendes, J. Review of soft sensor methods for regression applications. Chemom. Intell. Lab. Syst. 2016, 152, 69–79. [Google Scholar] [CrossRef]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016; Available online: http://www.deeplearningbook.org (accessed on 25 January 2021).
- Sun, Q.; Ge, Z. Deep Learning for Industrial KPI Prediction: When Ensemble Learning Meets Semi-Supervised Data. IEEE Trans. Ind. Inform. 2020, 17, 260–269. [Google Scholar] [CrossRef]
- Bergstra, J.; Bardenet, R.; Bengio, Y.; Kégl, B. Algorithms for Hyper-Parameter Optimization; Curran Associates, Inc.: Kottayam, India, 2011; Volume 24, pp. 2546–2554. [Google Scholar]
- Falkner, S.; Klein, A.; Hutter, F. BOHB: Robust and Efficient Hyperparameter Optimization at Scale. arXiv 2018, arXiv:1807.01774. [Google Scholar]
- Franceschi, L.; Donini, M.; Frasconi, P.; Pontil, M. Forward and Reverse Gradient-Based Hyperparameter Optimization. arXiv 2017, arXiv:1703.01785. [Google Scholar]
- Weiss, K.; Khoshgoftaar, T.; Wang, D. A survey of transfer learning. J. Big Data 2016, 3. [Google Scholar] [CrossRef] [Green Version]
- Pan, S.J.; Yang, Q. A Survey on Transfer Learning. IEEE Trans. Knowl. Data Eng. 2010, 22, 1345–1359. [Google Scholar] [CrossRef]
- Zhuang, F.; Qi, Z.; Duan, K.; Xi, D.; Zhu, Y.; Zhu, H.; Xiong, H.; He, Q. A Comprehensive Survey on Transfer Learning. Proc. IEEE 2021, 109, 43–76. [Google Scholar] [CrossRef]
- Farahani, H.S.; Fatehi, A.; Nadali, A.; Shoorehdeli, M.A. A Novel Method For Designing Transferable Soft Sensors And Its Application. arXiv 2020, arXiv:2008.02186. [Google Scholar]
- Shao, S.; McAleer, S.; Yan, R.; Baldi, P. Highly-Accurate Machine Fault Diagnosis Using Deep Transfer Learning. IEEE Trans. Ind. Inform. 2019, 15, 2446–2455. [Google Scholar] [CrossRef]
- Han, T.; Liu, C.; Yang, W.; Jiang, D. Deep transfer network with joint distribution adaptation: A new intelligent fault diagnosis framework for industry application. ISA Trans. 2019. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Fortuna, L.; Rizzo, A.; Sinatra, M.; Xibilia, M. Soft analyzers for a sulfur recovery unit. Award winning applications-2002 IFAC World Congress. Control Eng. Pract. 2003, 11, 1491–1500. [Google Scholar] [CrossRef]
- Khatami, A.; Babaie, M.; Tizhoosh, H.; Khosravi, A.; Nguyen, T.; Nahavandi, S. A sequential search-space shrinking using CNN transfer learning and a Radon projection pool for medical image retrieval. Expert Syst. Appl. 2018, 100, 224–233. [Google Scholar] [CrossRef]
- Zhao, B.; Huang, B.; Zhong, Y. Transfer Learning With Fully Pretrained Deep Convolution Networks for Land-Use Classification. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1436–1440. [Google Scholar] [CrossRef]
- Lu, Z.; Zhu, Y.; Pan, S.; Xiang, E.; Wang, Y.; Yang, Q. Source Free Transfer Learning for Text Classification. In Proceedings of the AAAI Conference on Artificial Intelligence, Quebec City, QC, Canada, 27–31 July 2014; Volume 28. [Google Scholar]
- Kandaswamy, C.; Monteiro, J.; Silva, L.; Cardoso, J.S. Multi-source deep transfer learning for cross-sensor biometrics. Neural Comput. Appl. 2017, 28, 2461–2475. [Google Scholar] [CrossRef] [Green Version]
- Li, J.; Wu, W.; Xue, D.; Gao, P. Multi-Source Deep Transfer Neural Network Algorithms. Sensors 2019, 19, 3992. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lu, J.; Behbood, V.; Hao, P.; Zuo, H.; Xue, S.; Zhang, G. Transfer learning using computational intelligence: A survey. Knowl. Based Syst. 2015, 80, 14–23. [Google Scholar] [CrossRef]
- Liang, H.; Fu, W.; Yi, F. A Survey of Recent Advances in Transfer Learning. In Proceedings of the 2019 IEEE 19th International Conference on Communication Technology (ICCT), Xi’an, China, 16–19 October 2019; pp. 1516–1523. [Google Scholar] [CrossRef]
- Han, T.; Liu, C.; Yang, W.; Jiang, D. Learning transferable features in deep convolutional neural networks for diagnosing unseen machine conditions. ISA Trans. 2019, 93, 341–353. [Google Scholar] [CrossRef] [PubMed]
- Shi-sheng, Z.; Song, F.; Lin, L. A novel gas turbine fault diagnosis method based on transfer learning with CNN. Measurement 2019, 137. [Google Scholar] [CrossRef]
- Mamledesai, H.; Soriano, M.A.; Ahmad, R. A Qualitative Tool Condition Monitoring Framework Using Convolution Neural Network and Transfer Learning. Appl. Sci. 2020, 10, 7298. [Google Scholar] [CrossRef]
- Ren, H.; Liu, W.; Shan, M.; Wang, X. A new wind turbine health condition monitoring method based on VMD-MPE and feature-based transfer learning. Measurement 2019, 148, 106906. [Google Scholar] [CrossRef]
- Wang, Y.; Wu, D.; Yuan, X. LDA-based deep transfer learning for fault diagnosis in industrial chemical processes. Comput. Chem. Eng. 2020, 140, 106964. [Google Scholar] [CrossRef]
- Wang, Q.; Michau, G.; Fink, O. Domain Adaptive Transfer Learning for Fault Diagnosis. arXiv 2019, arXiv:1905.06004. [Google Scholar]
- Wu, Z.; Jiang, H.; Zhao, K.; Xingqiu, L. An adaptive deep transfer learning method for bearing fault diagnosis. Measurement 2019, 151, 107227. [Google Scholar] [CrossRef]
- Yang, B.; Lei, Y.; Jia, F.; Xing, S. An intelligent fault diagnosis approach based on transfer learning from laboratory bearings to locomotive bearings. Mech. Syst. Signal Process. 2019, 122, 692–706. [Google Scholar] [CrossRef]
- Tang, S.; Tang, H.; Chen, M. Transfer-learning based gas path analysis method for gas turbines. Appl. Therm. Eng. 2019, 155, 1–13. [Google Scholar] [CrossRef]
- Liu, Y.; Yang, C.; Zhang, M.; Dai, Y.; Yao, Y. Development of Adversarial Transfer Learning Soft Sensor for Multigrade Processes. Ind. Eng. Chem. Res. 2020, 59, 16330–16345. [Google Scholar] [CrossRef]
- Liu, Y.; Yang, C.; Liu, K.; Chen, B.; Yao, Y. Domain adaptation transfer learning soft sensor for product quality prediction. Chemom. Intell. Lab. Syst. 2019, 192, 103813. [Google Scholar] [CrossRef]
- Cai, L.; Gu, J.; Ma, J.; Jin, Z. Probabilistic Wind Power Forecasting Approach via Instance-Based Transfer Learning Embedded Gradient Boosting Decision Trees. Energies 2019, 12, 159. [Google Scholar] [CrossRef] [Green Version]
- Rumelhart, D.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
- Graves, A. Supervised Sequence Labelling with Recurrent Neural Networks; Springer: New York, NY, USA, 2012; Volume 385. [Google Scholar] [CrossRef] [Green Version]
- Tarwani, K.M.; Edem, S. Survey on Recurrent Neural Network in Natural Language Processing. Int. J. Eng. Trends Technol. 2017, 48, 301–304. [Google Scholar] [CrossRef]
- Amberkar, A.; Awasarmol, P.; Deshmukh, G.; Dave, P. Speech Recognition using Recurrent Neural Networks. In Proceedings of the 2018 International Conference on Current Trends towards Converging Technologies (ICCTCT), Coimbatore, India, 1–3 March 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Graves, A.; Schmidhuber, J. Offline Arabic Handwriting Recognition with Multidimensional Recurrent Neural Networks. In Proceedings of the Advances in Neural Information Processing Systems 21—Proceedings of the 2008 Conference, Vancouver, BC, Canada, 8–11 December 2008; pp. 545–552. [Google Scholar] [CrossRef]
- Marquardt, D.W. An Algorithm for Least-Squares Estimation of Nonlinear Parameters. J. Soc. Ind. Appl. Math. 1963, 11, 431–441. [Google Scholar] [CrossRef]
- Werbos, P.J. Backpropagation through time: What it does and how to do it. Proc. IEEE 1990, 78, 1550–1560. [Google Scholar] [CrossRef] [Green Version]
- Pascanu, R.; Mikolov, T.; Bengio, Y. On the difficulty of training Recurrent Neural Networks. arXiv 2013, arXiv:1211.5063. [Google Scholar]
- Hochreiter, S.; Schmidhuber, J. Long Short-term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Greff, K.; Srivastava, R.K.; Koutnik, J.; Steunebrink, B.R.; Schmidhuber, J. LSTM: A Search Space Odyssey. IEEE Trans. Neural Netw. Learn. Syst. 2017, 28, 2222–2232. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kurata, G.; Ramabhadran, B.; Saon, G.; Sethy, A. Language Modeling with Highway LSTM. arXiv 2017, arXiv:1709.06436. [Google Scholar]
- Gers, F.; Eck, D.; Schmidhuber, J. Applying LSTM to Time Series Predictable through Time-Window Approaches. In Artificial Neural Networks—ICANN 2001; Springer: London, UK, 2001; pp. 669–676. [Google Scholar] [CrossRef]
- Graves, A.; Jaitly, N.; Mohamed, A. Hybrid speech recognition with Deep Bidirectional LSTM. In Proceedings of the 2013 IEEE Workshop on Automatic Speech Recognition and Understanding, Olomouc, Czech Republic, 8–12 December 2013; pp. 273–278. [Google Scholar] [CrossRef]
- Ullah, A.; Ahmad, J.; Muhammad, K.; Sajjad, M.; Baik, S. Action Recognition in Video Sequences using Deep Bi-directional LSTM with CNN Features. IEEE Access 2017, 6, 1155–1166. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. arXiv 2017, arXiv:1412.6980. [Google Scholar]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
Variable | Description |
---|---|
gas flow in the MEA chamber | |
air flow in the MEA chamber | |
secondary final air flow | |
total gas flow in the SWS chamber | |
total air flow in the SWS chamber | |
concentration (output 1) | |
concentration (output 2) |
Delays | Hidden Layers | |
---|---|---|
Line 2 | Input: 1 Internal: 2 Output: 2 | [4,4] |
Line 4 | Input: 4 Internal: 5 Output: 1 | [3,3] |
# LSTM Units | # FCL Units | Dropout Prob. | |
---|---|---|---|
Line 2 | 200 | 80 | 0.7 |
Line 4 | 175 | 100 | 0.5 |
Line 2 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
CC | 0.84 | 0.84 | 0.72 | 0.72 |
MAE | 0.13 | 0.027 | 0.12 | 0.16 |
RMSE | 0.62 | 0.67 | 0.77 | 0.89 |
Line 4 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
CC | 0.90 | 0.91 | 0.77 | 0.85 |
MAE | 0.02 | 0.03 | 0.1 0 | 0.01 |
RMSE | 0.56 | 0.44 | 0.70 | 0.53 |
Line 2 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
CC | 0.65 (−22.62%) | 0.70 (−16.6%) | 0.45 (−37.50%) | 0.64 (−11.11%) |
Line 2 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
MAE | 1.11 | 0.63 | 1.36 | 0.62 |
RMSE | 1.50 | 1.17 | 1.81 | 1.17 |
Line 2 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
CC | 0.75 (−10.71%) | 0.78 (−7.14%) | 0.72 (0%) | 0.72 (0%) |
Line 2 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
MAE | 0.36 | 0.07 | 0.23 | 0.36 |
RMSE | 0.81 | 0.76 | 0.78 | 0.92 |
Line 2 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
CC | 0.63 (−25%) | 0.71 (−15.48%) | 0.67 (−6.94%) | 0.70 (−4.16%) |
Line 2 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
MAE | 0.40 | 0.61 | 0.18 | 0.17 |
RMSE | 0.98 | 1.07 | 0.85 | 0.90 |
Line 2 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
CC | 0.74 (−11.90%) | 0.73 (−13.09%) | 0.71 (−1.39%) | 0.64 (−11.11%) |
Line 2 | RNN | LSTM | ||
---|---|---|---|---|
Output 1 | Output 2 | Output 1 | Output 2 | |
MAE | 0.09 | 0.50 | 0.03 | 0.78 |
RMSE | 0.76 | 0.96 | 0.77 | 1.21 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Curreri, F.; Patanè, L.; Xibilia, M.G. RNN- and LSTM-Based Soft Sensors Transferability for an Industrial Process. Sensors 2021, 21, 823. https://doi.org/10.3390/s21030823
Curreri F, Patanè L, Xibilia MG. RNN- and LSTM-Based Soft Sensors Transferability for an Industrial Process. Sensors. 2021; 21(3):823. https://doi.org/10.3390/s21030823
Chicago/Turabian StyleCurreri, Francesco, Luca Patanè, and Maria Gabriella Xibilia. 2021. "RNN- and LSTM-Based Soft Sensors Transferability for an Industrial Process" Sensors 21, no. 3: 823. https://doi.org/10.3390/s21030823