Abstract
Our senses perceive the world, but what happens if one of the senses is degraded through illness or injury? In such situations, the brain compensates by enhancing the remaining senses. This suggests that networks that process the data received by the senses are coupled. Similar situations can occur in scientific and engineering problems when independent measurement methods, based on different principles, are used to study the same characteristics of a system. In such situation, one can develop reliable artificial neural network (ANN) based models; each trained using data obtained by a different measurement method. This raises the question if it is possible to couple these different models to obtain and improved more accurate model. In this paper, we explore this possibility by training two ANN models that can recognize alphabet letters in a noisy environment. The performance of these ANNs are optimized by varying the number of hidden neurons (HN). The first ANN model trained using pictorial presentation of the letters while the second by corresponding audio signals. The two separate ANNs are trained using the two alphabet letters presentation to which different levels of white noise are added. Different schemes to couple the two systems are examined. For some coupling schemes, the combined system result in highly improved letter recognition than the two original separate ANNs did. Examination of the entropy related to the number of HNs showed that increased entropy is related to a higher error in letter recognition.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Nunes da Silva I, Spatti DH, Flauzino RA, Bartocci Liboni LH (eds) (2017) artificial neural networks: a practical course. Springer, Basel
Yegnanarayana B (2005) Artificial neural networks. Prentice-Hall of India Privet Limited, New Delhi
Dreyfus G (2004) Neural networks methodology and applications, 2nd edn. Springer, Berlin
Graupe D (2007) Principles of artificial neural networks. World Scientific Publishing Co. Pte. Ltd., Singapore
Hu J, Cao J, Alofi A, AL-Mazrooei A, Elaiw A (2015) Pinning synchronization of coupled inertial delayed neural networks. Cogn Neurodyn 9:341–350
Tu Z, Cao J, Hayat T (2016) Matrix measure based dissipativity analysis for inertial delayed uncertain neural networks. Neural Netw 75:47–55. https://doi.org/10.1016/j.neunet.2015.12.001
Ding X, Cao J, Alsaedi A, Alsaadi FE, Hayat T (2017) Robust fixed-time synchronization for uncertain complex-valued neural networks with discontinuous activation functions. Neural Netw 90:42–55. https://doi.org/10.1016/j.neunet.2017.03.006
Meireles MRG, Almeida PEM, Simões MG (2003) A comprehensive review for industrial applicability of artificial neural networks. IEEE Trans Ind Electron 50:585–601. https://doi.org/10.1109/TIE.2003.812470
Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12:993–1001. https://doi.org/10.1109/34.58871
Cunninghama P, Carneya J, Jacob S (2000) Stability problems with artificial neural networks and the ensemble solution. Artif Intell Med 20:217–225. https://doi.org/10.1016/S0933-3657(00)00065-8
Breiman L (1996) Bagging predictors. Mach Learn 24:123–140
Zhou ZH, Wu J, Tang W (2002) Ensembling neural networks: many could be better than all. Artif Intell 137:239–263. https://doi.org/10.1016/S0004-3702(02)00190-X
Giacinto G, Roli F (1997) Ensembles of neural networks for soft classification of remote sensing images. In: European symposium on intelligent techniques, Bari, Italy, pp 166–170
Sharkey AJC (ed) (1999) Combining artificial neural nets ensemble and modular multi-net systems. Springer, London
Huang W, Hong H, Bian K, Zhou X, Song G, Xie K (2015) Improving deep neural network ensembles using reconstruction error. In: 2015 International joint conference on neural networks (IJCNN), 12–17 July 2015. https://doi.org/10.1109/IJCNN.2015.7280524
Bonab HR, Can F (2017) Less is more: a comprehensive framework for the number of components of ensemble classifiers. IEEE Trans Neural Netw Learn Syst 14:1–7
Dutt A, Pellerin D, Quenot G (2017) Coupled ensembles of neural networks. Neurocomputing. https://doi.org/10.1016/j.neucom.2018.10.092
Gerstner W, Kistler WM, Naud R, Paninski L (2014) Neuronal dynamics: from single neurons to networks and models of cognition. Cambridge University Press, Cambridge
Chaturvedi S, Khurshid AA, Bajaj PR (2013) A novel pattern classifier for character recognition based on spiking neural network. Int J Emerg Technol Comput Appl Sci (IJETCAS) 13-323:118–122. ISSN (Print): 2279-0047 ISSN (Online): 2279-0055
Farulla GA, Armano T, Capietto A, Murru N, Rossini R (2016) Artificial neural networks and fuzzy logic for recognizing alphabet characters and mathematical symbols. Lecture Notes in Computer Science, Volume 9759 Computers Helping People with Special Needs, pp 7–14. https://doi.org/10.1007/978-3-319-41264-1_1
Neural Network Libraries by Sony (An open source software to make research, development and implementation of neural network more efficient). https://github.com/sony/nnabla-examples/blob/2057b0efb5a224a3ec6bbbdbb889008f89d16a32/meta-learning/README.md. Accessed 1 Aug 2019
MathWorks documentation. The link is. https://www.mathworks.com/help/nnet/examples/character-recognition.html. Accessed 1 Aug 2019
The link to Microsoft Speech API 5.4 ISpVoice:Speak is. https://msdn.microsoft.com/en-us/library/ee125024(v=vs.85).aspx. Accessed 1 Aug 2019
MathWorks, file exchange. The link is. https://www.mathworks.com/matlabcentral/fileexchange/18091-text-to-speech?ue&nocookie=true. Accessed 1 Aug 2019
Guterman H (1994) Application of principal component analysis to the design of neural networks. Neural Parallel Sci Comput 2:43–54
Boger Z (1995) Experience in developing and analyzing models of industrial plants by large-scale artificial neural networks. In: 2nd New Zealand two-stream international conference on artificial neural networks and expert systems (ANNES ‘95), November 20–23, Dunedin, New Zealand. https://doi.org/10.1109/annes.1995.499500
Boger Z, Guterman H (1997) Knowledge extraction from artificial neural networks models: systems, man and cybernetics. In: IEEE international conference on systems, man, and cybernetics. computational cybernetics and simulation, vol 5. https://doi.org/10.1109/ICSMC.1997.633051
TURBO-NEURON software package, optimal–industrial neural systems, Be’er Sheva Israel 84243. http://optimalneural.com. Accessed 1 Aug 2019
Sheela KG, Deepa SN (2013) Review on methods to fix number of hidden neurons in neural networks. Math Probl Eng Article ID 425740. http://dx.doi.org/10.1155/2013/425740
Boger Z (2003) Selection of the quasi-optimal inputs in chemometric modeling by artificial neural network analysis. Anal Chim Acta 490:31–40. https://doi.org/10.1016/S0003-2670(03)00349-0
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Boger, Z., Kogan, D., Joseph, N. et al. Improved Data Modeling Using Coupled Artificial Neural Networks. Neural Process Lett 51, 577–590 (2020). https://doi.org/10.1007/s11063-019-10089-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-019-10089-7