[go: up one dir, main page]

Skip to main content
Log in

Improved Data Modeling Using Coupled Artificial Neural Networks

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Our senses perceive the world, but what happens if one of the senses is degraded through illness or injury? In such situations, the brain compensates by enhancing the remaining senses. This suggests that networks that process the data received by the senses are coupled. Similar situations can occur in scientific and engineering problems when independent measurement methods, based on different principles, are used to study the same characteristics of a system. In such situation, one can develop reliable artificial neural network (ANN) based models; each trained using data obtained by a different measurement method. This raises the question if it is possible to couple these different models to obtain and improved more accurate model. In this paper, we explore this possibility by training two ANN models that can recognize alphabet letters in a noisy environment. The performance of these ANNs are optimized by varying the number of hidden neurons (HN). The first ANN model trained using pictorial presentation of the letters while the second by corresponding audio signals. The two separate ANNs are trained using the two alphabet letters presentation to which different levels of white noise are added. Different schemes to couple the two systems are examined. For some coupling schemes, the combined system result in highly improved letter recognition than the two original separate ANNs did. Examination of the entropy related to the number of HNs showed that increased entropy is related to a higher error in letter recognition.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Nunes da Silva I, Spatti DH, Flauzino RA, Bartocci Liboni LH (eds) (2017) artificial neural networks: a practical course. Springer, Basel

    Google Scholar 

  2. Yegnanarayana B (2005) Artificial neural networks. Prentice-Hall of India Privet Limited, New Delhi

    Google Scholar 

  3. Dreyfus G (2004) Neural networks methodology and applications, 2nd edn. Springer, Berlin

    Google Scholar 

  4. Graupe D (2007) Principles of artificial neural networks. World Scientific Publishing Co. Pte. Ltd., Singapore

    Google Scholar 

  5. Hu J, Cao J, Alofi A, AL-Mazrooei A, Elaiw A (2015) Pinning synchronization of coupled inertial delayed neural networks. Cogn Neurodyn 9:341–350

    Google Scholar 

  6. Tu Z, Cao J, Hayat T (2016) Matrix measure based dissipativity analysis for inertial delayed uncertain neural networks. Neural Netw 75:47–55. https://doi.org/10.1016/j.neunet.2015.12.001

    Google Scholar 

  7. Ding X, Cao J, Alsaedi A, Alsaadi FE, Hayat T (2017) Robust fixed-time synchronization for uncertain complex-valued neural networks with discontinuous activation functions. Neural Netw 90:42–55. https://doi.org/10.1016/j.neunet.2017.03.006

    Google Scholar 

  8. Meireles MRG, Almeida PEM, Simões MG (2003) A comprehensive review for industrial applicability of artificial neural networks. IEEE Trans Ind Electron 50:585–601. https://doi.org/10.1109/TIE.2003.812470

    Google Scholar 

  9. Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12:993–1001. https://doi.org/10.1109/34.58871

    Google Scholar 

  10. Cunninghama P, Carneya J, Jacob S (2000) Stability problems with artificial neural networks and the ensemble solution. Artif Intell Med 20:217–225. https://doi.org/10.1016/S0933-3657(00)00065-8

    Google Scholar 

  11. Breiman L (1996) Bagging predictors. Mach Learn 24:123–140

    Google Scholar 

  12. Zhou ZH, Wu J, Tang W (2002) Ensembling neural networks: many could be better than all. Artif Intell 137:239–263. https://doi.org/10.1016/S0004-3702(02)00190-X

    Google Scholar 

  13. Giacinto G, Roli F (1997) Ensembles of neural networks for soft classification of remote sensing images. In: European symposium on intelligent techniques, Bari, Italy, pp 166–170

  14. Sharkey AJC (ed) (1999) Combining artificial neural nets ensemble and modular multi-net systems. Springer, London

    Google Scholar 

  15. Huang W, Hong H, Bian K, Zhou X, Song G, Xie K (2015) Improving deep neural network ensembles using reconstruction error. In: 2015 International joint conference on neural networks (IJCNN), 12–17 July 2015. https://doi.org/10.1109/IJCNN.2015.7280524

  16. Bonab HR, Can F (2017) Less is more: a comprehensive framework for the number of components of ensemble classifiers. IEEE Trans Neural Netw Learn Syst 14:1–7

    Google Scholar 

  17. Dutt A, Pellerin D, Quenot G (2017) Coupled ensembles of neural networks. Neurocomputing. https://doi.org/10.1016/j.neucom.2018.10.092

    Google Scholar 

  18. Gerstner W, Kistler WM, Naud R, Paninski L (2014) Neuronal dynamics: from single neurons to networks and models of cognition. Cambridge University Press, Cambridge

    Google Scholar 

  19. Chaturvedi S, Khurshid AA, Bajaj PR (2013) A novel pattern classifier for character recognition based on spiking neural network. Int J Emerg Technol Comput Appl Sci (IJETCAS) 13-323:118–122. ISSN (Print): 2279-0047 ISSN (Online): 2279-0055

  20. Farulla GA, Armano T, Capietto A, Murru N, Rossini R (2016) Artificial neural networks and fuzzy logic for recognizing alphabet characters and mathematical symbols. Lecture Notes in Computer Science, Volume 9759 Computers Helping People with Special Needs, pp 7–14. https://doi.org/10.1007/978-3-319-41264-1_1

  21. Neural Network Libraries by Sony (An open source software to make research, development and implementation of neural network more efficient). https://github.com/sony/nnabla-examples/blob/2057b0efb5a224a3ec6bbbdbb889008f89d16a32/meta-learning/README.md. Accessed 1 Aug 2019

  22. MathWorks documentation. The link is. https://www.mathworks.com/help/nnet/examples/character-recognition.html. Accessed 1 Aug 2019

  23. The link to Microsoft Speech API 5.4 ISpVoice:Speak is. https://msdn.microsoft.com/en-us/library/ee125024(v=vs.85).aspx. Accessed 1 Aug 2019

  24. MathWorks, file exchange. The link is. https://www.mathworks.com/matlabcentral/fileexchange/18091-text-to-speech?ue&nocookie=true. Accessed 1 Aug 2019

  25. Guterman H (1994) Application of principal component analysis to the design of neural networks. Neural Parallel Sci Comput 2:43–54

    Google Scholar 

  26. Boger Z (1995) Experience in developing and analyzing models of industrial plants by large-scale artificial neural networks. In: 2nd New Zealand two-stream international conference on artificial neural networks and expert systems (ANNES ‘95), November 20–23, Dunedin, New Zealand. https://doi.org/10.1109/annes.1995.499500

  27. Boger Z, Guterman H (1997) Knowledge extraction from artificial neural networks models: systems, man and cybernetics. In: IEEE international conference on systems, man, and cybernetics. computational cybernetics and simulation, vol 5. https://doi.org/10.1109/ICSMC.1997.633051

  28. TURBO-NEURON software package, optimal–industrial neural systems, Be’er Sheva Israel 84243. http://optimalneural.com. Accessed 1 Aug 2019

  29. Sheela KG, Deepa SN (2013) Review on methods to fix number of hidden neurons in neural networks. Math Probl Eng Article ID 425740. http://dx.doi.org/10.1155/2013/425740

  30. Boger Z (2003) Selection of the quasi-optimal inputs in chemometric modeling by artificial neural network analysis. Anal Chim Acta 490:31–40. https://doi.org/10.1016/S0003-2670(03)00349-0

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yehuda Zeiri.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Boger, Z., Kogan, D., Joseph, N. et al. Improved Data Modeling Using Coupled Artificial Neural Networks. Neural Process Lett 51, 577–590 (2020). https://doi.org/10.1007/s11063-019-10089-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-019-10089-7

Keywords

Navigation