[go: up one dir, main page]

0% found this document useful (0 votes)
30 views4 pages

Algorithms 13 00140

Uploaded by

abhishek prasad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views4 pages

Algorithms 13 00140

Uploaded by

abhishek prasad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

algorithms

Editorial
Special Issue on Ensemble Learning and Applications
Panagiotis Pintelas * and Ioannis E. Livieris
Department of Mathematics, University of Patras, 265-00 GR Patras, Greece; livieris@upatras.gr
* Correspondence: ppintelas@gmail.com

Received: 5 June 2020; Accepted: 9 June 2020; Published: 11 June 2020 

Abstract: During the last decades, in the area of machine learning and data mining,
the development of ensemble methods has gained a significant attention from the scientific
community. Machine learning ensemble methods combine multiple learning algorithms to obtain
better predictive performance than could be obtained from any of the constituent learning algorithms
alone. Combining multiple learning models has been theoretically and experimentally shown to
provide significantly better performance than their single base learners. In the literature, ensemble
learning algorithms constitute a dominant and state-of-the-art approach for obtaining maximum
performance, thus they have been applied in a variety of real-world problems ranging from face and
emotion recognition through text classification and medical diagnosis to financial forecasting.

Keywords: ensemble learning; homogeneous and heterogeneous ensembles; fusion strategies; voting
schemes; model combination; black, white and gray box models; incremental and evolving learning

1. Introduction
This article is the editorial of the “Ensemble Learning and Their Applications”(https://www.mdpi.
com/journal/algorithms/special_issues/Ensemble_Algorithms) Special Issue of the Algorithms journal.
The main aim of this Special Issue is to present the recent advances related to all kinds of ensemble
learning algorithms, frameworks, methodologies and investigate the impact of their application in
a diversity of real-world problems. The response of the scientific community has been significant,
as many original research papers have been submitted for consideration. In total, eight (8) papers
were accepted, after going through a careful peer-review process based on quality and novelty criteria.
All accepted papers possess significant elements of novelty, cover a diversity of application domains
and introduce interesting ensemble-based approaches, which provide readers with a glimpse of the
state-of-the-art research in the domain.
During the last decades, the development of ensemble learning methodologies and techniques has
gained a significant attention from the scientific and industrial community [1–3]. The basic idea behind
these methods is the combination of a set of diverse prediction models for obtaining a composite global
model which produces reliable and accurate estimates or predictions. Theoretical and experimental
evidence proved that ensemble models provide considerably better prediction performance than single
models [4]. Along this line, a variety of ensemble learning algorithms and techniques have been
proposed and found their application in various classification and regression real-word problems.

2. Ensemble Learning and Applications


The first paper is entitled “A Weighted Voting Ensemble Self-Labeled Algorithm for the Detection of
Lung Abnormalities from X-Rays” and it is authored by Livieris et al. [5]. The authors presented a new
ensemble-based semi-supervised learning algorithm for the classification of lung abnormalities from
chest X-rays. The proposed algorithm exploits a new weighted voting scheme which assigns a vector of
weights on each component learner of the ensemble based on its accuracy on each class. The proposed

Algorithms 2020, 13, 140; doi:10.3390/a13060140 www.mdpi.com/journal/algorithms


Algorithms 2020, 13, 140 2 of 4

algorithm was extensively evaluated on three famous real-world benchmarks, namely the Pneumonia
chest X-rays dataset from Guangzhou Women and Children’s Medical Center, the Tuberculosis
dataset from Shenzhen Hospital and the cancer CT-medical images dataset. The presented numerical
experiments showed the efficiency of the proposed ensemble methodology against simple voting
strategy and other traditional semi-supervised methods.
The second paper is authored by Papageorgiou et al. [6] entitled “Exploring an Ensemble of Methods
that Combines Fuzzy Cognitive Maps and Neural Networks in Solving the Time Series Prediction Problem of
Gas Consumption in Greece”. This paper presents an innovative ensemble time-series forecasting model
for the prediction of gas consumption demand in Greece. The model is based on an ensemble learning
technique which exploits evolutionary Fuzzy Cognitive Maps (FCMs), Artificial Neural Networks
(ANNs) and their hybrid structure, named FCM-ANN, for time-series prediction. The prediction
performance of the proposed model was compared against that of the Long Short-Term Memory
(LSTM) model on three time-series datasets concerning data from distribution points which compose
the natural gas grid of a Greek region. The presented results illustrated empirical evidence that the
proposed approach could be effectively utilized to forecast gas consumption demand.
The third paper “A Grey-Box Ensemble Model Exploiting Black-Box Accuracy and White-Box Intrinsic
Interpretability” was written by Pintelas et al. [7]. In this interesting study, the authors proposed a new
framework for the development of a Grey-Box machine learning model based on the semi-supervised
philosophy. The advantages of the proposed model are that it is nearly as accurate as a Black-Box
and it is also interpretable like a White-Box model. More specifically, in their proposed framework,
a Black-Box model was utilized for enlarging a small initial labeled dataset, adding the model’s most
confident predictions of a large unlabeled dataset. In the sequel, the augmented dataset was utilized
for training a White-Box model which greatly enhances the interpretability and explainability of the
final model (ensemble). For evaluating the flexibility as well as the efficiency of the proposed Grey-Box
model, the authors used six benchmarks from three real-world application domains, i.e., finance,
education, and medicine. Based on their detailed experimental analysis the authors stated that the
proposed model reported comparable and sometimes better prediction accuracy compared to that of a
Black-Box while being at the same time interpretable as a White-Box model.
The fourth paper was authored by Karlos et al. [8] entitled “A Soft-Voting Ensemble Based
Co-Training Scheme Using Static Selection for Binary Classification Problems”. The authors presented
an ensemble-based co-training scheme for binary classification problems. The proposed methodology
is based on the imposition of an ensemble classifier as a base learner in the co-training framework.
Its structure is determined by a static ensemble selection approach from a pool of candidate learners
Their experimental results in a variery of classical benchmarks as well as the reported statistical
analysis showed the efficacy and efficiency of their approach.
An interesting research entitled “GeoAI: A Model-Agnostic Meta-Ensemble Zero-Shot Learning Method
for Hyperspectral Image Analysis and Classification was authored by Demertzis and Iliadis [9]. In this
work, a new classification model was proposed, named MAME-ZsL (Model-Agnostic Meta-Ensemble
Zero-shot Learning), which is based on zero-shot philosophy for geographic object-based scene
classification. The attractive advantages of the proposed model are its training stability, its low
computational cost, but mostly its remarkable generalization performance thought the reduction of
potential overfitting. This is performed by the selection of features which do not cause the gradients
to explode or diminish. Additionally, it is worth noticing that the superiority of MAME-ZsL model
lies on the fact that the testing set contained instances whose classes were not contained in the
training set. The effectiveness of the proposed architecture was presented against state-of-the-art fully
supervised deep learning models on two datasets containing images from a reflective optics system
imaging spectrometer.
Zvarevashe and Olugbara [10] presented a research paper entitled “Ensemble Learning of Hybrid
Acoustic Features for Speech Emotion Recognition”. Signal processing and machine learning methods
are widely utilized for recognizing human emotions based on extracted features from video files,
Algorithms 2020, 13, 140 3 of 4

facial images or speech signals. The authors studied the problem that many classification models
were not able to efficiently recognize fear emotion with the same level of accuracy as other emotions.
To address this problem, they proposed an elegant methodology for improving the precision of fear and
other emotions recognition from speech signals, based on an interesting feature extraction technique.
In more detail, their framework extracts highly discriminating speech emotion feature representations
from multiple sources which are subsequently agglutinated to form a new set of hybrid acoustic
features. The authors conducted a series of experiments on two public databases using a variety of
state-of-the-art ensemble classifiers. The presented analysis which reported the efficiency of their
approach, provided evidence that the utilization of the new features increased the generalization
ability of all ensemble classifiers.
The seventh paper entitled “Ensemble Deep Learning for Multilabel Binary Classification of
User-Generated Content is authored by Haralabopoulos et al. [11]. The authors presented an multilabel
ensemble model for emotion classification which exploits a new weighted voting strategy based on
differential evolution. Additionally, the proposed model used deep learning learners which comprised
of convolutional and pooling layers as well as (LSTM) layers which are dedicated for such classification
problems. To present the efficiency of their model, they conducted a performance evaluation, on two
large and widely used datasets, against state-of-the-art single models and ensemble models which
were comprised with the same base learners. The reported numerical experiments showed that
the proposed model presented improved classification performance, outperforming state-of-the-art
compared models.
Finally, the eighty paper ”Ensemble Deep Learning Models for Forecasting Cryptocurrency Time-Series”
was authored by Livieris et al. [12]. The main contribution of this research is the combination of three
of the most widely employed ensemble strategies: ensemble-averaging, bagging and stacking with
advanced deep learning methodologies for forecasting the cryptocurrency hourly prices of Bitcoin,
Etherium and Ripple. More analytically, the ensemble models utilized state-of-the-art deep learning
models as component learners, which were comprised by combinations of LSTM, Bi-directional
LSTM and convolutional layers. The authors conducted an exhaustive experimentation in which the
performance of all ensemble deep learning models was compared on both regression and classification
problems. The models were evaluated on forecasting of the cryptocurrency price on the next hour
(regression) and also on the prediction of next price directional movement (classification) with respect
to the current price. Furthermore, the reliability of all ensemble model as well as the efficiency of
their predictions was studied by examining for autocorrelation of the errors. The detailed numerical
analysis indicated that ensemble learning strategies and deep learning techniques can be efficiently
beneficial to each other, and develop accurate and reliable cryptocurrency forecasting models.
We would like to thank the Editor-in-Chief and the editorial office of the Algorithms journal for
their support and for trusting us with the privilege to edit a special issue in this high-quality journal.

3. Conclusions and Future Approaches


The motivation behind this Special Issue was to make a minor and timely contribution to the
existing literature. It is hoped that the novel approaches presented in this Special Issue will be
found interesting, constructive and appreciated by the international scientific community. It is also
expected that they will inspire further research on innovative ensemble strategies and applications in
various multidisciplinary domains. Future approaches may involve exploiting ensemble learning for
improving prediction accuracy, machine learning explainability and enhancing model’s reliability.

Funding: No funding was provided for this work.


Acknowledgments: The guest editors wish to express their appreciation and deep gratitude to all authors and
reviewers which contributed to this Special Issue.
Conflicts of Interest: The guest editors declare no conflict of interest.
Algorithms 2020, 13, 140 4 of 4

References
1. Brown, G. Ensemble Learning. In Encyclopedia of Machine Learning; Springer: Boston, MA, USA, 2010;
Volume 312.
2. Polikar, R. Ensemble learning. In Ensemble Machine Learning; Springer: Boston, MA, USA, 2012; pp. 1–34.
3. Zhang, C.; Ma, Y. Ensemble Machine Learning: Methods and Applications; Springer: Boston, MA, USA, 2012.
4. Dietterich, T.G. Ensemble learning. In The Handbook of Brain Theory and Neural Networks; MIT Press:
Cambridge, MA, USA, 2002; Volume 2, pp. 110–125.
5. Livieris, I.E.; Kanavos, A.; Tampakas, V.; Pintelas, P. A weighted voting ensemble self-labeled algorithm for
the detection of lung abnormalities from X-rays. Algorithms 2019, 12, 64. [CrossRef]
6. Papageorgiou, K.I.; Poczeta, K.; Papageorgiou, E.; Gerogiannis, V.C.; Stamoulis, G. Exploring an Ensemble of
Methods that Combines Fuzzy Cognitive Maps and Neural Networks in Solving the Time Series Prediction
Problem of Gas Consumption in Greece. Algorithms 2019, 12, 235. [CrossRef]
7. Pintelas, E.; Livieris, I.E.; Pintelas, P. A Grey-Box Ensemble Model Exploiting Black-Box Accuracy and
White-Box Intrinsic Interpretability. Algorithms 2020, 13, 17. [CrossRef]
8. Karlos, S.; Kostopoulos, G.; Kotsiantis, S. A Soft-Voting Ensemble Based Co-Training Scheme Using Static
Selection for Binary Classification Problems. Algorithms 2020, 13, 26. [CrossRef]
9. Demertzis, K.; Iliadis, L. GeoAI: A Model-Agnostic Meta-Ensemble Zero-Shot Learning Method for
Hyperspectral Image Analysis and Classification. Algorithms 2020, 13, 61. [CrossRef]
10. Zvarevashe, K.; Olugbara, O. Ensemble Learning of Hybrid Acoustic Features for Speech Emotion
Recognition. Algorithms 2020, 13, 70. [CrossRef]
11. Haralabopoulos, G.; Anagnostopoulos, I.; McAuley, D. Ensemble Deep Learning for Multilabel Binary
Classification of User-Generated Content. Algorithms 2020, 13, 83. [CrossRef]
12. Livieris, I.E.; Pintelas, E.; Stavroyiannis, S.; Pintelas, P. Ensemble Deep Learning Models for Forecasting
Cryptocurrency Time-Series. Algorithms 2020, 13, 121. [CrossRef]

c 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

You might also like