2021 International Workshop on Impedance Spectroscopy (IWIS), 2021
The Extreme Learning Machine (ELM) is a learning algorithm used for training a single hidden laye... more The Extreme Learning Machine (ELM) is a learning algorithm used for training a single hidden layer feed-forward neural network (SLFN). It leads to a better generalization performance and fast learning speed and many more advantageous properties. The first step of an ELM is to assign random values to the input weights and biases. Then it will determine the output weights in a single step using the generalized Moore-Penrose method. The random initialization of weights and biases and a large number of hidden nodes can affect the performance of the ELM. Several optimizers were proposed aiming to increase generalization performance and produce more compact networks. This paper presents a comparative study of ELM optimization in the state of the art using existing algorithms namely Optimally Pruned ELM (OP-ELM), Grey Wolf Optimizer (GWO), Salp Swarm Algorithm (SSA), Bat Algorithm (BA), and Particle Swarm Optimization (PSO) which are swarm intelligence-based ELMs. Also Genetic Pruning Algorithm (GPA) and Enhanced Genetic Algorithm (EGA) which are pruning methods using evolutionary algorithms. Results prove that SSA-ELM improves the accuracy in most benchmark datasets and shows better results than the other comparing methods while OP-ELM gives the most compact model with a high level of accuracy.
2021 International Workshop on Impedance Spectroscopy (IWIS), 2021
Recently, many optimization algorithms have been applied for Feature Selection (FS) problems and ... more Recently, many optimization algorithms have been applied for Feature Selection (FS) problems and still showing very promising results. Moreover, Salp Swarm Algorithm (SSA) is one of the most sophisticated meta-heuristic swarm-based optimization algorithms. As SSA is proving its efficiency, it has undergone several improvements in order to make its performance as better as it could be. In this context, this paper presents a comparative study of the Salp Swarm Algorithm variants for feature selection application. These different improvement approaches aim to reduce the number of features and eliminate the non-useful ones. This study focuses on three binary versions of SSA, namely Binary Salp Swarm Algorithm (BSSA), Chaotic Salp Swarm Algorithm (CSSA) and Dynamic Salp Swarm Algorithm (DSSA). For this purpose, 13 UCI benchmark datasets were used. Based on the comparative results we can conclude that the DSSA approach enhances the performance of the SSA algorithm and outperforms other similar approaches in the literature.
2021 18th International Multi-Conference on Systems, Signals & Devices (SSD), 2021
This paper proposes a comparative of binary swarm optimization based wrappers for ElectroMyograph... more This paper proposes a comparative of binary swarm optimization based wrappers for ElectroMyography (EMG) feature selection. Time-domain and frequency-domain features are extracted from two EMG channels to evaluate the effect of each of them according to the accuracy and computational costs. Six binary algorithms are used in this study namely Grey Wolf Optimizer (GWO), Whale Optimization Algorithm (WOA), Moth-Flame Optimization (MFO), Salp Swarm Algorithm (SSA), Bat Algorithm (BA), and Particle Swarm Optimization (PSO) in the domain of machine learning for feature selection and classification. Results prove that time-domain features are enough to give satisfying classification accuracy, WOA is giving the best average classification accuracy of 80.15% but needs more execution time. Compared with others, SSA is the best algorithm according to the number of selected features, execution time, and fitness function 78.25% as accuracy.
2021 International Workshop on Impedance Spectroscopy (IWIS), 2021
The Extreme Learning Machine (ELM) is a learning algorithm used for training a single hidden laye... more The Extreme Learning Machine (ELM) is a learning algorithm used for training a single hidden layer feed-forward neural network (SLFN). It leads to a better generalization performance and fast learning speed and many more advantageous properties. The first step of an ELM is to assign random values to the input weights and biases. Then it will determine the output weights in a single step using the generalized Moore-Penrose method. The random initialization of weights and biases and a large number of hidden nodes can affect the performance of the ELM. Several optimizers were proposed aiming to increase generalization performance and produce more compact networks. This paper presents a comparative study of ELM optimization in the state of the art using existing algorithms namely Optimally Pruned ELM (OP-ELM), Grey Wolf Optimizer (GWO), Salp Swarm Algorithm (SSA), Bat Algorithm (BA), and Particle Swarm Optimization (PSO) which are swarm intelligence-based ELMs. Also Genetic Pruning Algorithm (GPA) and Enhanced Genetic Algorithm (EGA) which are pruning methods using evolutionary algorithms. Results prove that SSA-ELM improves the accuracy in most benchmark datasets and shows better results than the other comparing methods while OP-ELM gives the most compact model with a high level of accuracy.
2021 International Workshop on Impedance Spectroscopy (IWIS), 2021
Recently, many optimization algorithms have been applied for Feature Selection (FS) problems and ... more Recently, many optimization algorithms have been applied for Feature Selection (FS) problems and still showing very promising results. Moreover, Salp Swarm Algorithm (SSA) is one of the most sophisticated meta-heuristic swarm-based optimization algorithms. As SSA is proving its efficiency, it has undergone several improvements in order to make its performance as better as it could be. In this context, this paper presents a comparative study of the Salp Swarm Algorithm variants for feature selection application. These different improvement approaches aim to reduce the number of features and eliminate the non-useful ones. This study focuses on three binary versions of SSA, namely Binary Salp Swarm Algorithm (BSSA), Chaotic Salp Swarm Algorithm (CSSA) and Dynamic Salp Swarm Algorithm (DSSA). For this purpose, 13 UCI benchmark datasets were used. Based on the comparative results we can conclude that the DSSA approach enhances the performance of the SSA algorithm and outperforms other similar approaches in the literature.
2021 18th International Multi-Conference on Systems, Signals & Devices (SSD), 2021
This paper proposes a comparative of binary swarm optimization based wrappers for ElectroMyograph... more This paper proposes a comparative of binary swarm optimization based wrappers for ElectroMyography (EMG) feature selection. Time-domain and frequency-domain features are extracted from two EMG channels to evaluate the effect of each of them according to the accuracy and computational costs. Six binary algorithms are used in this study namely Grey Wolf Optimizer (GWO), Whale Optimization Algorithm (WOA), Moth-Flame Optimization (MFO), Salp Swarm Algorithm (SSA), Bat Algorithm (BA), and Particle Swarm Optimization (PSO) in the domain of machine learning for feature selection and classification. Results prove that time-domain features are enough to give satisfying classification accuracy, WOA is giving the best average classification accuracy of 80.15% but needs more execution time. Compared with others, SSA is the best algorithm according to the number of selected features, execution time, and fitness function 78.25% as accuracy.
Uploads
Papers by Hiba HELLARA