International Journal of Recent Technology and Engineering (IJRTE), 2019
Reliability is one of the most critical and fundamental aspect while evaluating any software. Wit... more Reliability is one of the most critical and fundamental aspect while evaluating any software. With the rapid growth in the use of software, the issues concerning the trustworthiness of the software are also increasing. This provides the authors, the motivation to evaluate the software systems fiducially through the implementation of entropy based on the combination weights (CW) methods. These weights are the result of mathematical computation and are based on experts’ opinion. The entropy based approach enables the authors to determine the degree of criteria as per experts’ judgment and also remove the biasness in the weights, by providing objective weights. On contrary the Analytic Hierarchy Process (AHP) has been considered as principal precise methods for decision making with multiple criteria has been extensively considered in the operations research literature as well as useful to solve countless real-world problems. The result of this research contributes in providing better j...
In this paper we aim to investigate the trade off in selection of an accurate, robust and cost-ef... more In this paper we aim to investigate the trade off in selection of an accurate, robust and cost-effective classification model for binary classification problem. With empirical observation we present the evaluation of one-class and two-class classification model. The experiments are done with four two- class and one-class classifier models on five UCI datasets and then the classification models are evaluated with Receiver Operating Curve (ROC), Cross validation Error and pair-wise measure Q statistics. Our finding is that in the presence of large amount of relevant training data the two-class classifiers perform better than one-class classifiers for binary classification problem. It is due to the ability of the two class classifier to use negative data samples in its decision. In scenarios when sufficient training data is not available the one-class classification model performs better.
In this paper, new computational intelligence sequential hybrid architectures involving Genetic P... more In this paper, new computational intelligence sequential hybrid architectures involving Genetic Programming (GP) and Group Method of Data Handling (GMDH) viz. GP-GMDH. Three linear ensembles based on (i) arithmetic mean (ii) geometric mean and (iii) harmonic mean are also developed. We also performed GP based feature selection. The efficacy of Multiple Linear Regression (MLR),
Pattern Classification is the science of making inferences from perceptual data, using tools from... more Pattern Classification is the science of making inferences from perceptual data, using tools from statistics, probability, computational geometry, machine learning, signal processing, and algorithm design. It is a supervised technique in which the patterns are organized into groups of pattern sharing the same set of properties and to solve the classification problem at the attribute level and return to an output space of two or more than two classes. Probabilistic Neural Networks (PNN) and K-Nearest Neighbors are the effectively used. They uses training and testing data samples to build a model but with PNN it is very difficult to handle huge data so K-Nearest Neighbors (KNN) algorithms are used to improve the performance accuracy and the convergence ratebut the computational cost becomes expensive so genetic algorithms are used to design a classifier in which samples are divided into different class boundaries and for each generation accuracy of algorithm continued till we get our ...
In this paper review of existing literature in the field of software reliability models based on ... more In this paper review of existing literature in the field of software reliability models based on machine learning techniques presented. Software reliability is very useful tool in determining the software quality. By using machine learning techniques for getting unhidden parameters affecting software fault prediction for exploring various parameters leading to obsoleteness of software by presenting category of papers of software reliability, software fault prediction, software trustworthiness, software reusability, using machine learning techniques based on statistical inferences which could predict useful pattern on hidden data of faulty software database of empirical datasets related to software testing. After studying plenary relevant papers on faults generated during fault removal, faults already present, we proposed a novel approach based on identifying most relevant parameter affecting the software reliability using Machine Learning Techniques.
In this paper, new computational intelligence sequential hybrid architectures involving Genetic P... more In this paper, new computational intelligence sequential hybrid architectures involving Genetic Programming (GP) and Group Method of Data Handling (GMDH) viz. GPGMDH. Three linear ensembles based on (i) arithmetic mean (ii) geometric mean and (iii) harmonic mean are also developed. We also performed GP based feature selection. The efficacy of Multiple Linear Regression (MLR), Polynomial Regression, Support Vector Regression (SVR), Classification and Regression Tree (CART), Multivariate Adaptive Regression Splines (MARS), Multilayer FeedForward Neural Network (MLFF), Radial Basis Function Neural Network (RBF), Counter Propagation Neural Network (CPNN), Dynamic Evolving Neuro–Fuzzy Inference System (DENFIS), TreeNet, Group Method of Data Handling and Genetic Programming is tested on the NASA dataset. Ten-fold cross validation and t-test are performed to see if the performances of the hybrids developed are statistically significant.
International Journal of System Assurance Engineering and Management
The major challenge is to validate software failure dataset by finding unknown model parameters u... more The major challenge is to validate software failure dataset by finding unknown model parameters used. For software assurance, previously many attempts were made based using classical classifiers as Decision Tree, Naïve Bayes, and k-NN for software fault prediction. But the accuracy of fault prediction is very low as defect prone modules are very small as compared to defect-free modules. So, for solving modules fault classification problems and enhancing reliability accuracy, a hybrid algorithm proposed on particle swarm optimization and modified genetic algorithm for feature selection and bagging for effective classification of defective or non-defective modules in a dataset. This paper presents an empirical study on NASA metric data program datasets, using the proposed hybrid algorithm and results showed that our proposed hybrid approach enhances the classification accuracy compared with existing methods.
International Journal of Recent Technology and Engineering (IJRTE), 2019
Reliability is one of the most critical and fundamental aspect while evaluating any software. Wit... more Reliability is one of the most critical and fundamental aspect while evaluating any software. With the rapid growth in the use of software, the issues concerning the trustworthiness of the software are also increasing. This provides the authors, the motivation to evaluate the software systems fiducially through the implementation of entropy based on the combination weights (CW) methods. These weights are the result of mathematical computation and are based on experts’ opinion. The entropy based approach enables the authors to determine the degree of criteria as per experts’ judgment and also remove the biasness in the weights, by providing objective weights. On contrary the Analytic Hierarchy Process (AHP) has been considered as principal precise methods for decision making with multiple criteria has been extensively considered in the operations research literature as well as useful to solve countless real-world problems. The result of this research contributes in providing better j...
In this paper we aim to investigate the trade off in selection of an accurate, robust and cost-ef... more In this paper we aim to investigate the trade off in selection of an accurate, robust and cost-effective classification model for binary classification problem. With empirical observation we present the evaluation of one-class and two-class classification model. The experiments are done with four two- class and one-class classifier models on five UCI datasets and then the classification models are evaluated with Receiver Operating Curve (ROC), Cross validation Error and pair-wise measure Q statistics. Our finding is that in the presence of large amount of relevant training data the two-class classifiers perform better than one-class classifiers for binary classification problem. It is due to the ability of the two class classifier to use negative data samples in its decision. In scenarios when sufficient training data is not available the one-class classification model performs better.
In this paper, new computational intelligence sequential hybrid architectures involving Genetic P... more In this paper, new computational intelligence sequential hybrid architectures involving Genetic Programming (GP) and Group Method of Data Handling (GMDH) viz. GP-GMDH. Three linear ensembles based on (i) arithmetic mean (ii) geometric mean and (iii) harmonic mean are also developed. We also performed GP based feature selection. The efficacy of Multiple Linear Regression (MLR),
Pattern Classification is the science of making inferences from perceptual data, using tools from... more Pattern Classification is the science of making inferences from perceptual data, using tools from statistics, probability, computational geometry, machine learning, signal processing, and algorithm design. It is a supervised technique in which the patterns are organized into groups of pattern sharing the same set of properties and to solve the classification problem at the attribute level and return to an output space of two or more than two classes. Probabilistic Neural Networks (PNN) and K-Nearest Neighbors are the effectively used. They uses training and testing data samples to build a model but with PNN it is very difficult to handle huge data so K-Nearest Neighbors (KNN) algorithms are used to improve the performance accuracy and the convergence ratebut the computational cost becomes expensive so genetic algorithms are used to design a classifier in which samples are divided into different class boundaries and for each generation accuracy of algorithm continued till we get our ...
In this paper review of existing literature in the field of software reliability models based on ... more In this paper review of existing literature in the field of software reliability models based on machine learning techniques presented. Software reliability is very useful tool in determining the software quality. By using machine learning techniques for getting unhidden parameters affecting software fault prediction for exploring various parameters leading to obsoleteness of software by presenting category of papers of software reliability, software fault prediction, software trustworthiness, software reusability, using machine learning techniques based on statistical inferences which could predict useful pattern on hidden data of faulty software database of empirical datasets related to software testing. After studying plenary relevant papers on faults generated during fault removal, faults already present, we proposed a novel approach based on identifying most relevant parameter affecting the software reliability using Machine Learning Techniques.
In this paper, new computational intelligence sequential hybrid architectures involving Genetic P... more In this paper, new computational intelligence sequential hybrid architectures involving Genetic Programming (GP) and Group Method of Data Handling (GMDH) viz. GPGMDH. Three linear ensembles based on (i) arithmetic mean (ii) geometric mean and (iii) harmonic mean are also developed. We also performed GP based feature selection. The efficacy of Multiple Linear Regression (MLR), Polynomial Regression, Support Vector Regression (SVR), Classification and Regression Tree (CART), Multivariate Adaptive Regression Splines (MARS), Multilayer FeedForward Neural Network (MLFF), Radial Basis Function Neural Network (RBF), Counter Propagation Neural Network (CPNN), Dynamic Evolving Neuro–Fuzzy Inference System (DENFIS), TreeNet, Group Method of Data Handling and Genetic Programming is tested on the NASA dataset. Ten-fold cross validation and t-test are performed to see if the performances of the hybrids developed are statistically significant.
International Journal of System Assurance Engineering and Management
The major challenge is to validate software failure dataset by finding unknown model parameters u... more The major challenge is to validate software failure dataset by finding unknown model parameters used. For software assurance, previously many attempts were made based using classical classifiers as Decision Tree, Naïve Bayes, and k-NN for software fault prediction. But the accuracy of fault prediction is very low as defect prone modules are very small as compared to defect-free modules. So, for solving modules fault classification problems and enhancing reliability accuracy, a hybrid algorithm proposed on particle swarm optimization and modified genetic algorithm for feature selection and bagging for effective classification of defective or non-defective modules in a dataset. This paper presents an empirical study on NASA metric data program datasets, using the proposed hybrid algorithm and results showed that our proposed hybrid approach enhances the classification accuracy compared with existing methods.
Uploads
Papers