— Diabetic Retinopathy (DR) also known as diabetic eye disease. It is the damage occurs to the re... more — Diabetic Retinopathy (DR) also known as diabetic eye disease. It is the damage occurs to the retina due to diabetes. It can eventually lead to blindness. So the early detection of disease is needed, Manual detection is time consuming and often make observation error. Hence several computer-aided systems are introduced and which would make fast and consistent diagnosis-aid useful for biomedical and health informatics field. The Diabetic retinopathy detection methods that uses machine learning techniques. In one system classifiers such as the Gaussian Mixture model (GMM), k-nearest neighbor (kNN), support vector machine (SVM) are used and another system that uses GMM, kNN, SVM, and combinational classifiers are used for classifying retinal fundus images.
There are various methods of handling Optimal Binary search trees in order to improve the perform... more There are various methods of handling Optimal Binary search trees in order to improve the performance. One of the methods is Dynamic programming which incurs O(n 3) time complexity to store involved computations in a table. The data mining technique called Data Preprocessing is used in order to remove noise early in the dataset and enhances consistency of the given data. The post dynamic computing is applied using knowledge of dynamic programming principle which starts with only required data and computes only the necessary attributes required to construct Optimal Binary Search Tree with time complexity O(n) if there are n identifiers / integers / any complex objects. This approach avoids computing all necessary table attributes. Hence, the complexity or cost of post dynamic computing using Dynamic Programming is proven to be less than O(n 3) or even less than specified in some cases with experimental results.
Cloud computing, an internet based technology which provides virtualized computer resources over ... more Cloud computing, an internet based technology which provides virtualized computer resources over the internet. Distribution of the dynamic workload among the computer resources in the cloud evenly in such a way that no single node is overloaded or under loaded is called Load Balancing. An efficient load balancer will increase the performance of cloud, maximizes the cloud services and also increases the resource utilization. Today increasing the performance of a cloud depends on many factors, among them Load Balancing is one of the main factors. In this paper we propose a load balancing algorithm which is a variant to the Weight Least Connection (WLC) algorithm. The proposed algorithm shows better results in several aspects like accurate calculation of work load on a resource, distributing the work load on the service nodes efficiently, enhancing the response time and minimizing the overall task execution time.
— This paper proposes a novel Universal Steganalysis framework which can be applied for spatial d... more — This paper proposes a novel Universal Steganalysis framework which can be applied for spatial domain and JPEG Domain Steganography algorithms. The objective is to develop a steganalysis algorithm which has to identify any distribution (uniform or non-uniform) of stego-payloads. The framework proposed uses a 3-way tensor model to extract the image features which is important for estimating the embedded change irrespective of domain. To obtain the accurate results and to analyze the error, 360 degree bit change estimation is done. The experimental results evaluated on 3000 images which shows a good detection rate in both domains and a reasonable false acceptance rate and false rejection rate based on the pay load when tested with most of the steganography algorithms.
—The data leak detection plays a major role in organizational industry. The data leak poses serio... more —The data leak detection plays a major role in organizational industry. The data leak poses serious threat to online social Medias, sensitive datas and so on. We take two papers for this survey. Both of them belong to the area of information forensic and security. In first survey, paper develops a model [PPDLD] the model is based on fuzzy finger print method. The goal of this paper is to generate special type of digest is called fuzzy fingerprint. Rabin finger print algorithm is introduced here just for sampling. A filtering method is used during the digestion process. In second survey, paper deals the fast detection of transformed data leak. This paper suggests a preserving method based on alignment algorithm. The paper aim to detect long and inexact leak patterns from sensitive data and network. Detection is based on comparable sampling algorithm.
— Language is a hallmark of intelligence, and endowing computers with the ability to analyze and ... more — Language is a hallmark of intelligence, and endowing computers with the ability to analyze and generate language as a field of research is known as Natural Language Processing (NLP)-has been the dream of Artificial Intelligence. Software requirements are typically captured in natural languages (NL) such as English and then analyzed by software engineers to generate a formal software design/model. However, English is syntactically ambiguous and semantically inconsistent. Hence, English specifications of software requirements cannot only result in erroneous and absurd software designs and implementations but, the informal nature of English is also a main obstacle in machine processing of English complex specification of the software requirements. To tackle this key dispute, there is need to introduce a controlled NL representation for software requirements, to generate perfect and consistent software models. Proposed framework aims to model complex software requirements expressed in natural language and represent them with a new methodology that captures the natural language understanding(NLU) of events and models them using Stochastic Petri Nets (SPN) instead of only intermediate graph based structure using techniques of Natural Language Processing (NLP), this helps in removing ambiguity and corrects interpretation of requirements. To eliminate ambiguity, work combines all the different meanings (SPN graphs) of each ambiguous sentence into colored SPN graph. SPNs are state machines that help us to visualize better, the combined SPN graph. It can also represent knowledge about the requirement, which can be used to derive test case in early development phase. Hence aim of proposed work is twofold that overcomes the problem of ambiguity and knowledge representation. Stakeholder's document is input to framework, pre-processed by some pre-filter with certain functionality to improve the parsing. This parsed output gets converted into simple graph which in turn is converted into SPN graph with color representation to improve ambiguity. Pre-filter may be designed with self-learning capabilities to perk up output without human involvement.
Thyroid disease is one of the common diseases to be found in human beings. The disease of thyroid... more Thyroid disease is one of the common diseases to be found in human beings. The disease of thyroid gland varies from the low production as well as high production of the thyroid hormone, respectively. However, it is always recommended to diagnose the disease at an earlier stage in order to prevent further harmful effects and to provide the treatment to keep the thyroid hormone at normal level. Data Mining is playing vital role in health care applications. It is used to analyze the large volumes of data. One of the important task in data mining is predicting disease in earlier stage, which assist physician to give better treatment to the patients. Classification is one of the most significant data mining technique. It is supervised learning and used to classify predefined data sets. Data mining technique is mainly used in healthcare organizations for decision making, diagnosing diseases and giving better treatment to the patients. The data set used for this study on hypothyroid is taken from University of California Irvine (UCI) data repository. The entire research work is to be carried out with Waikato Environment in Knowledge Analysis (WEKA) open source software under Windows 7 environment. An experimental study is to be carried out using data mining techniques such as J48 and Decision stump tree. The data records are classified as negative, compensated, primary and secondary hypothyroid. As a result, the performance will be evaluated for both classification techniques and their accuracy will be compared through confusion matrix. It has been concluded that J48 gives better accuracy than the decision stump tree technique.
There are various pattern matching algorithms which take more comparisons in finding a given patt... more There are various pattern matching algorithms which take more comparisons in finding a given pattern in the text and are static and restrictive. In order to search pattern or substring of a pattern in the text with less number of comparisons, a general data mining technique is used called data preprocessing which named as D-PM using DP with help of one time look indexing method. The D-PM using DP finds given pattern or substring of given pattern in the text in less time and the time complexity involved is less than existing pattern matching algorithms. The new Pattern Matching Algorithm with data preprocessing (D-PM using DP) proposes Pattern Matching with dynamic search behavior and makes users should have flexibility in searching.
—Constraint based pattern mining and association rules are used in many applications like genetic... more —Constraint based pattern mining and association rules are used in many applications like genetic sequence analysis, in finance for bankrupting prediction, in securities for fraud detection, in agriculture for discovering classification of plants etc. to get the user interesting knowledge. Constraints are useful to eliminate unwanted rules and also solves rule explosion problem. Many algorithms are proposed for constraint based pattern mining and association rule generation. These constraints are in the form of attribute, item length, time or duration, regular expression etc. Pushing constraints in a mining process gives user interesting discovery. Literature survey shows that performance of an algorithm improves with application of constraint during the mining process. The paper elaborates about the literature survey on use of constraints in generation of association rules with different categories of constraints with its properties.
— User authentication is a crucial service in wireless sensor networks (WSNs) because wireless se... more — User authentication is a crucial service in wireless sensor networks (WSNs) because wireless sensor nodes are typically deployed in an unattended environment, leaving them open to possible hostile network attack. The main goal of research is to Authenticate remote user in a convenient and Secured manner. In this paper, we propose a ABC(Artificial Bee Colony optimization algorithm for matching)algorithm for user authentication in hierarchical wireless sensor networks using Biometric (finger print)data. In the proposed scheme ABC algorithm calculates the standard deviation(threshold value) from the biometric data (finger print) which is used for user authentication with maximum fitness in an optimized and secured manner.
Generally, huge data of any organization possess data redundancy, noise and data inconsistency. T... more Generally, huge data of any organization possess data redundancy, noise and data inconsistency. To eliminate, Data preprocessing should be performed on raw data, then sorting technique is applied on it. Data preprocessing includes many methods such as data cleaning, data integration, data transformation and data reduction. Depending on the complexity of given data, these methods are taken and applied on raw data in order to produce quality of data. Then, external sorting is applied. The proposed external sorting now takes the number of passes less than actual passes log B (N/M) + 1 for the traditional B – way external merge sorting. Also, the number of Input / Outputs of proposed method is less than 2*N* (log B (N/M) + 1) of Input / Outputs than traditional method, and also proposed method consume least number of runs compared to actual basic external sorting.
— Image segmentation is very important application in a biomedical diagnosis use image data analy... more — Image segmentation is very important application in a biomedical diagnosis use image data analysis. In medical analysis the accuracy of image segmentation has a critical clinical requirement for the localization of body organs or pathologies in order to raise the quality of prediction of disease or infections. This paper covers review that includes several articles in which latest A.I biomedical image segmentation techniques are applied to different imaging color space models. This review article describes how various computer assisted diagnosis system works for achieving the goal of finding abnormal segments of body organs in biomedical images of the MRI, ultrasound etc. It has been observed that those segmentation approach are broadly giving accurate results in which the segmentation of the images is performed by defining an active shape model and then localization of potential area of interest using thresholding.
— Security-typed programming languages aim to track insecure information flows in application pro... more — Security-typed programming languages aim to track insecure information flows in application program. This is achieved by extending data types with security labels in order to identify the confidentiality and integrity policies for each data element. Such policies specify which principals or entities are allowed to read from or write to the value of data respectively. In this paper, we evaluate the run-time overhead of dynamic information flow (DIF) analysis in security typed programming languages. Such analysis is performed by including the security labeling in the dynamic operational semantics. Our evaluation mechanism relies on developing two different language implementations for a simple while programming language that has been considered as a case of study. The first one is a traditional interpreter that implements the ordinary operational semantics of the language without security labeling of data types and hence performs no information flow analysis. The second one is an interpreter that performs a dynamic information flow analysis by implementing the security labeling semantics (where language data types are augmented with security labels). Next, two execution times of a program executed using both interpreters are measured (i.e., one execution time for each interpreter). The resulting difference in execution time represents the absolute run-time overhead of dynamic information flow analysis. We have calculated the difference in execution time for some benchmark programs that are executed using both implementations.
— Security-typed programming languages aim to track insecure information flows in application pro... more — Security-typed programming languages aim to track insecure information flows in application program. This is achieved by extending data types with security labels in order to identify the confidentiality and integrity policies for each data element. Such policies specify which principals or entities are allowed to read from or write to the value of data respectively. In this paper, we evaluate the run-time overhead of dynamic information flow (DIF) analysis in security typed programming languages. Such analysis is performed by including the security labeling in the dynamic operational semantics. Our evaluation mechanism relies on developing two different language implementations for a simple while programming language that has been considered as a case of study. The first one is a traditional interpreter that implements the ordinary operational semantics of the language without security labeling of data types and hence performs no information flow analysis. The second one is an interpreter that performs a dynamic information flow analysis by implementing the security labeling semantics (where language data types are augmented with security labels). Next, two execution times of a program executed using both interpreters are measured (i.e., one execution time for each interpreter). The resulting difference in execution time represents the absolute run-time overhead of dynamic information flow analysis. We have calculated the difference in execution time for some benchmark programs that are executed using both implementations.
— In this system an approach to clone analysis and Vulnerability detection for Web applications h... more — In this system an approach to clone analysis and Vulnerability detection for Web applications has been proposed together with a prototype implementation for web pages. Our approach analyzes the page structure, implemented by specific sequences of HTML tags, and the content displayed for both dynamic and static pages. Moreover, for a pair of web pages we also consider the similarity degree of their java source. The similarity degree can be adapted and tuned in a simple way for different web applications. We have reported the results of applying our approach and tool in a case study. The results have confirmed that the lack of analysis and design of the Web application has effect on the duplication of the pages. In particular, these results allowed us to identify some common features for the web pages that could be integrated, by deleting the duplications and code clones. Moreover, the clone analysis and Vulnerability detection of the pages enabled to acquire information to improve the general quality and conceptual/design of the database of the web application. Indeed, we plan to exploit the results of the code clone analysis method to support web application reengineering activities.
—Communication security is an essential and progressively difficult issue in wireless networks. P... more —Communication security is an essential and progressively difficult issue in wireless networks. Physical-layer approach to secret key generation which is each quick and independent of channel variations is taken into account. This approach makes a receiver jam the signal during a manner that also permits it to decrypt the info, nevertheless prevents alternative nodes from cryptography. Another well-known approach for achieving information-theoretic secrecy depends on deploying artificial noises to blind the intruders' interception within the physical layer. A multiple inter-symbol obfuscation (MIO) theme is planned, that utilizes a collection of artificial vociferous symbols to alter the initial knowledge symbols within the physical layer. MIO will effectively enhance the wireless communications security.
— This paper presents a cost effective product to automatically monitor and detect outages in vil... more — This paper presents a cost effective product to automatically monitor and detect outages in villages who don't have a reliable and intermittent supply of electricity. This product might prevent malpractices and corruption that linemen do by avoiding outage complaints and delaying the whole process, and hence, make the electric supply trustworthy.
— Clones are the piece of Software, which is creating from the copy of the original software. To ... more — Clones are the piece of Software, which is creating from the copy of the original software. To be more specific, the idea behind software cloning is to create a new software that replicates the aspect and usefulness of the original software in possible. It is important to understand that cloning does not have to involve any source code in the original software. Software Cloning typically occurs in the source code for the original software is not available. In a result, software cloning does not imply source code copying. Since software cloning goes way beyond simply executing a similar user interface. The goal in cloning is to create a new software program that mimics everything the original software does and the way in which it does.
—Techniques including minimal path can efficiently extract curve-like structures by optimally fin... more —Techniques including minimal path can efficiently extract curve-like structures by optimally finding the integral minimal-cost path between two seed points. In the first method, a novel minimal path-based algorithm which works on more general curve structures with fewer demands on the user for initial input compared to prior algorithms based on minimal paths. The main novelties and benefits of this new approach are that it may be used to find both closed and open curves, including complex topologies containing both multiple branch points and multiple closed cycles without demanding pre-knowledge about which of these types is to be extracted, and it requires only one input point which, in contrast to older methods, is no longer constrained to be an endpoint of the desired curve but truly may be any point along the desired curve. The second method MPP-BT (Minimal Path Propagation with Backtracking) first applies a minimal path propagation from one single starting point and then, at each reached point,backtracks few steps back to the starting point. Researchers in different areas like geometric optics, computer vision, robotics, and wire routing have previously solved related minimum-cost path problems using graph search and dynamic programming principles.
— The wireless sensor networks are the type of network in which sensor nodes sense the environmen... more — The wireless sensor networks are the type of network in which sensor nodes sense the environmental conditions and pass the sensed information to base station. The sensor network is deployed on the far places like forests, deserts etc. The size of the sensor node is very small due to which it is very difficult to recharge or replace battery of these sensor nodes. The various techniques has been proposed in the previous times, to reduce energy consumption of the network. Among various proposed techniques, clustering is the efficient technique to reduce energy consumption of the sensor networks. The clustering is of two types dynamic and static and in this article techniques of both type of clustering is reviewed and compared in terms of various parameters.
— Diabetic Retinopathy (DR) also known as diabetic eye disease. It is the damage occurs to the re... more — Diabetic Retinopathy (DR) also known as diabetic eye disease. It is the damage occurs to the retina due to diabetes. It can eventually lead to blindness. So the early detection of disease is needed, Manual detection is time consuming and often make observation error. Hence several computer-aided systems are introduced and which would make fast and consistent diagnosis-aid useful for biomedical and health informatics field. The Diabetic retinopathy detection methods that uses machine learning techniques. In one system classifiers such as the Gaussian Mixture model (GMM), k-nearest neighbor (kNN), support vector machine (SVM) are used and another system that uses GMM, kNN, SVM, and combinational classifiers are used for classifying retinal fundus images.
There are various methods of handling Optimal Binary search trees in order to improve the perform... more There are various methods of handling Optimal Binary search trees in order to improve the performance. One of the methods is Dynamic programming which incurs O(n 3) time complexity to store involved computations in a table. The data mining technique called Data Preprocessing is used in order to remove noise early in the dataset and enhances consistency of the given data. The post dynamic computing is applied using knowledge of dynamic programming principle which starts with only required data and computes only the necessary attributes required to construct Optimal Binary Search Tree with time complexity O(n) if there are n identifiers / integers / any complex objects. This approach avoids computing all necessary table attributes. Hence, the complexity or cost of post dynamic computing using Dynamic Programming is proven to be less than O(n 3) or even less than specified in some cases with experimental results.
Cloud computing, an internet based technology which provides virtualized computer resources over ... more Cloud computing, an internet based technology which provides virtualized computer resources over the internet. Distribution of the dynamic workload among the computer resources in the cloud evenly in such a way that no single node is overloaded or under loaded is called Load Balancing. An efficient load balancer will increase the performance of cloud, maximizes the cloud services and also increases the resource utilization. Today increasing the performance of a cloud depends on many factors, among them Load Balancing is one of the main factors. In this paper we propose a load balancing algorithm which is a variant to the Weight Least Connection (WLC) algorithm. The proposed algorithm shows better results in several aspects like accurate calculation of work load on a resource, distributing the work load on the service nodes efficiently, enhancing the response time and minimizing the overall task execution time.
— This paper proposes a novel Universal Steganalysis framework which can be applied for spatial d... more — This paper proposes a novel Universal Steganalysis framework which can be applied for spatial domain and JPEG Domain Steganography algorithms. The objective is to develop a steganalysis algorithm which has to identify any distribution (uniform or non-uniform) of stego-payloads. The framework proposed uses a 3-way tensor model to extract the image features which is important for estimating the embedded change irrespective of domain. To obtain the accurate results and to analyze the error, 360 degree bit change estimation is done. The experimental results evaluated on 3000 images which shows a good detection rate in both domains and a reasonable false acceptance rate and false rejection rate based on the pay load when tested with most of the steganography algorithms.
—The data leak detection plays a major role in organizational industry. The data leak poses serio... more —The data leak detection plays a major role in organizational industry. The data leak poses serious threat to online social Medias, sensitive datas and so on. We take two papers for this survey. Both of them belong to the area of information forensic and security. In first survey, paper develops a model [PPDLD] the model is based on fuzzy finger print method. The goal of this paper is to generate special type of digest is called fuzzy fingerprint. Rabin finger print algorithm is introduced here just for sampling. A filtering method is used during the digestion process. In second survey, paper deals the fast detection of transformed data leak. This paper suggests a preserving method based on alignment algorithm. The paper aim to detect long and inexact leak patterns from sensitive data and network. Detection is based on comparable sampling algorithm.
— Language is a hallmark of intelligence, and endowing computers with the ability to analyze and ... more — Language is a hallmark of intelligence, and endowing computers with the ability to analyze and generate language as a field of research is known as Natural Language Processing (NLP)-has been the dream of Artificial Intelligence. Software requirements are typically captured in natural languages (NL) such as English and then analyzed by software engineers to generate a formal software design/model. However, English is syntactically ambiguous and semantically inconsistent. Hence, English specifications of software requirements cannot only result in erroneous and absurd software designs and implementations but, the informal nature of English is also a main obstacle in machine processing of English complex specification of the software requirements. To tackle this key dispute, there is need to introduce a controlled NL representation for software requirements, to generate perfect and consistent software models. Proposed framework aims to model complex software requirements expressed in natural language and represent them with a new methodology that captures the natural language understanding(NLU) of events and models them using Stochastic Petri Nets (SPN) instead of only intermediate graph based structure using techniques of Natural Language Processing (NLP), this helps in removing ambiguity and corrects interpretation of requirements. To eliminate ambiguity, work combines all the different meanings (SPN graphs) of each ambiguous sentence into colored SPN graph. SPNs are state machines that help us to visualize better, the combined SPN graph. It can also represent knowledge about the requirement, which can be used to derive test case in early development phase. Hence aim of proposed work is twofold that overcomes the problem of ambiguity and knowledge representation. Stakeholder's document is input to framework, pre-processed by some pre-filter with certain functionality to improve the parsing. This parsed output gets converted into simple graph which in turn is converted into SPN graph with color representation to improve ambiguity. Pre-filter may be designed with self-learning capabilities to perk up output without human involvement.
Thyroid disease is one of the common diseases to be found in human beings. The disease of thyroid... more Thyroid disease is one of the common diseases to be found in human beings. The disease of thyroid gland varies from the low production as well as high production of the thyroid hormone, respectively. However, it is always recommended to diagnose the disease at an earlier stage in order to prevent further harmful effects and to provide the treatment to keep the thyroid hormone at normal level. Data Mining is playing vital role in health care applications. It is used to analyze the large volumes of data. One of the important task in data mining is predicting disease in earlier stage, which assist physician to give better treatment to the patients. Classification is one of the most significant data mining technique. It is supervised learning and used to classify predefined data sets. Data mining technique is mainly used in healthcare organizations for decision making, diagnosing diseases and giving better treatment to the patients. The data set used for this study on hypothyroid is taken from University of California Irvine (UCI) data repository. The entire research work is to be carried out with Waikato Environment in Knowledge Analysis (WEKA) open source software under Windows 7 environment. An experimental study is to be carried out using data mining techniques such as J48 and Decision stump tree. The data records are classified as negative, compensated, primary and secondary hypothyroid. As a result, the performance will be evaluated for both classification techniques and their accuracy will be compared through confusion matrix. It has been concluded that J48 gives better accuracy than the decision stump tree technique.
There are various pattern matching algorithms which take more comparisons in finding a given patt... more There are various pattern matching algorithms which take more comparisons in finding a given pattern in the text and are static and restrictive. In order to search pattern or substring of a pattern in the text with less number of comparisons, a general data mining technique is used called data preprocessing which named as D-PM using DP with help of one time look indexing method. The D-PM using DP finds given pattern or substring of given pattern in the text in less time and the time complexity involved is less than existing pattern matching algorithms. The new Pattern Matching Algorithm with data preprocessing (D-PM using DP) proposes Pattern Matching with dynamic search behavior and makes users should have flexibility in searching.
—Constraint based pattern mining and association rules are used in many applications like genetic... more —Constraint based pattern mining and association rules are used in many applications like genetic sequence analysis, in finance for bankrupting prediction, in securities for fraud detection, in agriculture for discovering classification of plants etc. to get the user interesting knowledge. Constraints are useful to eliminate unwanted rules and also solves rule explosion problem. Many algorithms are proposed for constraint based pattern mining and association rule generation. These constraints are in the form of attribute, item length, time or duration, regular expression etc. Pushing constraints in a mining process gives user interesting discovery. Literature survey shows that performance of an algorithm improves with application of constraint during the mining process. The paper elaborates about the literature survey on use of constraints in generation of association rules with different categories of constraints with its properties.
— User authentication is a crucial service in wireless sensor networks (WSNs) because wireless se... more — User authentication is a crucial service in wireless sensor networks (WSNs) because wireless sensor nodes are typically deployed in an unattended environment, leaving them open to possible hostile network attack. The main goal of research is to Authenticate remote user in a convenient and Secured manner. In this paper, we propose a ABC(Artificial Bee Colony optimization algorithm for matching)algorithm for user authentication in hierarchical wireless sensor networks using Biometric (finger print)data. In the proposed scheme ABC algorithm calculates the standard deviation(threshold value) from the biometric data (finger print) which is used for user authentication with maximum fitness in an optimized and secured manner.
Generally, huge data of any organization possess data redundancy, noise and data inconsistency. T... more Generally, huge data of any organization possess data redundancy, noise and data inconsistency. To eliminate, Data preprocessing should be performed on raw data, then sorting technique is applied on it. Data preprocessing includes many methods such as data cleaning, data integration, data transformation and data reduction. Depending on the complexity of given data, these methods are taken and applied on raw data in order to produce quality of data. Then, external sorting is applied. The proposed external sorting now takes the number of passes less than actual passes log B (N/M) + 1 for the traditional B – way external merge sorting. Also, the number of Input / Outputs of proposed method is less than 2*N* (log B (N/M) + 1) of Input / Outputs than traditional method, and also proposed method consume least number of runs compared to actual basic external sorting.
— Image segmentation is very important application in a biomedical diagnosis use image data analy... more — Image segmentation is very important application in a biomedical diagnosis use image data analysis. In medical analysis the accuracy of image segmentation has a critical clinical requirement for the localization of body organs or pathologies in order to raise the quality of prediction of disease or infections. This paper covers review that includes several articles in which latest A.I biomedical image segmentation techniques are applied to different imaging color space models. This review article describes how various computer assisted diagnosis system works for achieving the goal of finding abnormal segments of body organs in biomedical images of the MRI, ultrasound etc. It has been observed that those segmentation approach are broadly giving accurate results in which the segmentation of the images is performed by defining an active shape model and then localization of potential area of interest using thresholding.
— Security-typed programming languages aim to track insecure information flows in application pro... more — Security-typed programming languages aim to track insecure information flows in application program. This is achieved by extending data types with security labels in order to identify the confidentiality and integrity policies for each data element. Such policies specify which principals or entities are allowed to read from or write to the value of data respectively. In this paper, we evaluate the run-time overhead of dynamic information flow (DIF) analysis in security typed programming languages. Such analysis is performed by including the security labeling in the dynamic operational semantics. Our evaluation mechanism relies on developing two different language implementations for a simple while programming language that has been considered as a case of study. The first one is a traditional interpreter that implements the ordinary operational semantics of the language without security labeling of data types and hence performs no information flow analysis. The second one is an interpreter that performs a dynamic information flow analysis by implementing the security labeling semantics (where language data types are augmented with security labels). Next, two execution times of a program executed using both interpreters are measured (i.e., one execution time for each interpreter). The resulting difference in execution time represents the absolute run-time overhead of dynamic information flow analysis. We have calculated the difference in execution time for some benchmark programs that are executed using both implementations.
— Security-typed programming languages aim to track insecure information flows in application pro... more — Security-typed programming languages aim to track insecure information flows in application program. This is achieved by extending data types with security labels in order to identify the confidentiality and integrity policies for each data element. Such policies specify which principals or entities are allowed to read from or write to the value of data respectively. In this paper, we evaluate the run-time overhead of dynamic information flow (DIF) analysis in security typed programming languages. Such analysis is performed by including the security labeling in the dynamic operational semantics. Our evaluation mechanism relies on developing two different language implementations for a simple while programming language that has been considered as a case of study. The first one is a traditional interpreter that implements the ordinary operational semantics of the language without security labeling of data types and hence performs no information flow analysis. The second one is an interpreter that performs a dynamic information flow analysis by implementing the security labeling semantics (where language data types are augmented with security labels). Next, two execution times of a program executed using both interpreters are measured (i.e., one execution time for each interpreter). The resulting difference in execution time represents the absolute run-time overhead of dynamic information flow analysis. We have calculated the difference in execution time for some benchmark programs that are executed using both implementations.
— In this system an approach to clone analysis and Vulnerability detection for Web applications h... more — In this system an approach to clone analysis and Vulnerability detection for Web applications has been proposed together with a prototype implementation for web pages. Our approach analyzes the page structure, implemented by specific sequences of HTML tags, and the content displayed for both dynamic and static pages. Moreover, for a pair of web pages we also consider the similarity degree of their java source. The similarity degree can be adapted and tuned in a simple way for different web applications. We have reported the results of applying our approach and tool in a case study. The results have confirmed that the lack of analysis and design of the Web application has effect on the duplication of the pages. In particular, these results allowed us to identify some common features for the web pages that could be integrated, by deleting the duplications and code clones. Moreover, the clone analysis and Vulnerability detection of the pages enabled to acquire information to improve the general quality and conceptual/design of the database of the web application. Indeed, we plan to exploit the results of the code clone analysis method to support web application reengineering activities.
—Communication security is an essential and progressively difficult issue in wireless networks. P... more —Communication security is an essential and progressively difficult issue in wireless networks. Physical-layer approach to secret key generation which is each quick and independent of channel variations is taken into account. This approach makes a receiver jam the signal during a manner that also permits it to decrypt the info, nevertheless prevents alternative nodes from cryptography. Another well-known approach for achieving information-theoretic secrecy depends on deploying artificial noises to blind the intruders' interception within the physical layer. A multiple inter-symbol obfuscation (MIO) theme is planned, that utilizes a collection of artificial vociferous symbols to alter the initial knowledge symbols within the physical layer. MIO will effectively enhance the wireless communications security.
— This paper presents a cost effective product to automatically monitor and detect outages in vil... more — This paper presents a cost effective product to automatically monitor and detect outages in villages who don't have a reliable and intermittent supply of electricity. This product might prevent malpractices and corruption that linemen do by avoiding outage complaints and delaying the whole process, and hence, make the electric supply trustworthy.
— Clones are the piece of Software, which is creating from the copy of the original software. To ... more — Clones are the piece of Software, which is creating from the copy of the original software. To be more specific, the idea behind software cloning is to create a new software that replicates the aspect and usefulness of the original software in possible. It is important to understand that cloning does not have to involve any source code in the original software. Software Cloning typically occurs in the source code for the original software is not available. In a result, software cloning does not imply source code copying. Since software cloning goes way beyond simply executing a similar user interface. The goal in cloning is to create a new software program that mimics everything the original software does and the way in which it does.
—Techniques including minimal path can efficiently extract curve-like structures by optimally fin... more —Techniques including minimal path can efficiently extract curve-like structures by optimally finding the integral minimal-cost path between two seed points. In the first method, a novel minimal path-based algorithm which works on more general curve structures with fewer demands on the user for initial input compared to prior algorithms based on minimal paths. The main novelties and benefits of this new approach are that it may be used to find both closed and open curves, including complex topologies containing both multiple branch points and multiple closed cycles without demanding pre-knowledge about which of these types is to be extracted, and it requires only one input point which, in contrast to older methods, is no longer constrained to be an endpoint of the desired curve but truly may be any point along the desired curve. The second method MPP-BT (Minimal Path Propagation with Backtracking) first applies a minimal path propagation from one single starting point and then, at each reached point,backtracks few steps back to the starting point. Researchers in different areas like geometric optics, computer vision, robotics, and wire routing have previously solved related minimum-cost path problems using graph search and dynamic programming principles.
— The wireless sensor networks are the type of network in which sensor nodes sense the environmen... more — The wireless sensor networks are the type of network in which sensor nodes sense the environmental conditions and pass the sensed information to base station. The sensor network is deployed on the far places like forests, deserts etc. The size of the sensor node is very small due to which it is very difficult to recharge or replace battery of these sensor nodes. The various techniques has been proposed in the previous times, to reduce energy consumption of the network. Among various proposed techniques, clustering is the efficient technique to reduce energy consumption of the sensor networks. The clustering is of two types dynamic and static and in this article techniques of both type of clustering is reviewed and compared in terms of various parameters.
National Conference on Recent Innovative Trends in Engineering and Tec... more National Conference on Recent Innovative Trends in Engineering and Technology (NCRITET-2016) aims to provide an opportunity for academicians, researchers, scientists and industry experts engaged in teaching, research and development, gives a platform to present, discuss ideas and share their views to solve the real world complex challenges in Engineering and Technology.
Uploads
Papers by IJCSE Editor