2018 International Conference on Innovative Trends in Computer Engineering (ITCE), 2018
In the last few years, Cloud computing has become essential in the research, industry, and busine... more In the last few years, Cloud computing has become essential in the research, industry, and business. On the other hands, Cloud computing is based on virtualization over IT resources where virtual storage and computing services are provided. Therefore, trusting of both cloud providers and consumers is considered a critical factor to enhance reliability and security of the cloud environment. Although there are many research studies concern about establishing trust for cloud service providers. In this paper, a trust model has been proposed to determine the trust for cloud consumers. The main component of the proposed trust model is the trust metric stage. The function of this stage is to define the trust percentage for each consumer. Four enhanced techniques have been used and implemented in this stage; Particle Swarm Optimization (PSO), Multiple Regression (MR), Analytic Hierarchical Process (AHP), and PSO-Multiple Regression (MR-PSO). According to implementation results of these techniques, it is found that the PSO is the proper technique to be used to calculate the trust percentage. The performance of the proposed trust model has been evaluated relative to Armor dataset. The experimental results show that the trust percentage according to our proposed model is deviated by 0.017 relative to Armor dataset.
Journal of applied mathematics & informatics, 2009
In this paper we will be interested in characterizing and com- puting matrices X 2 C n×n that sat... more In this paper we will be interested in characterizing and com- puting matrices X 2 C n×n that satisfy e X = A, that is logarithms of A. The study in this work goes through two lines. The first is concerned with a theoretical study of the solution set, S(A), of eX = A. Along the sec- ond line computationalapproachesare considered to compute the principal logarithm of A, LogA.
Abstract. Basic graph structures such as maximal independent sets (MIS’s) have spurred much theor... more Abstract. Basic graph structures such as maximal independent sets (MIS’s) have spurred much theoretical research in randomized and distributed algorithms, and have several applications in networking and distributed computing as well. However, the extant (distributed) algorithms for these problems do not necessarily guarantee fault-tolerance or load-balance properties. We propose and study “low-average degree ” or “balanced ” versions of such structures. Interestingly, in sharp contrast to, say, MIS’s, it can be shown that checking whether a structure is balanced, will take substantial time. Nevertheless, we are able to develop good sequential/distributed (randomized) algorithms for such balanced versions. We also complement our algorithms with several lower bounds. Randomization plays a key role in our upper and lower bound results.
In this paper we discuss the problem of testing randomness m otivated by the need to evaluate of ... more In this paper we discuss the problem of testing randomness m otivated by the need to evaluate of the quality of different random number generators which may not generate a true random numbers. Such number generators are used by many practical applications including computer simulations, cryptography, and communications industry, where the quality of the randomness of the generated numbers affects the quality of these applications. In this paper we concentrate with one of the most popular approaches for testing randomness, Poker test. In particular, two versions of Poker test are known: the classical Poker test and the approximated Poker test, where the latter has been motivated by the difficulties involved in implementing the classical approach at the time it is designed. Given a sequence of random numbers to be tested, the basic Poker approach divides this sequence into groups of five numbers, observes which of the possible patterns is matched by each quintuple, computes the occurr...
Recent Advances in Computer Science and Communications
Introduction: Signal filters were originally seen as circuits or systems with frequency selecting... more Introduction: Signal filters were originally seen as circuits or systems with frequency selecting behaviors. The development of filtering techniques went on and more sophisticated filters were introduced, such as e.g. Chebychev and Butterworth filters, which gave means of shaping the frequency characteristics of the filter in a more systematic design procedure. During this stage, the filtering was mainly considered from this frequency. Mehtod: In (SIMD) model, a parallel computer consists of N identical processors, each of the N processors possesses its own local memory where it can store both programs and data, and all processors operate under the control of a single instruction stream issued by a central control unit. Equivalently, the N processors may be assumed to hold identical copies of a single program, each processor's copy being stored in its local memory. There are N data streams, one per processor. Result: It can be seen that the computation time decreases when we inc...
2018 International Conference on Innovative Trends in Computer Engineering (ITCE), 2018
In the last few years, Cloud computing has become essential in the research, industry, and busine... more In the last few years, Cloud computing has become essential in the research, industry, and business. On the other hands, Cloud computing is based on virtualization over IT resources where virtual storage and computing services are provided. Therefore, trusting of both cloud providers and consumers is considered a critical factor to enhance reliability and security of the cloud environment. Although there are many research studies concern about establishing trust for cloud service providers. In this paper, a trust model has been proposed to determine the trust for cloud consumers. The main component of the proposed trust model is the trust metric stage. The function of this stage is to define the trust percentage for each consumer. Four enhanced techniques have been used and implemented in this stage; Particle Swarm Optimization (PSO), Multiple Regression (MR), Analytic Hierarchical Process (AHP), and PSO-Multiple Regression (MR-PSO). According to implementation results of these techniques, it is found that the PSO is the proper technique to be used to calculate the trust percentage. The performance of the proposed trust model has been evaluated relative to Armor dataset. The experimental results show that the trust percentage according to our proposed model is deviated by 0.017 relative to Armor dataset.
Journal of applied mathematics & informatics, 2009
In this paper we will be interested in characterizing and com- puting matrices X 2 C n×n that sat... more In this paper we will be interested in characterizing and com- puting matrices X 2 C n×n that satisfy e X = A, that is logarithms of A. The study in this work goes through two lines. The first is concerned with a theoretical study of the solution set, S(A), of eX = A. Along the sec- ond line computationalapproachesare considered to compute the principal logarithm of A, LogA.
Abstract. Basic graph structures such as maximal independent sets (MIS’s) have spurred much theor... more Abstract. Basic graph structures such as maximal independent sets (MIS’s) have spurred much theoretical research in randomized and distributed algorithms, and have several applications in networking and distributed computing as well. However, the extant (distributed) algorithms for these problems do not necessarily guarantee fault-tolerance or load-balance properties. We propose and study “low-average degree ” or “balanced ” versions of such structures. Interestingly, in sharp contrast to, say, MIS’s, it can be shown that checking whether a structure is balanced, will take substantial time. Nevertheless, we are able to develop good sequential/distributed (randomized) algorithms for such balanced versions. We also complement our algorithms with several lower bounds. Randomization plays a key role in our upper and lower bound results.
In this paper we discuss the problem of testing randomness m otivated by the need to evaluate of ... more In this paper we discuss the problem of testing randomness m otivated by the need to evaluate of the quality of different random number generators which may not generate a true random numbers. Such number generators are used by many practical applications including computer simulations, cryptography, and communications industry, where the quality of the randomness of the generated numbers affects the quality of these applications. In this paper we concentrate with one of the most popular approaches for testing randomness, Poker test. In particular, two versions of Poker test are known: the classical Poker test and the approximated Poker test, where the latter has been motivated by the difficulties involved in implementing the classical approach at the time it is designed. Given a sequence of random numbers to be tested, the basic Poker approach divides this sequence into groups of five numbers, observes which of the possible patterns is matched by each quintuple, computes the occurr...
Recent Advances in Computer Science and Communications
Introduction: Signal filters were originally seen as circuits or systems with frequency selecting... more Introduction: Signal filters were originally seen as circuits or systems with frequency selecting behaviors. The development of filtering techniques went on and more sophisticated filters were introduced, such as e.g. Chebychev and Butterworth filters, which gave means of shaping the frequency characteristics of the filter in a more systematic design procedure. During this stage, the filtering was mainly considered from this frequency. Mehtod: In (SIMD) model, a parallel computer consists of N identical processors, each of the N processors possesses its own local memory where it can store both programs and data, and all processors operate under the control of a single instruction stream issued by a central control unit. Equivalently, the N processors may be assumed to hold identical copies of a single program, each processor's copy being stored in its local memory. There are N data streams, one per processor. Result: It can be seen that the computation time decreases when we inc...
Uploads
Papers by Ehab Morsy