Unit - V 1. Cloud Computing Security Challenges
Unit - V 1. Cloud Computing Security Challenges
Unit - V 1. Cloud Computing Security Challenges
In traditional data centers, IT managers put procedures and controls in place to build a
hardened perimeter around the infrastructure and data they want to secure. This configuration
is relatively easy to manage, since organizations have control of their servers’ location and
utilize the physical hardware entirely for themselves. In the private and public cloud,
however, perimeter boundaries blur and control over security diminishes as applications
move dynamically and organizations share the same remotely located physical hardware with
strangers.
MULTI-TENANCY
Cloud computing users share physical resources with others through common software
virtualization layers. These shared environments introduce unique risks into a user’s resource
stack. For example, the cloud consumer is completely unaware of a neighbor’s identity,
security profile or intentions. The virtual machine running next to the consumer’s
environment could be malicious, looking to attack the other hypervisor tenants or sniff
communications moving throughout the system. Because the cloud consumer’s data sits on
common storage hardware, it could become compromised through lax access management or
malicious attack. In a joint paper published in November 2009 by MIT and UCSD entitled
“Hey, You, Get Off of My Cloud: Exploring Information Leakage in Third-Party Compute
Clouds,” the authors exhibited the possibility of a side-channel attack in a cloud environment
in which an attacker would be able to implant some arbitrary code into a neighbor’s VM
environment with little to no chance of detection. In another scenario, a security bulletin from
Amazon Web Services reported that the Zeus Botnet was able to install and successfully run
a command and control infrastructure in the cloud environment.
Moving data from static physical servers onto virtual volumes makes it remarkably mobile,
and data stored in the cloud can live anywhere in the virtual world. Storage administrators
can easily reassign or replicate users’ information across data centers to facilitate server
maintenance, HA/DR or capacity planning, with little or no service interruption or notice to
data owners. This creates a number of legal complications for cloud users. Legislation like
the EU Privacy Act forbids data processing or storage of residents’ data within foreign data
centers. Careful controls must be applied to data in cloud computing environments to ensure
cloud providers do not inadvertently break these rules by migrating geographically sensitive
information across political boundaries. Further, legislation such as the US Patriot Act allows
federal agencies to present vendors with subpoenas and seize data (which can include trade
secrets and sensitive electronic conversations) without informing or gaining data owners’
consent.
DATA REMANENCE
Although the recycling of storage resources is common practice in the cloud, no clear
standard exists on how cloud service providers should recycle memory or disk space. In
many cases, vacated hardware is simply re-purposed with little regard to secure hardware
repurposing. The risk of a cloud tenant being able to gather pieces of the previous tenants’
data is high when resources are not securely recycled. Resolving the issue of data remanence
can frequently consume considerable negotiating time while establishing service agreements
between an enterprise and a cloud service provider.
DATA PRIVACY
The public nature of cloud computing poses significant implications to data privacy and
confidentiality. Cloud data is often stored in plain text, and few companies have an absolute
understanding of the sensitivity levels their data stores hold. Data breaches are embarrassing
and costly. In fact, a recent report by the Cloud Security Alliance lists data loss and leakage
as one of top security concerns in the cloud. Recent laws, regulations and compliance
frameworks compound the risks; offending companies can be held responsible for the loss of
sensitive data and may face heavy fines over data breaches. Business impacts aside, loose
data security practices also harm on a personal level. Lost or stolen medical records, credit
card numbers or bank information may cause emotional and financial ruin, the repercussions
of which could take years to repair. Sensitive data stored within cloud environments must be
safeguarded to protect its owners and subjects alike.
There are numerous security issues for cloud computing as it encompasses many technologies
including networks, databases, operating systems, virtualization, resource scheduling,
transaction management, load balancing, concurrency control and memory management
Security issues for many of these systems and technologies are applicable to cloud
computing. For example, the network that interconnects the systems in a cloud has to be
secure. Furthermore, virtualization paradigm in cloud computing results in several security
concerns. For example, mapping the virtual machines to the physical machines has to be
carried out securely. Data security involves encrypting the data as well as ensuring that
appropriate policies are enforced for data sharing. In addition, resource allocation and
memory management algorithms have to be secure. Finally, data mining techniques may be
applicable to malware detection in clouds.
Whenever a discussion about cloud security is taken place there will be very much to do for
it. The cloud service provider for cloud makes sure that the customer does not face any
problem such as loss of data or data theft. There is also a possibility where a malicious user
can penetrate the cloud by impersonating a legitimate user, there by infecting the entire cloud.
This leads to affects many customers who are sharing the infected cloud [7]. There are four
types of issues raise while discussing security of a cloud.
1. Data Issues
2. Privacy issues
3. Infected Application
4. Security issues
Data Issues: sensitive data in a cloud computing environment emerge as major issues with
regard to security in a cloud based system. Firstly, whenever a data is on a cloud, anyone
from anywhere anytime can access data from the cloud since data may be common, private
and sensitive data in a cloud. So at the same time, many cloud computing service consumer
and provider accesses and modify data. Thus there is a need of some data integrity method in
cloud computing. Secondly, data stealing is a one of serious issue in a cloud computing
environment. Many cloud service provider do not provide their own server instead they
acquire server from other service providers due to it is cost affective and flexible for
operation and cloud provider. So there is a much probability of data can be stolen from the
external server. Thirdly, Data loss is a common problem in cloud computing. If the cloud
computing service provider shut down his services due some financial or legal problem then
there will be a loss of data for the user. Moreover, data can be lost or damage or corrupted
due to miss happening, natural disaster, and fire. Due to above condition, data may not be
accessesable to users. Fourthly, data location is one of the issues what requires focus in a
cloud computing environment. Physical location of data storage is very important and crucial.
It should be transparent to user and customer. Vendor does not reveal where all the data’s are
stored.
Secrecy Issues:
The cloud computing service provider must make sure that the customer personal information
is well secured from other providers, customer and user. As most of the servers are external,
the cloud service provider should make sure who is accessing the data and who is
maintaining the server so that it enable the provider to protect the customer’s personal
information.
Infected Application:
cloud computing service provider should have the complete access to the server with all
rights for the purpose of monitoring and maintenance of server. So this will prevent any
malicious user from uploading any infected application onto the cloud which will severely
affect the customer and cloud computing service.
Security issues:
cloud computing security must be done on two levels. One is on provider level and another is
on user level. Cloud computing service provider should make sure that the server is well
secured from all the external threats it may come across. Even though the cloud computing
service provider has provided a good security layer for the customer and user, the user should
make sure that there should not be any loss of data or stealing or tampering of data for other
users who are using the same cloud due to its action. A cloud is good only when there is a
good security provided by the service provider to the user.
In Software as a Service (SaaS) model, the client needs to be dependent on the service
provider for proper security measures of the system. The service provider must ensure that
their multiple users don‘t get to see each other‘s private data. So, it becomes important to the
user to ensure that right security measures are in place and also difficult to get an assurance
that the application will be available when needed [15]. Cloud computing providers need to
provide some solution to solve the common security challenges that traditional
communication systems face. At the same time, they also have to deal with other issues
inherently introduced by the cloud computing paradigm itself.
C. Availability The availability ensures the reliable and timely access to cloud data or cloud
computing resources by the appropriate personnel. The availability is one of the big concerns
of cloud service providers, since if the cloud service is disrupted or compromised in any way;
it affects large no. of customers than in the traditional model.
D. Information Security
In the SaaS model, the data of enterprise is stored outside of the enterprise boundary, which
is at the SaaS vendor premises. Consequently, these SaaS vendor needs to adopt additional
security features to ensure data security and prevent breaches due to security vulnerabilities
in the application or by malicious employees. This will need the use of very strong encryption
techniques for data security and highly competent authorization to control access private data.
.
E. Data Access
Data access issue is mainly related to security policies provided to the users while accessing
the data. Organizations have their own security policies based on which each employee can
have access to a particular set of data. These security policies must be adhered by the cloud to
avoid intrusion of data by unauthorized users. The SaaS model must be flexible enough to
incorporate the specific policies put forward by the organization.
F. Network Security
In a SaaS deployment model, highly sensitive information is obtained from the various
enterprises, then processed by the SaaS application and stored at the SaaS vendor‘s premises.
All data flow over the network has to be secured in order to prevent leakage of sensitive
information.
G. Data breaches
Since data from various users and business organizations lie together in a cloud environment,
breaching into this environment will potentially make the data of all the users vulnerable.
Thus, the cloud becomes a high potential target.
Supports the development of standards for cloud computing and frameworks for
interoperating between clouds;
develops benchmarks for cloud computing; and
supports reference implementations for cloud computing, preferably open source
reference implementations.
The OCC has a particular focus in large data clouds. It has developed the MalStone
Benchmark for large data clouds and is working on a reference model for large data clouds.
The Open Cloud Consortium (OCC) is a member driven organization that supports the
development of standards for cloud computing and frameworks for interoperating between
clouds; develops benchmarks for cloud computing; supports reference implementations for
cloud computing, preferably open source reference implementations; manages a testbed for
Cloud Computing called the Open Cloud Testbed (OCT); and sponsors workshops and other
events related to Cloud Computing. The OCC has a particular focus in large data clouds. It
has developed the MalStone Benchmark for large data clouds and is working on a reference
model for large data clouds.
Cloud-standards.org is a Wiki site for Cloud Standards Coordination. The goal of the wiki is
to document the activities of the various SDOs working on Cloud Standards. Cloud-
standards.org is an initiative for editing and sharing a general cloud computing
standardization positioning, in which more relevant cloud standardization initiatives can be
seen and related. The first informal proposal of the positioning can be seen at cloud standards
positioning.
This working group manages and operates the Open Cloud Testbed. The Open Cloud Testbed
uses the Cisco C-Wave and UIC Teraflow Network for its network connections. Both use
wavelengths provided by the National Lambda Rail. Currently membership in this working
group is limited to OCC members who can contribute computing, networking, or other
resources to the Open Cloud Testbed.
Project Matsu
Project Matsu is a collaboration with NASA. The goal is to create a cloud containing all of
Earth Observing 1 (EO-1) satellite imagery from the Advanced Lander Imager (ALI) and the
Hyperion instruments and to make this data available to interested users. This working group
is also developing cloud-based services that can assist at times of disasters. The Project
Matsu cloud can, for example, be used to assist with image processing so that up to date
images can be made available to those providing disaster assistance.
The Open Science Data Cloud (OSDC) is cloud-based infrastructure that allows scientists to
manage, analyze, integrate and share medium to large size scientific datasets. The Institute
for Genomic and Systems Biology at the University of Chicago uses the OSDC as the basis
for Bionimbus, a cloud for genomics and related data. John Hopkins University uses the
OSDC to provide bulk downloads of the Sloan Digital Sky Survey to astronomers around the
world. NASA uses the OSDC to make data from the EO-1 satellite available to interested
parties. Partial funding for the OSCD is provided by the Gordon and Betty Moore Foundation
and the National Science Foundation. OSDC Partners include Yahoo, who contributed
equipment to the OSDC and Cisco who is providing access to the Cisco C-Wave.
DMTF spans the globe with member companies and organizations representing varied
industry sectors. The DMTF board of directors is led by 14 industry-leading technology
companies including: Broadcom Limited, CA Technologies, Dell, Emerson Network
Power, Hewlett Packard Enterprise, Hitachi, Ltd., HP Inc., Intel
Corporation, Lenovo, Microsoft Corporation, NetApp, Software AG,TIM and VMware.
Supporting implementations that enable the management of diverse traditional and emerging
technologies – including cloud, virtualization, network and infrastructure – the DMTF works
to simplify the manageability of network-accessible technologies through open and
collaborative efforts by leading technology companies.
The reasons to adopt standards in cloud computing closely match the same logic that made
the universal usability of the Internet a reality: The more accessible data is, the more
interoperable software and platforms are, the more standardized the operating protocols are,
the easier it will be to use and the more people with use it -- and the cheaper it will be to
implement, operate, and maintain. Systems and software designers see this logic in action
when they create a cloud platform and don't have to worry about figuring out how to make it
work with a dozen or so network protocols. Cloud application developers feel the power of
standards when they build an application using a framework that guarantees almost 100
percent success in such areas as data access, resource allocation, debugging, failover
mechanisms, user interface reconfiguration, and error, data, and exception handling... not to
mention the shouts of joy when a developer realizes that a favored toolkit can integrate into a
favored development platform, sometimes with only the push of a button.
Cloud standards that designers and developers can use in 2015 to help make software design
simpler, cheaper, and faster.
CSCC is an end-user advocacy group that seeks to "accelerate cloud's successful adoption" as
a means to strengthen 21st century enterprises. It is not really a standards organization but a
facilitator; it works with existing standards groups to ensure that client requirements are
addressed as standards evolve. This group understands that the transition from a traditional IT
environment to a cloud-based environment can require significant changes, so it attempts to
guarantee that this transition won't cost end-users the choice and flexibility they enjoy with
their current IT environments. Another role of the CSCC is to advocate for the establishment
of open, transparent standards for cloud computing; the council believes that the agility and
economic efficiencies cloud offers are only possible if the performance, security, and
interoperability issues that arise during the transition to the cloud are answered in an open,
transparent way.
Distributed Management Task Force (DMTF)
The OVF standard, adopted as ISO 17203 by the International Organization for
Standardization (ISO), creates uniform formatting for virtual systems-based software. OVF is
platform independent, flexible, and open, and can be used by anyone who needs a
standardized package for creating a virtual software solution that requires interoperability and
portability. OVF simplifies management standards using the Common Information Model
(CIM) to standardize management information formats; this reduces design and development
overhead by allowing for quicker and more costeffective implementation of new software
solutions.
The Open Cloud Standards Incubator working group's goal is to facilitate management
interoperability between in-enterprise private clouds and public and hybrid clouds. The
components — cloud resource management protocols, packaging formats, and security
mechanisms—address the increasing need for open, consistent cloud management
architecture standards.
CMWG uses the Cloud Infrastructure Management Interface (CIMI) to visually represent the
total lifecycle of a cloud service so that you can enhance the implementation and
management of that service and make sure it is meeting service requirements. This group can
explain how to model the characteristics of an operation, allowing variation of your
implementation to be tested prior to final development; it does this with CIM, which creates
data classes with well-defined associations and characteristics, as well as a conceptual
framework for organizing these components. CIM uses discrete layers: core model, common
model, and extension representations.
Cloud Auditing Data Federation Working Group (CADF)
CADF works to standardize "audit events across all cloud and service providers" with the
goal of resolving significant issues in cloud computing due to inconsistencies or
incompatibilities. It seeks to ensure consumers of cloud computing systems that the security
policies required on their applications are properly managed and enforced.
ETSI Technical Committee Cloud examines issues arising from the convergence of IT and
telecommunications. With cloud computing requiring connectivity to extend beyond the local
network, cloud network scalability has become dependent on the ability of the telecom
industry to handle rapid increases in data transfer; it also works on issues related to
interoperability and security
The CSC initiative is responsible for developing a detailed set of standards required to
support European Commission policy objectives that address security, interoperability, data
portability, and reversibility.
Client must have a constant connection to the host to receive SMTP messages.
This allows a server to store messages until a client connects and requests them.
Once the client connects, POP servers begin to download the messages and
subsequently delete them from the server
Internet Messaging Access Protocol (IMAP)
• IMAP allows messages to be kept on the server.
• XMPP remains the core protocol of the Jabber Instant Messaging and Presence
technology
SIMPLE
• Session Initiation Protocol for Instant Messaging and Presence Leveraging
Extensions
• For registering for presence information and receiving notifications.
• It is also used for sending short messages and managing a session of realtime
messages between two or more participants.