Preface
The Cloud is an interesting, rapid moving topic that caught my attention in the fall of 2010. It was at
that moment that I started to see the many possibilities of the cloud. Together with the stream of
media covering of these exciting new start-ups, I firmly believed that the cloud could be the next big
thing. But, as with many new next big things, the cloud also had a lot of implications that many didnt
have an answer for. Nor needed it. It was after all the next big thing. And when you are in the race of
becoming the winner of this race you dont care about the implications, you just want to win. That
Christmas I spoke with the VP of security at VMware. He also mentioned the many issues that
needed to be solved with cloud computing, including certain security issues. His words sparked my
imagination and I started, jointly inspired by my dad who profiled himself as one of the security
experts in The Netherlands, to investigate the cloud and its security issues.
A year has passed and much research has been done in the field of cloud computing and in cloud
security especially. None the less, much more research needs to get done in this field. For many
companies cloud security is still high on the list of topics that need further investigation and keeps
them holding back from entering the cloud (Bartvagh, 2010). I got tempted to do something with
these requests, the questions were many and the answers, as I later during my research found out, were
mostly impartial or plainly false. In order to provide those answer seeking companies with the right
information, I wanted to create a useful tool. This tool has been developed as a part of this thesis. It
explores the facets of the cloud and provides an useful model that enables decision makers to get a
clear overview of the threats and risks in the cloud and how to protect themselves from it.
I hope that anyone seeking more information on security in the cloud will find it useful and that it
will provide a reliable information source for those who are answering questions on security in the
cloud to potential customers and partners.
Yours Sincerely,
Thijs Baars
January 27, 2011
3 Summary
According to the definitions of ENISA and NIST, the cloud represents a business model that enables
on-demand network access to a shared pool of rapidly provisioned, elastic configurable computing
resources. This is reached by virtualizing the underlying hardware and giving the user control of its
resources it would like to obtain. This model creates a service that can be used for many applications
but as it is always attached to the internet, security becomes paramount. This thesis has shown that
many issues are not cloud specific and apply both to regular outsourcing applications as to any server
connected to the internet. However, some issues are particularly harmful to the cloud, such as
compromised virtual machines. If one is compromised, other virtual machines might be accessed with
insurmountable results. Other cloud specific issues lay at the client side, which are out of the scope of
this thesis.
Because of the virtualization techniques used, cloud environments can be spread throughout the
world. Although this can be a huge benefit (in terms of scalability and availability) it can also be a
threat to the system in the perspective of auditing, compliancy and legal jurisdictions. The main
vendors of cloud systems provide tools that can facilitate users with awareness where their data is
stored and processed. Auditability is an item that is very implementation specific, but it is shown that
complying with current standards such as ISO 27001, SAS 70 and PCI is possible. Another risk within
the cloud environment are trust chains. It can be very opaque which (indirect) partners can be involved
in your cloud environments and comparing to traditional outsourcing, they might be more
complicated. Because of the added risks in the cloud, the CIA-triad has been expanded to keep these
issues in line with the model. This expansion, called CI3A, has next to the well-known Confidentiality,
Integrity and Availability also Accountability and Auditability. Accountability and Auditability
should provide for a better grasp of the trust chains and an overview on governance & compliance.
Looking at the cloud, this thesis defines eight factors that influence the cloud environment and are
all related to the CI3A, and thus its security. These are regional, geo-spatial, governance &
compliance, delivery model, deployment model, encryption, network and premises. Each of these
factors can influence each other, but by defining these factors a security matrix can be setup that will
provide users with an overview of the cloud environment and how it is protected. These factors are
defined by inputting a data classification, that has been pre-defined by the organization. According to
the rules setup in the classification, these eight factors are defined. For some of these, they might be a
choice of several options, for others these values can be open to interpretation. For a detailed
description of these factors, see the section on The Secure Cloud Architecture model.
By integrating the results from the model, one can define a cloud environment that is sufficiently
secured for the data that needs to be stored and/or computed in the cloud. It might therefor be said
that in certain circumstances a cloud environment can be more secure than an insourced or outsourced
traditional environment, and that for many instances the cloud can be a secure solution.
4 Table of Contents
1
Preface .......................................................................................................................................................... 1
Summary ...................................................................................................................................................... 3
Introduction ................................................................................................................................................. 7
5.1
5.2
5.3
Research Model................................................................................................................................. 10
5.4
7.1.1
Virtualization ........................................................................................................................... 20
7.1.2
Characteristics ......................................................................................................................... 22
7.1.3
7.1.4
8.2
8.2.1
Locationlessness ...................................................................................................................... 29
8.2.2
8.2.3
8.2.4
8.2.5
8.3
8.4
8.5
CI3A..................................................................................................................................................... 34
8.6
8.7
Encryption ......................................................................................................................................... 37
9.2
9.2.1
9.2.2
SAS 70 ........................................................................................................................................ 41
9.2.3
9.2.4
COBIT ......................................................................................................................................... 42
9.3
9.4
9.5
10
10.2
10.3
Inputs ................................................................................................................................................. 50
10.3.1
Attributes .................................................................................................................................. 50
10.3.2
Outputs ...................................................................................................................................... 55
10.4
11
Conclusions ............................................................................................................................................ 58
12
13
Bibliography ......................................................................................................................................... 62
14
Appendix ................................................................................................................................................ 67
14.1
14.2
14.3
14.4
5 Introduction
In the past decades there has been a tremendous expansion and differentiation in IT. From main
frames to personal computers, from work stations to netbooks, and from off the shelve software to
Software as a Service (SaaS). Using IT services such as SaaS is a different way of thinking about IT,
were IT is perceived as an utility. These IT utilities, or subscription services if you will, range from
complete infrastructures to small applications. According to some, this is ideal. Paying for just what
you use, while optimizing the usage of resources. Elastic scalability allows high availability with lower
costs than ever before. The power of virtualization is the keystone for all of this and promises a world
of resources that wasnt available before, while maintaining flexibility, ease of use and incredible power.
Add these technologies to an enhanced service model, and you get what is called The Cloud.
All this sounds too good to be true and it actually might be. Where virtualization offers optimized
performance, high availability, and multi tenancy, it also comes with new issues related to security.
Multiple tenants have access to the same resources as you, the same machines and the same disks.
Next to that, with virtualization your data can be anywhere in the world. Geographic location is no
longer an issue when providers are utilizing protocols that enable disk arrays to be placed in all
continents on the world, forming one single Storage Array Network (SAN). This however, might have
a serious security impact on your environment. And the same issue arises with processing power.
Virtualization offers you the power to utilize processing power all over the globe as one single
machine. This allows for incredible amounts of power to be utilized, but it also means that your data
might be processed in territories it shouldnt be processed.
According to some, these issues can be solved with strategies applied in conventional solutions,
others (Levelt, 2010) show that these new issues need new solutions.
To get a clear image on the actual issues in detail, and how to find solutions for them, this thesis
will explain the cloud and the technology involved (see section 7 below.) Following is an analysis of
cloud specific issues, such as geospatial issues and governance issues (see section 8) followed by the
current state of the cloud with certain specific solutions for aforementioned issues (section 9) where we
also compare the Cloud to traditional outsourced (section 9.3) and insourced solutions (section 9.4).
Section 10 introduces and explains the Secure Cloud Architecture model. To conclude, section 11
wraps everything up, and section 12 discusses points for further research.
again. In (Gnagey, 2010) an overview is given on the cloud, discussing among economic benefits the
security concerns on a managerial level: geographic location and compliance. No solutions for these
issues are addressed however. Vigfusson & Chickler (2010) discuss in Clouds at Crossroads: Research
Perspectives research topics in the cloud, including privacy related issues. Discussing the trust
problems that arise with the complex models of some cloud environments, it provides a few solutions
to suggestions which might be very feasible.
This thesis overlaps with current research in that it provides a global overview of the cloud,
introducing and explaining it, but it also expands the current research with proposed solutions to the
current security issues specifically to the cloud and general security issues that have found new terrain
in the cloud environment. To conclude, this thesis provides users with a model that navigates them
through the security issues and solutions for their specific cloud environment so that the user can
secure himself.
How can data in the Cloud be protected from Cloud-specific security threats?
How does location-aware storing and executing of data in the Cloud affect its security?
How can Clouds conform to the international security standards for confidential data?
Because we can answer only whether or not the is a safe alternative if we compare it to other
solutions for storing and executing data, this thesis will also answer the following question:
How does the security in the Cloud of confidential data compare to that of outsourced
dedicated servers?
how does the security in the Cloud of confidential data compare to local insourced servers?
not the same first expert in question two per se. The appendix shows direct screen prints, all experts are
named expert by the system. This way, a consensus has been reached on the acceptance of the model.
The Delphi method was executed consisting of three rounds of surveys with qualitative questions.
Three rounds were chosen instead of two (which is more common (Skulmoski, Hartman, & Krahn,
2007)), so that a first round could be used to obtain general information on the topic, not specifically
regarding to the model to be developed, while still having enough rounds to reach a consensus. The
first round consisting of open questions where the experts were questioned on their experience with
security and the cloud, issues and concerns regarding security in the cloud. These questions gave a wide
result set that strengthened the results of the literature research earlier performed. Seventeen
respondents answered the questions in the survey in all three rounds, a rate of 65%.
The result set that the first round created delivered the starting point for the second round, in
which some questions were asked again in order to give the experts the option to rephrase their
answers after having seen the answers of round one. Some questions were designed after noticing a
consensus or discrepancy in the answers from round one, where others were completely new and had
no specific relation to the questions asked in round one. The questions for round one themselves were
fed by literature review and informal meetings during conferences and congresses.
In the second round, an initial version of the SeCA model was presented. The goal of this model is
to provide implementers, decision makers and experts in the field with a framework that they can used
in order to assess cloud environments to their security needs. The feedback on this initial model was
then used to improve it. The author of this thesis originally expected that a framework where raw data
is inputted would be best. This as the data will get hosted in the cloud (or at least has that intention.)
However, it was found that the SeCA model looked at the data from an unusual perspective for its
target audience and that a more architectural point of view was needed in order to be usable in the
field. In round three an improved SeCA model was introduced. The SeCA model is remodelled to
accept data classifications as input, instead of raw data. This as raw data is classified within an
organization, and that for each classification different system architectures are needed to host and
execute said data safely as prescribed by the classification. Therefore, encryption is inserted as an
attribute as this changes among the other attributes per classification and thus architecture. The SeCA
model allows for any user to assess the cloud environment from two perspectives. Either the user looks
at its current data and the inherent classification and decides how the cloud environment should be
configured to meet its requirements. Or reverses that action, and sees what data can be used by taking
11
a cloud environment and on that basis determine what can go in. This thesis will describe only the
forward movement, thus taking data classifications as an input and determine on that basis how the
cloud environment should be configured. The appendix (0) shows the evolution of the model, both
internal revisions and revisions discussed in the surveys.
The following concepts have been tested during the delphi study:
To summarize the results from the delphi round a burn chart is produced shown below. Green
depicts consensus, orange represents some polarisation, red represents extreme polarisation in the
answers of the experts.
As one can see, not all topics reached consensus. This was due to the fact that in the expert
selection business knowledge or technical knowledge on some topics were not taken into account. For
example, the field of encryption is a very technical field that can be hard to fully understand and apply.
Although the some answers were very useful, other answers were dismissed in the same round as
unfeasible, simpleminded or simply not true. This meant that the experience or knowledge between
the expert varied too greatly to reach consensus. Subsequent research was done through literature
review in the applicable topics.
12
Topic
Round 1
Round 2
Round 3
Consensus
New/specific
Variety of reasons
for the difference
between traditional
& cloud. Issues in
private/insourced
are the same.
Ownership is
dependent on law
& regulations.
DDoS answers
varied. Points from
both attackers &
victims discussed.
CI3A is accepted.
auditability and
accountability
could be implicit in
the CIA triad.
Location/
locationless
Locationless clouds
are considered to
unfeasible and
unwise
Trust issues
experts vary
greatly on the
difference in trust
between traditional
outsourcing and
cloud solutions.
Encryption
Encryption is very
important, some note its
not a safe haven. Some
believe that everything
should be encrypted.
No consensus; experts
differed in knowledge
and experience.
Further research
through literature.
Feasibility
No consensus, No
further research done;
feasibility did not
prove to be a security
issue.
Model
security issues
Auditing
Model is accepted
as presented.
Experts mention
the possibility of
practical issues
utilizing the model.
Table 1: Burn Chart of the delphi study. Colours denote the amount consensus reached: Green for consensus, orange for
consensus
on some issues, red for no consensus
13
Experience
and
business
opportunities
(Hintzbergen,
Hintzbergen,
Security
Technical
addition, other properties, such as authenticity, accountability, nonrepudiation, and reliability can also be involved (International Organization of Figure 2: Security divided
Standards, 2005, p. 10). The difference is striking, yet very applicable to this
thesis. Information security can be perceived from many points of view. The NIST and ISO definition
clearly define information security in technological terms, where Hintzbergen et. al. (2010) define it in
a business perspective.
In this thesis, security will be discussed in a technological sense, with the goal to minimize the
impacts to the organization when a breach of security occurs. This is also called information risk
management. Information risk management exists because [c]ompromise of a valuable information
asset will cause dollar losses to the information's owner whether acknowledged or not; the loss could
be either direct (through reduction in the value of the information asset itself) or indirect (through
service interruption, damage to the reputation of the information's owner, loss of competitive
advantage, legal liability, or other mechanisms). (Blakley, McDermott, & Geer, 2008, p. 1) By
managing the risks, the losses occurred at a breach of security can be minimized. These losses can be
tremendous, see for example (de Bruijn, Spruit, & van den Heuvel, 2009) how a weak encryption led
to millions of losses. Security governance has therefore become a hot topic on the agenda of IT
managers. (See for an overview on information risk management (Blakley, McDermott, & Geer,
2008)).
One way to look at information security is through a framework called the CIA triad. The CIA
triad, meaning Confidentiality, Integrity and Availability, offers an overview of the most important
14
Objective
Confidentiality
Prevents
Unauthorized users from accessing resources.
Integrity
u
r
Availability
Accountability
Auditability
Accountability is also often called within the term triple A (AAA). AAA stands for authentication,
authorization and accounting, and describes the three features of access control. As one can see, AAA
and the CIA triad overlap in meaning and function. Authentication handles the integrity of users.
Authorization makes sure that only authorized users get access and functionalities that are assigned to
them, which is in the CIA triad defined by confidentiality and availability. Accountability logs all user
15
actions. This thesis will therefore refer to the extended CIA triad (the CI triple A, or CI3A in short)
for all user control discussions.
The presented SeCA model in section 10 tests whether a cloud environment can be technically safe
enough for certain security requirements. The perceptions users have from the security measures, or in
other words the security experience, is not being tested as this is a different field of study that touches
with User Experience, Human-Computer Interaction and behaviour, as outlined in (Gonzales &
Sawicka, 2002). Nonetheless, human factors in security and in this thesis have to be kept in mind as
they play an important role.
Classification Measures
1: Top Secret
Data that should only be handled by specific people with the right
authorization. Very high business impact if a data leak occurs.
Resulting measures: Personnel screening, data on organizational premises,
data hosted only in the same country as the organization. Dedicated
hardware, e.g. no multitenancy. Replication on at least n+1 location. Data
must be within the network.
2: Secret
Data that should be handled by a limited amount of people with the right
authorization. High business impact if a data leak occurs.
Resulting measures: Screened personnel, within region of organization. Data
may be off-premise, High Availability architecture.
3: Private
4: Public
Because of the human perceptions on security, measuring security can become a subjective task. In
order to prevent a subjective measuring of security and to ensure that the measures will be objective
and based on technical features instead of the perceptions of security, this thesis abstracts the security
measures in terms of classification. Classification is an internal process that takes place within an
organization that puts security requirements on information. This information can be a set of data,
16
documents or any holder of information. Classification has generally speaking four classes, or levels, in
which data can be classified. Depending on the class, the security measures that are being taken to
secure the information differ. This has to do with the nature of said information. Trade secrets are
being classified at a different level than press releases. Trade secrets are of more value, have a larger
threat profile, incur greater risks at loss and thus shouldnt be available to everybody. Press releases on
the other are at the other end of the spectrum. They should be available to as many people as possible,
at any time. In the examples in this thesis we use four different levels of classification, ranging from a
low threat profile (public information), to a high threat profile (top secret information). The amount of
classification levels might differ between organizations, with that the security measures in each
classification. The four used in this thesis serve as an example to clarify concepts and for exemplary
usage of the proposed model.
17
and
spreads
across
the
18
better productivity and scalability. So, the cloud isnt new, and yet some call it the new paradigm of
computing. That is because the cloud is a new delivery model, or as Mulholland, Pyke & Fingar state:
The big deal is that cloud computing is a disruptive delivery model. Its an economic, not
technological shift! (2010, p. 24).
The National Institute for Standards and Technology (NIST) defines the cloud as: Cloud
computing is a model for enabling convenient, on-demand network access to a shared pool of
configurable computing resources (e.g., networks, servers, storage, applications, and services) that can
be rapidly provisioned and released with minimal management effort or service provider interaction.
This cloud model promotes availability and is composed of five essential characteristics, three service
models, and four deployment models. (NIST, 2010) We will discuss these characteristics, and models
below. ENISA (European Network and Information Security Agency) defines the cloud as [..] an ondemand service model for IT provision, often based on virtualization and distributed computing
technologies. Cloud computing architectures have:
Both definitions are more or less the same, and we will use these continuously in this thesis as the
working definitions. These definitions show that ASPs are more or less a part of the Cloud and that
SaaS (Software as a Service) is actually a model within a cloud environment. Table 4 explains all
characteristics.
19
Cloud characteristic
Description
Shared Resources
Service on Demand
Programmatic Management
7.1.1 Virtualization
Virtualization is the technology that enables any cloud environment. As an article in The Register
explained [virtualization] creates a layer of abstraction between a virtual machine and the physical
hardware. [] this allows multiple virtual machines to run on a single physical machine, and also can
enable a virtual machine to be moved quite straightforwardly from one physical machine to another.
(Collins, 2009) Companies like VMware, Xen and Citrix offer solutions that can virtualize physical
machines on different levels. These levels are known as Full Virtualization, Para-virtualization and
Hardware assisted virtualization.
Full Virtualization uses a combination of binary translation and direct execution techniques. This
approach translates kernel code to replace non-virtualizable instructions with new sequences of
instructions that have the intended effect on the virtual hardware. Meanwhile, user level code is
directly executed on the processor for high performance virtualization. Each [..] Virtual Machine [has]
all the services of the physical system, including a virtual BIOS, virtual devices and virtualized memory
management. This combination of binary translation and direct execution provides Full Virtualization
as the guest OS is fully abstracted (completely decoupled) from the underlying hardware by the
virtualization layer. The guest OS is not aware it is being virtualized and requires no modification.
(VMWare, 2007, p. 4) Full virtualization requires no hardware or operating system assistance to
virtualize sensitive and privileged instructions. Because everything is being run from a bare bone
operating system, called a hypervisor, it enables the best security options since no shared service
20
between other virtual machines can reach the hypervisors processes and instructions. Microsoft Virtual
Server and for example some VMware products offer full virtualization, among others.
In order to avoid the overhead caused by the binary translation taking place in Full Virtualization,
Paravirtualization, communicat[es] between the guest OS and the hypervisor to improve performance
and efficiency. Paravirtualization involves modifying the OS kernel to replace non-virtualizable
instructions with hypercalls that communicate directly with the virtualization layer hypervisor. [..] The
value proposition of paravirtualization is in lower virtualization overhead, but the performance
advantage of paravirtualization over full virtualization can vary greatly depending on the workload. As
paravirtualization cannot support unmodified operating systems (e.g. Windows 2000/XP), its
compatibility and portability is poor. (VMWare, 2007, p. 5)
In the time that virtualization was being introduced to the markets, hardware vendors started to
implement virtualization at the hardware level with technologies such as Intel VT-x and AMD-V.
Both target privileged instructions with a new CPU execution mode feature that allows the VMM to
run in a new root mode below. [..] Privileged and sensitive calls are set to automatically trap to the
hypervisor, removing the need for either binary translation or paravirtualization. (VMWare, 2007, p.
6) For an overview of virtualization, and its providers see (Blokdijk & Menken, 2009).
If we put this in perspective of the cloud, the virtualization of the physical machine, thus creating
one or more virtual machines, creates the scalability and flexibility of computing power. We can
virtualize multiple physical machines to create one single virtual machine (also called an instance).
Therefore, being able to create massive machines from multiple physical machines, or vice versa. This
also enables High Availability (HA), a system that has a minimal or no downtime.
When combining this flexibility with virtual networking, we can create virtual networks, and thus
virtual datacentres. Virtual networking is a term to describe the virtualization of network devices. Just
as with regular virtualization where physical machines are split into multiple virtual instances, now
physical network devices can be virtualized, thus split in to multiple instances. The offerings of virtual
networking are divers. Virtual LANs, or VLANs create groups of computers within a switch. VPNs
use public lines of communication, such as the internet, to connect multiple networks together
creating one virtual network. Openvswitch and vNetworkStack are software based multilayer switches
that can run on physical machines (thus allowing for scalability and flexibility through virtualization.)
In order to supply the massive amounts of storage required for some applications (such as
Facebook, which in 2009 had a 2 Petabyte dataset (Thusoo, 2009)), virtualizing storage through
21
Storage Area Networks (SANs) has been developed. A SAN is a network of storage devices, such as
tape libraries and disk arrays, which communicate through a SCSI protocol such as Fibre Channel and
iSCSI. (Preston, 2002, p. 5) Protocols like iSCSI, which stacks SCSI procedure calls on top of TCP,
allow storage devices to be mounted over the network. This means that one can reach an iSCSI device
which is located in the datacentre from his home environment. For the cloud this implies that storage
systems from all over the world can be provisioned into one cloud environment. This allows for
multiple petabyte data sets to be back upped in a high availability environment, with disaster recovery
at the other side of the world, since whenever a storage systems fails, the iSCSI calls can be rerouted to
the backup facility. This allows for no, or minimal loss of data, but this technique could also be used
for extending storage availability in the cloud and load balancing. For an overview of storage networks,
and their implication, see (Preston, 2002).
By virtualizing the memory in the physical machines, just like we do with storage and computing
power, a VM can begot the amount of memory that is more than the total amount of one physical
machine. In other words, we can create VMs with the total amount of memory, storage and
computing power equal to that of all the physical machines in 3 datacentres, each located in a separate
continent.
By clustering physical machines and network device we can create an elastic system which can
rapidly be provisioned and released to serve the content to its users, which is what we call the cloud.
Virtualization is truly the backbone of the cloud, allowing for near instant scalability and flexibility,
provisioning and shared resources.
7.1.2 Characteristics
The NIST definition speaks of five adherent characteristics, which have to be present to speak of a
cloud environment. Below we will outline those characteristics.
1. On-demand Self-service. Users can add or remove resources on the spot, form some kind of
control without the intervention of a third party. Thus no more need to call your service
provider to ask for 2 gigabytes of more storage. One click on a button and youre done.
2. Broad Network Access. As it says, there should a broad network access to heterogeneous thin or
thick client platforms (e.g. cell phones, notebooks, desktops).
3. The third characteristic is Resource pooling, meaning that the providers computing resources
are pooled to serve multiple tenants, with different physical and virtual resources dynamically
22
assigned and reassigned according to consumer demand. These resources can be storage,
processing power, memory, network bandwidth, and virtual machines.
4. Rapid Elasticity, which means that resources can be rapidly and elastically provisioned, to
quickly scale in and out. This comes with flexibility in time (for some minutes to years) and in
size (just 1 Gb of additional bandwidth to Exabytes of hard disk space).
5. Measured Service. Cloud systems automatically control and optimize resources by leveraging a
metering capability at some level of abstraction. Resource usage can be monitored, controlled,
and reported providing transparency for both the provider and the consumer.
Adapted after: (NIST, 2010). Except for broad network access and the measured service, all these
characteristics are made possible by virtualization technologies discussed above. Broad network access
is evolving continuously through the developments in dark fibre, which has mostly been laid during
the dot-com era.
SaaS
PaaS
complex
(these
complex
IaaS
23
7.1.3.1 IaaS
The lowest platform in the above displayed delivery method pyramid, IaaS, or Infrastructure as a
Service, provides the infrastructure of a server park. [V]irtual machines and other abstracted hardware
and operating systems which may be controlled through a service API. (ENISA, 2009) NIST
describes it as: The capability provided to the consumer is to provision processing, storage, networks,
and other fundamental computing resources where the consumer is able to deploy and run arbitrary
software, which can include operating systems and applications. The consumer does not manage or
control the underlying cloud infrastructure but has control over operating systems, storage, deployed
applications, and possibly limited control of select networking components (e.g., host firewalls).
Virtual Private Servers hosted as a cloud environment are often IaaS services. Rapid Elasticity comes
into place as more resources are required, the IaaS provider can then add more Virtual Machines to
the subscription, and are wound down when no longer needed. (Mulholland, Pyke, & Fingar, 2010)
7.1.3.2 PaaS
The Middle layer, platform as a Service, allows customers to develop new applications using APIs
deployed and configurable remotely. The platforms offered include development tools, configuration
management, and deployment platforms. Examples are Microsoft Azure, Force and Google App
engine. (ENISA, 2009), The consumer does not manage or control the underlying cloud
infrastructure including network, servers, operating systems, or storage, but has control over the
deployed applications and possibly application hosting environment configurations according to
NIST (2010). To expand on this, Mulholland et al. describe PaaS as platforms [that] can be preconfigured to support specific use by an industry or an enterprise, complete with management and
governance capabilities. However, the most common type of PaaS is the type that provides a core set
of services to which a wide range of additional services can be added to leverage the core services.
(Mulholland, et al., 2010) an example would be a JAVA platform, to which developers and add
applications to leverage the platform and the programming language for rapid development, without
the need to maintain the underlying technology (servers, tools, development environment etc.)
7.1.3.3 SaaS
SaaS is software available on subscription, or as ENISA puts it: software offered by a third party
provider, available on demand, usually via the Internet configurable remotely. NIST explains it as:
The capability provided to the consumer is to use the providers applications running on a cloud
infrastructure. The applications are accessible from various client devices through a thin client interface
24
such as a web browser (e.g., web-based email). The consumer does not manage or control the
underlying cloud infrastructure including network, servers, operating systems, storage, or even
individual application capabilities, with the possible exception of limited user-specific application
configuration settings. One of the best examples is probably Google Docs, that provides its customers
with on demand office tools such as word processing. SaaS provides actual end-user functionality,
either as services grouped together and orchestrated or as a conventional monolithic application.
(Mulholland, Pyke, & Fingar, 2010)
7.1.3.4 OaaS
Apart from the above 3 mentioned, there are multiple parties that tend to acknowledge more levels In
the pyramid. These Others as a Service include BPMaaS (Business Process Management as a
Service) which is defined in (Mulholland, Pyke, & Fingar, 2010) as to create unique business
processes designed for unique purposes to link together multi-company value delivery systems that in
the past werent feasible or economical to join together. Thus canned SaaS applications can become
participants in end-tot-end processes (internal quotes omitted).
Security as a Service is a term coined by security vendors, defined by McAfee as Rather than
acquiring your own security software tools and the technical expertise to administer them internally,
you contract with security vendors to have a turnkey service of virus defence, firewall management and
e-mail filtering. Outsourcing cyber-security eliminates all the labour and infrastructure, while still
giving you the state of the art in anti-virus, firewall and spam-fighting technologies. (McAfee, 2010)
Another popular term is Storage as a Service, in which a storage provider serves online storage for
data to clients. This data is often hosted in the cloud, and clients pay per gigabyte. These services can
be used as a backup solution.
The above mentioned services are just a small selection of popular services in the cloud. Some of
them float between the above mentioned 3 platforms. Security as a Service for example might exist on
all levels, as its nature encompasses the infrastructure to end-user authentication. This thesis limits
itself to IaaS, PaaS and SaaS, but results might be very usable on other services than discussed in this
thesis.
27
Cloud
specific
Risk
Description
Location awareness
Legal/regional
Geographic/geo-spatial
Organizational premises
Network/virtual
Trust chains
Data loss
Encryption
No
Partly
in
some
datacentre,
but
with
the
backups,
making
your
backups
28
meteoroid thats going to hit the earth, of course.) This, however also requires new solutions for data
deduplication, and controls for AAA en geographic locations in order to adhere to the security
objectives of the organization.
Results from the delphi session has identified four potential boundary issues: regional,
organizational premises, network and legal. (depicted in figure 4, the red bar being the firewall) These
will be explained below in greater detail, but in short the following definition can be applied to these
four:
1. Regional or legal boundaries are boundaries that show geographic regions in the sense of legal
regions. Thus countries, states, territories and so forth. They have differences in laws that
influence for instance privacy measures that have to be taken within the cloud. In this
perspective regional boundaries also influence governance & compliance and directly affects
Availability and Auditability within the CI3A (an extension on the CIA triad, elaborated in
section 8.5).
2. Organizational premises define boundaries of an organization. Secretive information can be
kept on premise for instance to create more assurance of the data for instance. Likewise it can
be kept off premise because the organization doesnt have the required funds or knowledge to
keep it on organizational grounds.
3. Network boundaries define whether the cloud should be an integral part of the company network
or not. For some data types it can be very important that the data is within the organizations
network (which can be established using VPNs and other methods) for other data
classifications it is not an issue.
4. Geo-spatial or geographic boundaries are boundaries that are marked by the physical distance
between one point and another. These boundaries are used for disaster recovery, High
Availability (HA), latency and other attributes that are of importance within a cloud
environment.
8.2.1 Locationlessness
Because of the virtualized environment in which the cloud runs, geographic location does not tend to
be an issue in the eyes of the beholder. The end user might not, or in some opinions doesnt need to,
know where his data physically resides. Although it is indeed possible to create a locationless cloud,
this locationlessness behaviour of the cloud can be a serious risk, according to a consensus of all the
participating experts, which outweigh the benefits.
29
The real benefit is that any actors that are willing to harm the data, or the technology residing
under it, have no clue where to start. Taking out a node is easier than taking out a whole cloud
environment. This could thus add an added layer of protection. It was even mentioned in the survey
that if one would carefully select its locations to which his or her data would be dispersed, legal
organizations such as the NSA would need to cooperate with the equivalent organization of for
example China in order to get hold of the data they wish to extract. Although this scenario is very
unlikely, and many arguments on the analysis of this scenario are out of the scope of this thesis, it
makes a compelling argument for locationless clouds for those who require minimal interference of
any third party. For sake of staying within the scope of this research, we will define a locationless cloud
as: a cloud environment in which the end user has no awareness of where his data physically resides.
Issues of a true locationless environment are plenty. First of all, your data has to reside somewhere
physically in order for any system to get it, even though it seems to the end user his data is located all
over the world. This lack of control makes compliancy to any standard incredibly hard. Physical
security becomes hard to control, such as access to the facilities. Next to that, legal issues arise, as
different countries have different legal systems that will require solutions to comply to these legal
systems. Then there is the issue of latency. With no knowledge of where data resides, and no need for
providers to provide the user with that information, it might happen that your data will reside at the
other end of the world from one moment to another resulting in a high latency and even time-outs. In
a poorly designed environment, features like HA might become compromised if mirroring/replication
nodes change often. Poor design might also make it impossible to setup a VPN connection with the
specific servers hosting your data. This would result in a significantly higher risk of Man in the Middle
attacks. That also applies to the connection to your data.
subpoenas on data extraction from datacentres. As one expert commented: bringing privacy
information out of the European Union can be [a] violation of local or European law. This would
mean that keeping in compliance with laws, be it local, national or international, will become more
difficult without knowledge of the physical location of the data store and processing unit. According to
the classification of the data, set objectives such as screening of all personnel handling data, cannot or
hardly be complied.
8.2.3 Geographic or geo-spatial risks
With regional or geo-spatial risks, the distance of objects relating to the relative position [..] on the
earth's surface (Collins English Dictionary, 2009) is meant, in this case the distance between servers,
but also the location of each server. This can be of importance in the case of disaster recovery, but also
with regards to physical security as presented in security norms such as the ISO 2700x series. An
example of geo-spatial risk is a fire in a datacentre. When availability is of very high importance, a fire
can be disastrous to availability ratings. It is therefore of primary concern that replication takes place to
a server at a distant location as a measure of disaster recovery. Failback servers like these will prevent
any downtime as they step in while the datacentre at the other location is burning down. Depending
on the risks and the threats, one might need to have a disaster recovery location off shore. In the light
of location, one could also consider other features such as the building type, the accessibility of the
server etc.
Geographic location should also be taken into account in the light of latency and propagation
speeds. Servers stationed at the other side of the world will have a harder time delivering data fast to
the user than the server under the users desk.
partly the same, as the program is deployed by the end user. Any compliance and governance within
the program and how it handles data is on the part of the developer. The governance of the
infrastructure and platform on which the application relies is in the hands of the provider. As with
SaaS, negotiations need to take place with the provider in order to secure compliance. For IaaS, most
of the governance and compliance lays in the hands of the tenant. The IaaS provider has to take care of
the compliance to standards such as SAS-70, but many issues like privacy, data encryption and
authentication
are
the
Iaas
Provider
A
Paas
Provider
A
Provider
C
SaaS
Provider
A
Provider
B
Provider
C
Provider
A
Figure 6: trust chains in hierarchical order. Doing business with one provide might result in
influence negotiations. That doesnt mean that complying to standards within a public cloud
environment is hard. In the end, you are still bound to your virtual machine and you will not be
influenced by other tenants.
Concerning boundaries, the major aspect is geographic location of the servers. The easiest option is
of course in the same region as the organization resides. Most knowledge of laws and executing
governance/assuring compliance will be available there. Auditing will not be an issue, as you can find
an auditing partner with whom you can easily communicate. That being said, the hardest option is
obviously a cloud environment dispersed over the globe. Although disaster recovery wise there will be
no issue complying to the toughest guidelines, getting audited and governance worldwide will be
tougher. Assuming a small company which is utilizing a cloud environment with datacentres on every
continent, which has no branches in any of those regions but one, will have more trouble getting his
system complying, and thus audited, than if it had only datacentres in use in the same location it
resides.
Although experts in the survey were wary of the fact that it could be done, in a personal interview
with an Chief Information Security Officer of a large utilities company, it was made clear that it is
33
theoretically possible. The issue with getting a successful audit done on an environment like this is that
all auditors need to cooperate. Even though many of the large auditing firms, like KPMG and Ernst
and Young have branches all over the world, it was mentioned that it doesnt mean that all of them are
willing, or capable to communicate with other branches at other parts of the world. Furthermore,
relations have to be built with all cloud providers in order for them to permit auditing on their systems
when physical access for an audit is required. Audits of the virtual systems can be done remotely, and
are presumably much easier as they do not require physical access and can thus be done by one
auditing firm. It is thus an extensive and difficult job to get the whole environment in compliance and
audited to certain security norms.
8.5 CI3A
Assurance in the Cloud can be defined by the terms of CI3A (confidentiality, integrity, availability,
accountability and auditability.) CI3A is an extension on the well-known and de facto CIA triad. This
has been used as a standard framework for testing the confidentiality, integrity and availability in
systems, data flows and so forth. However, for the Cloud it is too constrained because of the extended
reach in virtual and physical sense. Once CI3A is clearly defined and controlled within the cloud
environment, one can speak of assurance within the cloud. The proposed model utilizes CI3A to assure
the right level of security is maintained within the environment.
34
methods
such
as
secured
methods
needed
to
assure
confidentiality.
Integrity assures only authorized actors have access to certain data and said data gets distributed to
only authorized persons. Within that distribution, any editing or changes within the data should only
be made by the right persons. Man in the Middle attacks and other methods of data interception
violate the integrity. Governance and Compliance influence the integrity of the data. A fully compliant
environment is more likely to assure integrity. As with Confidentiality, the chosen Delivery and
distribution model influences the level of integrity and the measures needed to enforce integrity.
Availability comprises of measures to prevent unauthorized actors from deleting and moving data,
or accessing those files, minimizing downtime of the environment and the perception of threats to the
environment. These measures could be a HA infrastructure, strong authentication servers and disaster
recovery. Availability pays a big role within the cloud environment, as servers can be hosted anywhere
in the world, at multiple locations. This can be an advantage in the eyes of HA and disaster recovery;
latency, desynchronization and vulnerabilities in the extensive set of transceiver links can pose threats.
Availability is linked to the aforementioned boundaries and to the delivery and distribution models.
Accountability defines the measures taken to assure that no actor can make an actions without a
record. This is needed for forensics and governance. The measures needed to assure accountability
greatly depend on the delivery model, but also on the distribution model and compliance in general.
Auditability, the ability of the environment to be audited, is directly related to governance and
compliancy. Without a decent grade of auditability, compliance cannot be achieved. Auditability is
influenced by the delivery and distribution model, as with the geo-spatial and geographic boundaries.
35
CI3A covers all aspects of the proposed model, and when executed correctly creates assurance
within the cloud environment. These aspects, and how they correspond to the SeCA model are
outlined below.
The attributes described in the SeCA model are selected on the basis of the results of the Delphi
study and the literature research conducted. Notice how the deployment and delivery model are a part
of the definition of the cloud, all boundary related risks are included as described in 8.2. The
encryption attribute was mentioned in the delphi round, the network and premises attributes in
literature research (especially the Jericho model).
SeCA
attribute
Regional
Geo-spatial
Compliance
Delivery model
Deployment
model
Encryption
Network
Premises
X
X
X
X
Table 6: the attributes in the SeCA model and how they correspond to the CI3A
and computing network. Losing data between storage networks, due to time-outs for example, might
mean that data might never arrive. In other cases packets might get dropped when synchronization is
misconfigured. This opaqueness within the WAN architecture is an added risk. Note that it depends
on the configuration and regulations of the specific cloud environment whether the WAN architecture
is unknown or fully transparent.
Furthermore, one clients data can be stored on the same disk as his competitor. A compromised
system can serve your valuable data directly into their hands.
compromised, tenants use the same physical resources. This might increase the risks of data leakage
into the wrong hands.
Ownership can be compromised on certain systems as well. Especially in SaaS applications this
seems to be a hazardous issue. Facebook has ownership covered in his terms as follows: you grant us a
non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that
you post on or in connection with Facebook ("IP License"). This IP License ends when you delete
your IP content or your account unless your content has been shared with others, and they have not
deleted it. A particularly hazardous statement for any company, no matter what information gets
posted. (Facebook, 2010) Although this statement might not be legal in some countries (including
many countries in the European Union) one can imagine that not posting any IP content on Facebook
is a better solution than litigating Facebook for misuse of IP content. It is advisable to discuss data
ownership with the cloud provider.
Data loss is an issue on every system, but with the added amount of connections and actors, it
might provide a higher risk than in different system environments. Data Ownership is a critical point
that should be documented clearly in the Terms.
8.7 Encryption
Encryption plays a vital role within the cloud environment. It is affected by all but the geo-spatial
attributes in the SeCA model and affects the regional, delivery and deployment model. Although
encryption is a broad topic that has been covered in many papers, theses and books, there are some
aspects that are specifically related to the cloud. VPN tunnels, together with SSH can provide secure
access to the cloud environment. Two-factor authentication is a method of authentication in which
the user has to use two independent methods of authentication to reach a designated part of the
environment, which can be very helpful for the cloud environment. Many institutions are using
hardware key-tokens or SMS gateways in order two provide the second form of authentication apart
37
from keying in a password. Authentication servers using protocols as RADIUS in combination with
LDAP, Kerberos or Active Directory can handle all access requests in a proven manner as they are no
different from any LAN/WAN setup at a traditional environment. The author therefore believes that
in terms of access control, authentication and authorization, no issues are at hand other than those in a
non-cloud environment.
Attribute
Regional
Affects
encryption
X
Affected Description
by
Encryption
X
Geo-spatial
Compliance
Delivery model
Deployment
model
Network
Premises
Table 7: Encryption and how it is affected or affects the other attributes in the SeCA model
Apart from the aforementioned VPN and strong authentication possibilities for boundary support,
authentication & authorization, an encryption method specifically pertaining to the cloud is secure
computing. Secure computing offers a solution to issues that arise when multiple systems have to use
secure information transactions. Secure computing in essence involves Yaos Millionaires problem.
The millionaires problem is based on the following question: Two millionaires wish to know who is
richer. However, they do not want to find out inadvertently any additional information about each
others wealth. How can they carry out such a conversation? (Yao, 1982) This question also pertains to
the cloud: how does one compute data in the cloud, without decrypting, and thus the owner of the
38
computational unit getting access to the data? As one can imagine, asking Millionaire B hes richer
without giving essential data (the amount of wealth accumulated by Millionaire A) nor receiving it
(the amount of wealth accumulated by Millionaire B) is difficult and cumbersome.
In Yaos paper, three methods are discussed, each consisting of at least seven steps to perform in
order to find the answer to this relatively easy question. One of them can be summarized in a very
simplistic way as follows: Using the public key of Millionaire A, Millionaire B encrypts his wealth
with a random number added which he selects from a set of N-bit integers. Millionaire A then
decrypts that public key and calculates the modulus(P) where P is any prime calculated from the set of
N-bit integers. Whenever Modulus(P) reaches 2, she stops. This results in a list of numbers.
Millionaire A sends this list to Millionaire B. The N-bit integer that Millionaire B chose has a
corresponding value in the list that he received from Millionaire A. He compares that number with
Modulus(P). The outcome (equal, more or lesser than) is the answer to the millionaires problem.
Compare that to the three steps needed if both are willing to provide the amount of dollars they
have on their bank account (1: A states his amount, 2: B states his amount, 3: calculate answer) it is
trivial to say that secure computing comes with a large overhead.
This research has been extended by Goldreich (2000), who researched the problem with multiple
actors (called SMC, Secure Multi-party Computations). Recent research involves SMC geometry,
researching transactions of polygons on convex hulls. See Wang, Luo & Huang (2008) for an
overview. It is known that any multi-party computational problem can be solved using the generic
technique of Yao. (Yao, 1982) To overcome the overhead with Yoas Millionaires problem, and thus
SMC, it seems that algorithms designed to compute a special task need to be written (Goldreich,
2000; Feigenbaum, Pinkas, Ryger, & Saint-Jean, 2004). Using encryption methods such as
homomorphic encryption and public key encryption, several algorithms have shown to be applicable to
the cloud (Troncoso-Pastoriza & Prez-Gonzlez, 2010; Hu & Xu, 2009; Das & Srinathan, 2007) and
have proven to provide the security needed for the cloud within test situations approaching real life
cloud environments.
These methods of secure computing would allow the creation of a chain of trust that is secure, even
though not all parties within the chain know each other nor trust each other. This could overcome any
trust issues that might be in the field of cloud environments. Together with the enhanced and proven
techniques of authentication and authorization already available, encryption can make the cloud a very
secure architecture.
39
40
that its systems are secure and well audited. This proof can be a valuable asset in negotiations, trust
issues and SLAs.
The described standards vary widely on their applicability. The PCI DSS standard (section 9.2.1) is
only applicable to an organisation using payment card solutions. SAS 70 (9.2.2) is an American
accounting standard which does not specify any means of norms or rules on security or accountability,
but provides a framework for auditing of organisations.
internationally renowned security standards (sections 9.2.3 and 9.2.4 respectively). They provide
guidelines on security measures to be taken, auditing and governance to the respective standards.
9.2.2 SAS 70
The Statement on Auditing Standards number 70: Service Organizations, also known as SAS 70, is a
guidance to external auditors on the Generally Accepted Auditing Standards (GAAS). GAAS was
41
developed in the United States in 1972. Its international counterpart is the IFRS (International
Financial Reporting Standards).
SAS 70 provides guidance in the auditing of internal controls of an organization. It is a
requirement of the Sarbanes-Oxley Act of 2002. It used to show to external auditors and customers
that their environment has succinct internal control activities. This is done by the company that
specifies certain control objectives and control activities, which are then audited by an external auditor.
(SAS70.com, 2011)
Because the company sets the control objectives and activities, and the external auditor audits these
objectives and activities whether they are being processed or not, no real indication is given. An
extensive review of these objectives and activities needs to be done in order to understand what is truly
being audited.
9.2.4 COBIT
Control Objectives for Information and related Technology (COBIT) was developed in 1996 as a set
of best practices for IT management. It is formed out of 34 high level processes, covering 318 control
objectives and adopted worldwide.
42
COBIT comes with an IT Assurance guide in order to help service providers maintain assurance.
Although there is not much activity with COBIT in the cloud, it can potentially be very useful for
compliance and governance.
43
Characteristic
Conventional
outsourcing party
Virtualisation
No
Trust chain
Multi-tenant
No
Yes
Ownership of
architecture
Compliance
contracts needed)
virtualization)
Flexibility
Low
Very high
WAN infrastructure
Rapid deployment
available
Geographic location
Disaster recovery
virtualization)
Table 8: comparison of a dedicated outsourced solution and a public IaaS cloud solution
The major difference between conventional outsourcing and a public cloud, hosted off premise
within the same geographic region as the tenant is that the public cloud will be used by more tenants
on the same hardware. This means that in the case of a leak on the virtualization platform every
tenants virtual machine might be compromised. The agility of the system, including disaster recovery
options and backup solutions will be a profound advantage of the cloud environment. As virtualization
provides an added layer of security a public cloud solution might be more secure than an traditionally
outsourced server. However, it all depends on the regulations within the SLA and as cloud providers
deal with more customers on the same set of hardware, it might be harder to get that same SLA as you
were used to with your outsourced provider. Also, depending on the data will be stored, and the risks
44
associated with leakage of that data, it might be more secure to store data on a traditionally
outsourcing provider due to concerns of multiple tenants on the same hardware. This could also be
overcome by using a private cloud. It provides the security of only your organization utilizing the cloud
architecture, yet gives the benefits of the agility of the cloud.
Characteristic
Conventional
insourced solution
Virtualisation
possible
Trust chain
Multi-tenant
no
Ownership of
yes
yes
Compliance
Might be possible
Might be possible
Disaster recovery
Flexibility
low
WAN infrastructure
architecture
45
Rapid deployment
Unlikely
Geographic location
A good comparison to a cloud solution would be a private cloud, hosted on premise. The major
difference with this cloud architecture and an insourced datacentre is that it leverages the combination
of techniques used in the cloud such as virtualization, agility and multitenancy. (The tenants being
workgroups, projects or any other sort of segregation within the company.) One must keep in mind
that the agility of a private cloud, hosted on premise is not so agile compared to a public cloud. For
every expansion of the environment, not single VMs per se, a new physical server needs to be installed,
contrary to a public cloud where limits of the cloud environment are not as easily reached and there are
more options to extend the environment than only adding a new physical machine.
Internal/external describes the physical boundary of the cloud environment and how it
corresponds to your organisational premises. Internal would thus be that the cloud
environment is physically present on organisational boundaries.
Open/proprietary refers to the systems used in the cloud environment. Open are thus open
standards, open source software and such.
Insourced/outsourced defines whether the cloud is under the service of respectively the
organisations own personnel or by that of a third party.
During the research conducted, the author has found that the four criteria that are listed are valid,
but do not completely encompass the varieties of cloud environments. The Cloud Cube Model is
targeted at clouds in general, not specifically at the security of the cloud environment. It lists Open
and Proprietary attributes as a significant part of the cloud. In the model developed, shown below,
this attribute has been left out, as the development of a majority of cloud environments is done with
open source software and further research into this was not possible within the time admitted. The
author does not feel that the difference between proprietary and open source software is a significant
security concern. The Jericho Forum has listed it as a vendor lock-in issue and not as a security concern
insofar. A new model has been developed in order to fill the gap left by the Cloud Cube Model,
discussed below.
47
Figure 8: the SeCA model. Inputted is a classification, which gets passed through all the attributes (depicted horizontal)
which then outputs a specification for a secure cloud architecture
classifiydata
analysecloud
architecture
usingSeCA
list&select
cloud
providers
placedatain
cloud
48
Date:____________
Geospa al:
Delivery Model:
o IaaS
Deployment model:
Encryp on:
Network:
Premises:
ExpertsName:_________________________
PaaS
SaaS
Private
Partner/Community
Public
Hybrid
Within
Outside
Any
on premise
Off premise
Any
Table 10: a SeCA template form to be used in assessing the architecture needed for a secure cloud solution using the SeCA
model
It can occur that each classification has a different output from the assessment. (it is actually most
likely to do so.) In that case several options are open. For each classification a different list of cloud
providers is made in order to find and select the right cloud provider who can provide the cloud
architecture needed. These can be combined in Hybrid Clouds. Note that a hybrid cloud solution gives
the assessor/future client of the cloud provider a better negotiation position. One can also decide that
for certain classifications it is simply not feasible to transfer that data into the cloud and thus stick with
the solutions already in place.
The model does not provide the intelligence which classifications could be hosted at the same cloud
architectures. It is for the assessor to decide which cloud architectures that are the result from the
assessments can be merged.
49
10.3 Inputs
The input in the SeCA model is a classification. This classification needs to be made in the
organization by its security experts in order to define how data should be managed. These classification
tend to differ between organisations as they depend on what kind of data is being used, read, published
and processed by the organisation. For the examples used here, we use the following classifications
(which are also shown in Section 6, table 3):
Classification Measures
1: Top Secret
Data that should only be handled by specific people with the right
authorization. Very high business impact if a data leak occurs.
Resulting measures: Personnel screening, data on organizational premises,
data hosted only in the same country as the organization. Dedicated
hardware, e.g. no multitenancy. Replication on at least n+1 location. Data
must be within the network.
2: Secret
Data that should be handled by limited amount of people with the right
authorization. High business impact if a data leak occurs.
Resulting measures: Screened personnel, within region of organization. Data
may be off-premise, HA architecture.
3: Private
4: Public
Table 11: exemplary classifications that could be used as input in the SeCA model
10.3.1
Attributes
The input is then tested alongside several attributes, which are all relevant to the CI3A, seealsoTable
6: the attributes in the SeCA model and how they correspond to the CI3A.The CI3A definition works
both ways, on the one hand its content is being defined by the data classification, on the other hand
the attributes define how the definitions set in the classification result in a corresponding cloud
50
architecture. Below is outline of the attributes in the model. Each attribute features a table that shows
the measure that needs to be taken for each data classification.
10.3.1.1 Regional
Regional corresponds to the regional boundaries discussed in 8.2.2. They are the paramount of legal
importance when it comes to boundaries and tie in with Governance & Compliance, discussed below.
Regional can define multiple things. It might mean any legal boundary where the data are
hosted/executed, such as countries, states and counties. If we look at the classifications above, the
regional attribute will have different impacts. In the Top Secret classification, the data should be
hosted and executed within the region of the organization. It depends on the organization where that
would be. A multinational will have multiple offices in multiple regions. A small business will not.The
Public classification gives more room in this case, only requiring that one hosting location is within
the region of the organization. This might be set for example to assure legal directives like those are
common within the EU where all data of European organizations should be hosted in the EU.
Depending on the size of the region, creating a HA architecture with disaster recovery might be
difficult.
classification Measure
1: Top Secret
2: Secret
3: Private
4: Public
Cloud environment physically has one system within the same region as the
organisation
10.3.1.2 Geo-spatial
The Geo-spatial attribute defines the locality of hosting locations. There might be requirements
concerning disaster recovery where hosting location should at least be separated by a certain amount of
distance, or be separated by certain geographic features such as rivers, lakes or mountains. But also on
a micro level, such as specific industrial terrains where datacentres should be located. It has been
thoroughly discussed in 8.2.3. In case of classification Secret, which requires a HA architecture, one
can imagine setting a large distance between the datacentres in order to provide replication and
disaster recovery. Looking at Top Secret, this might be a struggle due to the fact that the servers
have to be on-premise.
51
classification Measure
1: Top Secret
2: Secret
3: Private
4: Public
Classification Measure
1: Top Secret
2: Secret
3: Private
4: Public
52
classification Measure
1: Top Secret
IaaS architecture due to the need of flexibility and custom security measures
2: Secret
Any, as long as trust chains are fully disclosed and personnel can be screened.
3: Private
Any.
4: Public
Any.
classification Measure
1: Top Secret
2: Secret
3: Private
Any.
4: Public
Any.
10.3.1.6 Encryption
Depending on the classification, certain encryption features need to be present in the cloud
environment. These might be mandatory in order to get certified, or to comply to rules and regulations
in certain legal districts. But encryption also provides the option for in-network access to the data
stored in the cloud environment. Secure computing can also be used to satisfy demands and objectives.
Encryption methods that apply to the cloud are described at large in 8.7. Data classified as Top
Secret will require an unbreakable encryption. Secure computing, two-factor authentication and
53
algorithms like AES and Elliptic Curve will be good options. The classification Public will be good
with encryption used for authentication and authorization in terms of adding data, editing and
deletion of data,, but the data itself does not need to be encrypted.
classification Measure
1: Top Secret
Strong Two-way authentication, all data is encrypted and then stored. For any
computations, SMC might be a solution or within the same system.
2: Secret
3: Private
4: Public
10.3.1.7 Network
The network attribute defines the network boundary the cloud environment should be in and how it
should be setup., as described in 8.2.5. If the cloud environment should be in the network, it can be
setup with measures like VPN, WebDAV and, in the case of on premise cloud environments VLANs
can be used.
classification Measure
1: Top Secret
2: Secret
Can be both
3: Private
Can be both
4: Public
10.3.1.8 Premises
Premises defines whether the cloud environment should be on organizational premises or not. This
might enhance the security of the environment and allows for more control concerning physical access
to the environment and personnel screening. However, off-premise means that the data will be hosted
at a company that has specific expertise with hosting and the cloud. This might add to the security as
well.
classification Measure
1: Top Secret
On premise
2: Secret
No preference
3: Private
No preference
4: Public
Off premise
54
10.3.2
Outputs
The output is a list of requirements to which the cloud architecture should adhere. In other words, a
framework to a secure cloud environment for the tested classification. If these requirements are applied
to the cloud environment, the data can be stored and processed securely. Of course, these requirements
need to get audited to create assurance according to the CI3A.
For the classification used in this example (see table 10, above), the following would be outputted:
Classification: Top
secret
Attribute
Value
Regional
Geo-spatial
Governance/compliance
Delivery model
IaaS
Deployment model
Encryption
Network
Premises
On premise
Table 20: Specification of the Secure Cloud Architecture for the Top Secret classification outputted from the SeCA model
Classification:
Secret
Attribute
Regional
Value
Cloud environment physically within the same region as the
organisation
Geo-spatial
Governance/compliance
Delivery model
Any, as long as trust chains are fully disclosed and personnel can
be screened.
Deployment model
Encryption
Network
Can be both
Premises
No preference
Table 21: Specification of the Secure Cloud Architecture for the Secret classification outputted from the SeCA model
Classification:
Private
Attribute
Regional
Value
Cloud environment physically within the same region as the
organisation or its partners
Geo-spatial
Governance/compliance
Delivery model
Any
Deployment model
Any
Encryption
Network
Can be both
Premises
No preference
Table 22: Specification of the Secure Cloud Architecture for the Private classification outputted from the SeCA model
Classification:
Public
Attribute
Regional
Value
Cloud environment physically has one system within the same
region as the organisation
Geo-spatial
Governance/compliance
Delivery model
Any
Deployment model
Any
Encryption
Network
Premises
Off premise
Table 23: Specification of the Secure Cloud Architecture for the Public classification outputted from the SeCA model
56
57
11 Conclusions
After research, defining the issues the cloud faces and the solution for those issues, several conclusions
can be made. While reading these conclusions it is paramount to take into account that the cloud is a
new business model for service providers, not a new technology on its own. The central question is, is
the cloud secure?
Defining something as secure depends on many factors. Depending on the sort of data, the
classification of that data and taking that wholly into perspective of the cloud environment, it can be
said that the cloud is secure in certain situations. Depending on the outcomes of investigations, there
should always be a cloud architecture that fits ones security needs. Better yet, the cloud can provide
additional layers of security by utilizing virtualization, elasticity and HA architectures. Even though
the additional layer of virtualization on the system might provide additional hazards, these hazards can
only be exploited when the virtualization platforms can be compromised. With a minimum of known
bugs, the last one dating from 2007, one can rationally say that the virtualization layer adds more
protection than threats.
By using the SeCA model described above, each and every classification can be checked to see how
a cloud architecture should be designed in order to meet the security standards needed. It will,
however, depend on the cloud provider whether it can deliver the architecture that is needed.
For the upmost secure classifications, a private cloud, hosted on premise, within the network, with
mirroring on a different physical location (branch office) utilizing the needed encryption methods will
provide a very secure architecture whilst maintaining the flexibility the cloud has to offer.
For every architecture counts that data location awareness is essential. Without the full knowledge
of where the data resides and is processed, issues will arise in all actors of the CI3A. Data location
awareness will also provide the means for compliance, legally and to security standards. These
standards are being adopted by all major vendors, including Amazon, Google and Microsoft, with
smaller ones following. This facilitates full compliance to the de facto security and auditing standards
such as SAS 70, ISO 27000 series, PCI and COBIT. It depends, once again, on the configuration of
the cloud architecture and where applicable the willingness of cloud provider to allow for audits. If the
selected cloud architecture features datacentres in widely spread different parts of the world, auditing
might be more complicated. This of course also applies to the compliance to legal systems (privacy,
intellectual property and auditing regulations) which can vary between jurisdictions. It is because of
58
these implications that so-called locationless clouds are not preferable. They have an opaque layer that
hides the user from vital knowledge in order to be assured from a secured cloud.
59
12 Further Research
Further research can be conducted in the legal field. This was out of scope of this thesis, but the legal
issues surrounding auditing, SLAs and NDAs are of paramount importance for the security in the
cloud. SLAs especially are of profound importance as they describe what measures a cloud provider
should undertake for the security of the cloud. This thesis unfortunately has not had the possibility to
explore the provider side of the cloud environment much.
Related to this is auditing in international/worldwide clouds. Auditing certifications, governance
and compliance to legal systems in these environments means that auditing firms, datacentre owner,
providers and application owner all need to work together in order
international and worldwide clouds these relations might become very complex, not to mention that
multiple audit firms/offices have to work together. The issues raised with datacentres situated in
different legal regions, such as China and the United States, are worth more research. Auditing plays
also here a major role in these issues.
The topic of true locationless clouds, and thus providing security by obscurity, is an interesting
topic. Although it seems hard to develop, as the end-user still needs to reach his data without
disclosing the location it is harboured, a proof of concept of such a system will be very interesting as it
does not only show what state of the art techniques the cloud can use, but also current issues in
countries with different legal systems. In the light of for example freedom of speech these solutions
might be very useful. One might consider a kind of interface that has to attempt to open a series of
ports on a series of hosts (like port knocking, but spread over various hosts) to define the location of
the data.
A pressing issue not discussed, but worth the research are third party appliances that are currently
installed in traditional datacentres. These appliances cannot be directly converted to the cloud, as the
cloud does not offer any place for such appliances. It seems that at the moment of writing many of
these appliances are converted to the cloud by their developers. It is nonetheless interesting to see what
impact these appliances have on the adoption of the cloud.
Although some cloud providers are certified, the impact of that certification on the real security of
the services the provider offers is not always known. SAS70 for example does not offer any concrete
security, it only offers a framework for auditing internal controls. The cloud provider will need to list
its internal controls for any user to see what has been audited. It might be interesting to see how cloud
60
providers use that information, what they do with it and whether the certifications really add up to
extra level of security that is said it adds.
61
13Bibliography
Asadoorian, P. (2007, 7 31). Escaping from the Virtualization Cave. Retrieved 12 22, 2010, from
PaulDotCom:
http://www.pauldotcom.com/2007/07/31/escaping_from_the_virtualizati.html
Backes, M., Ning, P., Wang, Q., Wang, C., Li, J., Ren, K., et al. (2009). Enabling Public Verifiability and
Data Dynamics for Storage Security in Cloud Computing. Computer Security ESORICS 2009 (pp.
355-370). Berlin / Heidelberg: Springer.
Bartvagh. (2010, May 7). Gartner: Private Cloud Computing Plans From Conference Polls. Retrieved
September 25, 2010, from MSDN Blogs:
http://blogs.msdn.com/b/architectsrule/archive/2010/05/07/gartner-private-cloudcomputing-plans-from-conference-polls.aspx
Blakley, B., McDermott, E., & Geer, D. (2008). Information Security is Information Risk Management.
ACM New Security Paradigms Workshop '08, 97-102.
Blokdijk, G., & Menken, I. (2009). Cloud Computing Virtualization Specialist Complete Certification Kit.
London: Emereo Pty Ltd.
Chen, Y., Paxson, V., & Katz, R. H. (2010). Whats New About Cloud Computing Security? Berkeley,
CA, USA.
Christodorescu, M., Sailer, R., Schales, D. L., Sgandurra, D., & Zamboni, D. (2009, Nov 13). Cloud
Security Is Not (Just) Virtualization Security. Chicago/Zurich, USA/CH.
Cisco. (2011). Cisco Data Center Network Manager Release 5.1. Retrieved Januari 5, 2011, from Cisco Data
Center Network Manager:
http://www.cisco.com/en/US/prod/collateral/netmgtsw/ps6505/ps9369/data_sheet_c78631924.html
Cisco. (2011). Cisco Fabric Manager 5.0: Visibility and Control for the Unified Data. Retrieved Januari 5,
2011, from Cisco MDS 9000 SAN Management:
http://www.cisco.com/en/US/prod/collateral/ps4159/ps6409/ps4358/product_data_sheet0
9186a00800c4656_ps6030_Products_Data_Sheet.html
Cisco. (2011). Overlay Transport Virtualization for Geographically Dispersed Virtual Data Centers: Improve
Application Availability and Portability. Retrieved Januari 05, 2011, from Cisco Nexus 7000 series
switches:
62
http://www.cisco.com/en/US/prod/collateral/switches/ps9441/ps9402/solution_overview
_c22-574939.html
Collins English Dictionary. (2009). geospatial. In Collins English Dictionary - Complete & Unabridged 10th
Edition. HarperCollins Publishers.
Collins, J. (2009, August 24). Virtualization and the Cloud: Just a stepping stone? Retrieved September 25,
2010, from The Register:
http://www.theregister.co.uk/2009/08/24/virtualization_and_cloud/
Das, A. S., & Srinathan, K. (2007). Privacy Preserving Cooperative Clustering Service. 15th International
Conference on Advanced Computing and Communications (pp. 435-441). IEEE.
de Bruijn, W., Spruit, M. R., & van den Heuvel, M. (2009). Identifying the Cost of Security. Journal of
Information Assurance and Security, 5(1), 79-83.
Dwivedi, H. (2005). Securing Storage. Westford: Pearson Education, Inc.
ENISA. (2009). Cloud Computing: Benefits, risks and recommendations for information security. ENISA,
Emerging and Future Risk programme. Crete: ENISA.
Eucalyptus. (2011). Eucalyptus Community Cloud. Retrieved april 5, 2011, from Eucalyptus:
http://open.eucalyptus.com/CommunityCloud
Facebook. (2010, 10 4). Statement of Rights and Responsibilities. Retrieved 1 27, 2011, from Facebook:
http://www.facebook.com/terms.php?ref=pf
Feigenbaum, J., Pinkas, B., Ryger, R., & Saint-Jean, F. (2004). Secure computation of surveys.
Proceedings on the EU Workshop on Secure Multiparty Protocols. Citeseer.
Gilder, G. (2006, October). The Information Factories. Retrieved November 30, 2010, from Wired.com:
http://www.wired.com/wired/archive/14.10/cloudware_pr.html
Gnagey, K. (2010, February). Cloud Storage - Where are we at? SNS Europe, 10(1), 31-32.
Goldreich, O. (2000). Secure Multi-party Computation. Working Draft.
Gonzales, J. J., & Sawicka, A. (2002). A Framework for Human Factors in Information Security. 2002
WSEAS Int. Conf. on Information Security (pp. 1-6). Rio de Janeiro: WSEAS.
Grossman, R. L. (2009, March/April). The Case for Cloud Computing. IT Professional, pp. 23-27.
63
Helmer, N. D. (1963, April). An Experimental Application of the Delphi Method to the Use of Experts.
Management Science, 9(3), 458-467.
Hewlett-Packard Development Company, L.P. (2011, march 14). HP Sets Strategy to Lead in Connected
World with Services, Solutions and Technologie. Retrieved April 5, 2011, from HP Newsroom:
http://www.hp.com/hpinfo/newsroom/press/2011/110314xa.html?mtxs=rss-corp-news
Hintzbergen, J., Hintzbergen, K., Smulders, A., & Baars, H. (2010). Foundations of Information Security.
Zaltbommel: Van Haren Publishing.
Hu, H., & Xu, J. (2009). Non-exposure location anonymity. ICDE'09. IEEE 25th International Conference on
Data Engineering, 2009 (pp. 1120-1131). IEEE.
Hwang, K., Kulkareni, S., & Hu, Y. (2009). Cloud Security with Virtualized Defense and ReputationBased Trust Mangement,. IEEE International Symposium on Autonomic and Secure Computing. 8,
pp. 717-722. IEEE.
ICT-Kring Delft. (2009). ICT Security in de Praktijk. Apeldoorn: Thieme Print4U.
International Organization of Standards. (2005). ISO/IEC 27002:2005: Information Technology Security techniques - Code of practice for information security management. Geneva,
Switserland.
Jensen, M., Schwenk, J., Gruschka, N., & Lo, L. (2009). On Technical Security Issues in Cloud
Computing. Proceedings of the 2009 IEEE International Conference on Cloud Computing (CLOUD '09)
(pp. 109-116). Washington: IEEE Computer Society.
Jericho Forum. (2009, April). Cloud Cube Model: Selecting Cloud Formations for Secure Collaboration.
Retrieved Januari 9, 2011, from Jerochio Forum:
http://www.opengroup.org/jericho/cloud_cube_model_v1.0.pdf
Kandukuri, B. R., Ramakrishna, P. V., & Atanu , R. (2009). Cloud Security Issues. IEEE International
Conference on Service Computing (pp. 517-520). IEEE.
Levelt, W. (2010, August 3). Cloud Security - The fear for the unknown. Capgemini Cloud Comuting
Conference. 3. Utrecht: Capgemini.
McAfee. (2010, September 25). Security as a Service. Retrieved september 25, 2010, from McAfee:
http://mcafee.com/us/small/security_insights/security_as_a_service.html
64
Mehta, N., & Smith, R. (2007, 9 17). VMWare DHCP Server Remote Code Execution Vulnerabilities.
Retrieved 12 22, 2010, from IBM Internet Security Systems:
http://www.iss.net/threats/275.html
Mulholland, A., Pyke, J., & Fingar, P. (2010). Enterprise Cloud Computing (1st Edition ed.). Tampa, Fl.,
USA: Meghan-Kiffer Press.
NIST. (2006, April 25). Glossary of Key Information Security terms. (R. Kissel, Ed.) Retrieved November 29,
2010, from Computer Security Resource Center NIST:
http://csrc.nist.gov/publications/nistir/NISTIR7298_Glossary_Key_Infor_Security_Terms.pdf
NIST. (2010). NIST Definition of Cloud Computing v15. Department of Commerce. Washington: NIST.
Ormandy, T. (2007). An Empirical Study into the Security Exposure to Hosts of Hostile Virtualized
Environments. Proceedings of CanSecWest Applied Security Conference. Vancouver.
PCI Security Standards Council. (2010). PCI DSS Requirements and Security Assessment Procedures, Version
2.0. PCI Security Standards Council, LLC.
PCI Security Standards Council, LLC. (2011). PCI SSC Data Security Standards Overview. Retrieved
Januaari 07, 2011, from PCI Security Standards Council:
https://www.pcisecuritystandards.org/security_standards/index.php
Preston, W. C. (2002). Using SANs and NAS. Sebastopol: O'Really Media.
Reuters. (2008, September 25). What on earth is cloud computing? New York, NY, USA: Reuters.
Ristenpart, T., Tromer, E., Shacham, H., & Savage, S. (2009). Hey, You, Get Off of My Cloud: Exploring
Information Leakage in Third-Party Compute Clouds. CCS'09 (pp. 1-14). Chicago: ACM.
SAS70.com. (2011). SAS 70 Overview. Retrieved Januari 7, 2011, from SAS 70:
http://sas70.com/sas70_overview.html
Skulmoski, G. J., Hartman, F. T., & Krahn, J. (2007). The Delphi Method for Graduate Research. Journal
of Information Technology Education(6).
Stoneburner, G. (2001). Underlying Technical Models for Information Technology Security:
Recommendations of the National Institute of Standards and Technology. Department of
Commerce, National Institute of Standards and Technology. Gaitherburg, MD: NIST.
65
Thusoo, A. (2009, June 10). Hive - A Petabyte Scale Data Warehouse using Hadoop. Retrieved October 5,
2010, from Facebook: http://www.facebook.com/note.php?note_id=89508453919
Troncoso-Pastoriza, J. R., & Prez-Gonzlez, F. (2010). CryptoDSPs for Cloud Privacy. CISE 2010. Hong
Kong.
Vaquero, L. M., Rodero-Merino, L., & Cacer, J. (2009, January). A Break in the Clouds: Towards a Cloud
Definition. ACM SIGCOMM Computer Communication Review, 2009(39), 50-55.
Vigfusson, Y., & Chickler, G. (2010, Spring). Clouds at Crossroads: Research Perspectives. Crossroads,
16(3), 10-13.
VMWare. (2007). Understanding Full Virtualization, Paravirtualization, and Hardware Assist. Retrieved 10
18, 2010, from VMWare:
http://www.vmware.com/files/pdf/VMware_paravirtualization.pdf
VMware. (2010, September 1). Creating a Provider Virtual Data Center in VMware vCloud Director.
Retrieved Januari 5, 2011, from VMware Knowledge Base:
http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC
&externalId=1026296
Voas, J., & Zhang, J. (2009, April/May). Cloud Computing: New Wine or Just a New Bottle? IT
Professional, pp. 15-17.
Wang, Q., Luo, Y., & Huang, L. (2008). Privacy-preserving Protocols for Finding the Convex Hulls. The
Third International Conference on Availability, Reliability and Security (pp. 727-733). IEEE.
Wang, Q., Wang, C., Li, J., Ren, K., & Lou, W. (2010). Enabling Public Veriability and Data Dynamics
for Storage Security in Cloud Computing. Computer Security - ESORICS 2009, 355-370.
Wharton Business School. (2005, February 17). Delphi Decission Aid. (Y. V. Simon Galperin, Producer, &
Skytech Systems) Retrieved October 5, 2010, from Delphi2:
http://armstrong.wharton.upenn.edu/delphi2/
Yao, A. C. (1982). Protocols for Secure Computations. Proceedings of the 23rd Annual IEEE Symposium on
Foundations of Computer Science, (pp. 160-164). Chicago.
66
14Appendix
14.1 Results from the Delphi study Round 1
Data results: Question#1
Question: First of all, thank you for participating in this research. In this research we
will refer to the cloud often. Just to be sure, with the cloud we mean The Cloud as
defined by ENISA: Cloud computing is an on-demand service model for IT
provision, often based on virtualization and distributed computing technologies,
which in essence is the same as the NIST definition. To be clear: we are not going to
talk about Cumulus or other forms of atmospheric vapor buildup :) I hope you enjoy
answering the questions. Let us commence: Do you have experience with the cloud,
and if so, on which level? (architect, developer, end user etc.)
Responses for this question
Expert's answer:
I have no experience with cloud computing so far.
Expert's answer:
yes, as a user and as a decision maker in a management role
Expert's answer:
end user
Expert's answer:
Yes, as architect. Providing advice on pro's and con's of cloud solutions.
Expert's answer:
I have written my thesis about Cloud computing (hereafter: CC). Furthermore
I review designs or other implementations of CC in my function as an
technical IT-auditor with a focus for/on security.
Expert's answer:
main experience is as end-user
Expert's answer:
yes, in the area of providing requirements from compliance point of view
Expert's answer:
I don't have experience
Expert's answer:
I have only experience in personal use. My company does not use any cloud
67
solution
Expert's answer:
user and security architect
Expert's answer:
Architect level
Expert's answer:
Sales and advise about security related issues in the cloud
Expert's answer:
Manager of architecture, development and deplopyment. Also services on the
cloud environment is under my responsibility
Expert's answer:
As an IT auditor we audit companies which use cloud computing in their line
of business
Expert's answer:
Little, as an architect
Expert's answer:
I do not have much experience with cloud applications other than as an end
user of certain distributed computing projects such as Folding@Home.
However, there are a lot of tools in the wild that make use of a cloud
environment that I am aware of (have tested or researched) such as Web Of
Trust or other community-based site ranking systems.
Expert's answer:
I'm from a security point of view interested in the cloud and
applications/usage
Yes, but it depends on what systems are part of the cloud used by a company
to store and/or process data. If any of the systems are untrusted, the safety of
the data can be compromised. The data on any one system may not be
complete (since this is beyond the system's control) but it remains a risk.
Expert's comment:
Some new security issues, especially due to virtualisation. But mainly loss of
control on current security measures
Expert's comment:
Yes; Because CC is the combination of a lot of computing techniques which
have their security issues on their own. If you combine the security issues you
will get a new attack vector!
Expert's comment:
As the environment will differ from the current known environment, it will
create new issues
Expert's comment:
- Onzekerheid over de locatie van de data. Van belang voor het voldoen aan
privacy wetgeving - onzekerheden over de toegangsmogelijkheden van
overheden tot de data - onzekerheden over de kwaliteit van mededelingen van
auditors in landen met een minder hoog niveau als in Noord-Amerika en
Europa
Expert's comment:
I think the cloud makes the use of information everywhere, with every device
and on any moment more possible than using information in a strictly closed
company environment
Expert's comment:
The cloud has mu boundaries, where youre data is stored can be any place on
earth. Your data is no longer under your own sight.
Expert's comment:
The location of data is not known, so this poses additional risks to mine
opinion
Expert's comment:
enterprise wide authorisation
Expert's comment:
I answered yes, mostly because I believe that security issues are part of every
application there is. The cloud actually increases potential issues by
extending the reach of your 'trusted network'. However, I'm not sure any of
those potential issues are new by any means.
69
Expert's comment:
As answered in the previous two questions, yes.
Expert's comment:
Maybe not the complexity, but just that with cloud solutions more
responsibility is moved to the provider (e.g. performing upgrades, testing new
functionality, backups, ...).
Expert's comment:
Yes; There will be resellers who provide other services. Any mailcious
company could therefore become a reseller or partner.
Expert's comment:
the more partners involved the more trust issues are a risk.
Expert's comment:
Not so much trust, one will be more depended in partnerships and therefor
ask for more evidence to proof trust. Main issue will be how things will be
guaranteed when partners are using third parties in between.
Expert's comment:
Als er op basis van contracten zekerheden kunnen worden verkregen binnen
de gebruikelijke kwaliteitskaders in Noord-Amerika en West-Europa (+
Australi) betekent dat nog niet dat nzelfde niveau kan worden bereikt bij
onderaanbesteding (van beheer bijv.) in andere delen van de wereld.
Expert's comment:
as my statement in question 3, it is possible that your information is disclosed
by unautorised peolple or even geovernments. You don't know were your
information exists and who is being care about your information
Expert's comment:
There could be a "third" or "fourth" vendor situated behind your Trusted
partner, witch you knowing it.
Expert's comment:
More partners are working together in the cloud, this chain is growing. So i
believ this is true.
Expert's comment:
storage is business and thus there is trade in storage over many chaines. You
only know the last chain
Expert's comment:
72
Cloud will provide a more easy administration of the customers and less
configurations. The infrastructure issues will become more as connecctivity is
the key problem.
Expert's comment:
No the trust issues are the same with outsourcing
Expert's comment:
In current business and organisations it already is hard to commit to
agreements and SLA's. In more complex business relations this matter wil
also become more complex and sanctions on not reaching SLA requirmenst
wil be dealt with on a legal level.
Expert's comment:
I believe trust will be difficult to establish. At some point, there will need to
be some hierarchy attached to the system, with trust levels, otherwise that just
leaves openings for security problems.
Expert's comment:
I think that separation of data is an issue, but also common sense of the users.
Question: Encryption has been called the savior of the Cloud. What are your
perspectives on that statement?
Responses for this question
Expert's answer:
If it has convinced users that the cloud is safe to use, then yes. But whether
that means that the feeling is justified, remains to be seen.
Expert's answer:
the system will always be capable of reading your data. without crypto the
cloud would be useless, but that doesn't mean that with crypto is will all of a
sudden become a safe haven
Expert's answer:
If you store the encrypted data as well as the key on the system in the cloud, it
can be hacked - as is with encrypted DVDs, HD-DVD, Blu-ray. Nonetheless,
if the key is not present on the system in the cloud (which means the system
is only storing the data since it cannot process or read it), encryption can keep
it safe - as long as the encryption is sufficiently secure.
73
Expert's answer:
Excryption is clearly essential. But is certainly not enough. And ...
governments would like to have a back door.
Expert's answer:
Encryption is not the answer to security. The weakest link remains human
(intervention).
Expert's answer:
I believe in the public cloud encryption is a must and offers a kind of a
needed risk management. I am not convinced if this is needed in case of a
private cloud.
Expert's answer:
Depends where it will be used and especially, who is the CA for these keys.
Because they have to be trusted by all involved parties, and proof that as well.
Expert's answer:
Encryptie is slecht van vele soorten maatregelen die een vorm van beveiliging
kunnen leveren. Dus wel van betekenis, maar niet zaligmakend.
Expert's answer:
Encryption can realy be a savior of the cloud in my opinion. If I am the owner
of the key material wich encrypts the data and the hosting provider does not
hav a key.... Is my total chain, from display until SAN encrypted? If yes it can
be a solution. Be aware performance issues with encrypting and decrypting
on the fly.
Expert's answer:
Only if the encryption algorithms are public, so everyone can inspect them.
Expert's answer:
Encryption can be one of the solution to provide confidentiality of informatie
stored in the cloud. I do not believe that it ia a saviour for the cloud.
Nowadays companies using encryption still have problemen regarding
cryptographic key management and compromitation of data...
Expert's answer:
There is no unbreakable encryption and it is not the savior, that is only from a
selling perspective. You have to backup, even in the cloud, you have to be
able to reproduce your data because of state regulations. So encryption, there
is allways a weak spot.
Expert's answer:
Encryption does not always work, because where do you start the encryption.
If the encryption is to early in the proces you can't handle the data in the
74
75
Expert's comment:
As stated in the previous answer, if the data needs to be processed or altered
in the system in the cloud, it needs to be decrypted - and for this, the
decryption key needs to be present on the cloud system, rendering it
vulnerable to attack. Either the data is safe but cannot be processed, or the
data can be processed but is unsafe.
Expert's comment:
depends on required security level. But most data will need to be encrypted.
Expert's comment:
Not all data, security sensitive information should be encrypted. The
overhead for using encryption on all data is just too much!
Expert's comment:
i think this denpends strong on the importance of the business data.
Expert's comment:
That is up to the company, depending the value of the information and/ or
risk involved, you define the appropriate measures to mitigate or protect
them.
Expert's answer:
No opinion
Expert's comment:
Bij encryptie is de vraag of de sleutels niet in verkeerde dan wel ongewenste
handen terechtkomen. Bovendien moet je tijdens de verwerking de encryptie
opheffen, waardoor er weer allerlei mogelijkheden ontstaan om gegevens
ongencrypt af te tappen. Dat wil niet zeggen dat encryptie niet een
waardevolle bijdrage kan leveren aan de beveiliging
Expert's comment:
No, there is a lot of data wich is public data or internally data for wich the
need to encrypt is not very necessary. Only confidential or secret data should
be encrypted. This means, a data classification is necessary and should be
implemented
Expert's comment:
confidentiality and integrity
Expert's comment:
I do not believe that encryption is the solution. I think that is is more
important to classify data so you can deside what information can be in the
cloud of should never been in the cloud.
Expert's comment:
76
Expert's answer:
It depends per (business) need. If a company doesn't need much security or
plan to implement their security needs themselves it is a perfect solution!
Expert's answer:
it depends strongly on the importance of the business data. I believe that the
cloud will be the way to go.
Expert's answer:
I don't agree yet, it has to proof itself in the future, if they can meet all
requirements (risk-based). To me currently the security risks are bigger.
Expert's answer:
Risico's die verband houden met het onbevoegd ontsluiten van data kunnen
dermate zwaar wegen dat hier geen enkel voordeel tegenop weegt.
Expert's answer:
I don't think that is true. However, you should think very carefully about what
to store in the cloud and what to store local. Think about laws, regulations,
classification end be sure your organisation is aware about information
security. That in combination with a trusted partner and the neccesary nondisclosure agreements and control moments (audit) should make it possible to
use the cloud for at least a part of your business
Expert's answer:
is the basis for cloud-computing is driving by a proper risk-analysis this could
be true. if it is user experience then there could be more security risks
involved
Expert's answer:
The use of clouds will give big advantages like: scalability fast aivalability of
new services. Pay what you use. No high investments. If these topics are
important for an organsation and they are more important than security, using
real risk management approaoches, than it will outreach.
Expert's answer:
That is only a sales argument. You do not need the cloud, there are other
ways to have your data allways with you if needed. I you violate against a
local or remote law or you want to act conform basel II or other regulations,
then you have a problem.
Expert's answer:
I don't know.
Expert's answer:
I disagree, a lot of companies will not take the step towards cloud solutions if
78
we can not garantee security of their data. They will rather have a more
expensive solution on premise.
Expert's answer:
I would agree that it could have more advantage than security risks, however
this should carefully need to be considered per solution. It is not by
definition, and that is what risk analysis is for. It might wel be that teh risks
are to high.
Expert's answer:
I'm not sure. It's still a bit early in the 'cloud-based' era to tell just how far the
security risks go I think. For now, I think it's just like any other online
application.
Expert's answer:
This is a selling sentence, I don't believe this.
Data results: Question#8
Question: The Cloud comes in 3 delivery models (Software, Platform and
Infrastructure as a service) and 4 deployment models (Public, Private, Hybrid and
Community Cloud). As you might be aware, depending on the combination of two of
the above more or less flexibility can be given to the end user (Optima Forma being
IaaS in a Public Cloud). What are your viewpoints from a security perspective?
Depending on your specialization you could discuss the following: should they all be
considered the same, or be handled as a unique system per combination? How should
Authentication, Authorization and Accountability be handled, and what about
encryption? Are there combinations that you believe to be unsecure, or unfavorable?
How about geographic properties of the cloud (both within the network and physical)?
This is basically your moment to say what you believe are serious issues with security
in the cloud. It can be in depth or a broad overview. The responses from this question
will be vital for the research and for the survey in the next round.
Responses for this question
Expert's answer:
I believe the delivery model should not matter too much when it comes to
security. The nature of the data should prescribe the level of needed security.
From the sort of use you would expect the deployment model to have a
stronger correlation with security. With a more open group of users the data
should not be interesting to secure, because you have almost no control about
who you are sharing with. The tighter the control over the users, the more
demands can be made about applying the three A's.
Expert's answer:
it all depends on what you have to loose. this will vary from case to case. if
you happen to have a cloud inhouse, nothing really changed but your network
design. in all other situations you're outsourcing part of your infrastructure, so
79
the before doing so a risk assessment has to be made. from a risk point of view
there is no reason to do this per combination in stead of per case.
Expert's answer:
In my (limited) opinion, data in the cloud should be stored either on trusted
systems if it needs to be processed or altered, OR it should be stored encrypted
on untrusted systems. Depending on the deployment model, trusted and
untrusted systems can be grouped together and treated accordingly. I am not
familiar with the delivery models.
Expert's answer:
Personally, I like the Jericho Cloud Cube model better for such an analysis. In
general: -where physical boundaries are removed, physical-based measures
need to be replaced by "virtual"-based measures. -Due to loss of control (shift
of responsibility to cloud provider), new risks emerge and new measures need
to be taken.
Expert's answer:
Ultimately, security should be inbedded in all solutions on a same level.
However, the implementation of security is costly (altough the price of a
system-wide security level is less costly than on a per user base), and therefore
customers could opt for a cheaper but less secure system. The EU data privacy
act is overrated, but there are problems in some regions of the world where
hosting should not be considered at all!
Expert's answer:
I like to come to one type of cloud, which can be controlled based on
Confidentiality, Integrity, Availability, Auditability, Compliance
requirements. Authentication, Authorization and accountability are type of
measures that could be implemented to make the difference between the
deployment models. Also the delivery models should be one, based on the
requirements.
Expert's answer:
De genoemde varianten kunnen nog weer verder worden verfijnd door in een
organisatie per soort toepassing c.q. soort gegevens een andere benadering te
kiezen. Hier ontstaat een nog complexer model om te beschouwen. De kern is
dat dit soort differentiaties in het gebruik maken van de Cloud een prima
groeipad kunnen opleveren om de verschillende risicoprofielen die bij die
combinaties horen uit te proberen en zo langs de weg van geleidelijkheid
ervaring op te doen en de beveiligingsoplossingen volwassen te laten worden
en het spel van vraag en aanbod. Een echte beschouwing van al deze aspecten
voert nu te ver.
Expert's answer:
Software in the cloud like the MS Office Live or Googles office environment
means that everything you do is 'virtual' Even if you store your article local,
80
than the temporary files and maybe the original file is stored somewhere
abroad... I do not like the idea. Still writing an article it can change from pubic
to secret. So, no leave my office version at least in my company environment,
but I want to have it on my local machine. "Internal" information or databases,
can be stored in the cloud. Higher level of classification, I prefer local (hosted
in the Netherlands) storage. Encrypted as mentioned before, in combination
with PKI should be allowed. Strong authentication (hardware token) is
neccesary for claasified data. SAS70 statement from the hsting party is
necessary
Expert's answer:
again, if the cloud computing is driven by a proper risk-analysis there will
only be legal issues if international law enforcement. If confidentiality,
integrity and availability of the information is kept within the boundaries of
user/business expectation then there is no real issue.
Expert's answer:
The security isuues will be different depending on the deployment models.
For privacy a private cloud is the best solution. For all the other security
criteria authentication, authorization, encrytion eg.they have to be
implemented regarding the security model. If authorization and encryption for
confidentiality reasins are the most important ones than you should do youre
own application, so use ionly PAAS or IASS.
Expert's answer:
It doesn't matter, it are all sales arguments and nevertheless it is about the
chain, in the end your data is some where stored. A private cloud is very old,
the first was in the beginning of desktop computing. A mainframe is private
cloud computing. Its all about boundaries, you have to think in compartiments
and where you have influence. The less control and/or influence the greater
the risk. the risks are spying on you, but also data loss, theft, breaking of
regulations, not be able to access it, etc etc The cloud is NOT safe, its only
another way to make money.
Expert's answer:
I think all combinations will have their own security issues. Some
combinations are "easier" to secure than others and therefor in different
situations you should take a different approach, also depending on the
sensitivity of the information that will enter the cloud.
Expert's answer:
long question - not sure what the intend is!! I do not see a whole lot of security
isseus as long as we keep the encryption and network acces control at a level
that is acceptable to our customers. Besides that we can always have a hybrid
situation where there is a sharepoint portal for instance as gateway to the
funcitonality in the cloud.
81
Expert's answer:
IaaS is a great thing , although in the end it would coem down to Who owns
the system, can own the data. SaaS there is no way in knowing whatever is
behind teh SaaS solution (no opensource) You need trusted third parties
(TTP's) for good authentication and authorisation. COmes down to trust, and
these are the weak links.
Expert's answer:
This is a difficult question to answer. As I've said previously, analyzing the
data that is to be contained in the cloud for sensitivity will be vital in
determining the level of authentication/authorization to said data. For instance,
in the example of cloud-based community ranking of web sites such as WOT
or Site Advisor, the level of trust a user has over another is non-existent.
Anyone can make a rating and affect the general outcome. I believe this is a
problem and a potential security risk should this avenue be explored for more
sensitive information. Encryption is vital for any PII or IP data.
Expert's answer:
I think you should use architecture / business drivers to choose either option.
The options depend also on security / ownership issues. I do not have a favor
for one option, it depends on the given situation.
should get ironed out pretty qquickly. Having no physical loaction should lessen
the chances of original and backup being destroyed at the same time.
Expert's answer:
within a traditional outsourcing the outsourced party will give evidence
regarding the quality of security and the way that privacy issues has been covert
during outsourcing. Within a outsourcing in the cloud, depending on the cloud
model, these issues need to be address also. The biggest issue will be the privacy
of the information and the way that the organization can rely on jurisdiction
regarding the enforcing of privacy laws.
Expert's answer:
The difference is the fact that in traditional outsourcing you believe to know
where your systems are and to know where your data is stored. We believe our
data and systems are in Apeldoorn and Amsterdam. When I go there and
someone points to server and says: This is your SAP server, than I'm happy, but
is it really my SAP server? I don't know.... :-) In Cloud computing you really
don't know where your applications and data are. This means there is not that
much difference, if you don't trust your outsourcingspartner
Expert's answer:
all your data are belong to not you
Expert's answer:
This depends on the type of outsourcing (outtasking/BPO etc) and way you want
to be in control. The only main difference I see is in the difficulty to get into
control, because you don't where it is.
Expert's answer:
The main difference is that you don't know exactly where your information is
located.
Expert's answer:
There is no real difference as long as the cloud service provider is located in the
same country, if this is not, then there will be "law" issues if something goes
wrong. Example: Google's states that back-upped data of a customer is google's
property.
Expert's answer:
Both are forms of outsourcing, however cloud is in the hands of people that you
do not have a personal relationship with. Therefore the mindset will be that is
will be less secure.
Expert's answer:
Less transparency on how security is covered. No governence
83
Expert's answer:
Outsourcing IT work is mostly done for financial reasons. IT is a very
competitive market and as such what gets you the most bang for the buck in the
field can land you in on any number projects. I suppose outsourcing also is most
feasible in this field due to its nature of being wide open with few physical
'boundaries'. Outsourcing to the cloud is already being done. I think similar
issues exist with security whether in the cloud or not.
Expert's answer:
With Cloud outsourcing enduser is much more in controle (more accessable), but
information is somewere on the Internet. 2FA authentication, certificate's are a
must to use.
Expert's answer:
Security standards are dictated by the cloud provider (This is not a qualification
as they may have implemented a high security level). With traditional
outsourcing there are more possibilities to implement company standards
regarding security.
Data results: Question #2
Question: How does outsourcing of IT in the traditional sense compare to outsourcing to
a Cloud service provider in the perspective of trust? Are the relations more or less
complex?
Expert's answer:
This is another open ended question. The traditional outsourcer can have
multiple subcontractors similar/comparable to the cloud service provider.
However if the respective client company has multiple cloud service providers
from which services are being delivered it can have a greater impact on the level
of trust. This would be the case if all services need to be combined to get the
desired outcome. The only correct answer is that with more (inter)relations the
perspective of trust is likely to be lower compared than with a single relation.
Expert's answer:
The relations are complexer, some say that therefore SAS70 reports can give
more trust, but in my opinion is the cloud a black box and trust is relative.
Expert's answer:
The relations will be more complex since the clients will want to be reassured
that their data is safe and accessible at all times. REspecially outside of the IT
field companies will have to get used to the idea.
Expert's answer:
Traditional outsourcing is base merely on trusting the outsourced party. The trust
element with outsourcing in the cloud is less, the main driver for cloud is cost
84
efficiency and trust in the original way is less. The outsourcing for cloud does
not have to be more complex compare with traditional outsourcing. The
complexity in both cases depends upon the SLA that has been agreed and the
way that both parties play their role in that agreement.
Expert's answer:
Normally in a good trust relationship, you know where your data is, you have
(contracted with NDA) trust in the contractors employees, you know the WAN
infrastructure where your data travels. So you believe it is secure. In a Cloud
solution you don't know where the data is. You don't know the people who takes
care of your systems and data and you don't know the network infrastructure.
Yes, it is the World Wide Very Trustful Web.... This looks less secure than
outsourcing to a very friendly partner
Expert's answer:
you have to trust his blue eyes
Expert's answer:
No they are the same.
Expert's answer:
More or less the same, but because it's newer and unknown
Expert's answer:
If you can build up this trust it could be very good, but in most cases there will
never be like a real live trust model because the provider is "somewhere out
there". So you will try to get more trust from a agreement perspective or from a
familiar provider, someone you know.
Expert's answer:
Relations will be more complex and trust have to build up as it is less personal.
Expert's answer:
MOre complex, all based on trust and general agreements
Expert's answer:
I would say that cloud applications make trust more complex. Depending on
how the 'cloud' is setup of course.
Expert's answer:
It's the same. The onfo is not on own premesis. You need to find out how it is to
swich from ASP.
Expert's answer:
Trust is build between persons. So if people are involved the relations will be
85
worldwide and strong international standards (such as the ISO 2700x, SAS 70 and PCI),
how much is auditing an issue for the cloud?
Expert's answer:
The international standards are currently not enough to give enough guidance for
cloud computing and some of the standards you refer to can be multiinterpretable. This will however not make the use of the well-known
international standards useless, but it is not enough to provide the needed level
of assurance. Until there is some agreed upon standard/ additions to the
international standards auditing remains an issue. Lastly, and with all do respect,
as I am an auditor myself, no auditor is likely to give the same audit results as
his collegue.
Expert's answer:
It all depends on the scope. you cannot transfer accountability because of storing
data and information outside your location.
Expert's answer:
It should not be an issue. The multinationals have been doing it for years, maybe
it will just be more common.
Expert's answer:
Auditing will still be an issues, not all cloud providers use the same audits
standards, and depending upon the country where the data is been processed
auditor will have that experience locally.
Expert's answer:
If you look to information security or compliancy to standard or local and
governmental law, an auditor is interested in design, existence and operation
(opzet, bestaan en werking, weet niet of dat goed vertaald is naar het Engels)
The auditor really wants to see of your security measures are designed (Let me
see where you wrote it down!!!) He want to see if the designed countermeasures
are implemented in the system and he wants to see if the implemented
countermeasures perform like the way they are designed. Design: A password
shall be 8 characters long, with minimal 1 capital and 1 number or punctuation
mark. Existence: He is shown in the Active directory that this is implemented
Operation: He tries to change a password and makes it 7 characters, only lower
case. The system comes back with an error message, this password is a breach of
the password. How does an IT auditor audits IT when he does not know where
the system engineers reside? Where data is, and where functional management is
doing there jobs?
Expert's answer:
lots! how (and where) should you test any physical controls?
Expert's answer:
88
How can I show evidence that the data is where we think it is. How can we show
that we are in control? Some types of security measures will not be enough
anymore (username, password) maybe heavier measures are necessary
(including higher costs)
Expert's answer:
There is no realy audit issue, if you can get a TPM report from your cloud
provider. The real issue is to deal with risks that are there in the solution and are
not in your "hand" to deal with.
Expert's answer:
This could be an issue for companies that are dependent on these rules.
Especially fo SOX compliance this could be an issue
Expert's answer:
Auditting comes down to checking whether something complies to requirements.
the checking kan now only be done against what a (cloud)provider says has been
implemented. It is much harder to physically check compliance
Expert's answer:
I really have no idea how auditing would be affected. I suppose that's the kind of
question that will be difficult to answer unless you've got plenty of experience
with that kind of mechanism, which I don't.
Expert's answer:
The same. Part of audit is security level investigation.
Expert's answer:
Auditability is certainly a requirement;unfortunately, few cloud providers offer
this facility.
Everything depends on the skills of the one responsible for security. Maybe it
will show that ones knowledgeable enough to use the cloud are better equiped
to secure the (virtual) server.
Expert's answer:
A private cloud is almost an environment where the information is processed
and owned, control by the organization itself. The are the same...
Expert's answer:
It does not. An in-sourced server is placed in an trusted environment,
maintained by trusted employees, placed in an trusted network infrastructure
which can even be separated from the Internet. A private cloud can be private,
but it stays that the system engineers are unknown, the location can be
unknown (Microsoft and Google are offering a cloud service in the
Netherlands now) and normally your data is always traveling over the
Internet. It may be not totally insecure, but it is almost less insecure than in
house servers
Expert's answer:
if the cloud is in-house, there is no difference
Expert's answer:
An insourced server is maintained and managed by the company self
including access control, with a private cloud the managed is in other hands,
and how do you know that you are the only people in the cloud.
Expert's answer:
I do not see that there is a difference between then.
Expert's answer:
private cloud needs to provide security as data is outside the company.
Expert's answer:
one can physically control teh access to the insourced server.
Expert's answer:
I would say they both would have similar security levels and impacts. As long
as they are both private and require some form of authentication/authorization
to access, security ought to be rather similar.
Expert's answer:
Again the same level. In the cloud the access is wider.
Expert's answer:
Conceptually no difference. The feeling will be there that with an insourced
90
data or process the data to create new info/data. Also with government
everywhere all over like US, China etc. they want to have access to this
information, to be in control.
Expert's answer:
Like mentioned before, if a cloud provider like google is backing up your data,
they state that it's there property. But you will get into trouble is someone puts
data in that cloud that wasn't supposed to be in the cloud. What guarantee do you
have that all data is removed after you logged a ticket to remove that data? It
could already be moved to some remote storage. So data ownership is crucial to
deal with.
Expert's answer:
Ownership of data is always an issue. We need to be able to secure data and
protect IP for our customers.
Expert's answer:
Data can be stored at different locations under different laws and regulations.
They could easiliy have difernet understanding of ownerschip.
Expert's answer:
If I understand this correctly, IaaS offers the infrastructure (meaning virtual
computing environment at a distance) but individuals supply the data. As long as
data is stored locally, ownership should not be an issue. If data is stored on (or is
it 'in'?) the cloud, policies would need to be implemented just like with any other
file/data server.
Expert's answer:
Not. Data belongs to enduser, not the cloud provider. Otherwise no one will go
to the cloud. CRM, financial and so on belongs to the enduser.
Expert's answer:
No issue in my opinion
Data results: Question #7
Question:
above is a picture of the
preliminary model shown. (Full-size here:
http://www.students.science.uu.nl/~3219534/Cloud_model_beta.jpg) It shows how
inputted data gets rubricated and depending on that, a deployment model, delivery model,
geographic (on premise, off-premise), geo-spatial (which countries/continents),
92
Expert's answer:
What is the "right cloud formation" Per aspect you would need a set of criteria
to determine the right cloud formation.
Expert's answer:
I believe the model is sound. Seems to cover the bases well.
Expert's answer:
How does it get in, or added, mixed? That's were the power lies of data
Expert's answer:
An interesting idea but why do you go to the cloud with this set of
requirements ? Bottom line the cloud should be there to free you of these type
of concerns from a functional point of view. Construction is not your primary
interest. Does it concern inputted, or output data, or both ?
Expert's answer:
IMHO the cloud only improves the possibilities for stronger DDoS attacks.
Ways will be found to use the full potential that an attack out of the cloud can be
deployed, whereas the protection against such attacks can only be responsive and
therefor automatically will be after the fact and weaker.
Expert's answer:
Yes, it can be used to some extent. But a more rigid approach would be to use
the right network detection and prevention techniques which would allow the
network service provider to drop the traffic before it can reach the specific
infrastructure. If this techniques are incorporated into the cloud then this
certainly is true, but I haven't seen any virtualized solutions yet.
Expert's answer:
94
DDOS is always hard to protect against. The cloud, or better, the bigger front
door of the cloud and bigger firewalls and multiple accespoints make it harder,
for the hacker but not impossible. Yess its still true.
Expert's answer:
That will remain true; scaling up cloud will never beat scaling up DDoS
Expert's answer:
The cloud will be no answer to DDOS attacks. every infrastructure has its
breaking point in terms of scalability in virtual machines and/or bandwith.
Expert's answer:
Well, DDoS are difficult to predict. Depending on how the cloud is setup, and
how it is accessed (which parts are publically available and how the location
changes are transmitted to the 'cloud members', it can be more secure in that
sense. If the cloud app is public though, and centralized information is stored
anywhere (say a mirror list that the members connect to access the app) then that
will be as vulnerable as any other site or system. I do think that response to such
a threat though will be easier to do quicker.
Expert's answer:
No the cloud can not be a means. We will have to have other measures to
overcome this matter. Am very intereseted how and what this will be.
Data results: Question #2
Question: In terms of the CIA triad, it seems that it doesn't cover all aspects concerning
cloud security. An extension seems to be in reach in terms of CI3A. CI3A (or CI triple
A/CIAAA) defines confidentiality, integrity, availability, accountability and auditability.
This extensions has been made in order to fulfill the wish of governance and compliancy.
What do you think of this extension? Does it need more concepts in order to create
assurance within the cloud?
Expert's answer:
I'm sorry. I don't understand this qiuestion.
Expert's answer:
I think this is a useful extension to security and should provide enough
background for assurance.
Expert's answer:
Deze toevoeging staat niet op gelijk niveau als CIA, d.w.z. dat CIA deze
aspecten reeds omvat. Het kan wel nuttig zijn en dat doe ik in mijn praktijk ook
bij het formuleren van beveiligingsnormen om deze - hier aanvullend bedoelde begrippen toe te lichten bij de uitleg over CIA.
96
Expert's answer:
I think that accountability and auditability are a part under the CIA triangle.
Expert's answer:
Compliancy is more than Accountability and Auditability. I think at least
duration or time should added, because more and timing is an issue with data,
e.g. salary slips or annual reports.
Expert's answer:
ci3a should be considered every time you outsource, not just to the cloud
Expert's answer:
For now it will be enough. People need to think about these issues in order to
move to cloud solutions.
Expert's answer:
Yes, this extension does make sense in itself. However, there can be other
extensions of CIA as well so we may end op with 2C4I3A.......
Expert's answer:
I think the extension are correct and needed, but ISC2 en ISACA seems to count
accountability and auditability into their CIA triad.
Expert's answer:
Sicne I'm involved in information security I always tell my public that there is a
CIA triangle, which has to be completed with auditability. I know the most of
the audit community thinks this is the only right way to fullfill information
security. I agree with your CI3A and suggest you should mention it to the
ISC(2), ISACA and PVIB peolple!!
Expert's answer:
We already used the CI3A in non cloud situation, so I don't think this is a cloud
specific extension. The CI3A is in my opinion sufficient to cover alle aspects
Expert's answer:
There is never assurance possible in the cloud. Stuctures are to complex, to
difficult to understand even for an aditor. That other aspects are added is a good
approach, but the question is always with every audit: what is the subject under
evaluation? What aspect ever, an audit is made by people and the audittees are
also people. So you can say something about the audit process but the outcome
is les hard.
Expert's answer:
Information security is a concern of the owner. If the owner has a demand for
accountability and audiability it should be provided.
97
Expert's answer:
I do not agree, accountability and auditability are aspects of integrety.
Expert's answer:
I think it's a good model. Only time will tell.
Expert's answer:
Sorry do not know this extension. however is must provide a safe harbor security
and prove that it is.
Data results: Question #3
Question: As mentioned before, physical location, or the lack thereof, can be an issue in
the cloud. It seems that there are four factors in defining the boundaries of data location
awareness. Regional: measured in physical distance relative from one server to another.
Can be used for instance to create HA/Disaster recovery locations/strategies. Premises: are
servers/data located on organizational premises or not. Network: is data available within
the network or not. Legal: geographic locations of servers pertaining to legal systems (e.g.
discrepancy between server locations in state, country and/or continent) Do these factors
cover all issues/risks pertaining to boundaries and borders?
Expert's answer:
Yes, this should cover all issues/risks
Expert's answer:
I think it does cover all the issues. Or better, all the issues I see can be
categorized in this way.
Expert's answer:
Dit lijkt mij een prima benadering, die ik niet kan aanvullen
Expert's answer:
I think these are enough. The main problem is BCM, whichs you can cover with
the regional aspect. The privacy aspect of diferent regulations canb be covered
by geiographic locations. The question that rises is: can you check this...
Expert's answer:
All this is depending on definitions, one can think of internet as an extension of
your network! The same with your own homeplace, so organisational premises
can also be outside philips sites.. Next to that a location is also the pc or laptop
or datacarrier at its one, as it can connect to internet everywhere and than be
compromized.
Expert's answer:
can you blame/sue when stuff goes wrong? this is partially covered in legal, but
not completely.
98
Expert's answer:
There should be new rules concerning Legal, so it doesn't matter were the data is
located. The company who offers the solution can/must forfill these rules
Expert's answer:
Seems sufficient.
Expert's answer:
Somehow i do get the feeling that data on a hand held device is "moving" around
on each of this 4 factors. So i my opinion you do need device as a factor as well.
Expert's answer:
Yes I think you have made the right conclusions
Expert's answer:
No - I think you also need to take in account the location of maintenance
personnel
Expert's answer:
Probably
Expert's answer:
I think so, but other risks also apply.
Expert's answer:
I think that access is an issue, in terms of "who has access to the system" and
where is the system hosted. I would use: access control and logging. preferably
outside the cloud system....
Expert's answer:
I think so. Seems like a fair assessment to me. Depending on what is meant with
boundaries and border issues.
Expert's answer:
I think software, connecting applications and portal technology is a very
important factor that should be added. Question is can you connect your
applications wether it is located in the cloud or on premise.
Data results: Question #4
Question: After the input of the last round, and much poindering, the model has been
improved. Once again, I'd like to hear all the input you have. The model has been
redesigned with a new perspective in mind. The goal is to create a secure cloud
architecture. So classified data (or a classification/rubrication if you will) goes in, instead
99
of raw data. For each classification/rubrication level there will be one cloud environment
that fits. This model helps to decide how the architecture of that environment should look
like. Mind you, that the critique you might have on the CI3A will be taken into account.
Please just mention whether or not the vertical bar is correct, with or without respect to
your critique. (it was originally a horizontal one. Made vertical to show how CI3A affects
all horizontal lines). other improvements include the inclusion of all boundaries/borders in
the horizontal lines.
a larger
version can be found at http://www.students.science.uu.nl/~3219534/Cloud_model_rc.jpg
Expert's answer:
I think the vertical bar is correct.
Expert's answer:
From a design perspective the model looks promising. The only difficulty
could be the practical approach.
Expert's answer:
You can put the vertical bar on that position. I think you need to address the
trust as a part of the whole model.
Expert's answer:
Ik geloof helemaal niet in een aparte cloud architectuur, omdat alle elementen
daarvan niet afwijken van de elementen die op de eigen organisatie van
toepassing zijn. Aan een uitbestedingspartner stel je in principe vanuit de
business geen andere eisen dan die je aan de eigen serviceorganisatie zou
stellen. Dus ook geen aparte architectuur, wel een beperkt aantal specifieke
eisen. Die extra eisen leiden niet tot een andere architectuur. Zo bijzonder is
de cloud niet.
Expert's answer:
This is the chicken and egg story, what is first, the classification of data and
based on that choose the right region, etc. or is the location etc first and based
on that we classify the data? I would like to have the term "Residual risks
(and/or) additional measures" to be added in the arrow on the right side.
Expert's answer:
100
it looks do-able
Expert's answer:
no commends
Expert's answer:
Seems to make sense indeed, the CIA.. properties of the different
environments are all to be taken into account for the mapping of a problem to
a (cloud) solution.
Expert's answer:
I do have some difficulties with the CIA(AA) bar positioned after the
Classfication, i do believe that classification is a method to achieve CIA(AA),
so it should be after the CIA(AA) bar. But that is just my view. The only
horizontal bar that i do miss is NO secure cloud architecture; if all signs go on
red, you shouldn't put that data or system in a cloud.
Expert's answer:
The model is correct. I agree with the CI3A modell
Expert's answer:
the vertical bar does not match question 3 what about the maintenance
component
Expert's answer:
Data classification is the proper mean to identify specific cloud environemnts
but then a more DMZ-like definition of cloud environments is expected
Expert's answer:
Many people relay on methods, models and architectures. Some things are
handy to communicate, others give structure in the wilderness. All of them
cannot assure that you will not take or overcome any risk. Its always good to
use common cense. Reading this last question i think of my first answers, it
could be that some af my answers not give the outcome you espected. The
model however is pretty good and it could be a good practise to adapt this
model because it covers enough. The vertical is not good either. I am missing
some control in the the proces.
Expert's answer:
access control inside and logging outside the cloud
Expert's answer:
It looks like a viable model. Again, as I said in an earlier question, only time
will tell since this is a rather new endeavour, just how secure it will be. The
main concerns seems to have been addressed with the current model as far as I
can see.
101
Expert's answer:
like in previous question, my opinion is that software for connection of
applications or on a bus sysstem is of strategic importance of succes for cloud
architecture
102
(a
for
classification) and
then has several
attributes which output a cloud. A majority of experts had remarks on this model. First of all, data was
too ambiguous. What kind of data, whos data were general remarks. Also they felt that the CIA triad
was wrongly placed. It is after all not a cloud architecture attribute but a perspective on security in
general. Also, it was mentioned that encryption was needed as an attribute. The output as a cloud in
general
didnt
have
much
order
to
satisfy
these
pass
through
following
generate
the
CI3A,
attributes
and
secure
cloud
103
encryption bar (for architectures that use it and white space for those that dont) didnt make much
sense, nor looked intuitive. (figure 9 and 11).
It was thus decided that encryption would just be an attribute. Network and premise were added as
they showed up in the delphi round as dedicated cloud attributes. The end result is the model
displayed
in
figure 12.
This
model
was accepted by
the experts, with
one
main
concern.
Its
applicability
in
the
field.
Figure 13: the second model presented to the experts in the delphi session. big improvements are CI3A in
a vertical bar and additional attributes
This
thesis (especially chapter 10) tries to explain how it could be used in the field.
The final model as presented in this thesis has undergone little change from the one showed in the
delphi session. Its greatest improvement is that this one is aesthetically more pleasing. The CIA-(AA)
was changed into CI3A as the last round confirmed that the CI3A is a proper extension of the CIA
triad. Distribution model was renamed to deployment model for consistency purposes, and the cloud
figure into an arrow to show that the outputted architecture doesnt go directly into the cloud but it a
specification.
104