Anda di halaman 1dari 106

A Thesis by Thijs Baars

Su pervised by Dr.M .R.Spru it


Departm ent of Inform ationand Com pu ting Sciences
Utrecht University

Preface

The Cloud is an interesting, rapid moving topic that caught my attention in the fall of 2010. It was at
that moment that I started to see the many possibilities of the cloud. Together with the stream of
media covering of these exciting new start-ups, I firmly believed that the cloud could be the next big
thing. But, as with many new next big things, the cloud also had a lot of implications that many didnt
have an answer for. Nor needed it. It was after all the next big thing. And when you are in the race of
becoming the winner of this race you dont care about the implications, you just want to win. That
Christmas I spoke with the VP of security at VMware. He also mentioned the many issues that
needed to be solved with cloud computing, including certain security issues. His words sparked my
imagination and I started, jointly inspired by my dad who profiled himself as one of the security
experts in The Netherlands, to investigate the cloud and its security issues.
A year has passed and much research has been done in the field of cloud computing and in cloud
security especially. None the less, much more research needs to get done in this field. For many
companies cloud security is still high on the list of topics that need further investigation and keeps
them holding back from entering the cloud (Bartvagh, 2010). I got tempted to do something with
these requests, the questions were many and the answers, as I later during my research found out, were
mostly impartial or plainly false. In order to provide those answer seeking companies with the right
information, I wanted to create a useful tool. This tool has been developed as a part of this thesis. It
explores the facets of the cloud and provides an useful model that enables decision makers to get a
clear overview of the threats and risks in the cloud and how to protect themselves from it.
I hope that anyone seeking more information on security in the cloud will find it useful and that it
will provide a reliable information source for those who are answering questions on security in the
cloud to potential customers and partners.
Yours Sincerely,

Thijs Baars
January 27, 2011

2 A Big Thank You!


The research conducted in this thesis was made possible with the effort of experts in the field of cloud
computing, security, or a combination of the two. I would hereby like to thank all those for their time
and effort, their advice and thought during the interviews and in the expert panel, especially:
T. Hendrixen, Capgemini
E. Vredeveldt, Directeur Operations at Switch Automatisering B.V.
T. Heinsbroek, B.ICT, CISSP, CISA at SeKuRiGo
C. M. Fritts, DeVry University
Dr. W.A. Mellink, Enterprise Architect at the CIO Office, KLM
Drs. L.H.M. Verstegen RE, Senior Manager at Advisory Services, Ernst & Young
P.H.M. Arntz, Veiligheidsadviseur at Kleijn Transport BV
drs. M. Perdeck CISSP CISA, Principal IT Architect at Logica
M. Straver, Channel Manager Nederland at VASCO Data Security
J. Weyers, IT Security & Continuity Officer at TenneT TSO B.V.
ing. R. Kuiper CISA, CISSP, CSF, Management Consultant at Verdonck , Klooster & Associates
B. Bokhorst RE RA, Strategisch beveiligingsadviseur at the Dutch Ministry of Finance
drs. T. Schiltmans, Sector IT Security Manager at Philips Healthcare
K. Hintzbergen, accountmanager at 3-Angle software & services b.v.
J. Hintzbergen, Managing Consultant Public Security at Capgemini Nederland B.V.
J. H. Baars CISSP, CISM, Chief Information Security Officer at Enexis B.V.
S. Nieuwenhuizen, IT-auditor at Ernst & Young
And last but not least, my supervisor dr. M.R. Spruit for the great opportunity and his relentless work
that made this thesis as great as it is.

3 Summary
According to the definitions of ENISA and NIST, the cloud represents a business model that enables
on-demand network access to a shared pool of rapidly provisioned, elastic configurable computing
resources. This is reached by virtualizing the underlying hardware and giving the user control of its
resources it would like to obtain. This model creates a service that can be used for many applications
but as it is always attached to the internet, security becomes paramount. This thesis has shown that
many issues are not cloud specific and apply both to regular outsourcing applications as to any server
connected to the internet. However, some issues are particularly harmful to the cloud, such as
compromised virtual machines. If one is compromised, other virtual machines might be accessed with
insurmountable results. Other cloud specific issues lay at the client side, which are out of the scope of
this thesis.
Because of the virtualization techniques used, cloud environments can be spread throughout the
world. Although this can be a huge benefit (in terms of scalability and availability) it can also be a
threat to the system in the perspective of auditing, compliancy and legal jurisdictions. The main
vendors of cloud systems provide tools that can facilitate users with awareness where their data is
stored and processed. Auditability is an item that is very implementation specific, but it is shown that
complying with current standards such as ISO 27001, SAS 70 and PCI is possible. Another risk within
the cloud environment are trust chains. It can be very opaque which (indirect) partners can be involved
in your cloud environments and comparing to traditional outsourcing, they might be more
complicated. Because of the added risks in the cloud, the CIA-triad has been expanded to keep these
issues in line with the model. This expansion, called CI3A, has next to the well-known Confidentiality,
Integrity and Availability also Accountability and Auditability. Accountability and Auditability
should provide for a better grasp of the trust chains and an overview on governance & compliance.
Looking at the cloud, this thesis defines eight factors that influence the cloud environment and are
all related to the CI3A, and thus its security. These are regional, geo-spatial, governance &
compliance, delivery model, deployment model, encryption, network and premises. Each of these
factors can influence each other, but by defining these factors a security matrix can be setup that will
provide users with an overview of the cloud environment and how it is protected. These factors are
defined by inputting a data classification, that has been pre-defined by the organization. According to
the rules setup in the classification, these eight factors are defined. For some of these, they might be a

choice of several options, for others these values can be open to interpretation. For a detailed
description of these factors, see the section on The Secure Cloud Architecture model.
By integrating the results from the model, one can define a cloud environment that is sufficiently
secured for the data that needs to be stored and/or computed in the cloud. It might therefor be said
that in certain circumstances a cloud environment can be more secure than an insourced or outsourced
traditional environment, and that for many instances the cloud can be a secure solution.

4 Table of Contents
1

Preface .......................................................................................................................................................... 1

A Big Thank You! ..................................................................................................................................... 2

Summary ...................................................................................................................................................... 3

Table of Contents ........................................................................................................................................ 5

Introduction ................................................................................................................................................. 7
5.1

Current Research ................................................................................................................................ 8

5.2

Research questions ............................................................................................................................ 9

5.3

Research Model................................................................................................................................. 10

5.4

Data collection using the Delphi Method ..................................................................................... 10

Security and its perceptions.................................................................................................................... 14

The Cloud and its Technologies .............................................................................................................. 18


7.1

The cloud ........................................................................................................................................... 18

7.1.1

Virtualization ........................................................................................................................... 20

7.1.2

Characteristics ......................................................................................................................... 22

7.1.3

Delivery Models ....................................................................................................................... 23

7.1.4

Deployment models ................................................................................................................ 25

Security issues in the cloud ..................................................................................................................... 27


8.1

Risks in The Cloud ............................................................................................................................ 27

8.2

Boundary related risks .................................................................................................................... 28

8.2.1

Locationlessness ...................................................................................................................... 29

8.2.2

Legal or regional risks ............................................................................................................. 30

8.2.3

Geographic or geo-spatial risks ............................................................................................. 31

8.2.4

Organizational Premises ......................................................................................................... 31

8.2.5

Network or virtual boundary risks ....................................................................................... 32

8.3

Governance & compliance .............................................................................................................. 32

8.4

Trust Chains ...................................................................................................................................... 34

8.5

CI3A..................................................................................................................................................... 34

8.6

Data loss ............................................................................................................................................. 36


5

8.7

Encryption ......................................................................................................................................... 37

Current State of Cloud Security .............................................................................................................. 40


9.1

Data Location awareness ................................................................................................................. 40

9.2

Security governance & Compliance .............................................................................................. 40

9.2.1

PCI SSC Data Security Standards ........................................................................................... 41

9.2.2

SAS 70 ........................................................................................................................................ 41

9.2.3

ISO 27002 ................................................................................................................................... 42

9.2.4

COBIT ......................................................................................................................................... 42

9.3

Compared to Outsourced servers .................................................................................................. 43

9.4

Compared to insourced servers ..................................................................................................... 45

9.5

Issues and solutions ......................................................................................................................... 46

10

The Secure Cloud Architecture model .............................................................................................. 47


10.1

Modelling the Cloud ......................................................................................................................... 48

10.2

Using the model................................................................................................................................ 48

10.3

Inputs ................................................................................................................................................. 50

10.3.1

Attributes .................................................................................................................................. 50

10.3.2

Outputs ...................................................................................................................................... 55

10.4

Results from the SeCA Model ......................................................................................................... 57

11

Conclusions ............................................................................................................................................ 58

12

Further Research .................................................................................................................................. 60

13

Bibliography ......................................................................................................................................... 62

14

Appendix ................................................................................................................................................ 67
14.1

Results from the Delphi study Round 1 ........................................................................................ 67

14.2

Results from Delphi study Round 2 ............................................................................................... 82

14.3

Results from Delphi study Round 3 ............................................................................................... 94

14.4

Evolution of the Model .................................................................................................................. 103

5 Introduction
In the past decades there has been a tremendous expansion and differentiation in IT. From main
frames to personal computers, from work stations to netbooks, and from off the shelve software to
Software as a Service (SaaS). Using IT services such as SaaS is a different way of thinking about IT,
were IT is perceived as an utility. These IT utilities, or subscription services if you will, range from
complete infrastructures to small applications. According to some, this is ideal. Paying for just what
you use, while optimizing the usage of resources. Elastic scalability allows high availability with lower
costs than ever before. The power of virtualization is the keystone for all of this and promises a world
of resources that wasnt available before, while maintaining flexibility, ease of use and incredible power.
Add these technologies to an enhanced service model, and you get what is called The Cloud.
All this sounds too good to be true and it actually might be. Where virtualization offers optimized
performance, high availability, and multi tenancy, it also comes with new issues related to security.
Multiple tenants have access to the same resources as you, the same machines and the same disks.
Next to that, with virtualization your data can be anywhere in the world. Geographic location is no
longer an issue when providers are utilizing protocols that enable disk arrays to be placed in all
continents on the world, forming one single Storage Array Network (SAN). This however, might have
a serious security impact on your environment. And the same issue arises with processing power.
Virtualization offers you the power to utilize processing power all over the globe as one single
machine. This allows for incredible amounts of power to be utilized, but it also means that your data
might be processed in territories it shouldnt be processed.
According to some, these issues can be solved with strategies applied in conventional solutions,
others (Levelt, 2010) show that these new issues need new solutions.
To get a clear image on the actual issues in detail, and how to find solutions for them, this thesis
will explain the cloud and the technology involved (see section 7 below.) Following is an analysis of
cloud specific issues, such as geospatial issues and governance issues (see section 8) followed by the
current state of the cloud with certain specific solutions for aforementioned issues (section 9) where we
also compare the Cloud to traditional outsourced (section 9.3) and insourced solutions (section 9.4).
Section 10 introduces and explains the Secure Cloud Architecture model. To conclude, section 11
wraps everything up, and section 12 discusses points for further research.

5.1 Current Research


Although the Cloud is still in development (Mulholland, Pyke, & Fingar, p. 20), it has already caught
the attention of the research community. Its definition (over 20 are known) has been researched in
(Vaquero, Rodero-Merino, & Cacer, 2009) and is also discussed in (Chen, Paxson, & Katz, 2010).
Vaquero et al. manage to give an overview of the features a cloud should have and discuss the
differences with a computing grid. The NIST definition (NIST, 2010), which was defined in May of
2009, 5 months after the publication of Vaquero et al. (2009), is a more open definition, while still
preserving the key characteristics of the cloud. Together with the definition of ENISA (2009), these
definitions are being used in this thesis.
Mulholland et al. (2010) give an overview on Cloud computing and its facets for enterprises in their
book, but fail to mention any security related topics. (Jericho Forum, 2009) has modelled the cloud in
order to help users to understand the different facets of the cloud and support a secure use of cloud
technologies. It will be extensively discussed in section 10: The Secure Cloud Architecture model.
Grossman (2009) Provides the reader also with an overview of the cloud, but on a more technical level.
It discusses techniques in a broad overview and shows that some technologies do not have
authentication systems build in.
The security of the cloud is an issue that is well in the centre of cloud research. Chen, Paxson &
Katz (2010) give an introduction to security issues in the cloud and discuss which issues are specifically
new in the cloud, and which are issues that that are related to traditional form of computing. Stating
that Arguably many of the incidents described as cloud security in fact just reflect traditional web
application and data-hosting problems [..] such as phishing, downtime, data loss, password
weaknesses, and compromised hosts running botnets. (p. 4, internal references removed). They hold
that most cloud security issues arent new, but do need new implementations to provide the security
wanted. (Wang, Wang, Li, Ren, & Lou, 2010) discuss the necessity of a Third Party Actor (TPA) to
assure security standards and to provide transparency in the security controls. (Christodorescu, Sailer,
Schales, Sgandurra, & Zamboni, 2009) discuss methods of securing Clouds at the Virtual Machine
(VM) level. They provide a short overview of known VM issues and solutions, and then propose their
system which protects VMs in a cloud against malware and rootkits using a white/blacklist method.
(Jensen, Schwenk, Gruschka, & Lo, 2009) describe technical security issues related to cloud, using the
Amazon EC2 cloud as a case. Although all issues discussed are related to the cloud, all them were
already apparent before the coming of the cloud, some just have found new grounds to be relevant
8

again. In (Gnagey, 2010) an overview is given on the cloud, discussing among economic benefits the
security concerns on a managerial level: geographic location and compliance. No solutions for these
issues are addressed however. Vigfusson & Chickler (2010) discuss in Clouds at Crossroads: Research
Perspectives research topics in the cloud, including privacy related issues. Discussing the trust
problems that arise with the complex models of some cloud environments, it provides a few solutions
to suggestions which might be very feasible.
This thesis overlaps with current research in that it provides a global overview of the cloud,
introducing and explaining it, but it also expands the current research with proposed solutions to the
current security issues specifically to the cloud and general security issues that have found new terrain
in the cloud environment. To conclude, this thesis provides users with a model that navigates them
through the security issues and solutions for their specific cloud environment so that the user can
secure himself.

5.2 Research questions


With the current research explaining either very technical details on protecting the cloud, where the
mere describes arbitrary issues that are not specific to the cloud, or giving overviews of the cloud where
security issues are touched lightly, the author of the thesis was compelled to research the security issues
in depth that come with the cloud in a more practical sense. According to an investigation of Gartner,
security issues are the main concern when adopting the cloud (Bartvagh, 2010), while no clear model
exists to determine security issues and solutions. Therefore this thesis will provide with an overview of
the security issues and create a Secure Cloud Architecture (SeCA) model to determine the security
issues one might expect in a certain cloud environment and what solutions might be used to secure
those issues. This framework will be developed by answering the following question: Can the Cloud be
a safe alternative for the storage and execution of organizational confidential data? To be able to
answer that question, we need to answer the following questions:

How can data in the Cloud be protected from Cloud-specific security threats?

How does location-aware storing and executing of data in the Cloud affect its security?

How can Clouds conform to the international security standards for confidential data?

Because we can answer only whether or not the is a safe alternative if we compare it to other
solutions for storing and executing data, this thesis will also answer the following question:

How does the security in the Cloud of confidential data compare to that of outsourced
dedicated servers?

how does the security in the Cloud of confidential data compare to local insourced servers?

5.3 Research Model


To conduct the research this thesis describes, a literature review has been conducted first. By reading
the overall themes in security, followed by cloud specific topics, an overview has been created that is
used as the starting point in the development of the SeCA model. This model has been verified by an
expert panel (shown as Cloud Security Practise in the model below). The experts were selected
because of their function, publications and knowledge of security, the cloud or a combination thereof.
This group of experts, 26 in total, coming from organizations within the business to consumer,
business to business and business to government industries, and governmental organizations. They
were interviewed using the Delphi tool developed at the Wharton Business School (Wharton Business
School, 2005). From this information, the security threats in the cloud were identified. These were
then used to be compared in a risk analysis to create a Secure Cloud Architecture. Note that the SeCA
model is the risk analysis, outputting a Secure Cloud Architecture.
A conceptual model was developed as follows:

Figure 1: Conceptual model of the research presented in this thesis

5.4 Data collection using the Delphi Method


A Delphi method has been considered to be the best method for research in this thesis, as it provides
the researchers with a qualitative data set which would allow to create and verify the model, and would
allow for the experts to see anonymized answers and be able to respond to these answers in upcoming
rounds (Helmer, 1963). This anonymity has been proven useful but gave issues in concluding which
expert, without knowing ones name, gave what answer. The first expert the answer in question one was
10

not the same first expert in question two per se. The appendix shows direct screen prints, all experts are
named expert by the system. This way, a consensus has been reached on the acceptance of the model.
The Delphi method was executed consisting of three rounds of surveys with qualitative questions.
Three rounds were chosen instead of two (which is more common (Skulmoski, Hartman, & Krahn,
2007)), so that a first round could be used to obtain general information on the topic, not specifically
regarding to the model to be developed, while still having enough rounds to reach a consensus. The
first round consisting of open questions where the experts were questioned on their experience with
security and the cloud, issues and concerns regarding security in the cloud. These questions gave a wide
result set that strengthened the results of the literature research earlier performed. Seventeen
respondents answered the questions in the survey in all three rounds, a rate of 65%.
The result set that the first round created delivered the starting point for the second round, in
which some questions were asked again in order to give the experts the option to rephrase their
answers after having seen the answers of round one. Some questions were designed after noticing a
consensus or discrepancy in the answers from round one, where others were completely new and had
no specific relation to the questions asked in round one. The questions for round one themselves were
fed by literature review and informal meetings during conferences and congresses.
In the second round, an initial version of the SeCA model was presented. The goal of this model is
to provide implementers, decision makers and experts in the field with a framework that they can used
in order to assess cloud environments to their security needs. The feedback on this initial model was
then used to improve it. The author of this thesis originally expected that a framework where raw data
is inputted would be best. This as the data will get hosted in the cloud (or at least has that intention.)
However, it was found that the SeCA model looked at the data from an unusual perspective for its
target audience and that a more architectural point of view was needed in order to be usable in the
field. In round three an improved SeCA model was introduced. The SeCA model is remodelled to
accept data classifications as input, instead of raw data. This as raw data is classified within an
organization, and that for each classification different system architectures are needed to host and
execute said data safely as prescribed by the classification. Therefore, encryption is inserted as an
attribute as this changes among the other attributes per classification and thus architecture. The SeCA
model allows for any user to assess the cloud environment from two perspectives. Either the user looks
at its current data and the inherent classification and decides how the cloud environment should be
configured to meet its requirements. Or reverses that action, and sees what data can be used by taking
11

a cloud environment and on that basis determine what can go in. This thesis will describe only the
forward movement, thus taking data classifications as an input and determine on that basis how the
cloud environment should be configured. The appendix (0) shows the evolution of the model, both
internal revisions and revisions discussed in the surveys.
The following concepts have been tested during the delphi study:

New/specific security issues (Round 1: Question 2. Round 2: Question 1, 2, 4, 5, 6. Round 3:


Question 1 &2)

Location awareness/locationlessness (Round 1: Question 3. Round 2: Question 3)

Trust issues & Trust chains (Round 1: Question 4. Round 2: Question 2)

Encryption (Round 1: Question 5, 6)

Feasibility (Round 1: Question 7)

The model (Round 1: Question 8. Round 2: Question 7. Round 3: Question 2, 3)

Auditing (Round 2: Question 4)

To summarize the results from the delphi round a burn chart is produced shown below. Green
depicts consensus, orange represents some polarisation, red represents extreme polarisation in the
answers of the experts.
As one can see, not all topics reached consensus. This was due to the fact that in the expert
selection business knowledge or technical knowledge on some topics were not taken into account. For
example, the field of encryption is a very technical field that can be hard to fully understand and apply.
Although the some answers were very useful, other answers were dismissed in the same round as
unfeasible, simpleminded or simply not true. This meant that the experience or knowledge between
the expert varied too greatly to reach consensus. Subsequent research was done through literature
review in the applicable topics.

12

Topic

Round 1

Round 2

Round 3

Consensus

New/specific

Yes, itll present new


issues

Variety of reasons
for the difference
between traditional
& cloud. Issues in
private/insourced
are the same.
Ownership is
dependent on law
& regulations.

DDoS answers
varied. Points from
both attackers &
victims discussed.
CI3A is accepted.
auditability and
accountability
could be implicit in
the CIA triad.

Yes, issues that are


presented have
reached consensus and
are taken into account
in the SeCA model.

Location/
locationless

Experts state that


location is an issue

Locationless clouds
are considered to
unfeasible and
unwise

Boundary issues are


discussed. The four
listed are accepted
as sufficient &
complete

Yes, new issues on the


location of data storage
and processing.

Trust issues

Yes, (new) trust issues


will arise in the cloud due
to its architecture

experts vary
greatly on the
difference in trust
between traditional
outsourcing and
cloud solutions.

Encryption

Encryption is very
important, some note its
not a safe haven. Some
believe that everything
should be encrypted.

No consensus; experts
differed in knowledge
and experience.
Further research
through literature.

Feasibility

Some believe that


security goes above
feasibility, others state
that it depends on
business needs.

No consensus, No
further research done;
feasibility did not
prove to be a security
issue.

Model

The attributes mentioned


are agreed upon and
discussed, but opinions
vary on the usage and
which should be included

security issues

Auditing

Model has good


points, but misses
attributes. Experts
have trouble
understanding
input and output
Auditing in some
environments is
hard. Experts
mention branch
offices, laws &
regulations and
communication
issues.

Partly, cloud only


issues have reached
consensus, not between
the difference of
outsourcing, insourcing
& cloud.

Model is accepted
as presented.
Experts mention
the possibility of
practical issues
utilizing the model.

Yes, the model is


accepted after
revisions. Attributes
are refined, open issues
are discussed in 12
Consensus is reached.
There are many issues
that lay ahead in all
kinds of fields, from
communication
between branches to
intra-legal regulations.

Table 1: Burn Chart of the delphi study. Colours denote the amount consensus reached: Green for consensus, orange for
consensus
on some issues, red for no consensus

13

6 Security and its perceptions


The term security, as used in this paper should be read in the perspective of

Experience

information security. In this perspective, security can be defined as the


protection of information from a wide range of threats in order to ensure
business continuity, minimize business risk and maximize return on
investment

and

business

opportunities

(Hintzbergen,

Hintzbergen,

Security

Smulders, & Baars, 2010) or as The protection of information and


information systems from unauthorized access, use, disclosure, disruption,
modification, or destruction in order to provide confidentiality, integrity, and
availability (NIST, 2006) and ISO 27002 standard describes it as
preservation of confidentiality, integrity and availability of information; in

Technical

addition, other properties, such as authenticity, accountability, nonrepudiation, and reliability can also be involved (International Organization of Figure 2: Security divided
Standards, 2005, p. 10). The difference is striking, yet very applicable to this

into two parts

thesis. Information security can be perceived from many points of view. The NIST and ISO definition
clearly define information security in technological terms, where Hintzbergen et. al. (2010) define it in
a business perspective.
In this thesis, security will be discussed in a technological sense, with the goal to minimize the
impacts to the organization when a breach of security occurs. This is also called information risk
management. Information risk management exists because [c]ompromise of a valuable information
asset will cause dollar losses to the information's owner whether acknowledged or not; the loss could
be either direct (through reduction in the value of the information asset itself) or indirect (through
service interruption, damage to the reputation of the information's owner, loss of competitive
advantage, legal liability, or other mechanisms). (Blakley, McDermott, & Geer, 2008, p. 1) By
managing the risks, the losses occurred at a breach of security can be minimized. These losses can be
tremendous, see for example (de Bruijn, Spruit, & van den Heuvel, 2009) how a weak encryption led
to millions of losses. Security governance has therefore become a hot topic on the agenda of IT
managers. (See for an overview on information risk management (Blakley, McDermott, & Geer,
2008)).
One way to look at information security is through a framework called the CIA triad. The CIA
triad, meaning Confidentiality, Integrity and Availability, offers an overview of the most important
14

features of information security. Confidentiality can be uphold by properly disclosing information,


integrity can be uphold be valid and legal modifications to the information, and availability can be
uphold if data remains available, and isnt lost, inaccessible or destroyed.
The results of the delphi study conducted in this research, show that auditability should be added as
well, since this is a major feature within security governance and compliance, and the lack of
auditability a threat to cloud environments and compliancy. These main features of information
security have been expanded with accountability and assurance by NIST (Stoneburner, 2001).
Assurance is interdependent of confidentiality, Integrity, availability, accountability and auditability
(CI3A). When designing a system, an architect or engineer establishes an assurance level as a target.
[] Assurance highlights the fact that for a system to be secure, it must not only provide the intended
functionality, but also ensure that undesired actions do not occur. (Stoneburner, 2001, p. 4) Assurance
is thus the overall term for the integration of CI3A in a system and assures a certain level of security.
The following table is adapted from (de Bruijn, Spruit, & van den Heuvel, 2009)

Objective
Confidentiality

Prevents
Unauthorized users from accessing resources.

Unauthorized users and unsafe hosts from accessing the network.

Confidential data from leaving the organization.

Integrity

Disturbance of integrity of data.

u
r

Unauthorized users from editing resources.

Availability

Unauthorized users from moving/deleting resources.

Missing clues on threats.

Downtime if availability is threatened.

Unauthorized users and unsafe hosts from accessing the network.

Accountability

Users from performing actions without any record.

Auditability

Systems from being unable to be audited.

Table 2: CI3A with Assurance. Adapted from De Bruijn et al. (2009)

Accountability is also often called within the term triple A (AAA). AAA stands for authentication,
authorization and accounting, and describes the three features of access control. As one can see, AAA
and the CIA triad overlap in meaning and function. Authentication handles the integrity of users.
Authorization makes sure that only authorized users get access and functionalities that are assigned to
them, which is in the CIA triad defined by confidentiality and availability. Accountability logs all user
15

actions. This thesis will therefore refer to the extended CIA triad (the CI triple A, or CI3A in short)
for all user control discussions.
The presented SeCA model in section 10 tests whether a cloud environment can be technically safe
enough for certain security requirements. The perceptions users have from the security measures, or in
other words the security experience, is not being tested as this is a different field of study that touches
with User Experience, Human-Computer Interaction and behaviour, as outlined in (Gonzales &
Sawicka, 2002). Nonetheless, human factors in security and in this thesis have to be kept in mind as
they play an important role.

Classification Measures
1: Top Secret

Data that should only be handled by specific people with the right
authorization. Very high business impact if a data leak occurs.
Resulting measures: Personnel screening, data on organizational premises,
data hosted only in the same country as the organization. Dedicated
hardware, e.g. no multitenancy. Replication on at least n+1 location. Data
must be within the network.

2: Secret

Data that should be handled by a limited amount of people with the right
authorization. High business impact if a data leak occurs.
Resulting measures: Screened personnel, within region of organization. Data
may be off-premise, High Availability architecture.

3: Private

Data is open to employees and partners. Some business risks involved in


case of data leakage.
Resulting measures: Data can be hosted externally within the region of
partners. Any employee can access the data. Five nines availability.
Thorough backup solution.

4: Public

Data that is open to anyone, for instance press releases. No to minimal


business impact on data leakage.
Resulting measures: Can be hosted off-premise, anywhere in the world, with
at least one location within the region of the organization. High availability
is necessary for promotional purposes. Can be out of the company network.

Table 3: Exemplary data classifications

Because of the human perceptions on security, measuring security can become a subjective task. In
order to prevent a subjective measuring of security and to ensure that the measures will be objective
and based on technical features instead of the perceptions of security, this thesis abstracts the security
measures in terms of classification. Classification is an internal process that takes place within an
organization that puts security requirements on information. This information can be a set of data,
16

documents or any holder of information. Classification has generally speaking four classes, or levels, in
which data can be classified. Depending on the class, the security measures that are being taken to
secure the information differ. This has to do with the nature of said information. Trade secrets are
being classified at a different level than press releases. Trade secrets are of more value, have a larger
threat profile, incur greater risks at loss and thus shouldnt be available to everybody. Press releases on
the other are at the other end of the spectrum. They should be available to as many people as possible,
at any time. In the examples in this thesis we use four different levels of classification, ranging from a
low threat profile (public information), to a high threat profile (top secret information). The amount of
classification levels might differ between organizations, with that the security measures in each
classification. The four used in this thesis serve as an example to clarify concepts and for exemplary
usage of the proposed model.

17

7 The Cloud and its Technologies


Before one can model the cloud, let alone create a Secure Cloud Model, the cloud has to be defined.
These definitions, including the one from NIST and ENISA, will be introduced in section 7.1.
followed is a description of the technologies that comprise the cloud. First virtualization is introduced
(section 7.1.1), second, section 7.1.2, the characteristics of the cloud to which are referred in the NIST
definition of the cloud. In section 7.1.3 the delivery models, a part of the definition of both NIST and
ENISA, are described including Infrastructure, Platform and Software as a Service. The following
section, 7.1.4, continue elaborating on the definitions describing private, partner, hybrid and public
clouds. Lastly, section 1 described geographic locations within cloud environments.)

7.1 The cloud


The Cloud is called by some a paradigm-shift in computing (Voas & Zhang, 2009), by others it
doesnt even exist (Reuters, 2008). So how come this thesis is trying to research the security issues of
something that by some doesnt
even exist? Because the cloud does
exist. However, it is not a brand
new technology. The cloud has
always been here, under the name
of the internet, and the idea of
utilizing the internet as a storage
and computing power provider isnt
new either. In 1993, Eric Schmidt,
then CTO of Sun Microsystems,
said in an email When the
network becomes as fast as the
processor, the computer hollows
out

and

spreads

across

the

network. (Gilder, 2006) This the


network is the computer mantra is
basically what the cloud is all about.
Utilizing all the power that make
up the all-encompassing internet for Figure 3: The Cloud has a specific deployment model, delivery model and a set
of characteristics (shown in rounded boxes)

18

better productivity and scalability. So, the cloud isnt new, and yet some call it the new paradigm of
computing. That is because the cloud is a new delivery model, or as Mulholland, Pyke & Fingar state:
The big deal is that cloud computing is a disruptive delivery model. Its an economic, not
technological shift! (2010, p. 24).
The National Institute for Standards and Technology (NIST) defines the cloud as: Cloud
computing is a model for enabling convenient, on-demand network access to a shared pool of
configurable computing resources (e.g., networks, servers, storage, applications, and services) that can
be rapidly provisioned and released with minimal management effort or service provider interaction.
This cloud model promotes availability and is composed of five essential characteristics, three service
models, and four deployment models. (NIST, 2010) We will discuss these characteristics, and models
below. ENISA (European Network and Information Security Agency) defines the cloud as [..] an ondemand service model for IT provision, often based on virtualization and distributed computing
technologies. Cloud computing architectures have:

highly abstracted resources

near instant scalability and flexibility

near instantaneous provisioning

shared resources (hardware, database, memory, etc)

service on demand, usually with a pay as you go billing system

programmatic management (e.g., through WS API).


(ENISA, 2009)

Both definitions are more or less the same, and we will use these continuously in this thesis as the
working definitions. These definitions show that ASPs are more or less a part of the Cloud and that
SaaS (Software as a Service) is actually a model within a cloud environment. Table 4 explains all
characteristics.

19

Cloud characteristic

Description

Highly Abstracted Resources

Using virtualisation, resources can be created and


scaled on the spot over one or more physical resources

Near Instant Scalability and Flexibility

The ability to add or remove virtual resources with the


click on a button

Near Instantaneous Provisioning

The ability to supply resources, services and such


nearly-instantaneous.

Shared Resources

Resources can be shared by multiple tenants

Service on Demand

Get the services needed on demand, and pay only for


what you use. (pay per hour basis, pay per use etc.)

Programmatic Management

APIs provide interfaces to manage the cloud


environment. E.g. via web interfaces, on a step for step
basis.

Table 4: Cloud characteristics explained

7.1.1 Virtualization
Virtualization is the technology that enables any cloud environment. As an article in The Register
explained [virtualization] creates a layer of abstraction between a virtual machine and the physical
hardware. [] this allows multiple virtual machines to run on a single physical machine, and also can
enable a virtual machine to be moved quite straightforwardly from one physical machine to another.
(Collins, 2009) Companies like VMware, Xen and Citrix offer solutions that can virtualize physical
machines on different levels. These levels are known as Full Virtualization, Para-virtualization and
Hardware assisted virtualization.
Full Virtualization uses a combination of binary translation and direct execution techniques. This
approach translates kernel code to replace non-virtualizable instructions with new sequences of
instructions that have the intended effect on the virtual hardware. Meanwhile, user level code is
directly executed on the processor for high performance virtualization. Each [..] Virtual Machine [has]
all the services of the physical system, including a virtual BIOS, virtual devices and virtualized memory
management. This combination of binary translation and direct execution provides Full Virtualization
as the guest OS is fully abstracted (completely decoupled) from the underlying hardware by the
virtualization layer. The guest OS is not aware it is being virtualized and requires no modification.
(VMWare, 2007, p. 4) Full virtualization requires no hardware or operating system assistance to
virtualize sensitive and privileged instructions. Because everything is being run from a bare bone
operating system, called a hypervisor, it enables the best security options since no shared service

20

between other virtual machines can reach the hypervisors processes and instructions. Microsoft Virtual
Server and for example some VMware products offer full virtualization, among others.
In order to avoid the overhead caused by the binary translation taking place in Full Virtualization,
Paravirtualization, communicat[es] between the guest OS and the hypervisor to improve performance
and efficiency. Paravirtualization involves modifying the OS kernel to replace non-virtualizable
instructions with hypercalls that communicate directly with the virtualization layer hypervisor. [..] The
value proposition of paravirtualization is in lower virtualization overhead, but the performance
advantage of paravirtualization over full virtualization can vary greatly depending on the workload. As
paravirtualization cannot support unmodified operating systems (e.g. Windows 2000/XP), its
compatibility and portability is poor. (VMWare, 2007, p. 5)
In the time that virtualization was being introduced to the markets, hardware vendors started to
implement virtualization at the hardware level with technologies such as Intel VT-x and AMD-V.
Both target privileged instructions with a new CPU execution mode feature that allows the VMM to
run in a new root mode below. [..] Privileged and sensitive calls are set to automatically trap to the
hypervisor, removing the need for either binary translation or paravirtualization. (VMWare, 2007, p.
6) For an overview of virtualization, and its providers see (Blokdijk & Menken, 2009).
If we put this in perspective of the cloud, the virtualization of the physical machine, thus creating
one or more virtual machines, creates the scalability and flexibility of computing power. We can
virtualize multiple physical machines to create one single virtual machine (also called an instance).
Therefore, being able to create massive machines from multiple physical machines, or vice versa. This
also enables High Availability (HA), a system that has a minimal or no downtime.
When combining this flexibility with virtual networking, we can create virtual networks, and thus
virtual datacentres. Virtual networking is a term to describe the virtualization of network devices. Just
as with regular virtualization where physical machines are split into multiple virtual instances, now
physical network devices can be virtualized, thus split in to multiple instances. The offerings of virtual
networking are divers. Virtual LANs, or VLANs create groups of computers within a switch. VPNs
use public lines of communication, such as the internet, to connect multiple networks together
creating one virtual network. Openvswitch and vNetworkStack are software based multilayer switches
that can run on physical machines (thus allowing for scalability and flexibility through virtualization.)
In order to supply the massive amounts of storage required for some applications (such as
Facebook, which in 2009 had a 2 Petabyte dataset (Thusoo, 2009)), virtualizing storage through
21

Storage Area Networks (SANs) has been developed. A SAN is a network of storage devices, such as
tape libraries and disk arrays, which communicate through a SCSI protocol such as Fibre Channel and
iSCSI. (Preston, 2002, p. 5) Protocols like iSCSI, which stacks SCSI procedure calls on top of TCP,
allow storage devices to be mounted over the network. This means that one can reach an iSCSI device
which is located in the datacentre from his home environment. For the cloud this implies that storage
systems from all over the world can be provisioned into one cloud environment. This allows for
multiple petabyte data sets to be back upped in a high availability environment, with disaster recovery
at the other side of the world, since whenever a storage systems fails, the iSCSI calls can be rerouted to
the backup facility. This allows for no, or minimal loss of data, but this technique could also be used
for extending storage availability in the cloud and load balancing. For an overview of storage networks,
and their implication, see (Preston, 2002).
By virtualizing the memory in the physical machines, just like we do with storage and computing
power, a VM can begot the amount of memory that is more than the total amount of one physical
machine. In other words, we can create VMs with the total amount of memory, storage and
computing power equal to that of all the physical machines in 3 datacentres, each located in a separate
continent.
By clustering physical machines and network device we can create an elastic system which can
rapidly be provisioned and released to serve the content to its users, which is what we call the cloud.
Virtualization is truly the backbone of the cloud, allowing for near instant scalability and flexibility,
provisioning and shared resources.

7.1.2 Characteristics
The NIST definition speaks of five adherent characteristics, which have to be present to speak of a
cloud environment. Below we will outline those characteristics.
1. On-demand Self-service. Users can add or remove resources on the spot, form some kind of
control without the intervention of a third party. Thus no more need to call your service
provider to ask for 2 gigabytes of more storage. One click on a button and youre done.
2. Broad Network Access. As it says, there should a broad network access to heterogeneous thin or
thick client platforms (e.g. cell phones, notebooks, desktops).
3. The third characteristic is Resource pooling, meaning that the providers computing resources
are pooled to serve multiple tenants, with different physical and virtual resources dynamically

22

assigned and reassigned according to consumer demand. These resources can be storage,
processing power, memory, network bandwidth, and virtual machines.
4. Rapid Elasticity, which means that resources can be rapidly and elastically provisioned, to
quickly scale in and out. This comes with flexibility in time (for some minutes to years) and in
size (just 1 Gb of additional bandwidth to Exabytes of hard disk space).
5. Measured Service. Cloud systems automatically control and optimize resources by leveraging a
metering capability at some level of abstraction. Resource usage can be monitored, controlled,
and reported providing transparency for both the provider and the consumer.

Adapted after: (NIST, 2010). Except for broad network access and the measured service, all these
characteristics are made possible by virtualization technologies discussed above. Broad network access
is evolving continuously through the developments in dark fibre, which has mostly been laid during
the dot-com era.

7.1.3 Delivery Models


The cloud has three distinct platforms on which a cloud
environment can be offered. They are stackable, meaning
that if you have a Software as a Service (SaaS)
solution, chances are that your provider manages a
Platform as a Service (PaaS), but takes services

SaaS
PaaS

from an Infrastructure as a Service (IaaS)


provider. These relations can thus be
very

complex

(these

complex

IaaS

relation and their issues will be Figure 4: Delivery model pyramid


discussed further on in this thesis). This, however, does not mean that every SaaS solution is running
as the top of a stack of cloud platforms. A SaaS solution can run on a traditional hardware stack with
no further cloud environment attached. This also works vice-versa: you might be subscribing services
from an IaaS provider, but not running any SaaS application whatsoever. Figure 3 shows the hierarchy
within the cloud. IaaS can be used to deploy PaaS solutions, PaaS can be used to deploy SaaS
solutions.

23

7.1.3.1 IaaS
The lowest platform in the above displayed delivery method pyramid, IaaS, or Infrastructure as a
Service, provides the infrastructure of a server park. [V]irtual machines and other abstracted hardware
and operating systems which may be controlled through a service API. (ENISA, 2009) NIST
describes it as: The capability provided to the consumer is to provision processing, storage, networks,
and other fundamental computing resources where the consumer is able to deploy and run arbitrary
software, which can include operating systems and applications. The consumer does not manage or
control the underlying cloud infrastructure but has control over operating systems, storage, deployed
applications, and possibly limited control of select networking components (e.g., host firewalls).
Virtual Private Servers hosted as a cloud environment are often IaaS services. Rapid Elasticity comes
into place as more resources are required, the IaaS provider can then add more Virtual Machines to
the subscription, and are wound down when no longer needed. (Mulholland, Pyke, & Fingar, 2010)
7.1.3.2 PaaS
The Middle layer, platform as a Service, allows customers to develop new applications using APIs
deployed and configurable remotely. The platforms offered include development tools, configuration
management, and deployment platforms. Examples are Microsoft Azure, Force and Google App
engine. (ENISA, 2009), The consumer does not manage or control the underlying cloud
infrastructure including network, servers, operating systems, or storage, but has control over the
deployed applications and possibly application hosting environment configurations according to
NIST (2010). To expand on this, Mulholland et al. describe PaaS as platforms [that] can be preconfigured to support specific use by an industry or an enterprise, complete with management and
governance capabilities. However, the most common type of PaaS is the type that provides a core set
of services to which a wide range of additional services can be added to leverage the core services.
(Mulholland, et al., 2010) an example would be a JAVA platform, to which developers and add
applications to leverage the platform and the programming language for rapid development, without
the need to maintain the underlying technology (servers, tools, development environment etc.)
7.1.3.3 SaaS
SaaS is software available on subscription, or as ENISA puts it: software offered by a third party
provider, available on demand, usually via the Internet configurable remotely. NIST explains it as:
The capability provided to the consumer is to use the providers applications running on a cloud
infrastructure. The applications are accessible from various client devices through a thin client interface
24

such as a web browser (e.g., web-based email). The consumer does not manage or control the
underlying cloud infrastructure including network, servers, operating systems, storage, or even
individual application capabilities, with the possible exception of limited user-specific application
configuration settings. One of the best examples is probably Google Docs, that provides its customers
with on demand office tools such as word processing. SaaS provides actual end-user functionality,
either as services grouped together and orchestrated or as a conventional monolithic application.
(Mulholland, Pyke, & Fingar, 2010)
7.1.3.4 OaaS
Apart from the above 3 mentioned, there are multiple parties that tend to acknowledge more levels In
the pyramid. These Others as a Service include BPMaaS (Business Process Management as a
Service) which is defined in (Mulholland, Pyke, & Fingar, 2010) as to create unique business
processes designed for unique purposes to link together multi-company value delivery systems that in
the past werent feasible or economical to join together. Thus canned SaaS applications can become
participants in end-tot-end processes (internal quotes omitted).
Security as a Service is a term coined by security vendors, defined by McAfee as Rather than
acquiring your own security software tools and the technical expertise to administer them internally,
you contract with security vendors to have a turnkey service of virus defence, firewall management and
e-mail filtering. Outsourcing cyber-security eliminates all the labour and infrastructure, while still
giving you the state of the art in anti-virus, firewall and spam-fighting technologies. (McAfee, 2010)
Another popular term is Storage as a Service, in which a storage provider serves online storage for
data to clients. This data is often hosted in the cloud, and clients pay per gigabyte. These services can
be used as a backup solution.
The above mentioned services are just a small selection of popular services in the cloud. Some of
them float between the above mentioned 3 platforms. Security as a Service for example might exist on
all levels, as its nature encompasses the infrastructure to end-user authentication. This thesis limits
itself to IaaS, PaaS and SaaS, but results might be very usable on other services than discussed in this
thesis.

7.1.4 Deployment models


The cloud comes in four different deployment models, these are private, public, hybrid and
community/Partner clouds. The difference between these four models is the openness of the cloud to
its tenants. Below is a description of each these models.
25

7.1.4.1 Private Cloud


This cloud infrastructure is operated for just one organization. This does not mean that it has to be
managed by that organization. The management of the private cloud can be done by a third party, and
the cloud itself can be physically located on the premises of that organization, or can be hosted
somewhere else (in which case it is also called a virtual private cloud.) The cloud can exist behind a
firewall of the organization, and thus only accessible within its private network, but can also be hosted
off-premise on dedicated hardware (thus no multitenancy with other organizations). The main
difference between a mainframe or internal traditional datacentre and a private cloud is that there is a
virtualization layer that can be used to host SaaS applications, rapid deployment and other benefits of
cloud computing.
7.1.4.2 Community or Partner Cloud
In a community cloud, a community or group of organizations share the same cloud infrastructure.
These communities have shared concerns, such as a mission, goal and/or policy. The cloud can be
managed by one of the organization within the community, or by a third party, and may exist on or off
premise. (Mulholland, Pyke, & Fingar, 2010) An example of a community cloud is the Eucalyptus
Community Cloud. It is a platform for software engineers to test drive Eucalyptus (cloud architecture
software). It features all possibilities that a self-hosted eucalyptus cloud would have, thus making it
possible to use eucalyptus before investing in the full implementation of it. (Eucalyptus, 2011)
7.1.4.3 Public Cloud
The cloud is open for use to a large group of tenants, which do not need to know each other. The
Cloud is a ran by a cloud service provider. An example can be a majority of offerings from Force.com
and Googles Gmail to VPS.net, Rackspace cloud hosting and other public services, free or on a
subscription basis.
7.1.4.4 Hybrid Cloud
This cloud is a composition, or hybrid if you will, of two or more clouds of the types mentioned above.
They are unique entities, but tied together with APIs to enable the exchange of data and applications.
Due to the nature of the different clouds the hybrid consists of, it can be deployed both on and of
premise, and be partly behind the firewall of an organization. An example is the announced app store
HP is building in order to communicate and process sales with public and business relations. It
features a public cloud in which consumers can buy products and services, and a private cloud for
product development and administration. (Hewlett-Packard Development Company, L.P., 2011)
26

8 Security issues in the cloud


Several issues and risks are identified during the Delphi session as well as during interviews with
special selected experts. These issues and risks are both specific as unspecific to cloud computing. They
are described hereunder, starting with a general section on risks in the cloud (8.1), followed by
boundary risks (section 8.2) which have been mentioned often during the surveys in the delphi session.
These consist of legal or regional risks, geographic or geospatial risks, organizational boundary risks
and network or virtual boundary risks. All are mentioned sequentially. A special section is written on
locationless clouds. A phenomena that has great impact on various factors in a cloud architecture. In
8.3, Governance and compliance is discussed, followed by 8.4 where trust chains in the cloud are
described. Following is a section (section 8.5) on CI3A, an extension of the CIA triad. In section 8.6
data loss is discussed and the final section, 8.7, deals with encryption in the cloud.

8.1 Risks in The Cloud


As concisely described in Foundations of Information Security (Hintzbergen, Hintzbergen,
Smulders, & Baars, 2010, p. 9) Information Security is achieved by implementing a suitable set of
controls, including policies, processes, procedures, organizational structures and software and hardware
functions. Risks in the cloud can be assessed by analysing these controls. Hintzbergen, et al. (2010)
follow by stating that these controls need to be established, implemented, monitored, reviewed and
improved, where necessary, to ensure that the specific security and business objectives of the
organization are met. Especially the focus on business objectives is viable in this research. The cloud,
as discussed above, offers a variety of solutions to known business issues such as business continuity
and scalability. Deciding which system will be used in order to maintain the set security and business
objectives will be an organization specific decision, but the objective of this thesis is to make that
decision easier by analysing the risks in the cloud.
As previous research has showed, many risks in the cloud are not specifically cloud related, but
browser, user or framework related. (Jensen, Schwenk, Gruschka & Lo, 2009) This thesis will not
discuss those issues at large, but will instead focus on the design and implementation of the cloud
environment from the security perspective at the server side.
The following table summarises all risks and security issues identified by the experts. The column
cloud specific denotes whether a risk/issue is cloud specific or not (yes, no) or if there are new
perspectives on these risks in the cloud (partly).

27

Cloud
specific

Risk

Description

Location awareness

Location awareness, or locationless when no Yes


awareness of the physical location of the information
systems in use

Legal/regional

Depending on the physical location of the server, Partly


laws and regulations can differ

Geographic/geo-spatial

Distance between physical systems in place

Organizational premises

Are the systems in the cloud environment hosted on No


organisational premises or not?

Network/virtual

Is the cloud environment within the perimeter of the Partly


network already in place

Governance and compliance

Governance and compliance to standards and norms

Trust chains

The amount of actors involved to serve the Partly


subscribed service

Data loss

Losing data due to the added amount of links and Partly


new technologies

Encryption

Encryption techniques used or new applications for Partly


them

No

Partly

Table 5: risks and security issues in the cloud

8.2 Boundary related risks


With the capabilities of virtualization, discussed in
7.1.1, physical locations tend not to exist in the eyes
of the end user. Of course, the cloud has to be
hosted

in

some

datacentre,

but

with

the

perspective of virtualization, it doesnt really


matter whether that datacentre resides in London,
New York or Beijing. The end user will just see its
resources, and the first apparent notice it will get
that a part of his computing power comes from
the other side of the globe is that there might be a
propagation lag of 100ms instead of 20ms. This
can be a significant competitive edge. It gets
incredibly easy to scale, and to provide users with
off-shore

backups,

making

your

backups

meteoroid proof. (Depending on the size of the

Figure 5: boundaries depicted that can occur


within cloud environments

28

meteoroid thats going to hit the earth, of course.) This, however also requires new solutions for data
deduplication, and controls for AAA en geographic locations in order to adhere to the security
objectives of the organization.
Results from the delphi session has identified four potential boundary issues: regional,
organizational premises, network and legal. (depicted in figure 4, the red bar being the firewall) These
will be explained below in greater detail, but in short the following definition can be applied to these
four:
1. Regional or legal boundaries are boundaries that show geographic regions in the sense of legal
regions. Thus countries, states, territories and so forth. They have differences in laws that
influence for instance privacy measures that have to be taken within the cloud. In this
perspective regional boundaries also influence governance & compliance and directly affects
Availability and Auditability within the CI3A (an extension on the CIA triad, elaborated in
section 8.5).
2. Organizational premises define boundaries of an organization. Secretive information can be
kept on premise for instance to create more assurance of the data for instance. Likewise it can
be kept off premise because the organization doesnt have the required funds or knowledge to
keep it on organizational grounds.
3. Network boundaries define whether the cloud should be an integral part of the company network
or not. For some data types it can be very important that the data is within the organizations
network (which can be established using VPNs and other methods) for other data
classifications it is not an issue.
4. Geo-spatial or geographic boundaries are boundaries that are marked by the physical distance
between one point and another. These boundaries are used for disaster recovery, High
Availability (HA), latency and other attributes that are of importance within a cloud
environment.

8.2.1 Locationlessness
Because of the virtualized environment in which the cloud runs, geographic location does not tend to
be an issue in the eyes of the beholder. The end user might not, or in some opinions doesnt need to,
know where his data physically resides. Although it is indeed possible to create a locationless cloud,
this locationlessness behaviour of the cloud can be a serious risk, according to a consensus of all the
participating experts, which outweigh the benefits.
29

The real benefit is that any actors that are willing to harm the data, or the technology residing

under it, have no clue where to start. Taking out a node is easier than taking out a whole cloud
environment. This could thus add an added layer of protection. It was even mentioned in the survey
that if one would carefully select its locations to which his or her data would be dispersed, legal
organizations such as the NSA would need to cooperate with the equivalent organization of for
example China in order to get hold of the data they wish to extract. Although this scenario is very
unlikely, and many arguments on the analysis of this scenario are out of the scope of this thesis, it
makes a compelling argument for locationless clouds for those who require minimal interference of
any third party. For sake of staying within the scope of this research, we will define a locationless cloud
as: a cloud environment in which the end user has no awareness of where his data physically resides.
Issues of a true locationless environment are plenty. First of all, your data has to reside somewhere
physically in order for any system to get it, even though it seems to the end user his data is located all
over the world. This lack of control makes compliancy to any standard incredibly hard. Physical
security becomes hard to control, such as access to the facilities. Next to that, legal issues arise, as
different countries have different legal systems that will require solutions to comply to these legal
systems. Then there is the issue of latency. With no knowledge of where data resides, and no need for
providers to provide the user with that information, it might happen that your data will reside at the
other end of the world from one moment to another resulting in a high latency and even time-outs. In
a poorly designed environment, features like HA might become compromised if mirroring/replication
nodes change often. Poor design might also make it impossible to setup a VPN connection with the
specific servers hosting your data. This would result in a significantly higher risk of Man in the Middle
attacks. That also applies to the connection to your data.

8.2.2 Legal or regional risks


Legal, or geographical boundaries are boundaries that signify separate legal systems. These boundaries
include cities, states, countries and territories. Some changes in legal systems might be significant,
such as the difference in respect to privacy between the European Union and China, some might be
incremental such as the difference between counties and state laws in the United States. These
differences however do impose a risk if your data gets placed on a physical server crossing such a
boundary. For example, if youre data gets stored on a server in a country with a high level of
corruption, (government) officials can unplug the Hard Disk Drive with your data for ungrounded
reasons. Next to that, different legal systems have different perspectives on privacy, the use of
30

subpoenas on data extraction from datacentres. As one expert commented: bringing privacy
information out of the European Union can be [a] violation of local or European law. This would
mean that keeping in compliance with laws, be it local, national or international, will become more
difficult without knowledge of the physical location of the data store and processing unit. According to
the classification of the data, set objectives such as screening of all personnel handling data, cannot or
hardly be complied.
8.2.3 Geographic or geo-spatial risks
With regional or geo-spatial risks, the distance of objects relating to the relative position [..] on the
earth's surface (Collins English Dictionary, 2009) is meant, in this case the distance between servers,
but also the location of each server. This can be of importance in the case of disaster recovery, but also
with regards to physical security as presented in security norms such as the ISO 2700x series. An
example of geo-spatial risk is a fire in a datacentre. When availability is of very high importance, a fire
can be disastrous to availability ratings. It is therefore of primary concern that replication takes place to
a server at a distant location as a measure of disaster recovery. Failback servers like these will prevent
any downtime as they step in while the datacentre at the other location is burning down. Depending
on the risks and the threats, one might need to have a disaster recovery location off shore. In the light
of location, one could also consider other features such as the building type, the accessibility of the
server etc.
Geographic location should also be taken into account in the light of latency and propagation
speeds. Servers stationed at the other side of the world will have a harder time delivering data fast to
the user than the server under the users desk.

8.2.4 Organizational Premises


Organizational premises play a role in the physical location of the cloud environment. One can either
wise choose to have the hardware reside on or off organizational premises. For high security purposes
keeping the hardware on premise, and thus fully in own control, might provide a benefit. Personnel
can be checked more easily, as you have full control of the HR policies, as is access control to the
datacentre. The risks incurred from intruders physically compromising the servers, and thus its data,
are less and any forensics are in your control which might be a great advantage in some cases. This
extends the discussion on the geographic location of the server. Depending on the size of the
organization (is it a local organization, a multinational or a global organization) there can be trade off
in security between on-premise servers versus geo-spatial choices.
31

8.2.5 Network or virtual boundary risks


Network or virtual boundaries indicate the boundary of an organizational computer network. This is
an important actor, as some information is not wanted outside the corporate network, such as trade
secrets. Keeping a cloud environment within the boundaries of the network can be reached by keeping
it on premise and thus physically in the network, or it can be reached by creating a VPN connection or
a VLAN (in case of internal networks, dividing different departments for instance) in order to keep
the information within the network. Because some configurations stretch the extension of the
enterprise network, additional risks are incurred due to this stretch, as some of our experts mentioned
in the survey. A VPN over the internet with the cloud environment in an intranet creates means that
computer otherwise not connected to the internet all of sudden are. This might create extra
vulnerabilities within the network. This stretch in the network is also noticeable in the added amount
of actors which have to be trusted. The cloud provider will probably have access to your network, or
the possibility to illegally gain so.
An added risk is the uncertainty of the WAN infrastructure at the providers side. Connecting with
the cloud provider might create vulnerabilities that could threaten the corporate network. Next to that,
multitenancy might also be considered within the range of network boundaries. Although
multitenancy should never be a threat to the virtual machine, in that it shouldnt have the possibility of
other tenants to enter your VM, it has been proven that a vulnerability on the OS level could provide
access to other VMs. Ristenpart, Tromer, Shacham & Savage (2009) describe ways to discover where
nodes are hosted, following with a discussion how to place a co-resident on that physical server in
order to able to reach the hardware a selected node is on. By then compromising the system, the
selected node might be entered. The system tested on was Amazons EC2 compute cloud. This is a risk
that has to be considered, how small it seems to be (see (Ormandy, 2007; Asadoorian, 2007; Mehta &
Smith, 2007) for an overview).

8.3 Governance & compliance


Executing governance and compliance is according to our experts a much debated issue. Because
governance and compliance greatly depend on the infrastructure of the system and the above
mentioned boundary issues, this topic is much under the discretion of the chosen cloud environment.
Depending on the chosen delivery model, compliance can be completely out of hand. A SaaS
application depends on their vendors for governance and compliance. If one chooses a SaaS
application but has strict requirements, it has to be discussed with the SaaS provider. For PaaS it is
32

partly the same, as the program is deployed by the end user. Any compliance and governance within
the program and how it handles data is on the part of the developer. The governance of the
infrastructure and platform on which the application relies is in the hands of the provider. As with
SaaS, negotiations need to take place with the provider in order to secure compliance. For IaaS, most
of the governance and compliance lays in the hands of the tenant. The IaaS provider has to take care of
the compliance to standards such as SAS-70, but many issues like privacy, data encryption and
authentication

are

the

responsibility of the tenant.


Concerning deployment

Iaas

Provider
A

models, it all depends on


whom has access to what. In

Paas

Provider
A

Provider
C

a partner cloud, one can


imagine that governance and
compliance is a shared goal.

SaaS

Provider
A

Provider
B

Provider
C

Provider
A

Figure 6: trust chains in hierarchical order. Doing business with one provide might result in

In a private cloud this hasnt indirectly doing business with another


have to be the case. This will

influence negotiations. That doesnt mean that complying to standards within a public cloud
environment is hard. In the end, you are still bound to your virtual machine and you will not be
influenced by other tenants.
Concerning boundaries, the major aspect is geographic location of the servers. The easiest option is
of course in the same region as the organization resides. Most knowledge of laws and executing
governance/assuring compliance will be available there. Auditing will not be an issue, as you can find
an auditing partner with whom you can easily communicate. That being said, the hardest option is
obviously a cloud environment dispersed over the globe. Although disaster recovery wise there will be
no issue complying to the toughest guidelines, getting audited and governance worldwide will be
tougher. Assuming a small company which is utilizing a cloud environment with datacentres on every
continent, which has no branches in any of those regions but one, will have more trouble getting his
system complying, and thus audited, than if it had only datacentres in use in the same location it
resides.
Although experts in the survey were wary of the fact that it could be done, in a personal interview
with an Chief Information Security Officer of a large utilities company, it was made clear that it is
33

theoretically possible. The issue with getting a successful audit done on an environment like this is that
all auditors need to cooperate. Even though many of the large auditing firms, like KPMG and Ernst
and Young have branches all over the world, it was mentioned that it doesnt mean that all of them are
willing, or capable to communicate with other branches at other parts of the world. Furthermore,
relations have to be built with all cloud providers in order for them to permit auditing on their systems
when physical access for an audit is required. Audits of the virtual systems can be done remotely, and
are presumably much easier as they do not require physical access and can thus be done by one
auditing firm. It is thus an extensive and difficult job to get the whole environment in compliance and
audited to certain security norms.

8.4 Trust Chains


Trust is a major issue in any relation, be it personal or professional. Although this is trivial, cloud
computing can create trust chains, in which the end user is not always aware which other links are
present in his chain of trust. This pertains especially towards delivery models. With IaaS, the tenant is
in direct contact with the owner of the infrastructure (in some cases there might be a reseller in
between) who can have outsourced duties associated with the maintenance of the physical systems. In
a SaaS model, one is not aware if the SaaS provider also owns the platform, or the infrastructure. This
means that there might be a variety of different actors working on the cloud, whom all might be able
to access the data that is being used in the SaaS in some way or another. Actors whom the tenant
initially didnt trust and now has to. This might result in actions that are a threat to the data.
In other words, trust chains can be opaque within the cloud environment. Although business is
about making relations and trust, and thus this issue is not insurmountable, it is a risk factor.

8.5 CI3A
Assurance in the Cloud can be defined by the terms of CI3A (confidentiality, integrity, availability,
accountability and auditability.) CI3A is an extension on the well-known and de facto CIA triad. This
has been used as a standard framework for testing the confidentiality, integrity and availability in
systems, data flows and so forth. However, for the Cloud it is too constrained because of the extended
reach in virtual and physical sense. Once CI3A is clearly defined and controlled within the cloud
environment, one can speak of assurance within the cloud. The proposed model utilizes CI3A to assure
the right level of security is maintained within the environment.

34

Confidentiality is reached by proper

authentication/authorization controls and


encryption

methods

such

as

secured

computing and two-factor encryption.


Preventing data leakage is a central part
within the confidentiality strategy. The
choice of distribution and delivery model
influences the level of confidentiality and
the

methods

needed

to

assure

Figure 7: CI3A visualized

confidentiality.
Integrity assures only authorized actors have access to certain data and said data gets distributed to
only authorized persons. Within that distribution, any editing or changes within the data should only
be made by the right persons. Man in the Middle attacks and other methods of data interception
violate the integrity. Governance and Compliance influence the integrity of the data. A fully compliant
environment is more likely to assure integrity. As with Confidentiality, the chosen Delivery and
distribution model influences the level of integrity and the measures needed to enforce integrity.
Availability comprises of measures to prevent unauthorized actors from deleting and moving data,
or accessing those files, minimizing downtime of the environment and the perception of threats to the
environment. These measures could be a HA infrastructure, strong authentication servers and disaster
recovery. Availability pays a big role within the cloud environment, as servers can be hosted anywhere
in the world, at multiple locations. This can be an advantage in the eyes of HA and disaster recovery;
latency, desynchronization and vulnerabilities in the extensive set of transceiver links can pose threats.
Availability is linked to the aforementioned boundaries and to the delivery and distribution models.
Accountability defines the measures taken to assure that no actor can make an actions without a
record. This is needed for forensics and governance. The measures needed to assure accountability
greatly depend on the delivery model, but also on the distribution model and compliance in general.
Auditability, the ability of the environment to be audited, is directly related to governance and
compliancy. Without a decent grade of auditability, compliance cannot be achieved. Auditability is
influenced by the delivery and distribution model, as with the geo-spatial and geographic boundaries.

35

CI3A covers all aspects of the proposed model, and when executed correctly creates assurance

within the cloud environment. These aspects, and how they correspond to the SeCA model are
outlined below.
The attributes described in the SeCA model are selected on the basis of the results of the Delphi
study and the literature research conducted. Notice how the deployment and delivery model are a part
of the definition of the cloud, all boundary related risks are included as described in 8.2. The
encryption attribute was mentioned in the delphi round, the network and premises attributes in
literature research (especially the Jericho model).

SeCA
attribute

Confidentiality Integrity Availability Accountability Auditability

Regional

Geo-spatial

Compliance

Delivery model

Deployment
model

Encryption

Network

Premises

X
X

X
X

Table 6: the attributes in the SeCA model and how they correspond to the CI3A

8.6 Data loss


As discussed in 1, the cloud allows data to be hosted anywhere in the world. While a real profit in the
perspective of HA and Disaster recovery, it can also offer risks in the area of latency, a-synchronization
of data and attacks on the transceiver links.
Because all data is hosted on a network, be it local or the internet, a connection has to be made
every time data is requested. In an environment where the storage network is connected to the
computing cloud via the internet additional connections have to be setup to transfer the data to the
computing cloud. These connections, like any other connections, can be compromised by hackers.
Man in the Middle attacks could prove fruitful for intercepting data, with or without altering it to
compromise the system further. The changing of physical location of data might propose issues in
latency. If data is being moved from Amsterdam to Melbourne, European users might see an increase
in latency which could mean that services might slow down, or in extreme cases render them unusable.
This risk also applies to connections between parts of the cloud, such as the aforementioned storage
36

and computing network. Losing data between storage networks, due to time-outs for example, might
mean that data might never arrive. In other cases packets might get dropped when synchronization is
misconfigured. This opaqueness within the WAN architecture is an added risk. Note that it depends
on the configuration and regulations of the specific cloud environment whether the WAN architecture
is unknown or fully transparent.
Furthermore, one clients data can be stored on the same disk as his competitor. A compromised
system can serve your valuable data directly into their hands.

Although every system can be

compromised, tenants use the same physical resources. This might increase the risks of data leakage
into the wrong hands.
Ownership can be compromised on certain systems as well. Especially in SaaS applications this
seems to be a hazardous issue. Facebook has ownership covered in his terms as follows: you grant us a
non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that
you post on or in connection with Facebook ("IP License"). This IP License ends when you delete
your IP content or your account unless your content has been shared with others, and they have not
deleted it. A particularly hazardous statement for any company, no matter what information gets
posted. (Facebook, 2010) Although this statement might not be legal in some countries (including
many countries in the European Union) one can imagine that not posting any IP content on Facebook
is a better solution than litigating Facebook for misuse of IP content. It is advisable to discuss data
ownership with the cloud provider.
Data loss is an issue on every system, but with the added amount of connections and actors, it
might provide a higher risk than in different system environments. Data Ownership is a critical point
that should be documented clearly in the Terms.

8.7 Encryption
Encryption plays a vital role within the cloud environment. It is affected by all but the geo-spatial
attributes in the SeCA model and affects the regional, delivery and deployment model. Although
encryption is a broad topic that has been covered in many papers, theses and books, there are some
aspects that are specifically related to the cloud. VPN tunnels, together with SSH can provide secure
access to the cloud environment. Two-factor authentication is a method of authentication in which
the user has to use two independent methods of authentication to reach a designated part of the
environment, which can be very helpful for the cloud environment. Many institutions are using
hardware key-tokens or SMS gateways in order two provide the second form of authentication apart
37

from keying in a password. Authentication servers using protocols as RADIUS in combination with
LDAP, Kerberos or Active Directory can handle all access requests in a proven manner as they are no
different from any LAN/WAN setup at a traditional environment. The author therefore believes that
in terms of access control, authentication and authorization, no issues are at hand other than those in a
non-cloud environment.

Attribute
Regional

Affects
encryption
X

Affected Description
by
Encryption
X

Legal restriction might forbid or allow the


use of some technologies. Likewise, if
some encryption techniques are needed
they may affect the choice of physical
residence.

Geo-spatial
Compliance

Compliance may force the usage of some


practises that will affect encryption.

Delivery model

Delivery models may or may not allow for


some techniques, this will influence it
both ways.

Deployment
model

Same as delivery models (see above).

Network

Depending on where the systems are in


the network, encryption might be needed.

Premises

Depending on the location of the systems,


some encryption methods might be
needed.

Table 7: Encryption and how it is affected or affects the other attributes in the SeCA model
Apart from the aforementioned VPN and strong authentication possibilities for boundary support,
authentication & authorization, an encryption method specifically pertaining to the cloud is secure
computing. Secure computing offers a solution to issues that arise when multiple systems have to use
secure information transactions. Secure computing in essence involves Yaos Millionaires problem.
The millionaires problem is based on the following question: Two millionaires wish to know who is
richer. However, they do not want to find out inadvertently any additional information about each
others wealth. How can they carry out such a conversation? (Yao, 1982) This question also pertains to
the cloud: how does one compute data in the cloud, without decrypting, and thus the owner of the
38

computational unit getting access to the data? As one can imagine, asking Millionaire B hes richer
without giving essential data (the amount of wealth accumulated by Millionaire A) nor receiving it
(the amount of wealth accumulated by Millionaire B) is difficult and cumbersome.
In Yaos paper, three methods are discussed, each consisting of at least seven steps to perform in
order to find the answer to this relatively easy question. One of them can be summarized in a very
simplistic way as follows: Using the public key of Millionaire A, Millionaire B encrypts his wealth
with a random number added which he selects from a set of N-bit integers. Millionaire A then
decrypts that public key and calculates the modulus(P) where P is any prime calculated from the set of
N-bit integers. Whenever Modulus(P) reaches 2, she stops. This results in a list of numbers.
Millionaire A sends this list to Millionaire B. The N-bit integer that Millionaire B chose has a
corresponding value in the list that he received from Millionaire A. He compares that number with
Modulus(P). The outcome (equal, more or lesser than) is the answer to the millionaires problem.
Compare that to the three steps needed if both are willing to provide the amount of dollars they
have on their bank account (1: A states his amount, 2: B states his amount, 3: calculate answer) it is
trivial to say that secure computing comes with a large overhead.
This research has been extended by Goldreich (2000), who researched the problem with multiple
actors (called SMC, Secure Multi-party Computations). Recent research involves SMC geometry,
researching transactions of polygons on convex hulls. See Wang, Luo & Huang (2008) for an
overview. It is known that any multi-party computational problem can be solved using the generic
technique of Yao. (Yao, 1982) To overcome the overhead with Yoas Millionaires problem, and thus
SMC, it seems that algorithms designed to compute a special task need to be written (Goldreich,
2000; Feigenbaum, Pinkas, Ryger, & Saint-Jean, 2004). Using encryption methods such as
homomorphic encryption and public key encryption, several algorithms have shown to be applicable to
the cloud (Troncoso-Pastoriza & Prez-Gonzlez, 2010; Hu & Xu, 2009; Das & Srinathan, 2007) and
have proven to provide the security needed for the cloud within test situations approaching real life
cloud environments.
These methods of secure computing would allow the creation of a chain of trust that is secure, even
though not all parties within the chain know each other nor trust each other. This could overcome any
trust issues that might be in the field of cloud environments. Together with the enhanced and proven
techniques of authentication and authorization already available, encryption can make the cloud a very
secure architecture.
39

9 Current State of Cloud Security


In the above chapters, the cloud, its technologies, issues and risks have described. In order to solve
these issues and reduce the risks in the cloud, the current state of the cloud has to be examined. This
chapter will introduce the possibilities of data location awareness at the time of writing, at what impact
it will have on decision makers and experts in the field. It will also investigate the current security
governance standards that are being used in the cloud working field (section 9.2). Furthermore, it will
detail the comparison of the cloud environments to insourced servers and traditional outsourced
servers, thus servers that are hosted at an external location by a third party without using cloud
technology.

9.1 Data Location awareness


As discussed above, data location awareness is a serious issue that can have a tremendous impact on
the security of the cloud environment. It is therefore of importance that users and administrators have
full control of the location where their data are stored and computed. VMware, a major virtualization
vendor, has created an application called vCloud Director that allows cloud architects to build their
own virtual datacentres. (called a PvDC by VMware (2010)) These PvDCs can be created on any
criteria the administrator selects, such as geographic location, availability and SLA. Cisco provides the
same functionality with Cisco Fabric Manager 5.0 (Cisco, 2011) which allows to create vSANs (virtual
SANs) between Cisco fabric switches, Data Center Network Manager can create virtual Datacenters
like VMware PvDC (Cisco calls them VDC, Virtual Device Context, (Cisco, 2011)) and Cisco has
Developed OVT (Overlay Transport Virtualization) to converge geographic disparate servers. (Cisco,
2011) Other vendors have similar technologies like HP FlexFabric, IBMs Open Fabric and Virtual
Fabric.
Data location awareness is thus technically possible and should be addressed within the SLA of the
cloud provider.

9.2 Security governance & Compliance


Several standards exist for security governance. Below is a collection of the most common standards
found within the cloud environment. Note that this is by far not an exhaustive collection of standards.
By complying to standards such as the ones described below, trust can be reinforced with all partners
in the chain. Furthermore, complying to internationally accepted standards, an organization has proof

40

that its systems are secure and well audited. This proof can be a valuable asset in negotiations, trust
issues and SLAs.
The described standards vary widely on their applicability. The PCI DSS standard (section 9.2.1) is
only applicable to an organisation using payment card solutions. SAS 70 (9.2.2) is an American
accounting standard which does not specify any means of norms or rules on security or accountability,
but provides a framework for auditing of organisations.

The ISO 2700x and COBIT are

internationally renowned security standards (sections 9.2.3 and 9.2.4 respectively). They provide
guidelines on security measures to be taken, auditing and governance to the respective standards.

9.2.1 PCI SSC Data Security Standards


The Payment Card Industry Security Standards Council has developed a series of security standards
for payment card data security. The keystone is the PCI Data Security Standard (PCI DSS) which
provides an actionable framework for developing a robust payment card data security process -including prevention, detection and appropriate reaction to security incidents. (PCI Security
Standards Council, LLC., 2011)
As the PCI states: A service provider or merchant may use a third-party service provider to store,
process, or transmit cardholder data on their behalf, or to manage components such as routers,
firewalls, databases, physical security, and/or servers. If so, there may be an impact on the security of
the cardholder data environment.
For those entities that outsource storage, processing, or transmission of cardholder data to thirdparty service providers, the Report on Compliance (ROC) must document the role of each service
provider, clearly identifying which requirements apply to the assessed entity and which apply to the
service provider. (PCI Security Standards Council, 2010, p. 11) In other words, if one would use a
Cloud Provider and seeks PCI compliance, the Cloud Provider would also need to be PCI compliant.
(This is defined in Requirement 12.8.2: Maintain a written agreement that includes an
acknowledgement that the service providers are responsible for the security of cardholder data the
service providers possess. (p. 68)
Amazons AWS and Rackspace are for instance PCI compliant.

9.2.2 SAS 70
The Statement on Auditing Standards number 70: Service Organizations, also known as SAS 70, is a
guidance to external auditors on the Generally Accepted Auditing Standards (GAAS). GAAS was

41

developed in the United States in 1972. Its international counterpart is the IFRS (International
Financial Reporting Standards).
SAS 70 provides guidance in the auditing of internal controls of an organization. It is a
requirement of the Sarbanes-Oxley Act of 2002. It used to show to external auditors and customers
that their environment has succinct internal control activities. This is done by the company that
specifies certain control objectives and control activities, which are then audited by an external auditor.
(SAS70.com, 2011)
Because the company sets the control objectives and activities, and the external auditor audits these
objectives and activities whether they are being processed or not, no real indication is given. An
extensive review of these objectives and activities needs to be done in order to understand what is truly
being audited.

9.2.3 ISO 27002


This ISO/IEC 27002:2005 Information Technology Security Techniques Code of Practice for
Information Security Management, in short ISO 27002, is an information security standard
developed by the International Organization for Standardization (ISO) and International
Electrotechnical Commission (IEC).
It provides best practices on information security management in the context of the CIA triad. It is
worldwide adopted, although some countries hold it under a different name. It consists of 12 sections
ranging from risk assessment to compliance.
Certification for these best practices, and the ISO 27000 series in general, is specified by the ISO
27001 standard. It specifies a set of requirements for establishing and implementing the best practices
mentioned in ISO 27002.
Cloud providers are getting ISO 27001 certified, examples are Microsoft, SalesForce and Amazon.
This will reinforce trust in the chain by having proof that a system is secure by complying to an
international standard.

9.2.4 COBIT
Control Objectives for Information and related Technology (COBIT) was developed in 1996 as a set
of best practices for IT management. It is formed out of 34 high level processes, covering 318 control
objectives and adopted worldwide.

42

COBIT comes with an IT Assurance guide in order to help service providers maintain assurance.

Although there is not much activity with COBIT in the cloud, it can potentially be very useful for
compliance and governance.

9.3 Compared to Outsourced servers


Depending on which cloud environment is used, outsourcing to a cloud provider can be equally safe as
outsourcing it to a conventional outsourcing partner. A conventional outsource partner is defined in
this thesis as a third party who hosts the data, is owner of the information system that data is hosted
on and is located on an external location. We compare the cloud and this outsource provider on
attributes that have earlier been discussed as potential issues and success factors in the cloud, the
common characteristics of the cloud as defined by ENISA and NIST.
The major advantage, according to the experts, of conventional outsource partners is the shorter
chain of trust (although this is also reliant on the outsource partner. It may have sub-contractors or
(partly) outsource its capacity as well.) The WAN infrastructure is also easier to manage. With cloud
providers, the WAN infrastructure can change with a click on the button. The virtualized nature of the
cloud can introduce unforeseen scenarios in the WAN infrastructure, which might not occur so easily
with a conventional outsourcing partner. The conventional partner might also provide for easier
auditability and with that reaching compliance. But it depends greatly on the outsourcing partner and
the cloud environment it is compared to. This is also the case for the geographic controls. As discussed
earlier, some cloud providers offer control over the geographic locations where the data in the cloud
will reside. This, according to the experts in the test panel, is standard with conventional outsourcing.
The client is always aware where its data will reside with conventional outsourcing. This gives the end
user a larger control over governance and ownership of data. Due to the dependence of the cloud
architecture its compared to, the following comparison (see also table 8) features a conventional
dedicated outsourced solution and a public IaaS cloud environment. Assumed is that because were
dealing with a dedicated solution, no virtualisation takes place as theres no need for it on the provider
side. If there is a need for it, it would be on request of the client. As were dealing with a single
location, auditing and compliance will be easier (see 8.2 and 8.3) but flexibility will be low as youre
limited to the services within that one place. Therefor the added value of cloud computing, such as
rapid deployment, scalability and HA/disaster recovery structures will vary depending on the size of
the organization and the datacentres it owns.

43

Characteristic

Conventional
outsourcing party

Public IaaS cloud


solution

Virtualisation

No

Yes, multiple systems and/or DCs linked


to each other

Trust chain

Short and reliable

Can be long and opaque (see 8.4)

Multi-tenant

No

Yes

Ownership of

Outsourcing party only

Cloud provider, see trust chain.

Easier to reach and audit

Can be hard to achieve and hard to audit

architecture
Compliance

(see 8.2 and 8.3)


Might be possible (multiple

Most likely easy to setup (due to

contracts needed)

virtualization)

Might be possible (not at the same


location)

Most likely easy to achieve (due to

Flexibility

Low

Very high

WAN infrastructure

Transparent and unlikely to


change

Can be opaque and change often


according to the needs and load of the
cloud. (see 8.2.5)

Rapid deployment

Unlikely (special software needed)

available

Geographic location

Known and likely nearby

Might be unknown & far away

High Availability (HA)

Disaster recovery

virtualization)

Table 8: comparison of a dedicated outsourced solution and a public IaaS cloud solution

The major difference between conventional outsourcing and a public cloud, hosted off premise
within the same geographic region as the tenant is that the public cloud will be used by more tenants
on the same hardware. This means that in the case of a leak on the virtualization platform every
tenants virtual machine might be compromised. The agility of the system, including disaster recovery
options and backup solutions will be a profound advantage of the cloud environment. As virtualization
provides an added layer of security a public cloud solution might be more secure than an traditionally
outsourced server. However, it all depends on the regulations within the SLA and as cloud providers
deal with more customers on the same set of hardware, it might be harder to get that same SLA as you
were used to with your outsourced provider. Also, depending on the data will be stored, and the risks
44

associated with leakage of that data, it might be more secure to store data on a traditionally
outsourcing provider due to concerns of multiple tenants on the same hardware. This could also be
overcome by using a private cloud. It provides the security of only your organization utilizing the cloud
architecture, yet gives the benefits of the agility of the cloud.

9.4 Compared to insourced servers


As discussed above, in 9.3, comparing cloud providers to insourced servers is hard as it depends on the
cloud environment. Many of the issues and risks that come up when comparing outsourcing with
insourcing apply when comparing insourcing to cloud providers. However, in some cases we might
actually say that setting up a cloud environment might provide a higher level of security than when
using traditional insourced servers. This higher level of security is reached due to the virtualized nature
of the cloud. These virtualized servers have HA possibilities, thus the availability of data, through for
example the means of replication and backups, can be maximized at ease. Table 9 summarizes the
comparison. We compare the cloud and this outsource provider on attributes that have earlier been
discussed as potential issues and success factors in the cloud, the common characteristics of the cloud
as defined by ENISA and NIST.

Characteristic

Conventional
insourced solution

Private cloud solution

Virtualisation

possible

Yes, multiple systems and/or DCs linked


to each other

Trust chain

Short and reliable

Short and reliable

Multi-tenant

no

possible (but with absolute control over


the tenants)

Ownership of

yes

yes

Compliance

Easy to reach and audit

Can be hard to achieve and hard to audit

High Availability (HA)

Might be possible

Might be possible

Disaster recovery

hard to achieve (multiple locations


needed)

hard to achieve (multiple locations


needed)

Flexibility

low

Medium (only within the system)

WAN infrastructure

Transparent and unlikely to


change

Transparent and controlled change.

architecture

45

Rapid deployment

Unlikely

Available (but within the limits of the


systems available)

Geographic location

Known and on premise

Known and likely nearby or on premise

Table 9: Comparison of an in sourced and a Private cloud solution

A good comparison to a cloud solution would be a private cloud, hosted on premise. The major
difference with this cloud architecture and an insourced datacentre is that it leverages the combination
of techniques used in the cloud such as virtualization, agility and multitenancy. (The tenants being
workgroups, projects or any other sort of segregation within the company.) One must keep in mind
that the agility of a private cloud, hosted on premise is not so agile compared to a public cloud. For
every expansion of the environment, not single VMs per se, a new physical server needs to be installed,
contrary to a public cloud where limits of the cloud environment are not as easily reached and there are
more options to extend the environment than only adding a new physical machine.

9.5 Issues and solutions


To summarize the current state of the cloud with the issues projected in this thesis, it can be said that
for every issue there is a workable solution. Discussing location awareness, the technology is there to
provide full disclosure to tenants where their data is being stored and/or executed. Although the
author of this thesis has found no evidence of any cloud provider giving its tenants the control to
warehouse their data on the level the reviewed software offers, VPS.net for example does give its
tenants the possibility to choose for either its London cloud or its US cloud.
Security governance is possible within certain limits. The current security standards can be applied
to cloud providers and as showed some providers have already been certified. Of course, whenever a
cloud provider is certified, the tenant should have its local architecture also certified.
Comparing to traditional outsourced servers or insourced servers is difficult. It depends on the
many characteristics that envelope the cloud. However, making a more or less direct comparison with
a cloud that has been setup like a traditional outsourced server, we can conclude that the cloud due to
its virtualization capabilities can in fact be more secure than an outsourced non-cloud service. Looking
at insourced, the only advantage of a private cloud seems to be its agility. Of course, one must
remember that these are very crude and only technical comparisons and do not feature any cost
comparison as these will be company specific (for instance the costs of education and implementation
will vary greatly per organization.)
In general, all solutions can be found within the SLAs of the providers.
46

10 The Secure Cloud Architecture model


Resulting from the research conducted, we can summarize that the cloud can be secure, as long as its
policies and SLAs are correctly in place and enforced. The different factors and risks involving cloud
computing make it difficult to pinpoint to one secure cloud. In fact that is impossible, due to the
diversity of cloud architectures and the data that is being stored on it. To circumvent this problem, a
model has been designed. This model has been tested in the Delphi study and will be further discussed
in 10.1.
The Jericho Forum has developed a model to enabl[e] secure collaboration in the appropriate cloud
formations best suited to the business needs (Jericho Forum, 2009) called the Cloud Cube Model. It
features four criteria to differentiate the cloud environment from each other, namely internal/external,
proprietary/open, perimeterised/de-perimeterised and insourced/outsourced.

Internal/external describes the physical boundary of the cloud environment and how it
corresponds to your organisational premises. Internal would thus be that the cloud
environment is physically present on organisational boundaries.

Open/proprietary refers to the systems used in the cloud environment. Open are thus open
standards, open source software and such.

Perimeterised/de-perimeterised refer to the cloud environment and whether or not it is within


your IT boundaries. Thus a perimeterised cloud environment would be within the firewall of
the organisation or included into it via a VPN connection.

Insourced/outsourced defines whether the cloud is under the service of respectively the
organisations own personnel or by that of a third party.

During the research conducted, the author has found that the four criteria that are listed are valid,
but do not completely encompass the varieties of cloud environments. The Cloud Cube Model is
targeted at clouds in general, not specifically at the security of the cloud environment. It lists Open
and Proprietary attributes as a significant part of the cloud. In the model developed, shown below,
this attribute has been left out, as the development of a majority of cloud environments is done with
open source software and further research into this was not possible within the time admitted. The
author does not feel that the difference between proprietary and open source software is a significant
security concern. The Jericho Forum has listed it as a vendor lock-in issue and not as a security concern
insofar. A new model has been developed in order to fill the gap left by the Cloud Cube Model,
discussed below.
47

10.1 Modelling the Cloud


The model described below gives an abstract overview of all the characteristics of the cloud. It defines a
secure cloud architecture for a specific type of data that is defined by a classification. An example of
such a classification this displayed below, in section 10.1.1.

Figure 8: the SeCA model. Inputted is a classification, which gets passed through all the attributes (depicted horizontal)
which then outputs a specification for a secure cloud architecture

10.2 Using the model


As shown in 10.1.3, the model outputs guidelines for the Cloud environment and to which
specification a cloud solution should adhere. Below is a template which could be used for assessing the
cloud architecture following the SeCA model. The flow chart below shows where this assessment
ordinarily should take place. It is assumed that organisations already have made classifications for their
data in an earlier stage. For each classification a cloud architecture is assessed using the SeCA model.
Once that is done, a list of cloud providers who can adhere to the results from the assessment is
created. Ultimately a cloud provider is selected, arrangements are made and the data can be placed in
the cloud.
The figure below shows the flow of the complete process. Discussed on the next page (section 10.3)
is step 3: analyse cloud architecture using SeCA.
create
classification

classifiydata

analysecloud
architecture
usingSeCA

list&select
cloud
providers

placedatain
cloud

Figure 9: simplified process flow of placing data in the cloud.

48

Date:____________

SeCA Data classifica on Template form


Classifica onName/iden fica on: ______________
Regional:

Geospa al:

Governance & Compliance:

Delivery Model:
o IaaS

Deployment model:

Encryp on:

Network:

Premises:

ExpertsName:_________________________

PaaS

SaaS

Private

Partner/Community

Public

Hybrid

Within

Outside

Any

on premise

Off premise

Any

Table 10: a SeCA template form to be used in assessing the architecture needed for a secure cloud solution using the SeCA
model

It can occur that each classification has a different output from the assessment. (it is actually most
likely to do so.) In that case several options are open. For each classification a different list of cloud
providers is made in order to find and select the right cloud provider who can provide the cloud
architecture needed. These can be combined in Hybrid Clouds. Note that a hybrid cloud solution gives
the assessor/future client of the cloud provider a better negotiation position. One can also decide that
for certain classifications it is simply not feasible to transfer that data into the cloud and thus stick with
the solutions already in place.
The model does not provide the intelligence which classifications could be hosted at the same cloud
architectures. It is for the assessor to decide which cloud architectures that are the result from the
assessments can be merged.

49

10.3 Inputs
The input in the SeCA model is a classification. This classification needs to be made in the
organization by its security experts in order to define how data should be managed. These classification
tend to differ between organisations as they depend on what kind of data is being used, read, published
and processed by the organisation. For the examples used here, we use the following classifications
(which are also shown in Section 6, table 3):

Classification Measures
1: Top Secret

Data that should only be handled by specific people with the right
authorization. Very high business impact if a data leak occurs.
Resulting measures: Personnel screening, data on organizational premises,
data hosted only in the same country as the organization. Dedicated
hardware, e.g. no multitenancy. Replication on at least n+1 location. Data
must be within the network.

2: Secret

Data that should be handled by limited amount of people with the right
authorization. High business impact if a data leak occurs.
Resulting measures: Screened personnel, within region of organization. Data
may be off-premise, HA architecture.

3: Private

Data is open to employees and partners. Some business risks involved in


case of data leakage.
Resulting measures: Data can be hosted externally within the region of
partners. Any employee can access the data. Five nines availability.
Thorough backup solution.

4: Public

Data that is open to anyone, for instance press releases. No to minimal


business impact on data leakage.
Resulting measures: Can be hosted off-premise, anywhere in the world, with
at least one location within the region of the organization. High availability
is necessary for promotional purposes. Can be out of the company network.

Table 11: exemplary classifications that could be used as input in the SeCA model

These classifications are arbitrary and may differ per organization.

10.3.1

Attributes

The input is then tested alongside several attributes, which are all relevant to the CI3A, seealsoTable
6: the attributes in the SeCA model and how they correspond to the CI3A.The CI3A definition works
both ways, on the one hand its content is being defined by the data classification, on the other hand
the attributes define how the definitions set in the classification result in a corresponding cloud
50

architecture. Below is outline of the attributes in the model. Each attribute features a table that shows
the measure that needs to be taken for each data classification.
10.3.1.1 Regional
Regional corresponds to the regional boundaries discussed in 8.2.2. They are the paramount of legal
importance when it comes to boundaries and tie in with Governance & Compliance, discussed below.
Regional can define multiple things. It might mean any legal boundary where the data are
hosted/executed, such as countries, states and counties. If we look at the classifications above, the
regional attribute will have different impacts. In the Top Secret classification, the data should be
hosted and executed within the region of the organization. It depends on the organization where that
would be. A multinational will have multiple offices in multiple regions. A small business will not.The
Public classification gives more room in this case, only requiring that one hosting location is within
the region of the organization. This might be set for example to assure legal directives like those are
common within the EU where all data of European organizations should be hosted in the EU.
Depending on the size of the region, creating a HA architecture with disaster recovery might be
difficult.

classification Measure
1: Top Secret

Cloud environment physically within the same region as the organisation

2: Secret

Cloud environment physically within the same region as the organisation

3: Private

Cloud environment physically within the same region as the organisation or


its partners

4: Public

Cloud environment physically has one system within the same region as the
organisation

Table 12: Regional measures ot be taken for each classification

10.3.1.2 Geo-spatial
The Geo-spatial attribute defines the locality of hosting locations. There might be requirements
concerning disaster recovery where hosting location should at least be separated by a certain amount of
distance, or be separated by certain geographic features such as rivers, lakes or mountains. But also on
a micro level, such as specific industrial terrains where datacentres should be located. It has been
thoroughly discussed in 8.2.3. In case of classification Secret, which requires a HA architecture, one
can imagine setting a large distance between the datacentres in order to provide replication and
disaster recovery. Looking at Top Secret, this might be a struggle due to the fact that the servers
have to be on-premise.
51

classification Measure
1: Top Secret

At least two physical locations, on different power nets, different buildings.

2: Secret

A HA architecture is preferred, at least one backup location in a different


building.

3: Private

A HA architecture is preferred, at least one backup location in a different


building on separate power net.

4: Public

A HA architecture is preferred, at least one backup location in a different


building. Disaster on a distant enough location.

Table 13: Geo-spatial measures to be taken for each classification

10.3.1.3 Governance & Compliance


Governance & Compliance applies to ability of the cloud provider to be audited and compliant to
security standards. This is an overlapping attribute, that relates to the physical location of the servers,
their architecture, delivery model, deployment model and encryption depending on the security
standard that the cloud environment has to be certified to. Its implications have been discussed in
10.3.1.3.
This attribute also allows itself also for the determination to what governance and compliance
standards it has to uphold. This will much depend on the data is being stored and used in the cloud
environment, and the objectives and expectations of the application in the environment. Top Secret
will for example not be certified, as it will only be used for internal purposes. As long as it is extremely
secure it is good. For Public however, it might be useful to have the data on a certified system, as it
might be contact with potential clients, end users and other external stakeholders. Private requires a
more thorough certification, as the data will be handled by partners whom in turn expect a decent
security audit.

Classification Measure
1: Top Secret

No need to adhere to specific standards, but good auditing of security


measures taken are needed often. Screening of all personnel that have access

2: Secret

No need to adhere to specific standards, but good auditing of security


measures taken are needed often. Screening of all personnel that have access

3: Private

No need to adhere to specific standards.

4: Public

No need to adhere to specific standards.

Table 14: Governance measures to be taken for each classification

52

10.3.1.4 Delivery Model


The delivery model will be in the perspective of information security chosen depending mostly on the
Compliance and Governance attribute. The differences as IaaS, PaaS, SaaS have been discussed in
7.1.3. For Top Secret an IaaS service will be more likely, as more control is in the hands of the
organization (in this case the IT department of the organization is the service provider to the whole
organization.) IaaS has simpler trust chains and is more flexible in configuration. However,
configuration on the software level is not a part of the configuration. At the other end of the spectrum,
Public might be well off with a SaaS solution.

classification Measure
1: Top Secret

IaaS architecture due to the need of flexibility and custom security measures

2: Secret

Any, as long as trust chains are fully disclosed and personnel can be screened.

3: Private

Any.

4: Public

Any.

Table 15: Delivery model to be chosen for each classification

10.3.1.5 Deployment Model


The deployment model, just like the delivery model, depends of the certification and the nature of the
data. It has been extensively discussed in 7.1.4.
For example, Private might be very well off in a Partner Cloud environment. Top Secret will
need a private cloud to avoid multitenancy and Public might by any of the offerings, which might
result in a hybrid cloud architecture with one or more of the other classifications.

classification Measure
1: Top Secret

Private cloud, as no other tenants are allowed on the system

2: Secret

Any, depending on the security measures taken to prevent others from


accessing sensitive information.

3: Private

Any.

4: Public

Any.

Table 16: Deployment model to be chosen for each classification

10.3.1.6 Encryption
Depending on the classification, certain encryption features need to be present in the cloud
environment. These might be mandatory in order to get certified, or to comply to rules and regulations
in certain legal districts. But encryption also provides the option for in-network access to the data
stored in the cloud environment. Secure computing can also be used to satisfy demands and objectives.
Encryption methods that apply to the cloud are described at large in 8.7. Data classified as Top
Secret will require an unbreakable encryption. Secure computing, two-factor authentication and
53

algorithms like AES and Elliptic Curve will be good options. The classification Public will be good
with encryption used for authentication and authorization in terms of adding data, editing and
deletion of data,, but the data itself does not need to be encrypted.

classification Measure
1: Top Secret

Strong Two-way authentication, all data is encrypted and then stored. For any
computations, SMC might be a solution or within the same system.

2: Secret

Strong Two-way authentication, all data is encrypted and then stored.

3: Private

a proper authentication and authorisation system should be in place.

4: Public

a proper authentication and authorisation system should be in place for


adding, deleting and editing content.

Table 17: Encryption measures to be taken for each classification

10.3.1.7 Network
The network attribute defines the network boundary the cloud environment should be in and how it
should be setup., as described in 8.2.5. If the cloud environment should be in the network, it can be
setup with measures like VPN, WebDAV and, in the case of on premise cloud environments VLANs
can be used.

classification Measure
1: Top Secret

Within the network

2: Secret

Can be both

3: Private

Can be both

4: Public

Outside the organisational network

Table 18: Network options to be chosen for each classification

10.3.1.8 Premises
Premises defines whether the cloud environment should be on organizational premises or not. This
might enhance the security of the environment and allows for more control concerning physical access
to the environment and personnel screening. However, off-premise means that the data will be hosted
at a company that has specific expertise with hosting and the cloud. This might add to the security as
well.

classification Measure
1: Top Secret

On premise

2: Secret

No preference

3: Private

No preference

4: Public

Off premise

Table 19: Organizational premise options to be chosen for each classification

54

10.3.2

Outputs

The output is a list of requirements to which the cloud architecture should adhere. In other words, a
framework to a secure cloud environment for the tested classification. If these requirements are applied
to the cloud environment, the data can be stored and processed securely. Of course, these requirements
need to get audited to create assurance according to the CI3A.
For the classification used in this example (see table 10, above), the following would be outputted:

Classification: Top
secret

Secure Cloud Architecture specification

Attribute

Value

Regional

Cloud environment physically within the same region as the


organisation

Geo-spatial

At least two physical locations, on different power nets, different


buildings.

Governance/compliance

No need to adhere to specific standards, but good auditing of


security measures taken are needed often. Screening of all
personnel that have access

Delivery model

IaaS

Deployment model

Private cloud, as no other tenants are allowed on the system

Encryption

Strong Two-way authentication, all data is encrypted and then


stored. For any computations, SMC might be a solution or within
the same system.

Network

Within the network

Premises

On premise

Table 20: Specification of the Secure Cloud Architecture for the Top Secret classification outputted from the SeCA model

Classification:
Secret

Secure Cloud Architecture specification

Attribute
Regional

Value
Cloud environment physically within the same region as the
organisation

Geo-spatial

A HA architecture is preferred, at least one backup location in a


different building.

Governance/compliance

No need to adhere to specific standards, but good auditing of


security measures taken are needed often. Screening of all
personnel that have access

Delivery model

Any, as long as trust chains are fully disclosed and personnel can
be screened.

Deployment model

Any, depending on the security measures taken to prevent others


55

from accessing sensitive information.

Encryption

Strong Two-way authentication, all data is stored encrypted.

Network

Can be both

Premises

No preference

Table 21: Specification of the Secure Cloud Architecture for the Secret classification outputted from the SeCA model

Classification:
Private

Secure Cloud Architecture specification

Attribute
Regional

Value
Cloud environment physically within the same region as the
organisation or its partners

Geo-spatial

A HA architecture is preferred, at least one backup location in a


different building on separate power net.

Governance/compliance

No need to adhere to specific standards.

Delivery model

Any

Deployment model

Any

Encryption

a proper authentication and authorisation system should be in


place for any actions.

Network

Can be both

Premises

No preference

Table 22: Specification of the Secure Cloud Architecture for the Private classification outputted from the SeCA model

Classification:
Public

Secure Cloud Architecture specification

Attribute
Regional

Value
Cloud environment physically has one system within the same
region as the organisation

Geo-spatial

At least two physical locations, on different power nets, different


buildings.

Governance/compliance

No need to adhere to specific standards.

Delivery model

Any

Deployment model

Any

Encryption

a proper authentication and authorisation system should be in


place for adding, deleting and editing content.

Network

Outside the organisational network

Premises

Off premise

Table 23: Specification of the Secure Cloud Architecture for the Public classification outputted from the SeCA model

56

10.4 Results from the SeCA Model


The model helps to get a clear overview of all the features of the clouds. If these are taken into account,
the cloud could be a very secure location for anyones data. Evidence is shown that multiple attributes
can overlap and influence other attributes. Governance & Compliance is for example dependent on
the regional choices and the delivery model chosen. On the other hand, it might also influence this, if
the certification has already been defined in the classification. The output is a clear list of requirements
that are needed to evaluate any cloud environment choices and actions.

57

11 Conclusions
After research, defining the issues the cloud faces and the solution for those issues, several conclusions
can be made. While reading these conclusions it is paramount to take into account that the cloud is a
new business model for service providers, not a new technology on its own. The central question is, is
the cloud secure?
Defining something as secure depends on many factors. Depending on the sort of data, the
classification of that data and taking that wholly into perspective of the cloud environment, it can be
said that the cloud is secure in certain situations. Depending on the outcomes of investigations, there
should always be a cloud architecture that fits ones security needs. Better yet, the cloud can provide
additional layers of security by utilizing virtualization, elasticity and HA architectures. Even though
the additional layer of virtualization on the system might provide additional hazards, these hazards can
only be exploited when the virtualization platforms can be compromised. With a minimum of known
bugs, the last one dating from 2007, one can rationally say that the virtualization layer adds more
protection than threats.
By using the SeCA model described above, each and every classification can be checked to see how
a cloud architecture should be designed in order to meet the security standards needed. It will,
however, depend on the cloud provider whether it can deliver the architecture that is needed.
For the upmost secure classifications, a private cloud, hosted on premise, within the network, with
mirroring on a different physical location (branch office) utilizing the needed encryption methods will
provide a very secure architecture whilst maintaining the flexibility the cloud has to offer.
For every architecture counts that data location awareness is essential. Without the full knowledge
of where the data resides and is processed, issues will arise in all actors of the CI3A. Data location
awareness will also provide the means for compliance, legally and to security standards. These
standards are being adopted by all major vendors, including Amazon, Google and Microsoft, with
smaller ones following. This facilitates full compliance to the de facto security and auditing standards
such as SAS 70, ISO 27000 series, PCI and COBIT. It depends, once again, on the configuration of
the cloud architecture and where applicable the willingness of cloud provider to allow for audits. If the
selected cloud architecture features datacentres in widely spread different parts of the world, auditing
might be more complicated. This of course also applies to the compliance to legal systems (privacy,
intellectual property and auditing regulations) which can vary between jurisdictions. It is because of

58

these implications that so-called locationless clouds are not preferable. They have an opaque layer that
hides the user from vital knowledge in order to be assured from a secured cloud.

59

12 Further Research
Further research can be conducted in the legal field. This was out of scope of this thesis, but the legal
issues surrounding auditing, SLAs and NDAs are of paramount importance for the security in the
cloud. SLAs especially are of profound importance as they describe what measures a cloud provider
should undertake for the security of the cloud. This thesis unfortunately has not had the possibility to
explore the provider side of the cloud environment much.
Related to this is auditing in international/worldwide clouds. Auditing certifications, governance
and compliance to legal systems in these environments means that auditing firms, datacentre owner,
providers and application owner all need to work together in order

to get successful audit. In

international and worldwide clouds these relations might become very complex, not to mention that
multiple audit firms/offices have to work together. The issues raised with datacentres situated in
different legal regions, such as China and the United States, are worth more research. Auditing plays
also here a major role in these issues.
The topic of true locationless clouds, and thus providing security by obscurity, is an interesting
topic. Although it seems hard to develop, as the end-user still needs to reach his data without
disclosing the location it is harboured, a proof of concept of such a system will be very interesting as it
does not only show what state of the art techniques the cloud can use, but also current issues in
countries with different legal systems. In the light of for example freedom of speech these solutions
might be very useful. One might consider a kind of interface that has to attempt to open a series of
ports on a series of hosts (like port knocking, but spread over various hosts) to define the location of
the data.
A pressing issue not discussed, but worth the research are third party appliances that are currently
installed in traditional datacentres. These appliances cannot be directly converted to the cloud, as the
cloud does not offer any place for such appliances. It seems that at the moment of writing many of
these appliances are converted to the cloud by their developers. It is nonetheless interesting to see what
impact these appliances have on the adoption of the cloud.
Although some cloud providers are certified, the impact of that certification on the real security of
the services the provider offers is not always known. SAS70 for example does not offer any concrete
security, it only offers a framework for auditing internal controls. The cloud provider will need to list
its internal controls for any user to see what has been audited. It might be interesting to see how cloud

60

providers use that information, what they do with it and whether the certifications really add up to
extra level of security that is said it adds.

61

13Bibliography
Asadoorian, P. (2007, 7 31). Escaping from the Virtualization Cave. Retrieved 12 22, 2010, from
PaulDotCom:
http://www.pauldotcom.com/2007/07/31/escaping_from_the_virtualizati.html
Backes, M., Ning, P., Wang, Q., Wang, C., Li, J., Ren, K., et al. (2009). Enabling Public Verifiability and
Data Dynamics for Storage Security in Cloud Computing. Computer Security ESORICS 2009 (pp.
355-370). Berlin / Heidelberg: Springer.
Bartvagh. (2010, May 7). Gartner: Private Cloud Computing Plans From Conference Polls. Retrieved
September 25, 2010, from MSDN Blogs:
http://blogs.msdn.com/b/architectsrule/archive/2010/05/07/gartner-private-cloudcomputing-plans-from-conference-polls.aspx
Blakley, B., McDermott, E., & Geer, D. (2008). Information Security is Information Risk Management.
ACM New Security Paradigms Workshop '08, 97-102.
Blokdijk, G., & Menken, I. (2009). Cloud Computing Virtualization Specialist Complete Certification Kit.
London: Emereo Pty Ltd.
Chen, Y., Paxson, V., & Katz, R. H. (2010). Whats New About Cloud Computing Security? Berkeley,
CA, USA.
Christodorescu, M., Sailer, R., Schales, D. L., Sgandurra, D., & Zamboni, D. (2009, Nov 13). Cloud
Security Is Not (Just) Virtualization Security. Chicago/Zurich, USA/CH.
Cisco. (2011). Cisco Data Center Network Manager Release 5.1. Retrieved Januari 5, 2011, from Cisco Data
Center Network Manager:
http://www.cisco.com/en/US/prod/collateral/netmgtsw/ps6505/ps9369/data_sheet_c78631924.html
Cisco. (2011). Cisco Fabric Manager 5.0: Visibility and Control for the Unified Data. Retrieved Januari 5,
2011, from Cisco MDS 9000 SAN Management:
http://www.cisco.com/en/US/prod/collateral/ps4159/ps6409/ps4358/product_data_sheet0
9186a00800c4656_ps6030_Products_Data_Sheet.html
Cisco. (2011). Overlay Transport Virtualization for Geographically Dispersed Virtual Data Centers: Improve
Application Availability and Portability. Retrieved Januari 05, 2011, from Cisco Nexus 7000 series
switches:

62

http://www.cisco.com/en/US/prod/collateral/switches/ps9441/ps9402/solution_overview
_c22-574939.html

Collins English Dictionary. (2009). geospatial. In Collins English Dictionary - Complete & Unabridged 10th
Edition. HarperCollins Publishers.
Collins, J. (2009, August 24). Virtualization and the Cloud: Just a stepping stone? Retrieved September 25,
2010, from The Register:
http://www.theregister.co.uk/2009/08/24/virtualization_and_cloud/
Das, A. S., & Srinathan, K. (2007). Privacy Preserving Cooperative Clustering Service. 15th International
Conference on Advanced Computing and Communications (pp. 435-441). IEEE.
de Bruijn, W., Spruit, M. R., & van den Heuvel, M. (2009). Identifying the Cost of Security. Journal of
Information Assurance and Security, 5(1), 79-83.
Dwivedi, H. (2005). Securing Storage. Westford: Pearson Education, Inc.
ENISA. (2009). Cloud Computing: Benefits, risks and recommendations for information security. ENISA,
Emerging and Future Risk programme. Crete: ENISA.
Eucalyptus. (2011). Eucalyptus Community Cloud. Retrieved april 5, 2011, from Eucalyptus:
http://open.eucalyptus.com/CommunityCloud
Facebook. (2010, 10 4). Statement of Rights and Responsibilities. Retrieved 1 27, 2011, from Facebook:
http://www.facebook.com/terms.php?ref=pf
Feigenbaum, J., Pinkas, B., Ryger, R., & Saint-Jean, F. (2004). Secure computation of surveys.
Proceedings on the EU Workshop on Secure Multiparty Protocols. Citeseer.
Gilder, G. (2006, October). The Information Factories. Retrieved November 30, 2010, from Wired.com:
http://www.wired.com/wired/archive/14.10/cloudware_pr.html
Gnagey, K. (2010, February). Cloud Storage - Where are we at? SNS Europe, 10(1), 31-32.
Goldreich, O. (2000). Secure Multi-party Computation. Working Draft.
Gonzales, J. J., & Sawicka, A. (2002). A Framework for Human Factors in Information Security. 2002
WSEAS Int. Conf. on Information Security (pp. 1-6). Rio de Janeiro: WSEAS.
Grossman, R. L. (2009, March/April). The Case for Cloud Computing. IT Professional, pp. 23-27.

63

Helmer, N. D. (1963, April). An Experimental Application of the Delphi Method to the Use of Experts.
Management Science, 9(3), 458-467.
Hewlett-Packard Development Company, L.P. (2011, march 14). HP Sets Strategy to Lead in Connected
World with Services, Solutions and Technologie. Retrieved April 5, 2011, from HP Newsroom:
http://www.hp.com/hpinfo/newsroom/press/2011/110314xa.html?mtxs=rss-corp-news
Hintzbergen, J., Hintzbergen, K., Smulders, A., & Baars, H. (2010). Foundations of Information Security.
Zaltbommel: Van Haren Publishing.
Hu, H., & Xu, J. (2009). Non-exposure location anonymity. ICDE'09. IEEE 25th International Conference on
Data Engineering, 2009 (pp. 1120-1131). IEEE.
Hwang, K., Kulkareni, S., & Hu, Y. (2009). Cloud Security with Virtualized Defense and ReputationBased Trust Mangement,. IEEE International Symposium on Autonomic and Secure Computing. 8,
pp. 717-722. IEEE.
ICT-Kring Delft. (2009). ICT Security in de Praktijk. Apeldoorn: Thieme Print4U.
International Organization of Standards. (2005). ISO/IEC 27002:2005: Information Technology Security techniques - Code of practice for information security management. Geneva,
Switserland.
Jensen, M., Schwenk, J., Gruschka, N., & Lo, L. (2009). On Technical Security Issues in Cloud
Computing. Proceedings of the 2009 IEEE International Conference on Cloud Computing (CLOUD '09)
(pp. 109-116). Washington: IEEE Computer Society.
Jericho Forum. (2009, April). Cloud Cube Model: Selecting Cloud Formations for Secure Collaboration.
Retrieved Januari 9, 2011, from Jerochio Forum:
http://www.opengroup.org/jericho/cloud_cube_model_v1.0.pdf
Kandukuri, B. R., Ramakrishna, P. V., & Atanu , R. (2009). Cloud Security Issues. IEEE International
Conference on Service Computing (pp. 517-520). IEEE.
Levelt, W. (2010, August 3). Cloud Security - The fear for the unknown. Capgemini Cloud Comuting
Conference. 3. Utrecht: Capgemini.
McAfee. (2010, September 25). Security as a Service. Retrieved september 25, 2010, from McAfee:
http://mcafee.com/us/small/security_insights/security_as_a_service.html

64

Mehta, N., & Smith, R. (2007, 9 17). VMWare DHCP Server Remote Code Execution Vulnerabilities.
Retrieved 12 22, 2010, from IBM Internet Security Systems:
http://www.iss.net/threats/275.html
Mulholland, A., Pyke, J., & Fingar, P. (2010). Enterprise Cloud Computing (1st Edition ed.). Tampa, Fl.,
USA: Meghan-Kiffer Press.
NIST. (2006, April 25). Glossary of Key Information Security terms. (R. Kissel, Ed.) Retrieved November 29,
2010, from Computer Security Resource Center NIST:
http://csrc.nist.gov/publications/nistir/NISTIR7298_Glossary_Key_Infor_Security_Terms.pdf
NIST. (2010). NIST Definition of Cloud Computing v15. Department of Commerce. Washington: NIST.
Ormandy, T. (2007). An Empirical Study into the Security Exposure to Hosts of Hostile Virtualized
Environments. Proceedings of CanSecWest Applied Security Conference. Vancouver.
PCI Security Standards Council. (2010). PCI DSS Requirements and Security Assessment Procedures, Version
2.0. PCI Security Standards Council, LLC.
PCI Security Standards Council, LLC. (2011). PCI SSC Data Security Standards Overview. Retrieved
Januaari 07, 2011, from PCI Security Standards Council:
https://www.pcisecuritystandards.org/security_standards/index.php
Preston, W. C. (2002). Using SANs and NAS. Sebastopol: O'Really Media.
Reuters. (2008, September 25). What on earth is cloud computing? New York, NY, USA: Reuters.
Ristenpart, T., Tromer, E., Shacham, H., & Savage, S. (2009). Hey, You, Get Off of My Cloud: Exploring
Information Leakage in Third-Party Compute Clouds. CCS'09 (pp. 1-14). Chicago: ACM.
SAS70.com. (2011). SAS 70 Overview. Retrieved Januari 7, 2011, from SAS 70:
http://sas70.com/sas70_overview.html
Skulmoski, G. J., Hartman, F. T., & Krahn, J. (2007). The Delphi Method for Graduate Research. Journal
of Information Technology Education(6).
Stoneburner, G. (2001). Underlying Technical Models for Information Technology Security:
Recommendations of the National Institute of Standards and Technology. Department of
Commerce, National Institute of Standards and Technology. Gaitherburg, MD: NIST.

65

Thusoo, A. (2009, June 10). Hive - A Petabyte Scale Data Warehouse using Hadoop. Retrieved October 5,
2010, from Facebook: http://www.facebook.com/note.php?note_id=89508453919
Troncoso-Pastoriza, J. R., & Prez-Gonzlez, F. (2010). CryptoDSPs for Cloud Privacy. CISE 2010. Hong
Kong.
Vaquero, L. M., Rodero-Merino, L., & Cacer, J. (2009, January). A Break in the Clouds: Towards a Cloud
Definition. ACM SIGCOMM Computer Communication Review, 2009(39), 50-55.
Vigfusson, Y., & Chickler, G. (2010, Spring). Clouds at Crossroads: Research Perspectives. Crossroads,
16(3), 10-13.
VMWare. (2007). Understanding Full Virtualization, Paravirtualization, and Hardware Assist. Retrieved 10
18, 2010, from VMWare:
http://www.vmware.com/files/pdf/VMware_paravirtualization.pdf
VMware. (2010, September 1). Creating a Provider Virtual Data Center in VMware vCloud Director.
Retrieved Januari 5, 2011, from VMware Knowledge Base:
http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC
&externalId=1026296
Voas, J., & Zhang, J. (2009, April/May). Cloud Computing: New Wine or Just a New Bottle? IT
Professional, pp. 15-17.
Wang, Q., Luo, Y., & Huang, L. (2008). Privacy-preserving Protocols for Finding the Convex Hulls. The
Third International Conference on Availability, Reliability and Security (pp. 727-733). IEEE.
Wang, Q., Wang, C., Li, J., Ren, K., & Lou, W. (2010). Enabling Public Veriability and Data Dynamics
for Storage Security in Cloud Computing. Computer Security - ESORICS 2009, 355-370.
Wharton Business School. (2005, February 17). Delphi Decission Aid. (Y. V. Simon Galperin, Producer, &
Skytech Systems) Retrieved October 5, 2010, from Delphi2:
http://armstrong.wharton.upenn.edu/delphi2/
Yao, A. C. (1982). Protocols for Secure Computations. Proceedings of the 23rd Annual IEEE Symposium on
Foundations of Computer Science, (pp. 160-164). Chicago.

66

14Appendix
14.1 Results from the Delphi study Round 1
Data results: Question#1
Question: First of all, thank you for participating in this research. In this research we
will refer to the cloud often. Just to be sure, with the cloud we mean The Cloud as
defined by ENISA: Cloud computing is an on-demand service model for IT
provision, often based on virtualization and distributed computing technologies,
which in essence is the same as the NIST definition. To be clear: we are not going to
talk about Cumulus or other forms of atmospheric vapor buildup :) I hope you enjoy
answering the questions. Let us commence: Do you have experience with the cloud,
and if so, on which level? (architect, developer, end user etc.)
Responses for this question
Expert's answer:
I have no experience with cloud computing so far.
Expert's answer:
yes, as a user and as a decision maker in a management role
Expert's answer:
end user
Expert's answer:
Yes, as architect. Providing advice on pro's and con's of cloud solutions.
Expert's answer:
I have written my thesis about Cloud computing (hereafter: CC). Furthermore
I review designs or other implementations of CC in my function as an
technical IT-auditor with a focus for/on security.
Expert's answer:
main experience is as end-user
Expert's answer:
yes, in the area of providing requirements from compliance point of view
Expert's answer:
I don't have experience
Expert's answer:
I have only experience in personal use. My company does not use any cloud
67

solution
Expert's answer:
user and security architect
Expert's answer:
Architect level
Expert's answer:
Sales and advise about security related issues in the cloud
Expert's answer:
Manager of architecture, development and deplopyment. Also services on the
cloud environment is under my responsibility
Expert's answer:
As an IT auditor we audit companies which use cloud computing in their line
of business
Expert's answer:
Little, as an architect
Expert's answer:
I do not have much experience with cloud applications other than as an end
user of certain distributed computing projects such as Folding@Home.
However, there are a lot of tools in the wild that make use of a cloud
environment that I am aware of (have tested or researched) such as Web Of
Trust or other community-based site ranking systems.
Expert's answer:
I'm from a security point of view interested in the cloud and
applications/usage

Data results: Question#2


Question: Do you believe that the Cloud poses new (unforeseen) security issues?
Experts' comments for this question
Expert's comment:
any new system has his own issues
Expert's comment:
68

Yes, but it depends on what systems are part of the cloud used by a company
to store and/or process data. If any of the systems are untrusted, the safety of
the data can be compromised. The data on any one system may not be
complete (since this is beyond the system's control) but it remains a risk.
Expert's comment:
Some new security issues, especially due to virtualisation. But mainly loss of
control on current security measures
Expert's comment:
Yes; Because CC is the combination of a lot of computing techniques which
have their security issues on their own. If you combine the security issues you
will get a new attack vector!
Expert's comment:
As the environment will differ from the current known environment, it will
create new issues
Expert's comment:
- Onzekerheid over de locatie van de data. Van belang voor het voldoen aan
privacy wetgeving - onzekerheden over de toegangsmogelijkheden van
overheden tot de data - onzekerheden over de kwaliteit van mededelingen van
auditors in landen met een minder hoog niveau als in Noord-Amerika en
Europa
Expert's comment:
I think the cloud makes the use of information everywhere, with every device
and on any moment more possible than using information in a strictly closed
company environment
Expert's comment:
The cloud has mu boundaries, where youre data is stored can be any place on
earth. Your data is no longer under your own sight.
Expert's comment:
The location of data is not known, so this poses additional risks to mine
opinion
Expert's comment:
enterprise wide authorisation
Expert's comment:
I answered yes, mostly because I believe that security issues are part of every
application there is. The cloud actually increases potential issues by
extending the reach of your 'trusted network'. However, I'm not sure any of
those potential issues are new by any means.
69

Data results: Question#3


Question: Do you believe that the possible lack of geographic physicality (also
known as: Locationless) of the Cloud is a security risk, and why?
Experts' comments for this question
Expert's comment:
you can't check the guards
Expert's comment:
Yes, depending on the laws in the region where a system in the cloud is
located. Countries have different laws regarding privacy, the use of
subpoenas to extract data from providers or ISPs, etc.
Expert's comment:
Yes, physical location based counter measures will (partly) not work anymore
Expert's comment:
The answer is partly true, partly false. It is most depending on the location
were the data will be stored. I would'nt trust my data in America as well as in
China. But the true locationless solution provides help in this case as the NSA
or other Government organization would not know where to search.
Expert's comment:
bottom line, the on-demand provider should garantee availability, meaning
instability of an region could be a risk. provider should cope with this in a
transparent manner.
Expert's comment:
One has to keep in mind that in every country or region, different laws are
applicable, unless you know to which laws you need compliance, you have to
take all into consideration
Expert's comment:
Zie commentaar hiervoor
Expert's comment:
First: Yes, bringing privacy information out of the European Union can be
violation of local or European law. Second: if you don't know were your
information is being kept, how do you know if the system administrators are
reliable? third: if you have competitive information, trade secrets or
governmental secrets, you don't want to have the risk that this information
gets in wrong hands and/or gets unauthorized disclosed.
Expert's comment:
70

International law enforcement


Expert's comment:
Duw to the fact that you do not know where youre data is. Within the
Netherlands we have a privacy regualtion (WBP). If private information is
disclosed in countries other that The Netherland we do not have a
law/regulation to go to court.
Expert's comment:
Your data is hosted there where the storage is cheapest, that can be China.
Everywhere are backdoors build in: usa does it, israel does it, china does it.
Expert's comment:
Will your data be available? And to who? How is access on OS level
garanteed?
Expert's comment:
dat stored subject to different laws
Expert's comment:
I answered no, but it's a double-edged sword. It will increase security because
the architecture of the system itself will be difficult to attack directly seeing
as it won't be running on a server in a data center somewhere but spread out
over a host of individual machines. It can also be a security risk though, by
making it easier to infiltrate for potential attackers. Since cloud-based
applications are mostly community-driven, anyone can become a part of that
community, regardless of their motives, so a double edged-sword.
Expert's comment:
You don't know where your data is, is in in your country, in Europe or
somewhere else, What about NL and EU privacy regulations What happens
with your data? Who and under what law are you protected?
Data results: Question#4
Question: Do you believe that the possible lack of geographic physicality (also
known as: Locationless) of the Cloud is a security risk, and why?
Experts' comments for this question
Expert's comment:
The more you share, the more you need to be aware of possible conflicts and
possible abuse of resources. Considering that the cloud is all about sharing, it
is my opinion that this presents users with trust issues.
Expert's comment:
regardless of using clouds or not, partnerships could get more complex. this
increase will in any case improve trust issues
71

Expert's comment:
As answered in the previous two questions, yes.
Expert's comment:
Maybe not the complexity, but just that with cloud solutions more
responsibility is moved to the provider (e.g. performing upgrades, testing new
functionality, backups, ...).
Expert's comment:
Yes; There will be resellers who provide other services. Any mailcious
company could therefore become a reseller or partner.
Expert's comment:
the more partners involved the more trust issues are a risk.
Expert's comment:
Not so much trust, one will be more depended in partnerships and therefor
ask for more evidence to proof trust. Main issue will be how things will be
guaranteed when partners are using third parties in between.
Expert's comment:
Als er op basis van contracten zekerheden kunnen worden verkregen binnen
de gebruikelijke kwaliteitskaders in Noord-Amerika en West-Europa (+
Australi) betekent dat nog niet dat nzelfde niveau kan worden bereikt bij
onderaanbesteding (van beheer bijv.) in andere delen van de wereld.
Expert's comment:
as my statement in question 3, it is possible that your information is disclosed
by unautorised peolple or even geovernments. You don't know were your
information exists and who is being care about your information
Expert's comment:
There could be a "third" or "fourth" vendor situated behind your Trusted
partner, witch you knowing it.
Expert's comment:
More partners are working together in the cloud, this chain is growing. So i
believ this is true.
Expert's comment:
storage is business and thus there is trade in storage over many chaines. You
only know the last chain
Expert's comment:
72

Cloud will provide a more easy administration of the customers and less
configurations. The infrastructure issues will become more as connecctivity is
the key problem.
Expert's comment:
No the trust issues are the same with outsourcing
Expert's comment:
In current business and organisations it already is hard to commit to
agreements and SLA's. In more complex business relations this matter wil
also become more complex and sanctions on not reaching SLA requirmenst
wil be dealt with on a legal level.
Expert's comment:
I believe trust will be difficult to establish. At some point, there will need to
be some hierarchy attached to the system, with trust levels, otherwise that just
leaves openings for security problems.
Expert's comment:
I think that separation of data is an issue, but also common sense of the users.

Data results: Question #5

Question: Encryption has been called the savior of the Cloud. What are your
perspectives on that statement?
Responses for this question
Expert's answer:
If it has convinced users that the cloud is safe to use, then yes. But whether
that means that the feeling is justified, remains to be seen.
Expert's answer:
the system will always be capable of reading your data. without crypto the
cloud would be useless, but that doesn't mean that with crypto is will all of a
sudden become a safe haven
Expert's answer:
If you store the encrypted data as well as the key on the system in the cloud, it
can be hacked - as is with encrypted DVDs, HD-DVD, Blu-ray. Nonetheless,
if the key is not present on the system in the cloud (which means the system
is only storing the data since it cannot process or read it), encryption can keep
it safe - as long as the encryption is sufficiently secure.
73

Expert's answer:
Excryption is clearly essential. But is certainly not enough. And ...
governments would like to have a back door.
Expert's answer:
Encryption is not the answer to security. The weakest link remains human
(intervention).
Expert's answer:
I believe in the public cloud encryption is a must and offers a kind of a
needed risk management. I am not convinced if this is needed in case of a
private cloud.
Expert's answer:
Depends where it will be used and especially, who is the CA for these keys.
Because they have to be trusted by all involved parties, and proof that as well.
Expert's answer:
Encryptie is slecht van vele soorten maatregelen die een vorm van beveiliging
kunnen leveren. Dus wel van betekenis, maar niet zaligmakend.
Expert's answer:
Encryption can realy be a savior of the cloud in my opinion. If I am the owner
of the key material wich encrypts the data and the hosting provider does not
hav a key.... Is my total chain, from display until SAN encrypted? If yes it can
be a solution. Be aware performance issues with encrypting and decrypting
on the fly.
Expert's answer:
Only if the encryption algorithms are public, so everyone can inspect them.
Expert's answer:
Encryption can be one of the solution to provide confidentiality of informatie
stored in the cloud. I do not believe that it ia a saviour for the cloud.
Nowadays companies using encryption still have problemen regarding
cryptographic key management and compromitation of data...
Expert's answer:
There is no unbreakable encryption and it is not the savior, that is only from a
selling perspective. You have to backup, even in the cloud, you have to be
able to reproduce your data because of state regulations. So encryption, there
is allways a weak spot.
Expert's answer:
Encryption does not always work, because where do you start the encryption.
If the encryption is to early in the proces you can't handle the data in the
74

application. To late and to many people will have access


Expert's answer:
I think that NAC (network access control) solutions in combination with
encryption will be the main topic for cloud solutions. How do I get into my
cloud environment combined with other clouds and with single sign on -thats
the question.
Expert's answer:
No, not only encryption. With encryption privacy and exclusiveness may be
covered, but certainly not trust!!! And what about availability?
Expert's answer:
I do believe encryption is a necessary part of any cloud based application or
system. Mainly because the user does not know where that information is
going and could potentially be hijacked by unauthorized eyes. Having the
transmitted information encrypted with a strong encryption scheme can
greatly reduce the risks. Risks won't be eliminated, as encryption can be
defeated, but it will make the information much harder to get to for
unauthorized persons.
Expert's answer:
Encrypion helps to make it safer. But where are encryption keys in a virtual
environment? what happens if close your session? Is memory cleaned and
given back to other users, what about system management and key
management?

Data results: Question #6


Question: Do you believe that all data should be encrypted in the cloud, and why
(not)?
Experts' comments for this question
Expert's comment:
Whereas the level of encryption might be variable, and will probably depend
on the importance of the data to the ownner, I feel using unencrypted data is
like leaving your door open, because nobody should steal.
Expert's comment:
when building a public wiki in a cloud, crypto would only create overhead.
anything you would rather not share should be encrypted. secrets should
come near anything cloudy

75

Expert's comment:
As stated in the previous answer, if the data needs to be processed or altered
in the system in the cloud, it needs to be decrypted - and for this, the
decryption key needs to be present on the cloud system, rendering it
vulnerable to attack. Either the data is safe but cannot be processed, or the
data can be processed but is unsafe.
Expert's comment:
depends on required security level. But most data will need to be encrypted.
Expert's comment:
Not all data, security sensitive information should be encrypted. The
overhead for using encryption on all data is just too much!
Expert's comment:
i think this denpends strong on the importance of the business data.
Expert's comment:
That is up to the company, depending the value of the information and/ or
risk involved, you define the appropriate measures to mitigate or protect
them.
Expert's answer:
No opinion
Expert's comment:
Bij encryptie is de vraag of de sleutels niet in verkeerde dan wel ongewenste
handen terechtkomen. Bovendien moet je tijdens de verwerking de encryptie
opheffen, waardoor er weer allerlei mogelijkheden ontstaan om gegevens
ongencrypt af te tappen. Dat wil niet zeggen dat encryptie niet een
waardevolle bijdrage kan leveren aan de beveiliging
Expert's comment:
No, there is a lot of data wich is public data or internally data for wich the
need to encrypt is not very necessary. Only confidential or secret data should
be encrypted. This means, a data classification is necessary and should be
implemented
Expert's comment:
confidentiality and integrity
Expert's comment:
I do not believe that encryption is the solution. I think that is is more
important to classify data so you can deside what information can be in the
cloud of should never been in the cloud.
Expert's comment:
76

yes I believe that, but is not possible to do that


Expert's comment:
Only encrypt sensitive data. I think you should use riskanalyses to determine
where to encrypt
Expert's comment:
Yes, but you need good and transparant key-management. This wil become
the weak link.
Expert's comment:
It depends on what the level of sensitivity of the data is. Just like with any
other application or system, the data itself needs to be categorized and
assessed to know whether or not it is sensitive in nature. Any PII (Personally
Identifiable Information) or IP (Intellectual Property) should be encrypted as
a measure of protection for the persons involved.
Expert's comment:
Why, encryption is not always nessecary. security measures should only be
used if it is a requirement according to your risk analysis

Data results: Question #7


Question: An often said comment is that the advantages of the cloud outreach the
security risks. How do you feel about this?
Responses for this question
Expert's answer:
I am afraid it will be like that. But security should come first and cloud
computing should only be used when security allows it.
Expert's answer:
it all depends on what you have to loose. this will vary from case to case.
Expert's answer:
No matter the advantages, if you store sensitive data in the cloud you do not
want it to be stolen.
Expert's answer:
I agree in general. For specific cases where risks or impacts are high this will
be different. Also at the moment cloud security is still in its infancy, so risks
are still difficult to assess and therefore more care should be taken.
77

Expert's answer:
It depends per (business) need. If a company doesn't need much security or
plan to implement their security needs themselves it is a perfect solution!
Expert's answer:
it depends strongly on the importance of the business data. I believe that the
cloud will be the way to go.
Expert's answer:
I don't agree yet, it has to proof itself in the future, if they can meet all
requirements (risk-based). To me currently the security risks are bigger.
Expert's answer:
Risico's die verband houden met het onbevoegd ontsluiten van data kunnen
dermate zwaar wegen dat hier geen enkel voordeel tegenop weegt.
Expert's answer:
I don't think that is true. However, you should think very carefully about what
to store in the cloud and what to store local. Think about laws, regulations,
classification end be sure your organisation is aware about information
security. That in combination with a trusted partner and the neccesary nondisclosure agreements and control moments (audit) should make it possible to
use the cloud for at least a part of your business
Expert's answer:
is the basis for cloud-computing is driving by a proper risk-analysis this could
be true. if it is user experience then there could be more security risks
involved
Expert's answer:
The use of clouds will give big advantages like: scalability fast aivalability of
new services. Pay what you use. No high investments. If these topics are
important for an organsation and they are more important than security, using
real risk management approaoches, than it will outreach.
Expert's answer:
That is only a sales argument. You do not need the cloud, there are other
ways to have your data allways with you if needed. I you violate against a
local or remote law or you want to act conform basel II or other regulations,
then you have a problem.
Expert's answer:
I don't know.
Expert's answer:
I disagree, a lot of companies will not take the step towards cloud solutions if
78

we can not garantee security of their data. They will rather have a more
expensive solution on premise.
Expert's answer:
I would agree that it could have more advantage than security risks, however
this should carefully need to be considered per solution. It is not by
definition, and that is what risk analysis is for. It might wel be that teh risks
are to high.
Expert's answer:
I'm not sure. It's still a bit early in the 'cloud-based' era to tell just how far the
security risks go I think. For now, I think it's just like any other online
application.
Expert's answer:
This is a selling sentence, I don't believe this.
Data results: Question#8
Question: The Cloud comes in 3 delivery models (Software, Platform and
Infrastructure as a service) and 4 deployment models (Public, Private, Hybrid and
Community Cloud). As you might be aware, depending on the combination of two of
the above more or less flexibility can be given to the end user (Optima Forma being
IaaS in a Public Cloud). What are your viewpoints from a security perspective?
Depending on your specialization you could discuss the following: should they all be
considered the same, or be handled as a unique system per combination? How should
Authentication, Authorization and Accountability be handled, and what about
encryption? Are there combinations that you believe to be unsecure, or unfavorable?
How about geographic properties of the cloud (both within the network and physical)?
This is basically your moment to say what you believe are serious issues with security
in the cloud. It can be in depth or a broad overview. The responses from this question
will be vital for the research and for the survey in the next round.
Responses for this question
Expert's answer:
I believe the delivery model should not matter too much when it comes to
security. The nature of the data should prescribe the level of needed security.
From the sort of use you would expect the deployment model to have a
stronger correlation with security. With a more open group of users the data
should not be interesting to secure, because you have almost no control about
who you are sharing with. The tighter the control over the users, the more
demands can be made about applying the three A's.
Expert's answer:
it all depends on what you have to loose. this will vary from case to case. if
you happen to have a cloud inhouse, nothing really changed but your network
design. in all other situations you're outsourcing part of your infrastructure, so
79

the before doing so a risk assessment has to be made. from a risk point of view
there is no reason to do this per combination in stead of per case.
Expert's answer:
In my (limited) opinion, data in the cloud should be stored either on trusted
systems if it needs to be processed or altered, OR it should be stored encrypted
on untrusted systems. Depending on the deployment model, trusted and
untrusted systems can be grouped together and treated accordingly. I am not
familiar with the delivery models.
Expert's answer:
Personally, I like the Jericho Cloud Cube model better for such an analysis. In
general: -where physical boundaries are removed, physical-based measures
need to be replaced by "virtual"-based measures. -Due to loss of control (shift
of responsibility to cloud provider), new risks emerge and new measures need
to be taken.
Expert's answer:
Ultimately, security should be inbedded in all solutions on a same level.
However, the implementation of security is costly (altough the price of a
system-wide security level is less costly than on a per user base), and therefore
customers could opt for a cheaper but less secure system. The EU data privacy
act is overrated, but there are problems in some regions of the world where
hosting should not be considered at all!
Expert's answer:
I like to come to one type of cloud, which can be controlled based on
Confidentiality, Integrity, Availability, Auditability, Compliance
requirements. Authentication, Authorization and accountability are type of
measures that could be implemented to make the difference between the
deployment models. Also the delivery models should be one, based on the
requirements.
Expert's answer:
De genoemde varianten kunnen nog weer verder worden verfijnd door in een
organisatie per soort toepassing c.q. soort gegevens een andere benadering te
kiezen. Hier ontstaat een nog complexer model om te beschouwen. De kern is
dat dit soort differentiaties in het gebruik maken van de Cloud een prima
groeipad kunnen opleveren om de verschillende risicoprofielen die bij die
combinaties horen uit te proberen en zo langs de weg van geleidelijkheid
ervaring op te doen en de beveiligingsoplossingen volwassen te laten worden
en het spel van vraag en aanbod. Een echte beschouwing van al deze aspecten
voert nu te ver.
Expert's answer:
Software in the cloud like the MS Office Live or Googles office environment
means that everything you do is 'virtual' Even if you store your article local,
80

than the temporary files and maybe the original file is stored somewhere
abroad... I do not like the idea. Still writing an article it can change from pubic
to secret. So, no leave my office version at least in my company environment,
but I want to have it on my local machine. "Internal" information or databases,
can be stored in the cloud. Higher level of classification, I prefer local (hosted
in the Netherlands) storage. Encrypted as mentioned before, in combination
with PKI should be allowed. Strong authentication (hardware token) is
neccesary for claasified data. SAS70 statement from the hsting party is
necessary
Expert's answer:
again, if the cloud computing is driven by a proper risk-analysis there will
only be legal issues if international law enforcement. If confidentiality,
integrity and availability of the information is kept within the boundaries of
user/business expectation then there is no real issue.
Expert's answer:
The security isuues will be different depending on the deployment models.
For privacy a private cloud is the best solution. For all the other security
criteria authentication, authorization, encrytion eg.they have to be
implemented regarding the security model. If authorization and encryption for
confidentiality reasins are the most important ones than you should do youre
own application, so use ionly PAAS or IASS.
Expert's answer:
It doesn't matter, it are all sales arguments and nevertheless it is about the
chain, in the end your data is some where stored. A private cloud is very old,
the first was in the beginning of desktop computing. A mainframe is private
cloud computing. Its all about boundaries, you have to think in compartiments
and where you have influence. The less control and/or influence the greater
the risk. the risks are spying on you, but also data loss, theft, breaking of
regulations, not be able to access it, etc etc The cloud is NOT safe, its only
another way to make money.
Expert's answer:
I think all combinations will have their own security issues. Some
combinations are "easier" to secure than others and therefor in different
situations you should take a different approach, also depending on the
sensitivity of the information that will enter the cloud.
Expert's answer:
long question - not sure what the intend is!! I do not see a whole lot of security
isseus as long as we keep the encryption and network acces control at a level
that is acceptable to our customers. Besides that we can always have a hybrid
situation where there is a sharepoint portal for instance as gateway to the
funcitonality in the cloud.
81

Expert's answer:
IaaS is a great thing , although in the end it would coem down to Who owns
the system, can own the data. SaaS there is no way in knowing whatever is
behind teh SaaS solution (no opensource) You need trusted third parties
(TTP's) for good authentication and authorisation. COmes down to trust, and
these are the weak links.
Expert's answer:
This is a difficult question to answer. As I've said previously, analyzing the
data that is to be contained in the cloud for sensitivity will be vital in
determining the level of authentication/authorization to said data. For instance,
in the example of cloud-based community ranking of web sites such as WOT
or Site Advisor, the level of trust a user has over another is non-existent.
Anyone can make a rating and affect the general outcome. I believe this is a
problem and a potential security risk should this avenue be explored for more
sensitive information. Encryption is vital for any PII or IP data.
Expert's answer:
I think you should use architecture / business drivers to choose either option.
The options depend also on security / ownership issues. I do not have a favor
for one option, it depends on the given situation.

14.2 Results from Delphi study Round 2


Data results: Question #1
Question: Welcome to the second round! We've had a small delay due to virusses
attacking some of our members, but we're ready to head on! The first question: How
does outsourcing of IT in the traditional sense compare to outsourcing to a Cloud
service provider in the perspective of security?
Expert's answer:
The security of outsourcing is always fully dependant of the choices and wishes
of an client. However on an overall level one can say that the security of a large
service provider is usually better than compared to a smaller company.
Furthermore the budget for security can be lowered to economies of scale, were
cloud computing is largely about.
Expert's answer:
With traditional outsourcing your data stays where it was before the outsourcing.
With outsourcing to the cloud your data can be anywhere. The risks in the cloud
are greater and there are more risks applicable.
Expert's answer:
Ideally it is not noticeable for the costumer except for the rates. In the real world
I expect some extra problesm to surface in the initial usage periods, which
82

should get ironed out pretty qquickly. Having no physical loaction should lessen
the chances of original and backup being destroyed at the same time.
Expert's answer:
within a traditional outsourcing the outsourced party will give evidence
regarding the quality of security and the way that privacy issues has been covert
during outsourcing. Within a outsourcing in the cloud, depending on the cloud
model, these issues need to be address also. The biggest issue will be the privacy
of the information and the way that the organization can rely on jurisdiction
regarding the enforcing of privacy laws.
Expert's answer:
The difference is the fact that in traditional outsourcing you believe to know
where your systems are and to know where your data is stored. We believe our
data and systems are in Apeldoorn and Amsterdam. When I go there and
someone points to server and says: This is your SAP server, than I'm happy, but
is it really my SAP server? I don't know.... :-) In Cloud computing you really
don't know where your applications and data are. This means there is not that
much difference, if you don't trust your outsourcingspartner
Expert's answer:
all your data are belong to not you
Expert's answer:
This depends on the type of outsourcing (outtasking/BPO etc) and way you want
to be in control. The only main difference I see is in the difficulty to get into
control, because you don't where it is.
Expert's answer:
The main difference is that you don't know exactly where your information is
located.
Expert's answer:
There is no real difference as long as the cloud service provider is located in the
same country, if this is not, then there will be "law" issues if something goes
wrong. Example: Google's states that back-upped data of a customer is google's
property.
Expert's answer:
Both are forms of outsourcing, however cloud is in the hands of people that you
do not have a personal relationship with. Therefore the mindset will be that is
will be less secure.
Expert's answer:
Less transparency on how security is covered. No governence
83

Expert's answer:
Outsourcing IT work is mostly done for financial reasons. IT is a very
competitive market and as such what gets you the most bang for the buck in the
field can land you in on any number projects. I suppose outsourcing also is most
feasible in this field due to its nature of being wide open with few physical
'boundaries'. Outsourcing to the cloud is already being done. I think similar
issues exist with security whether in the cloud or not.
Expert's answer:
With Cloud outsourcing enduser is much more in controle (more accessable), but
information is somewere on the Internet. 2FA authentication, certificate's are a
must to use.
Expert's answer:
Security standards are dictated by the cloud provider (This is not a qualification
as they may have implemented a high security level). With traditional
outsourcing there are more possibilities to implement company standards
regarding security.
Data results: Question #2
Question: How does outsourcing of IT in the traditional sense compare to outsourcing to
a Cloud service provider in the perspective of trust? Are the relations more or less
complex?
Expert's answer:
This is another open ended question. The traditional outsourcer can have
multiple subcontractors similar/comparable to the cloud service provider.
However if the respective client company has multiple cloud service providers
from which services are being delivered it can have a greater impact on the level
of trust. This would be the case if all services need to be combined to get the
desired outcome. The only correct answer is that with more (inter)relations the
perspective of trust is likely to be lower compared than with a single relation.
Expert's answer:
The relations are complexer, some say that therefore SAS70 reports can give
more trust, but in my opinion is the cloud a black box and trust is relative.
Expert's answer:
The relations will be more complex since the clients will want to be reassured
that their data is safe and accessible at all times. REspecially outside of the IT
field companies will have to get used to the idea.
Expert's answer:
Traditional outsourcing is base merely on trusting the outsourced party. The trust
element with outsourcing in the cloud is less, the main driver for cloud is cost
84

efficiency and trust in the original way is less. The outsourcing for cloud does
not have to be more complex compare with traditional outsourcing. The
complexity in both cases depends upon the SLA that has been agreed and the
way that both parties play their role in that agreement.
Expert's answer:
Normally in a good trust relationship, you know where your data is, you have
(contracted with NDA) trust in the contractors employees, you know the WAN
infrastructure where your data travels. So you believe it is secure. In a Cloud
solution you don't know where the data is. You don't know the people who takes
care of your systems and data and you don't know the network infrastructure.
Yes, it is the World Wide Very Trustful Web.... This looks less secure than
outsourcing to a very friendly partner
Expert's answer:
you have to trust his blue eyes
Expert's answer:
No they are the same.
Expert's answer:
More or less the same, but because it's newer and unknown
Expert's answer:
If you can build up this trust it could be very good, but in most cases there will
never be like a real live trust model because the provider is "somewhere out
there". So you will try to get more trust from a agreement perspective or from a
familiar provider, someone you know.
Expert's answer:
Relations will be more complex and trust have to build up as it is less personal.
Expert's answer:
MOre complex, all based on trust and general agreements
Expert's answer:
I would say that cloud applications make trust more complex. Depending on
how the 'cloud' is setup of course.
Expert's answer:
It's the same. The onfo is not on own premesis. You need to find out how it is to
swich from ASP.
Expert's answer:
Trust is build between persons. So if people are involved the relations will be
85

comparable; otherwise it will be less complex because the decision to outsource


includes earlier decisions regarding the trust level expected

Data results: Question #3


Question: One answer on the "locationless cloud"-question in the previous round was that
a true locationless cloud is actually a security solution, as the NSA or other Government
organization would not know where to search. In this light, one could add that by
strategically selecting places to store data, agencies would be confronted with
cooperations that wouldnt work due to political issues (for example: China and America).
What are your perspectives on this?
Expert's answer:
This could be both a positive point, with respect to a business which want no
interference with the NSA. However the risk that secret agencies from in this
perticular case China could be able to retrieve the data of your company is a big
threat as well. I would therefore be very skeptical on the allowed locations.
Furthermore there is (at least within the Netherlands) no legislation about the
ownership of data. Therefore a company cannot reclaim their data if this is not
contractually agreed upon. And lastly if there has been a security breach, the
data is "gone" already.
Expert's answer:
Storing data in America or China is never an option. You can add some more
countries to the list :) Nevertheless, I think that certain locations anywhere on
this planet are never helping you against spying on you.
Expert's answer:
As soon as abuse of this type is noticed laws will be fabricated in an attempt to
make this impossible. Law is always one step behind new technology. This type
of usage is to be expected. Could be the start of a new arms-race type of war.
Expert's answer:
If an organization for what reason ever will use location less cloud because they
want to hide their information, it is theirs responsibility. If government want to
access that information they have to find other ways to enforce law and
regulation in countries like china.... Every company has the right to do business
and use information stored were even if the work on al legitimate basis.
Expert's answer:
I don't believe this is true. Take the NSA, Mossad? Some other ugly creatures...
They are able to search on keywords in all the data streams on the internet. The
know our IP-Ranges. We always communicate with our locationless cloud
solution through our registered IP-addresses (even if we could spoof the Enexis
IP-address to a Microsoft-IP-address, then they would know. Follow the data
stream from Enexis.local.nl and you have everything you want. Encryption, why
86

is AES (Rhyndael) the American standard....because NSA has a backdoor...


Expert's answer:
for an individual this might work. for a company this wil NOT fly.
Expert's answer:
That is one of the issues that has to be solved. A company has to know where his
data is, in order to control it and to be sure that it is compliant with applicable
laws. If government bodies find out earlier that they aren't, that could have big
commercial/financial impact.
Expert's answer:
There is no location-less cloud, data will always be stored on a server in a
geographical location. So this could be geographically okay when you start, but
the provider is constantly seeking opportunities to reduce costs, so you could
turn up with your data in a country that is not-done from your point of view.
Expert's answer:
Data security is of high importance. The cloud has to be able to safegaurd data of
our customers when providing cloud technology to them.
Expert's answer:
This sounds more like obscuring the data. Hiding it everywhere and nowhere.
And this is exactly teh reason why it sounds less secure, because one now cannot
know whether the data is breached or not.
Expert's answer:
Politics and geography can and likely will play a role in this kind of scenario. If
China were to outsource intelligence reports or employee reports for sensitive
documents and there was any kind of likelihood it could be stored in the US,
there might be issues there. I would certainly not want to be the one in charge (or
god forbid, the owner) of any of these 'servers' should tensions flare up between
the involved parties.
Expert's answer:
Information need to be presented if NSA or other asked for it. Doesn't matter
were it is..
Expert's answer:
Currently governments have rules regarding the location of business application
data and those have to be taken into account making a cloud decision.
Data results: Question #4
Question: In the previous round, I asked about new security issues. One answer was that
auditability would be an issue. However, with large auditing agencies being spread
87

worldwide and strong international standards (such as the ISO 2700x, SAS 70 and PCI),
how much is auditing an issue for the cloud?
Expert's answer:
The international standards are currently not enough to give enough guidance for
cloud computing and some of the standards you refer to can be multiinterpretable. This will however not make the use of the well-known
international standards useless, but it is not enough to provide the needed level
of assurance. Until there is some agreed upon standard/ additions to the
international standards auditing remains an issue. Lastly, and with all do respect,
as I am an auditor myself, no auditor is likely to give the same audit results as
his collegue.
Expert's answer:
It all depends on the scope. you cannot transfer accountability because of storing
data and information outside your location.
Expert's answer:
It should not be an issue. The multinationals have been doing it for years, maybe
it will just be more common.
Expert's answer:
Auditing will still be an issues, not all cloud providers use the same audits
standards, and depending upon the country where the data is been processed
auditor will have that experience locally.
Expert's answer:
If you look to information security or compliancy to standard or local and
governmental law, an auditor is interested in design, existence and operation
(opzet, bestaan en werking, weet niet of dat goed vertaald is naar het Engels)
The auditor really wants to see of your security measures are designed (Let me
see where you wrote it down!!!) He want to see if the designed countermeasures
are implemented in the system and he wants to see if the implemented
countermeasures perform like the way they are designed. Design: A password
shall be 8 characters long, with minimal 1 capital and 1 number or punctuation
mark. Existence: He is shown in the Active directory that this is implemented
Operation: He tries to change a password and makes it 7 characters, only lower
case. The system comes back with an error message, this password is a breach of
the password. How does an IT auditor audits IT when he does not know where
the system engineers reside? Where data is, and where functional management is
doing there jobs?
Expert's answer:
lots! how (and where) should you test any physical controls?
Expert's answer:
88

How can I show evidence that the data is where we think it is. How can we show
that we are in control? Some types of security measures will not be enough
anymore (username, password) maybe heavier measures are necessary
(including higher costs)
Expert's answer:
There is no realy audit issue, if you can get a TPM report from your cloud
provider. The real issue is to deal with risks that are there in the solution and are
not in your "hand" to deal with.
Expert's answer:
This could be an issue for companies that are dependent on these rules.
Especially fo SOX compliance this could be an issue
Expert's answer:
Auditting comes down to checking whether something complies to requirements.
the checking kan now only be done against what a (cloud)provider says has been
implemented. It is much harder to physically check compliance
Expert's answer:
I really have no idea how auditing would be affected. I suppose that's the kind of
question that will be difficult to answer unless you've got plenty of experience
with that kind of mechanism, which I don't.
Expert's answer:
The same. Part of audit is security level investigation.
Expert's answer:
Auditability is certainly a requirement;unfortunately, few cloud providers offer
this facility.

Data results: Question #5


Question: How does a private cloud compare to an insourced server with the perspective
on security?
Expert's answer:
That is very much dependant on the requested level of security for both
service offerings. However if you would compare it side by side, the level of
security should not differ that much.
Expert's answer:
Its the same. both are on your private network
Expert's answer:
89

Everything depends on the skills of the one responsible for security. Maybe it
will show that ones knowledgeable enough to use the cloud are better equiped
to secure the (virtual) server.
Expert's answer:
A private cloud is almost an environment where the information is processed
and owned, control by the organization itself. The are the same...
Expert's answer:
It does not. An in-sourced server is placed in an trusted environment,
maintained by trusted employees, placed in an trusted network infrastructure
which can even be separated from the Internet. A private cloud can be private,
but it stays that the system engineers are unknown, the location can be
unknown (Microsoft and Google are offering a cloud service in the
Netherlands now) and normally your data is always traveling over the
Internet. It may be not totally insecure, but it is almost less insecure than in
house servers
Expert's answer:
if the cloud is in-house, there is no difference
Expert's answer:
An insourced server is maintained and managed by the company self
including access control, with a private cloud the managed is in other hands,
and how do you know that you are the only people in the cloud.
Expert's answer:
I do not see that there is a difference between then.
Expert's answer:
private cloud needs to provide security as data is outside the company.
Expert's answer:
one can physically control teh access to the insourced server.
Expert's answer:
I would say they both would have similar security levels and impacts. As long
as they are both private and require some form of authentication/authorization
to access, security ought to be rather similar.
Expert's answer:
Again the same level. In the cloud the access is wider.
Expert's answer:
Conceptually no difference. The feeling will be there that with an insourced
90

server more control is obtained regarding access management, back


up/restore, incident management etc but then it becomes a matter of trust /
agreements made.

Data results: Question #6


Question: It has been mentioned that ownership of data can be an issue in the cloud.
Assuming a single relation with a IaaS cloud provider, how could ownership of data be
an issue in your opinion?
Expert's answer:
as mentioned before in question 3 or 4, there is no ownership of data within the
Netherlands, so therefore it is a risk to any data which is brought outside the
company.
Expert's answer:
your data/information is allways yours. IaaS is a strange product. Normaly I
prefer to say that data becomes information when you add a certain context to it.
So your information is as data is to your provider. IaaS is simpley a marceting
trick.
Expert's answer:
Most of the time, it is hard to own data, unless you consider access to equal
ownership. In this light, having data on your physical drive does not equal
ownership if the data is encrypted. And not knowing where it is but being able to
read, alter and save data does mean ownership.
Expert's answer:
An IasS Cloud provider is only responsible for the infrastructure that will be
delivered. The data its self and the integrity of the data is the responsibility of the
own organization. The IaaS provider does not have any access tot the
organizational data. Within the IaaS contract ownership of data should be
address also in case of an exit strategy.
Expert's answer:
I don't see the difference between lending cloud storage or the normal meaning
of cloud computing. In both ways the data owner is the party who is hiring a
cloud solution. The data owner will never loose his responsibility.
Expert's answer:
it isn't.
Expert's answer:
How can a Iaas cloud provider sufficiently garantee, that he doesn't share the
91

data or process the data to create new info/data. Also with government
everywhere all over like US, China etc. they want to have access to this
information, to be in control.
Expert's answer:
Like mentioned before, if a cloud provider like google is backing up your data,
they state that it's there property. But you will get into trouble is someone puts
data in that cloud that wasn't supposed to be in the cloud. What guarantee do you
have that all data is removed after you logged a ticket to remove that data? It
could already be moved to some remote storage. So data ownership is crucial to
deal with.
Expert's answer:
Ownership of data is always an issue. We need to be able to secure data and
protect IP for our customers.
Expert's answer:
Data can be stored at different locations under different laws and regulations.
They could easiliy have difernet understanding of ownerschip.
Expert's answer:
If I understand this correctly, IaaS offers the infrastructure (meaning virtual
computing environment at a distance) but individuals supply the data. As long as
data is stored locally, ownership should not be an issue. If data is stored on (or is
it 'in'?) the cloud, policies would need to be implemented just like with any other
file/data server.
Expert's answer:
Not. Data belongs to enduser, not the cloud provider. Otherwise no one will go
to the cloud. CRM, financial and so on belongs to the enduser.
Expert's answer:
No issue in my opinion
Data results: Question #7

Question:
above is a picture of the
preliminary model shown. (Full-size here:
http://www.students.science.uu.nl/~3219534/Cloud_model_beta.jpg) It shows how
inputted data gets rubricated and depending on that, a deployment model, delivery model,
geographic (on premise, off-premise), geo-spatial (which countries/continents),
92

Confidentiality Integrity Authenticity (meaning all secure computing needs) and


compliance & auditability get selected. The outcome should be that the right data gets in
the right cloud formation. what is your opinion of this model? Does it cover all the aspects
of data storage/computing in the cloud?
Expert's answer:
from a quick look, yes.
Expert's answer:
Yes but I would prefer to rotate the bottom lane (CIA) 90 degrees up and
placing it after rubricate.
Expert's answer:
I think there should also be a category for "desired encryption level" Not every
level may be possible in every cloud.
Expert's answer:
If the model uses a rubricate way of addressing the cloud model as a selection
criteria than it could work. The rubricate should address the CIA aspects of
security but also the privacy aspect of information and the way that law
enforcement can be used.
Expert's answer:
The only part I am missing is my employee of the cloud vendor problem. I
think, if you put that in the Geographic and compliancy "layer" you have all
parts of the security issue addressed
Expert's answer:
this would mean all my data will stay in house, or at least on this continent.
Expert's answer:
This is to generic, i miss availability, i miss access control, i miss information
classification (on both sides!), i miss the stakeholders like government,
companies, competitors, risks, what are the rubrications??
Expert's answer:
Only if it can not be invoked by human decision, you must clearly state the
rules for the proper handling of data and the must be "active" at all cost. So is
one data packets drops out, at whatever level, if must be redrawn from this
model. If, and only if, there are 100% guarantees that the model is working
properly, is always active and cannot be invoked by human then this will
work. I love this concept very much.
Expert's answer:
Missing third party software or hardware and on premise functionality (hybrid
model)
93

Expert's answer:
What is the "right cloud formation" Per aspect you would need a set of criteria
to determine the right cloud formation.
Expert's answer:
I believe the model is sound. Seems to cover the bases well.
Expert's answer:
How does it get in, or added, mixed? That's were the power lies of data
Expert's answer:
An interesting idea but why do you go to the cloud with this set of
requirements ? Bottom line the cloud should be there to free you of these type
of concerns from a functional point of view. Construction is not your primary
interest. Does it concern inputted, or output data, or both ?

14.3 Results from Delphi study Round 3


Data results: Question #1
Question: Welcome to round 3, the final round, of the Security in the cloud survey.
This round will consist of an improved model, definitions and finaly the possibility to
get mentioned in my thesis! First question: While wiki leaks fans are clashing with
corporations and governments, it has been mentioned that DDoS are hard to guard
against. However, with the endless capabilities of the cloud, is that still true? Can the
cloud be a means to end DDoS attacks? (think about rapid automated elasticity,
synchronization, near instant scalability and flexibility, near instantaneous
provisioning and HA)

Expert's answer:
IMHO the cloud only improves the possibilities for stronger DDoS attacks.
Ways will be found to use the full potential that an attack out of the cloud can be
deployed, whereas the protection against such attacks can only be responsive and
therefor automatically will be after the fact and weaker.
Expert's answer:
Yes, it can be used to some extent. But a more rigid approach would be to use
the right network detection and prevention techniques which would allow the
network service provider to drop the traffic before it can reach the specific
infrastructure. If this techniques are incorporated into the cloud then this
certainly is true, but I haven't seen any virtualized solutions yet.
Expert's answer:
94

Mijn technische kennis schiet tekort om hier een onderbouwd antwoord op te


geven.
Expert's answer:
I think that ddos attacks will allways be a problem. To avoid this problem a
cloud service provider shpuld be able tot dynamically change the POP fr internet
access. After he has switch thant onde, the DDOS will find it and start again. I
think that DDOS attacks in an cloud enviroment are harder to do, but still will
succeed.
Expert's answer:
That's only possible if you have good tresholds in place and indeed are that fast
and flexible to react. Because normally what you see, is that a lot of companies
have monitoring devices and if they spot something (threshold exceeded) than
they can not react as necessary. So still lots to do in that area.
Expert's answer:
yes it can. even if company's don't want their systems in the cloud, they can
offload the frontdoor (like messagelabs does for mail)
Expert's answer:
Yes, it can be. Moving to another IP adress, bigger lines to the website, more
possibilities to check the attack
Expert's answer:
Cloud hosted websites may indeed not be killed by a DDOS attack but the attack
and extra resources used will increase costs for the owner.
Expert's answer:
In my opinion there is no real defence against DDoS, if someone is really
determined to "get you" he/she will eventually get all the computing power to
stop you from doing your business. Even if you have physically separated access
points. But only if someone is "really determined" to take your (web) services
down. Not someone who is playing around.
Expert's answer:
I think DDoS is still possible. The attacker attacks not a specific server, but
attacks to an IP-address. The IP-address leads to a certain, unknown server
location, which he attacks. It is not interesting for the attacker what the location
to attack is. He just wants to shut down a website, whether it is in the USA,
Europe or Asia...or some leaky cheap 3rd world country
Expert's answer:
?
Expert's answer:
95

DDOS is always hard to protect against. The cloud, or better, the bigger front
door of the cloud and bigger firewalls and multiple accespoints make it harder,
for the hacker but not impossible. Yess its still true.
Expert's answer:
That will remain true; scaling up cloud will never beat scaling up DDoS
Expert's answer:
The cloud will be no answer to DDOS attacks. every infrastructure has its
breaking point in terms of scalability in virtual machines and/or bandwith.
Expert's answer:
Well, DDoS are difficult to predict. Depending on how the cloud is setup, and
how it is accessed (which parts are publically available and how the location
changes are transmitted to the 'cloud members', it can be more secure in that
sense. If the cloud app is public though, and centralized information is stored
anywhere (say a mirror list that the members connect to access the app) then that
will be as vulnerable as any other site or system. I do think that response to such
a threat though will be easier to do quicker.
Expert's answer:
No the cloud can not be a means. We will have to have other measures to
overcome this matter. Am very intereseted how and what this will be.
Data results: Question #2
Question: In terms of the CIA triad, it seems that it doesn't cover all aspects concerning
cloud security. An extension seems to be in reach in terms of CI3A. CI3A (or CI triple
A/CIAAA) defines confidentiality, integrity, availability, accountability and auditability.
This extensions has been made in order to fulfill the wish of governance and compliancy.
What do you think of this extension? Does it need more concepts in order to create
assurance within the cloud?
Expert's answer:
I'm sorry. I don't understand this qiuestion.
Expert's answer:
I think this is a useful extension to security and should provide enough
background for assurance.
Expert's answer:
Deze toevoeging staat niet op gelijk niveau als CIA, d.w.z. dat CIA deze
aspecten reeds omvat. Het kan wel nuttig zijn en dat doe ik in mijn praktijk ook
bij het formuleren van beveiligingsnormen om deze - hier aanvullend bedoelde begrippen toe te lichten bij de uitleg over CIA.
96

Expert's answer:
I think that accountability and auditability are a part under the CIA triangle.
Expert's answer:
Compliancy is more than Accountability and Auditability. I think at least
duration or time should added, because more and timing is an issue with data,
e.g. salary slips or annual reports.
Expert's answer:
ci3a should be considered every time you outsource, not just to the cloud
Expert's answer:
For now it will be enough. People need to think about these issues in order to
move to cloud solutions.
Expert's answer:
Yes, this extension does make sense in itself. However, there can be other
extensions of CIA as well so we may end op with 2C4I3A.......
Expert's answer:
I think the extension are correct and needed, but ISC2 en ISACA seems to count
accountability and auditability into their CIA triad.
Expert's answer:
Sicne I'm involved in information security I always tell my public that there is a
CIA triangle, which has to be completed with auditability. I know the most of
the audit community thinks this is the only right way to fullfill information
security. I agree with your CI3A and suggest you should mention it to the
ISC(2), ISACA and PVIB peolple!!
Expert's answer:
We already used the CI3A in non cloud situation, so I don't think this is a cloud
specific extension. The CI3A is in my opinion sufficient to cover alle aspects
Expert's answer:
There is never assurance possible in the cloud. Stuctures are to complex, to
difficult to understand even for an aditor. That other aspects are added is a good
approach, but the question is always with every audit: what is the subject under
evaluation? What aspect ever, an audit is made by people and the audittees are
also people. So you can say something about the audit process but the outcome
is les hard.
Expert's answer:
Information security is a concern of the owner. If the owner has a demand for
accountability and audiability it should be provided.
97

Expert's answer:
I do not agree, accountability and auditability are aspects of integrety.
Expert's answer:
I think it's a good model. Only time will tell.
Expert's answer:
Sorry do not know this extension. however is must provide a safe harbor security
and prove that it is.
Data results: Question #3
Question: As mentioned before, physical location, or the lack thereof, can be an issue in
the cloud. It seems that there are four factors in defining the boundaries of data location
awareness. Regional: measured in physical distance relative from one server to another.
Can be used for instance to create HA/Disaster recovery locations/strategies. Premises: are
servers/data located on organizational premises or not. Network: is data available within
the network or not. Legal: geographic locations of servers pertaining to legal systems (e.g.
discrepancy between server locations in state, country and/or continent) Do these factors
cover all issues/risks pertaining to boundaries and borders?
Expert's answer:
Yes, this should cover all issues/risks
Expert's answer:
I think it does cover all the issues. Or better, all the issues I see can be
categorized in this way.
Expert's answer:
Dit lijkt mij een prima benadering, die ik niet kan aanvullen
Expert's answer:
I think these are enough. The main problem is BCM, whichs you can cover with
the regional aspect. The privacy aspect of diferent regulations canb be covered
by geiographic locations. The question that rises is: can you check this...
Expert's answer:
All this is depending on definitions, one can think of internet as an extension of
your network! The same with your own homeplace, so organisational premises
can also be outside philips sites.. Next to that a location is also the pc or laptop
or datacarrier at its one, as it can connect to internet everywhere and than be
compromized.
Expert's answer:
can you blame/sue when stuff goes wrong? this is partially covered in legal, but
not completely.
98

Expert's answer:
There should be new rules concerning Legal, so it doesn't matter were the data is
located. The company who offers the solution can/must forfill these rules
Expert's answer:
Seems sufficient.
Expert's answer:
Somehow i do get the feeling that data on a hand held device is "moving" around
on each of this 4 factors. So i my opinion you do need device as a factor as well.
Expert's answer:
Yes I think you have made the right conclusions
Expert's answer:
No - I think you also need to take in account the location of maintenance
personnel
Expert's answer:
Probably
Expert's answer:
I think so, but other risks also apply.
Expert's answer:
I think that access is an issue, in terms of "who has access to the system" and
where is the system hosted. I would use: access control and logging. preferably
outside the cloud system....
Expert's answer:
I think so. Seems like a fair assessment to me. Depending on what is meant with
boundaries and border issues.
Expert's answer:
I think software, connecting applications and portal technology is a very
important factor that should be added. Question is can you connect your
applications wether it is located in the cloud or on premise.
Data results: Question #4
Question: After the input of the last round, and much poindering, the model has been
improved. Once again, I'd like to hear all the input you have. The model has been
redesigned with a new perspective in mind. The goal is to create a secure cloud
architecture. So classified data (or a classification/rubrication if you will) goes in, instead
99

of raw data. For each classification/rubrication level there will be one cloud environment
that fits. This model helps to decide how the architecture of that environment should look
like. Mind you, that the critique you might have on the CI3A will be taken into account.
Please just mention whether or not the vertical bar is correct, with or without respect to
your critique. (it was originally a horizontal one. Made vertical to show how CI3A affects
all horizontal lines). other improvements include the inclusion of all boundaries/borders in
the horizontal lines.

a larger
version can be found at http://www.students.science.uu.nl/~3219534/Cloud_model_rc.jpg
Expert's answer:
I think the vertical bar is correct.
Expert's answer:
From a design perspective the model looks promising. The only difficulty
could be the practical approach.
Expert's answer:
You can put the vertical bar on that position. I think you need to address the
trust as a part of the whole model.
Expert's answer:
Ik geloof helemaal niet in een aparte cloud architectuur, omdat alle elementen
daarvan niet afwijken van de elementen die op de eigen organisatie van
toepassing zijn. Aan een uitbestedingspartner stel je in principe vanuit de
business geen andere eisen dan die je aan de eigen serviceorganisatie zou
stellen. Dus ook geen aparte architectuur, wel een beperkt aantal specifieke
eisen. Die extra eisen leiden niet tot een andere architectuur. Zo bijzonder is
de cloud niet.
Expert's answer:
This is the chicken and egg story, what is first, the classification of data and
based on that choose the right region, etc. or is the location etc first and based
on that we classify the data? I would like to have the term "Residual risks
(and/or) additional measures" to be added in the arrow on the right side.
Expert's answer:
100

it looks do-able
Expert's answer:
no commends
Expert's answer:
Seems to make sense indeed, the CIA.. properties of the different
environments are all to be taken into account for the mapping of a problem to
a (cloud) solution.
Expert's answer:
I do have some difficulties with the CIA(AA) bar positioned after the
Classfication, i do believe that classification is a method to achieve CIA(AA),
so it should be after the CIA(AA) bar. But that is just my view. The only
horizontal bar that i do miss is NO secure cloud architecture; if all signs go on
red, you shouldn't put that data or system in a cloud.
Expert's answer:
The model is correct. I agree with the CI3A modell
Expert's answer:
the vertical bar does not match question 3 what about the maintenance
component
Expert's answer:
Data classification is the proper mean to identify specific cloud environemnts
but then a more DMZ-like definition of cloud environments is expected
Expert's answer:
Many people relay on methods, models and architectures. Some things are
handy to communicate, others give structure in the wilderness. All of them
cannot assure that you will not take or overcome any risk. Its always good to
use common cense. Reading this last question i think of my first answers, it
could be that some af my answers not give the outcome you espected. The
model however is pretty good and it could be a good practise to adapt this
model because it covers enough. The vertical is not good either. I am missing
some control in the the proces.
Expert's answer:
access control inside and logging outside the cloud
Expert's answer:
It looks like a viable model. Again, as I said in an earlier question, only time
will tell since this is a rather new endeavour, just how secure it will be. The
main concerns seems to have been addressed with the current model as far as I
can see.
101

Expert's answer:
like in previous question, my opinion is that software for connection of
applications or on a bus sysstem is of strategic importance of succes for cloud
architecture

102

14.4 Evolution of the Model


The SeCA model was developed in a series of stages. The following appendix will show what stages
there were and how they led to the eventual SeCA model.
The first model
(figure 11) that was
developed has a
clear input (data)
that flows through
rubrication
synonym
Figure 10: The first model to be presented to the experts

(a
for

classification) and
then has several

attributes which output a cloud. A majority of experts had remarks on this model. First of all, data was
too ambiguous. What kind of data, whos data were general remarks. Also they felt that the CIA triad
was wrongly placed. It is after all not a cloud architecture attribute but a perspective on security in
general. Also, it was mentioned that encryption was needed as an attribute. The output as a cloud in
general

didnt

have

much

meaning to the experts.


In

order

to

satisfy

these

demands, the model was changed.


The input was changed to a data
classification. It would then Figure 12: the model with a partial encryption bar. Architectures that didn't need
encryption could 'flow' below it.
Figure 11: the model with encryption as a vertical bar.

pass

through

following
generate

the

CI3A,

attributes

and

secure

cloud

architecture. Although it was


proposed that a vertical bar,
next to the CI3A would be
needed for encryption. However, encryption is an attribute of a cloud architecture as it may use it to
enforce the CI3A. And what is an architecture wouldnt need encryption? Showing a partial vertical

103

encryption bar (for architectures that use it and white space for those that dont) didnt make much
sense, nor looked intuitive. (figure 9 and 11).
It was thus decided that encryption would just be an attribute. Network and premise were added as
they showed up in the delphi round as dedicated cloud attributes. The end result is the model
displayed

in

figure 12.
This

model

was accepted by
the experts, with
one

main

concern.

Its

applicability

in

the

field.

Figure 13: the second model presented to the experts in the delphi session. big improvements are CI3A in
a vertical bar and additional attributes

This

thesis (especially chapter 10) tries to explain how it could be used in the field.
The final model as presented in this thesis has undergone little change from the one showed in the
delphi session. Its greatest improvement is that this one is aesthetically more pleasing. The CIA-(AA)
was changed into CI3A as the last round confirmed that the CI3A is a proper extension of the CIA
triad. Distribution model was renamed to deployment model for consistency purposes, and the cloud
figure into an arrow to show that the outputted architecture doesnt go directly into the cloud but it a
specification.

Figure 14: The final version of the SeCA model

104

Anda mungkin juga menyukai