M A G A Z I N E
F O R
T H E
I T
P R O F E S S I O N A L
SUMMER 2013
2013
AUTUMN
bcs.org/itnow
BIG DATA
Aim high
BIG DATA
06 WHERE ARE WE WITH BIG DATA?
08 BIG DATA, IP AND PRIVACY
10 SECURING BIG DATA
12 WHAT IS BIG DATA?
14 BIG DATA VISION
16 BIG DATA AND PATIENT CARE
20 IDENTITY AND BIG DATA
22 YOUR RESOURCES: BIG DATA
HEALTH
40 TAKING CARE
LEARNING AND DEVELOPMENT
44 EDUCATION VS SKILLS
SECURITY
26 CLOUD SURFING
28 AVOIDING CYBERWASH
30 ROLE WITH IT
32 DATA, DATA EVERYWHERE
35 TAKE CONTROL
36 SECURITY UPDATE
38 DATA PROTECTION
...THE REST
42 3D TECHNOLOGY
52 BCS AND BROADCASTING
54 SCRUM & KANBAN
56 CIO VS CDO
58 COMPUTER ARTS
64 YOUR RESOURCES
Image: iStockphoto/173390168
MEMBERNEWS
MEMBER NEWS
Membership
and recruiters
MAINTAIN
DIALOGUE
Image: iStockphoto/162947253
Professional Membership
Regional Constituency
International Constituency
Specialist Group Constituency
Professionals
Image: Stockbyte/stk128227rke
May saw BCS Council elect a new chair and vice-chair one a past President and the other a new council member who was first elected to the body by the professional membership last year. Dr Roger Johnson, FBCS CITP
(BCS President 1992/3) returned to the Chairs seat at Mays Council meeting, and Kevin Streater, FBCS CITP took
up his seat as Vice-Chair.
Wanted
Employers value individuals who are committed to professional
standards and development according to BCS, The Chartered Institute
for IT.
David Evans, Director of Membership at the
Institute, explained further: Our experience of
working with employers, small and large,
is that they want to hire and retain
individuals who are committed to exceeding
professional standards and that they are
willing to pay a premium for such individuals.
This is borne out by results from
a recent survey conducted by the
Institute that revealed that a Chartered
IT Professional with a Certificate of
Current Competency earns on average
approximately 92,000 pa.
They can command such salaries
because of their competence, their
commitment to keeping their skills current
and up-to-date and to professional ethics
and values, all of which are valued by
employers, David added.
Competence
CITP provides a recognisable sign of
professional integrity and dedication. IT
professionals achieve CITP status through a
combination of peer assessment and formal
testing and are awarded a certificate of
www.bcs.org
September 2013 ITNOW
05
Further reading
Microsofts special report on using
clusters for analytics:
http://research.microsoft.com/apps/pubs/
default.aspx?id=179615
Victor Mayer-Schonenberger and Kenneth
Cukier, Big Data review:
http://www.bostonglobe.com/arts/
books/2013/03/05/book-review-big-dataviktor-mayer-schonberger-and-kennethcukier/T6YC7rNqXHgWowaE1oD8vO/story.
html
IBM on big data:
www-01.ibm.com/software/data/bigdata
Wired on Cloudera: www.wired.com/wiredenterprise/2013/06/cloudera-search
Image: iStockphoto/148232706
BIG
There have been many descriptions of big data of late mostly metaphors or similes for big
deluge, flood, explosion but not only is there a lot of talk about big data, there is also a lot of data.
What can we do with structured and unstructured data? Can we extract insights from it? Is big data
just a marketing puff term? Brian Runciman MBCS introduces the ITNOW big data focus.
DATA?
06
07
BIG DATA
BD,
P,
IP
Along with cloud, social and mobility, big
data (aka information) is one of four key
technology forces which, according to
Gartners Nexus of Forces, have combined
to create a paradigm shift in the way we do
business.
In a previous article (see refs)on this topic,
I discussed how the nexus of forces impacts
the rather more fundamental concept of
intellectual property. In this article, we shall
dive a little deeper into the key issues that
impact and influence big data.
A little web research will bring up
vast amounts of information and links to
articles on the topic of big data. On closer
inspection, however, only two or three
main issues appear capable of making or
breaking the promise of big data, and these
are related to: solution approach, personal
privacy and intellectual property (IP).
The first issue deals with technology,
deployment and the organisational context,
whereas the latter two big-ticket items
raise concerns about the nature and
applicable use of information or big data.
For the purpose of this article well pay
more attention to the latter issues, mainly
because sparks tend to fly whenever the
commercial exploitation of information and
content enters into the realm of personal
privacy and IP rights.
Big data
According to a recent Forrester Research
paper, typical firms tend to have an
08
Personal privacy
Given such powerful tools, and the large
amount of replicated information spread
across various sources, it is much easier to
obtain a clear picture of any individuals
situation, strengths and limitations.
Furthermore, the explosion in speed, types
and channels of interaction, enabled by
components of Gartners nexus of forces, may
have brought about a certain degree, (perhaps even an expectation and acceptance), of
reduction in personal privacy.
However, people do still care about
what and how their personal information
is used, especially if it could become
disadvantageous or harmful to them.
There is a certain class of data which can
easily become toxic should a company
suffer any loss of control, and it include:
personal information, strategic IP
information, corporate sensitive data (e.g.
KPIs and results)
The situation is further complicated by
differing world views on personal privacy
as a constitutional or fundamental human
right. The UKs Data Protection Act is not
applicable to personal information stored
outside of the UK, yet we deal daily with
organisations, processes and technologies
that are global in scale and reach. On the
other hand, some users are happy to share
personal data in exchange for financial
gain.
According to a recent SSRN paper, data
protection and privacy entrepreneurship
Intellectual property
In addition to the above points,
organisations also have to deal with the
drama of IP rights and masses of
unstructured data. Simply put, every last
piece of the aforementioned 125TB of big
data held in your average organisation
will have some associated IP rights that
must be taken into consideration when
collecting, storing, processing or storing
all that information. According to legal
experts, companies need to think through
fundamental legal aspects of IP rights e.g.
who owns the input data companies are
using in their analysis, and who owns the
output?
An extreme scenario: Imagine how
that corporate promotional video, shot on
location with paid models and real people
(sans model release), plus uncleared
samples in the background music, which
just went viral on a number of social
networks, could end up costing a lot more
than was ever intended. Oh, by the way, the
ad was made with unlicensed video editing
software, and is freely available to stream
or download on the corporate website and
on YouTube. Well, such an organisation
will most likely get sued, and perhaps
should just hang a sign showing where
the lawyers can queue up. Every challenge
brings an opportunity, but not always to
the same person.
Now imagine all that content, and
tons more like it (including employees
personal content), just sloshing around in
every organisation, and you might begin
to perceive the scale of the problem. In
REFERENCES
Gartner - Information and the Nexus
of Forces: Delivering and Analyzing
Data - Analyst: Yvonne Genovese
BCS TWENTY:13 ENHANCE YOUR IT
STRATEGY - Intellectual property in
the era of big and open data.
Forrester - Deliver On Big Data
Potential With A Hub-And-Spoke
Architecture Analyst: Brian Hopkins
SSRN - Buying and Selling Privacy:
Big Datas Different Burdens and
Benefits by Joseph Jerome (Future
of Privacy Forum) http://papers.
ssrn.com/sol3/papers.cfm?abstract_
id=2294996
Out-law.com - Big data: privacy
concerns stealing the headlines but IP
issues of equal importance to
businesses http://www.out-law.
com/en/articles/2013/march/bigdata-privacy-concerns-stealing-theheadlines-but-ip-issues-of-equalimportance-to-businesses-saysexpert/
BCS Edspace Blog Big data: manage
the chaos, reap the benefits Marc
Vael - www.bcs.org/blogs/edspace/
bigdata
Capping IT Off Forget Data Science,
Data Art is Next! - Simon Gratton
www.capgemini.com/blog/cappingit-off/2013/07/forget-data-sciencedata-art-is-next
09
BIG DATA
SECURING
Image: Thinkstock/iStockphoto
BIG DATA
Big data can create business value by solving emerging business challenges. However, big data also creates security challenges that need to be considered by organisations adopting or using big data
techniques and technologies says Mike Small, FBCS CITP.
There is now an enormous quantity of data
in a wide variety of forms that is being generated very quickly. However, the term big
data is as much a reflection of the
limitations of the current technology as it
is a statement on the quantity, speed or
variety of data.
The term big data needs to be
understood as data that has greater
volume, variety or velocity than can
be comfortably processed using the
technology that you already have.
Big data comes from a number of
sources both internal and external.
Many organisations have accumulated
large amounts of data that they are not
exploiting. There is an even larger amount
of data that is held in publicly available
10
and security.
The basic objectives of information
security for big data are the same as
for normal data being to ensure its
confidentiality, availability and integrity.
To achieve these objectives certain
processes and security elements must be
in place. There is a large overlap with the
normal information security management
processes, however, specific attention is
needed in the following areas:
Everyone is responsible
The unstructured nature of big data means
that it is difficult to assign the responsibility
to a single person. Everyone in an
organisation needs to understand their
responsibility for the security of all of the
data they create or handle.
Verification of data source
Technical mechanisms are needed to verify
the source of external data used; for
example digital signatures.
Systems integrity
There needs to be good control over the
integrity of the systems used for
analysis including privilege management
REFERENCES
www.kuppingercole.com/report/
advisorynote_bigdatasmartdata
70750140513
www.enisa.europa.eu/activities/riskmanagement/evolving-threatenvironment/ENISA_Threat_
Landscape/at_download/fullReport
https://downloads.cloudsecurity
alliance.org/initiatives/bdwg/Big_
Data_Top_Ten_v1.pdf
www.energynetworks.org/modx/
assets/files/electricity/futures/
smart_meters/ENACR009_002_1.1Control%20Points.pdf
11
BIG DATA
Image: Thinkstock/Stockbyte
Keith Gordon MBCS CITP, former Secretary of the BCS Data Management Specialist Group, looks at
definitions of big data and the database models that have grown up around it.
Whether you live in an IT bubble or not, it
is very difficult to miss hearing of
something called big data nowadays. Many
of the emails hitting my inbox go further
and talk about big data technologies.
These fall into two camps: the
technologies to store the data and the
technologies required to analyse and make
sense of the data.
So, what is big data? In an attempt to
find out I attended a seminar put on by The
Institution of Engineering and Technology
(IET) late last year. After listening to five
speakers I was even more confused than
I had been at the beginning of the day.
Amongst the interpretations of the term
big data I heard on that day were:
Making the vast quantities of data
that is held by the government
publically available, the Open Data
initiative. I am really not sure what
big means in this scenario!
For a future project, storing, in
a hostile environment with no
readily-available power supply, and
then analysing in slow time large
quantities of very structured data of
limited complexity. Here big means
a lot of.
For a telecoms company, analysing
data available about a persons
12
So, although there is no commonly
accepted definition of big data, we can say
that it is data that can be defined by some
combination of the following five
characteristics:
Volume where the amount of
data to be stored and analysed is
sufficiently large so as to require
special considerations.
Variety where the data consists
of multiple types of data potentially
from multiple sources; here we
need to consider structured data
held in tables or objects for which
13
BIG DATA
Questions, questions
To start with there is the question of the
overall organisational vision for big data
and who has the responsibility of setting
this? What projects will be carried out with
what priority? Also one has to consider
practicalities how will the management
of organisational data be optimised?
Next we come to the critical question
of quality. Garbage in, garbage out is
an old adage and IT departments have
been running data cleansing initiatives
since time immemorial. But in the
world of big data, is this enough?
What about the role of the wider
organisation, the people who really get
the benefit from having good quality
data? There is also the issue that a lot of
the anticipated value of big data comes
not just from using the data you own, but
from combining your data with external
data sets. But how do you guarantee the
quality of these externally derived data
sets and who takes responsibility for the
consequences of decisions made based on
poor quality, externally derived data?
Although garbage in more or less
guarantees garbage out, the opposite
is not necessarily true. There are two
elements involved in turning a data asset
into something useful to the organisation;
good quality data and good quality models
to analyse that data. As was clearly
demonstrated in the banking crisis,
however, predictive models rarely give
perfect results.
How therefore can organisations ensure
that the that the results of modelling
are properly tested against historic data
and then re-tested and analysed against
real results so the models and the data
sets required to feed the models can be
refined and improved? Above all, how can
organisations ensure that the results of
analysis are treated with an appropriate
degree of scepticism when used as a basis
for decision-making?
Confirmation bias
Also, when considering how such models
are used, the psychological phenomenon
of confirmation bias needs to be
considered; the human tendency to look
for or favour the results that are expected
or desired. Inevitably analysis of data will
sometimes give results that are
counterintuitive or just not what was
looked for, leading to the age old
Diversity of
Organisational Activities
Level of
information
dependency
Low
High
Low
CIO or User
CIO
High
User
CDO
BIG DATA
VISION
15
BIG DATA
BIG DATA
AND PATIENT CARE
Image: iStockPhoto
Big data needs to meet small data to deliver on healthcares challenges says Dr Justin Whatling FBCS,
Chair BCS Health and Senior Director of Population Health at Cerner.
Harnessing the potential of big data is one
of the single biggest opportunities facing
the NHS. It will be at the heart of how we
make care better, safer and more
affordable in the future. The question,
therefore, is not whether big data will
transform care, but how to maximise the
benefits. The danger for the NHS is that it
focuses on data at a macro level, building
pictures of health trends across
populations. Useful though that is, it merely
classifies problems.
The real breakthrough comes from using
data to identify solutions, helping doctors
and nurses to make real-time decisions.
The NHS does not need to choose between
using data to plot problems or identify
solutions; it needs to be ambitious in
embracing both, and using big data to
transform the quality of care.
Before the NHS conquers big data,
however, it first needs to overcome its
cultural aversion to sharing it. Some
industries, such as retail and banking, have
been quick to see monetary value in high
quality, customer data and have developed
advanced systems to capture, trade and
use it for commercial advantage. The NHS
is different.
The lack of commercial drivers,
combined with greater sensitivity and
legal constraints around sensitive medical
data has helped foster an ingrained
nervousness about sharing data. The
consequences for care are stark; clinicians
are left facing decisions without vital pieces
of the jigsaw that did not follow the patient
through the system.
16
Information sharing
Fortunately for big data advocates, the
tide is turning on the NHSs reluctance to
share. In May last year, the Department of
Healths information strategy, The power
of information explicitly stated that not
sharing information has the potential to
do more harm than sharing it. This was
followed in March 2013 by the Caldicott2
report, a government-commissioned
review tasked with identifying how NHS
information sharing could be improved
without compromising patient
confidentiality.
The Health Secretary, Jeremy Hunt,
has spoken passionately about the need
to improve the quality of data as part of
his plan to make Britain a global hub for
health technology. It would be nave to
think political will can reverse decades
of cultural conservatism overnight, but
the direction of travel is the right one and
there is now a determination to use data to
improve the quality of care.
In seeking to harness the power of
big data, the easiest approach for the
NHS would be to focus on the secondary
benefits of big data. This involves joining
up data currently locked in silos to identify
health opportunities across the population.
Some work has already started on
this, with national disease registries and
more recently with NHS Englands care.
data project already extracting data from
General Practice and seeking to extract
and link data from hospitals, and down the
line from community and mental health
providers across the NHS. The US has,
17
DATA
MOUNTAIN
18
Going green
The companys rationale for building the
data centre came from wanting to build the
facility, but also to try and do it in as
environmentally a way as possible.
It was a combination of both (wanting
to build a data centre and wanting it to be
green). We were discussing the possibility
of building a data centre in other places
around us, because the owners of Green
Mountain are already owners of another
data centre in Stavanger, and during the
process of evaluating the possibility of
building another data centre using water
for cooling, this site came up for sale. So
it was a combination of we were looking
to build a green data centre and an
opportunity that came along.
By creating what Green Mountain likes to
claim is the greenest data centre they are
hoping that other companies take their lead
Cooling station
In our ever connected world we are relying more and more on data centres, but they use a lot of power. With
this in mind, Henry Tucker MBCS went to see the self-proclaimed, greenest data centre in the world.
The dark blue water laps gently on the
hard granite shoreline. Take just one
step into the cold water and the drop is
70m straight down. Go a little further out
and it can get as deep as 150m. These
Norwegian fjords have been used for many
things ever since man first laid eyes on
them. Now they are being used to cool a
data centre.
Green Mountain is no ordinary data
centre though, even before it started using
8oc fjord water to cool its servers. Thats
because not only is it quite green, literally
and figuratively, but it is also a mountain.
Well inside one.
Smedvig, the company that owns the
data centre isnt the first to operate inside
the mountain though. The tunnels that run
up to 260m into the granite were drilled by
Data room
in-row
coolers
Fjord
30m
8oC
100m
19
BIG DATA
School
Training
University
News sites
TV sites
Conferences
Books
Publications
Articles
Public profile
Car/Driving licence
Electoral role
Education
Address sites
Directories
Birth & Citizenship
Career
CV/Resume
Professional qualifications
Medical
Preferences
Banks
Credit references
Credit cards
Online albums
Other peoples photos
Picture sharing
Memberships
Professional bodies
Groups
Organisations
Online shops
Review sites
Ratings sites
Purchases
Photographs
20
Job sites
Career sites
Personal website
Government records
Financial
Search engines
Social interactions
Social media
Phone records
SMS
21
dge
e
l
w
o
n
My Kee further
o
t
n
i
g
o
rs: l e area to st links
e
b
m
e
M
irec
ecur
in theisstings with d
l
Helen Wilcox outlines some of the resources in the member library on big data.
Image: iStockPhoto/135161171
YOUR RESOURCES
Dallas and London offices respectively.
Source: McKinsey Quarterly, 2013
Big data: The next big thing in innovation
The rise of big data is connected to the
advent of web 3.0 and the proliferation of
sensors increasing the amount of
automated data collection, according to this
writer. However, she says that putting big
data to work, whether to drive innovation
or to reshape innovation processes, will not
be so easy.
MaryAnne M Gobble, Research Technology
Management
Source: Research Technology Management,
USA January/February 2013
The rise of big data
The writers look at the effect of increasing
quantities of digital information, or big data,
on the way humans interact, communicate,
and learn. Topics include the
determination of correlative rather than
causative relationships in statistical
research using large quantities of data,
the lack of accuracy and precision of data
created through resources such as the
internet, and the ability of technology to
produce larger statistical samples.
Kenneth Cukier, The Economist, and Viktor
Mayer-Schoenberger, Oxford Internet
Institute, UK
Source: Foreign Affairs, May/June 2013
Big data in digital media research
This paper discusses the methodological
aspects of big data analyses with regard to
their applicability and usefulness in digital
media research. The authors examine the
consequences of using big data at different
stages of the research process, based on
a review of a diverse selection of literature
about online methodology. They argue that
researchers need to consider whether the
analysis of huge quantities of data is
justified, given that it may be limited in
validity and scope, and that small-scale
analyses of communication content or user
behaviour can provide equally meaningful
inferences when using proper sampling,
measurement, and analytical procedures.
Merja Mahrt, Heinrich Heine University,
Germany, and Michael Scharkow, University
of Hohenheim, Germany
Source: Journal of Broadcasting &
Electronic Media
Internal auditors input to big data
projects
On big data projects, internal auditors need
to have a seat at the table and ask hard
questions about risks and rewards, says
this writer. Big data has become a
significant development for internal
auditors as corporations adopt its use,
following its rise to prominence during the
2012 US elections. Internal audit must be
in the forefront in classifying data sets,
according to the article.
Russell A Jackson, freelance writer, USA
Source: Internal Auditor, February 2013
Assisting developers of big data analytics
applications when deploying on Hadoop
clouds
Big data analytics applications are a new
type of software applications which
analyse big data using massive
parallel processing frameworks (e.g.
Hadoop). Developers of such applications
typically develop them using a small
sample of data in a pseudo-cloud
environment and then deploy the
applications in a large-scale cloud
environment with considerably more
processing power and larger input data.
The authors noticed that the runtime
analysis and debugging of such
applications in the deployment phase
cannot be easily addressed by traditional
monitoring and debugging approaches.
In this paper, they propose a lightweight
approach for uncovering differences
between pseudo and large-scale cloud
deployments, using execution logs from
these platforms.
Weiyi Shang, Zhen Ming Jiang, Hadi
Hemmati, Ahmed E Hassan, and Patrick
Martin, Queens University, Canada, and
Bram Adams, Polytechnique Montral.
Source: ICSE: International Conference on
Software Engineering, February 2013
Finding the needle in the big data
systems haystack
With the increasing importance of big data,
23
INFORMATION SECURITY
Agile
HOLISTIC
SECURITY
Professionals across the business can now demonstrate their ability to deliver
greater value from their projects with the global benchmark in agile capability.
BCS Agile Certification pushes the boundaries in agile thinking and delivers the
why, not just the how, of agile by bringing people together in an agile learning
environment to tackle real-world business issues. Its method-neutral, leaving you
to decide on the agile approach that works best in your organisation.
Enjoy successful agile projects and transform the way you do business.
bcs.org/agilecertified
FURTHERINFORMATION
Information Security Specialist
Group (ISSG):
www.bcs-issg.org.uk
Information Risk Management and
Assurance Specialist Group:
www.bcs.org/groups/irma
BCS Security Community of
Expertise (SCoE):
www.bcs.org/securitycommunity
BCS, The Chartered Institute for IT, is the business name of the British Computer Society (Registered charity no. 292786) 2013
BC291/LD/AD/0713
Image: iStockPhoto/168767483
Certified
INFORMATION SECURITY
CLOUD
SURFING
Image: iStockPhoto/132001048
It has been several years since cloud services became a viable and cost-effective means of managing our
information and IT infrastructure. There have been numerous articles and books written about the technicalities
of how organisations can make best use of the cloud and of the security issues that arise. David Sutton FBCS CITP,
co-author of Information Security Management Principles says that we should turn our attention back to focus on
the what and the where, rather than on the how.
Certainly, using the cloud does (or at least
should) solve two key business problems:
A quantifiable reduction in costs.
Organisations using the cloud require
less IT infrastructure on their own
premises; they dont have to spend
money on staff to look after these
increasingly complex systems and
they dont feel the need to upgrade
them whenever new hardware,
operating systems or application
software appear.
A reduction in security worries. The
cloud provider takes care of securing
the organisations outsourced
infrastructure and the information well, in theory at least.
Whilst the first benefit is undoubtedly true,
can we be sure about the second? Recently,
there has been much discussion in the media
about interception of personal information
including emails, fixed and mobile phone call
records, text messages, instant messages,
Facebook and Twitter accounts . . . the list
seems endless.
Media reporting about the PRISM
programme has highlighted the active
participation of Apple, Facebook, Google and
26
27
AVOIDING
CYBERWASH
INFORMATION SECURITY
References
1. http://www.gchq.gov.uk/
Press/Pages/10-Steps-to-CyberSecurity.aspx
2. https://dm.pwc.com/
HMG2013breachessurvey/
3. Understanding the influences
on information security behaviour, Professor Steven Furnell
and Anish Rajendran, Plymouth
University
4. So Long, And No Thanks for
the Externalities: The Rational
Rejection of Security Advice by
Users, Cormac Herley, Microsoft
Research
Image: iStockPhoto/136579577
28
29
INFORMATION SECURITY
Image: iStockPhoto/166199437
ROLE WITH IT
Mike Barwise MBCS CITP looks at the role of information assurance and how it fits in with other roles
within business.
Information assurance is one of those popular terms, like risk, that is widely used without
a clear understanding of its real meaning.
To some extent it has been the victim
of grade inflation. We started out doing IT
security, which gradually became referred
to as information security and ultimately
as information assurance - ever grandersounding titles, despite the actual nature of
what most of us were doing hardly changing.
As a result, the security remit of most Chief
Information Officer (CIO) and Chief Information
Security Officer (CISO) roles is today still
restricted largely to technologies, often
essentially replicating the security remit of the
Chief Technology Officer (CTO). Much corporate
information assurance therefore exists in
name alone.
So what could be done differently? Lets
look first at information security (IS). Although
this is commonly considered a technological
discipline, technological security (ITS) is really
only one of its components.
Technologies, business processes and
people management each contribute roughly
30
31
DATA, DATA
INFORMATION SECURITY
Image: iStockPhoto/160138145
EVERYWHERE
As individuals living in a rich technology and communication ecosystem we capture, encode and publish
more data than ever before. This trend toward greater amounts of data is set to increase as technology
is woven ever more into the fabric of our everyday lives, says Ben Banks MBCS, European Information
Security Manager, RR Donnelley.
As information security and privacy
professionals we are in the vanguard of
navigating this new landscape. Our
challenge is enabling commerce whilst
ensuring our stewardship for these new
assets remains strong.
This article explores one aspect of this
challenging new world - when does data
become information and what does that
change mean for our assurance work?
From data to Information
Data and information are not synonymous.
Although the terms data and information
are often used interchangeably adopting a
more rigorous understanding of them has
important implications.
It is fairly intuitive that an instance of
data, a data point, when considered in
isolation is not information. For example
32
Truth
If we consider another order of our EMV
Brevity
If I had more time, I would have written a
shorter letter is commonly accepted as
true. When bandwidth was costly short
meaningful messages were more valuable.
If we had a list of default PINS and some of
the associated PAN numbers it would be a
big data set.
That information could be re-expressed
in a condensed format as the algorithm
for generating a default PIN from a PAN (in
truth this isnt actually how it works, but
it does help to illustrate the point). Brevity
condenses the content of data without loss
of meaning and in so doing it becomes
more valuable.
Causal efficacy
What can data you hold let you do? In order
to dig into this question we need to change
our perspective a little. Consider the question
of what data you would need to supply to
make an online payment when the card is
not present.
Typically one needs a credit card
number, a name, a card verification value
and an expiry date associated together to
make a valid transaction (assuming the
sites you used didnt require an additional
verification step).
Whilst payments require a number
of data points to have a level of causal
efficacy, consider how many data points
you need to identify yourself to get access
to online services.
33
TAKE
CONTROL
According to John Mitchell IT governance can be defined as a structure of relationships and processes to
direct and control the IT function in order to achieve the enterprises goals by sustaining and extending the
enterprises strategies and objectives.
IT governance comprises the organisation
and provision of IT services and the
performance measurement and
enhancement of these services. It is in
relation to this latter element that the concept
of assurance arises.
Assurance is an important component of
IT governance. How can the CIO show that the
IT service is meeting its value for money and
service objectives? Usually this is through
the provision of performance metrics, but
how can they prove that these metrics and
associated analysis are reliable?
I once attended a meeting with a Chief
Executive and his six direct reports, two
of whom were the CIO and the Chief
Internal Auditor (CIA). I asked each head of
department, in turn, who was responsible
for internal control in their company?
Without hesitation each one pointed to
the CIA. When I then asked them how
frequently the CIA audited their controls
they responded every three years.
When I then asked them who was
responsible in-between the three year
period, they shuffled their feet, avoided by
eyes and remained silent. I then pressed
them on risk management. They accepted
that this was their responsibility, but when I
then pointed out that risk was managed by
controls they started to realise that control
was their responsibility too. Assurance
is primarily achieved by measuring the
effectiveness of controls in managing risks.
Unfortunately most auditors and very
few managers cannot define what a control
is and how it operates, so it is not too
surprising that our IT assurance processes
are somewhat suspect. I would even
go further by asserting that our current
control paradigm is not fit-for-purpose.
Our technology has changed beyond
35
INFORMATION SECURITY
SECURITY
UPDATE
Image: iStockPhoto/153900037
BCS, The Chartered Institute for IT, remains actively involved in information security. Andy Smith FBCS
CITP details what the various groups have been working on, primarily in two key areas.
The first is general information security,
the other is professionalisation of the
information security industry. The full
version of this update can be found at:
www.bcs.org/securitycommunity.
The Security Community of Expertise
(SCoE) and its various sub-groups deal
with information security and IT security
on behalf of the Institute. Over the last few
months the SCoE has been busy talking
at various conferences and representing
the Institute on various national and
international bodies. We have also been
busy writing position papers and providing
feedback on various legislative proposals.
Our latest report on aspects of identity
that covers the work of the Identity
Assurance Working Group (IAWG) for
2012/13 has been published and is
available for download on the website at:
www.bcs.org/identity.
The Institute held a workshop at the
InfoSec Europe 2013 conference in
April. This was on the subject of identity
assurance, which the Institute feels is a
very important area. We concentrated
on preventing identity theft and what
organisations can do to protect themselves
and ensure their staff are who they claim
to be.
At the EEMA (www.eema.org) annual
conference in the Netherlands the team
36
37
INFORMATION SECURITY
DATA
PROTECTION
Charlotte Walker-Osborn, Head of the TMT Sector and Partner, Liz Fitzsimons, Legal Director, and James Ruane,
Associate, from international law-firm Eversheds LLP, take us through a whistle-stop tour of a couple of recent
developments in the field of data protection.
38
So what?
This decision highlights the ICOs increased
willingness to issue monetary penalties
to reinforce its message that portable
devices containing personal data should
be encrypted. This decision is notable as a
significant penalty was issued despite no
sensitive personal data being contained on
the laptops. However, given that the council
had been the subject of an enforcement
notice two years earlier for similarly
39
HEALTH INFORMATICS
TAKING CARE
41
3D TECHNOLOGY
Fig. 2: Stereo frames from Robinson Crusoe (1947).
Barry G Blundell FBCS looks at 3D, referring back to glasses-free 3D Cinema 70 years ago in Russia.
The BBCs recent decision to put its 3D TV
venture on hold is yet another indication
that all is not well with televisions foray
into the third dimension. A number of
factors have contributed to its current
demise and these include a failure to
properly accommodate the ways people
behave when watching TV - from the
child who regularly switches attention
between toys and screen, to the adult who
multitasks. In every case those glasses
get in the way and all too often gravitate to
that dark space beneath the sofa.
Cinema audiences are more singleminded and are generally intent on a truly
immersive experience. They are therefore
more willing to tolerate viewing glasses as
an interim solution but look forward to the
development of alternative technologies
that will support the convenience of
glasses-free (autostereoscopic) 3D.
In fact, glasses-free 3D cinema is not a
futuristic vision in Moscow back in 1941
it was reality playing on a 5x3m screen:
The auditorium is plunged in darkness,
except for a little lamp suspended from the
ceiling by a long cord. But wait an actor
suddenly reaches out from the screen and
draws the lamp towards him. How did he do
it? As a matter of fact, there was no lamp
left burning in the auditorium. It was simply
an effect produced by the stereocinema A
juggler flings a ball straight at the audience,
and those who happen to come within his
line of vision blink and duck involuntarily
Ivanov [1941]
On show was the 40 minute 3D film,
Konsert (Fig. 1) and during a four month
period some 500,000 people took the
opportunity to enjoy autostereoscopic 3D.
Unfortunately the venture could not have
been more ill-timed, and it came to an
abrupt halt in June when Germany and
Russia became embroiled in total warfare.
On 20 February 1947 glasses-free
3D re-opened in Moscow. Significant
developments in display technology were
complemented by advances in the art and
science of stereo photography. The 3D
GLASSES-FREE 3D
images.
Within a few
years, glasses-free
3D cinemas opened
in Kiev, Leningrad
and Astrakhan and
continued successfully
through to the 1960s.
Success or failure
Glasses-based 3D
cinema continues to
represent the simpler Fig. 4: The radial raster barrier invented by Edmond Noaillon in the
solution and is able to 1920s, which was successfully used by Russian pioneers.
accommodate more
closely packed audiences. Furthermore, the to implement viable and highly innovative
challenges associated with glasses-free
glasses-free solutions.
approaches become more taxing
as screen width is increased. These
would have been key factors that
eventually caused the Russians
to migrate to the glasses-based
approach. Our continued use of
glasses-based technology is
primarily driven by commercial
considerations: it represents the
most cost effective approach.
In the case of TV it is important
to recognise that 3D works well
with only some forms of content.
This suggests the need for display
technologies that can seamlessly
Fig. 5: Constructing the radial raster barrier circa 1940.
transition between 2D and 3D modes The barrier weighed around six tons. The opaque
of operation thereby supporting
regions were groups of copper wire. Given the required
the infusion of 3D into 2D delivery.
accuracy and the lack of lightweight plastics, the
In turn this implicitly necessitates a
development was a remarkable achievement.
glasses-free approach able to support
appropriate freedom in viewing
positions. Furthermore, since TV audiences
Further reading
are usually in quite close proximity to the
Blundell, B.G., On Aspects of Glassesscreen, accommodation and convergence
Free 3D Cinema ~70 Years Ago,
issues (which can cause visual strain)
www.barrygblundell.com
cannot be cast to one side.
Such requirements - coupled with
Funk, W., History of Autostereoscopic
the creation of appropriate content - are
Cinema, Proc. SPIE, Vol. 8288, (2012).
essential to the success of 3D TV and may
perhaps be most readily achieved through
Ivanov, S.P., Russias Third
the use of techniques that better capitalise
Dimensional Movies, American
on the remarkable capabilities of our sense
Cinematographer, pp. 212-213, (May
of sight. In this respect, technologists
1941).
have sometimes incorrectly assumed that
binocular vision is the sole basis for 3D
Macleod, J., Stereoscopic Film. An
perception.
Eyewitness Account, Monthly Film
Certainly the early pioneers of 3D cinema
Bulletin, pp. 118-119, (1st October
have demonstrated the practicality of
1947).
delivering autostereoscopic 3D to quite
large audiences. Armed with todays
Valyus, N.A., Stereoscopy, The Focal
materials, technologies, simulation tools
Press, (1966).
and know-how, we are far better placed
Fig. 3: The parallax barrier with a light-diffusing screen. Three exemplar viewing locations shown.
42
43
doi:10.1093/itnow/bws004
2012 The
The British
British Computer
Computer Society
Society
doi:10.1093/itnow/bwt053 2013
44
45
46
Conclusions
Whilst the debate motion certainly
highlighted the alignment gap between
academia and the IT profession (real or
perceived), the motion was essentially
fallacious in that an academic education
does not exist exclusively, or perhaps even
primarily, to produce graduates that have
all the skills and experience necessary
to be competent and productive during
their first week of employment, as many
employers might desire. Who would want
to undergo a heart operation conducted by
a surgeon whose only experience has been
in the classroom?
The idea that universities will be
able to produce production ready IT
graduates capable of being sent out on
billable assignments in their first week of
employment is unrealistic and misleading,
no matter how desirable that might be to
employers.
Recommendations
The post-debate discussion led to some
simple yet compelling recommendations
that will help to close the gap.
Resources
The debate was video recorded and
may be viewed online along with a
selection of photographs and a copy of
the Book of the Night at:
www.bcs.org/content/
ConWebDoc/50013
The post-debate discussion will
continue online via the BCS Learning
and Development Specialist Groups
forum on LinkedIn. Readers of this
article are welcome to contribute to
that discussion online:
http://www.linkedin.com/groupItem?view=&gid=2430056&type=
member&item=258516388&qi
d=624610c2-9d93-4dac-a74b7bc6441fdab0&trk=group_
most_recent_rich-0-bttl&goback=%2Egmr_2430056
47
ADVERTISEMENT
CHANGE
MASTER
The Institute publishes new reports from analyst Forrester every month. Members have
access to the full text. This is an overview of recent reports.
With an increased emphasis on employability and the skills needed to excel in the online, connected
workplace, the Open University is changing its Masters computing programme. Kevin Waugh,
Postgraduate Programme Director for Technologies and Computing at the Open University, explains the
drivers for change and goes on to outline what these changes will look like.
June 2013
IT analytics that help with big data
Your business is complex. Big data
promises to manage this complexity to
make better decisions. But the technology
services that run your business are also
complex. Many are too complex to manage
easily, fuelling more complexity, delays,
and downtime.
In Turn Big Data Inward With IT
Analytics: The Future Of Service Monitoring
And Management, Forrester predicts
this will inevitably get worse. To combat
this onslaught, you can no longer just
accelerate current practices or rely on
human intelligence. You need machines
to analyse conditions to invoke the
appropriate actions.
These actions themselves can be
automated. To perform adaptive, fullservice automation, you need IT analytics, a
disruption to your existing monitoring and
management strategy. This report helps
IT infrastructure and operations leaders
prepare for IT analytics that turns big data
efforts inward to manage the technology
services that run your business. Key points:
If you cant manage todays
complexity, you stand no chance
managing tomorrows: Your companys
data is growing at exponential rates,
and without systems in place to
manage and organise growth, you will
drown in it. Virtualisation,
consumerisation and cloud are
guaranteed to follow. The answer
cannot simply be to accelerate the
same processes and methods.
One size does not fit all: IT analytics
solutions using a single algorithm are
impossible, and its unlikely that one
single vendor will be able to offer all
of the solutions needed. Forrester
expects the emerging and existing
management software vendors to
consolidate many of these capabilities.
Work on improving processes before
deploying IT analytics tools: IT
analytics is an exciting field because it
July 2013
Deliver on big data potential
Deliver On Big Data Potential With A HubAnd-Spoke Architecture says that data
management has become as crucial as
financial management to leading firms,
but businesses grapple with data
management platforms that cant respond
fast enough to fickle customers and fluid
markets. Big data has emerged as the new
industry buzzword promising to help do
more with data and break down silos.
Forrester examines this phenomenon
and discovers that firms are taking a
pragmatic approach that focuses first
on wringing value from internal data at
a lower cost and implementing a new
architecture they call hub-and-spoke.
This report explores this architecture
and provides recommendations for
exploiting the big data phenomenon
using it. Enterprise architects and
other technology strategists need this
information to be equipped with options
for flexibility and speed as they earn seats
at the business strategy table. Key points:
Firms have a lot of data, want more,
and struggle to afford it: Firms are
collecting a lot of data, but their
data platforms
struggle
to remain
affordable
as scale and
demands
increase. While
the pack struggles, leaders have
figured out an approach that breaks
down silos, enables agile analytics,
and creates cost-effective data
management.
Hub-and-spoke data architecture
meets the need: Yesterdays
correct data architecture involved
centralised warehouses, marts,
operational data stores, and a
lot of ETL. Hub-and-spoke takes
a different approach; it features
rapid analytics and extreme-scale
operations on raw data in an
affordable distributed data hub.
Firms that get this concept realise
all data does not need first-class
seating.
Understand technology patterns to
deliver flexible options: Forrester
has interviewed firms with
practical experience to identify best
practices. It found seven common
technology patterns that reuse the
building blocks of hub-and-spoke.
Make data management part of
your business strategy: Enterprise
architects must earn their way
to the strategy table, then come
prepared with a host of options
for speed and flexibility using the
hub-and-spoke concept. Most
importantly, stop talking about
big data and start talking about
business outcomes that big data
can help deliver.
49
Me
Di mbe
Av sco rsh
ail u ip
ab nt
le!
OTHER JOURNALS
VIDEO GAME VIOLENCE EFFECTS
The July issue of Interacting with Computers is a special
commentary on scale derivation, looking at the thorny question of
measuring games engagement and any potential effects on players.
Brian Runciman MBCS reports.
Data Analytics
The IET is the Professional Home for Life for engineers and
technicians, and a trusted source of Essential Engineering
Intelligence. The Institution of Engineering and Technology
is registered as a Charity in England and Wales
(No. 211014) and Scotland (No. SCO38698).
51
MEMBER GROUPS
AND
COMMUNICATION ENGAGEMENT
In the 21st century, what role can internet-based communication and social media technologies play in
our activities as a professional institute? Can they support virtual meetings, or make the content
generated by specialist groups and branches accessible to a greater range of members? Dr. Geoff
Sharman FBCS CITP chaired the Recording and Broadcasting Working Group, which was tasked with
improving audiovisual approaches for member groups.
Recommendations
Each member group should be
encouraged to develop a recording and
broadcasting strategy, describing what
events it plans to record and what level
of investment is appropriate to enable
this.
Each member group should assess
which technique, or combination of
techniques, best suits its style of
operation and is the most effective for
each event.
Each member group should be
encouraged to appoint a committee
member as Recording Officer, to act as
a focal point for recording activities and
the development of skills.
Best Practice Committee should
establish an initial inventory of
equipment at Southampton Street.
Best Practice Committee should
document techniques and scenarios
for using recording and broadcasting
equipment.
Equipment, software and techniques
should be reviewed after two years.
Member Board should work with
BCS HQ to develop cooperative working
practices so that the activities of
volunteer members are supported by
appropriate staff functions.
Best Practice Committee should
set up the appropriate social media
accounts and establish procedures for
promoting/migrating material from
member group repositories to the
central repository and tagging it with
appropriate terms.
Member Board should work with
the BCS Publications Department to
identify the best recordings produced by
member groups and showcase them in
BCS publications.
Best Practice Committee has initiated
a study of social media and should aim
to identify the best ways of exploiting
their synergy with recording and
broadcasting techniques.
damaging to BCS.
Assuming this is the case, the organiser
should also ensure that the speaker owns
the copyright of any material he or she
presents and is willing to release it, in
order to protect BCS against subsequent
charges of copyright infringement.
Privacy
Many member group events are open to
non-members and are therefore public
meetings. In the course of such a meeting,
personal information about the speaker or
other attendees (such as their appearance,
experience, attitudes, ethnic background,
etc.) may become known. When a meeting
is recorded or broadcast this information
may become widely known, potentially
infringing the data protection rights of
these attendees. In the course of recording
an event, members of the audience may
also be recorded, for example when
asking questions. Therefore, to protect
their privacy, the organiser should inform
the attendees that they may be recorded
and what action they should take if they
wish to avoid this, for example to leave the
meeting or to sit in a reserved area of the
room.
Security
The technologies and methods described
here are unsuitable for communication
Hygiene factors
which will include proprietary business or
Some people have expressed concerns over personal information, or information
privacy, security and intellectual property
relating to national defence and security.
rights in relation to recording and
Any member group that needs to
broadcasting, and especially in relation to
communicate this kind of information
electronic meetings and public broadcasts. should consider carefully how it is managed
Its therefore worth outlining the main
and use appropriate technologies.
factors that member groups should
consider when setting up an event.
Further information
53
ARE
MEMBER OPINION
SCRUM
AND
KANBAN
ENOUGH?
While Scrum and Kanban (and Waterfall
before it) may be good management
practices, they are not sufficient by
themselves to result in agile development.
Software development has never been
easy as the computer industry is only
about 50 years old. Most applications,
until recently, were small and could be
readily broken into smaller segments for
incremental delivery.
The software industry, even now, has
few pre-built and tested components
(apart from the dreaded dll files) that can
Visual
Basic
SQL
Server
Oracle
C#
Java
PHP
Library
programme
Development costs 200k+
Accounts
programme
Development costs 200k+
ERP/MRP
bill of material
Development costs 500k+
54
Library system
Accounts
Bugs, poor documentation
Manufacturing
Read/write Access
Read/write Excel
Read/write CSV
Visual Basic
C# code
Code only this side
Progen
4GL/Agile
Engine
RDMBS/SQL
Report writer
search engine
forms
generator
Library system
Small text files <10K
Care home
system
Self-documenting system
Manufacturing
Data only this side
55
INTERVIEW
CIO vs. CDO vs. CIO vs. CDO vs. CIO vs. CDO
Brian Runciman MBCS spoke to Richard Harris, former CIO at ARM UK and Rolls Royce.
57
COMPUTER ARTS
LAYERED
PRACTICE
Credit: Paul Coldwell, Still Life with Keys, Inkjet + laser cut relief, 2012. Image size 47x64 cms, paper size 59x84 cms.
Copyright the artist. Reproduced with permission.
59
1/8/13
16:12
Page 1
The extracts below are taken from the BCSs eminent academic monthly, The Computer
Journal, published with Oxford University Press.
60
Our Services
3D laser range measurements. So at the
data scan stage, a far smaller signal is
captured, using sparse representations of
laser range measurement sequences.
This work is of relevance for sensor
systems; robot sensing systems; intelligent
sensors, to be deployed in 3D mapping; 3D
modelling. 3D laser range finder data from
Schlo Dagstuhl and from Bremen city
centre are used in this work.
This article appeared in Volume 56 Issue
7 July 2013. The authors are from Bilkent
University, Ankara, Turkey.
RE-UML: A Component-Based System
Requirements Analysis Language
An extension to unified modelling
language (UML) named RE-UML is
presented, with formal semantics
utilising the Prolog programming language
to support component-based system (CBS)
requirements analysis. RE-UML extends
the UML sequence diagrams with a
satisfaction interaction frame and
mapping operators to model matching
criteria between stakeholder demands and
candidate component features.
Furthermore, associations between
requirements and candidate components
are introduced to model risk assessment
and conflict resolution during CBS
requirements analysis. To demonstrate the
use of RE-UML, its application is presented
to the software system of Seven-Eleven
Japan, relating to stores and franchises.
This article appeared in Volume 56 Issue
7 July 2013. The authors are from King
Fahd University of Petroleum and Minerals,
Dhahran, Saudi Arabia, and La Trobe
University, Victoria, Australia.
Testing Solutions Group Ltd (TSG) 117-119 Houndsditch, London, EC3A 7BT
T: +44 207 469 1500 E: ITNow@testing-solutions.com W: www.testing-solutions.com
Unblinded
by science
SMART IT DECISIONS
Practical guidance for
implementing and
maintaining an eective
and robust governance
framework.
An essential reference
for anyone in governance
or executive management.
www.bcs.org/bookshop
- also available from all good booksellers
BCS, The Chartered Institute for IT, is the business name of The British Computer Society (Registered charity no. 292786) 2013
BC290/LD/AD/0713
ADVERTISEMENT
dge
e
l
w
o
n
My Kee further
o
t
n
i
g
o
rs: l e area to st links
e
b
m
e
M
irec
ecur
in theisstings with d
l
Helen Wilcox outlines some of the resources in the member library on software testing.
Image: iStockPhoto/135161171
YOUR RESOURCES
now widely used, with regression
testing used to assure system quality
because changes made to one component
could affect it or the entire system. The
papers authors identify diverse changes
made to components and the system
based on models, then perform change
impact analysis, and finally refresh the
regression test suite using a state-based
testing practice. A case study shows the
approach is feasible and effective.
By Chuanqi Tao, Southeast University,
Nanjing, China, and San Jose State
University, USA; Bixin Li, Southeast
University, Nanjing, China; and Jerry Gao,
San Jose State University, USA
Source: Journal of Software, March 2013
Web software systems testing, supported
by model-based direct guidance of the
tester
The common approach to software testing
based on manual test design and manual
execution of test cases can be made more
efficient by suitable automation. This paper
proposes a new approach to the
testing process using automated test cases
Books 24/7
Practical Software Project Estimation:
A Toolkit for Estimating Software
Development Effort & Duration
This guide explains the tools and
methods necessary to extract conclusive
business intelligence from disparate
corporate data. It aims to help in the
deployment of high-performance data
transformation solutions in enterprise.
Peter R. Hill (ed), McGraw-Hill/Osborne
65
IP EXPO
DAYS PAST
New Themes
Applications
Communications
NEW!
COMMUNICATIONS
NEW!
New Theatres
Communication &
Collaboration
Data Centre
Data Insight &
Analytics
Enterprise Mobility
Management
Network Security
NEW!
APPLICATIONS
10%
New
Exhibitors
CLOUD
VIRTUALIZATION
NETWORKS
STORAGE
SECURITY
NEW!
Sponsors
Learn More:
http://www.exerasoftware.com
2013 Flexera Software LLC. All other brand and product names mentioned herein may be the trademarks and registered trademarks of their respective owners.