Most of the information governance tools on the market today are tools to
enable us to cope with a lack of information governance. They dont
actually establish good governance themselves. Organizations are
deploying content analytic tools to make ad-hoc disposition decisions on
groupings of content that they have not been able to apply retention rules
to. Most organizations have huge gaps in their documentary records kept
in whichever generic document repository they use (SharePoint, an
EDRM/ECM system, or simple shared drives). Many decisions are made
exclusively in e-mail without leaving a trace in those repositories.
Individuals cope with gaps in the documentary record by relying on their
own e-mail accounts. General Counsel copes with gaps in the
documentary record by searching e-mail archives.
The foundations for e-discovery were established a decade ago. The FRCP
were amended in 2006 to consider electronic documents. Eight years
later, those amendments are again being debated. The general consensus
is that many attorneys are still not familiar with e-discovery rules. In my
discussions with vendors, I note their frustration with not being able to
monetize information governance services. Several vendors have
retrenched their information governance efforts and refocused on ediscovery apparently e-discovery services are easier to sell. As I see the
issue, organizations view information governance as a support function
and cost center, and not a revenue-generating effort.
By 2020, IG will drive success in business and risk management by
leveraging powerful, innovative data analytics and ephemeral messaging
technologies. However a dramatic reduction in the volume of outdated
and useless information stored by organizations is not likely, in my
opinion. Breakthroughs in solid state storage technology like Spintronics
(an order-of-magnitude faster and more efficient than current storage
systems) save time and reduce the need to cut down stored data. In
parallel, the rapidly emerging demands of data privacy will fuel ephemeral
messaging development. This will serve to reduce risk and protect
privacy, as well as shrink the clutter of data that have extremely short
shelf lives.
Information Overload: Courts Prepare for e-Discovery Changes
Ed Silverstein Legaltech News 17 April 2015 http://www.legaltechnews.com/id=1202723823025/Information-Overload-CourtsPrepare-for-eDiscovery-Changes?slreturn=20150322161001
At the same time, data itself continues to become more complex, explains
David Houlihan, an analyst at Blue Hill Research. At first, that data was
largely in the form of email, but grew to include text messages, videos
and social media. There are also changes as storage went from onpremise and being hosted by law firms to a greater acceptance of cloud
storage and the increased use of software as a service.
Over the next few years, the volume of data will continue to skyrocket,
especially with increased use of the Internet of Things and more machine-
Legal frameworks generally lag behind technological capability, and the complexities of the data
landscape generated by connected devices and systems will pose interesting legal and
regulatory challenges.
For example, a connected device in a domestic fridge could be designed to monitor energy use
or shopping needs, but could simultaneously be generating personal information about such
things as an individuals health, lifestyle and changing family structure. This kind of information
would need to be regulated and protected.
The third challenge will be the storage and retention of the information. It will be impossible (and
not the right thing) to store and keep absolutely everything.
Information governance frameworks are already struggling under the weight of emerging digital
channels, and could buckle under IoT unless organisations get better at classifying their data and
knowing what to retain and store and what to delete.
This is not always going to be easy. The challenge of determining what information constitutes a
record or has potential business value, and applying an appropriate retention rule is no mean feat
and may well seem overwhelming for the many businesses already overloaded with growing
volumes of information in multiple formats. Yet failure to take on the challenge will to expose
many to unacceptable levels of risk.
Information professionals often err on the side of caution when it comes to the data they retain.
Businesses are reluctant to destroy data that could at some future point deliver value, and they
dont want to have deleted data which may suddenly required for e-discovery purposes. This
results in hesitancy with a keep-it-all-in-case culture.
Judgement calls about record disposition will have to be made but these difficult decisions
will be helped considerably by having strong information governance in place; pre-defining and
automating categorisation to limit storage and vulnerability, and defining and enforcing clear
responsibilities amongst your team.
Information Governance Problems
Many organisations dont know who owns or who should own the content created through these
communications channels. A recent survey of information professionals, by Iron Mountain and
AIIM, revealed that a third of businesses have yet to allocate content responsibility for instant
messaging (39%), mobile (32%), social media (28%) and cloud-sharing (33%).But close to one in
ten respondents said their organisations fail to regulate even well-established information types
such as email, customer data and public online content.
The following deals with some of the technical issues surrounding data
collection and storage.
As IoT grows, the need for real-time scalability to handle dynamic traffic
bursts also increases. There also may be the need to handle very low
bandwidth small data streams, such as a sensor identifier or a status bit
on a door sensor or large high-bandwidth streams such as high-def video
from a security camera. Consider the following examples and the
applicability of network-connected device to IoT
Homes and offices
Utility meters send complex data packets to service providers where
centralised systems provide real-time monitoring to proactively detect and
remediate problems such as blackouts, water leaks and circuit overloads.
Data is analysed to improve efficiency by determining needs, spotting
trends, and predicting demand. By virtue of its smart IoT fixtures, the city
of Oslo reduced energy costs by 62%.
Wearables
From heartbeat-sensing fitness bands to step-counting smartphone apps,
wearables are the public face of IoT. A portable device is connected to a
service that aggregates data and, increasingly, shares it across social
media, with a doctor or even a gym. The cloud-based services also push
back analytics, motivational graphics and music, and location-based
maps.
Hospitals
Hospitals utilise several smart devices, both standalone and those wired
to nurses station monitors. Soon, these will be interconnected through a
highly available and secure network with server-based applications that
can track patient conditions by correlating all data not just nurses
readings allowing better monitoring, data logging and big data analytics.
An IoT-connected network helped St. Lukes Medical Center reduce
patient-bed turnaround time by 51 minutes.
Factories and warehouses
The flow of materials must be monitored and optimized for efficiency.
Location sensors are embedded in components moving through assembly
lines and inventory systems. The location of forklifts, pallets and workers
are tracked as well, while centralised software directs the activity in real
time to effectively respond to customer requests.
By implementing predictive maintenance and quality control IoT, BMW
reduced auto-warranty costs by 5% and reduced the scrap rate of
defective vehicles by 80%.
Dynamic application delivery
Along with these various applications mentioned above, and there are
plenty more. When an IoT node performs a service request, such as
a proper case? Answering those questions is part of their duty to the court
and to their client.
The State of Information Governance Forbes, Barry Murphy, 19 April
2012 http://www.forbes.com/sites/barrymurphy/2012/04/19/the-state-ofinformation-governance/?utm_source=twitterfeed
IG is defined as a comprehensive program of controls, processes, and
technologies designed to help organizations maximize the value of
information assets while minimizing associated risks and costs.
Everyone recognizes the need for IG, but no one wants to be ultimately
responsible for it. Businesspeople care only about the ability to easily
create and access information to do their jobs. ITs job is to support the
business. Defensible disposition projects and Legal Hold programs for
cost avoidance dont exactly have the sex appeal of implementing social
media marketing projects that can drive revenue. That is hardly
surprising, given the very real challenges of truly managing corporate
information assets. Still, though, IG is important and is gaining some traction in many
organizations. For companies that ignore IG, managing the risk that
information poses is harder and harder because the volume of information
stored keeps going up. For every effort a company takes to safeguard
information, employees create a workaround if that effort impinges on the
velocity of information. In turn, those workarounds can lead to a vicious
circle of eDiscovery nightmares.
Good IG programs build a corporate culture where responsibility for
information is a core tenet. Employees understand policies and are
incented to abide by them. That culture can only develop under a highlevel executive who truly believes in IG. Which C-level executive owns IG
is less important than the leadership and consensus-building qualities she
or he possesses.
There is no real standardization amongst which C-level executive owns IG,
nor does there need to be. For some companies, it will be best for a CIO
to own IG, for some it is best for a Legal Officer or General Counsel to
own, and for others IG is bested owned by a committee of senior
executives. One thing is for sure: IG cannot, and will not succeed, unless
there is a C-level executive that clearly owns real responsibility and
accountability for IG. Organizations seeking to exert greater control over
their information assets must close this gap. In addition, IG executive
leaders must be savvy in the ways of securing proper budgets for
projects. Anecdotal evidence from companies with good IG programs
shows buy in from senior IT and Legal executives. These executives
actually work together early and often to define what is reasonable for
the organization, any process requirements (e.g. legal hold, early case
assessment), and then allow IT to purchase the right infrastructure or
Legal to procure the right services. While it sounds trite, the key to IG
success is cross-functional communication and cooperation.
the remediation costs will be. The assessment of resulting benefit is a yet
further stage which cannot sensibly be addressed without knowing the
costs.
Consider the questions:
What is the chance of one of our aeroplanes crashing?
How likely is it that our new medicine will poison people?
Compare these with:
How likely is it that I will be sanctioned for destroying or failing
to find this document?
or
How much value lies in being able to find this document quickly
when the subject-matter recurs?
and then ask:
How much does it matter if this event happens?
..then you are beginning to see the way to the question:
What is it worth spending to be relieved of that risk?
The point is not so much whether you and I, in the abstract, can give a
weighting to any of the factors involved in these questions but whether
companies are addressing them at all. It is a pretty good bet that a
company at risk of having its aircraft crash or its medicines poison people
is very focussed on risk and willing to spend almost anything to mitigate
the risk.
The questions about deleting documents are more nuanced. What sort of
documents are these? Legal questions arise (Is there a regulatory,
statutory or other implication? and Are they already subject to a legal
hold?). There are technical questions (Can we still access and read these
documents?), and practical ones (Does anyone ever bother?), and
questions of cost (What does it actually cost to keep them?).
What has all this to do with external lawyers? What role is there for them
in the kind of proactive input which is required when legal considerations
are involved in such decision-making, as they clearly are when regulatory
and eDiscovery implications may arise.
It is not much talked about, really. You can find plenty of lawyers and
others talking about risk and its mitigation (and perhaps rather fewer
talking about benefits) but you do not often see or hear about case
studies, actual examples of projects being undertaken by more firms to
help their clients.
I suspect that this is because very few firms are offering such a service,
with the rest either uninterested or unaware of the problems and solutions
or (less creditably) seeing the steady accumulation of yet more data in
their clients hands as an insurance policy which will keep them in work for
long enough to see out their careers.
Justice requests, abiding by subpoenas and legal holds, and following the
Federal Rules of Civil Procedure (FRCP).
* Being good stewards of data by properly preserving what needed to be
preserved while properly disposing of data that had lost its value.
* Making it easier for business users to find the high-value information
they need by disposing of information they dont need.
The ILG strategy used for achieving these goals is called defensible
disposal, and it requires bringing expertise from the legal, compliance
and IT departments together with key information stakeholders from the
business side to
lay out more comprehensive records retention and destruction
policies,
develop the procedures to implement and enforce those policies,
and,
where necessary, deploy the technology to support and automate
implementation and enforcement.
In addition to business data, the targets of the defensible disposal
program included offsite storage, email stores, backup tapes, and call
recordings.
Proactive Approaches to E-Discovery Equip Systems Martin Bonney
& Martin Nikel 3 October 2014 http://www.epiqsystems.com/askQ.aspx?
id=2147484901
Where to begin?
The journey, as the new EDRM suggests, ought to start with information
governance: understanding the data universe, the records retention
policies and the legal and regulatory needs of the business. This has
obvious benefits in terms of reducing the cost of storage (many estimates
suggest by more than 40 per cent) and the concomitant cost of processing
and reviewing documents. Less quantifiable, but possibly more significant
is that by not doing this, your organisation could retain data that might
come back to bite you in the future, but could justifiably have been
deleted if retention policies had been practically applied.
The reality, however, is often that litigation or regulatory investigation hits
before this governance work has begun. The smart professional will think
positively, and be proactive no matter where they are in the process. With
the right, proactive approach, much can be achieved. An eDiscovery
requirement from a regulator or the courts can actually be a valuable spur
to get your information governance house in order. The essence of such
an approach is planning and communication. Get the key players
(typically at least IT, legal/compliance and your eDiscovery provider)
talking to each other, and invest the time to build a data map essentially
a description of the organisations data types, technical infrastructure and
storage solutions. This is an essential first step to preserving and
collecting data, and can bring an early understanding as to the scale and
nature of the challenge.
Identify what you dont need
Another benefit of building a data map is that it enables organisations to
quickly highlight data sets that can be removed from a disclosure
requirement, for example back-up tapes which duplicate emails on the
journaling system or that are easily available and can be swiftly identified
and collected. This low-hanging fruit can be useful whether your
eDiscovery exercise relates to a regulatory investigation, an internal
investigation or to litigation. Showing practical responsiveness to
disclosure requests is a way to gain essential goodwill from regulators or
the courts.
Languages and priorities
Electronic disclosure professionals can often provide valuable input in the
early stages of a disclosure requirement, not least as translators between
lawyers and IT. It may seem a frivolous point, but these groups can often
use the same terms to mean different things, and can easily come out of
meetings with a completely different understanding of what needs to be
done.
Collection strategies
Similarly, forensics consultants can often help expedite a collection
process that may otherwise take second place to normal operational IT
requirements.
While for lawyers, the priority is often to obtain and review documents
quickly, its advisable to exercise caution at this key stage. It is often
important to be able to prove the provenance of a document to the courts,
opponents or regulators. As such, it is important to maintain the chain of
custody during collection, and to work with eDiscovery providers to
document an appropriate convention for data transfer. Emailing
interesting documents through Outlook to colleagues in a piecemeal
manner can create a huge meta-data challenge and is likely to add extra
cost and duplication of effort.
Accelerators to prioritize your data:
Once you have data or a subset of data then a raft of techniques is
available to minimize cost, accelerate the review and prioritize the most
relevant information. An obvious approach is to identify key custodians
and process these first to validate keywords and confirm that there are no
gaps in the collection (e.g. via histograms, which quickly emphasize gaps
in time). Analytics tools might also highlight unexpected subject matter
not covered by key words, and social network analysis might provide
insight to key custodians not yet considered.
Finally, if the data set is large consider the use of technology-assisted
review (TAR), also known as predictive coding. Using TAR tools allow a
legal expert to train the software, using a small subset of data to provide a