Wednesday, November 21, 2012

"The combination of small form factor and unprecedented
computing power make the GreenTec platform the
ideal information governance and e-Discovery platform for us." - R. E. Davis

Cloud in a Credenza?

When I wanted to find a
caption for this blog entry, it really was not that hard. All I had to do
was think back on my observations during a recent trip to Washington D.C. when
I visited a low key, secure GreenTec computer
hardware lab run by a the dynamic duo of Richard Detore and Steve Petruzzo.
Richard and Steve are the founders and principals of GreenTec,
a company that manufactures highly sophisticated, super dense computing
platforms. While well known to many “agencies”, the GreenTec team is not
exactly a household name in the traditional commercial computing sector.
A mutual friend, who was aware of my keen interest in sophisticated
hardware platforms and software applications germane to the e-Discovery space,
thought that the technological wizardry practiced in the GreenTec labs would give
me pause, and did it ever.

My colleagues and I were
so impressed by what we saw, that within short order we settled on the GreenTec
as the hardware solution for the DiscoveryLogix MIDCAP
(Multi-initiative Data Capture and Analysis Platform) solution. MIDCAP is
a non-governmental tactical data management solution that:

Enables the secure, forensically sound, capture, and aggregation of hundreds of terabytes of digital data or digitized hard-copy data.

Allows us to use sophisticated “pure math” based machine learning and predictive analytics on vast amounts of data in a portable, closed, compartmentalized, and impenetrable environment.

Richard Detore, Steve Petruzzo and

Richard Davis

"GreenTec platforms are empirical verification that Moore’s 1st and 2nd laws of computing are in full force and effect."

The MIDCAP approach
overcomes the logistical barriers associated with the physical aggregation,
processing, analysis, porting and accessing of vast amounts of data by
leveraging GreenTec’s “hot” portable data centers framework. Without
getting into detail, a small MIDCAP cluster (one Green-MAPP / one Ninja)
implementation of the MIDCAP platform works like this:

The Green-MAPP application server provides massive computing power in a ridiculously small form factor - ~5k gigaflops of processing power connected to 150 TB of internal hybrid 15k and SSD local disk storage by 106gb Ethernet and 2k gb iBand.

The Green-MAPP talks to the one petabyte GreenTec “Ninja” data storage NAS /SAN, which is a credenza sized storage device, through GIGE, 10GIGE and / or 40 GIG iBand pipe.

The propeller head in all of us says, “sweet”.

GreenTec Tactical "Cloud in

Credenza" Small Form Factor

Server

However, what does all
this salacious tech talk mean to information governance stakeholders?
Please indulge me for a moment by extrapolating on the following
use-cases:

Litigation Use Case

In the following hypothet, a U.S. based entity pursuant to subpoena must collect data from foreign based affiliate entities subject to a host of complex data privacy regulations.

D-Logix rolls (literally) the MIDCAP cluster with the appropriate customs paperwork and peripherals onto an aircraft.

At the destination, we transport the portable data center / cloud environment to a pre-selected secure location meeting all environmental requirements.

In the same MIDCAP portable cloud infrastructure, we enable any review tool (ViewPoint, Relativity, Introspect, Driven ONE, etc.) for 100 attorneys to conduct document review from anywhere in the world.

RIM Use Cases

Simply take the above use case and substitute responsive document exemplars with those reflecting organizations records categories and series, and create metadata fields as triggers for your retention schedules.

Is your regulated entity’s IT Department migrating data from one compliance mandated archive to another and you want to enhance the classification metadata (search criteria) that you can use to identify email and documents with greater precision and recall? Our solutions can help you more effectively and defensibly classify data of business value while giving you the ability to identify things that you don’t need to keep. Give us a call to chat about Archive Migration +.

Information Security
Classification Framework Implementation Use Case

Simply take the above
use case, make the responsive document exemplars correspond to your security
level designations by custodian, department, content signatures and develop
rules (implementable by a host of 3rd party solutions) to sequester documents
based on protocols.

Other Use Information
Governance Cases

We make the seemingly
difficult challenges very feasible and reasonable. Whatever your use
case, our information governance consulting team can address your requirements
with a higher degree of holistic institutional benefit than you ever thought
possible.

For comments or questions
regarding the content of this blog, please contact Richard E. Davis, JD: redavis@discoverylogix.com or
visit me on LinkedIn at:

Thursday, November 15, 2012

PART I – The Problem Space: The Vulnerability of the United States' Corporate Electronic Information Management Networks and Infrastructure.

Many companies don’t know that they have been attacked and are infected by malware.

Those who do know that they have been attacked and infected are placing unwarranted faith in malware detection tools that cannot effectively identify the nature and source of the infection they have.

The majority of attacks on U.S. non-governmental (private sector) interests originate from China and are “military grade” in nature, far surpassing the detection ability ofU.S. “commercial” malware and anti-virus software applications.

The weak points for attack by infiltrators are not necessarily faults in infrastructure per se, rather the weak points targeted are an organization’s cultural and behavioral dimensions which implicate policy and governance around security and data management.

The majority of attacks on U.S. interests have shifted over time from attacks on “governmental” type targets, to defense industry targets, to high value IP private sector targets.

Law firms, especially those with significant IP and international trade practices are prime targets for attack. Law firms are not in the information security business and many lack the distinctive competencies and IT budgets to holistically “bullet proof” their environments.

Chertoff Group - Consulting, business development, policy and governance.

Yesterday morning I was privileged to participate in a breakfast briefing given by former DHS Secretary Michael Chertoff. The event was sponsored by the leading network security auditing and defense consultancy, Mandiant. In attendance were numerous corporate information and network security experts representing major financial institutions, law firms and other organizations with a vested interest in learning more about the evolving nature of information security risk.

While many of us in the IT and legal world have a visceral sense of the present danger posed by information espionage, appropriation and data leakage, without a national clearinghouse of metrics and data aggregating threat information, the ability to empirically quantify threats on a national level, much less strategically address them, is greatly limited. Today most organizations are relegated to handling issues in an institutional silo - without the benefit of the collective learning process that would take place if such a national cyber threat warehouse existed.

The shocking truth about the vast majority of organizations that are targets for hackers and information appropriation is that:

Without holistic coordination between the stakeholder roles responsible for protecting an organization’s IP, CEIMI (corporate electronic information management infrastructure) and physical records, there will always be weak links.

As I listened to Secretary Chertoff and the other speakers discuss the state of affairs around U.S. corporate and energy sector network and information security generally, it confirmed my longstanding belief that the IT market’s next greenfield opportunities will require an amalgam of e-Discovery and network / data security skills. As an e-Discovery and information governance professional, much of my work has focused on the mapping, identification, extraction and classification of data behind corporate firewalls for litigation, compliance, RIM and M&A activity. My network security counterparts have within their purview the creation, implementation and maintenance of physical and logical barriers designed to keep data secure and prevent intrusion. In the new hybrid white hat / e-Discovery role paradigm, the skills sets are intersecting with palpable and immediate effectiveness.

Based on recent analysis of attacks on corporate networks, it is clear that we can no longer keep our heads in the sand. Awareness of a foreseeable situation creates a duty to act, and with lowered thresholds of pecuniary and fiduciary liability, it’s just a matter of time before a senior corporate executive or board member gets pilloried as an example. Based on the outcome of the recent election and heightened rhetoric about regulation, this statement should come as no surprise. The fact of the matter is that while IT and governance professionals have to continue to be ever vigilant and on top of prophylactic measures intrusion prevention, based on both known and unknown infection and attack mutation rates, we must bolster traditional measures with equally strong skills and policies that focus on intrusion management.

It’s time to face up to the fact that if and organization is in possession of information of value or has the corporate profile of a company that does, i.e. IP ligation or patent prosecution firm, there is a very strong possibility that your network has already been hacked and an even stronger possibility that it will be hacked again. Moore’s law applies to the bad guys as well as the good guys. Roughly paraphrased, Moore’s law states that hardware and software power will increase exponentially every two years. The implications of this are clear for all involved.

The private sector problem in aggregate has risen to the level of a being significant national security concern and it is now a hot button topic which has the full attention of bi-partisan committees in Washington.

For additional information on service relevant to information security and data classification, please visit the following sites:

Please check back for PART II - How the Human Metadata Model Helps Shore Up Data Leakage Points

Monday, November 12, 2012

The 2nd
Circuit Rejects Judge Scheindlin’s Position That A Failure To Issue A Written
Legal Hold Is Gross Negligence Per Se

In Chin v. Port Auth. of NY, 685
F.3d 135, 162 (2nd Cir. 2012), the court held, contrary to the well
publicized and oft-lauded holding in Pension Comm. of Univ. of Montreal
Pension Plan v. Banc of Am. Secs., LLC, 685 F.Supp.2d 456, 464–65
(S.D.N.Y.2010), that a failure to issue a written legal hold is not
gross negligence per se and does not result in a requirement that a court issue
an adverse inference instruction against the offending party. What does this
mean for staunch, die-hard advocates of the position that Judge Scheindlin took
in Pension Committee? Arguably, while rejecting the gross negligence per
se standard and some of its implications, the Chin holding does not
throw the baby out with the bath water. Rather than make the failure to issue a
written litigation hold dispositive of a gross negligence per se finding, the Chin
holding stands for the proposition that a failure to issue a written legal hold
is but one of a number of factors that may be considered by a court in
determining whether an adverse inference instruction is warranted. This
softening of the S.D.N.Y.’s position on legal hold issuance does not as a
standard preclude a court from arriving at the conclusion that an adverse
inference instruction is warranted where written legal hold instructions were not
issued. The Chin decision most significantly provides greater judicial latitude
and discretion to consider the effect of other variables and factors before
which either individually or collectively may or may not be ultimately
dispositive of gross negligence per se.

The implications of a failure to act in a fashion consistent with the
duty to preserve potentially relevant information remains the same and opposed
to adopting a bright line approach, it seems that some courts might be willing
to afford counsel more opportunities to argue the evidentiary impact issues
that the actions or inaction of a party has on case before imposing significant
sanctions. The operative question that seems to be distilled with increasing
frequency is, “was the information that was not properly preserved demonstrably
or likely to be relevant?” The 2nd circuit ruling should in no way
be construed as a holding that weakens or undermines this duty in any way.

One message that this holding has for litigants is that courts have evolved
in their level of sophistication and understanding of the substantive issues of
evidence that result from the failure to use best practices to preserve
relevant data. It also suggests that they have the bandwidth look at the issues
on a case-by-case basis before levying a high caliber sanction on a party who
has for one reason or another failed to properly issue a written legal
hold.