July 22, 2008

2008 Socha-Gelbmann EDD Survey Sneak Preview

The anxiously awaited 2008 Socha-Gelbmann EDD Survey executive summary will be published in the August issue of Law Technology News, but here is a sneak preview to whet your appetite.

This year we gathered information from or about 107 EDD services and software providers, and from 29 law firms and 19 corporations. Using more than 350 qualitative and quantitative factors, we ranked those providers - top overall EDD services providers, top overall EDD software providers, and top EDD providers in various categories. Although one can, and perhaps should, take issue with the specific rankings, we feel that the providers listed below represent the best of the best, an elite cadre singled out from a field of well over 600 organizations that offer some form of EDD services or software.

At the same time, we cannot emphasize too much that these still are generalized rankings. Anyone who makes buying decisions primarily on these rankings or any other generalized rankings is a fool. Although as part of your selection process you might choose to consider these rankings, or rankings from the increasing number of other organizations who attempt to rate EDD providers, you really should conduct an analysis that focuses on your specific needs. No generalized set of rankings will do this for you.

Our rankings are based on a detailed ranking model containing more than 350 qualitative and quantitative factors. We used this model to evaluate information from or about 107 EDD services and software providers.

The top 5 electronic data discovery service providers, in alphabetical order, are:

Autonomy Zantaz

Fios Inc.

FTI

Kroll Ontrack

LexisNexis

Six through 10 for services are:

Encore Discovery Solutions

Epiq Systems Inc.

Huron Consulting Group Inc.

KPMG

Renew Data Corp.

Eleven through 15:

Electronic Evidence Discovery Inc.

Ernst & Young

Merrill Corp.

Onsite3

Stratify

And 16 through 25:

CaseCentral

Computer Forensic Services

Daticon

Discovery Mining

eMag Solutions

Guidance Software Inc.

IE Discovery Inc.

Pitney Bowes Legal Solutions

SPi

TechLaw Solutions

The top 5 electronic data discovery software providers, again in alphabetical order, based on a similar set of criteria, are:

Autonomy Zantaz

Clearwell Systems Inc.

FTI

Guidance Software Inc.

LexisNexis

Six through 10 for software are:

Attenex Corp.

CT Summation

Epiq Systems Inc.

iConect Development

Symantec Corp.

And 11 through 15:

AccessData Corp.

Equivio

Kazeon Systems Inc.

Kroll Ontrack

MetaLincs

In the August LTN issue, in print, and on its website (on August 1), and here on EDD Update (August 4) we also will have:

Size of the market in 2007

Market size projections for 2008-2010

Discussion of key issues and trends relating to EDD

The top 1-5, 6-10 and 11-20 EDD service providers based on experience

The top 1-5, 6-10 and 11-20 EDD service providers based on capacity

The top 1-5, 6-10 and 11-20 EDD service providers based on corporation ranking

The top 1-5, 6-10 and 11-20 EDD service providers based on law firm ranking

The top 1-5, 6-10 and 11-15 EDD software providers based on corporation ranking

The top 1-5, 6-10 and 11-15 EDD software providers based on law firm ranking

The top 1-5, 6-10 and 11-15 EDD software providers based on reported use of software by others

We will also publish the Socha-Gelbmann rankings of service providers based on the stages of the electronic data discovery process set forth in the Electronic Discovery Reference Model (http://edrm.net/):

The top 1-5 and 6-10 EDD service and software providers - Identification

The top 1-5 and 6-10 EDD service and software providers - Preservation

TrackBack

Comments

Not surprisingly, we have been receiving some questions about our rankings, what the bases are for them, and what they mean. I won't try to lay out a comprehensive explanation here, but I would like to highlight a couple of items.

First, not everyone may realize the degree to which we must depend on self-reported information. We ask services and software providers, law firms and corporations to provide us with an overwhelming amount of detail. Many of the organizations that respond do that, giving us lots of information and putting in a great deal of time and effort to get there. We appreciate greatly the effort, for without it we would have precious little information to evaluate. Much of the data provided to us, however, is not information that we can verify or refute. We have to depend on the integrity of the people and the organizations providing us with the information. At times, we feel it is necessary to give providers haircuts; we never, however, give them toupees.

Secondly, we sometimes are asked whether our rankings are "statistically significant." We do not attempt to address confidence levels, confidence intervals, significance levels and the like. We report the number of organizations from whom we obtained amounts of information that we deemed to be useful for our analysis. This year that was 107 services and software providers; 29 law firms, which not surprisingly tended to be law firms that are involved in a lot of electronic discovery work; and 19 corporations, again not surprisingly many of which have had to deal with quite of bit of electronic discovery. We also describe the process through which we go to arrive at our rankings. We did not include that lengthy narrative in this posting - after all, a sneak preview should be short, not a tome. We will, however, post addition information about the process we used both here and at www.sochaconsulting.com.

Idea: A report based upon real experiences and not market opinion would be a great concept. A “tripadvisor.com” report card that rates service providers by actual users would help – even if mere comic relief.

"We do not attempt to address confidence levels, confidence intervals, significance levels and the like."

But doesn't a survey need to do that? Rather than compiling data about a large group, a survey usually studies a chosen subset of the population, the sample. The data are then subjected to statistical analysis and IF the sample is representative of the population, then inferences and conclusions made from the sample can be extended to the population as a whole.
A major problem lies in determining the extent to which the chosen sample is representative. When you have a small sample size, very small differences will be detected as significant. so you want a large sample to be very sure that the differences are significant. Because isn't that what's important..what sample set is significant?

And I'm not just picking on you here George. Fios also published a report recently of a "survey" taken of 28 legal professionals from Fortune 500 corporations. How much illumination do we get from a .05% sample of a certain population or even 106 out of 600 plus without knowing more about the demographics of the respondents?

Seems to me this is a valid question for any survey. Just my (annual) 2 cents. Tom O

While the full adherence to the Code of Standards and Ethics for Survey Research for the survey may or may not be in question, I do think if one look's beyond the rankings - one can find the real gold in the survey - that being the overview of EDD usage/need/requirement trends. While people may have concern with "vendor reporting" being used as the primary component of vendor rankings - I do think most people would be very comfortable with vendor reporting of the trends they see concerning EDD usage/needs/requirements - as the vendors have very strong contact with both law firm and corporation users.

To Tom's point on "surveys", the info below might be worthy of consideration.

Based on the Council of American Research Organization’s (CASRO) Code of Standards and Ethics for Survey Research, a research organization's report to a Client or the Public should contain, or the Research Organization should be ready to supply to a Client or the Public on short notice, key information about a survey to include:

1. A description of the sample design, including the method of selecting sample elements, method of interview, cluster size, number of callbacks, Respondent eligibility or screening criteria, and other pertinent information.

2. A description of results of sample implementation including (a) a total number of sample elements contacted, (b) the number not reached, (c) the number of refusals, (d) the number of terminations, (e) the number of non-eligible’s, (f) the number of completed interviews.

3. A description of any weighting or estimating procedures used.

4. A description of any special scoring, data adjustment or indexing procedures used. (Where the Research Organization uses proprietary techniques, these should be described in general and the Research Organization should be prepared to provide technical information on demand from qualified and technically competent persons who have agreed to honor the confidentiality of such information).

5. Estimates of the sampling error and of data should be shown when appropriate, but when shown they should include reference to other possible sources of error so that a misleading impression of accuracy or precision is not conveyed.

I look forward to being able to review the complete 2008 survey - and being able to see how the trends have evolved over the last year - and how we might learn from them to better our ability to respond to the needs of the consumers of EDD services.

Please do not forget that what we have posted to date is called, for good reason, a "sneak preview."

It was never meant to be a comprehensive description of the process through which we structure our survey, gather information, evaluate that information and report on that evaluation. Nor was it meant to be a full report of the results of our survey.

We are in the final steps of preparing the full report, currently 349 pages long. In addition to the report, we include hundreds of pages of appendices. The appendices contain, among other things, the three spreadsheets we use as survey instruments; each of those would be over 200 pages long if printed.

The full report contains most, if not all, of the explanations that the authors of the comments have suggested we provide. We will be posting substantial portions of those explanations once they are ready for publication. We will not, however, be posting the full report itself.

We sell the full report. The cost is $5,000. Some have criticized us for asking people to pay for the full report. Once they understand the effort involved, however, they usually appreciate why we charge for the report. The overall process now takes nearly a year and requires hundreds of hours of our time.

There is a inescapable beauty and attraction to the sweet, bizarre music created by Lindsay Powell as Cake Bake Betty. This multi-instrumentalist originally from New Jersey, now resides in Chicago via Nashville and is part of the Infinity Cat collective, the micro-label created by Jake and Jamin Orrall, who also now live in Chicago and perform as JEFF. (Factoid: Jamin was the original drummer of Be Your Own Pet.) If we're lucky, the boys will be backing up Lindsay for her set at Subterranean this Saturday, when she peforms as a part of Bandwidth. More after the jump!!

Lindsay is currently working on recording the follow-up album to her 2006 release Songs About Teeth, which not only delivers what it promises "teeth songs", it is also filled with sad and delightful tracks about desperation, hope, spines, and cannibals. At the heart and soul of this album is Lindsay's piano playing and elegant voice (think Regina Spektor), which she is not afraid of pushing to the limits, but with the help of some friends on cello, violin, mandolin, and drums the captivating sound of Songs About Teeth is achieved. She is armed to the teeth with an arsenal of quirky pop songs chock full of imaginative lyrics that we surely make you smile and squirm.

It wasn't long after job boards were "invented" that someone thought of the concept of job scraping. If you can't get enough vacancies posted on your site why not simply "copy" them from elsewhere, thus giving the impression of a busy job board. Some job boards do this covertly and jobseekers may not even be aware the job was originally posted elsewhere, whereas for other job boards this is their clear business model (including to a certain extent the job search engine sites like clickajob).

However, when a site scrapes your vacancies, adds additional (and untrue) information and then refuses to remove them it's got to be pretty annoying.

I will say, where did the race card come in? That is the part that confuses me so. Yes, most opponents of this reform are white, but that's true of most Americans. I geuss like I said it's some extrapolation from the 1950's and 1960's of the idea that when right wing people take to the streets violence and racism follow. It's quite a stretch imo.