Skip to main content
Find a Lawyer

Not Just Another Article on eDiscovery Review

I, and my fellow electronic discovery veterans, have authored thousands of prescriptive articles offering sage advice on how to best improve and conduct an e-discovery review. Our industry often talks about "best" like they are non-intuitive or otherwise so unique that only the truly gifted and inspired can attain such vaulted status. In truth, I believe that the best e-discovery review best practices are better characterized as the application of real-life lessons. I believe they are not complicated, and I further believe that focusing on a few of these learned lessons with a thoughtful, deliberate approach will achieve a truly effective electronic discovery review.


The absolute number one lesson is to plan. I would advocate that in nearly every project, e-discovery or otherwise, doubling (yes doubling) the planning effort for a project results in substantially greater returns than the additional investment. Take, for example, a large e-discovery effort involving 100 custodians, which not uncommonly can incur more than $5 million in review and e-discovery costs.

From my experience, teams for e-discovery projects of this size spend in the neighborhood of 100 hours or so in the planning phase. At a billing rate $500 per hour, that's $50,000 in planning costs, which astoundingly is less than 1% of the project budget. When compared with other industries, where 10% is often the norm, this is a miserably small investment for such a critical activity. It is even more astonishing when you think about the level of coordination that is needed in large-scale discovery involving law firms, clients, service providers and others - all the while synthesizing legal and technical requirements with a kitchen full of cooks.

So how do we plan to plan?

Lesson Number One

Utilize staff and service providers who are trained, certified project managers and understand all of the nuances of a complex e-discovery review. Second, start early. Often, project planning gets short changed because there is a rush to start. Lastly, formally establish a reasonable project management budget and communicate the plan to all parties involved, so that realistic expectations can be established up front.

Lesson Number Two

Determine objectives and think SMART - Specific, Measurable, Actionable, Relevant, and Timely. Every project starts with an objective. The more clearly defined the objective, the greater the likelihood that the project will be successful. How many e-discovery reviews have started with no known complete date and yet they wind up late? Defining the objectives and goals of a project at the beginning of a project, even if arbitrary, are critical for setting appropriate expectations and ultimately delivering a successful review. SMART is a wonderfully useful acronym that every objective should fulfill.

Lesson Number Three

Determine objectives and think SMART. Wait -- isn't this a repeat? Why yes, it is. I can't stress how trivial yet important this is. Take, for example, an objective defining review accuracy (by the way, most e-discovery reviews don't formally consider this issue but should).

The theoretical goal of any review is to be 100% accurate in determining responsive and privilege calls. Most of us know that attorneys are human beings and are prone to the occasional error. So, what is a reasonable and yet acceptable accuracy rate? Is it 99%, or one error out of every 100; or 99.99%, which is 1 error out of every 10,000; or will 90%, or one error out of every 10, be acceptable? And what are we measuring? Is it that every redaction, responsive call, and issue classification is correct, or whether privilege claims are the primary focus? I can tell you that a review striving to achieve a 99.99% accuracy looks (and costs) much different than a review with 90% accuracy. As you are thinking about this, be SMART.

Lesson Number Four

Detailed and documented instructions for the review team are critical. By the time you get to checking for quality and consistency at the end of the review, it is too late. The project has to be approached from the beginning with quality objectives in mind. For example, if the objective is to reach 99.99% accuracy on responsive calls, is it reasonable to assume that two different people will review 1000 documents and agree on every document call except one? At minimum, providing appropriate training and documentation will insure that the review team has the basic foundation to make consistent document calls.

Lesson Number Five

Review environments vary. An under-appreciated factor in effective reviews is the computing environment. For example, the Internet traffic generated by a large review team can equal, if not exceed, the traffic generated by the typical browsing activities of an entire firm or legal department. Will the IT infrastructure support it? What are the contingencies if an Internet disruption occurs? If an in-house application is being used, will it support a large number of concurrent users?

Review environment requirements should not only address technology, but also people. Happy, enthusiastic people do better work. In all of the last minute madness, don't forget to take care of the people who fundamentally can make or break your e-discovery review project.


Planning, SMART objectives, documented review and a healthy, positive review environment are the basic building blocks for effective e-discovery review. These practical, hard-earned lessons are not complicated and are easy to include in your next review.

Courtesy of Allen L. Gurney of Fios, Inc.

Was this helpful?

Copied to clipboard