Whether labeled technology assisted review (TAR), predictive coding, or any of the other catch phrases floating around in the e-discovery ether, the concept of using computer power to enhance document review efficiencies while lowering costs has firmly entrenched itself into the litigation mainstream. You can’t escape reading or hearing about it if you wanted to. Webinars, email blasts, and a myriad of newspaper and magazine articles all are broadcasting a similar message: If you are not utilizing some magical “black box” solution, you will be left hopelessly by the roadside as the “lawyers who get it” leave you in the dust.

But for those of you standing on the side of the road, don’t feel left out because of all the recent excitement. While there is little doubt that TAR has a more prominent role in document review today than it did even one or two years ago, the use of technology to enhance document review has been developing for the past 15 years. The reality is that TAR is not all that new, and the fundamentals of a successful document review still remain the same. TAR will never replace the human intelligence, judgment and experience that is required to design sound and defensible workflow practices.

There isn’t a perfect document review methodology. Every case requires individual analysis and a tailored approach that takes into account multiple factors, including the type of case, a cost-benefit analysis (proportionality), production deadlines, prior agreements and cooperation between counsel, and complexity and richness of the data set. However, in deciding what approach is best for a particular matter, the following tenets almost always apply:

• The proliferation of data continues to skyrocket, and much of it is potentially discoverable in litigations or investigations.

• The greatest document review cost is the attorney review process.

• Reducing the quantity of documents requiring attorney review will reduce the overall cost of the review.

• Litigants are attempting to gain greater control of their information.

• TAR, when used in conjunction with a well-developed and defensible workflow, may oftentimes assist in reducing attorney review for a significant percentage of a document collection.

• The chosen methodology must be tested and sampled to ensure that it is reliable and returns acceptable levels of precision and recall.

• A primary concern of document review is the inadvertent or unknowing production of privileged or highly sensitive documents and the potential ramifications of doing so.

• There is no “doc review in a box” solution. No predictive coding technology exists today that can be implemented without substantial attorney review, sampling, and quality control to ensure defensibility.

Several courts have issued decisions this year addressing the use of predictive coding technologies, and countless bloggers and other e-discovery pundits have seized the opportunity to characterize those decisions as approving or even ordering the use of predictive coding technologies. However, to conclude that as a result of these orders, litigants should feel free to jump blindly onto the predictive coding bandwagon is a huge mistake. Predictive coding, like the technologies that have come before it, can be a dangerous trap for the unwary, and those intending to use it for their document reviews must tread lightly.

Decisions Provide Some Guidance

Three recent opinions have finally recognized that there is a logical argument for technology enhancement in the work-flow of legal document review. In two recent federal cases, Da Silva Moore v. Publicis Groupe and Kleen Products v. Packaging Corporation of America,1 and one state court case, Global Aerospace v. Landow Aviation,2 the judiciary attempted to allow variants of TAR that limit the percentage of documents to be manually reviewed through statistical sampling methodologies.

In these decisions, the courts did not approve of or direct the use of specific technologies. Rather, the courts said they would “allow” the use of these technologies within various acceptable protocols, workflows, and information sharing. While not an endorsement of any company’s software or workflow, it is nonetheless a sign of progress in the quest to implement technology enhancements into the review process.

Analyzing the Judicial Shift

So what has really changed? The realities of economics, technological advances and the pursuit of justice have intersected at a tipping point in our society. The lingering recessionary economy has forced litigants to focus more on the cost/benefit analysis of their potential cases. When the exploding volume of electronically stored information (ESI) creation is tossed into the mix, it becomes clear that the status quo, which is processing and reviewing a substantial percentage of the data collection, is no longer sustainable.

Over the past few years, the judiciary has recognized that cases should be resolved on the merits of the legal issues, and not be controlled by the expense of e-discovery. The belated but practical acceptance of TAR by the judiciary was inevitable as the reality set in that a tremendous amount of resources is required to review documents, and less expensive alternatives are required if the legal system in the United States is to remain open to all.

However, what is lost in this TAR clamor is similar to the revelation from the Good Witch of the East to Dorothy that she always had the ability to get home. All Dorothy had to do was implement the correct protocols. In the case of e-discovery, the correct protocols are efficient workflows and reliable technologies combined with human expertise. The whole process, which has been in effect for many years, may sound simple, but there are multiple phases needed for a defensible result in the discovery process.

The vulnerabilities of the e-discovery process, including those using TAR methodologies, are most easily understood by examining the end result and working backwards. Why were responsive documents omitted? How did a privileged document make it through? A litigation or e-discovery specialist can search and sample a production set all they like, but if the data were improperly identified, inadequately preserved, indefensibly collected, or incompletely processed, the results will be invalid. Garbage in equals garbage out, and any flaws in the process will be reflected, and possibly magnified, in the end result. This observation is true especially with regard to today’s more complicated data, such as embedded objects, audio and video files, metadata, and social media to name just a few. If it can’t be identified, preserved, collected, processed, or searched, it can’t be seeded, correlated, predicted, sampled or reviewed. Any failure along the way that is not caught through robust validation will undermine the reliability of the entire process.

As stated prior, there is no “discovery in a box,” no automated discovery, and definitely no magic bullet. The entire process is too complex, and the creation, transmission and storage of data is continuously evolving. The key to success in this process is to have a strong combination of highly experienced people, leading edge technology, and a highly organized, repeatable and documentable workflow that is under the litigant’s own control. And while clients understandably are looking to save on costs, it is the litigator’s role to educate them on why the inexpensive route can be so expensive when a production goes awry.

There is no doubt that new methodologies of identifying, culling, searching, categorizing, and reviewing documents will continue to gain acceptance over time by the legal community. They are inevitable in a litigation arena where expanding data volumes and formats are in a constant tug-of-war with the desire to decrease document review costs. However, regardless of whatever latest and greatest predictive coding solution or “Content Based Advanced Analytics” tool we see next, we cannot lose sight of the human judgment and experience that is required to design effective workflows in combination with these tools.

So if you want to sleep well while 15 servers are coding your next 18 million-page document review, make sure you have intelligent workflow processes; efficient, auditable, and defensible software; and a high level of human expertise. You will find this combination a key to e-discovery success.

Howard J. Reissner is chief executive officer at Planet Data, and Ian K. Hochman is special counsel at Willkie Farr & Gallagher.

Endnotes:

1. Da Silva Moore v. Publicis Groupe SA, No. 11 Civ. 1279 (ALC) (AJP), 2012 U.S. Dist. LEXIS 23350 (S.D.N.Y. Feb. 24, 2012); Kleen Products v. Packaging Corp. of America, No. 10 Civ. 5711 (N.D. Ill. 2011).

2. Global Aerospace v. Landow Aviation, No. CL 61040 (Va. Cir. Ct. April 23, 2012).