Phase I Results
EDI/Oracle has released four difference performance metrics that compare TAR performance to the original review: cost, and three F1 rankings for responsiveness, privilege and hotness, Oot said. "F1 is a measurement of how an information retrieval system performs. It should be noted that the results are merely ranking the study participants. Actual F1 values will be released in Phase II out of concern that human re-review teams in Phase II could modify their behavior to achieve better results."
Oot identified the key takeaways for Phase I, when compared to the actual document review:
• Technology providers using similar underlying technology, but different human resources, performed in both the top-tier and bottom-tier of all categories. Conclusion: Software is only as good as its operators. Human contribution is the most significant element.
• The top quartile of responsiveness represents four different technologies.
• Spending more money does not correlate with greater quality. Inexpensive service provider performed very well.
• Per-document prices ranged from $0.03 to $0.89. All well below market standards of $1 per document of human-based contract attorney review.
• Tech Team 19's human input wasa single senior-level attorney who spent 64.5 hours on review and analysis. Tech 19 performed best at finding both responsive documents and privileged documents. If the gold standard is to replicate the mind of the senior attorney who certifies he or she has conducted a reasonable inquiry in Federal Rule of Civil Procedure 26(g), then it appears that both the original review and Tech Team 19 used a favorable methodology. Coincidentally, Team 15 used similar technology as Tech 19, with multiple contract reviewers, and did not achieve similar results.
• Tech Team 8, while in the middle tier of price performance, married a U.S.-based team with an overseas team to achieve the second-best performance on responsiveness and performed fairly well on privilege. This may dispel the myth that overseas teams perform less effectively than U.S.-based document review teams. Especially when other technology providers performed poorly with U.S.-based teams.
• Tech 14 performed well at locating both responsive and hot documents and was a top-tier performer on price. It was unclear from invoices the degree of effort placed on privilege review, but Tech 14 performed in the bottom half of the respondents.
• Tech 5 also engaged experienced attorneys and were the second-most effective at identifying privileged information.