Artificial Intelligence Credit: Phonlamai Photo/

For some lawyers, artificial intelligence boils down to fears that robots will take over their jobs. But AI offers opportunities for firms that are proactive about incorporating the technology into the services they offer clients. And those who ignore it risk getting outpaced by the competition.

That was the message of the College of Law Practice Management’s conference this year on the future of law. About 125 lawyers and legal operations professionals converged on Atlanta for the annual confab the last week in October, hosted by Georgia State University College of Law. This year’s theme: “Running With the Machines: Artificial Intelligence in Law Practice.”

Here are six common questions that participants addressed:

Artificial intelligence has hit the legal industry—but what the heck is it?

Broadly speaking, AI in law refers to algorithms that use computational power to review (often massive) data sets of legal documents and identify patterns to augment traditionally human tasks like problem-solving and learning. The main applications that participants identified ranged from uses like legal research and e-discovery to newer applications such as contract review, due diligence and predictive analytics—with automated contract generation also gaining traction.

Why can’t I just ignore it?

In a flat market for legal services—where demand, as measured in lawyer hours, has not increased since 2009—AI-enabled legal services are a way to deliver more value to clients, said Chris Boyd, the head of knowledge management, attorney recruitment and professional development at Wilson Sonsini Goodrich & Rosati in Palo Alto, California.

“We need to do something different in how we provide value to clients,” Boyd said. AI offers lower costs and greater accuracy for legal tasks such as document review, he said, and potential increases in predictability and better litigation outcomes using predictive analytics. AI might, for example, help decide whether a client should settle a case or fight it, based on an analysis of a judge’s behavior faced with similar fact patterns.

Is my job going to be taken by a computer?

The short answer is no. What AI is very good at right now is classifying documents—for instance, deciding which documents are responsive in discovery—but it still can’t do strategy, said Michael Mills, co-founder and chief strategy officer of Neota Logic, which offers customizable AI software to firms and law departments so non-programmers can build user-specific applications.

The consensus from Mills and other participants is that AI applications free up lawyers’ time for better uses. “No one went to law school to do hours of legal discovery or due diligence,” Boyd of Wilson Sonsini said. “Tech-deal lawyers hate due diligence. They want to spend their time meeting with and advising cool startups.”

Boyd noted that first-year associates cost his firm $180,000 a year plus bonuses. Often their realization rate is low, since they can’t be used for many tasks. AI is a way to increase realization, he said. He added that junior associates are “very concerned” about AI, and a first-year recently asked him if robots would take their jobs. His response: “We don’t have you here to do due diligence. We want you to use your brains and talents to help clients reach their goals in interesting ways.”

What does AI look like?

Here are some real-world examples:

Due diligence. Canadian firm Gowling uses AI for contract review and due diligence, said Rick Kathuria, the firm’s director of legal project management and legal logistics. He addressed the common misconception that the software—they use Kira Systems—looks for key words. Rather, it’s looking for patterns, he said, common provisions and clauses in contracts such as a “change of control” clause.

“You have to train it,” Kathuria added, which can take some time. “It’s a baby—you have to teach it the right things.” For example, he said, a client needed to analyze thousands of contracts because its service people had made all sorts of different service agreements. The trainers told the software to look for provisions like pricing. On the first review, it was 80 percent accurate in finding responsive documents, but the team wanted 95 accuracy, which meant more careful training.

The software does not replace humans, he emphasized. “It pulls up a set of documents, but you have to review it and decide if the terms are a problem. It can’t make risk assessments—yet.”

E-discovery. John Tredennick, the CEO of Denver-based Catalyst, said the e-discovery software similarly requires training so it can find the relevant documents in a database numbering in the thousands or millions. He compared it to Pandora or Spotify for music, where the algorithm gains information about a user’s preferences and dislikes in songs from what they select or turn down and then continually refines itself to suggest music that’s responsive, aka “music you like.”

For example, he said, one bank client had two million documents to review at a cost of $2 per document. Rather than spend $4 million, it opted for “predictive discovery.” Tredennick said his software produced an initial batch of 6,000 documents that were 98 percent responsive. The rate dropped sharply on the next pass, indicating it had successfully mined the data to a reasonable degree.

Tredennick noted that the courts have now approved discovery production that returns 70 to 75 percent of relevant documents—noting that the cost increases geometrically for the last bit, which would require ever-increasing refinement of terms.

Predictive analytics. Kyle Doviken of Lex Machina, Menlo Park, California, covered predictive analytics, saying his company started out by comparing judges’ speediness and decisions in patent cases, and then expanding to other judge comparisons, such as how often a family law judge awards a father custody. Now it is assessing how successful experts are in various types of litigation. Predictive analytics could soon be used to vet lateral hires, he said.

How much does it cost?

Pricing varies for AI applications, and there is room for negotiation. Catalyst, the e-discovery software, charges by the document, not the user, while Lex Machina, for predictive analytics, is subscription-based. Contract review software may bill by the document, by the user or a combination of both.

Does AI mean the end of the billable hour?

It might—in some cases. Speaking broadly, AI applications allow firms to perform initial reviews of large pools of legal documents that are faster and cheaper than paying lawyers to do it by the hour. “Efficiency is the enemy of the billable hour,” said Andy Daws, chief customer officer for Kim Technologies, which offers legal project management and other applications.

But Gowling’s Kathuria said this can be profitable for firms that are able to quote a client a fixed rate and then figure out how to perform a task for less.

There are nuances as well. Doviken gave the example of a lawyer using Lex Machina software to assess a client’s chances in filing a motion to dismiss. An analysis of similar cases predicted a 98 percent chance of losing, he said, so the lawyer advised the client not to file. It cost the lawyer’s firm $10,000 to $15,000 in billable hours—but gained good will from the client.

AI is here, but it can still be “horribly messy,” said Liam Brown, the founder of Elevate Services, which he described as Accenture for law firms and law departments, including many name-brand Fortune 500 companies.

In many cases, “the tools are rudimentary and hard to use,” Brown said. “Now, in 2017, things are finally getting interesting.” AI is starting to erode law firm’s power to set prices, Brown said, but it can also be a profit center. “If your firm is relying on junior lawyers to generate significant amounts of profit, be careful about the future,” he said.