Hello What’s Next readers, I’m your correspondent Ben Hancock. It’s that time of year again: the buds are opening, people are mucking out their homes, and Stanford Law School is hosting its annual CodeX FutureLaw conference! It’s a day chock full of the stuff I cover here: blockchain, AI ethics, privacy — you name it.
Watch This Space: Yelp on the Defensive
Yelp has so far been able to beat back lawsuits seeking to pull down reviews by unhappy business owners. But this morning in Los Angeles, in arguments before the California Supreme Court, the platform seemed to be on its back foot.
The case, Hassell v. Bird (Yelp), stems from a negative review of a law firm called The Hassell Law Group in San Francisco. The firm secured a default judgment against a former client, Ava Bird, that Bird’s Yelp review was defamatory. Then the trial court issued an injunction against Yelp to pull the review down—only Yelp was never a party to the case.
Yelp has made the twin arguments that its due process rights were violated in this instance, and that Section 230 of the Communications Decency Act protects internet service providers from having to pull down contentposted by their users. Section 230, as we know, is usually a sturdy defense for tech companies when it comes to liability for content.
But Yelp’s lawyer, Davis Wright Tremaine attorney Thomas Burke, faced a tough crowd this morning. The justices seemed uneasy with the idea that an injunction to remove speech adjudicated as defamatory could essentially be ignored by Yelp because of its immunities as a “publisher” under 230.
“So Section 230 is basically a license to continue to publish unlawful or defamatory content in perpetuity?” Justice Leondra Kruger asked Burke.
Justice Carol Corrigan seemed to have similar concerns. If the speech is ruled to be defamatory, “on what basis does Yelp then have grounds to say, ‘Well OK, Ms. Bird can’t continue to say this, but we’re going to be able to say it,” she prodded.
Burke struggled to respond to that, and instead emphasized that Yelp wanted a chance to argue over whether the speech was defamatory at all. His opposing counsel, Monique Olivier of Duckworth Peters Lebowitz Olivier, got some pointed questions from Chief Justice Tani Cantil-Sakauye about the fact that Yelp never got a chance to really argue its case.
The chief justice said that left the court to “guess” how things might have played out in assessing how to deal with Section 230’s scope. “That’s hardly a way to decide a case in front of the Supreme Court on an issue this important,” she said.
>> Takeaway: It’s hard to say whether the justices are ready to clip Section 230. But it did seem like they want to ensure that online platforms affected by a lawsuit have a chance to argue the merits prior to an order being issued against them.
IRL: Stanford’s CodeX FutureLaw Conference
This Thursday is the annual CodeX FutureLaw conference at Stanford, where attorneys, legaltech vendors, academics, startups and students all get together to talk about what’s coming next in the legal profession. Ahead of the event, I caught up with Roland Vogl, the head of Stanford’s CodeX center, to ask him about what he anticipates.
What are some of the newer themes at this year’s FutureLaw conference?
Roland Vogl (RV): Fairness, accountability and transparency (FAT) of algorithmic decision-making is a big topic this year. The first panel after the ABA president’s keynote will be on that topic. This topic, and broader questions around AI policy will only increase in importance in the coming years.
The idea of using blockhain for legal applications seemed to just be taking off this time last year. Now you have a session called “Blockchain as legaltech.” Has the industry really figured out how blockchain is relevant yet?
RV: I think there is a recognition that the blockchain is here to stay, and many different industries—including legal (inhouse and firms) are experimenting with it. I am not sure whether law firms have figured out the concrete use-cases for their operations.
There are certainly different consortia in legal that are trying to guide the way towards uses of the blockchain in legal settings, like the EEA Legal Industry Group, or Integra Ledger. There are some people in government working with blockchain startups to register property transactions.
Generally speaking, I think there is still a lot of hype, and there will probably be a bit of a bubble burst in the near future. But there will be transformative companies emerging that will solve real problems using the blockchain.
What’s the mood like going into the conference this year around AI? Is there still a lot of fear about the impact it will have on legal jobs—or about how it will meet legal and ethical challenges?
I think it boils down to the question whether those AI technologies are lawyer enhancing or lawyer replacing technologies. Most of what I have been seeing around CodeX are lawyer enhancing technologies.
What’s one thing you’re personally hoping to learn more about at FutureLaw?
I think we have a fantastic lineup of experts as panelists and lightning round presenters. There will be so much new stuff to be learned. It’s hard to point out one. I think Margaret Hagan’s project on using AI to avoid legal issues will be interesting. We also have a speaker who will speak about [General Data Protection Regulation] related things The keynotes will be great too. Of course the FAT panel, and the blockchain panel. I am excited about the entire program.
The Conversation: AI and the Law
If you’re looking for a deeper dive into what AI means for the law ahead of CodeX, make sure to listen to my latest podcast. I pull together interviews with executives at legal tech companies Luminance, LegalMation, IBM’s Cognitive Legal, and Judicata to get a real-world sense of how the technology will change how lawyers work. I also speak with Roy Strom, who writes “The Law Firm Disrupted” briefing, for additional perspective
The takeaway? Like Roland says above, most of the technology is supposed to be lawyer enhancing; it aims to make contract review less time-consuming, to address “capability gaps” that law firms have, or make legal research better and less tedious. The pitch is that lawyers will generally get to focus less on grunt work and do exciting problem-solving for clients. Most of the people I interviewed agreed that AI still has serious limitations when it comes to legal work.
But Roy raises the hard question: If you start taking out all the grunt work, is there really enough of that exciting stuff to go around? In other words, is there some “vast amount of higher value work that isn’t getting done for today’s clients at today’s prices?” If not, Roy asks, will law firms lower rates for that work in order to compete—something that’s “pretty much the opposite of the way law firms price their work today?”
Listen to the full podcast here. (Quick announcement for you regular listeners: From here on out I’ll be broadcasting on Law.com’s Legal Speakpodcast. I’ll be dropping in every month or so with a dispatch on the future of law. Make sure to update your subscriptions!)
On the Radar: 1 Big Thing To Know
The Microsoft Ireland Case: The Showdown that Wasn’t?
The Big Tech privacy and anti-surveillance cause celebre looks like it might end up being a big nothing burger. The U.S. Justice Department now says
its fight with Microsoft at the Supreme Court is moot, my colleague Tony Mauroreports. And today, Microsoft has said that it agrees.
DOJ staked out the position in a brief notifying the court that it has obtained a new search warrant against Microsoft to access emails stored at a company server in Ireland. The argument comes after Congress passed (and the president signed) a statute called the CLOUD Act that would allow law enforcement to obtain emails within a U.S. provider’s control, even if they’re stored outside the United States.
Microsoft today said it would evaluate the new warrant it has received issued under the CLOUD Act. In the meantime, the company says it “agrees with the Government that there is no longer a live case or controversy between the parties with respect to the question presented, which involves interpreting the prior version of the Stored Communications Act.”
Dose of Dystopia
While we’re on the subject of privacy, here’s this for your consideration. The travel ban may still be tied up in court, but the U.S. will still be doing some of that “extreme vetting” when it comes to social media accounts. The New York Times reports on a new State Department proposal:
“Last September, the Trump administration announced that applicants for immigrant visas would be asked for social media data, a plan that would affect 710,000 people or so a year. The new proposal would vastly expand that order to cover some 14 million people each year who apply for nonimmigrant visas.”
Another legal battle may well be brewing. Hina Shamsi, director of the American Civil Liberties Union’s National Security Project told the Times the proposal “will infringe on the rights of immigrants and U.S. citizens by chilling freedom of speech and association, particularly because people will now have to wonder if what they say online will be misconstrued or misunderstood by a government official.”
That’s it for this week. Keep plugged in with What’s Next!