Hey there What’s Next readers. This is Ian Lopez back again, and as it turns out, this time for good! Your usual esteemed host, Ben Hancock, has moved on to what’s next for him: an awesome new role as data editor. Going forward, he’ll be be working on collecting and sifting through data to do big-picture stories in ALM’s newsroom (think “Big Data Ben”).

In the coming weeks I’ll be making this newsletter my own — keeping much of what Ben brought to the table and trying a few new things, too. So bear with me and feel free to chime in! Send suggestions, tips and more to ilopez@alm.com, or via Twitter at @IanMichaelLopez.


 

Watch This Space: Mueller’s Crypto Crackdown

 

By now, you all have surely digested the news about Special Counsel Robert Mueller’s indictment last Friday of 12 Russian government officials for hacking into the DNC and Hillary Clinton’s emails. But what might have gotten lost in the media frenzy over the charges, and what they mean for the Trump administration, are details in the indictment relating to money laundering — and what they say about how law enforcement is tracking crypto transactions.

According to the document, defendants conspired to launder what amounts to over $95,000 via a “web of transactions structured to capitalize the perceived anonymity of cryptocurrencies such as bitcoin.” Allegedly, this money was used to foot the bill for the hacking infrastructure used against the U.S., and laundered through means like peer-to-peer exchanges, currency exchanges, and multilayered transactions.

But for all you may have heard about bitcoin and crypto transactions being anonymous, the indictment suggests they’re anything but. It alleges that defendants left what essentially amounts to a modern-day paper trail — albeit one linked to various fictional names and addresses, bunk email accounts, and the like. Making the mistake of using the same fake accounts in different purposes, the indictment says, allowed investigators to draw connections.

And missteps, notes Holland & Knight’s Joe Dewey, aren’t exactly unusual. Dewey is a receiver in an SEC case in which about $20 million of cryptocurrency was stolen from a hack and is part of an international hunt to recover the funds. He tells me: “While virtual currency does allow you to move and store value around fairly anonymously, in some ways it makes it more difficult or raises more red flags.”

Part of the reason is the exchanges themselves. Often, those trying to cover their tracks are left with the need to convert the currency, and many exchanges are cooperative with law enforcement and regulated. And while there are international exchanges that have run afoul of the law — think Russia’s BTC-e, which the Treasury Department shut down last year — international agencies often undertake their own regulatory measures.

Dewey notes there are some outlets that ease conversion efforts. Shapeshift, for example, was used by the WannaCry hackers to launder ransom payments, Forbes reported last year. However “common missteps,” Dewey says, like those highlighted in Mueller’s indictment, often catch up with perpetrators.

“It’s fairly hard to remain anonymous. Many people make the mistake at some point, and it leads to law enforcement or others to find relationships between the accounts,” he adds.

>>Takeaway: While regulators are still trying to rein in scams, the crypto world already leaves more avenues for law enforcement to follow criminals than many believe.


On the Radar: 3 Things to Know

 

I, Robot? California has been leading the way for tightening data privacy in recent months. But if State Senator Robert Hertzberg has his way, online bots will have to expose themselves for who they really are. As The New York Times reports, Hertzberg introduced a “first of its kind in the United States” bill this year “that would compel automated social media accounts to identify themselves as bots[.]” Yet, as the Times notes, uncertainty remains over how the bill would apply to companies potentially operating globally. It also wouldn’t require technology companies to enforce the regulation.

Data Overload. Legal research technology made quite the splash last week as Thomson Reuters and LexisNexis announced offerings that double down on analytics. Thomson Reuters is billing its “Westlaw Edge” — a new take on the Westlaw classic — as its biggest update in years, deploying machine learning to analyze state and federal dockets by motions, attorneys, judges and other attributes (my colleague Zach Warren does a deep dive here). LexisNexis’ Lexis Analytics, meanwhile, brings together analytics tech from acquired companies like Ravel Law and Lex Machina for litigation, regulatory compliance, and contracts.

“The dam is breaking, as it should.” That’s what the ACLU had to say about Colorado Rep. Mike Coffman, who was first in breaking with Republican ranks to support a move to reverse the FCC’s effort to axe net neutrality. As Motherboard reports, Coffman’s vote to undo the dismantling of net neutrality brings the number of the initiative’s supporting House votes to 176. While 215 votes are needed for such a reversal, activists are hoping that Coffman will inspire other Republicans to vote with him.


Face Off: Microsoft Wants *More* Tech Regulation

 

You read that right. ICYMI, Microsoft CLO Brad Smith late last week put up a blog post asking for more, not less, government oversight of facial recognition technology. The blog appears to set Microsoft against the grain from its tech brethren down south in Silicon Valley, citing “a bipartisan and expert commission” as “the only way” to regulate the controversial technology.

Smith fundamentally rejects the notion of technology companies self-regulating, describing the idea as “an inadequate substitute for decision making by the public and its representatives.” In his estimation, a handful of tech giants tweaking their practices wouldn’t necessarily mean industry self regulation. Instead, “competitive dynamics” between international tech companies “will likely enable governments to keep purchasing and using new technology in ways the public may find unacceptable in the absence of a common regulatory framework.”

“[A] world with vigorous regulation of products that are useful but potentially troubling is better than a world devoid of legal standards,” Smith writes.

This position has more than its fair share of advocates. As The New York Times reports, privacy groups have challenged tech titans over their uses of facial recognition software. Facebook took heat in a complaint to the FTC over not obtaining proper user permissions for its use. Meanwhile, among other controversies, the ACLU requested Amazon stop offering a face-matching service, as did researchers in an open letter to CEO Jeff Bezos.

But is Microsoft’s proposed an approach likely to have the intended effect? Theresa Payton, the first woman to hold the role of White House chief information officer under George W. Bush, tells me that an open dialogue much like the one between NATO and advocacy groups to discuss international privacy woes, might be more ideal in getting closer to the core of issues inherent in facial recognition technology, such as racial bias and privacy.

“If the regulation forces the conversation, bravo. But if the regulation actually just creates a more expensive barrier to entry and doesn’t really do a good job protecting anything, not just the security of the data but my privacy, I’m not interested,” she said.

>>Looking Ahead: This may be the start of something, but increased regulatory oversight of biometric recognition technology still seems a way off in the United States — despite the massive impact of the GDPR worldwide.


That’s it for this week! Remember: Think twice before uploading that Facebook picture!