X

Thank you for sharing!

Your article was successfully shared with the contacts you provided.

This content has been archived. It is available exclusively through our partner LexisNexis®.

To view this content, please continue to Lexis Advance®.

Not a Lexis Advance® Subscriber? Subscribe Now

Why am I seeing this?

LexisNexis® is now the exclusive third party online distributor of the broad collection of current and archived versions of ALM's legal news publications. LexisNexis® customers will be able to access and use ALM's content by subscribing to the LexisNexis® services via Lexis Advance®. This includes content from the National Law Journal®, The American Lawyer®, Law Technology News®, The New York Law Journal® and Corporate Counsel®, as well as ALM's other newspapers, directories, legal treatises, published and unpublished court opinions, and other sources of legal information.

ALM's content plays a significant role in your work and research, and now through this alliance LexisNexis® will bring you access to an even more comprehensive collection of legal content.

For questions call 1-877-256-2472 or contact us at customercare@alm.com

Ben Hancock

Ben Hancock is the Data Editor for ALM Media and Law.com. Based in San Francisco, he leads a newsroom initiative to produce insightful, data-driven journalism. Ben can be reached at bhancock@alm.com.

More from this author

Law Firms Mentioned

<img class="alignnone size-full wp-image-1583" src="http://www.almcms.com/contrib

    /uploads/sites/403/2017/10/European-Commission-Article-201710130145.jpg" alt="" width="559" height="372" /> SAN FRANCISCO���The European Commission is ramping up pressure on tech companies to more aggressively use automated filtering to scrub ���illegal��� content from the internet, a move that is drawing criticism from some lawyers and free speech activists in Silicon Valley. In a <a href="https://ec.europa.eu/digital-single-market/en/news/communication-tackling-illegal-content-online-towards-enhanced-responsibility-online-platforms" target="_blank">communication</a> issued Sept. 28, titled ���Tackling Illegal Content Online,��� the commission said it ���strongly encourages online platforms to use voluntary, proactive measures��� to pull down illegal content and to pour more money into ���automatic detection technologies.��� Though the document is not a binding regulation or legislative proposal, the commission makes clear that it will monitor the tech industry���s response to its call for action and may take further steps������including possible legislative measures������by May 2018. ���Lawyers should be emphatically paying attention,��� said Andrew Bridges, who represents tech firms in copyright disputes at Fenwick &amp; West. ���I think that any company that provides any kind of platform these days need to be absolutely on top of this stuff.��� Bridges and digital rights advocates argue that implementing the commission���s proposal would be too costly for tech companies���especially smaller startups���and chill free expression without effectively fixing the problems the EU is targeting. The push by the EU seems to be part of a larger trend of placing more responsibility on online platforms���and not only in Europe. The U.S. Senate has also <a href="http://www.therecorder.com/id=1202794633652/Will-New-Senate-Bill-Really-Break-the-Internet">proposed creating a carve out��for claims relating to sex trafficking in Section 230</a>, which generally shields online intermediaries from liability over the content they host. The focus of the EU communication is largely on hate speech and online material that incites terrorist violence. But it also explicitly references applying filtering technologies to target material that infringes intellectual property rights, like pirated movies and music. European cities have been hit by a wave of terrorist violence over the past months, most recently in the UK and Spain. The release of the document by the commission, the EU���s executive arm, comes after the heads of EU member state governments in late June adopted a statement saying they expect the industry to develop ���new technology and tools to improve the automatic detection and removal of content that incites terrorist acts.��� But Daphne Keller, a former senior lawyer at Google who now is the director of intermediary liability at Stanford���s Center for Internet and Society, warns that the commission proposal places too much confidence in the ability of technology to know what is ���illegal.��� ���The communication buys in wholeheartedly to the idea that expression can and should be policed by algorithms,��� Keller <a href="http://cyberlaw.stanford.edu/blog/2017/10/problems-filters-european-commissions-platforms-proposal" target="_blank">wrote in a blog post</a>. ���The Commission���s faith in machines or algorithms as arbiters of fundamental rights is not shared by technical experts.��� Pointing to a <a href="http://www.engine.is/the-limits-of-filtering" target="_blank">March 2017 paper</a> co-authored by experts from Princeton���s Computer Science Department and advocacy group Engine about the limits of online filtering, Keller added: ���In principle, filters are supposed to detect when one piece of content���an image or a song, for example���is a duplicate of another. In practice, they sometimes can���t even do that.��� The commission doesn���t call out any companies by name, but it describes the online platforms that it has in mind as ���search engines, social networks, micro-blogging sites, or video-sharing platforms.��� It���s proposal almost surely has Google, Facebook, Twitter, and YouTube in mind. Representatives for Google and Facebook declined to comment directly on the commissions��communication and instead pointed to public posts and comments previously made by company officials about fighting terrorism. Twitter did not respond to a request for comment. To some degree, it appears that at least the major tech companies are already trying to respond to the call to be more active about filtering the material they host. Google General Counsel Kent Walker, in a <a href="https://www.blog.google/topics/public-policy/working-together-combat-terrorists-online/" target="_blank">speech</a> to the UN on Sept. 20, underscored the large volumes of footage that are uploaded to YouTube every hour and described efforts to pull down extremist videos more quickly���saying 75 percent of the videos that had been removed in recent months ���were found using technology before they received a single human flag.��� Jeremy Malcolm, an attorney and senior global policy analyst at the Electronic Frontier Foundation, said the problem with using a fully automatic filter is that determining whether online content is illegal often depends on context; the one general exception is child pornography, he noted. Malcolm gave the example of��scholars��posting terrorist videos online to work with other academics to analyze them. Filters would have a difficult time discerning between the two. ���We normally recommend that there should be a court order to take something down,��� Malcolm said, ���and it should definitely not be an automated process doing that.��� Keller worries especially that it would be too difficult to restore legitimate content once its been taken down, and cites data about how the counter-notice system has performed under the U.S. Digital Millennium Copyright Act. ���A key takeaway is that while improper or questionable [takedown] notices are common���one older study found 31 percent questionable claims, another found 47 percent���reported rates of counter-notice were typically below 1 percent,��� <a href="http://cyberlaw.stanford.edu/blog/2017/10/counter-notice-does-not-fix-over-removal-online-speech" target="_blank">she wrote</a>. ���That���s over 30 legally dubious notices for every one counter-notice.��� <

      >

      Lean Adviser Legal

      Think Lean Daily Message

      "Law firms are rightly afraid of having their bills challenged or, worse still, being sued by clients. They also have numerous safety related processes. These include client vetting, conflict checks, and of course training and supervision of lawyers. But the ultimate safety measure is doing the work effectively, efficiently, and transparently."

      Learn More

       

      ALM Legal Publication Newsletters

      Sign Up Today and Never Miss Another Story.

      As part of your digital membership, you can sign up for an unlimited number of a wide range of complimentary newsletters. Visit your My Account page to make your selections. Get the timely legal news and critical analysis you cannot afford to miss. Tailored just for you. In your inbox. Every day.

      Copyright © 2018 ALM Media Properties, LLC. All Rights Reserved.