A few years ago, The New York Times Magazine asked constitutional scholar and legal commentator Jeffrey Rosen to imagine the future of technology and law. His subsequent reporting produced a sense of how inadequate the U.S. Constitution is to deal with some of the challenges posed by technological advances in fields like genetic selection and surveillance. It also revealed the profound role of corporate legal departments—sometimes ahead of courts and judges—in making determinations about privacy and free speech.

Those ideas are at the fore of Constitution 3.0: Freedom and Technological Change, a new compilation of essays edited by Rosen and Benjamin Wittes and published by The Brookings Institution. Rosen, a law professor at George Washington University and legal affairs editor at The New Republic, spoke with CorpCounsel.com about the book, legal battles to come, and how in-house counsel can have more power than the Supreme Court.

Below is part one of an edited version of that conversation (see part two here).

CorpCounsel: Given how much you had written on this topic, what surprised you about the essays in the book?

Jeffrey Rosen: I was very struck when I read the finished collection, both by the creativity of the thinkers, and also by how much people disagreed about the appropriate regulatory, legal, and technological responses. Some people put a lot of emphasis on judicial doctrine as the best way to protect liberty. Others emphasized administrative regulations and statutes. Still others were more interested in technological changes.

CC: Where do corporate lawyers fit into this equation?

JR: I became very interested in the role of corporate lawyers in protecting liberty when I was sent by the Times Magazine to interview Nicole Wong, who was then the deputy general counsel at Google. I wrote a piece called ‘Google’s Gatekeepers,’ which argued that Nicole Wong and her colleagues in the deputy general counsel’s office had more power over the future of free speech and privacy than any king or president or Supreme Court justice, and I expanded on this theme in a chapter for the Constitution 3.0 book.

Nicole Wong resigned recently, but Google entrusted her with the ultimate power over content decisions on YouTube and the more than 100 country-specific search sites Google operates. That’s why her colleagues—Kent Walker, the general counsel told me—jokingly referred to her as ‘The Decider.’ She was the ultimate authority to decide what stayed up or came down.

It’s just a dizzying range of problems she confronted. One example is the Greek football fans who posted YouTube videos saying that Kemal Atatürk, the founder of modern Turkey, was gay, which they like to do to rile up their rivals. This is illegal in Turkey, and Nicole Wong is woken up in the middle of the night and has to make a decision about whether the videos are clearly illegal under Turkish law, in which case they come down, or whether they might be plausibly protected as political speech, in which case she’ll keep them up. She ended up taking down some videos, but only in Turkey, which wasn’t enough to satisfy Turkish prosecutors. A judge then ordered Google blocked entirely in Turkey for a long time.

These are the kinds of decisions that we used to imagine governments making, and now that companies like Google and Facebook and Microsoft really determine the scope of free speech and privacy and many other values on the web, whether we like it or not, lawyers and corporate law departments are going to have to become interested in these issues.

CC: Did Wong give you an indication of any changes she made in the law department in order to accommodate those types of decisions?

JR: She had to set up a chain of command. So the first responders in making YouTube content decisions aren’t Nicole Wong, they’re a group of 22-year-olds in flip-flops and T-shirts at the YouTube headquarters near the San Francisco airport.

There are also first responders in Dublin, and around Europe. They make the initial decisions based on flags that are placed by YouTube users, suggesting that content isn’t appropriate. Then if a decision seems hard, it gets filtered up the pipeline and eventually will reach Nicole Wong. It’s a really tough decision.

CC: Have you found that these questions are more pressing abroad than in the U.S.?

JR: I think the problems are pressing both abroad and in the U.S. Abroad, of course, the consequences of making a hasty decision can be much more unfortunate. Yahoo got into trouble a few years ago when it turned over the email of a Chinese dissident who was later persecuted. But Google and Facebook and other tech companies are confronting tough choices in the U.S. all the time, too.

Recently, Sen. Joseph Lieberman, who has appointed himself as the kind of free speech prosecutor of the Senate, has put pressure on Twitter to take down pro-Taliban feeds. And Twitter refused, saying that the feeds didn’t lead to criminal incitement of violence and were essentially news feeds for the Taliban.

Basically, Twitter—and to a lesser degree, Google—have embraced the U.S. free speech standard, which is the most protective in the world. It says that speech has to be protected unless it poses an imminent threat of serious, lawless action. That’s a much more rigorous standard than even Europe has adopted.

And I can also think of a whole lot of areas where we’re about to see a dramatic clash between U.S. and European laws when it comes to privacy and reputation and defamation. And that’s going to make the job of corporate counsel even more challenging.

See also: “Privacy, Technology, and Preparing Corporate Lawyers for the Future,” CorpCounsel, December 2011.