Richard Raysman and Peter Brown
Richard Raysman and Peter Brown (ljh)

Search-engine results have become the lodestar of the Web for most users. Essentially any sort of knowledge, entertainment or other information is accessible via search-engine results. Whether the user constructs a narrowly-tailored query designed to exclude inapplicable results, or a broad search designed as an introduction to a given topic, the algorithmic apparatus fueling search engines will usually produce pertinent information. As a matter of fact, this article on search results was in part fueled by using results acquired from a search engine.

The seemingly ubiquitous presence of search-engine results within the milieu of Internet browsing has begun to implicate legal questions, including constitutional ones. With respect to search-engine results, whether they constitute protected speech under the First Amendment has become the foremost area of debate, both within courts and the legal academia. This question implicates axiomatic questions about the Free Speech Clause. Can “bits” of information generated by an algorithm—an algorithm reflective of the subjective judgments of its designers—constitute speech warranting the highest level of First Amendment protection? Is a search engine and the results it produces a “publisher” or “speaker” of content, and thus entitled to this protection, or is a search engine a mere “conduit” or “facilitator” of the speech of others, a role inapplicable to heightened free speech protections.

This article will discuss these questions as well as the relevant cases on point, including a recent decision in the Southern District of New York holding that a search engine had the right to exclude certain types of content from its search results, two other decisions holding that search-engine results are protected speech, and the thought-provoking debate among legal academics on the topic.

‘Baidu’

Baidu.com (Baidu) is a public Chinese Web services company headquartered in Beijing. Its search engine, a primary component of the company, provides an index of over 740 million Web pages and 80 million images. Additionally, the company purports to be the third largest search engine service provider in the world and the largest in China, with an estimated 70 percent of the Chinese-language market. According to one online newspaper that covers China, Baidu has a “long history of being the most proactive and restrictive online censor in the search arena.” This backdrop of censorship, particularly in the context of pro-democracy political movements in China, precipitated the case of Jian Zhang v. Baidu.com, — F. Supp. 2d —, 2014 WL 1282730 (S.D.N.Y. 2014).

Baidu involved a group of New York residents (plaintiffs) who advocate for increased democracy in China. The plaintiffs alleged that Baidu conspired to block political speech advocating democracy from appearing in its search results in the United States. Specifically, plaintiffs alleged that Baidu, at the behest of the Chinese government, censored and/or blocked any article, publication, video or audio in any format that dealt with the “Democracy Movement” in China. The same content would appear in the results of other search engines such as Bing or Google.

Plaintiffs proffered eight claims, including conspiracy to violate their civil rights under 42 U.S.C. §1985 and violation of civil rights on the basis of race pursuant to 42 U.S.C. §1981. Baidu responded that its search engine results constituted protected speech under the First Amendment. At the beginning of its opinion, the court noted both the volume of scholarship on the topic of whether search engine results should be classified as protected speech, and the paucity of cases adjudicating this question. In fact, as will be discussed in-depth later in this column, only two other courts have addressed this question.

In the instant case, the Southern District of New York initially cited a number of past Supreme Court precedents that contemplated the meaning and scope of the First Amendment. It found that these precedents supported Baidu’s argument. In citing to Miami Herald Publishing v. Tornillo, 418 U.S. 241 (1974), the court in Baidu noted that a Florida statute requiring newspapers to provide political candidates with a right of reply infringed the newspaper’s First Amendment right to “exercise editorial control and judgment.” Similarly, the court in Baidu referenced Hurley v. Irish-American Gay, Lesbian, & Bisexual Group of Boston, 515 U.S. 557 (1995) in which the Supreme Court held that “a speaker has the autonomy to choose the content of his own message” and that “one important principle of free speech is that one who chooses to speak may also decide ‘what not to say.’” From these decisions, the court in Baidu derived several principles pertinent to its case: (1) the government may not interfere with the editorial judgments of private speakers on issues of public concern, even if the speaker is a business corporation; and (2) whether the intention to suppress the speech is “noble” is alone insufficient to shield a speech regulation from the purview of the First Amendment.

As such, the court in Baidu held that “there is a strong argument to be made that the First Amendment fully immunizes search-engine results from most, if not all, kinds of civil liability and government regulation,” and thus, “all but compels the conclusion that the plaintiffs’ suit must be dismissed.” Specifically, Baidu’s editorial judgment in altering its search-results is tantamount to the protected editorial judgments made by newspapers and websites. It also dispensed with the two arguments that search engines are not under the aegis of the First Amendment because they collect and communicate facts, and produce said facts in search-engine results via algorithms. It stated that since facts are the beginning point for essential speech and that algorithms were created by human beings and incorporate the judgments and views of its creators, the Baidu-generated search-engine results epitomized protected speech.

Other Decisions

The two other notable cases to confront this question arrived at the same conclusion as Baidu in holding that search-engine results are protected speech. One similarly dealt with a website critical to the government of China. In Langdon v. Google, 474 F. Supp. 2d 622 (D. Del. 2007), the plaintiff (Langdon) operated two websites, one that claimed to expose fraud by the government of North Carolina, and another devoted to delineating “atrocities” committed by the Chinese government. Langdon, in filing a cause of action based in part on the First Amendment, alleged that Google refused to run advertisements for his websites. As to relief, Langdon requested an injunction forcing Google to place advertisements for his websites in prominent places within search-engine results. Google moved to dismiss this claim on First Amendment grounds, and the court granted it. Like Baidu, it agreed that forcing Google to relinquish editorial control over its search-engine results was an impermissible burden on its free speech rights.

In another action against Google, the Western District of Oklahoma held in an unreported case that a subjective process of ranking websites by a search engine according to relative significance of the pages’ correspondence to search queries were protected by the First Amendment. See Search King v. Google Technology, No. CIV-02-1457-M, 2003 WL 21464568 (W.D. Okla. May 27, 2003). In response to an allegation that Google had decreased the “ranking” of an advertising site within its search-engine results, the court held that because the algorithm used to produce the rankings includes factors that are “fundamentally subjective in nature,” the subsequent rankings were “constitutionally protected opinions.”

The cases on this topic are in accord that search-engine results are protected speech. However, as explained below, a debate among legal academics has arisen recently in which each side offers strenuous arguments.

Dispute Among Legal Academics

As the court in Baidu noted, law professors differ over whether search-engine results should be classified as protected speech under the First Amendment. As a doctrinal matter, one scholar against the classification of search-engine results as protected speech argues that such a decision would expand the First Amendment protections significantly, thereby imposing a higher-burden of regulators and the cost of regulation on the whole. Another has even contended that “nonhuman or automated choices,” such as search-engine results produced by automated algorithms, should not be treated as protected speech for First Amendment purposes. Other scholars have advocated for a lower-level of protection for search results based on the Supreme Court decision in Turner Broadcasting System v. F.C.C., 512 U.S. 622 (1994). In Turner, the Supreme Court applied intermediate scrutiny (a lower standard than normal for First Amendment claims) to deciding whether regulations requiring cable operators to carry the signals of a pre-ordained number of local broadcast television stations were constitutional. Interestingly, as a threshold matter, the Supreme Court held that the uncontroverted “initial premise” was identical to Baidu insofar as there was no disagreement that cable providers “exercised editorial discretion over which stations or programs to include in its repertoire,” and thus “engaged in and transmit[ted] speech by the First Amendment.”

Nonetheless, the court in Turner employed intermediate scrutiny to evaluate the First Amendment claims of the cable providers. It did so for a trio of reasons: (1) the providers were “conduit[s] for the speech of others in an unedited fashion”; (2) the providers had the ability to shut out some speakers, giving rise to a governmental interest in limiting monopolistic autonomy; and (3) the regulations in question were content-neutral. Though these factors were inapplicable to Baidu (the Plaintiff’s specifically argued that Baidu was not a mere conduit of the speech of others), they raise interesting questions about future jurisprudence on this subject. Although but for the publishing of content by other speakers, search engines would be largely unnecessary, it is difficult to imagine that an entity that creates an algorithm designed to choose, rank, and sort search results could be considered a mere conduit for others. A relevant response could argue that because search engines do not publish the content originally, nor do they edit the content itself, their primary purpose is as a facilitator and conduit of the content of unaffiliated other parties.

The second Turner factor seems to indicate that search-engine results will be subject to a higher-level of scrutiny than intermediate scrutiny. Unlike cable providers, a lone search engine cannot effectively shut out access to the dissemination of content, as other search engines will likely provide the content banned by another search engine. As for the third Turner factor, even if the regulation was ostensibly content-neutral (e.g., no search results can be censored, irrespective of the content), it would invariably engender a conflict with precedents discussed above that hold that exercising editorial judgment with respect to the organization and type of search-results shown to the user is speech subject to the highest protection under the First Amendment, even if presumably content-neutral. Additionally, a regulation that a search engine cannot filter any of the billions of results that include certain search terms would likely raise questions of overbreadth.

Conversely, another group of scholars have taken the position that search-engine results are protected speech under the First Amendment, thereby mooting the Turner intermediate-level scrutiny factors and analysis. These academics defend search engines as speakers on a number of grounds. First, search engines sometimes convey information that it has itself prepared, such as information about places appearing on a digitized map. More important in the context of search-results as protected speech, the same scholars argue that the editorial judgments by search engines in sorting results in a way tailored to provide users with results believed to be the most helpful and useful, the search engines are exercising judgment analogous to judgments made by protected speakers such as newspapers and websites. So the argument goes, the judgment of search engines as manifested in its results are merely the 21st century equivalent of newspapers’ judgments about which stories should go “above the fold.”

Of additional interest is that search-engine results presented entirely as commercial speech are considered to be, although no direct precedent exists at present, subject to a lesser form of scrutiny such as intermediate scrutiny. See Cent. Hudson Gas & Elec. v. Pub. Serv. Comm’n of N.Y., 447 U.S. 557 (1980) (defining “commercial speech” as “expression related solely to the economic interests of the speaker and its audience”); see also United States v. United Foods, 533 U.S. 405 (2001) (stating that commercial speech is “usually defined as speech that does no more than propose a commercial transaction”). According to the court in Baidu, search-engine results are considered commercial speech if the results consist of advertisements displayed by a search engine. Additionally, the court stated that the “relaxed” commercial speech-standard “might even apply” to “search results shown to purposely advance an internal commercial interest of the search provider.” Legal academics, even those in favor of defining search results as protected speech under the First Amendment, appear to concede that commercial speech should be analyzed under a lesser standard.

Conclusion

Courts thus far have agreed that search results are protected speech under the First Amendment. As one scholar who disputes this conclusion argues, the First Amendment thus protects “many areas of commerce and private concern without promoting the values [underlying it].” The example offered is that of a car alarm, an electronic device that utilizes an algorithm to decide when to communicate its opinions, and when to send a “particularized” message well understood by its audience. Nonetheless, though the precedents are few in number, courts are generally in agreement that search engine results are protected speech, largely on the grounds that the search engines providing the results exercise editorial judgment insofar as they decide which results to present to the user, and the manner in which the user sees the results. Moreover, the search engine conveys results that it itself has prepared, such as customized maps.

Richard Raysman is a partner at Holland & Knight and Peter Brown is the principal at Peter Brown & Associates. They are co-authors of “Computer Law: Drafting and Negotiating Forms and Agreements” (Law Journal Press).