A spate of recent court and regulatory decisions, seemingly unconnected on the facts, point to the concept of privacy as being the principal theme for regulation of the Internet. (Google Spain SL v. Agencia Española de Protección de Datos (AEPD), No. C‑131/12 (ECJ 05/13/14); In re Zynga Privacy Litigation, No. 11-18044 (9th Cir. 05/08/14); In re Snapchat, No. 132 3078 (FTC 05/08/14)) These judicial events impact the most important aspects of Internet commerce affecting search engines, commercial data collection, and marketing of services.
The Google case is a decision from the EU’s highest court, the European Court of Justice. It holds that because persons have a fundamental right to privacy with respect to the processing of personal data, Google has an obligation to remove objectionable hits upon request.
In the Zynga case, Zynga and Facebook were held blameless under the Wiretap Act and the Stored Communications Act for permitting dissemination of certain consumer information contained in HTTP requests from Facebook to Zynga created after a consumer clicked on a link to Zynga located on an open Facebook page. The information included the request line, the resource identified by the request, and request header fields. The Zynga case was therefore a clarification, but not a victory, for consumer interests.
In re Snapchat is a settlement before the FTC regarding misrepresentation in the marketing of Snapchat’s services. In Snapchat, the company agreed, without admission of fault, that it had misrepresented the ephemerality of each “snap” placed on its site by consumers and that its statements about the instant demise of each communication were overstated.
The Google Case and the Necessity for a Search Engine to Remove Personal Data Upon Request
Google’s Spanish unit, as a provider of content, located information published or included on the Internet by third parties, indexed it automatically, stored it temporarily and made it available to Internet users according to a specified order of preference.
An individual plaintiff, supported by Spain, Italy, Austria, and Poland complained that Google had listed a newspaper article which negatively reported on his debt status after a name search.
Google did not contest the fact that the information it had provided was personal data. However, it denied that it was a processor of that personal data.
Under EU law, privacy in the processing of personal data is a fundamental human right. An administrative directive defined such processing as “‘any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organisation, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction.”
Another directive defined “controller of personal data” as “the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data.”
The Court held irrelevant that the personal data had been published before Google provided reference to the data, as the list of results given in response to a personal query is a structured overview of the information relating to an individual that enables the searcher to establish a more or less detailed profile of the data subject.
“[T]he very display of personal data on a search results page constitutes processing of such data.” Google both processed personal data and was the controller of such data when it provides previously-published personal information in response to a request, and therefore a fundamental human right is implicated.
As the controller and processor of the information provided by a search request, Google had an obligation to further effective and complete protection of data subjects, in particular their right to privacy. Google therefore had to withdraw from its search results any objectionable item of information published by third parties, without addressing itself in advance or simultaneously to the owner of the web page on which that information is located.
The implementation of the Court ruling is left to the EU nations. In particular, “the national authority may directly order the operator of a search engine to withdraw from its indexes and intermediate memory information containing personal data that has been published by third parties . . .”
The Zynga Case and the Legitimacy of Marketing Consumer Data
Facebook users must provide their real names, email addresses, gender, and birth dates to Facebook in order to establish an account. Facebook then issues users a unique Facebook User ID.
To generate revenue, Facebook sells advertising to third parties who want to market their products to Facebook users. Advertisers do not receive users’ identifying data but do receive users’ demographic information.
Zynga has developed free social gaming applications accessed by links on Facebook’s platform. When users click on the Zynga link on the Facebook page, the hypertext transfer protocol, they are entering a URL address in the HTTP web address format. This tells the web browsers which resources to request and where to find them.
The basic unit of HTTP communication is the message, A request message, such as one generated by clicking on the Zynga link, has several components, including a request line, the resource identified by the request, and request header fields.
When Facebook users clicked on a Zynga icon, the web browser sent an HTTP request to access the Zynga resource identified by the link. The HTTP request included a referer (a mispelled version of “referrer” which is now a term of art) header that provided both the user’s Facebook ID and the address of the Facebook webpage the user was viewing when the user clicked the link.
In response to the HTTP request, the Zynga server would load the game in an inline frame on the Facebook website. The inline frame allows a user to view one webpage embedded within another.
Zynga programmed its gaming applications to collect the information contained in the referer header, and then transmit this information to advertisers and other third parties. Thus, both Facebook and Zynga disclosed the information provided in thereferer headers, such as user Facebook ID’s and the address of the Facebook webpage the user was viewing when the user clicked the link, to third parties.
The Ninth Circuit held that disclosure of the information contained in referer headers to third parties was not barred by the Wiretap Act and the Stored Communications Act.
The Wiretap Act states that those (1) “providing an electronic communication service to the public” (2) “shall not intentionally divulge the contents of any communication (3) “while in transmission on that service” (4) “to any person or entity other than an addressee or intended recipient of such communication or an agent of such addressee or intended recipient.” (18 U.S.C. § 2511(3)(a))
The “contents” of a communication are defined as “any information concerning the substance, purport, or meaning of that communication.” (Id. § 2510(8))
The Stored Communications Act covers access to electronic information stored in third party computers. (Id. §§ 2701–12) The relevant provision here imposes requirements on providers of remote computing services that are similar to the requirements of the Wiretap Act discussed above. Under the Stored Communications Act, “a person or entity” (1) “providing remote computing service to the public” (2) “shall not knowingly divulge to any person or entity the contents of any communication” (3) “which is carried or maintained on that service . . . on behalf of, and received by means of electronic transmission from (or created by means of computer processing of communications received by means of electronic transmission from), a subscriber or customer of such service” (4) “solely for the purpose of providing storage or computer processing services to such subscriber or customer.” (Id. § 2702(a)(2))
The Stored Communications Act incorporates the Wiretap Act’s definition of “contents.” (See id. § 2711(1)) It also differentiates between contents and record information. Section 2702(c)(6) permits an electronic communications service or remote computing service to “divulge a record or other information pertaining to a subscriber to or customer of such service (not including the contents of communications covered by [§ 2702](a)(1) or (a)(2)) . . . to any person other than a governmental entity.”
Although there is no specific statutory definition for “record,” the Stored Communications Act provides examples of record information in a different provision that governs the government’s power to require a provider of electronic communications service or remote computing service to disclose such information. (Id. § 2703(c))
According to § 2703(c), record information includes, among other things, the “name,” “address,” and “subscriber number or identity” of “a subscriber to or customer of such service,” but not “the contents of communications.” Id. § 2703(c)(2)(A), (B), (E). In other words, the Stored Communications Act generally precludes a covered entity from disclosing the contents of a communication, but permits disclosure of record information like the name, address, or client ID number of the entity’s customers in certain circumstances.
The Ninth Circuit held that Facebook and Zynga did not violate ECPA by disclosing the HTTP referer information to third parties because such information is the “contents” of a communication for purposes of 18 U.S.C. §§ 2511(3)(a) and 2702(a)(2).
When it enacted ECPA, Congress amended the definition of “contents” to eliminate the words “identity of the parties to such communication,” indicating its intent to exclude such record information from its definition of “contents.” (See Pub. L. 99-508 § 101(a)(5))
Thus, the term “contents” refers to the intended message conveyed by the communication, and does not include record information regarding the characteristics of the message that is generated in the course of the communication.
In so holding, the Ninth Circuit made the leap from statute law to constitutional law in citing United States v. Reed, 575 F.3d 900, 917 (9th Cir. 2009) for the proposition that information about a telephone call’s “origination, length, and time” was not “contents” for purposes of § 2510(8), because it contained no “information concerning the substance, purport or meaning of [the] communication.”
Thus, the Ninth Circuit ultimately rested the Zynga case on the metadata ruling in Smith v. Maryland, 442 U.S. 735 (1979). “Under the Fourth Amendment, courts have long distinguished between the contents of a communication (in which a person may have a reasonable expectation of privacy) and record information about those communications (in which a person does not have a reasonable expectation of privacy).”
The Court’s reasoning is that the referer header information that Facebook and Zynga transmitted to third parties which included the user’s Facebook ID and the address of the webpage from which the user’s HTTP request to view another webpage was sent did not meet the definition of “contents” in either the Wiretap Act or the Stored Communications Act because these pieces of information are not the “substance, purport, or meaning” of a communication.
This syllogism ignores the nagging fact that the consumer transmission in question never had “content” at all and was never intended to transmit any substance, purport or meaning to anybody. Therefore, there exists the possibility that neither Act covered this form of HTTP message and that therefore any statutory protection accorded to content did not apply to this form of communication.
Put another way, just because statutes in these cases do not preclude the disclosure of personally identifiable information when coupled with a message, they may not govern the situation where there never was an intent to communicate anything other than the desire to access another site. While such statutes expressly allow release of the metadata to third parties when content is attached to the identifying information, they say nothing about release of metadata unconnected with any content.
The Snapchat Case and Honesty in Marketing
Snapchat had marketed its message service by claiming that “snaps,” or messages, disappeared forever after the sender-designated time period expired. (See http://blog.snapchat.com (05/09/13, 10/14/13)) However, there were several ways to save the snaps indefinitely.
Several apps were sold on the Internet stores which would permit viewing and saving snaps indefinitely. This worked because Snapchat’s deletion feature only functioned on the Snapchat app. According to the FTC complaint, Snapchat was warned about this by a security researcher and did nothing.
Snapchat stored video snaps unencrypted on the recipient’s device in a location outside the app’s “sandbox,” meaning that the videos remained accessible to recipients who simply connected their device to a computer and accessed the video messages through the device’s file directory.
It also told its users that the sender would be notified if a recipient took a screenshot of a snap. In fact, any recipient with an Apple device that has an operating system pre-dating iOS 7 can use a simple method to evade the app’s screenshot detection, and the app will not notify the sender.
Numerous consumers complained that they had sent snaps to someone under the false impression that they were communicating with a friend. In fact, because Snapchat failed to verify users’ phone numbers during registration, these consumers were actually sending their personal snaps to complete strangers who had registered with phone numbers that did not belong to them.
Finally, Snapchat’s failure to secure its Find Friends feature resulted in a security breach permitting attackers to compile a database of 4.6 million Snapchat usernames and phone numbers. According to the FTC, the exposure of this information could lead to costly spam, phishing, and other unsolicited communications.
This case is part of a multi-national enforcement sweep on mobile app privacy by members of the Global Privacy Enforcement Network, a cross-border coalition of privacy enforcement authorities. (www.privacyenforcement.net)
Privacy Trends in the Regulation of Social Sites
The Google case is of tremendous scope as it affects all EU countries, a vital portion of Google’s empire. It is also final under EU procedural law.
The ruling affects Google’s business model because the search and search results are the key to marketing search engine processes. Advertisers pay for precedence in search results and advertising on the search results screen and information about the person making the query is sold to marketers. The value of all of these is diminished if the search results are skewed by the subject’s deletion of selected, and presumably unflattering results.
The EU’s concept of privacy are carefully spelled out in its governing law, while US privacy is not even explicit in the Constitution. However, Google and other search engines would be foolish to ignore the US implications of the EU decision. Arguably, privacy protects personal data and dissemination of personal data in the first instance of publication is clearly within the right of privacy. The interesting issue is whether a re-publisher of the data faces the same liability.
For example, a website which publishes booking or mug shots based on a name search and the payment of a fee may seek to avoid liability in the first instance because of the First Amendment. However, if a name search on a search engine yields the website or a display ad for the website, the search engine would have no more protection than the website and possibly less, simply because the search engine has related the name inquiry to a mugshot.
The Zynga litigation shows the assimilation of internet marketing practices within the federal regulation of communications. In fact, contrary to the Google case, its result assures that Facebook’s widely-touted monetization of its social function will continue with customers such as Zynga. This status quo ruling insures that other social sites such as Twitter have a well-worn path to profitability.
As far as consumer privacy rights go, the Zynga opinion leaves the users of a social site no worse off than before. The business model which drives a social site is that the use of its services are free, but that the harvesting of marketing information, explicitly anonymized, is for sale to support the free services.
The Snapchat case illustrates the application of normal trade practices to the marketing of internet services coupled with sweeping enforcement efforts, in this case, Global Privacy Enforcement Network. In reviewing the FTC complaint, it is clear that Snapchat was guilty only of failing to think comprehensively about the ramifications of its success.
For example, only a start-up beginning at zero would fail to project that third-party apps would be created to modify or reverse the instant-erase feature it, for the first time, had marketed successfully. Thus, the FTC complaint was an expensive and embarrassing exercise in outside consultancy on Snapchat’s product. This vetting could only improve service in the coming years.
These cases make one single point, that privacy is the lever which can move economic giants such as Google, Zynga, Facebook, and Snapchat in every aspect of their business models.