Facebook allows companies to access your private data without consent. Uber has used big data to embarrass its enemies. Companies like Equifax and Yahoo have been hacked and your private information is no longer private. However, there may be a judicially derived solution out of a growing theory that allows users to not only take steps to protect the data companies collect during their use of technological devices, but ensure redress should anything go wrong. Once this solution is accepted, plaintiffs attorneys should be prepared to see an influx of privacy-derived cases with tort remedies.
Currently, privacy law in the United States is a messy “hodgepodge” of laws, rules and treaties with specific laws assigned to different industries leaving disjointed solutions. Included in the hodgepodge of privacy rules are state laws that are highly specific to industry, but used most common with private remedy. See e.g. 6 Pa. Code Section 11.197 (concerning adults in daily living centers having access to information); 31 Pa. Code Section 146b.11 (disclosing health information); 35 Pa. Stat. Ann. Section 5636 (disseminating data on cases of cancer); 28 Pa. Code. 28.5 (identifying when information about newborns are shared); 43 Pa. Stat. Ann. Sections 1321-1324 (inspecting personnel files); and 35 Pa. Stat. Ann. Sections 7601-7612 (keeping HIV-related information confidential).
The current framework, although troublesome when asserting a new claim or protecting against liability, does not leave users completely vulnerable, see Solove, Daniel, “FTC and Privacy Common Law, “114 Colum. L. Rev. 583, 587 (2014). The legislative branch does attempt to curb data breach and invasions of privacy, but there are limits to this power. Congress has assigned the Federal Trade Commission (FTC) power to protect consumers by fining companies that are mismanaging user data, but there still remains a malleable standard for those users who then wish to receive individual damages under a broadly applied system of privacy law. See e.g., FTC v. Wyndham Worldwide, Civ. Action No. 13-1887 (D.N.J. Apr. 7, 2014)(granting FTC authority to pursue a deception claim on behalf of a group).
Corporate data breach plaintiffs have been somewhat successful in pursuing tort remedy, specifically negligence claims, in the early stages of litigation, but courts are not united in finding a duty of care between user and consumer, see Simpson, Michael D., “All Your Data Belongs to Us,” 87 Univ. Colo. L. Rev. 669, 686 (2016) (citing Lone Star Nat’l Bank v. Heartland Payment Systems, 729 F.3d 421, 426 (5th Cir. 2013) (reversed district court dismissal), Sovereign Bank v. BJ’s Wholesale Club, 395 F. Supp. 2d 183, 194–95 (M.D. Pa. 2005) (holding that the direct relationship between a retailer and a card issuer favors imposition of a duty of care); BancFirst v. Dixie Restaurants, No. CIV-11-174-L, 2012 WL 12879 at *4 (W.D. Okla. Jan. 4, 2012) (holding that no special relationship existed between the parties from which a duty of care would arise)).
The FTC remains the primary federal data protection authority, and offers persuasive decisions for privacy-related disputes, however, there is still no private cause of action under Section 5 of the FTC Act for consumers who are victims of an unfair or deceptive trade practice. The FTC is also understaffed, with more or less five employees dedicated to mobile privacy alone, with settlements for violations and then penalties for violations of those settlements up to $16,000 per violation. Consumers do not receive these fines. The FTC consent orders or settlements with Google, Facebook, Twitter, etc. for privacy violations, allows a company to avoid admitting wrongdoing in exchange for remedial measures. Hardly any cases with the FTC go to court because there is no threat of financial penalty with a Section 5 violation, so there is little to no financial incentive to spend time fighting FTC complaints. The FTC is also limited to seeking equitable monetary relief.
Some settlement violations may include violations of Safe Harbor agreements with Europe, countries that have more consumer-friendly privacy policies, see General Data Protection Regulation (GDPR), https://www.eugdpr.org/the-regulation.html; Satariano, Adam, ”G.D.P.R., a New Privacy Law, Makes Europe World’s Leading Tech Watchdog,” The New York Times (May 24, 2018). There are 28 countries in the European Union (EU) that focus on protecting its citizens and require companies doing business with these countries to be clear about the collection and use of users’ personal data, see Tiku, Nitasha, “Europe’s New Privacy Law Will Change the Web and More,” Wired (March 19, 2018) (including the collection of location, IP address, or identifier that tracks web and app use on smartphones). The EU also requires companies to explain why they are collecting data and whether it will be used to create profiles on individuals for research and dissemination. Among other specific protections, consumers are essentially able to gain access to their data at any point, and truly delete data if they request companies to do so. For any violation of the GDPR, companies will suffer higher fines.
Pennsylvania has the four generalized forms of privacy recognized in common law that include: intrusion upon seclusion; appropriation of name or likeness; publicity given to private life; and false light. Restatement (Second) of Torts, Sections 625B-652E. These are not directed toward modern day privacy concerns. To this point, in Pennsylvania, tort law has not been helpful to plaintiffs that wished to receive damages for the collection and misuse of their data.
In Gabriel v. Giant Eagle, the plaintiff, on behalf of a class, sued a group of pharmacies for failing to stop third parties from illegally accessing prescription medications and charging victims’ insurance companies. This case used intrusion upon seclusion, misappropriation of name, and conversion of identity to pursue a claim, but the court ruled in favor of defendants on each claim. The court found it could not find the defendant pharmacies liable, because they had lawful access to the medical information and that it was the person who stole plaintiffs’ identities who was at fault so there is no causation. Further, plaintiffs’ identities and confidential information are considered intangible rights that were not connected to or identified with some sort of document to successfully make a conversion claim.
There is also no recovery in privacy law using a negligent infliction of emotion distress (NIED) claim. See e.g., Toney v. Chester County Hospital, 614 Pa. 98, (2011) (finding if plaintiff had alleged a fiduciary relationship in a medical malpractice case, the plaintiff would have won the NIED claim); see also Emekekwue v. Offor, No. 1:11-CV-01747 (M.D. Pa. May 15, 2012). But see Okane v. Tropicana Entertainment, No. Civ. A. 12-6707 (E.D. Pa. Jan. 3, 2013) (finding a schizophrenic woman did not have a fiduciary relationship with a casino when trying to prove distress because she did not have access to a tape of her having an episode).
There are two common scenarios that involve the potential misuse of information. First, there may be third-party hackers who steal information from a provider that stores and maintains a user’s digital information. Second, information can be collected and sold to data brokers after a company analyzes the data stored on a device. See Balkin, Jack M., ”Information Fiduciaries and the First Amendment,” 49 U.C. Davis L. Rev. 1183, 1189-90 (2016) (comparing between providers, Uber that used sensitive data to embarrass, versus Facebook that had used data to experiment for profit or science etc.). The issue is that pursuing redress makes it difficult when information is stolen because those third-party hackers are often anonymous, and providers cannot be held accountable by law without proof of an intent to harm or causation. Courts have differed on whether the second potential misuse of information is unlawful.
Providers may be able to use the third-party doctrine in defense against plaintiffs. This doctrine finds defendants automatically without fault whenever the user assumes the risk his or her information will be stolen. See e.g., United States v. Miller, 425 U.S. 435, 443 (1976) (the plaintiff had “taken the risk … that the information [would] be conveyed by that person to the government.”). Defendants also use the First Amendment as a defense when it comes to the right to collect, analyze or sell data. The other defense used is contractual, which defines the relationship between the provider and user by terms in a contract, but there must be proven consideration if challenged.
Clearly, there is a problem with the current legal framework. However, there is a solution that will not offer much disruption when the necessity to use internet-connected devices has become unavoidable, but the need to protect sensitive information remains. Congress’ potential understanding and willingness to comply with this solution began to surface with Mark Zuckerberg’s testimony to the Senate Judiciary and Commerce Committee three months ago. There was a noticeable theme when the questions senators asked involved in what capacity Facebook owed a duty to its users while maintaining their data, not formally established, see Heller, Nathan, “We May Own Our Data, But Facebook Has a Duty to Protect It,” The New Yorker, (Apr. 12, 2018).
Two years prior to this hearing, Professor Jack M. Balkin wrote a law review article that assigned fiduciary duties to entities that store user information. Arguing for an assignment of an implied duty of care and a duty of loyalty to technology providers would allow them to be held accountable in a court of law when charged with the misuse of data. This potential expansion creates a duty that is a prerequisite for tort actions and levels the playing field between an end-user and technology provider. Further, First Amendment rights, normally used to combat business regulation, can co-exist with fiduciary duties.
By re-defining relationships between parties, rather than focusing on specific types of technology in analysis, judges may no longer provide narrow decisions as guidance in fear of the speed at which technology progresses. Instead, there may be broad solutions that offer insightful guidance for all sides of the debate. Instability breeds resentment not only for providers, who spend time and money developing their own policy with no assurances against predictability, but also for the end-user who feels insecure using a smart device. According to the creation of a new duty defined as, information fiduciaries, any breach in this duty will allow users to be awarded tort damages, even punitive as corrective justice.
Correction of the presently existing inequities will compel providers to work more diligently to afford users meaningful protection, while at the same time providing predictability to providers.
Caitlin Wilenchik is a first-year associate at Anapol Weiss working with the firm’s mass tort team. She is a graduate of George Washington University Law School and is barred to practice in Pennsylvania.