Shutterstock

Earlier this year, the New York Times reported that Madison Square Garden is using facial recognition technology for security purposes and to identify individuals entering the building. Kevin Draper, “Madison Square Garden Has Used Face-Scanning Technology on Customers,” N.Y. Times, March 13, 2018. The article did not address the full scope of MSG’s intended use of the technology, but did make clear that the type of technology reportedly utilized is not only for security, but also for marketing and promotional purposes. Indeed, MSG has used FanCam facial recognition technology since 2011, which essentially captures a high definition image of the entire audience and encourages attendees to tag themselves. See Jen Booton, “Report: MSG Adopts Facial Recognition at Arena Gates for Security,” SportTechie, March 14, 2018. Taking a picture in an open arena during an event that is publicly broadcasted is nothing new. Indeed, using biometrics at sports events in New York is also not new. See Daniel Roberts, “Tickets. (Check.) Glove. (Check.) Fingerprint scan. (Check. Wait—at a Ballgame?),” Fortune, Aug. 7, 2015 (discussing the Yankees’ use of CLEAR fingerprint identification technology to permit faster stadium access while still screening entrants for security). What is new is the ability to transform high resolution images into a biometric identifier—whether in the form of facial geometry or otherwise—and the subsequent use of that biometric data for commercial marketing purposes or its sale for use by a third party.

Biometric data typically refers to any information that is used to identify a natural person based upon unique physiological identifiers (e.g., fingerprint, face, eye, or voice). Therefore, facial recognition technology, hand geometry, and retinal or iris scans are all considered biometric data.

Developing Law

New York has yet to enact legislation regarding the use of biometric identifiers and information. However, a recent legislative proposal, and the statutes and regulations of other jurisdictions, provide valuable guidance to New York businesses regarding permissible practices for collection, use, storage, and destruction of biometric data consistent with individual privacy rights, the perceived need for increased security measures, and the potential future commercial purposes of the business.

The law has developed slowly around the collection and use of biometrics. In 2008, Illinois was the first to adopt comprehensive legislation called the Biometric Information Privacy Act (BIPA), which is the most aggressive biometric protection law in the country. BIPA broadly defines biometric information as “any information…based on an individual’s biometric identifier [i.e., a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry] used to identify an individual.” 740 ILCS 14/10. Under BIPA, private companies with biometric identifiers or biometric information:

  1. must implement a written biometric retention policy available to the public;
  2. cannot collect, sell, or obtain biometrics unless it informs data subjects in writing that their biometrics are being collected and stored and the purpose and length of term for use of the biometrics, and receives a written executed release from the data subjects authorizing the company to do so;
  3. cannot sell or profit from biometrics;
  4. cannot disclose biometrics unless authorized by the data subject, required as part of a financial transaction required by law, or required in response to a valid court order; and
  5. must adhere to a reasonable standard of care to store biometrics not less than how the company stores other confidential or sensitive information.

740 ILCS 14/15. Failure to adhere to these standards can subject a company to a private right of action, with recovery of up to the greater of actual damages or $5,000 per reckless violation. Private litigants can also recover attorney fees, costs (including expert fees and litigation expenses), and additional relief in the discretion of the court. 740 ILCS 14/20. Even negligent violations of the statute permit recovery of the greater of actual damages or $1,000 per violation. 740 ILCS 14/15(1). There are currently more than 30 cases pending that seek recovery of “damages” for alleged violations of BIPA.

In 2009, Texas adopted its own, less onerous legislation applicable to the capture and use of biometrics. Tex. Bus. & Com. Code §503.001(a) et seq. For example, the Texas law permits collection of biometrics for a commercial purpose based on informed consent of the data subject, as well as the sale or disclosure of biometric data under certain very limited circumstances. Id. at (b) and (c)(1). No private right of action exists, but the Texas attorney general can bring an action to recover against a company for up to $25,000 per violation. Id. at (d).

Several years later, in 2017, Washington implemented legislation applicable to biometrics. Although the Washington statute is the least onerous overall, it has a broad definition of biometric identifier, to include any “data generated by automatic measurements of an individual’s biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual.” Rev. Code Wash. (ARCW) § 19.375.010(1). The Washington law requires companies to provide notice of collection for a commercial purpose, but passive consent of the individual is sufficient. Id. at (2). Sale or disclosure of biometrics is permitted in a variety of circumstances. Id. at (3). There is no private right of action, but violations of the statute may be prosecuted by the Washington attorney general, punishable by fines up to $500,000. Rev. Code Wash. (ARCW) at § 19.375.030; Rev. Code Wash. (ARCW) § 19.86.140.

Regulations

There is no controlling federal regulation of the collection and use of biometric information, although, in 2012, the Federal Trade Commission issued a staff report detailing recommended best practices for use of certain biometric information. See FTC, “Facing Facts, Best Practices for Common Uses of Facial Recognition Technologies,” October 2012. Although lacking the force of law, the report underscored three key principles that companies should employ if using facial recognition technology: (a) privacy by design; (b) simplified consumer choice, and (c) transparency. Id. at p. 2.

In November 2017, then New York Attorney General Eric Schneiderman proposed the “Stop Hacks and Improve Electronic Data Security” (SHIELD) Act to require a number of data security enhancements under New York law, including the requirement of reasonable data security practices. Notably, the SHIELD Act seeks to change New York’s breach notification law to include biometric data within the definition of “private information,” so that, if biometric data is coupled with another personal identifier, any security compromise may trigger New York’s data breach notification obligations. See 2017 New York Senate-Assembly Bill S06933A, A08884A, Section 3, Subsection 1(b)(5). The New York attorney general had proposed similar changes to New York’s data breach notification law in January 2015, but the proposed amendments were not adopted. See 2016 New York Senate Assembly Bill S06834B, A10475A.

Other Jurisdictions

Even though New York does not currently have a law that regulates the collection, use, and sale of biometric data, businesses in New York need to evaluate the legal landscape before engaging in biometric data collection and retention. Collecting biometric data on residents across the country may be regulated through extraterritorial application of the law of other jurisdictions. As an example, Facebook is facing suit in the Northern District of California for clams under Illinois’ BIPA for allegedly not obtaining appropriate consent before scanning users’ photographs with a facial recognition technology feature called Tag Suggestions. In that class action, In re Facebook Biometric Information Privacy Litigation, No. 3:15-cv-03747, Illinois Facebook users recently were granted class certification under BIPA. See RJ Vogt, Facebook Users Win Class Cert. In Face Scan Privacy Row, Law360, April 16, 2018. Shutterfly, Google, Snapchat, and Take-Two Interactive, among others, face similar litigation under BIPA.

The lesson of these suits—at least thus far—appears to warn against a company collecting biometric information on Illinois residents absent compliance with BIPA, regardless of where the company is located. For similar reasons, evaluating the need for compliance with Texas and Washington law may be warranted, including the potential for an enforcement action by those states’ attorney general in the absence of a private right of action. Indeed, New York companies collecting biometric data “for the purpose of uniquely identifying a natural person” in the European Union will need to evaluate and comply with the strictures of the EU’s General Data Protection Regulation (GDPR). See GDPR, Article 9(1) (Processing of Special Categories of Personal Data) & Article 3 (Territorial Scope).

Conclusion

In light of the increasing collection and use of biometric data for commercial purposes, including security, marketing, and employment, and the evolving legislative and regulatory landscape across the country and globally, New York companies should, at a minimum, treat biometric data as private information under New York’s data breach notification statute, including providing reasonable data security protections and practices for biometric data. Moreover, while extraterritorial application of Illinois’ BIPA and similar state laws remains unsettled for now, compliance with these laws when collecting biometric data on Illinois, Texas, and Washington residents is an obvious “best practice.” By implementing such policies and procedures, New York companies will be able to pursue the ongoing commercial purposes and security needs of the business, while taking appropriate steps to accommodate the fundamental privacy rights of individual customers across the country.

 

John T. Wolak and Mitchell Boyarsky are co-chairs of the Gibbons data privacy and security task force. Randy A. Gray is an associate on the task force.