Most people know Facebook and Google can “read” a face and identify the person. Next generation software goes much further: uncovering moods and emotions. Courts and trial counsel alike should consider now the implications of possible courtroom use.
1. Facial expression analysis
Identifying people based on their unique physical characteristics, or “biometrics,” is increasingly common. Facebook, for example, stores data like the geometry of a person’s face, including the distance between the eyes, nose, and ears, to help identify a person in a photo. Google offers a similar service many use to sort pictures on their phones.
Next generation software recognizes not just the static identity of a person, but which emotions the person is expressing at a given moment. This advancement stems from a body of research identifying a series of basic expressions all humans convey identically. Each expression can be broken down into individual movements of facial muscles. From this catalogue of facial cues, developers have made products that can glean insight into how a subject actually feels. The software was used this past fall with an audience watching the Republican presidential debate to determine the “winner.” McDonald’s has used it to determine how employees’ moods affected those of its customers. Companies see the immense value in tailoring their products, marketing or ad campaigns to please customers.
2. Courtroom concerns
For litigators, effectively reading jurors’ emotional responses to cases is a tantalizing prospect. It is no wonder that lawyers during voir dire now often conduct Google, Facebook and other social media searches of prospective jurors to gain a better sense of their attitudes and character. With much on the line, lawyers look for any opportunity to maximize persuasiveness in hopes of a favorable outcome. Attorneys could perfect their message by applying the software to determine which case theme is working or what line of questions clearly is not. Lawyers can capture this potential by using the software on jurors in mock trials and focus groups. So long as these mock jurors consent, the software promises a powerful advantage. Emotient Analytics, a provider of emotion reading software, already markets such a product toward lawyers.
It may not be long, however, before zealous advocates seek to use the technology in a real courtroom. Cameras have not typically been allowed in federal courts, but federal courts have recently piloted programs experimenting with trial recording, and many state courts allow video recording. People want to see the government at work, and the trend seems to be in favor of expanding the use of cameras in the courtroom. Moreover, the ubiquity of powerful smartphone cameras and cloud computing make the prospect of courtroom use – even surreptitiously – a real prospect. Taking advantage of this reality, lawyers and jury consultants could obtain footage of a trial and apply the software retroactively. It is also not far-fetched to imagine clandestine use of the software in real time on camera phones or even Google Glass. Recent news accounts, for example, document an attorney caught snapping photos from his phone at a federal trial and posting the video to social media.
In light of the potential for courtroom use, courts may want to consider the implications of emotion-reading software and the possibility of a disruptive effect on jury service and trial management.
a. Consent and Privacy Issues
Jurors are summoned by court order to report for jury duty and are subject to observation by courtroom attendees as part of that role. But consent to emotion reading does not follow, and jurors might object to that degree of scrutiny. Indeed, consent in the biometric realm is currently a hot topic, as Facebook and Shutterfly have already been sued to put a stop to their face-recognizing features without first gaining users’ consent. The plaintiffs alleged a violation of Illinois’ Biometric Information Privacy Act, which requires notification that an individual’s biometric data is being collected or stored and written permission from the subject. The court in the Shutterfly case recently affirmed that the Illinois law applies to data obtained from photographs.
Aside from the possible applicability of state biometric privacy laws, courts will have to grapple with jurors’ expectations of privacy. Our emotions are the foundation of individual privacy; they can betray innermost thoughts. Emotion-reading software, however, can now analyze subconscious micro facial movements. A blank expression is no longer blank. Software has effectively inched closer to thought-reading.
The public may view this capability as invasive and damaging to a person’s privacy. Federal and state protections of privacy rights, as well as constitutional implications, are certainly implicated in possible use of emotion analytics software in a courtroom. Not surprisingly, laws protecting people’s control over their biological identifiers are also explicitly linked to privacy. Emotion analysis likely will elicit the same concerns. Legal protections of privacy are therefore an important consideration.
b. Decorum, Distraction, and Danger
Another issue is the impact on jurors’ focus and freedom from harassment. Court’s seek to protect the “solemnity of the proceeding” – the barring of cameras in the courtroom, for example, stems in part from a desire to avoid a “circus atmosphere” detracting from the soberness of trial. And jurors are removed to a private, guarded room to deliberate free from the public gaze. While emotion analysis could be applied in real time without disrupting cameras and wires, a jurors’ mere awareness that he or she is being scrutinized could distract from the merits of the trial.
Emotion monitoring could also implicate juror safety and protection. Imagine, for instance, a criminal trial in which each juror’s attitudes toward the defendant are monitored by computer. Does such a prospect open the door for possible intimidation or harassment of jurors who have been identified as sympathetic to one side or the other? If a juror’s vote is swayed by intimidation or harassment, then litigants have opened the door to mistrials or juror disqualifications. Additionally, other motions for mistrial could be rooted in asserted bias or other irregularities tied to jurors’ emotions. Both the bench and the bar may want to pause to consider that possibility and its possible impact of judicial resources.
Emotion-reading software may prove useful, and its application to mock trials and focus groups could be valuable and entirely acceptable given the subjects’ consent. However, there are perils associated with application on an actual jury in a real courtroom. The impact on privacy could dissuade potential jurors and raise legal issues. The focused atmosphere of the courtroom could be upended by distracting use. As a result, courts may wish to consider getting ahead of the technology by explicitly adopting rules regarding the use of the technology and making litigants well aware of their stance before technology has the chance to outpace deliberatively set rules.
Rick Martinez is a partner at Robins Kaplan LLP and chair of the firm’s Privacy and Cyber Security Litigation practice. He can be reached at RMartinez@RobinsKaplan.com.