Stephen Treglia
Stephen Treglia ()

Data-protection methodology has been with humans for thousands of years, but only recently has it increasingly become a ubiquitous part of our technology-driven lives. Clearly, statutes and regulations have recently evolved mandating its regular use in various arenas of the business world, but even the everyday consumer faces the reality that security hardware and software come with such technology already built-in.

Inevitably, legal issues have begun to arise regarding this form of technology. Least surprising, search-and-seizure issues regarding law enforcement’s attempts to circumvent data-protection methods are at the forefront. The first-half of 2017 has produced some interesting results and court analyses.1

Non-Alteration Techniques

Prior to evaluating these court rulings, however, it is first important to clarify the differences between the most common data-protection techniques. At times, the legal system has not been entirely clear in identifying the precise technique at issue in a legal decision, and these differences in how data-protection operates and what can be done to circumvent a particular technology can have significant consequences.

Certain methods do nothing to alter the original data; they just make it difficult for a human to see it. The first type of such information protection dates back to over 400 years B.C. “Steganography,” which is Greek for “to hide in plain sight,” means literally that. The data is there, and it has not been changed. You just have to know how to find it.

The earliest version included tattooing instructions on a slave’s head, letting the hair grow out, and then having the slave deliver the message personally. Another common method was to write the secret message onto a wood slate and cover it with dark wax in which an innocent message was carved.

More modern techniques utilized during the two World Wars included knitting Morse Code into fabric or hiding the message behind postage stamps. The Nazis developed technology permitting the shrinking of data into microdots that could be hidden within other items, such as a typed period or a slit cut into the edge of a postcard.

The most common data-protection technology familiar to the average consumer is the use of a “password” of “passcode” to gain access to a given device or a given set of data (such as a file or document), which is simply another form of access-protection technology. Passwords do nothing to change the protected data; it just cannot be accessed until the proper code is deployed. And while this is considered a fundamentally necessary step in today’s digital world, its use provides only a minimal level of cybersecurity.

People often use overly elementary levels of passwords (birthdates, spouses’ or children’s name, everyday words, etc.) that are exceedingly easy to deduce. Moreover, their passwords often remain the same no matter what device or data is being protected. Crack the password on one device or collection of protected data, and you have access to everything.

Even when the most secure password practice is deployed, they are still fairly easy for a hacker to uncover. For example, a method known as a “brute-force attack” can figure out the most complicated password in short order.

Such an attack uses advanced computers to randomly select possible combinations of letters, numbers and symbols to produce the correct combination to unlock the protected data. Think of the ending of the 1983 movie “War Games” when the super-computer kept cycling through letters until it discovered the nuclear weapon launch codes.

Coded Data and Encryption

Other types of data protection, however, don’t just hide the data, they actually change it to a format unreadable to humans unless returned to its original version. And encryption is the ultimate of this form of protection.

Coded data is the earliest of these methods. Probably the most familiar version of this is the anagram puzzle found in the puzzle/comic section of many newspapers in which each letter of the message has been replaced by another member of the alphabet. This, however, is a very simplistic form of disguise. A decoder only need focus on the short words, deduce the limited possibilities there, and then substitute each recurrence of that letter wherever it reappears in the coded message. As pieces of the message begin to form in their original format, it doesn’t take long to deduce the message in its entirety—a kind of espionage “Wheel of Fortune.”

The most sophisticated form of coding is again credited to the Nazis. What happens if coding for each letter changes each time it appears through the application of a sophisticated mathematical formula known as an “algorithm”? So, an “a” might be a “t” the first time it appears, but a “p” the next time.

Although Polish scientists initiated the technology between the World Wars, it was Nazi Germany that perfected it by its creation of what was called “the Enigma machine.” A letter was punched in via a keyboard and traveled through a sequence of circuits that changed the letter several times as it passed through the algorithmic formula on its journey to its final coded output. It took the renowned British cryptographer, Alan Turing, and his team to develop the forerunner of the modern computer to crack the code.2

Encryption’s distinct feature is it doesn’t just change one letter for another. It turns data, again through the use of advanced algorithmic formulae, into total gibberish. The encrypted format can be seen by the computer user, but is intelligible only to the computer.

In order to return the data to readable format, the user must possess the necessary decryption code which is often referred to as a “key.” Applying the acceptable key to the encrypted data transforms it back to its original version.

The unique significance of encryption is that, when properly deployed, it is virtually impossible to decrypt the data without the necessary key. It is often said that if the most advanced computers worked together 24/7, it would take multiple human lifetimes to “brute force” the solution.

An excellent demonstration occurred in the 1992 movie, “Sneakers.” About midway through, the characters portrayed by Dan Aykroyd, David Strathairn, and River Phoenix connect a computer to a chip allegedly containing a master decryption key that can transform any version of encrypted data. To test the validity of the chip, they visit the computer networks of the Federal Reserve Bank, the Air Traffic Controllers, and the North American Power Grid. Upon first arrival at each network, the data on their screen is total gibberish. Upon connecting the network to the master decryption chip, the data on the screen morphs into what authorized users of each network are permitted to see.

Issues in Recent Decisions

Turning to this year’s relevant court decisions, one recurring issue is whether the computer user be compelled by some form of legal process (i.e., subpoena, search warrant, court order, etc.) to reveal the access code, decryption key or unprotected version of the data being sought by law enforcement to investigate criminal activity, or does such compulsion implicate the computer user’s Fifth Amendment rights against self-incrimination? The obvious follow-up becomes what can be done if the subject refuses or alleges inability to comply.

So far, only one case has resulted in a published written decision this year. United States v. Apple Macro Computer, __ F.3d __ (3d Cir. 2017),3 affirmed a civil contempt order issued by the Eastern District of Pennsylvania. The appellant failed to provide an unencrypted version of two external hard drives, seized during a search issued pursuant to the All Writs Act, 28 U.S.C. §1651, originally enacted in 1789,4 believed to contain thousands of depictions of child pornography and was held in contempt by the district court.5

The Third Circuit recognized past precedent that the compelled production of documents and evidence from a potential subject of a criminal investigation could invoke a subject’s Fifth Amendment right to remain silent and refuse production. The sheer act of production could, alone, establish incriminating proof against the subject, such as the evidence’s existence and authenticity and/or the subject’s custody of such evidence.

In the present case, however, the district court determined the “foregone conclusion” exception to the evidence production incrimination rule applied. If the government can establish sufficient proof of the evidence’s existence and the subject’s ability to access it, any claims of self-incrimination due to compelled production are moot. Here, the district court found law enforcement had obtained a statement from appellant’s sister claiming he had shown her hundreds of child pornography depictions stored on the hard drives in question, thereby supporting a “foregone conclusion” holding. The Third Circuit concurred.

A couple of recent cases out of Florida demonstrates how different courts risk applying compelled decoding directives unequally. While awaiting trial on child abuse charges, a Broward Circuit Court judge ordered the defendant to serve 180 days in jail for contempt by refusing to turn over to law enforcement the valid password to his iPhone. As he was removed from the courtroom in handcuffs, he loudly swore that he believed the password he had given was the currently active one.

In what could be considered a contrary ruling, a Miami-Dade Circuit Court judge held that an arrested couple allegedly acting in conjunction in an online harassment/extortion case could not be held in contempt for failing to provide the access codes to their digital media. Both claimed that the passage of nearly a year between their arrest and the requested disclosure had caused them to forget their respective passwords.

Conclusion

As expected, the Electronic Frontier Foundation and the American Civil Liberties Union have been active in following these issues and presenting, when permitted, amicus curiae briefs. In contrast, Manhattan District Attorney Cyrus R. Vance Jr. provided written testimony to Congress in March 2016, arguing that denying law enforcement the ability to circumvent data-protecting technology could have dire consequence on crime-fighting and the prevention of terrorist acts.

It is safe to assume neither side in this battle will back down anytime soon, which makes an ultimate appearance and resolution by SCOTUS sometime down the road almost certainly inevitable.

Endnotes:

1. This article only addresses incidents that have either occurred or been decided this year. It is believed that the first reported decision to touch on law enforcement compelling a suspect to assist in decrypting protected data dates back to 2009 and is beyond the scope of this article. If interested, in some of the earliest decisions on this topic, see Mohan & Villasenor, “Decrypting the Fifth Amendment: The Limits of Self-Incrimination in the Digital Era,” Pennsylvania University, Vol. 15, October 2012.

2. This became the subject of the 2014 movie “The Imitation Game,” but be aware the script both significantly changed the true story (Turing and his co-workers actually got along) and overwhelmingly simplified how the code was cracked.

3. The Third Circuit provided an excellent explanation of the basics of encryption technology in its first footnote.

4. The Third Circuit’s analysis of the Appellant’s challenge to the validity of the All Warrants Act on subject matter jurisdiction grounds will not be addressed in this article.

5. It has been alleged over several media outlets that the appellant is former Philadelphia Police Sergeant Francis Rawls, who, at the time of the Third Circuit’s decision had already served 17 months in prison for failure to comply with the All Writs Order, with no foreseeable release in the future absent compliance with the Order.