Editor’s note: This is the last in a three-part series.
In Part I of this article, we discussed how the restrictive conditions imposed by the Vermont Supreme Court upon the search of a PC and an iPad in In re Application for Search Warrant, No. 2010-479 (December 14, 2012), might be constitutionally permissible. In Part II, we began our review of those conditions, which we will conclude in Part III.
Restricting Search Methods
The court next turned its attention to conditions (5) and (6), which require that analysts "use only ‘methods designed to uncover only information for which the state’ had probable cause" and prohibit them from using "sophisticated searching software" such as "specialized ‘hashing tools’ and ‘similar search tools’ without specific authorization of the court." These conditions are ambiguous and foolish, and do nothing to advance the cause of protecting privacy.
What "sophisticated searching software" is, in the context of a computer search, is impossible to fathom. Everything on a computer is sophisticated. The ambiguity of the condition is evidence that those who wrote it did not understand what they were trying to prohibit.
Consider the phrase "specialized ‘hashing tools’ and ‘similar search tools.’" It is impossible to fathom what a search tool "similar" to hashing tools would be, unless one were to think of all searching tools as similar to hashing tools. To explain, a "hash value" is a unique value that can be created for any piece of digital media, from the smallest fragment of a file to a server, and so on. Using a complex algorithm, the hash application creates an alphanumeric string unique to the hashed media.
The way to use hashing in searching a computer is first to generate hash values for all files and then to compare those values (automatically, using a computer application) to a set of hash values for known files. When searching for child pornography, for example, analysts routinely draw upon a library of hash values for known child pornography, comparing those values to those of graphics files found on a computer drive. The entire procedure is done automatically, so that the analyst does not need to look at the contents of any individual file.
Simply put, there are no tools "similar" to hashing tools. Furthermore, hashing a file tells the examiner nothing about it; it is only when the hash value is compared to values of known files that it can reveal anything. So, if the court is trying to prohibit anything, it would have to be hash comparisons.
There are, however, no search strategies "similar" to hash comparisons. The only way to make sense of the phrase "similar search tools" is to reduce it to an absurdity by asserting that all searches are similar. The most standard of search strategies, that is, using keywords to identify files that contain those words, would be "similar" in that the search would be conducted across the entire drive and any hits would be of interest. That description of a computer search, however, is descriptive of pretty much every search.
The nonsensical meaning of "similar search tools" reveals that it is throwaway language, used to make the condition seem more embracing than it is. The true import of the condition is to prohibit hash comparisons.
Prohibiting hash comparisons, however, is both silly and dangerous. It is silly because it is based upon the fear of "Big Brother" that pervades the opinion. A hash comparison requires that the entire drive be searched. Such would be scandalous, save for the facts that: (1) the search warrant allows for it (law enforcement can search anywhere within the house, including the hard drives, where evidence of the crime may reside); and (2) hash comparisons are the least intrusive searches imaginable. Prior conditions prohibited plain-view searches because of fears that law enforcement eyes would see personal information irrelevant to the investigation; hash comparisons prevent such inadvertent looks at irrelevant data, because the analyst does not look at a file’s contents, only at the hash matches, and any file with a hash match is, by definition, evidence.
In other words, the court is, in one set of conditions, prohibiting plain-view searches and then, in the next, prohibiting (without additional court approval) a search strategy that avoids all of the problems presented when the plain-view doctrine is in force.
The court’s fear of using "sophisticated searching software" is, then, absurd. That the court would abrogate the plain-view exception with one set of conditions and then restrict usage of a search strategy that avoids all of the problems created by the plain-view exception reveals its inherent suspicion and dislike of computer searches, period.
Copying, Returning and Destroying Data
The last set of conditions, (7) through (10), instructs analysts to copy and provide to agents "only evidence ‘relevant to the targeted alleged activities,’" to return "nonresponsive data" at the conclusion of the analysis, to destroy the "remaining copies of electronic data absent judicial authorization otherwise" at the conclusion of the analysis, and to file a return "to indicate precisely what data was obtained, returned and destroyed." At their core, these conditions reveal a fundamental misunderstanding of, or antipathy to, digital forensics, and make it impossible to perform such analysis and to bring such evidence to court.
The court recognized that it is essential to the searching of any digital media that the media itself not be searched but that, instead, a bit-stream, forensic image of the media, sometimes referred to as a mirror image, be created and that it be searched. The court approved of this procedure when it wrote, "We do not conclude that instruction (7) or (8) prevents the segregated search persons from imaging the computer hard drive and other electronic storage media so that the computer and media can be returned to its owner. That procedure gives the state full search capacity while minimizing the interference with the activities of the computer owner. It was used in this case."
The problem with the conditions is that they fail to recognize the second crucial reason for making a forensic image: to authenticate evidence and provide a foundation for it should it be introduced into evidence. Once the forensic image is made, it is hash verified — another crucial use of hashing software in forensics — by creating the hash values for the original media and the image, then comparing them to ensure that they are identical. Hash verification is the proof that evidence found on the image is the same as evidence found on the original.
If the state were challenged when introducing evidence found on the image, it could produce the forensic image and allow the defendant’s analyst to examine it to confirm that the hash verification was correct and nothing had been changed on the image since its creation (if even one byte of data had been changed, the hash value would have been totally different than the verified one). However, if, upon conclusion of analysis, all nonresponsive data had to be returned to the defendant and copies of it residing in the lab destroyed, that would mean that the nonresponsive data on the forensic image would have to be destroyed. With such destruction, the state would lose its ability to authenticate the evidence it had gathered.
Given this reality, the court’s position is untenable. The court, however, tries to keep hold of that position by supplying the caveat that it did not read applicable condition (9) "as prohibiting the maintenance of evidence for appeals, post-conviction relief and civil liability. … In circumstances where the state can show that digital information should be kept for a specific reason — for example, for an appeal of a dispute over the validity of or compliance with ex ante instructions — the instruction authorizes the state to seek a judicial authorization to delay destruction. Otherwise, the overall procedure leaves a sufficient record for future proceedings."
The problem with the court’s position is that such "circumstances" leading to an exception being made for condition (9) would always arise. When such is the case, the exception does not simply swallow the rule, but proves it foolish.
Computers can be scary to even the most sophisticated of users, and one can conjure up all sorts of Orwellian nightmares involving them. The conditions imposed by the magistrate and upheld by the court in the instant matter, however, do nothing to address any true concerns that computer usage and searching may present. Instead, those conditions, unwisely at a minimum and unconstitutionally at a maximum, work from an ignorance of what is involved in analyzing computers and play on false impressions of such analysis to frustrate law enforcement.
Leonard Deutchman is general counsel and vice president of LDiscovery LLC, a firm with offices in New York City, Philadelphia, Washington, D.C., Chicago, San Francisco and Atlanta that specializes in electronic digital discovery and digital forensics.