Editor’s note: This article is the second in a three-part series.
In Part I of the analysis of computer search warrants, we discussed how the restrictive conditions imposed by the Vermont Supreme Court upon the search of a PC and an iPad in In re Application for Search Warrant, No. 2010-479 (December 14, 2012), might be constitutionally permissible. In Parts II and III, we will review those conditions.
Ex Ante Conditions
Having rejected the state’s general objection to the warrant’s ex ante conditions, the court turned its attention to the reasonableness of the conditions themselves. The court grouped the conditions as follows:
• (1) relating to the plain view doctrine;
• (2), (3) and (4) requiring that the search be performed by third parties or police personnel segregated from the investigators and requiring that the information be segregated and redacted prior to disclosure;
• (5) and (6) requiring police to use focused search techniques and prohibiting the use of specialized search tools without prior court authorization;
• (7), (8), (9) and (10) pertaining to the copying, destruction and return of data.
Plain View Doctrine
Condition (1) abrogated the plain view doctrine, under which police may view and seize an object without a warrant if they are lawfully in a position from which to view it (e.g., when they are executing a properly issued search warrant), the object’s "incriminating character" is immediately apparent and the officers have a lawful right of access to the object. (See Minnesota v. Dickerson, 508 U.S. 366, 375 (1993).) The court found the first condition invalid for two reasons. First, it found the condition unnecessary. The court reasoned that the restrictions requiring that the search of the computers be done by personnel other than the case agents, and that only evidence relevant to the search warrant be released to the case agents, "obviate application of the plain view doctrine" and so render the restriction unnecessary.
Second, it found the condition was outside of the magistrate’s power to issue. The power of the judiciary, it reasoned, citing to McNabb v. United States, 318 U.S. 332, 340 (1943), "does not … go so far as to allow a judicial officer to alter what legal principles will or will not apply in a particular case."
Search Performed by Third Parties
Turning to conditions (2) through (4), the conditions that the search be performed by third parties or trained computer personnel separate from the investigators and operating behind a "firewall," who could provide the case agents with only "digital evidence relating to identity theft offenses," which evidence had to be "segregated and redacted from surrounding non-evidentiary data before being delivered to the case investigators," "no matter how intermingled" it was, and who could not disclose "their work to prosecutors or investigators," the court found such conditions valid.
In so doing, the court relied upon inapposite "precedent," and ignored reality.
The court found that the "application for the warrant in this case requested incredibly broad authorization." The affidavit explained in detail (asset already summarized above) the need to search the entirety of each computer hard drive in the lab, and the need to search all digital media found in the residence.
As the court saw it, "the warrant application could not have requested a broader authorization." Because of such a "broad" request, it was, according to the court, "understandable" that the magistrate concluded that "the warrant application did not provide probable cause for such a wide ranging search." The "separation and screening instructions" were proper to "remedy this lack of particularity."
The court’s reasoning here is, simply put, nonsense. Searching a computer does not expose a person to a greater invasion of privacy than in the pre-computer days. Rather, it simply makes it easier for law enforcement to conduct the search. If the police have probable cause to search a house for drugs, drug proceeds, records, etc., putting aside looking into computers and other digital devices for communications and such, the police can look anywhere such evidence could be found, which is to say, everywhere in the house. If 10 people live there, they can look in 10 bedrooms, in every drawer and under every bed. They can read every letter, every diary, etc.
The sole difference between computer searches and physical searches is that computer searches are easier. But there is nothing in the Fourth Amendment’s guarantee of privacy that makes the difficulty of searching an attribute of privacy.
It should also be noted, as the dissent notes, that while the court rejected condition (1) because the magistrate lacked the authority to abrogate the plain view doctrine, it allowed conditions (2) through (4) because it, de facto, "eviscerated the plain view doctrine." While the court admitted that "the practical consequences of the instructions may be comparable to an abrogation of the plain view doctrine," it asserted that instruction (1) was not the same as instructions (2) through (4) because "the mechanism was critically different," in that in condition (1), the police could not seize evidence they saw in plain view, while in (2) through (4) they were prevented from seeing the evidence, so there was nothing in "plain view."
The sophistry in such reasoning is shameless. One can only imagine the stance the ACLU would take if a governmental body chose not to abrogate the right to assembly, but simply to prevent individuals from entering any place where they could exercise such right. The distinction between "mechanisms," one would guess, would be lost to the ACLU and the court, and properly so.
The court also deflected the state’s objection that exposure of all information to be searched to "third parties" working for law enforcement is exposing the data to law enforcement, and so nothing is accomplished by conditions (2) through (4) in the way of protecting privacy, save for making law enforcement’s job more difficult. The court emphasized that a loss of privacy is measured not simply by whether information is disclosed, but to whom it is disclosed. "If an embarrassing or humiliating piece of personal information must be revealed to someone, it is surely worse to have it revealed to the neighborhood busybody or to one’s boss than it is to have it revealed to a stranger."
In support of its position, the court cited to four cases, none of which supported it and which, in fact, undercut it. In two of those cases, third parties were brought in to work with law enforcement to add expertise, not to screen information before passing it on to law enforcement. The other two cases involved matters where subpoenas were used to gather information. Obviously, the position a person is in when he or she has been subpoenaed is considerably different than when a search warrant for his or her property has been issued. Subpoenas allow the party subpoenaed to search his or her property to determine what is responsive and then make disclosures; they involve no physical seizure.
Using a court or special master to filter data that has been turned over by the party pursuant to a subpoena is far different than having law enforcement personnel filter data that has been seized before passing on the filtered data to other law enforcement personnel. In the cases cited, the filter involved was the judicial branch of government; with conditions (2) and (4), the data has already been seized by the government and the court is ordering that some law enforcement personnel not share findings with other law enforcement personnel. The court’s reliance on these cases, then, is simply misplaced.
To try to bring home its point, the court returns to a familiar theme: that there "are just too many secrets on people’s computers, most legal, some embarrassing, and some potentially tragic in their implications, for loose liberality in allowing search warrants." This premise is, to use a word I have, unfortunately, overused in this article, nonsense.
Compare a search of a computer today for evidence of the rape of children to the search of a home (remember, the search of the computers in the instant matter is simply a part of the search of a residence) 30 years ago. In the latter search, every part of that house would have been searched, every writing read, every piece of clothing examined, every prescription drug looked at, etc. Any personal detail embarrassing to the occupants or others would have been discovered. The sole difference between the searches, to make the point again, is not that one is more invasive than the other, but that the computer search is much quicker and more thorough. Privacy, however, is not the right to be searched slowly and without attention to detail.
The court also rejected the state’s two practical objections: (1) that examination by an analyst without input from the case agents "may result in relevant evidence being missed"; and (2) "that the segregation requirement will prevent a dynamic investigation in which the search can be expanded based upon what information is uncovered." The court’s response to these objections is that because the state chooses the analysts, it can bring in people with sufficient skills and educate them regarding the case, and that the case agents and the analysts can have, over the "firewall" between them, a dynamic exchange of information regarding ongoing searches.
The court’s gloss over these practical issues is perhaps the most crucial and revealing error in its analysis. Anyone who has spent any time in law enforcement trying to investigate crimes involving computers (such as I did for many years) knows that, at every level, law enforcement lacks the trained personnel and equipment to do the job as it should be done. There are many reasons for this: (1) law enforcement lacks resources generally; (2) digital forensics expertise is particularly expensive to maintain, because it requires (a) educated personnel from the start and their education must constantly be updated and (b) keeping equipment up to date, i.e., always buying new equipment because the old equipment is outdated, not worn out; (3) paying law enforcement analysts salaries even remotely competitive to what the private sector pays is impossible; and (4) people interested in forensic analysis are, for the most part, not interested in the culture of law enforcement.
The idea, then, that multiple investigators will be available to participate in "firewalled" investigations is like the unrealistic academic’s model company that Dr. Barbay proposes building in Back to School. When he asks his class where it should be built, realist and neophyte student Rodney Dangerfield, who has actually built companies, responds, "How about Fantasyland?"
Illustrative of the problems with the court’s picture of how an analysis would be conducted is that throughout its opinion, when discussing computer evidence, it consistently refers to "files," i.e., emails, Word docs, spreadsheets, PDFs, etc.
A considerable portion of forensic analysis, however, is devoted to looking at what might be considered "transactional data." That data includes metadata regarding files, particularly the file created, last accessed and last modified dates and times; actively stored and deleted Internet history; examination of the registry (on Windows operating systems), which records every application loaded onto a hard drive and the first and last attachment dates of every external device; the most recently used file (again, on a Windows operating system), LNK (pronounced "link") files, which will indicate when files on a drive were accessed (very helpful when trying to determine whether a user accessed a file from an external device), and many, many more sources of data.
Typically, the case agent would explain the type of information he or she is looking for and the analyst would then run the various searches for such data as he or she deems appropriate. The analyst would review the reports with the case agent and, between the two, come to understand the evidence — this is the "dynamic exchange of information" the state had sought to preserve.
Placing a firewall between the agent and the analyst, however, makes that exchange extremely difficult at best and impossible at worst and, in all cases, lengthens the time the exchange will take, which means not only that the case under investigation will take longer to investigate, possibly leading to dire consequences (e.g., the criminal gets away or commits other crimes in the interim), but also that fewer cases would be investigated.
As for the latter consequence, the only way to avoid that would be to hire more agents and analysts, and that is, practically speaking, not going to happen. In those situations where the case agent performs his or her own analysis, the agent would be forced to choose between roles and another agent or analyst hired — again, a step that, practically speaking, no agency will be able to afford to take.
It should be noted that examination of such transactional data does not lead to the perusing of medical information, sexually laced communications or other personal materials the court seems most eager to protect. The court’s uninformed understanding of how forensic analysis is conducted has led it to accept procedures that do nothing but frustrate the legitimate attempts of law enforcement to understand the evidence.
Finally, it must be emphasized that the court’s fascination with law enforcement examiners looking at the computer user’s personal data has distorted the plain view doctrine in two important ways.
First, while the plain view doctrine emphasizes that the criminal nature of the evidence must be immediately apparent, or the examiner must move on, the court imagines agents obsessing upon the lurid personal details, which should not arise. Of course, the restraint dictated by the plain view doctrine could be ignored by the examiner, but that is not the issue at hand; any protocol, including the conditions set forth here, can be ignored.
The second distortion of the plain view doctrine, consistent with the first, is that the court makes it appear that it does not lead to the discovery of evidence of other crimes, only to the perusing of personal information. Imagine that the analyst behind the firewall, who, searching for evidence of identity theft, uncovers not racy personal emails but, rather, the next 9/11 plot.
If that is too dramatic, imagine he or she uncovers evidence of child abuse or rape, or merely evidence of a burglary ring. What is the analyst supposed to do with this evidence? It goes against not only human nature but the analyst’s oath of office not to inform the agents. Such would happen only in Fantasyland. The court’s solution, then, is not required by the Fourth Amendment, is not good public policy and is not even workable.
Leonard Deutchman is general counsel and vice president of LDiscovery LLC, a firm with offices in New York City, Philadelphia, Washington, D.C., Chicago, San Francisco and Atlanta that specializes in electronic digital discovery and digital forensics.