Ken Strutin
Ken Strutin ()

Not all data is created equal. The quality of information depends on how it’s collected, stored, retrieved and interpreted. At every step there is potential for error, human and mechanical. And misguided confidence in the infallibility of machine generated information, or human judgment, renders probable cause improbable. Thus, in an era when databases have become commonplace extensions of human logic and memory, the data heavy transactions of law enforcement require meaningful case-by-case scrutiny.

Data Stops

Traffic stops today are enriched by access to data collected from motor vehicle departments and law enforcement agencies and thus have become data stops. The latest decision concerning the accuracy, validity and reliability of such data comes from the Kentucky Court of Appeals in Willoughby v. Commonwealth, 2014 Ky. App. LEXIS 5 (Jan. 10, 2014).

In November 2010, James Willoughby had been driving his Jeep Cherokee when he caught the attention of a patrol officer. Taking down the plate number, the officer used his Mobile Data Terminal to run a check through the state’s Automated Vehicle Information System (AVIS), among other sources. It raised a red flag: “Verify proof of insurance.”

Experience told the officer that “more times than not,” this warning meant that the driver’s insurance had been cancelled. It should be noted at this point that a county clerk would later testify that aside from a lapse or cancellation it could simply mean a change in coverage. Thus, there was already a built-in inconsistency in the interpretation of the same computerized data by different departments.

During the traffic stop, the officer shone a flashlight into the car and spotted an electric coffee bean grinder. And while Willoughby continued to search for his insurance card, the officer called in another database search for pseudoephedrine sales. It appeared that minutes earlier Willoughby and his female passenger had made such a purchase. The questioning continued.

Finally, the officer gave Willoughby a warning for not having insurance and then conducted a pat down, which uncovered two bags of white powder hidden in the back of his pants.

A search of his vehicle revealed cough medicine containing pseudoephedrine and plastic tubing with white residue. A field test of the bagged powder showed that it was methamphetamine. At this point, Willoughby was formally arrested.

At a pretrial suppression hearing, the traffic stop was upheld based on the arresting officer’s interpretation of the insurance coverage data. Willoughby was convicted after trial and sentenced to 10 years in prison.

The chief issue addressed by the appeals court was the basis for the traffic stop, i.e., the state motor vehicle information database.

The Kentucky Court of Appeals examined a handful of precedents from other jurisdictions that generally found that a negative insurance database warning alone was insufficient for a traffic stop.1 And at a minimum, a hearing was required in those cases to resolve the reliability of the data. Otherwise, a database error or ambiguity could transform innocent behavior into reasonable suspicion.

These courts could not exclude the reality that information might have been incorrectly entered, not updated promptly, incorrectly transmitted, or too indefinite to support a finding of illegality.

As for Kentucky’s AVIS database, the county clerk’s testimony showed that the “verify proof of insurance” indicator could apply to innocent as well as unlawful behavior, and as it turned out Willoughby was properly insured at the time.

No information about the accuracy rate of the database existed. And the officer’s yardstick, “more times than not,” was inadequate opinion evidence. Thus, the trial court’s findings about AVIS reliability were unsupported.

The appeals court understood that the accuracy of the database was crucial because this information impacted a person’s “right to be left alone” as well as the law enforcement function. Without announcing a bright line rule, they remanded the case for a reliability hearing to decide, among other issues: “[W]hat the various indications provided by AVIS mean, both in theory and in practice; whether the database’s ‘match rate’ can be definitively determined; and how (in)frequently an indication of ‘verify proof of insurance’ indicates that a vehicle is uninsured.”

In the absence of standards, a data accuracy-validity-reliability hearing ought to be a staple of pre-trial probable cause and suppression proceedings.2

This will become increasingly important as new data streams continue to emerge3 and change from tell-tale to tattle-tale.4 And they will create new territories of forensic data that might be harvested for case investigation.5

‘The More You Look’

“For every fact there is an infinity of hypotheses. The more you look the more you see.”6 This is an apt description of what happens when old conclusions are viewed through the eyes of new science and new technology. And someone is always working on a better mousetrap or a better search engine.

Not too long ago, the FBI conducted a review of the country’s national DNA database and discovered 166 errors.7 Against the context of millions of profiles it might seem insignificant, but therein lurks the possibility of a wrongful conviction or an overlooked lead in a cold case.

“The mistakes were discovered in July, when the F.B.I., using improved software, broadened the search parameters used to detect matches … . In 166 instances, the new search found DNA profiles in the database that were almost identical but conflicted at a single point.”

The catalog of problems resulting in such errors likely resembled that of any database. Indeed, the FBI believed that they might have been due to such common mistakes as typographical or transcription errors.8

The upshot of this study, as with Willoughby, is that error rates should be the foremost statistic:

In court, prosecutors often describe the strength of DNA evidence against a defendant with numbers that can run into the billions—expressing how unlikely it is that a person chosen at random would also have a DNA profile linked to the crime scene. But the rate of errors by a lab or a technician, a less dramatic topic, can be a much more relevant statistic, many defense lawyers and some scientists said.

Notably, six of these profiles came from New York crime scenes.9

Thus, data integrity and quality control are the red flags that will continue to appear so long as humans are involved in data processing and new math and software reveal unseen errors.10 Moreover, until reviews are conducted in all state databases, and routinely in every database, red flags will be difficult to spot.11

Ghost in the Machine

Forensic technology is not immune to the faults and failures common to all machines.

Recently, a series of Denver burglary cases had to be dismissed due to errors created by a malfunctioning DNA analytic device.12 The samples collected from 11 crime scenes were mis-linked, i.e., the DNA data were attributed to the wrong defendants. Thus, four prosecutions had to be dismissed in which all four defendants had confessed and three had pled guilty. By the time that the mismatched DNA had finally been sorted out, more than two years later, the statute of limitations had run.

It started with a simple unassuming mechanical error: “The mistake happened after an $80,000 DNA processing machine “froze” while running a tray of 19 DNA samples on June 13, 2011.” Due to some misunderstanding of the directions for fixing the problem or their application, the DNA were wrongly resorted.13

Such experiences chasten everyone. “‘This is regarding one run, on one day,’ [Lt. Matt] Murray[, the department's chief of staff] said, adding that the $36 million state-of-the-art lab handles more than 2,300 DNA samples a year. ‘DNA is part of the process. It’s often touted as the end-all-be-all in all criminal cases, and that’s just not true.’”

The criminal defense community as well as forensic experts expressed concern that this mechanical mismatching could call into question all state prosecution DNA evidence, notwithstanding the correctness of the analysis. It would be the human equivalent of an honest but mistaken eye witness who identifies the wrong defendant.14

The most disturbing aspect to this case, and all such errors, is the role that chance played in its resolution.

“‘Had the instrument not broken down a second time, they would have continued to think those samples processed in 2011 were correctly processed,’ [Phillip] Danielson[, a professor of molecular biology at the University of Denver] said. ‘Incidents of cross-contamination or sample mix-ups can occur and not be detected by the quality-control systems labs currently have in place.’”

Some years back, the German police learned a comparable lesson when they discovered that the Phantom of Heilbronn, a serial criminal linked by DNA to 40 crimes in a two-year period, never existed.15 It turned out that the cotton swabs for collecting the genetic material at each crime scene had been contaminated at the factory. Again, it was only by happenstance that this mistake had come to light.

‘The More We Learn’

“The more we learn about the world, and the deeper our learning, the more conscious, specific, and articulate will be our knowledge of what we do not know; our knowledge of our ignorance. For this indeed, is the main source of our ignorance—the fact that our knowledge can be only finite, while our ignorance must necessarily be infinite.”16

Information access and scientific thinking are bit by bit changing the landscape of criminal justice.17 So it is that the National Commission on Forensic Science has finally been convoked to pursue their mission of creating professional codes for crime laboratories.18

At their first public meeting on Feb. 3, 2014, Judge Harry T. Edwards, erstwhile co-chair of the National Academy of Sciences Committee on Identifying the Needs of the Forensic Science Community, emphasized: “Judicial review, by itself, will not cure the infirmities of the forensic community.”19

Edwards went on to add: “The adversarial process is not suited to the task of finding “scientific truth.” The judicial system is encumbered by judges and lawyers who generally lack the scientific expertise necessary to comprehend and evaluate forensic evidence in an informed manner. And the judicial system embodies a case-by-case adjudicatory approach that is not well suited to address the systematic problems in many of the forensic disciplines.”20

Indeed, without science, the law can never reach the truth by its own rules.

Imprisoning scientific inquiry within the boundaries of legal precedent serves no one, least of all those in prison. Edwards noted that the 2009 report, “Strengthening Forensic Science in the United States: A Path Forward,” has not penetrated courtroom walls as hoped.21 Thus, while reform offers the promise of improving due process going forward, the convicted, the incarcerated and the unrepresented must wait until these changes filter down to their post-conviction reality.22

Conclusion

The presumption of correctness in forensics, no less than in convictions, is misplaced and overused.23 Balancing the scales of justice requires judges, prosecutors and defense counsel to be vigilant of the ever-present likelihood of human and mechanical errors.24

In assessing forensic evidence of guilt, benefit of the doubt is not something that should be given to humans or machines.25 And luck is not a standard for preventing or uncovering misconduct, poor judgment or mechanical breakdowns.

To be sure, new scrupled codes for all facets of post-conviction review are needed to make justice a reality for everyone. And in terms of judging guilt, the capacity for doubt should be at least as broad as the capacity for error.

Ken Strutin is director of legal information services at the New York State Defenders Association.

Endnotes:

1. Willoughby, 2014 Ky. App. LEXIS 5, at 4-5.

2. See generally Susan Haack, “What’s Wrong with Litigation-Driven Science? An Essay in Legal Epistemology,” 38 Seton Hall L. Rev. 1053, 1077 (2008).

3. See, e.g., In-Car Location-Based Services: Companies Are Taking Steps to Protect Privacy, but Some Risks May Not Be Clear to Consumers (GAO-14-81 Dec. 6, 2013).

4. See Hon Chu et al., “I Am a Smartphone and I Know My User Is Driving” (Microsoft Res. Jan. 7, 2014); Dave Smith, “Google’s Project Tango: 5 Things You Need to Know,” ReadWrite, Feb. 20, 2014 (smart phone that can see by mapping environments in 3D).

5. See, e.g., Google Glass Forensics Blog 1, Computer & Digital Forensics Blog, Jan. 18, 2014; Nick Mccann, “Drug Test Database for Truck Drivers Planned,” Courthouse News Serv., Feb. 25, 2014.

6. Robert M. Pirsig, Zen and the Art of Motorcycle Maintenance 171 (1974).

7. Joseph Goldstein, “F.B.I. Audit of Database That Indexes DNA Finds Errors in Profiles,” N.Y. Times, Jan. 25, 2014, at A15.

8. Id.

9. Id.

10. See Ken Strutin, “Math in Justice and the Calculus of Truth,” N.Y.L.J., Jan. 21, 2014, at 5; Ken Strutin, “Calculating Justice: Mathematics and Criminal Law,” LLRX, Dec. 8, 2013.

11. See, e.g., Denise Lavoie, “Mass. Legal Group Says Labs Need More Oversight,” SF Gate, Feb. 10, 2014. See generally Ken Strutin, “Databases, E-Discovery, and Criminal Law,” 15 Rich. J.L. & Tech. 6 (2009); Jason Kreag, Letting Innocence Suffer: The Need for Defense Access to the Law Enforcement DNA Database, SSRN (2014).

12. See Sadie Gurman, “Problem with DNA Robot Led to Denver Police DNA Mix-Up,” Denver Post, Jan. 10, 2014.

13. Id.

14. See generally Ken Strutin, “Forensic Due Process: Lawyering With Science,” N.Y.L.J., March 20, 2012, at 5.

15. See also Claudia Himmelreich, “Germany’s Phantom Serial Killer: A DNA Blunder,” Time Mag., March. 27, 2009.

16. See Karl Popper, “On the Sources of Knowledge and Ignorance,” Proceedings of the British Academy, 46, 69 (1960).

17. See Strengthening Forensic Science: A Progress Report (White House 2014).

18. See “New Forensics Commission to Develop Professional Code,” The Crime Rep., Jan. 13, 2013. See generally Ken Strutin, “Ethics and Experts: Courts Look to the Web to Find Professional Standards,” N.Y.L.J., Oct. 11, 2005, at 5.

19. Hon. Harry T. Edwards, Reflections on the Findings of the National Academy of Sciences Committee on Identifying the Needs of the Forensic Science Community at First Public Meeting of the National Commission on Forensic Science (OJP Feb. 3, 2014), at 2.

20. Id. at 5.

21. Id. at 2. See generally Ken Strutin, “Strengthening Forensic Science: The Next Wave of Scholarship,” LLRX, Nov. 23, 2009.

22. See, e.g., Rockefeller Introduces Bill to Advance Forensic Science Reform, U.S. Senate Comm. on Commerce, Science and Transportation Press Rel., Feb. 12, 2014 (Forensic Science and Standards Act of 2014).

23. See generally Exonerations in 2013 (Nat’l Reg. of Exonerations)(record setting 87 documented exonerations in 2013).

24. See, e.g., N.Y. Inspector General Reports: Investigation into the New York City Office of Chief Medical Examiner: Department of Forensic Biology (2013); Report of Investigation of the Monroe County Public Safety Laboratory (2012); Investigation into the Nassau County Police Department Forensic Evidence Bureau (2011); Erie County Department of Central Police Services Forensic Laboratory Final Report (2009); Report of Investigation of the Trace Evidence Section of the New York State Police Forensic Investigation Center (2009).

25. See Martha Neil,” In Wake of Crime Lab Scandal, ACLU Asks Top State Court for Dismissal of 40,000 Cases,” ABA J. Law News Now, Jan. 9, 2014; Deborah Becker, “Despite Scandals, Nation’s Crime Labs Have Seen Little Change,” NPR, Jan. 3, 2014; Mark Hansen, “Crime Labs Under The Microscope After a String of Shoddy, Suspect and Fraudulent Results,” ABA J. Law News Now, Sept. 1, 2013.