As a private investigator, I’m interested in how our society reconciles notions of privacy with the collection and dissemination of data, some of which I use in my cases. Investigators aren’t generally purveyors in aggregate data, but in many ways the individualized data we use cuts to the heart of the debate about privacy when it comes to so-called big data. Investigators don’t work with anonymized data; if we use a database to retrieve information about someone we’re investigating, for example, that information is likely to include their date of birth and Social Security number.

To hear the latest in the debate over privacy and big data, I attended an event at Georgetown University McCourt School of Public Policy yesterday called, “Privacy Principles in the Era of Massive Data.” The talk featured keynote speaker Maureen Ohlhausen, the Commissioner of the Federal Trade Commission (FTC), and a panel of distinguished academics from Georgetown Law Center, Georgetown University, the Brookings Institution, and the Future of Privacy Forum. My goal in attending the event was to feel the pulse of the privacy debate.

Referring to our time as the “Era of the Internet of Things” (a nod to technologist Kevin Ashton), Ohlhausen laid out the FTC’s chief concerns when it comes to privacy and big data:

• General privacy concerns that are not unique to big data

• Privacy concerns that might be exacerbated because of the nature of big data

• Issues that relate to fairness and discrimination

The first issue would include things like data breaches, and the second issue would include things like conforming to the FTC’s Fair Information Practice Principles (FIPPs) as they relate to notifying consumers what purpose their information will be used for when those collecting it have no idea.

Regarding fairness and discrimination, Ohlhausen listed some of the limitations imposed by the Fair Credit Reporting Act (FCRA), pointing out, “The FCRA could be considered the first big data bill.” The FCRA establishes limitations on how certain information, such as credit, insurance, and employment information, can be used, and it also imposes limitations on “random background checks,” according to Ohlhausen.

Incidentally, the FIPPS are fundamentally at odds with how information is collected during investigations, and the private investigations industry has had its own bad experiences with the application of the FCRA. For a period of time a few years ago the FTC’s position was that employees had to be notified in advance of any internal investigation, obviously a problem for many types of investigations. This position was later amended to give some leeway in cases where employees are suspected of violating a law or an existing company policy.

It seems people are always trying to cram the circular private investigation industry into the square hole of consumer privacy. This is in many ways similar to the fallacy of trying to apply lofty principles like those in the FIPPs to the so-called Era of the Internet of Things.

Following Ohlhausen’s remarks, Benjamin Wittes a Senior Fellow in Governance Studies at the Brookings Institution, pointed out that using the term “privacy” as the chief value in the debate over the effects of the big data revolution “overpromises in terms of its scope of protection to the individual.” He likened it to how some describe the European privacy regime, “the right to be forgotten on the Internet”—a right that sounds terrific in theory, but which doesn’t exist in practice. He asked if there could be a “breach of privacy” if users willingly give up their information for the benefit of services.

Wittes argued for a more reasonable and localized measure of “privacy gained,” as opposed to viewing the debate solely through the opaque prism of privacy. In explaining what he means by privacy gained, he gave as an example the days before the prevalence of Internet pornography, when young men buying porn had to do so from a cashier. Because of the Internet, young men (and women too) no longer have to face the embarrassment of facing a live person to buy porn, the point being that the mere perception of monitoring inhibits our behavior, for better or worse. The same idea applies to searching for information about sexually transmitted diseases online or buying condoms from automated checkout lines at the grocery store. This is privacy gained in action, even if the transactions are still technically traceable.

The investigator in me loves Wittes’ measure of privacy gained, because it seems to cut to the heart of why privacy matters on a personal level without ceding much of the data that might become valuable later for investigations.

Julie Cohen, a Professor at the Georgetown Law Center, disagreed with Wittes, however, arguing that “privacy harms are more than just a creepy feeling.” Cohen described how people are less willing to take personal risks when there is a perceived need for conformity. Privacy allows for a “process of self-creation and experimentation,” she said. Cohen agreed with Wittes that “privacy” is perhaps too nebulous of a concept to describe this debate, but she prefers to look at this issue in more structural terms, the “right to hide in the gaps, the right to have that breathing room for self-exploration.”

We need to “appreciate the unknown more,” she said, “not [consider the unknown] as something that should be eliminated.”

Even as a private investigator and a purveyor of information, I’m fine with the idea of embracing the unknown, but when something arbitrarily unknowable affects the means to justice in some way I have a problem with that. People certainly have a right to hide in the gaps, but by the same token, given a legally sound reason, I should have a right to look for them there.

The way I see it, to the extent that bills like the FCRA curtail access to specific types of information for use under limited permissible purposes, that is privacy gained. On the other hand, efforts by legislatures (and academics) to install an overarching privacy framework to apply to today’s exceedingly complex data economy—thereby creating artificial gabs of unobtainable information—should be handled with, as Ohlhausen put it, “regulatory humility.”