Hey there What’s Next readers! Hope you all had a great holiday. I’m still out myself, but wanted to share a fascinating conversation I had with none other than Stanford Law’s Jen King about privacy concerns surrounding commercial DNA collection (think 23andMe—a company on which King did quite a bit of research). We also have news on the SEC’s legal setback this week in its push to treat an initial coin offering as a security. My colleague Nate Robson has also been kind enough to wrangle some links to news you can use (and may have missed) around tech in law.
➤➤ Would you like to receive What’s Next as an email? Sign up here.
Does DNA Collection Spell Danger for Consumer Privacy?
While the U.S. has the Genetic Information Nondiscrimination Act to prohibit health insurance and employment discrimination based on a person’s genetic information, consumer privacy protections for anyone handing their DNA to companies like Ancestry.com remain slim.
Stanford Law’s Jen King is more than familiar with the potential policy pitfalls unique to this area of data collection. As director of consumer privacy for Stanford’s Center for Internet and Privacy, King conducted a study of 23andMe users looking at their motivations for taking the test, and their perception of risk. I dialed King to get her take on some of the legal and privacy issues around DNA data collection, where the law stands and what may be in store down the road.
Here are the highlights, condensed for clarity:
Q: What are some of the bigger privacy concerns people should have regarding sharing their DNA with companies like 23andMe?
DNA is unlike any other data that you share or have collected. It is uniquely identifiable, and it’s unchangeable. It’s yours. Forever. You could change your social security number. You could change your name. You can’t change your DNA. I think people are used to sharing and giving away a lot of info about themselves, but this is different. You could infer things about other people from your data. DNA is shared with your family. When you give that up, you potentially give up other people’s identifiability and privacy as well.
While we do have GINA, which protects the use of DNA, for this consumer level DNA testing, there’s no other restrictions. No use purpose limitation. So while all the sites today are not selling that data, there’s no reason these companies couldn’t change their mind at some point and decide, “Hey, we’re going to sell it.”23andMe has partnerships with very positive research concerns, but the GlaxoSmithKline partnership is one that I think people might raise a lot of questions around. There’s no guarantee that the findings they derive from this data that people paid to make available won’t be used to further enrich these companies.
Q: Let’s chat a bit more about the legal framework. What are some loopholes around DNA collection?
It’s huge. It’s not covered by HIPAA—there’s no doctor-patient relationship here outside of it going to an employer or health insurance company. Therefore there should be no limitations on what collecting companies can do with it. If they sell aggregated, anonymized data about their customers—that includes a genetic predisposition—to a data broker, could a medical insurance company buy that? Probably? Because it may not be identifiable in the form it’s sold, but they couldn’t use it to discriminate against the individual. But for the most part, there’s nothing in the U.S. that should stop them from doing what they want with it.
Q: How does the U.S. stack up against other nations in terms of protections?
I don’t know a definitive answer. I don’t think the GDPR would affect direct consumer genetic testing in any way other than the fact that you have to opt in to everything. There may be more subtleties on how they process the data, but for the most part they were not sharing and selling with the partners to begin with, so I’m not sure anything in GDPR would make them change their business practices, other than the things that were related to them just running the website.
I do know researchers in Europe have been looking at some of these issues, especially the data ownership one—something the DNA collecting companies dance around both here and the EU. I mean you own the saliva sample, you could probably argue, and you could ask them to destroy it, but who owns the raw genome mapping data? I think in the terms and conditions, to the extent they address that around research and scientific development based on the data, they basically say you have no claim without saying who owns it ultimately.
Q: What sort of suggestions do you have for a legal framework tightening up loopholes, and limiting the controls?
Limitations on its use—You probably don’t want to restrict this data in the biomedical context. There are a lot of important and useful reasons why people want to have this type of testing done beyond a recreational purpose. But I’m more concerned about the lack of limitations on commercial use. And certainly data brokerage. I’d recommend some limitations on consumer marketing usage, especially with individual data.
Also, restrictions on how DNA data is stored and repurposed, because technically, it’s identifiable. With precision medicine and drug development, there will be resistance to de-identifying the data because they want the highest fidelity possible. But maybe in a context where you can argue you could have a less rich sample, maybe more requirements on it being cryptographically signed or stored. Using differential privacy—introducing noise to the genetic sample so it can’t be traced back to you, but allows you to study the mutation of particular genes.
Q: Is this something the law is dangerously falling behind on?
As we collectively, us and the Europeans and whomever else, engage in the process of future privacy law, we need to get past the world of just behavioral targeting and preferences, and observations, to this biometric level—facial recognition, iris scanning all fall into that bucket of identifiability. Having some hard discussions—are we going to put any limits on what we do with this data? Because I think there’s some really disturbing political, societal-level controlling and sorting uses of this data.
Q: there any comfort to be found in the FTC looking into the issue?
I think that’s important. There is a fair amount of misunderstanding out there about what is protected by HIPAA. There were people who just assume, “Oh well, it’s health related. I think there’s a law. HIPPA can protect me.” Which is not true. I mean, GINA does exist, but no one in my research knew that law by name and could tell me what protections they thought they had.
Q: What sort of DNA disputes do you foresee?
Let’s take paternity tests. A man takes a 23andMe test and discovers his father is not his father because he has a bunch of cousins that aren’t related to any part of his family. Let’s say his father is still alive. Can he use it to sue his father or compel him to take a test? That’s probably more of what we might see. I can imagine someone saying have my parents fraudulently kept information from me my whole life.That I was adopted or one parent wasn’t my parent, and we have a bad relationship and I want to sue them.
Q: And we can see how this ties into privacy concerns.
Certainly we know the companies are asked for information. I think a couple have said publicly that unless there’s a proper warrant or something similar, they usually quash all requests that come their way.
Q: What instance do you see being the wakeup call for consumers? What is that Cambridge Analytica moment for DNA?
There’s a third party app scenario—you’re going to open an API to a DNA collector’s data in some form that ends up being misused in some sort of way. There’s also a flat out breach, and it could be with a partner. Then the questions will start to come into play: Well, how de-anonymized was it, and can you truly de-anonymize these samples even if they have no other kind of metadata associated with them? It’s still DNA. I can still take a sample and compare it against something similar in a database, and find a match.
SEC’s Crypto Insecurity
In what appears to be the first federal decision finding that a digital asset offered in an initial coin offering is not a security, a judge in San Diego has turned back a request from the U.S. Securities and Exchange Commission for a preliminary injunction against the backers of the Blockvest ICO.
U.S District Judge Gonzalo Curiel of the Southern District of California, who previously granted the SEC’s ex parte request for a temporary restraining order and froze the assets involved in the ICO, found on Tuesday that the SEC couldn’t show that investors bought into the Blockvest offering with the expectation of making a profit from the efforts as others—part of the three-part “Howey” test for the definition of a security under the the 1946 U.S. Supreme Court decision in SEC v. W.J. Howey Co.
Stanley Morris of Santa Monica’s Corrigan & Morris, who represents Blockvest and Ringgold, told my colleague Ross Todd it was obvious the judge studied the facts and the applicable precedent.
“It is an extraordinary challenge for defendants facing freeze orders and restraining orders obtained ex parte by the government,” Morris said. “They face a mountain of expedited discovery ordered by the court, with no money to pay professionals to respond.”
Read the full report here.