In a letter to a U.S. senator earlier this month, Amazon confirmed that when Alexa is spoken to, it’s all ears. Specifically, the company said its smart speaker doesn’t always delete transcripts of conversations, even if users manually delete the recording.

To be sure, Alexa gives users notice of its collection of recordings and the opportunity to change those settings, meaning its practice clears most privacy law hurdles in the U.S., lawyers said. But should Alexa collect medical information and background conversations, it may put the transcripts under the scope of various state laws.

This content has been archived. It is available through our partners, LexisNexis® and Bloomberg Law.

To view this content, please continue to their sites.

Not a Lexis Subscriber?
Subscribe Now

Not a Bloomberg Law Subscriber?
Subscribe Now

Why am I seeing this?

LexisNexis® and Bloomberg Law are third party online distributors of the broad collection of current and archived versions of ALM's legal news publications. LexisNexis® and Bloomberg Law customers are able to access and use ALM's content, including content from the National Law Journal, The American Lawyer, Legaltech News, The New York Law Journal, and Corporate Counsel, as well as other sources of legal information.

For questions call 1-877-256-2472 or contact us at [email protected]