Attack of the (Voice) Clones: Protecting the Right to Your Voice
A wide range of tools have been developed to perform vocal cloning, leading to vocal deepfakes becoming a common source of scams and misinformation. And these issues have only been exacerbated by a lack of appropriate laws and regulations to rein in the use of AI and protect an individual's right to their voice.
September 23, 2024 at 12:03 PM
8 minute read
CybersecurityIn January 2023, AI speech synthesis company ElevenLabs, Inc. released a beta platform for its natural-sounding vocal cloning tool. Using this platform, a brief snippet of a person's voice could generate audio files of the target saying anything the uploader desired. This release created a spike in misappropriated vocal cloning from viral rap songs to parodies of political figures. Recognizing their software was being widely misused, ElevenLabs installed safeguards to ensure the company could trace the generated audio back to a creator. But it was too late. Pandora's box was already open.
NOT FOR REPRINT
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.
Law Firms Mentioned
Trending Stories
- 1First Lawsuit Filed Alleging Contraceptive Depo-Provera Caused Brain Tumor
- 2BD Settles Thousands of Bard Hernia Mesh Lawsuits
- 3The Law Firm Disrupted: For Big Law Names, Shorter is Sweeter
- 4The Growing Tension—And Opportunity—in Big Law Nonequity Tiers
- 5The 'Biden Effect' on Senior Attorneys: Should I Stay or Should I Go?
Featured Firms
Law Offices of Gary Martin Hays & Associates, P.C.
(470) 294-1674
Law Offices of Mark E. Salomone
(857) 444-6468
Smith & Hassler
(713) 739-1250