Consumer Reports finds popular voice cloning tools lack safeguards


Latest
AI
Amazon
Apps
Biotech & Health
Climate
Cloud Computing
Commerce
Crypto
Enterprise
EVs
Fintech
Fundraising
Gadgets
Gaming
Government & Policy
Hardware
Layoffs
Media & Entertainment
Meta
Microsoft
Privacy
Robotics
Security
Social
Space
Startups
TikTok
Transportation
Venture
Events
Startup Battlefield
StrictlyVC
Newsletters
Podcasts
Videos
Partner Content
TechCrunch Brand Studio
Crunchboard
Contact Us
Posted:
Several popular voice cloning tools on the market don’t have “meaningful” safeguards to prevent fraud or abuse, according to a new study from Consumer Reports.
Consumer Reports probed voice cloning products from six companies — Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify — for mechanisms that might make it more difficult for malicious users to clone someone’s voice without their permission. The publication found that only two, Descript and Resemble AI, took steps to combat misuse. Others required only that users check a box confirming that they had the legal right to clone a voice or make a similar self-attestation.
Grace Gedye, policy analyst at Consumer Reports, said that AI voice cloning tools have the potential to “supercharge” impersonation scams if adequate safety measures aren’t put in place.
“Our assessment shows that there are basic steps companies can take to make it harder to clone someone’s voice without their knowledge — but some companies aren’t taking them,” Gedye said in a statement.
Topics
Subscribe for the industry’s biggest tech news
Every weekday and Sunday, you can get the best of TechCrunch’s coverage.
TechCrunch's AI experts cover the latest news in the fast-moving field.
Every Monday, gets you up to speed on the latest advances in aerospace.
Startups are the core of TechCrunch, so get our best coverage delivered weekly.
By submitting your email, you agree to our Terms and Privacy Notice.
© 2024 Yahoo.