The new Personal Voice featurewill create a voice that sounds like the user.
It’s intended for users who are nonspeaking or at risk of losing that ability.
But the technology could also create confusion.
Westend61 / Getty Images
“There are many voice biometric-based authentication systems that are going to be in trouble.
What if someone uses a voice clone to gain unauthorized access to a bank account?
Or uses it to spread misinformation.”
yanyong / Getty Images
The feature uses on-machine machine learning to keep users' information private.
A Tool for Voice Cloning Scams?
Voice cloning scamsare on the rise.
Criminals use AI to impersonate someone else and trick the victim into giving money or personal information.
The new voice cloning tech on iPhones could lead to unintended consequences, Iyengar said.
He pointed out that iPhone calls are often used as evidence in legal cases.
Personal Voice might blur the line between fiction and reality.
And, she noted, it is illegal to impersonate a law enforcement official or a federal employee.
The law and regulations are not so clear on that.