Amazon is making Alexa speak in the voice of a missing loved one and that has security experts worried.
The connected speakers, and the voice digital assistants that accompany them, allow an impressive number of actions to be carried out. The only limit seems to be the imagination of the developers. If most of the time the new features are very useful, some are a little more questionable, or, at the very least, strange. This is the case of the last idea ofAmazon for his voice assistant Alexa who wants to allow him to reproduce voices.
Amazon makes Alexa speak with the voice of a missing loved one
During the Amazon Re: Mars conference, the vice-president of Alexa Rohit Prasad demonstrated a brand new faculty of the voice digital assistant of the American giant: the possibility of imitating voices. So far, we have no indication of when this feature will be available, or even if it will ever hit the mainstream.
Strangely enough, Amazon introduced this new ability as a way to honor our departed loved ones. The American firm showed a video in which Alexa reads to a child with the voice of her recently deceased grandmother. Rohit Prasad explained that the company is looking for ways to make its artificial intelligence as personal as possible. “While AI can’t take away the pain of loss, it can make memories last.” An Amazon spokesperson told Engadget that this new skill can create a synthetic voiceprint after being trained for just a minute with the voice of the individual it is meant to replicate.
and that worries security experts
Security experts have long expressed concern that such audio tools, which use text-to-speech technologies to create synthetic voices, could pave the way for new kinds of scams. Voice-cloning software has already enabled plenty of mischief, including a 2020 incident in the United Arab Emirates where fraudsters tricked a bank manager into transferring $35 million after he vocally impersonated the manager. That being said, crimes with deep fake audio are still not widespread and the tools available are still relatively basic.