Amazon wants Alexa to speak with the voices of your deceased loved ones | TechBuzz

- Advertisement -


Amazon is making Alexa speak in the voice of a missing loved one and that has security experts worried.

The connected speakers, and the voice digital assistants that accompany them, allow an impressive number of actions to be carried out. The only limit seems to be the imagination of the developers. If most of the time the new features are very useful, some are a little more questionable, or, at the very least, strange. This is the case of the last idea ofAmazon for his voice assistant Alexa who wants to allow him to reproduce voices.

Amazon makes Alexa speak with the voice of a missing loved one

- Advertisement -

During the Amazon Re: Mars conference, the vice-president of Alexa Rohit Prasad demonstrated a brand new faculty of the voice digital assistant of the American giant: the possibility of imitating voices. So far, we have no indication of when this feature will be available, or even if it will ever hit the mainstream.

Strangely enough, Amazon introduced this new ability as a way to honor our departed loved ones. The American firm showed a video in which Alexa reads to a child with the voice of her recently deceased grandmother. Rohit Prasad explained that the company is looking for ways to make its artificial intelligence as personal as possible. “While AI can’t take away the pain of loss, it can make memories last.” An Amazon spokesperson told Engadget that this new skill can create a synthetic voiceprint after being trained for just a minute with the voice of the individual it is meant to replicate.

READ  Premium Telegram is coming. What do you get for a small fee? | mobilenet.cz | TechBuzz

and that worries security experts

Security experts have long expressed concern that such audio tools, which use text-to-speech technologies to create synthetic voices, could pave the way for new kinds of scams. Voice-cloning software has already enabled plenty of mischief, including a 2020 incident in the United Arab Emirates where fraudsters tricked a bank manager into transferring $35 million after he vocally impersonated the manager. That being said, crimes with deep fake audio are still not widespread and the tools available are still relatively basic.

- Advertisement -



Source link

- Advertisement -
Admin
Adminhttp://techbuzz.asia
I am admin of techbuzz.asia blog & I provide tech-related news. As a part of my hobby, I make content related to technology and gadgets reviews too. I love to be a content creator apart from it, I am a full-time employee in an MNC company and manage blogs systematically. You can mail me at [email protected]

More from author

Related posts

Advertisment

Latest posts

How to Secretly Exchange Messages on iPhone | TechBuzz

The Notes app on iOS lets you collaborate. And this feature provides the ability to secretly communicate with one or more people. ...

PlayStation VR2: Tobii is the provider of eye tracking technology | TechBuzz

Tobii's eye-tracking technology will be used in Sony's PlayStation VR2. After lengthy negotiations at the start of the year, the world leader in...

Want to stay up to date with the latest news?

We would love to hear from you! Please fill in your details and we will stay in touch. It's that simple!