Will we one day be able to chat with a deceased loved one thanks to Amazon? | TechBuzz

- Advertisement -


At its re:MARS conference, Amazon unveiled a technology capable of mimicking a person’s voice through several audio clips. Its objective is to allow its voice assistant to take over, for example, the voice of your grandmother.

Amazon is playing with fire. If the subject of virtual resurrection is not new, the American giant has just taken a new step with a technology unveiled on June 22 during its re:MARS conference. The scientists in charge of Alexa, Amazon’s voice assistant, have developed a technology capable of mimicking a person’s voice by listening to several original audio clips. 1 minute of speech allows the AI ​​to speak at length with a voice similar to the buffer person.

If, technically, this technology can simply be used to improve voice modeling, Amazon wants to go much further. He introduces her as a way to resurrect his dead grandmother’s voice, to feel like he’s chatting with her when talking to Alexa. “We are unquestionably living in the golden age of AI, where our dreams and science fiction are becoming reality. » welcomes Rohit Prasad, the scientist in charge of Alexa.

Des deep fakes audio

- Advertisement -

Technically, the prowess of Amazon lies in the short duration of the extracts to be analyzed. It used to take hours to teach a machine a voice, but the e-commerce giant now takes less than a minute. However, what he does is not unprecedented. It has never been technically difficult to imitate a voice from original samples and deep learning (the machine analyzes lots of pieces, puts them in relation and creates a model itself). On the other hand, ethically, this poses a problem.

READ  Kobo Libra 2: this premium reader is back on sale for the sales | TechBuzz

Since the imitated person is not consenting, we fall into the domain of deep fakes. A little like Thierry Ardisson who interviews Dalida, Amazon plays with the border between the world of the living and the world of the dead, while putting aside the moral question of making someone speak who is no longer there to defend. Granted, Alexa will never pretend to be your grandmother, but the illusion could fool an unsuspecting user. What is happening with the Google engineer convinced that his AI is conscious is a good example of the excesses of this technology. One can easily fall into the trap and no longer dissociate real and unreal. In short, Amazon opens a Pandora’s box. Another problem: this technology can, technically, imitate anyone. And therefore make say anything to someone alive. Amazon takes the risk of making accessible a technology that is currently not easy to master.

amazon echo 3
An Amazon Echo speaker. // Source: Amazon

Far be it from us to deny the benefits that such a feat could bring to part of society. There are many people who have lost a loved one and who would dream of being able to chat with them, at least artificially, from time to time. For them, Amazon’s announcement undoubtedly raises hope. The concerns are more about how Amazon will set up this whole system, if the technology ever comes to market. The limit is no longer technical, but moral. If this step is really taken, what will be the next one?

- Advertisement -



Source link

- Advertisement -
Admin
Adminhttp://techbuzz.asia
I am admin of techbuzz.asia blog & I provide tech-related news. As a part of my hobby, I make content related to technology and gadgets reviews too. I love to be a content creator apart from it, I am a full-time employee in an MNC company and manage blogs systematically. You can mail me at [email protected]

More from author

Related posts

Advertisment

Latest posts

The new Mi Band 7 Pro is more of a smartwatch than a bracelet – SmartphoneHrvatska | TechBuzz

Yesterday, at an event in China, Xiaomi presented the new Xiaomi 12S series of phones with Leica cameras, along with the new...

Want to stay up to date with the latest news?

We would love to hear from you! Please fill in your details and we will stay in touch. It's that simple!