Amazon uses a child’s dead grandmother in Alexa’s morbid audio deepfake demo

Zoom in / The fourth generation Amazon Echo Dot smart speaker.

Amazon

Amazon is trying to figure out how to have its voice assistant Alexa fake anyone’s voice, dead or alive, with a short recording. The company unveiled the feature in a demo at its re: Mars conference in Las Vegas on Wednesday, using the emotional trauma of the ongoing pandemic and grief to sell interest.

Amazon’s Re: Mars focuses on artificial intelligence, machine learning, robotics and other emerging technologies, with technical experts and industry leaders taking the stage. During the second day keynote, Rohit Prasad, senior vice president and chief scientist of Alexa AI at Amazon, demonstrated an in-development feature for Alexa.

In the demo, a child asks Alexa: “Grandma can finish reading me Wizard of Oz? “Alexa replies,” Okay, “in her typical feminine, robotic voice. But then, the baby’s grandmother’s voice comes out of the speaker to read the story by L. Frank Baum.

You can watch the demo below:

Amazon re: MARS 2022 – Day 2 – Keynote.

Prasad only said that Amazon is “working” on Alexa’s functionality and did not specify what work remains and when / if it will be available.

However, he provided minute technical details.

“This required an invention where we had to learn how to produce a high quality voice with less than a minute of recording time compared to hours of recording in the studio,” he said. “The way we did it is to frame the problem as a speech conversion activity and not as a speech generation activity.”

Prasad very briefly discussed how the function works.
Zoom in / Prasad very briefly discussed how the function works.

Of course, deepfaking has earned a controversial reputation. However, there has been some effort to use technology as a tool rather than a means of unrest.

Audio deepfakes in particular, as noted by The Verge, have been exploited by the media to compensate when, for example, a podcaster misses a line or when a project star suddenly dies, as happened with Anthony Bourdain’s documentary. Roadrunner.

There are even cases of people using AI to create chatbots that work to communicate as if they were a lost loved one, the publication noted.

Alexa wouldn’t even be the first consumer product to use deepfake audio to replace a family member who can’t be present in person. The Takara Tomy smart speaker, as pointed out by Gizmodo, uses AI to read children’s bedtime stories with the voice of a parent. Parents reportedly upload their voices, so to speak, by reading a script for about 15 minutes. Although this differs greatly from the Amazon demo, in that the owner of the product decides to provide their own voice, rather than the product using the voice of someone who is likely unable to give their permission.

In addition to concerns that deepfakes are being used for scams, scams, and other nefarious activities, there are already some troubling things about how Amazon is framing the feature, which does not yet have a release date.

Before showing the demo, Prasad talked about Alexa offering users a “companion relationship”.

“In this companion role, the human attributes of empathy and affection are critical to building trust,” the executive said. “These attributes have become even more important in these ongoing pandemic times, when so many of us have lost someone we love. While AI can’t eliminate the pain of loss, it can definitely make their memories last.”

Prasad added that the feature “allows for lasting personal relationships”.

It is true that countless people are serious about human “empathy and affection” in response to the emotional distress initiated by the COVID-19 pandemic. However, Amazon’s AI voice assistant isn’t the place to meet those human needs. Also, Alexa can’t enable “lasting personal relationships” with people who are no longer with us.

It’s not hard to believe that there are good intentions behind this developing feature and that hearing the voice of someone you miss can be a great comfort. We might even see ourselves having fun with a feature like this, in theory. Getting Alexa to make a friend sound like she said something stupid is harmless. And as we discussed earlier, there are other companies that leverage deepfake technology in ways similar to those demonstrated by Amazon.

But framing a developing Alexa ability as a way to revive a connection with overdue family members is a giant, unrealistic, and problematic leap. In the meantime, pulling on the heartstrings bringing pandemic-related pain and loneliness seems free. There are a few places Amazon doesn’t belong, and bereavement counseling is one of them.

Leave a Comment