Sir David Attenborough says AI clone of his voice is ‘disturbing’
Now for a game of guess who. Can you tell the difference between one of the most famous voices in broadcasting, Sir David Attenborough, and his AI-generated clone? Well, here’s a clip of the real Sir David talking about his new series Asia:
“If you think you’ve seen the best the natural world has to offer, think again. There’s nowhere else on Earth with so many untold stories. Welcome then to Asia.”
Well, now just take a listen to this:
“If you think you’ve seen the best the natural world has to offer, think again. There’s nowhere else on Earth with so many untold stories. Welcome then to Asia.”
So, was that the same clip that we played twice, or was it AI just impersonating Sir David Attenborough? I wonder if we can just try and listen to that second clip again? Ah, well, we can. Well, I mean, what do you think? It’s incredibly close. Could you tell the difference?
Those items you just heard were found on a website by some of our colleagues right here at the BBC, and there are several sites that offer AI-generated voices of the trusted broadcaster. The BBC also contacted Sir David in light of all of this, and just have a look at his response:
“Having spent a lifetime trying to speak what I believe to be the truth, I’m profoundly disturbed to find that these days my identity is being stolen by others and greatly object to them using it to say whatever they wish.”
Well, this morning, one of those websites posted another clip clarifying its stance. Let’s set the record straight:
“Unless Mr. Attenborough has been moonlighting for us in secret and under an assumed name with work or authorization in the United States, he is not on our payroll. I am not David Attenborough. We are both male British voices for sure; however, I am not David Attenborough. For anyone out there who might be confused.”
Incredibly close, isn’t it? I mean, you have to really double-take. If I didn’t know, I wouldn’t know. Let’s talk to Dr. Jennifer Williams, who researches AI audio at the University of Southampton. I mean, we’re talking about this and how closely those voice matches come together, but there’s a really serious issue behind all of this. Even some of my colleagues have had their voices used for different purposes—AI-generated voices. Just talk to us about how it can match so closely and what they’re being used for.
Dr. Jennifer Williams: “So, there’s actually a few different ways that you get a voice clone that matches so closely. One would be to scrape the internet of a target—for example, Sir David Attenborough—collecting enough of his data to create a model of his voice and then, of course, putting words in his mouth. Another way is that it could potentially happen accidentally. There are no safeguards in place to guarantee that a synthetic voice is uniquely different from a real person.”
By scraping the internet, what kind of mechanisms are used? Who’s doing it? We know that there are websites out there using these AI-generated voices to get their messages across for whatever reasons, but what kind of organizations do it?
Dr. Williams: “Surprisingly, a lot. Before this interview, I did a quick Google search and found that there are tools you could go to right now and get a clone of a voice. I don’t know how they’re making it. They’re probably, of course, scraping the web. But anyone could make a clone of David Attenborough’s voice.”
That’s absolutely extraordinary. From Sir David’s response, it’s clear he’s disturbed and upset. What are the implications of this going forward? What could these different companies use these voices for, especially trusted voices like Sir David’s?
Dr. Williams: “I think some people probably see it as a creative outlet, doing something like humor or parody. Then, of course, there are the nefarious purposes—creating an actual voice and presenting it as an authoritative figure for various misinformation or disinformation. It’s important to make the distinction between creative uses of voice cloning technology and these nefarious uses that present falsely as someone.”
How aware is the public about the use of these AI-generated clones?
Dr. Williams: “It’s becoming more and more commonplace to talk about voice cloning. I don’t think we need to be in a state of fear and hide ourselves away from the internet, but it is important to be aware that this technology exists. It didn’t exist several years ago and wasn’t in the hands of everyone online for free. Raising awareness about the issue and considering legal frameworks and regulatory frameworks will help protect people.”
Dr. Jennifer, in terms of learning how to deal with this, what could we do?
Dr. Williams: “I recommend an approach called the SIFT method: stop, investigate the source, find other sources, and then think about the context. Anytime you see something that might be out of place, stop, examine it, and consider whether it’s supported by other evidence or information sources. Then, think about the context. Is this a political advert? What’s the topic?”
Let’s replay the original clip again. Dr. Jennifer, bear with us. I’d like to replay the original clip of Sir David talking about his new series. To reiterate, this is the real clip of Sir David speaking:
“If you think you’ve seen the best the natural world has to offer, think again. There’s nowhere else on Earth with so many untold stories. Welcome then to Asia.”
For absolute transparency, when we played the AI-generated clip earlier, we indeed played the same clip again. That’s how confused we were, and we were meant to know what we were doing. Now, I’m going to play the AI-generated clip. Let’s listen:
“Donald Trump has nominated Florida Congressman Matt Gaetz as the next Attorney General, a move that has generated significant controversy due to Gaetz’s legal history. The James Webb Space Telescope recently made a jaw-dropping discovery, catching sight of supermassive black holes from the early universe. NATO is preparing for the worst-case scenario—a large-scale evacuation of wounded troops in the event of a war with Russia.”
To stress, that was the AI-generated clip of Sir David. Dr. Jennifer, your reaction to how closely those voice patterns match?
Dr. Williams: “It sounds like Sir David Attenborough to me. When you played the second round of clips, I’m a little disgusted. This is very serious to think about. When you have a trusted voice like Sir David Attenborough, recognized worldwide as an authority and a voice of truth, and then to have words put in his mouth about war, politics—things he has never said or may not endorse—it’s very concerning.”
Dr. Jennifer Williams, a researcher of AI audio at the University of Southampton, thank you for giving us your reaction and insights. We appreciate your analysis. For more information, visit our website.
Related links: