Stephen Fry claims his voice was stolen using AI to narrate film

Artifical intelligence concept graphic
Share this Article:

Stephen Fry’s voice was allegedly appropriated for a historical documentary – without his permission or consent. 

Stephen Fry has claimed that his work narrating the Harry Potter series of audio books was used as a dataset for AI to appropriate his voice, with an unnamed historical documentary then using it without permission to narrate the film.

Deadline has reported the story in which Fry, speaking at the CogX festival in London last week, played a clip of the documentary before revealing: “I said not one word of that — it was a machine. Yes, it shocked me,” he said. “They used my reading of the seven volumes of the Harry Potter books, and from that dataset an AI of my voice was created and it made that new narration.

“What you heard was not the result of a mash up. This is from a flexible artificial voice, where the words are modulated to fit the meaning of each sentence,” Fry said.

“It could therefore have me read anything from a call to storm parliament to hard porn, all without my knowledge and without my permission. And this, what you just heard, was done without my knowledge. So I heard about this, I sent it to my agents on both sides of the Atlantic, and they went ballistic — they had no idea such a thing was possible.”

We’d imagine that the film in question is already facing legal recourse, but the incident brings into sharper focus the problems that are on the horizon for voice actors and voice workers, with AI presenting nothing less than an existential threat to their profession. The iconic voice of Darth Vader is now fully voiced by AI since James Earl Jones signed over the rights to his voice to Lucasfilm, but it seems that less scrupulous companies are appropriating the voices of talent without looking for permission or seeking to offer recompense.

The issue also calls into question what happens with voices where actors cannot identify or challenge the appropriation of their voices. Perhaps they are no longer with us or lack the financial resources to challenge this worrying new problem. Fry warns that this is just the beginning, adding, “You ain’t seen nothing yet. This is audio, it won’t be long until full deepfake videos are just as convincing.” Yikes. We’ll bring you more on this one as we hear it.

Thank you for visiting! If you’d like to support our attempts to make a non-clickbaity movie website:

Follow Film Stories on Twitter here, and on Facebook here.

Buy our Film Stories and Film Junior print magazines here.

Become a Patron here.

Share this Article:

More like this