Digital babies. Altered vocals. Deceased actors back from the grave. A list of major films that have used AI in their making:
NB: The following contains an unavoidable spoiler for Alien: Romulus. Do skip that entry if you havenāt seen the film yet.
As the past few months have proven, AI is becoming an increasingly controversial topic both inside the film industry and without. As companies like OpenAI talk excitedly about the next generation of Artificial Intelligence replacing millions of jobs over the coming years, Oscar hopeful The Brutalist has sparked a backlash over its use of AI to adjust its actors’ performances.
The consternation has been such that the Academy is reportedly thinking about making it mandatory for filmmakers to disclose their use of AI when submitting their movies. The film industry, in essence, is racing to keep up with an area of technology that has only risen in the public’s consciousness over the past year or two.
In reality, though, filmmakers have been using AI since at least the start of the decade. Marvel made controversial use of generative AI to create the smearing opening title sequence for its TV series Secret Invasion in 2023, but Shang-Chi used a different form of AI for several of its shots two years earlier. As the list below shows, the usage of AI has not only accelerated between 2021 and the present, but the anger surrounding it has also grown at the same time.
In 2022, the use of machine learning to create a digital Baby Thor went largely unnoticed. Today, the use of AI tools in such films as The Brutalist or Emilia Perez is enough to harm their chances of winning an Oscar.
This might explain why the makers of The Fall Guy didn’t exactly advertise its (brief) use of AI tech, or why its presence in such films as Sonic The Hedgehog 3 or Deadpool & Wolverine has only recently come to light. (To date, we don’t know exactly how AI was used in those latter two films, hence why they don’t get an entry in this list.)
What follows, then, is a list of high-profile films that have used AI in some form and how they used it. Of note are the particular forms of AI you will and won’t find here. Machine learning has become increasingly prevalent when it comes to de-aging and face replacement among VFX studios. What’s markedly uncommon ā beyond one or two instances ā is generative AI, which synthesises artists and writers’ work to create sentences or imagery. To date, we haven’t found a high-profile film that has used ChatGPT to write its script, for example.
It’s likely that the movies listed below are only the tip of a much larger iceberg, given that some entries exist only as a short commentary on a VFX studio’s website. As such, it’s a list we’ll keep updating as more examples emerge.
The Irishman (2019)
Where was AI used? Martin Scorsese’s epoch-spanning crime drama used cutting edge CG trickery to de-age its stars, Robert De Niro, Al Pacino and Joe Pesci. Among the tools ILM used in its lengthy production process was a program called FaceSwap, which employs machine learning to comb through countless reference images of its actors in their earlier movies.
What they said: “We took a look at Casino, GoodFellas, Home Alone, and Godfather II, Godfather III, hundreds of titles,” VFX artist Pablo Helman told Vulture. “After we rendered their performances as younger actors, we would run these images through our database, through an artificial-intelligence program that would look for alike angles and lighting conditions. And the computer program would spew out frames that were similar to the ones we rendered.”
Shang-Chi And The Legend Of The Ten Rings (2021)
Where was AI used? The use of AI in Marvel’s martial arts fantasy adventure passed by largely without comment in 2021 ā probably because the film came out before the technology was more widely discussed. A post on Australian VFX studio Rising Sun Pictures, however, tells us that it expanded its “Artificial Intelligence capabilities” in its work on Shang-Chi.
Suggested product
SPECIAL BUNDLE! Film Stories issue 54 PLUS signed Alien On Stage Blu-ray pre-order!
£29.99
Among the 300 shots Rising Sun worked on, six sequences used machine learning to replace stunt performers’ faces with those of the film’s actors. An early example of a form of ‘deep fake’ technology in a feature film, these sequences took in “30,000 face images across five characters, training five principal models that combined to over 4 million training iterations,” according to the studio.
“We used machine learning tools to train neural network models of principal actors and apply them to the faces of stunt doubles engaged in martial arts fighting,” a VFX expert at the studio writes. “The output is incredibly real and sets a new bar for believability.”
Keep Rising Sun’s name in mind ā it’s one you’ll see pop up again elsewhere on this list.
What they said: In a sign of how rapidly machine learning technology is evolving, Rising Sun suggests that its use in Shang-Chi was almost brand new at the time of filming. “This would have been unheard [of[ a couple of years ago,” the studio’s managing director Tony Clark said. “What would have been a costly, labour intensive process, is now in the hands of Machine Learning, with no special hardware.”
Thor: Love And Thunder (2022)
Where was AI used? Again, the use of AI in Taika Waititi’s antic superhero sequel has largely gone unnoticed by critics and the movie-going public. In Thor: Love And Thunder’s case, machine learning was used to help create a baby Thor seen in one prominent sequence. A real infant ā said to be then-Disney boss Bob Chapek’s son ā was used as reference imagery, which was then used to create an all-digital Baby Thor.
What they said: “The advantage of this technique over standard ‘deep fake’ methods is that the performance derives from animation enhanced by a learned-library of reference material,” VFX producer Ian Cope said in a Rising Sun Pictures post. “The result is a full screen photo-real Baby Thor storming into battle.”
Ant-Man And The Wasp: Quantumania (2023)
Where was AI used? Here’s a piece of software we probably won’t hear discussed all that much in the coming years. The creation of Canadian VFX studio Monsters Aliens Robots Zombies (or MARZ for short), Vanity AI is a program designed to digitally de-age actors, adjust hairlines, or remove blemishes from their skin ā in other words, precisely the kinds of things big-name stars wouldn’t want people to read about in Hollywood trade papers.
Nevertheless, The Hollywood Reporter revealed in 2023 that Marvel’s Ant-Man And The Wasp: Quantumania used Vanity AI in certain shots involving Cassie Lang, played by Kathryn Newton. The article doesn’t specify exactly how it was used, but it does mention that Vanity AI had also been employed in films and shows including Spider-Man: No Way Home, Stranger Things 4 and First Ladies. It’s one of those technologies that will likely be used in dozens of movies without audiences being any the wiser.
What they said: ““An artist can sit down and on one frame mask the areas we want to fix,” said MARZ co-founder Lon Molnar. “So let’s say itās under-eye bags. You can make that area look proper with that one frame and [Vanity AI] basically extrapolates that not only across the shot but sequences of shots.”
Indiana Jones And The Dial Of Destiny (2023)
Where was AI used? Director James Mangold’s belated Indy adventure opened with an extended action sequence featuring a conspicuously young Harrison Ford. Like so many movies released after the past few decades, the scene used cutting-edge VFX to de-age its star. In this case, ILM used a mixture of tools to make the then 78-year-old Ford look like the 40-something year-old version of himself.
As on The Irishman, ILM’s artists used FaceSwap to comb through countless reference images of a younger Ford. It’s quite a technical feat, though the jury’s out on whether the process has become sufficiently advanced enough to pass for photoreal. The Uncanny Valley remains a challenge even an adventurer like Indiana Jones can’t quite overcome.
What they said: “We used ILM FaceSwap, which is a proprietary suite of tools rather than a single technology, to do the work,” VFX supervisor Andrew Whitehurst told Art of VFX. “It can use 3D facial tracking and retargeting, keyframe animation, machine learning-driven facial solves, use of the underlying plate, or full CG where appropriate… Part of the skill of the artists is in figuring out how to combine the tools to create the right recipe for each shot.”
The Fall Guy (2024)
Where was AI used? Director David Leitchās homage to Hollywood’s stunt community was full of practical action set-pieces, but its filmmakers also used some AI trickery here and there. Rising Sun Pictures (we told you that name would pop up a lot here) used VFX quite a bit in a late sequence involving a helicopter ā mostly to remove cranes and wires and add in background explosions.
The more eye-catching part of the company’s website, meanwhile, says that it used its Revize machine learning software to replace actors’ faces (much as they did on Shang-Chi) and even alter a line of dialogue. Its usage largely went unremarked at the time of The Fall Guy’s release.
What they said: Per RSP’s website: “In another scene, the producer in the film ‘Gail Meyer’ [played by Hannah Waddingham] says a line that the director wanted to change. Instead of a reshoot, RSP’s REVIZE technology ā a fusion of machine learning and artistic approaches ā was [used]. The result is Gail delivering a different line seamlessly. The studio also applied this technology to creating amazingly lifelike digital doubles and face swaps.”
Dune: Part Two (2024)
Where was AI used? You may remember that the residents of Arrakis, the Fremen, have icy blue eyes. In director Denis Villeneuve’s first Dune film, released in 2021, each set of eyes had to be manually added by VFX artists; for the sequel, effects studio DNEG used artificial intelligence to automatically detect actors’ eyes and change their colour.
What they said: Paul Lambert, VFX supervisor (via Befores And Afters): “We came up with a different technique, using what we’d learned before from the hundreds of blue eye shots in the first movie and creating a machine learning model, an algorithm trained from those ‘Dune’ shots to find human eyes in an image, which would then give us a matte for the different parts of the eye. We then used this multi-part matte to tint the eyes blue.
Late Night With The Devil (2024)
Where was AI used? Having appeared to largely warm reviews, low-budget horror Late Night With The Devil’s release was marred somewhat when it emerged that generative AI was employed to create a small number of static interstitial images seen at certain points. The controversy was such that its filmmakers, Cameron and Colin Cairnes, eventually released a statement to address the matter.
What they said: “In conjunction with our amazing graphics and production design team,” the Cairnes siblings wrote in a statement, “all of whom worked tirelessly to give this film the 70s aesthetic we had always imagined, we experimented with AI for three still images which we edited further and ultimately appear as very brief interstitials in the film.”
A Complete Unknown (2024)
Where was AI used? Director James Mangold’s Bob Dylan biopic may look as analogue as its subject’s music sounds, but its 1960s setting required a smattering of VFX trickery to bring to life. When it comes to AI, A Complete Unknown reportedly used Revize (yep, it’s Rising Sun again) to replace a stuntman’s face with Timothee Chalamet’s features in a handful of shots.
Aside from the movies listed here, the number of movies that Rising Sun says has featured Revize is surprisingly broad. Superhero film The Flash (2023) and Baz Luhrmann’s Elvis are both mentioned, though at the time of writing we haven’t been able to confirm in which specific scenes they were used.
What they said: “The technology was used [in A Complete Unknown] to assist in three brief wide shots on a motorcycle, not involving performance or creative enhancements,” an unnamed Searchlight Pictures spokesperson told Variety. “This technology is commonplace for making stunt people resemble their actors in films. The VFX facility implemented this specific methodology as a tool for the artists to use for only these 3 shots – these type of VFX stunt face replacement shots have been used for decades.”
Furiosa: A Mad Max Saga (2024)
Where was AI used? If you watched George Miller’s post-apocalyptic sequel and wondered how they made child actor Alya Browne look so much like her grown-up counterpart, Anya Taylor-Joy, AI is partly the answer. Revize (see above) was used to blend the two actresses’ features, with Furiosa’s face gradually looking closer to Taylor-Joy’s adult face as the narrative progresses.
The same machine learning process (albeit using another piece of software, Metaphysic) was also used to create a character called The Bullet Farmer, which essentially mixed the performance of actor Lee Perry with the likeness of Richard Carter, who appeared in Mad Max: Fury Road but died in 2019.
What they said: “For this iconic role,” said Metaphysic VFX supervisor Jo Plaete (via Befores And Afters), “our teams at Metaphysic.ai leveraged our neural performance toolset to bring the Bullet Farmer back for Mad Max: Furiosa, channeling Richard Carter’s likeness through the performance of stand-in actor Lee Perry. We meticulously trained our models on footage from Mad Max: Fury Road, ensuring the perfect fusion of Perry’s performance and Carter’s identity.”
Alien: Romulus (2024)
Where was AI used? Director Fede Alvarez made an admirable commitment to using largely traditional filmmaking techniques for his Alien prequel, from puppets to miniatures to physical sets. While plenty of digital VFX can also be found, Romulus has the feel of the original films, released in 1979 and 1986. That is, until the character Rook shows up. A mixture of animatronics and Metaphysic’s deepfake tech (see also Furiosa), Rook was based on the voice and likeness of the late Ian Holm.
Reactions to the resulting android character were mixed, and the effects shots were quietly revised for Alien: Romulus’s home release. Alvarez and his collaborators have pointed out that Ian Holm’s likeness was used with permission from his estate.
What they said: “If you fast forward to the way Rook was done […] it’s not perfect,” said VFX supervisor Eric Barba in an interview with Animation World Network. “The tools are being written and worked on as we speak, and every day are getting better. It’s so much faster, better results, and certainly more affordable for shows with modest budgets. The team at Metaphysic.ai did a great job in adding new tools.”
Here (2024)
Where was AI used? Like Furiosa, Robert Zemeckis’ single-location drama used Metaphysic to digitally de-age (and age up) its characters, most prominently those played by Tom Hanks and Robin Wright. The software was ‘trained’ on dozens of images and videos of the actors in their younger years, and the results were used as a kind of ‘digital makeup’ applied to their physical performances. We’ll leave it up to you to decide whether the results are convincing or mildly terrifying.
What they said: “It becomes part of the production process, rather than getting in the way of it,” VFX supervisor Kevin Baillie told Variety. “This technology doesn’t require dots on faces. It doesn’t require multiple witness cameras or any other sort of intrusive technology.”
“Traditional computer graphics techniques simply couldn’t have achieved what we accomplished,” an entry about the film on Metaphysic’s website reads. “By leveraging our proprietary AI technologies, we created a system that could dynamically transfer performances across different ages, maintaining the emotional nuance and physical characteristics of each actor.”
The Brutalist (2024)
Where was AI used? Given how extensively AI was used in other 2024 releases, it’s notable just how loud the backlash against The Brutalist has been. Perhaps it’s because it’s an Oscar frontrunner that wears its artistry on its sleeve, as opposed to a VFX-heavy blockbuster.
At any rate, the use of AI to augment the Hungarian pronunciation of certain lines of Hungarian dialogue ā uttered by the leads Adrien Brody and Felicity Jones ā caused such a furore that it might have harmed the film’s Oscar chances. In January 2024, editor Dávid Jancsó revealed that he’d used an AI tool made by a company called Respeecher to improve Brody and Jones’ use of Hungarian. Outrage ensued.
What they said: “Adrien and Felicity’s performances are completely their own,” director Brady Corbet said in the wake of the backlash. “They worked for months with dialect coach Tanera Marshall to perfect their accents. Innovative Respeecher technology was used in Hungarian language dialogue editing only, specifically to refine certain vowels and letters for accuracy. No English language was changed. This was a manual process, done by our sound team and Respeecher in post-production.”
Emilia Pérez (2024)
Where was AI used? The Brutalist isn’t the only Oscar front-runner to have been found to use artificial intelligence. Director Jacques Audiard’s trans gangster musical comedy drama also used Respeecher’s software to adjust star Karla Sofía Gascón’s vocal performance in its musical sequences. Though given the number of other controversies that have encircled Emilia Pérez, using software to make Gascón sound like late 1990s Cher is probably the least of its problems.
What they said: Word of Emilia Perez’s use of AI came out at the Cannes Film Festival in May 2024, in which sound mixer Cyril Holtz talked openly about the use of ‘machine learning voice cloning technology’ in a filmed interview. At first, a traditional vocal double was considered for Karla Sofía Gascón’s songs, before Holtz suggested using AI software to adjust her performance on certain notes that were outside her natural register. Just as Furiosa blended one face with another, Emilia Perez merges two vocal performances into one ā or, at least, that’s the process in layperson’s terms. The whole process was “Painstaking,” according to Holtz.
Film marketing
Finally, a quick entry on film marketing. This one area where generative AI has popped up in the past, including a set of AI-generated landscape shots in the promotion of Alex Garlandās war thriller, Civil War. Marvel was accused of using AI on the promotional work for Fantastic Four: First Steps, though the studio (via an anonymous source) insisted it hadnāt used the tech. But with AI being used increasingly in advertising (see last yearās nightmarish Coca-Cola commercials), itās something weāre likely to see more of in future campaigns and the rest of the film industry.
As VFX artist David Stripinis told The Hollywood Reporter last year, “There are tons of people who are using AI, but they can’t admit it publicly because you still need artists for a lot of work and they’re going to turn against you,” says David Stripinis. Right now, it’s a PR problem more than a tech problem.”
āThank you for visiting! If youād like to support our attempts to make a non-clickbaity movie website:
Follow Film Stories on Twitter here, and on Facebook here.
Buy our Film Stories and Film Junior print magazines here.
Become a Patron here.