Deadpool director Tim Miller suggests that CGI is on the cusp of crossing the uncanny valley – meaning digital characters may soon be indistinguishable from real ones.
Tim Miller has a new animated series coming to Amazon called Secret Level, which uses assets created in Unreal 5 Engine to digitally animate a different videogame world in each episode.
Much of the recent chatter about the series (since its splashy Gamescom announcement) has centred on the fact that one of the episodes focuses on Concord, Sony’s hyped online arena shooter that released on the 23rd August and performed so poorly that it was cancelled just two weeks later.
While the Deadpool director has expressed hope that Secret Level may in some way help to revive Concord’s fortunes, he’s also excited about lots of other elements of the upcoming series, including the use of Unreal Engine 5 to create the animation in the show. Unreal Engine is becoming a more commonly-used tool in Hollywood, both for ‘previz’ (pre-visualisation) work in live action films and for animation.
The latest Unreal tech to be showcased was Megalights, a tool which lets filmmakers add 1000 lights into a digitally-created scene.
Miller was on hand and chatted a little to Variety about how the use of Unreal could transform Hollywood’s traditional workflow, saying: “The tools that they’re adding to Unreal to allow you to preview your movie, to preview whole scenes, to build whole sequences of scenes at a level that is high quality – because the whole team sees everything. Everybody thinks, we’ll just previs it and everybody will do their thing. But in fact, the previs becomes a blueprint for the film in a way that people can always refer to it, and always do refer to it, almost as gospel.”
A common topic on this site and in the wider film world continues to concern the cost of blockbusters and the way how sometimes, the economics simply don’t add up. Should tools like Unreal (and its competitors) allow such colossal productions to fine-tune practical elements ahead of time, it could well add to the kind of clever little savings that might make these tentpole films more viable.
But then also, lots of films don’t have any rehearsals, and previz might help there too.
Miller also talked about the tool’s ability to be used within the film itself, rather than just as a pre-production application. What’s more, he was surprisingly upbeat about how the technology has progressed, adding: “I think we need maybe just a little more, another level of iteration, another level of reality, to be able to integrate the Unreal content, digital humans, whatever, with live action. There’s still little ways to go. And as much as I want to say that we’re across the uncanny valley, we’re not. I would say that we’re on the slopes of the other side and climbing upward, but we’re not across it yet.”
Are we just one or two iterations away from being unable to tell the difference between a human actor and a digital double? Miller clearly thinks we’ve passed the halfway mark on the journey towards solving this issue, and he’s more familiar with the technology than most. But what do you think? Is Tim Miller right? Are we about to cross the uncanny valley? Let us know in the comments below.