An impressive demonstration of body motion and facial capture work at SIGGRAPH showed real-time movements from a live actor transferred to an avatar in a completely digitized scene. The demonstration shows how efficient professional tools are becoming that merge elements from real and digital realms for videogame cutscenes. It also could be seen as a preview of the believable avatars we’ll have when VR headsets include facial tracking.
The system used Unreal Engine — the toolset used to make many of the bigger budget VR games. This same toolset is set to get mixed reality features in the next update, immediately offering anyone with a green screen, a camera and a VR-ready system the chance to merge real and virtual footage together in real-time. The project at SIGGRAPH, though, is on a whole different level. It shows a performer transforming into a videogame character for a scene inspired by the upcoming game Hellblade: Senua’s Sacrifice by Ninja Theory.
“We’re looking to work with partners that are interested in using this new technique for their projects, whether this be in games, movies, VR or live performance,” Ninja Theory CEO Nina Kristensen is quoted as saying. “We see this new technology as a game changer.”
The project’s contributors, which include Epic Games, Cubic Motion, 3Lateral, House of Moves and others, see a much more efficient way of doing performance capture on set — reducing a process of creating cinematics for a game that could’ve taken days or weeks previously down to just a matter of minutes. The system includes facial and full body movement tracking features and a sequencer to layer in elements of the scene.
The additional tracking features in the motion capture demo are missing on current headsets as the underlying technology gets worked out. Still, some of the tools and techniques used could enhance peer to peer communications at home, according to Brian Rausch, head of motion capture and animation studio House of Moves.
“I think some of the tools and techniques used last night could enhance home experiences by adding facial recognition and CG face animation during peer to peer communication,” said Brian Rausch, head of motion capture and animation studio House of Moves, in a prepared statement. “It’s one thing to get real-time running, but it’s another to be able to take director changes on set and have real-time reflect that. House of Moves has focused on production tools for use in our projects as well as stage tools to overhaul the way virtual production and real-time animation is being accomplished.”
Check out the behind-the-scenes video below.