How Mill employs real-time for Character animation

Getting animated performances onto the screen can take time. But Jeffrey Dates, a creative director at Mill+, part of visual effects and creative content studio The Mill, has been on a mission to use real-time tools to make it happen more quickly.

Dates is behind The Mill's 'real-time animation system,' or RTAS, using a combination of real-time tools including Epic's Unreal Engine and Leap Motion's hand-tracking tech.

The RTAS is still in development, but Dates shared with Cartoon Brew why he wants to implement real-time animation in his work – which is mostly in television commercials – and where he thinks it could go next.

Certainly, the use of real-time tools during production, especially motion capture and game engine systems, is not a new phenomenon. What Dates and The Mill are trying to achieve, however, is something artists can use at their desks, something they can experiment with, and something they can produce as 'final picture.'

So, what is the RTAS- “In technical terms it's a collage of hardwares and ideas melded together,” said Dates. “The system wholly is hardware agnostic, and really refers to a workflow and approach to live animated production. The RTAS has deployed Kinects, Leap Motion units, and the HTC Vive in various iterations of production. We're pulling it all together in Epic's Unreal where we're able to really push the look and quality.”

For a while now Dates has been experimenting with real-time digital puppets (see the video, below). He's produced several in-house Mill demos before relying on the system to make some 'Vonster' spots for client Monster.com. These featured a very hairy creature that was puppetered in real-time doing all sorts of human-like behaviors.

“The Vonster – virtual monster – spots were RTAS' first commercial application,” said Dates. “The highlight was the 2.7 million self-shadowing, simulated real-time hairs on the same asset that was used for the television commercial. Even the groom was ported over to Unreal. 'Vonster' became our proof-of-concept to demonstrate how real-time animation is a performance medium closer to live-action than the traditional mindset of animation.”

Although the quality of the final animation out of the RTAS is crucial, Dates says the biggest challenge in making the system real-time was not in motion or rendering. Instead, it was the challenge of transcribing controls so that a single performer had enough to bring life to a character. “We worked closely with a professional puppeteer who was able to give us feedback on what makes it performance friendly,” explained Dates. “These subtleties of control are as important as animator controls on any traditional character rig. A happy animator means a better performance.”

The idea was also to make the system behave just like a live-action puppet. Leap Motion's hand-tracking provides the hand dexterity side of things, while the RTAS also enables the reading of faces for facial emoting. Since it is currently designed to be used at a desktop, full-body capture is still in the works (although Dates is certainly aware of the many markerless capture solutions out there). He says the big difference in his approach is not just transcribing motion capture, but generating a relatable character performance.

“Without that, I worry that just strapping a cg character to real-time motion capture will be stilted with the shortcomings of the late '90s when motion capture was thought to be an animation replacement. I firmly believe that full-body motion capture is great when you need performance by a human grounded in reality. However, the moment you want to tell Toy Story, you're going to need something more sophisticated.”

The Mill is also no stranger to real-time solutions. Last year, they partnered with Epic to use Unreal Engine for a Chevrolet virtual production demo called 'The Human Race.' It allowed for a car covered in tracking markers to be filmed, and then a virtual cg car model of any description to be skinned over it – in real-time.

Meanwhile, Dates notes that with the RTAS, apart from generating faster animated performances, he and other artists are enjoying the process. “I'm using it for final animation and embracing the style inherent to the system. Its strength is when you accept the limitations of live performance, you're able to really generate spontaneous and varying animations live – trying things out, joking around, all with an animated character. It's really a joy to play with and watch being performed.”