Site icon Strike Force heroes4

Here’s some ways one visual effects studio is using machine learning tools in production right now

Here’s some ways one visual effects studio is using machine learning tools in production right now

And it’s not only with the dedicated pro VFX tools you might think (it’s also with ones originally designed for just social media use).

The topic on the top of so many minds in visual effects right now is artificial intelligence and machine learning. There are, quite simply, new developments every day in the area. But how are all these developments finding their way into VFX usage? befores & afters asked one studio owner during the recent VIEW Conference to find out what they are doing.

Wylie Co. founder and CEO Jake Maymudes started his visual effects studio in 2015. He had previously worked at facilities including The Mill, Digital Domain and ILM. Wylie Co. has in recent times contributed to Dune: Part One and Part Two, Alien: Romulus, Uglies, The Killer, Thor: Love and Thunder, The Last of Us and a host of other projects. The boutique studio works on final VFX, sometimes serving as the in-house VFX team, and commonly on aspects such as postvis.

The biggest change to visual effects that Maymudes has seen in recent times has come with the advent of new artificial intelligence (AI) and machine learning (ML) workflows. The studio has utilized deep learning, neural networks and generative adversarial networks (GANs) for projects. Some of this relates to dedicated VFX tools, other work, as discussed below, was even done with tools intended for just social media use.

Wylie Co. Postvis Reel 2023

In terms of the tools now available, Maymudes is adamant that AI and ML workflows will (and already are) changing the way labor-intensive tasks like rotoscoping, motion capture and beauty work are done in VFX. “There’s so much efficiency to be had by using AI tools,” argues Maymudes. “I see it as really the only way to survive right now in VFX by taking advantage of these efficiencies. I think the whole world’s going to change in the next couple of years. I think it’ll change dramatically in five. I think it’ll change significantly in two. I could be wrong, it could be one.”

Wylie Co. has leapt into this AI/ML world in both small and large ways. On She-Hulk: Attorney at Law, for example, Wylie Co. was utilizing machine learning rotoscoping in 2021 for postvis work on the series. “Back then I wasn’t aware of a single other company that was diving into machine learning like we were,” says Maymudes. “And now, we’ve all had that capability for years.”

The blue eyes of the Fremen in ‘Dune: Part Two’.

A much larger way Wylie Co. used machine learning tools was on Dune: Part Two to tint thousands of Fremen characters’ eyes blue. That task involved using training data direct from blue tinting VFX work already done on Dune: Part One by the studio and feeding that into Nuke’s CopyCat node to help produce rotoscope mattes. Production visual effects supervisor Paul Lambert, who is also Wylie’s executive creative director, oversaw the training himself. “He’s deep into AI and AI research,” notes Maymudes. “He’s a technologist at heart.”

[You can read more about Wylie Co.’s Fremen blue eyes visual effects in issue #23 of befores & afters magazine.]

Then, there’s a different kind of approach Wylie Co. has taken with AI and ML tools that were not perhaps initially intended to be used for high-end visual effects work. The example Maymudes provides here is in relation to the studio’s VFX for Uglies. On that film, visual effects supervisor Janelle Ralla tasked Wylie with a range of ‘beauty’ work to be done on the characters as part of the Ugly/Pretty story point. Ralla demonstrated a social media app—FaceApp—to Maymudes that she was using to concept the beauty work. The app lets users, on their smartphones, change their appearance.

Original frame inside FaceApp.

“She used this app to generate the images to convey what she wanted to see,” explains Maymudes. “The results were really good, even for those concepts. So, I researched it, and it was an AI-based app. It had used a neural network to do the beauty work. And it did it fast.”

That was an important consideration for Maymudes. The beauty work had to be completed to a limited budget and schedule, meaning the visual effects shots had to be turned around quickly.

After the FaceApp filter was applied.

Here’s what Wylie Co. did using the app as part of its workflow.

“We downloaded FaceApp, then brought in our plates,” discusses Maymudes. “I took the app and I made hero frames with the shots. Then I would take those hero frames into Nuke. I would create a dataset with these hero frames. Then I would train overnight on my Lenovo workstation with my NVIDIA GPUs for 12 hours. I’d come back in the morning, click a couple buttons, apply the inference, and it worked.”

Nuke node graph for the beauty work.

“We figured out a good workflow for this work through trial and error,” adds Maymudes. “You have to be very explicit with what you want to tell these neural networks because it’s one-to-one. You’re basically saying, ‘Please do exactly this.’ And if your dataset is messed up that you’re training with, your results are going to be either really bad or not great, but not perfect, no matter what because it’s so one-to-one. It’s so black and white. That’s why using FaceApp was great in this regard because it was so consistent between the hero frames.”

Why Maymudes is excited for this particular use of an AI/ML tool is that it was actually designed for something else—just a fun social media purpose. “But,” he says, “it has amazing facial tracking for face effects and gags. I mean, a lot of these tools do now. There’s a lot of R&D that has gone into these tools, especially ones relating to your face. Because of that, you can pick and pull little tools here and there to use in visual effects. And if you do that, you can find just insane efficiency. That’s why we used it.”

Original frame.
Final beauty work.

“What we do love at our company are tools that make us better artists,” continues Maymudes. “We have machine learning tools that do re-timing, and upscaling, and morph cuts, beauty work, matte work. All these little things that kind of take the grunt work out of it, which is nice. But I don’t think machine learning is going to stop there. It’s going to transform our industry. I don’t actually know where it’s going to go even with how much I research it and I think about it. Honestly, I think it’s completely unpredictable what visual effects or the world will look like in five years. But the stuff you can do now, well, it’s good, it’s useful. We use it.”

Join the new befores & afters Patreon tier: Click here for the Daily VFX Slider


Stay up-to-date with VFX news -> Subscribe to befores & afters weeklies, and get a free Tech of Terminator 2 ebook!


link

Exit mobile version