
OLM Digital, Inc
JP
Emmy-Award winner Margaret Dean is the Head of Studio for SKYBOUND, the home of Invincible and The Walking Dead, and is responsible for the production of original content and studio operations. Inspired at an early age by dramatic black and white films from the 30s and 40s, Marge discovered the moving image as an art student and delved into her passion for visual storytelling through animation. Known for building studios and animation pipelines, Marge has been responsible for the design or re-design of several studios, and as President of Women in Animation, she is a recognized global leader in advancing women in the field of animation.
As the head of a studio, you are responsible for creating and nurturing the culture, where everyone feels they belong. What I work to do is instill a space that embraces mentoring – not only to expand and diversify the workforce, but because it lends a strong sense of inspiration and community. The flow of shared experience, knowledge and support is critical to building a collaborative environment. Women in Animation’s mentoring program is our most successful initiative and demand continues to grow. What is truly exciting is that our formal mentorship matches planted the seeds to grow new networks. I don’t think you can make your way in this often-challenging industry without people who share their lessons learned, foster your talents and provide encouragement – and as someone who benefitted from great mentors, I’m proud to be in a position to pay this forward.
I was a single working parent early in my career, and the issue of balancing a career and family is highly personal. I was able to figure out a way where I did not have to sacrifice one for the other – but so many parents, particularly women, feel backed into making that tough choice. Women in Animation is focused on the enormous need to provide job flexibility and more support for working parents and caregivers. The number of women who have had to walk away from their jobs because of the high cost and lack of childcare and too few options for hybrid work schedules is startling – even more so due to COVID. We need to do better and we highly encourage partners to join our advocacy.
There is an enormous need to provide job flexibility and more support for working parents and caregivers.
Women in Animation wants to achieve 50/50 parity for women and underrepresented genders in the animated creative workforce by 2025 – and we believe the industry is already committed to that goal. What we‘re focused on now is how to make it easter to do it. We’ve created a searchable database of more than 6,000 women/diverse gender professionals to dispel that myth of ‘I can’t find anyone to hire.’ We are also working on breaking down barriers to build the pipeline, including creating pathways that do not require going to an expensive art school or college. I’m very excited about our ongoing work with the California Board of Education and The BRIC Foundation to build out training and apprenticeship programs to prepare people for a multitude of jobs and enrich our talent pool.
Join us for our series of interactive webinars with visual effects professionals. Ask your questions, learn about the industry and glean inspiration for your career path.
Register today at VESGlobal.org/AMA
By CHRIS McGOWAN
Images courtesy of DNEG and Netflix.
A time jet getting ready to jump across decades. Height references included weather balloons and footage of Felix Baumgartner’s record freefall as a reference point for the curvature of the Earth.
Netflix’s The Adam Project is a family drama embedded in time-traveling, world-saving science fiction. To help with the sci-fi aspect, Overall Visual Effects Supervisor Alessandro Ongaro tasked DNEG London with conjuring up unique-looking wormholes, decades-hopping “time jets” and digi-double “time soldiers.”
The wormholes were a key visual effect in The Adam Project. It was decided that the wormholes would have some kind of funnel in the middle through which a jet could disappear. Then each wormhole had to disappear after the jet had gone through.
In the Shawn Levy-directed story, Adam (Ryan Reynolds) is a time pilot from 2050, on an illegal mission to rescue his wife Laura (Zoe Saldana). He crash-lands in 2022, and, as he heals himself and fixes his jet, enlists the help of his 12-year-old self (Walker Scobell). It is the year following the death of their father, Louis Reed (Mark Ruffalo), from which they have never recovered emotionally. Reed was a brilliant quantum physicist who accidentally invented time travel, which has been used by Reed’s former partner, Maya Sorian (Catherine Keener), to enrich herself and create a dystopian future. To undo this terrible timeline, Adam and his younger self travel to 2018 to seek out the help of their younger father. Once there, they must also find a way to make peace with his future absence. Jennifer Garner also stars in the film, portraying Adam’s mother Ellie Reed.
“The wormhole was the main pivot of the movie. DNEG has a great history of doing wormholes and black holes for movies, so we had the tricky task of coming up with something innovative that had not been done before.”
—Mike Duffy, VFX Producer, DNEG
The time jets needed to project velocity and urgency when leaving Earth’s environment. Vibration and camera shake were added to Adam’s jet to give the chase a more frantic feel.
The single most important effect in The Adam Project was, arguably, the wormholes that the time jets create for time jumping. Explains DNEG VFX Producer Mike Duffy, “The wormhole was the main pivot of the movie. DNEG has a great history of doing wormholes and black holes for movies, so we had the tricky task of coming up with something innovative that had not been done before.”
“There was also one instance where we had to digitally replace young Adam’s legs. There is a scene where they are plummeting so quickly towards Earth that they had to rig young Adam on wires to make him appear weightless, but to add to the comedy value we added [a boy’s] legs dangling behind him, which really helped to sell the gag of those couple of shots.”
—Alexander Seaman, Visual Effects Supervisor, DNEG
Time soldiers drop from the jet of Sorian (Catherine Keener) and begin pursuit on hoverboards. Stunt actors in time soldier outfits were digitally scanned on set, with the data used to create digital versions.
Older Adam (Ryan Reynolds), younger Adam (Walker Scobell) and Laura (Zoe Saldana) are in a classic GMC Jimmy truck as they flee the time soldiers through the forest.
DNEG Visual Effects Supervisor Alexander Seaman recalls, “The creation and animation of the wormhole was actually quite a simple 3D task with some fairly rudimentary 3D volumes and shapes which could then be easily animated to be scaled up and down.” The real challenge was working out the unique design of the wormhole, which had to disappear after the jet flew through it. Seaman adds, “Director Shawn Levy guided us towards optical flares and lens distortions as references. We looked at the way that different prisms behaved, and then ultimately decided that the wormhole needed to have some kind of funnel in the middle of it which the ship could disappear through.”
DNEG augmented the time jets’ original designs and designed their cloaking effect. The spectacular dogfights of the swift and agile jets – piloted by older Adam or Sorian’s head of security, Christos (Alex Mallari Jr.) – were “a fairly complex process,” according to Seaman. “We were provided with a few rounds of pre-visualization, as well as some aerial footage and plates of mountains from above the clouds in North America, which we then repurposed to create the same camera angles and speeds. Where this wasn’t possible, we digitally created the parts of the environment, including a digital valley, a digital rock surface and a digital cave. We would then block all of that out and animate the chase. Next, we would assess whether the scene was thrilling or fast enough and augment each shot accordingly. On some occasions the aerial footage wasn’t fast or high enough, so we had to look for ways to re-speed the plates we already had or simply replace it with a CG version of the same thing from a different perspective.”
A time jet in pursuit of the GMC Jimmy in the forest. DNEG had to replace forest, build forest extensions and blend it all with existing plate material.
Continues Seaman, “Once we had established how high and far away from the Earth they wanted to put the chase, we looked at references such as weather balloons and Felix Baumgartner’s world record freefall [in 2012]. This footage proved useful as a reference point for the curvature of the Earth and sense of serenity at that altitude. We also used the Hubble space telescope footage as a reference for how the clouds cast shadows onto the oceans and land masses. We then used some of our own proprietary tools to generate some of the atmosphere effects that you see from the Earth.”
The time jets needed to project velocity and urgency once they were leaving the Earth’s environment, at the edge of space where everything is calm and serene, Seaman explains, “We had to use a few film-making tricks, including adding a certain amount of vibration and camera shake to Adam’s jet in particular to give it a more frantic feel. We also used an element of ‘space dust’ through the air, which gave a sense of traveling through something that we could justify as water particles. Anytime that the jets got close to each other, we could justify haze or vapor from the jets washing past and over them. When the Sorian jet starts shooting at the time jet, we’ve got the tracers from the guns, which are able to convey a sense of speed and danger as well.”
The older Adam and younger Adam in the cockpit at a dramatic yet comical moment during a chase sequence with Sorian’s jet pursuing them.
Adds Seaman, “One of the features of that sequence was a huge Earth that the environment team did a really good job of creating. If we had kept Earth in the correct position throughout that sequence, you would have only seen it in a couple of shots, so we had to really cheat where the Earth was in relation to the camera in order to keep some kind of visual anchor point as to where they were going and how fast they were moving. In some cases, we even cheated the scale of the Earth to make it feel like they were traveling faster away from it.”
“Once we had established how high and far away from the Earth they wanted to put the chase, we looked at references such as weather balloons and Felix Baumgartner’s world record freefall [in 2012]. This footage proved useful as a reference point for the curvature of the Earth and sense of serenity at that altitude. We also used the Hubble space telescope footage as a reference for how the clouds cast shadows onto the oceans and land masses. We then used some of our own proprietary tools to generate some of the atmosphere effects that you see from the Earth.”
—Alexander Seaman, Visual Effects Supervisor, DNEG
DNEG also worked on the truck chase sequences that involved the Adams and Laura fleeing Sorian in a classic GMC Jimmy. They drive along and through a forest with Sorian’s jet and flying time soldiers in hot pursuit. DNEG had to replace forest and build forest extensions and blend it all with existing plate material. Explains Seaman, “There was a real forest complete with various types of vegetation and a dirt road running through the middle of it. To make the sequence more thrilling, they wanted to replace the dirt road and instead show the [truck] weaving to and fro between various bushes. We used the on-set reference for what the trees and plants looked like and then had a very talented modeling team recreate the same vegetation, as well as a very good environment team effectively fill in the forest for the pieces that were absent.”
Older Adam pilots a time jet through a CGI canyon. Much of that environment was digitally created, including a digital valley, a digital rock surface and a digital cave.
When the time soldiers flew through the forest in pursuit of the truck, cutting between trees while riding hoverboards, it is reminiscent of the Star Wars: Return of the Jedi chases in the forests of Endor. Responds Seaman, “We also felt that there were some influences by Return of the Jedi in the style of the forest and the speed at which the heroes were being chased through it. But this wasn’t something that we were asked to match or reference.”
The time soldiers often required digi-doubles. Seaman notes, “There were real-life stunt actors in time soldier outfits that they digitally scanned on set. They then sent us the data, and we recreated digital versions of the stuntmen in their costumes. We did a couple of varieties of them holding their weapons in different ways with slightly different imperfections to their armor. We then modeled and rigged the hover platforms to their feet. The team had done a really good job of filming the stunt performers through the sequence, but sometimes they were not going quite fast enough. So, in a lot of cases, we digitally re-produced them, using the footage as a reference to see how they moved and how their costumes reacted to the environment, but ultimately digitally replacing them to make them go faster.”
Digi-doubles were also used as replacements for actors in aircrafts, especially during the flying scenes. Comments Seaman, “There was also one instance where we had to digitally replace young Adam’s legs. There is a scene where they are plummeting so quickly towards Earth that they had to rig young Adam on wires to make him appear weightless, but to add to the comedy value we added [a boy’s] legs dangling behind him, which really helped to sell the gag of those couple of shots.”
“We had to use a few film-making tricks, including adding a certain amount of vibration and camera shake to Adam’s jet in particular to give it a more frantic feel. We also used an element of ‘space dust’ through the air, which gave a sense of traveling through something that we could justify as water particles. Anytime that the jets got close to each other, we could justify haze or vapor from the jets washing past and over them. When the Sorian jet starts shooting at the time jet, we’ve got the tracers from the guns, which are able to convey a sense of speed and danger as well.”
—Alexander Seaman, Visual Effects Supervisor, DNEG
Closeup of a time jet in the canyons. DNEG was tasked with adding extra detailing to the time jet exteriors and their cockpits.
DNEG has a history of creating wormholes and black holes, and was challenged to come up with something innovative for The Adam Project.
DNEG contributed more than 350 shots spread over eight sequences, out of the 1,432 total VFX shots in the movie. The other visual effects studios working on The Adam Project included Scanline VFX, Lola VFX, Supervixen Studios and Clear Angle Studios, and there was an in-house VFX team. Cameron Waldbauer was Special Effects Supervisor.
By TREVOR HOGG
Images courtesy of Netflix and Alt.vfx, except where noted.
Benedict Cumberbatch portrays malicious rancher Phil Burbank while Kodi Smit-McPhee takes on the role of his brother’s effeminate stepson Peter Gordon in The Power of the Dog. (Image courtesy of Netflix)
With the exception of the Red Mill Inn, the town of Herndon, Montana was CG.
While movie critics praised the performances of Benedict Cumberbatch, Kirsten Dunst, Jesse Plemons and Kodi Smit-McPhee in The Power of the Dog, nothing was ever mentioned about the visual effects work supervised by Jay Hawkins (Wolf Like Me) and produced by Alt.vfx which amounted to over 200 shots. The lack of awareness and recognition is not something that bothers Hawkins. “People ask me, ‘What did you do on The Power of the Dog? That’s not a visual effects film.’ I show them the breakdown and they’re always quite surprised, which makes me happy.” Digital doubles were made to increase the herds of cattle, set extensions were required for the ranch, a town had to be digitally constructed, CG wounds were placed on animals and actors, and the outline of a dog was etched into the rolling hills.
“People ask me, ‘What did you do on The Power of the Dog? That’s not a visual effects film.’ I show them the breakdown and they’re always quite surprised, which makes me happy.”
—Jay Hawkins, Visual Effects Supervisor
Based on the novel by Thomas Savage, the cinematic adaption by Jane Campion (Bright Star) is set in the 1920s Montana where Phil (Benedict Cumberbatch) wages brutal psychological warfare against the new bride (Kirsten Dunst) and stepson (Kobi Smit-McPhee) of his brother, George (Jesse Plemons), that unfolds on their family ranch. New Zealand doubled for Montana during principal photography, which was conducted by Ari Wenger (Lady Macbeth) who received an Oscar-nomination for her contributions. Campion had done some extensive scouting in Montana where Thomas Savage lived. “I thought it was going to be alpine trees and big logging forests, but that wasn’t the look or terrain that Jane was going for. She wanted vast and open fields which we found in New Zealand. In terms of changing New Zealand for Montana, we weren’t doing any of that.”
Part of a ranch house was built on a farm in the Hawkdun Range in Maniototo by Production Designer Grant Major (Mulan). “The house had to service all of these different story beats and lines of sight,” explains Hawkins. “On one of early recces there was a small-scale 3D printed model of the house. We walked out to the location, which wound up being used for the film, and placed and rotated the model around in the light and starting thinking about where the rest of the buildings should be placed.” Extensive previs was utilized for the interior shots as there was not a budget for big translights, and the preference was to avoid greenscreen or bluescreen. “We came up with this idea of vinyl backdrops [of which we had three],” notes Hawkins. “I did previs for what we would shoot outside of the window, what would be the set’s field of view and what would be the set’s horizon, given that we had a limited size for the backdrop that could be used outside of the window.”
Rocks were digitally constructed to integrate the railroad tracks into the landscape.
A drone captured aerial plate photography of the ranch. “We didn’t do a whole lot of drone footage on the show, and, on that day, it was the arrival of the governor for the dinner scene,” remarks Hawkins. “We had to make sure that the drone stayed at the right altitude so you could see enough of the top of the house, given the fact only half of it had been built. Some practical snow blankets were laid down while the cowboys are running to the front door. In the rough cut before seeing the shot with the full house and snow, we weren’t sure, but when we started adding snow and post rendered the house, it came alive.” Grant Major produced concept art for the fictional setting of Herndon, Montana. “While scouting, we couldn’t find something that spoke to Jane,” adds Hawkins, “so the only physically constructed building was the exterior of the Red Mill Inn, which was located a couple hundred meters from the ranch house. The rest of the town is CG.” Drone photogrammetry scans were taken of the ranch house and Red Mill Inn. “When we went to rebuild it,” he says, “we were able to take the real-world measurements of photogrammetry scan, marry those with the original concept and build from there with texture reference from the practical build.”
“[T]he only physically constructed building was the exterior of the Red Mill Inn, which was located a couple hundred meters from the ranch house. The rest of the town is CG. When we went to rebuild it, we were able to take the real-world measurements of photogrammetry scan, marry those with the original concept and build from there with texture reference from the practical build.”
—Jay Hawkins, Visual Effects Supervisor
Visual Effects Supervisor Jay Hawkins thought the terrain was going to be alpine trees and big logging forests, but director Jane Campion wanted vast and open fields.
In two different scenes, the shape of a dog was incorporated into the rolling hills. “There was a lot of time spent rotting in vans and discussing things,” recalls Hawkins. “One evening. we climbed this big hill which was being considered as a possible location for the picnic scene. We were watching the sunset on the hills behind the house, and there were these really long shadows that were winding around them. One looked like the face of a witch and another resembled a tiger. Ari and I were sitting in wonder watching the nose of the witch go from being perfect to abstract. We thought, ‘What if the dog was a shadow puppet like that on a hill?’ When I got back from that recce, I worked with my concept artist on a bunch of different versions of the dog. Maybe it would be sculptural or embedded into a rock formation in the hills. However, the shadow throw was so strong and powerful that Jane loved it. We kept refining that concept. You’re trying to sculpt a ridge line that is also a shadow receiver of the ideal shape that you want when the sun is at a certain part of the day. In the end it was a fully 2D effect.”
While the car was practical, the train was a CG asset. “The carriages were based upon the passenger carriages we were able to get for the train station platform shot when they’re arriving at the station,” states Hawkins. “That was captured by the drone unit during COVID-19. We had all of these different options of plates and found that one. Extensive relighting and reworking were required on the plate to get it to work.” Rocks were digitally constructed to integrate the railroad tracks into the landscape. “Initially,” notes Hawkins, “the shot of the people next to tracks was supposed to have nothing around them. But it felt so naked with just the tracks and the cowboys standing there. We wound up putting in the stockyards, a section of town and additional elements until that shot itself felt correct.” Having the proper number of extras was not an issue. “Our bigger crowd scenes like at the railway station were shot pre-pandemic,” he adds, “and when we were on our interiors, New Zealand was in a fortunate situation where there were zero COVID-19 cases.”
“We thought, ‘What if the dog was a shadow puppet like that on a hill?’ When I got back from that recce, I worked with my concept artist on a bunch of different versions of the dog. Maybe it would be sculptural or embedded into a rock formation in the hills. However, the shadow throw was so strong and powerful that [director] Jane [Campion] loved it. We kept refining that concept. You’re trying to sculpt a ridge line that is also a shadow receiver of the ideal shape that you want when the sun is at a certain part of the day. In the end it was a fully 2D effect.”
—Jay Hawkins, Visual Effects Supervisor
Cattle were an important part of the visual storytelling. A cow library was built in Houdini of different groupable bovine behaviors.
Cattle were an important storytelling and visual element. “Before the film had even started, Ari had a cow breakdown for the different seasons and how many would logically be at the ranch,” remarks Hawkins. “For two or three days, we had real cattle with us. I did a massive texture and behavior study with as many witness cameras as I could. Then I worked with my team to construct a bunch of different groupable behaviors so that Chris Gardner, my technical director, could build them into his Houdini cow library. He had some nice anti-collision things, so if one cow stopped another it would walk around them. It took awhile but was quite good. I’m looking forward to another cow film just so we can use it again!” Not everything could be procedural, he adds. “If they were clumped together in a mass, there was always heaps of art direction because we had to integrate it with what was happening in the plate.”
“Initially, the shot of the people next to tracks was supposed to have nothing around them. But it felt so naked with just the tracks and the cowboys standing there. We wound up putting in the stockyards, a section of town and additional elements until that shot itself felt correct.”
—Jay Hawkins, Visual Effects Supervisor
Development of the ranch house, which was practically built and extended in CG.
The shape of a dog was etched into the natural landscape.
Only two shots used greenscreen. “The backgrounds were such a high contrast that I wouldn’t be able to get a nice clean roto, and as a result the shots would suffer if I didn’t use greenscreen,” states Hawkins. “Also, the lighting conditions allowed for it.” On set the wounds were done practically. “It was when we were in the edit that we realized more was needed,” Hawkins observes. “That became a fun exercise of Googling things like anthrax and wolf attacks on bears.” The dissection of the rabbit was CG because real animal parts were not allowed on set. As for atmospherics, extensive dust had to be digitally added. “That was fun too,” reveals Hawkins, “because Murray Smallwood, our Compositing Supervisor, was into experimenting with EmberGen as a kit to use inside of Nuke, and he got some wonderful results with that. We put dust into quite a lot of scenes to add life to them.” There were times that the skies had to be altered. “Everything that we did was based on things that were shot,” Hawkins says. “If I wasn’t shooting for visual effects, then I was capturing sky domes, reference out of the windows and time-lapse of clouds to build a library. In that part of New Zealand, we were blessed with so many potentially beautiful skies.”
By TREVOR HOGG
Images courtesy of Universal Pictures and DreamWorks Animation
A character design plate with contributions from Julien Le Rolland, Taylor Krahenbuhl, Anthony Holden, Pierre Perifel and Jorge Capote.
For Australian author Aaron Blabey, the best way to describe The Bad Guys, a series of illustrated books depicting what are viewed to be despicable creatures trying to redeem themselves, was as “Tarantino for kids.” The cinematic adaptation found a home at DreamWorks Animation, with it being overseen by producer Damon Ross and director Pierre Perifel, who was making his feature directorial debut. The vocal cast features Sam Rockwell as Mr. Wolf, Marc Maron as Mr. Snake, Craig Robinson as Mr. Shark, Anthony Ramos as Mr. Piranha and Awkwafina as Ms. Tarantula. The creative journey began for Pierre Perifel in March 2019 with the lockdown caused by the pandemic occurring halfway through preproduction.
A character experssion sheet of Mr. Wolf with the model created by Hyun Huh and designed by Jorge Capote.
“The bad guys are in the warm colors and a cooler palette when they attempt to be good guys. The police moments would be the regular color of a police car, like deep reds, white and black. When it’s more the desperate moments, it would be desaturated, almost black and white. There is strong lighting in Los Angeles, so we have white skies and warm light.”
—Pierre Perifel, Director
“There is no way you can stick for the long run with something that you don’t like or feel drawn to,” admits Perifel. “The universe of the books struck a chord with me as it could be a heist movie by Quentin Tarantino or Steven Soderbergh. I added my own influences as animator back in France. Underneath all of this is the journey of Wolf. The idea that people can change and figure out more meaning in their personal lives, was something I connected a lot with for personal reasons.” The illustrations from the books had to be altered in order to be cinematic. “The art of Aaron Blabey is simple and efficient,” observes Perifel, “but yet we had to expand upon it to make a visual experience on the big screen. There are also limitations to his characters that you want to change or rework so you can have them actually moving. A shark without legs in our world would have been difficult to do. The same for Piranha.”
Sam Rockwell voices Mr. Wolf, who attempts to pull off his biggest con job.
Perifel wanted to create a new animation style which combined influences of Hayao Miyazaki and Ernest & Celestine. “The code of anime is that the posing of the characters has a lot to do with economical animation. Over the last few years at the studio, we had tended to be video reference and realistic for our acting in animation. I didn’t want to forget that, but wanted to try something that was more stylized and illustrative.” A simple color theory was developed by production designer Luc Desmarchelier that reflected the mental state of the main characters. “The bad guys are in the warm colors and a cooler palette when they attempt to be good guys,” explains Perifel. “The police moments would be the regular color of a police car, like deep reds, white and black. When it’s more the desperate moments, it would be desaturated, almost black and white.” The location had an impact on the color palette,” Perifel adds. “There is strong lighting in Los Angeles, so we have white skies and warm light.”
Pierre Perifel wanted to create a new animation style that combined influences of Hayao Miyazaki and Ernest & Celestine.
The storyboard by director Pierre Perifel and the final frame that appeared in the movie.
Storytelling drives the technology at DreamWorks Animation. “The head of layout, Todd Jansen, wanted to give us an anamorphic lens, which is what you usually do in live-action because it has a Los Angeles film vibe to it,” states J.P. Sans, Head of Character Animation for The Bad Guys. “We wrote tools to have this lens distortion whenever we needed to. The other tool that we had was a comic-book style, so there were a lot of drawing effects. We could draw motion blur and multiple legs for when a character was spinning around, instead of using rigs and CG elements. Everything felt handmade but still had that CG aspect, so it feels like a hybrid.” Animation tests involved copying 2D films frame by frame into CG, which were then shown to Perifel. “It was a great way to find our parameters of, ‘Are we close or are we too far off?’” states Sans. “The style that we found was removing some of that motion in CG and letting the mind fill in the blanks like you do in 2D.”
“We wrote tools to have this [anamorphic] lens distortion whenever we needed to. The other tool that we had was a comic-book style, so there were a lot of drawing effects. We could draw motion blur and multiple legs for when a character was spinning around, instead of using rigs and CG elements. Everything felt handmade but still had that CG aspect, so it feels like a hybrid.”
—J.P. Sans, Head of Character Animation
A color script by Luc Desmarchelier and Pierre Perifel for a dramatic car chase.
It was important to make Ms. Tarantula appealing rather than creepy. “The fur on tarantulas looks pointy and like it could stab you,” remarks Sans. “We wanted to bring a cuteness by making the fur feel soft. Because of going anthropomorphic, we added a torso and head that separates from the body so that it gives you a humanistic feel. We wanted her to feel like a spider based on the speed and how she moves around. But we didn’t overdo the legs, because if you have every leg doing something different or you can see every leg, you’re always going to remind people that she is a spider and some people don’t like spiders! It’s about visually simplifying the characters. At times we hid legs. Sometimes when Tarantula is running around, you only see four legs and visually it’s more appealing and easier to swallow than all of these eight limbs coming out of this body.” The vocal delivery of Marc Maron was a perfect fit for Mr. Snake. “Marc brought so much personality to that character and who he was that we wanted to visually maintain that sarcastic dry humor in his expressions. The actual visual recordings give us a lot of ideas on mannerisms that we could incorporate into the character animation,” Sans notes.
Concept art by Floriane Marchix that explores the white skies and warm light of Los Angeles.
“We wanted to bring a cuteness [to Ms. Tarantula] by making the fur feel soft. Because of going anthropomorphic, we added a torso and head that separates from the body so that it gives you a humanistic feel. We wanted her to feel like a spider based on the speed and how she moves around. But we didn’t overdo the legs, because if you have every leg doing something different or you can see every leg, you’re always going to remind people that she is a spider and some people don’t like spiders! It’s about visually simplifying the characters.”
—J.P. Sans, Head of Character Animation
Central to the technical process was figuring out the workflows and tools needed to allow digital artists to solve visual problems like an illustrator. “We wanted to come up with ways that would allow us to hide detail in the rigging so you could procedurally lose some of detail on a per-shot basis depending on the angle of the light,” states Matt Baer, Visual Effects Supervisor for The Bad Guys. “We also wanted the ability to add additional linework later on to enhance the idea that the image looked handmade. If you look at Wolf, some of his linework is built into the rig. That allows the character animator to move these expression lines around. We also even painted some lines into his fur. That stuff is cooked into those renders.” Textures were strategically chosen. “Where we wanted detail to show up on each of those characters was where the highlight would transition into the mid-tones or where the mid-tones would transition into shadow,” details Baer. “Each of the characters came with their version of a base color and then a texture map. Based on where the light was sitting, we could dial in some of that texture in those transitional areas.”
A lighting key by Floriane Marchix for a scene when Ms. Tarantula hacks into a security camera system.
The Bad Guys was inspired by Australian Aaron Blabey wanting to invert archetypal evil animals and place them in a story that would be ‘Quentin Tarantino for children.’
Genders were switched when creating Ms. Tarantula, voiced by Awkwafina.
The texture of the characters was influenced by the environmental lighting.
“We wanted to come up with ways that would allow us to hide detail in the rigging so you could procedurally lose some of detail on a per-shot basis depending on the angle of the light. We also wanted the ability to add additional linework later on to enhance the idea that the image looked handmade. If you look at Wolf, some of his linework is built into the rig. That allows the character animator to move these expression lines around. We also even painted some lines into his fur. That stuff is cooked into those renders.”
—Matt Baer, Visual Effects Supervisor
A new tool called Doodle was created to help sell the illusion that the explosion was a 2D effect.
2D effects had to be created procedurally. “We built a big sprite library and created a bunch of procedural simulations techniques that could be rendered and composited in a way that you can mix and match the sprites with simulations,” remarks Baer. “The goal was to not know where one started and where one ended. The massive explosion had to avoid appearing as a fluid simulation. “We wanted to represent the cooler and hotter areas of an explosion in a much more graphic way,” Baer explains. “A new tool called Doodle was created that allowed effects artists to essentially add additional 2D animated elements on top of the base explosion, which helped to sell the illusion of the whole thing being done as 2D effect.” The same approach was adopted for environmental effects. “You’re trying to boil each of those components down to the necessary detail so that the audience can fill in the rest,” adds Baer. “For effects, we didn’t want a lot of detail inside. We needed just enough to sell the motion of what the effect was doing. We didn’t want to see every single leaf, but needed the ability to make it look that we took a dry brush and brushed it across the whole tree. When you are outside of the silhouette those textures and speckles would appear as physically geometric leaves.”
It was important to have an anamorphic lens, so tools were written to create the corresponding lens distortion.
The characters had to be modified from the books, such as giving legs to Mr. Shark, voiced by Craig Robinson.
The tight pre-production schedule was the biggest challenge. “Preparing all of the assets and characters would have been fine if it was the regular style, but I wanted something that was different from what we’re usually doing and not just relying on the PBRT rendering, which is physical lighting,” notes Perifel. “It was to be more stylized with brush textures and linework. Figuring all of this out in six months was tricky. But once the team figured it out it went smoothly; that would be the hardest part of it. Then, of course, the transition to working from home technically.” Every scene was carefully crafted narratively and emotionally. “There are two action sequences in the second half of the film that are incredibly fun to look at,” touts Perifel. “There is a lot in this movie.”
To better handle the height differences of the characters Ms. Tarantula was often placed on the shoulder of Mr. Shark.
By TREVOR HOGG
Images courtesy of Columbia Pictures.
For the greater portion of the movie Dr. Michael Morbius (Jared Leto) walks around in his human form.
“We designed Morbius’ face to have some resemblance to Jared Leto and wanted to carry through all of his idiosyncrasies into what the monster does. That was achieved through standard techniques and leaning on machine learning techniques that we employ these days. … We marked up the face to have an accurate tracking of where his head was in 3D space, knowing that we would totally own the face underneath. We kept the hair and clothes; there were times that we didn’t. It’s a performance that leans on machine learning techniques that Digital Domain has put into place over the years.”
—Matthew Butler, Visual Effects Supervisor
Highlighting the rogues’ gallery of Spider-Man villains is the Sony Pictures Universe of Marvel Characters, which began with the malevolent symbiote Venom and has expanded to include Dr. Michael Morbius, a brilliant biochemist turned vampire. The origin story gets explored in Morbius, directed by Daniel Espinosa (Child 44), with Jared Leto playing the title character alongside Michael Keaton, Adria Arjona, Jared Harris, and Matt Smith. Hired to handled the visual effects was Matthew Butler (Ready Player One), who collaborated with Digital Domain, One of Us, Lola VFX, Storm Studios, Sony Pictures Imageworks and NVIZ to visualize and create 1,000 shots.
Dr. Michael Morbius (Jared Leto) attempts to cure his rare blood disease with experimental vampire-bat science.
Making impossible things occur in a believable manner for audience members is tricky. “In this case, you have Jared Leto and Matt Smith playing vampires, and we see them walking around as humans for the greater portion of the movie, so we know their physical inertia,” explains Butler. “We tried to masquerade within almost gratuitous visual effects that justified where physical reality needed to be bent. The face was the most impressive work in the movie, but honoring the dynamics was much harder.” The vampire faces had to appear monstrous but still resemble the actor so to retain its appeal. Comments Butler, “We designed Morbius’ face to have some resemblance to Jared Leto and wanted to carry through all of his idiosyncrasies into what the monster does. That was achieved through standard techniques and leaning on machine learning techniques that we employ these days. We let the director Daniel Espinosa run free with his actors on the day. We marked up the face to have an accurate tracking of where his head was in 3D space, knowing that we would totally own the face underneath. We kept the hair and clothes; there were times that we didn’t. It’s a performance that leans on machine learning techniques that Digital Domain has put into place over the years.”
Aftering injecting himself with his cure, Dr. Michael Morbius (Jared Leto) gains superhuman strength but also transforms into a vampire.
“In this case, you have Jared Leto and Matt Smith playing vampires, and we see them walking around as humans for the greater portion of the movie, so we know their physical inertia. We tried to masquerade within almost gratuitous visual effects that justified where physical reality needed to be bent. The face was the most impressive work in the movie, but honoring the dynamics was much harder.”
—Matthew Butler, Visual Effects Supervisor
Prosthetic makeup was strictly used for the gaunt and sickly versions of Jared Leto and Matt Smith. “There are levels of madness,” notes Butler. “Phase seven ended up being the most dramatic and extreme. Most of time, Morbius is at phase three. There was also a phase that we referred to as a balloon, where he would elastically go into this mode and comes back again. Those shots are so subtle that they’re almost subliminal.” The first transformation happens off screen. “Morbius bursts out of the glass where he is being contained and is full monster. Now that we know what he’s going to become we can balloon into that, not go as far, and see the progression,” notes Butler. It was important to avoid the transformation appearing as a morph or dissolve. “We would have different parts of the face do various behaviors,” Butler adds. “Morbius has this pale skin, so we had to pull red blood cells out of him and accompany that with a fairly grotesque vein work and a quite translucent subsurface to the skin. When he reverts back from being a monster in the container ship, the last frame of that shot is a fully digital face of Jared. It’s quite an achievement by Digital Domain because it’s a lot easier to do a monster than a human who is your lead actor – and full frame.”
The face of Morbius was designed to have some resemblance to Jared Leto.
To receive the PG-17 rating many shots had blood removed or made to look black.
“There are levels of madness. Phase seven ended up being the most dramatic and extreme. Most of time, Morbius is at phase three. There was also a phase that we referred to as a balloon, where he would elastically go into this mode and comes back again. Those shots are so subtle that they’re almost subliminal.”
—Matthew Butler, Visual Effects Supervisor
A unique ability possessed by Morbius is echolocation, where high frequency sound pulses are emitted through the nose or mouth of a bat that listens for the resulting echo. “How does one visualize something that is not visual?” asks Bulter. “We had to show that he is seeing these surfaces by bouncing soundwaves against them. It didn’t have to be sound, but a wave-like phenomena that has a particle system response. I also wanted it to be beautiful. We see Morbius sending out these pulses and an energization of the surfaces which have inherent colors.” Not everything happens on the ground. “Flying was one of my biggest fears as it can quickly become hokey. Morbius has a certain mass and telegraphs that by the way he walks around and picks up and puts down a cup. You can’t suddenly [show] that he has shed his mass and is now a helium balloon. I love any natural phenomena because it tends to be beautiful, and if someone has seen a real thing before, they are now clued into a curtained believability. The technique that we used to hide some of the sins here was a cavitation of the volumes around them.”
“We would have different parts of the face do various behaviors. Morbius has this pale skin, so we had to pull red blood cells out of him and accompany that with a fairly grotesque vein work and a quite translucent subsurface to the skin. When he reverts back from being a monster in the container ship, the last frame of that shot is a fully digital face of Jared. It’s quite an achievement by Digital Domain because it’s a lot easier to do a monster than a human who is your lead actor – and full frame.”
—Matthew Butler, Visual Effects Supervisor
Greenscreen and bluescreen were favored over LED walls. “There was a greenscreen shoot for the cave exterior for the opening sequence in Costa Rica, which was some of the hardest in the movie because that was shot indoors,” reveals Butler. “It was difficult to get that lighting to look real. The end of the movie is all bluescreen. We built a partial set with it being 90% synthetic. We knew what it was and could pull it up on the video feed there because we had already conceived it.” The third act completely changed from being situated during the day to a nighttime setting. “We shot for months in a park in England doubling for Central Park in New York City,” notes Butler, “and the action we did with Jared Leto, Matt Smith and Adria Arjona was all gone. The entire third act was reconceived digitally. We selectively re-shot little pieces of Jared, Matt and Adria, and the rest was CG.” The theatrical rating of Morbius impacted the blood and gore. “It is hard to do a PG vampire movie,” observes Butler. “There were so many shots where we went back and took blood out or made it black.” The stunt performances were impressive, in particular by Jared Leto’s stunt double, Greg Townley, who literally ran up walls sprayed with Coca-Cola for the subway scene. States Butler, “I came at this from what can we do practically first.” A major accomplishment occurs during the container ship sequence. “It is an elegant shot,” savors Butler, “where you see him go from full monster and become Jared Leto. I love that shot and am so proud of it. It looks like Jared, but that’s all synthetic.”
In the subway scene the walls were sprayed with Coca-Cola so that the stunt double for Jared Leto could actually run up the walls.
By TREVOR HOGG
Images courtesy of Warner Bros. Entertainment.
After establishing a relationship with director Matt Reeves on the prequel trilogy for The Planet of the Apes, Weta FX veteran Dan Lemmon was moved into the role of Production Visual Effects Supervisor for the next blockbuster helmed by the filmmaker. The Batman takes place during the early days of the Caped Crusader and stars Robert Pattinson, Zoë Kravitz, Paul Dano, Colin Farrell, Jeffrey Wright, John Turturro and Andy Serkis. Handling the signature car chase between Batman and Penguin, as well as the Batcave and the memorial service at City Hall, was Weta FX, with key members of the team being Visual Effects Supervisor Anders Langlands, Animation Supervisor Dennis Yoo and Compositing Supervisor Beck Veitch who collaborated on a total of 320 shots.
The workshop, gym equipment and Batmobile areas were practically constructed while the rest of the Batcave was a digital environment.
“[Cinematographer] Greig [Frasier] was putting globs of silicon sealant from a caulking gun onto a plate of glass in front of the lens to create these beautiful abstract lens flares, particularly in the vehicle shots and throughout the chase scenes. Initially, we were mystified as to what they were until Dan explained what Greig was actually doing there.”
—Anders Langlands, Visual Effects Supervisor
“It’s definitely exciting to be able to put your own spin on things,” notes Langlands. “This is a detective story that is a love letter to all of those old 1970s crime thrillers which I’m a huge fan of personally, like Chinatown, The French Connection, Taxi Driver and the paranoia trilogy [Klute, The Parallax View, All the President’s Men]. Greig Fraser [Dune] is a fantastic cinematographer, and the photography is stunning throughout. That combination of things made it an exciting journey to be part of.”
A CG cape was created to get the billowing effect that director Matt Reeves wanted for the shot.
Greig Fraser shot with two sets of ARRI Large Format Anamorphic lenses, with one being optically pristine and the other detuned so it could not focus on anything outside the center of the frame. He also had a unique approach to the lens filtration that refracted streetlights and car headlights into a spiderweb of light. Comments Langlands, “Greig was putting globs of silicon sealant from a caulking gun onto a plate of glass in front of the lens to create these beautiful abstract lens flares, particularly in the vehicle shots and throughout the chase scenes. Initially, we were mystified as to what they were until Dan explained what Greig was actually doing there. We did talk about generating some elements with effects to create procedural 2D stuff, but in the end decided to do the same thing that Greig did and shoot some elements for ourselves. Beck’s team was able to take those elements and construct a tool that emulated what Greig had achieved in the live action. I was definitely not cursing [Greig]. It was a lot of fun.”
Weta FX referred to its element library to get the necessary explosion effects that were subsequently graded and timed to be consistent with the plate photography.
“For the compositing team, our challenge was to deconstruct all of the things that happened to the detuned lenses and be able to replicate that for the set extensions and CG shots because it’s so distinctive,” notes Veitch. “We got the chase sequence turned over quite late and had to develop whole new tools and templates for Nuke to be able to implement rain interaction and wheel spray at speed. It was a mixture of simulated effects for hero cars and Eddy templates for background traffic.”
Extensive digital rain had to be created as director Reeves wanted the car chase to feel wet and dangerous the whole time.
The torrential rain was the major creative and technical task. “Matt wanted the car chase to feel wet and dangerous the whole time,” states Langlands. “We added digital rain to all of those shots, which is fairly simple, but when you’re flying through it, that’s a lot more complex. We were modeling how they osculate and deform as they fall so you get those motion blur streaks from them. Our effects team was simulating hundreds of millions of raindrops in every shot. Then we were simulating all of the wheel spray coming off of the wheels. In some shots, we had a mix of 3D, but in most shots 2D solutions for all of the raindrops hitting the road surface. Getting the look of that right, making it feel believable, and being efficient enough that we could do it across that huge number of shots was a real challenge.”
Plates were shot on several different locations including the Dunsfold Aerodrome in Surrey, England for the car chase.
Dennis Yoo figured out the timing, composition and action beats for the car chase was achieved by creating the postvis. “The great part about that was Matt Reeves shot everything,” remarks Yoo. “You add a CG car beside the practical one, then you have something to play off the motion with. It makes everything easier. What people don’t understand is that it’s a chaotic sequence, but there is also artistry in there with the composition and our motion so you understand the direction that you’re going. That was quite a challenge because it’s a mix of cars crashing into each other, but if it was all chaos no one would know what is going on. It was fun to do, and trying to keep that as realistic as possible was also a challenge.”
“For the compositing team, our challenge was to deconstruct all of the things that happened to the detuned lenses and be able to replicate that for the set extensions and CG shots because it’s so distinctive. We got the chase sequence turned over quite late and had to develop whole new tools and templates for Nuke to be able to implement rain interaction and wheel spray at speed. It was a mixture of simulated effects for hero cars and Eddy templates for background traffic.”
—Beck Veitch, Compositing Supervisor
As many as of 11 plates had to be integrated together for the memorial service scene at City Hall.
“[Director] Matt [Reeves] wanted the car chase to feel wet and dangerous the whole time. We added digital rain to all of those shots, which is fairly simple, but when you’re flying through it, that’s a lot more complex. … Our effects team was simulating hundreds of millions of raindrops in every shot. Then we were simulating all of the wheel spray coming off of the wheels. In some shots, we had a mix of 3D, but in most shots 2D solutions for all of the raindrops hitting the road surface. Getting the look of that right, making it feel believable, and being efficient enough that we could do it across that huge number of shots was a real challenge.”
—Anders Langlands, Visual Effects Supervisor
The Batmobile had to be photorealistic. “I worked on a movie that was all vehicles,” continues Yoo, “so we grabbed some of that tech, [and in doing so] the ground contacts and using the actual LiDAR from set to dynamically move the wheels for us allowed for more realism to be built into the rig. We reference for everything so we could look at the actual vehicle to understand what it was doing and then mimic that even though we’re changing the motion.” Batman had to come across as skilled driver. “The Batmobile was bouncing off the trucks, and the Batmobile looks like its causing all of this mayhem. We didn’t want it to look like that at all. Matt was adamant about Penguin starting that whole pile-up, and the Batmobile was [more] in there riding the wave than causing more havoc,” adds Langlands.
Distinct lens flares created by cinematographer Greig Fraser required customized tools by Weta FX to digitally recreate them.
A dramatic upside-down shot is taken from the perspective of Penguin as Batman walks towards Penguin’s overturned vehicle. “That was a funny one because I saw someone on Twitter saying, ‘It’s the most beautiful shot without any CG,’ not realizing that Batman is mostly CG in that shot,” reveals Langlands. “They had a rain machine going, but when you get a big piece of material like a cape wet, it just wants to bunch up and hang down. Matt wanted to have it billowing out in the wind as he’s walking up, so we had to do a digital cape with a cloth simulation. In the plate there is a huge fireball behind him, and because we’re putting a dark object in front of something that is causing a lens flare we had to take the CG Batman and track it to the live-action Batman in composting to patch bits of him.” Rain was not the only problematic natural element. “We raided our element library for every explosion that we had historically,” remarks Veitch. “That was a challenge to get their temperatures matching, because it was on a whole lot of different film stocks and digital formats. Then timing all of those so they came through and exploded at the right time, and patching when we needed to – that was a complex composite.”
“What people don’t understand is that [the car chase] is a chaotic sequence, but there is also artistry in there with the composition and our motion so you understand the direction that you’re going. That was quite a challenge because it’s a mix of cars crashing into each other, but if it was all chaos no one would know what is going on. It was fun to do, and trying to keep that as realistic as possible was also a challenge.”
—Anders Langlands, Visual Effects Supervisor
Situated in an abandoned neo-gothic subway situated underneath the Wayne Tower is the Batcave, which is a huge environment with the workshop, exercise equipment and Batmobile areas being practically built. “Initially there was suppose to be a bat colony which was suppose to be assimilated, but it kept building up more and more,” states Yoo. “It didn’t help that the environment was so big, so we had to cheat because they wanted bats in the foreground, but that foreground didn’t make any sense compared to the environment. We were having scale issues by having the bats quite close to the camera, which didn’t make sense for the bats that were further back.” Darkness prevails in the setting. “We’re adding little kicks and pings off of the superstructure in order to get texture in the background, which is fun because you’re placing little light sources around the place out of focus,” notes Langlands. “That ends up becoming like putting splashes of paint on the canvas.” The shots were simple to composite. “I enjoyed them,” states Veitch, “because we were playing around with a lot of proprietary defocus tools, and being able to compose the focus in those shots and try to make them look authentic. Those lenses are quite incredible and they gave us a lot of reference shots.”
Director Matt Reeves discusses a shot with Robert Pattinson while on the set of The Batman, which was inspired by the crime thrillers of the 1970s.
Crashing the memorial service for the mayor at City Hall is a SUV, at the behest of the Riddler, portrayed by Paul Dano. “We were combining anywhere between four to 11 plates,” remarks Veitch. “We had small crowds because of COVID, and then there was the careening car which was a safety issue. There were about eight plates for the top-down shots. The shots of the car coming towards the camera involved compositing takes of who they wanted – the Riddler at the top in the mezzanine area, the car coming forward, smoke and dust coming off it, and trying to retain all of that. Very tricky shot.” Careful research went into matching the different plates with each other. “We had to figure out what section of the plate we had to use within those shots,” adds Veitch. “There was a massive amount of paint and roto to do before our compositing team even touched it. Then it’s making sure that we can retain as much of the plate as possible, and then adding atmospherics where we needed to help us cover up edges or replace atmospherics that we had to lose. It is quite incredible how much smearing and lensing artifacts you get with the detuned lensing that we had to match up. It was a lot of work on those shots even without the CG.”
By CHRIS McGOWAN
Netflix distributed Girls from Ipanema, a Brazil-produced series about four Brazilian female friends in Rio de Janeiro’s bossa nova scene in the late 1950s. Sao Paulo-based Quanta Post contributed to the VFX. (Image courtesy of Netflix)
The growth and globalization of the visual effects industry has resulted in worldwide interconnectivity and a vast workflow spanning the planet. There is more top-notch VFX in films and series than ever before, boosted by the growth in streaming content, episodic fare becoming more cinematic in terms of quality, and a continued evolution in VFX technology. Demand for VFX artists as a whole is also growing due to the surging video game industry, amusement park visual effects and the gradual ascension of VR and AR.
All of those factors have increased the work for VFX studios and the demand for skilled artists from Vancouver to London to Mumbai. Financial incentives in certain locations have helped globalize the VFX business for some time now. And the COVID-19 crisis further accelerated home entertainment demand and remote VFX work. “The pandemic has really kicked the globalization of the VFX industry into high gear, and now even more producers know what can be achieved with VFX,” says David Lebensfeld, Founding Partner and VFX Supervisor of Ingenuity Studios, which has offices in Los Angeles, New York and Vancouver.
Local productions outside North America, such as many series funded by Netflix, are spreading work across the planet in both film production and post-production. Fiona Walkinshaw, Framestore’s Global Managing Director, Film, comments, “The streamers have made no secret about their desire for regionally-focused content and how this feeds into their business strategies.
As of late 2021, the Korea-produced dystopian survival drama Squid Game was Netflix’s most watched series. Seoul-based Gulliver Studios supplied VFX. (Image courtesy of Siren Pictures and Netflix)
There’s a tremendous desire for stories that could only come from a certain city or country – shows like Squid Game or Money Heist, for example, which, like the Scandi noir boom, captivate viewers by dint of their freshness and unique cultural or geographical perspectives. This will inevitably mean our worlds become larger, as we work with storytellers, producers and below-the-line talent from all over the world. It’s an exciting prospect, and it will help us all grow and learn.” Walkinshaw adds, “In time I’m sure we’ll also see new VFX hotspots establishing themselves – you just have to look at the way the Harry Potter franchise helped turbocharge London’s VFX industry, or what the Lord of the Rings films did for New Zealand.” Framestore itself is quite globalized, with offices in London, Mumbai, Montreal, Vancouver, Melbourne, New York, Los Angeles and Chicago.
Visual Effects studios have spread widely over the last two years across North America, Europe and Australia/New Zealand, and a growing number can be found also in Asia. Many facilities built initially for wire removal and rotoscoping have evolved into full-service VFX studios. BOT VFX, founded in 2008 in India, has expanded from an outsourcing facility in Chennai for rotoscoping and other detail work into a large and complete VFX business; it now has its headquarters in Atlanta and has worked on high-profile recent projects, including The Book of Boba Fett, Dune and Black Widow.
Just as Korea has grown into a movie/series global powerhouse, so too have its VFX studios expanded over the last 10 years.
Ragnarok, a Norwegian-language fantasy series from Copenhagen-based SAM Productions, is distributed by Netflix. Ghost VFX and Oslo-based Stardust Effects contributed VFX. (Image courtesy of Sam Productions and Netflix)
The Last Forest, Luiz Bolognesi’s movie about the Yanomami Indians of the Amazon rainforest, produced in Brazil, mixes documentary and staged scenes. It was distributed globally by Netflix. (Image courtesy of Netflix)
S.O.Z.: Soldiers or Zombies, an eight-episode horror-action TV series distributed by Prime Video, is a Mexican production created by Nicolas Entel and Miguel Tejada Flores. (Image courtesy of Prime Video)
Netflix globally distributed Invisible City, a Brazil-produced fantasy series about mythological creatures in the rain forest, created by Carlos Saldanha, the Brazilian director of various successful animated films, such as the Ice Age movies and Rio. (Image courtesy of Netflix)
“The pandemic has really kicked the globalization of the VFX industry into high gear, and now even more producers know what can be achieved with VFX.”
—David Lebensfeld, Founding Partner and VFX Supervisor, Ingenuity Studios
Gulliver Studios supplied VFX for the Netflix hit series Squid Game, while Dexter Studios contributed VFX work to Bong Joon- Ho’s Parasite. Dexter and five other VFX studios worked on Space Sweepers, arguably Korea’s first high-production science fiction film. Korea’s 4th Creative Party helped with the VFX for Joon-ha’s acclaimed Snowpiercer and Okja films (along with Method Studios). And Digital Idea worked on the hit zombie film Train to Busan.
VHQ Media, founded in 1987 in Singapore, has grown into a large film studio and claims to be Asia’s largest post-production house, working on both national and international productions. It also has studios in Beijing, Kuala Lumpur and Jakarta. Many international VFX firms have opened offices in Asia, including DNEG (four offices in India), The Third Floor (Beijing), Scanline VFX (Seoul), Method Studios (Pune), ILM (Singapore), Digital Domain (Taiwan, Hyderabad and four locations in China) and MPC (Bangalore).
“The global growth of the VFX industry and VFX as a tool of technology is limitless and boundless, to say the least,” says Merzin Tavaria, President, Global Production and Operations at DNEG. The London-based firm is another example of a VFX studio with offices spread across the globe. It was formed in 2014 by a merger between Prime Focus (India-based) and Double Negative (U.K.-based) and has studios in Los Angeles, Vancouver, Montreal, Toronto and London along with Mumbai, Bangalore, Chandigarh, and Chennai in India.
VFX BOOM
“There are some fantastic companies doing amazing work in all corners of the globe,” says Pixomondo CEO Jonny Slow, “and at the moment, they are all working to keep up with an unprecedented level of demand. Growing demand, driven by episodic content with a higher budget and huge creative ambition, is a big factor in all the trends affecting the market [this year] and beyond.” Pixomondo has offices in Vancouver, Toronto, Montreal, Los Angeles, Frankfurt, Stuttgart and London.
The streamers Netflix, Amazon, Hulu (majority owned by Disney) and Apple TV have added their VFX demand to that coming from traditional movie/TV companies and affiliated streaming services (HBO Max, Disney+, Peacock, Paramount+). Florian Gellinger, RISE Visual Effects Studios Co-Founder and Executive Producer, notes, “Right now, as the market is so saturated, the work is going to be globally distributed to whoever has availability and meets the required profile. So yes, clients will have to look increasingly globally for a good fit.” RISE has offices in Berlin, Cologne, Munich, Stuttgart and London.
Other VFX studios concur that the business has been activated. “We have too much work, which means we need more capacity, more artists and more supervisors. Right now, we’re ensuring that we continue to make our established clients happy while bringing in new clients,” says Tom Kendall, VFX Head of Business Development, Sales & Marketing for Ghost VFX, which has offices in Los Angeles, Copenhagen, London, Manchester, Toronto and Vancouver.
Executive Producer Måns Björklund of Stockholm-based Important Looking Pirates (ILP) notes, “There aren’t enough VFX companies in the world to do all the work. The demand for content has boomed, and the need for clients to seek new vendors around the world has increased.”
GLOBALIZED WORKFLOWS
DNEG is one of the pioneers in the globalization of VFX workflows. Tavaria comments, “With nine facilities working seamlessly together across three continents, I believe we’ve led by example, creating an ever-expanding global network that can deliver highly creative and compelling visual storytelling while introducing new norms of efficiency and flexibility.”
He adds, “The standardization of workflows, tools and capabilities across sites allows us to move work around our network to cater to the demands of our clients and to balance the load across locations to maximize utilization. We also take full advantage of time zone differences to create efficiencies in our production scheduling.”
Framestore recently opened a studio in Mumbai, which already has 130 on-site creatives. Walkinshaw comments, “Being able to set up in Mumbai and seamlessly integrate with our new colleagues there is an incredible advantage, especially given the tremendous talent pool there. Generally speaking, increased access to amazing talent is the main consequence of this worldwide connectivity.”
Another positive effect of globalization is that “exchanging work between companies has become much easier despite everyone running their own pipeline,” says Gellinger. Slow sees the globalization of VFX as a positive trend that creates stability for those companies who are prepared to evolve continuously and adapt to constant change. “It’s not the only trend in the VFX industry, but it’s a trend in response to demand for capacity and client requirements for speed and efficiency. But there is also a quality threshold. Quality output drives stability for VFX companies, wherever their artists are located.”
BEYOND BORDERS
Lebensfeld comments, “What certainly helps with this business becoming more globalized is access to talent that isn’t in your zip code, which backfills what already makes us competitive.” To open up to talent in another country, he says, “we already have the technology – the hardware, software and methodology – to work remotely. Anything else past that are just details. We look at it less like it is a global business and more as one that breaks down borders. One of the more exciting things is getting the chance to develop artists and give opportunities to people who would not have had them otherwise. They only need a computer, inherent artistic talent, and we work with them on training in a studio environment. I’m a big believer that a combination of local studio artists and international artists is the best way to go because the industry still relies on specific locations for some projects for a variety of reasons. The business still needs a large base of talent in certain production hubs.”
Teddy Roosevelt (Aidan Quinn) pulling an arrow from the arm of Brazilian explorer Cãndido Rondon (Chico Diaz) in the HBO mini-series The American Guest. The 2021 Brazilian production showcased the work of Brazilian artists, including Orbtal Studios in Sao Paulo, which supplied visual effects. (Photo courtesy of HBO Max)
The Korean zombie apocalypse and coming-of-age series All of Us Are Dead is distributed internationally by Netflix. (Image courtesy of Film Monster Co., JTBC Studios/Kimjonghak Production Co. and Netflix)
Korean sci-fi mystery series The Silent Sea, written and directed by Choi Hang-yong, is distributed by Netflix globally. (Image courtesy of Artist Company and Netflix)
“[S]ince the workforce has become so flexible in where it settles, recruiting has become a lot harder and companies have to reach out further than they used to in order to meet their talent requirements. [Globalization] has solved a couple of these problems by having access to top talent across borders, not being limited to one’s own backyard.”
—Florian Gellinger, Co-founder and Executive Producer, RISE Visual Effects Studios
Ghost VFX worked on Shadow and Bone, a fantasy series distributed by Netflix and based on books by Israeli novelist Leigh Bardugo. Shadow and Bone was shot in Budapest and Vancouver. (Image courtesy of 21 Laps Entertainment and Netflix)
How I Fell in Love with a Gangster is a Polish crime drama distributed globally by Netflix. (Image courtesy of Netflix)
Money Heist is a Spanish heist drama that had a successful run of five seasons and is one of Netflix’s biggest international hits. (Image courtesy of Atresmedia/Vancounver Media and Netflix)
Gellinger observes that it has become easier for artists to find a job in their desired ‘adventure destination’ abroad. “And since the workforce has become so flexible in where it settles, recruiting has become a lot harder and companies have to reach out further than they used to in order to meet their talent requirements.” Yet globalization also “has solved a couple of these problems by having access to top talent across borders, not being limited to one’s own backyard.”
Walkinshaw adds, “From a production perspective it means juggling more time zones, currencies and teams, so this part of the business has become more complex, and there is a need for investment in both more people and technology solutions to make it easier for production to function. The role of a producer working for a company like Framestore on a project spread globally is far more complex and demanding than it used to be.”
Producers and supervisors now must be more patient and organized because of the time differences, and they have to schedule their work around that, according to Kendall. “The projects are shot in so many diverse locations, it’s about being able to address clients’ needs in a timely manner and be flexible in terms of how we work.”
Gellinger notes that the way that business is being distributed globally is “definitely creating stability, but only as long as companies keep investing in their talent. Flying in entire teams from abroad is not a business model. Investing in education and training is more important than ever.”
Slow comments, “We have seen a lot of these consequences [of globalization] playing out for the past few years. It has allowed the formation of larger, better funded, better organized companies that are becoming attractive investment propositions. This has been very positive for the industry – for growth to happen, investment is required.”
INCENTIVES
DARK BAY Virtual Production Studio is an example of how VFX globalization has been boosted by Netflix and by government help. Baran bo Odar and Jantje Friese, the creators of Netflix’s hit series Dark – a German science fiction thriller – built an LED stage in Potsdam-Babelsberg in part to shoot 1899, their next Netflix series. Odar and Friese’s production company Dark Ways holds a majority share in DARK BAY (Studio Babelsberg has a minority share). Funding from the state of Brandenburg in Germany and a long-term booking commitment from Netflix backed the venture.
Incentives continue to play a role in the globalization of VFX. Framestore’s Walkinshaw comments, “National or regional incentives have provided a huge boost for our industry and encouraged international collaboration.
They’ve been key to growth in the U.K. and Canada – to date our biggest sites for film and episodic work – and the willingness of studios to put work in these regions helps create a virtuous circle: it allows companies to invest in their talent, facilities and infrastructure, makes those places a magnet for established talent from elsewhere, and also helps schools and universities attract ambitious students. Take Montreal for example – Framestore was the first major studio to open a studio there [in 2013], and it’s now an established and hugely-respected hub for the global visual effects industry.”
THE PANDEMIC
“The pandemic forced studios like us to build a pipeline that works in remote environments,” says Lebensfeld. “We were able to leverage figuring out how to work remotely with talent that have previously been local to our studio locations. We have history and momentum with these artists, and we figured out processes that mirror what we have already been doing – just with remote capabilities.”
Already existing worldwide VFX interconnectivity helped DNEG to address the challenges of the pandemic, according to Tavaria. “The unprecedented speed with which our technology teams enabled global remote working was astounding, based on work that was already underway. It also, somewhat counter-intuitively, brought us closer together and enabled even more collaboration across our global teams,” he comments. “These advances have positioned us well to cater to the growth in demand for visual effects and animation work this year, driven by the increases in content production by TV and OTT companies, in addition to increased demand for our VFX and animation services from our studio clients.”
Walkinshaw comments, “The pandemic has definitely encouraged us to think outside the box, be this seeking workarounds for physical shoots, having colleagues working remotely from different countries or broadening our talent pool by making hires from different territories, because so much of the workforce has spent time outside the office. I imagine this will endure, especially as we continue to seek skills beyond the ‘traditional,’ particularly in areas such as technology and gaming.”
Slow says, “In our industry, technology and innovation are the fundamental drivers of changes like globalization. We are at a very interesting point – with technology driving once-in-a-lifetime changes in content distribution and production technique – and these trends have been accelerated by a major pandemic. The consequences are significant, and the impact will largely play out over the next five years.”
Lebensfeld concludes, “Pre-pandemic [film companies] went on location and brought on as many extras as needed. The scope of requests has expanded well beyond that. There’s a VFX solution for every aspect of a story. That’s a powerful thing.” He adds, “I think the VFX industry has transformed these past two years, with very positive changes overall. There’s no going back now. Our industry is global, and that’s here to stay.”
“We have seen a lot of these consequences [of globalization] playing out for the past few years. It has allowed the formation of larger, better funded, better organized companies that are becoming attractive investment propositions. This has been very positive for the industry – for growth to happen, investment is required.”
—Jonny Slow, CEO, Pixomondo
Amazon Studios produced the epic fantasy series The Wheel of Time, another international VFX effort involving Cinesite, MPC, Automatik VFX, Outpost VFX, Union Visual Effects and RISE Visual Effects Studios, among others. (Image courtesy of Sony Pictures Television and Amazon Studios)
HBO series Beforeigners is a science-fiction crime drama produced in Norway by Oslo-based Rubicon TV AS. (Photo courtesy of HBO Max)
By TREVOR HOGG
A scene from A Shaun the Sheep Movie: Farmageddon from Aardman Animations, which was originally founded as a stop-frame studio in 1972. (Image courtesy of Aardman Animations)
Being able to successfully manage an animation studio in 2022 and beyond requires foresight and the desire to be an industry leader. The landscape has been transformed by globalization, dominance of streaming services, growing demand for content, and the development of other media platforms such as virtual reality. Being flexible has enabled Oscar-winning Aardman Animations to remain relevant since being founded as a stop-motion studio in 1972, responsible for Creature Comforts and Wallace & Gromit, and creating CG features and AR projects. More recently there is Emmy-lauded Baobab Studios, which has specialized in interactive action animation since 2015 and is creating The Witchverse anthology series for Disney+ based on Baba Yaga. Veteran animator and lecturer Ken Fountain has found himself in the middle of all of this, having worked on Megamind for DreamWorks Animation, Pearl for Google Spotlight Stories and Baba Yaga for Baobab Studios, as well as doing tutorials on SplatFrog.com.
There is still value in using real and tactile material for characters and world-building, believes Sarah Cox, Executive Creative Director at Aardman Animations. “It’s less about creating assets digitally and more about the way that technology can enable handcrafted processes to be more efficient, beautiful and effective.” Technology is embraced by Lorna Probert, Head of Interactive Production at Aardman Animations, who last year released the Wallace & Gromit AR adventure The Big Fix Up. “There is some exciting technology, like virtual production and the way that we can use real-time engines for previs and creating more flexible and iterative processes.”
Another major trend pointed out by Cox is the push for photorealism. “The distinction between live action and animation, what is real and what’s not real, is continually going to be blurred as we use live-action information in animation.” The filmmaking process is not entirely different. “Stop-frame is effectively like a live-action asset because it’s shot in-camera, and then you’re using those assets to mix with other bits of animation,” observes Cox. “What we did with our last Christmas special was shoot all of the effects in-camera. The snow was made out of puffs of wool, but then composited together in the most technically proficient, up-to-date way.”
Baba Yaga won the Daytime Emmy for Interactive Media and resulted in Disney+ partnering with Baobab Studios to create a Witchverse series for the streaming service. (Image courtesy of Baobab Studios)
Virtual reality has yet to become mainstream, partly hampered by being perceived strictly as a marketing tool. “You are still wearing what is not a comfortable thing on your face,” states Probert. “Until your interface with the content becomes more natural and comfortable, that suspension of reality is always going to be broken.” The communal aspect is presently missing. “A lot of our content design is for family viewing, so that whole being in your own space is quite contrary to what we do,” notes Cox. “It quadruples the production time [because various viewer narrative decisions have to be accounted for], and the other challenge is most of our work is comedy. If there is user interaction, you can’t time the gags in quite the same way.” It is important to recognize that each medium cannot be treated in the same way. “It’s creating content that plays to the strength of the format, and we’re doing lots of exploration,” remarks Probert. “The fact that you can, for the first time, be in one of our sets, be able to build and move things and explore the detail in VR is exciting. It’s exciting to be in the barn with Shaun the Sheep and see him reacting to you – that’s an interesting thing for us to play with.”
Veteran animator and lecturer Ken Fountain finds it to be exciting that the demand for animation content is allowing for experimentation such as Spider-Man: Into the Spider-Verse. (Image courtesy of Sony Pictures Entertainment and Marvel)
Netflix decided to bridge the gap between Season 1 and 2 of the live-action The Witcher by releasing an anime prequel Nightmare of the Wolf. (Image courtesy of Netflix)
The Very Small Creatures is a pre-school series produced by Aardman Animations for Sky Kids. (Image courtesy of Aardman Animations and Sky Kids)
The mandate for Baobab Studios is to make viewers feel that they are an essential part of the storytelling. (Image courtesy of Baobab Studios)
Animation is no longer just for families as illustrated by the adult-oriented anthology Love, Death + Robots. (Image courtesy of Netflix)
When co-founding Baobab Studios CEO Maureen Fan combined her video game experience with the filmmaking expertise of CCO Eric Darnell and technical leadership of CTO Larry Cutler. “The reason why something is a success isn’t because it’s stop-motion versus traditional animation. It is how good the story and characters are. The style is in support of that story. For Crow: The Legend and Baba Yaga, we created both for VR and 2D. Namoo was created within the VR tool Oculus Quill, where you are literally painting 360, but the project was meant to be a 2D output. The animation is different because the director is different. We brought in Erick Oh, and it’s more like stop-motion because in Quill there is no frame interpolation and rigging. Every single project that we’ve done has had completely different methods and tools, which is fun.” The mandate driving everything is making sure that the user feels like a protagonist. “Certain parts of Baba Yaga and Bonfire were straight animation, but our animators also built in a bunch of cycles that we fed into the AI engine. We built a character-emotive AI system similar to games, so whatever the audience did, the story would change and reorder itself, and the characters would do different things.”
“There is a real-time revolution that is going to come, but not everybody has embraced it yet,” remarks Fan. “Even when the big studios adopt real-time, unless you’re having the product completely change where it’s interactive, you would still animate the same way. I don’t think that animators need to change any time soon. But if you’re interested in doing interactive animation, your skillset needs to be embracing the real-time aspect.”
The visual language for VR is still being developed. “The fun thing about VR is no one knows what they’re doing! You don’t need a lot of previous experience. It is finding the best animator and rigger, and find one who has a flexibility to try different things and not always have to do things the way they did previously.” Globalization of the industry has not only had an impact on the workforce. “You will notice that all of our projects have specifically cast minorities and women,” adds Fan. “That’s because I’m a female minority and feel like if I don’t do it, who will? Crow: The Legend was inspired by a Native American legend, and it was one of the first indigenous-themed stories in animation. Instead of a hero’s journey, they’re much more about the community. Baba Yaga is based on a character well-known in Eastern European literature. I’m excited about the different types of stories that we can tell with globalization.”
Previously an animation supervisor at Baobab Studios and currently working as Animation Supervisor for DNEG Animation, Ken Fountain points out that VR is rooted in an artform that predates cinema, which is theatre. “If you’re talking about going into the AR/VR space, the filmmaking language is totally changing because you can’t rely on an editor anymore,” observes Fountain. “The editor is the one standing with the headset and making the choices of where to look. The way that you build a story and performance, and attract the eye and create compositions, is based around theatrical approaches rather than cinematic ones. You have to be procedural and theatrical.” As much as it is exciting for a user to be able to choose their own narrative, there is also respect for boundaries.
Netflix paid over $100 million to purchase the rights to The Mitchells vs. the Machines. (Image courtesy of Netflix)
“Ultimately, it’s up to whoever is creating the experience to decide if they’re giving the user room to do literally whatever they want. Personally, I like the engineered outcomes,” adds Fountain, as parameters have a positive impact on the creative process. “You make your best work when you have limitations, and that translates into writing stories for this open universe. Unless you give people a box, the outcome is not going to be as good.”
With the growing reliance on AI and machine learning, could there come a time where animation is literally created in real-time? “That seems so far away,” remarks Fountain. “The first wall to get over for AI is it being able to create empathy in animation.” Animators are not going to be replaced any time soon by machines, but the required skillset has changed somewhat, according to Fountain. “Because there’s so much demand for content as an artist, you have to be able to do so many more different things stylistically, which wasn’t always the case. Also, because there are so many start-up companies, your technical generalist knowledge is way more valuable now.” Streaming has provided a platform for projects that are not strictly family entertainment. “Streaming has eclipsed everything right now,” states Fountain. “The Netflix release of Arcane is another bit of proof that people are craving adult-based animation, because the production value of that series is amazing. The Disney-Pixar model has had too much of a grip for too long.”
Different animation styles and techniques are being melded together. Comments Fountain, “Even in VR, we used 2D-animated effects in our engine at Baobab Studios. It’s the same thing that Arcane is doing by combining CG animation, 2D effects and rough 2D-composited motion graphics. Something like Love, Death + Robots has so many new techniques and combinations of things that are sometimes hit and miss. I am so glad that people are shooting for those things because it’s making the artform better.”
The Big Fix Up is an augmented and mixed reality adventure starring Wallace and Gromit. (Image courtesy of Aardman Animations and Fictioneers Ltd.)
The animation style of Namoo was dictated by Korean filmmaker Erick Oh and the story he wanted to tell. (Image courtesy of Baobab Studios)
By CHRIS McGOWAN
The VR rhythm game Beat Saber involves slicing through cubes to the beat of popular hits from artists like Billie Eilish, available in DLC releases. Beat Saber has sold more than 4 million copies across all VR platforms and earned more than $100 million in total revenue. (Image courtesy of Beat Games and Oculus Studios/Meta)
VR is going mainstream next year. VR is going nowhere. AR will be bigger than VR.
There is no consensus on where virtual reality and augmented reality are headed and how soon they will get there. But although the virtual reality and augmented reality platforms are still far from mass acceptance, certain positive signs indicate that they really will become large, viable businesses, growing steadily over the next several years.
Tuong H. Nguyen, Senior Principal Analyst for Gartner, Inc., comments, “AR and VR have been hyped as being ready for widespread adoption for a long time, but the technology, use cases, content and ecosystem readiness have yet to live up to the hype.” Nguyen believes that VR in particular will go mainstream when it has three Cs – content, convenience and control. He notes, “While we’ve made progress on all these fronts, we’re still far from reaching the point where each of those aspects are sufficiently mature to make VR go mainstream. It will become mainstream, but I don’t expect it to happen until five to 10 years from now.”
Others are pessimistic that VR will ever become mainstream. “The answer is never. Sorry. Here’s why. People don’t like wearing stuff on their face and getting sick doing it, and having to pay a lot of money for the privilege,” says Jon Peddie, President of Jon Peddie Research and author of the book Augmented Reality, Where We Will All Live. “The VR market has bifurcated into industrial and scientific – where it started in the 1990s – and consumer. The consumer portion is a small group of gamers and a few – very few – people who watch 360 videos.”
On the other hand, in the opinion of Maze Theory CEO Ian Hambleton, the point has passed for people to doubt VR’s future. “With over 10 million active headsets on Oculus Quest sold now, it’s an active ecosystem. The 10 million unit number is often cited as a crucial stepping point.” Maze Theory developed the VR title Dr. Who and the Edge of Time.
You may find yourself in a digital living room using a HP Reverb G2 VR headset, which boasts a resolution of 2,160 x 2,160 pixels per eye and a 114-degree field of view. (Image courtesy of Hewlett Packard)
In November, Qualcomm CEO Cristiano Amon announced at the company’s 2021 investor day that Meta had sold 10 million Oculus Quest 2 headsets worldwide (Qualcomm’s Snapdragon XR2 chipset powers the Quest 2). A Qualcomm spokesperson later clarified that the number wasn’t meant to be official and came from market-size estimates from industry analysts. But as Qualcomm obviously knows how many Snapdragon XR2 chips it has sold to Meta, the cat seemed to be out of the bag.
Meta’s Oculus VR app marked another key milestone at the end of 2021, when it was the most popular download on Apple’s App Store on Christmas Day. The Oculus app beat out long-standing leaders like TikTok, YouTube, Snapchat and Instagram for having the most downloads.
And the category’s size is bigger than Quest 2. There are also Sony PlayStation VR, HP Reverb G2, Valve Index VR, HTC Vive Pro 2 and HTC Vive Cosmos, among other headsets. And PlayStation’s Next Generation VR (NGVR) is also joining the mix.
Research firm Statista estimates that the total cumulative installed base of VR headsets worldwide reached 16.4 million units in 2021 and that the cumulative installed base will surpass 34 million in 2024. And Statista predicts the global VR market size will grow from $5 billion in 2021 to $12 billion by 2024. Another firm, Reportlinker.com, foresees 62 million units shipped by 2026.
Hambleton thinks the launch of the next-generation Sony PlayStation VR (sometimes called NGVR) will give a major boost to VR. “[It’s] really important. NGVR will be huge. That’s our prediction. It’s got some great new features and sorted out many of the issues of the previous headset, including inside-out tracking and much better controllers. So long as they ensure there’s a strong pipeline of content for NGVR, we think it will do really well.” The latter support is likely – the current PlayStation VR has released over 500 VR games and experiences since the format’s debut in October, 2016.
Hewlett Packard’s HP Reverb G2 VR headset, developed in collaboration with Windows and Valve. (Image courtesy of Hewlett Packard)
Peak Blinders: The King’s Ransom is a narrative-driven VR adventure developed and published by Maze Theory. (Image courtesy of Maze Theory)
Hewlett Packard’s HP Reverb G2 VR headset with handheld controllers. (Image courtesy of Hewlett Packard)
A “home environment” view inside the standalone Oculus (now Meta) Quest 2 headset. More than 10 million Quest 2s had been sold as of November, according to estimates. (Image courtesy of Oculus Studios and Facebook/Meta)
Another indication of a growing market came when Oculus announced in February 2021 that the rhythm game Beat Saber had sold over four million copies across all platforms and over 40 million songs from paid DLCs. In an October Oculus blog, it was revealed that Beat Saber had surpassed $100 million in gross lifetime revenue on the Quest platform alone.
In February 2021, Facebook announced that more than 60 VR games available for Oculus Quest and Quest 2 had garnered over $1 million since the beginning of 2020, with six topping $10 million, including Skydance Interactive’s The Walking Dead: Saints and Sinners, released on Oculus in January 2020. The latter title has grossed more than $50 million across all platforms, it was announced by Skydance late last year.
“VR is definitely at an inflection point. It’s starting to look like the early PC evolution, which expanded from just hardcore enthusiasts and tinkerers and hobbyists to everyday use for a whole lot of people. That’s happening with VR now — people beyond the initial core believers are buying headsets and making it a regular part of their lives,” says Johanna Peace, Manager of Technology Communications at Meta, formerly Facebook.
She continues, “A lot of that is thanks to Quest 2. Before Quest 2, there hadn’t been a high-resolution, all-in-one form factor headset at that price point yet, so when we built it, people really responded. A big part of this is because the headset is so intuitive and approachable. With a small and portable form factor, no wires or external sensors, anyone can pick it up and in seconds they’ll be immersed in a VR experience. That simplicity is incredibly powerful, and it removes big barriers to adopting VR.” She adds, “Quest 2 sales are strong and have surpassed our expectations, and we’re thrilled to see the community’s response to Quest 2.”
Peace notes that other genres growing in popularity in VR include fitness/wellness/meditation such as Supernatural and FitXR, multiplayer/social games like POPULATION: ONE and adventure games like Star Wars: Tales from the Galaxy’s Edge.
Vicki Dobbs Beck, ILMxLAB Vice President of Immersive Content Innovation, believes that VR has already begun its breakout and will continue to be adopted by a more mainstream audience given the headsets’ (such as Meta Quest 2) accessibility and ease of use. She comments, “In addition to a robust array of premium game titles, new content categories are further helping to drive growth.” ILMxLAB has shown its interest in the format with its production of the Star Wars: Tales from the Galaxy’s Edge interactive VR game titles, compatible with the Oculus Quest systems.
Beck also sees social VR sites having a positive effect on the acceptance of VR. She comments, “Against the backdrop of the pandemic and the desire to reconnect across geographies, we’ve seen a rise in engagement through social VR sites like VRChat and AltSpaceVR. Whether to experience immersive theater, remote viewing parties, engage in collaboration or just ‘be’ with friends, I expect such use will meaningfully increase in the year ahead.” Other popular social VR sites include Rec Rom, Bigscreen VR, and Meta’s Horizon Home and Horizon Worlds.
The Metaverse, a predicted global network of immersive virtual worlds, is expected to boost virtual reality, one of its key components. “VR’s greatest strengths are the power of ‘being there’ and the power of ‘connection.’ While there is neither a single definition of the Metaverse nor a universal strategy for engagement, I believe that VR will be one of the most compelling ways to explore and experience new worlds [and] emerging stories and establish relationships with characters,” says Beck.
Peace adds, “VR will be one of many entry points into the Metaverse, similar to how phones, laptops and other devices are entry points to the Internet today. The Metaverse won’t happen overnight, and it will take years to be fully realized. But VR today shows a glimpse of the immersive, social experiences that can be possible in the Metaverse, and these experiences will continue to develop as VR hardware advances and as the building blocks of the Metaverse are built.”
Augmented Reality has also not yet hit the mainstream, but it has plenty of believers, as it only requires special glasses or goggles, not headsets. Peddie comments, “AR has the biggest potential long-term. AR properly done will change our lives – for the better. When done right, it will be like wearing sunglasses or normal corrective lenses. It won’t be conspicuous or obnoxious, and it won’t take you out of the now – it expands it. AR has nothing in common with VR.”
Participating actively with the Oculus (now Meta) Quest 2 headset. The Quest 2 is credited with taking virtual reality closer to being a mainstream business. (Image courtesy of Oculus Studios and Facebook/Meta)
A Dalek from Doctor Who: The Edge of Time, a VR adventure developed by Maze Theory and published by Playstack. (Image courtesy of the BBC, Maze Theory and Playstack)
The Valve Index is a tethered high-end VR system, which comes with headset, hand controllers and base stations, and connects to your PC. (Image courtesy of Valve)
Skydance Interactive’s The Walking Dead: Saints and Sinners was one of the best-selling VR game titles as of late 2021. (Image courtesy of Skydance Interactive)
“With over 10 million active headsets on Oculus Quest sold now, it’s an active ecosystem. The 10 million unit number is often cited as a crucial stepping point.”
—Ian Hambleton, CEO, Maze Theory
Nguyen comments, “AR continues to mature. Much of the maturation and adoption are driven by the enterprise frontline work of AR. Companies like Snap and Niantic, as well as use cases like virtual try-on, and Google AR dinosaurs, animals and [Google] Maps arrow [Live View], have raised the profile for consumer AR, but we’ll still far from mainstream.”
Beck notes that while there is growing interest in AR and some novel applications, the pivotal shift will come with the introduction of compelling AR glasses. She comments, “The kinds of experiences we can create when people do not have to hold a phone or tablet could be truly transformational. A key will be the seamless blend of our digital and physical realities.”
Mobile AR is already widely used with smart phones, tablets and other mobile devices. The most notable example is the Niantic game Pokémon Go, developed and published in collaboration with Nintendo and The Pokémon Company for iOS and Android devices. Pokémon Go has over 150 million worldwide active users (its peak was 232 million users in 2016) and has passed one billion downloads, according to Niantic. Pokémon Go’s in-app purchases account for a large proportion of consumer mobile AR spending, according to Statista, which predicts that the number of mobile AR users will grow from 800 million in 2021 to 1.7 billion by 2024.
Zombies go boom in Skydance Interactive’s The Walking Dead: Saints and Sinners, which has grossed more than $50 million across all VR platforms, according to a Skydance announcement in October. (Image courtesy of Skydance Interactive)
Microsoft HoloLens 2, Magic Leap One, Google Glass Enterprise Edition 2, Ray-Ban Stories and Vuzix Blade are examples of AR glasses on the market. Apple is expected to launch either AR or MR (mixed reality) glasses by early 2023. Statista forecasts AR glasses sales rising from 410,000 units in 2021 to 3.9 million units in 2024. The firm predicts that enterprise spending on AR glasses will rise from $2 billion in 2021 to almost $12 billion in 2024.
Nguyen concludes, “AR and VR will continue to see forward momentum. The pace and trajectory of AR and VR haven’t changed. The difference will be the level of hype and the mismatch between that hype and the reality. Regardless, it’ll take five to 10 years.”
By KEVIN H. MARTIN
Images courtesy of Warner Bros. and DNEG.
DNEG referenced Geof Darrow’s original sketches from the first film in helping to recreate and enhance various environments being revisited in The Matrix Resurrections. These locales include the gel-filled pod containing Neo’s living body.
If the original Star Wars stood as the iconic representation of visual effects innovation for the last decades of the 20th Century, then 1999’s The Matrix surely had taken up that standard in the intervening years. Two sequels – also written and directed by the Wachowskis – may have diluted some of its impact, but the sheer excitement of the first film’s innovative bullet-time scenes raised the bar on VFX in a no-going-back-from-here way that was much aped, but rarely equaled.
Working alone this time, writer-director Lana Wachowski’s return to the franchise with The Matrix Resurrections effectively subverts expectations of both fans and casual moviegoers, choosing to focus on character over flash while still managing to expand on both ‘real’ and virtual realms. For visual effects, this meant revisiting some creatures and environments from the original trilogy, but endowing them with greater verisimilitude; in equal parts, this called for creating new characters from a combination of on-set motion capture and CGI.
As a veteran of the original two Matrix sequels, Visual Effects Supervisor Dan Glass offered an informed perspective on the new film’s place in the Matrix pantheon. “There was some trepidation,” he admits, “in terms of meeting audience expectations. But the goal was always to treat the film as its own entity; while it was written as a continuation, Resurrections also presented its own solid story. Visually and aesthetically, there was a deliberately chosen departure in its approach, something quite apart from its predecessors.”
Part of that visual distinction came from Wachowski’s intent to shoot more in the real world and rely less on static compositions that harken back to graphic novels. “We used previs on a limited basis, mainly for the fully CG shots, so we understood how they laid out and how a camera might move through them,” states Glass. “There were also previs studies to figure out logistics in bigger sequences, but the main stylistic difference comes from being out on location rather than being studio-based, and Lana wanting to have tremendous freedom with the camera. So this one has a lot more fluidity; not necessarily a documentary look, but with more organic aspects of shooting out in the world. It was justified from a story standpoint because we’re depicting a different and more advanced version of the Matrix, where they’ve learned lessons about what constitutes reality. So our approach, to incorporate as much reality into things as possible, permitted a lot of impressive visuals, like seeing our two lead actors jump off the 40th story of a skyscraper.”
The Dojo sequence was a callback to the Neo/Morpheus fight in the first film, and stylistically differed from the look of the real world and the matrix world.
“[T]he main stylistic difference comes from being out on location rather than being studio-based… So this one has a lot more fluidity; not necessarily a documentary look, but with more organic aspects of shooting out in the world. It was justified from a story standpoint because we’re depicting a different and more advanced version of the Matrix, where they’ve learned lessons about what constitutes reality. So our approach, to incorporate as much reality into things as possible, permitted a lot of impressive visuals, like seeing our two lead actors jump off the 40th story of a skyscraper.”
—Dan Glass, Senior Visual Effects Supervisor, DNEG
Instead of simply revisiting bullet-time, Resurrections instead features scenes that essentially show the process through the other end of the scope, with time dilation being used against Neo. “To portray these moments, we built scenes using multiple frame rates for various characters and events,” Glass explains. “This involved referencing underwater photography to get ideas for how things might look and feel as Neo tries desperately to react to events transpiring faster than he can effectively defend against. We used lots of complex layering of CG over original photography, and in a few scenes employed a very subtle use of fluid dynamics, but it wasn’t about creating a big visual moment, and instead really focused on the emotional content and the acting.”
Even with the focus often being on the intimate, there were still action set pieces, which range from a bullet-train battle to a chase through the streets of San Francisco. “Neo and Trinity race a motorcycle while trying to evade menacing throngs trying to stop them,” says Glass. “In terms of complexity, it was immense. We shot on location in the city with the principals on a real bike – rigged to a platform. Digitally, we went in and added even more figures to the thousands in the crowd while removing rigging.”
DNEG added environmental effects to provide nuances such as rippling water and falling leaves.
Numerous hoses plugged into Neo’s body were created in CG, and had to be tracked to actor Keanu Reeves’ moves and to his underlying musculature. The hoses also had to appear to interact with the red gel and various microbot creatures maintaining the pod.
Pre-production artwork suggested a path for realizing the environment surrounding the pods of Neo and Trinity, with production building a 10-meter-high set piece including a practical pod.
New visual sensibilities – rendered with latticeworks and curved forms – contrast with the previously established brutalist look, suggesting how the faction of Machines working and living with the humans have created a new hybrid aesthetic.
For the first time with their feature work, DNEG relied heavily throughout the scene on Epic’s Unreal engine for real-time rendering at 4K resolution.
The film utilized work from several vendors, including Framestore, Rise, One of Us, BUF, Turncoat Pictures and Studio C, with DNEG drawing a significant number of shots and sequences, including the Dojo sequence in which Neo and a new incarnation of Morpheus square off. Though DNEG created the CG dojo with Clarisse, DNEG Visual Effects Supervisor Huw Evans reports this was the first occasion when the company used real-time rendering with Epic Games’ Unreal engine for finals-quality renders at 4K resolution. “That was a big deal for us,” he avows. “Lana and Epic crafted the scene based on a place called Devil’s Bridge in Germany. After Epic built that out, we took it off their hands, adding extra detail such as rippling water and falling leaves, then starting lighting with it before running our CG cameras and match-moving the real cameras.
“The big challenge at the time in running it all through Unreal was the limitations of version 4.25,” Evans elaborates. “That was missing a lot of features we’d normally use in dealing with imagery, like OCIO color support, that can help get us into comp with all the extra sprinkles. So after we got these beautiful images from Epic, we’d always put things into our pipeline to have the ability to tweak individual grades, such as varying bloom or emphasizing some aspect of a visual effect. We could have output straight from Unreal, but it would have been missing that 10% we always strive to achieve.”
Glass notes that the decision to handle that sequence in Unreal was made in part because the scene represents a construct within the matrix. “There was an aesthetic choice for that to be different in look from the rest of the film, but internally consistent.”
With the film’s new machine version of Morpheus, live on-set facial capture was used. “The real benefit here was getting all that interaction between the actors live so the eyelines were maintained naturally during the conversations,” Glass acknowledges. “The biggest R&D components on the project connected to the newer body-method capture, which used AI and machine learning to reconstruct from data these three-dimensional representations of actors. It didn’t require a CG head or do the modeling and lighting, because you get all that baked in from the original photography. That was something Lana really wanted because it went along with how she might want to pan rapidly from one direction to the other. The AI/machine learning tech was also used when two actresses are supposed to be occupying the same space – occupying the same avatar, if you will – so we recorded each of them, then used machine learning to superimpose one over another while they appear to be fully synchronized with all their actions.”
Determining just how different this new Morpheus would appear and behave was an iterative process for DNEG. “While he looks human when inside the matrix, his digital look flows and changes as the situation demands,” says Evans.
The resistance’s new belowground digs, Io, represented a departure from their older Zion enclave, suggesting a much more expansive abode, with factories and delineated residential blocs.
“The big challenge at the time in running it all through Unreal was the limitations of version 4.25. That was missing a lot of features we’d normally use in dealing with imagery, like OCIO color support, that can help get us into comp with all the extra sprinkles. So after we got these beautiful images from Epic, we’d always put things into our pipeline to have the ability to tweak individual grades, such as varying bloom or emphasizing some aspect of a visual effect. We could have output straight from Unreal, but it would have been missing that 10% we always strive to achieve.”
—Huw Evans, Visual Effects Supervisor, DNEG
“The earliest attempts were very free-flowing, and I really quite liked the casually arrogant way he could let parts of himself flow ahead before snapping back into the human configuration. But in going down that route, we found that it was harder to relate to this major character when he was all over the place; your eye didn’t always know where to look when he was in such extreme motion. I decided that if he was looking at you, his face and arms would be mostly human, but you’d see this almost seagrass-like effect visible on his back, and a different level of flowing when his muscles were in use, like when he was climbing.”
Separate muscle and skin passes were required before the flowing character effect went in. “There’s a lot of nuance to the performance that started with what the face camera got, in order to convey how he doesn’t have the fidelity of a real human,” states Evans. “When the effects team took over, they tried to proceduralize it as much as possible, but with so much custom work to make the character, there’s still a lot remaining.”
DNEG was also responsible for aspects involving locales recognizable from the first film, including the pod containing Neo’s inert form as he lay in a pool of red gel, literally plugged into the Matrix through various hoses. “The look of that environment started with a beautiful piece of concept art that gave us the broad strokes,” recalls Evans. “Then we did countless designs on the specifics of the pods and the huge turbines around them. Production actually built a massive set piece for the live action that measured 10 meters high and included Neo’s pod – we reused that for Trinity’s, too – which contained all that practical goo. We extended the turbine and built a whole chamber around it in CG, along with the pit below.
For a Machine incarnation of Morpheus, the film utilized live on-set facial capture. AI and machine learning permitted reconstruction of the character from three-dimensional presences on set. Separate passes for skin and muscle helped finesse the look, but Morpheus’ ability to fluctuate his form went through many iterations before filmmakers settled on a subdued rippling to avoid distracting or confusing viewers.
A flashback illustrating the war between two factions of Machines provided an opportunity to deliver exposition through a pair of expansive visual cuts.
The Machines’ armada reflected some souping-up from their earlier forms, while the opposition – operating in what were dubbed ‘squid tanks’ – used geometry made up of familiar detail bits from the trilogy.
The faction of Machines working and living with the humans have created a new hybrid aesthetic.
These constructs were only slightly ‘upgraded’ – both to reflect the changes in the nature of the matrix as well as CG developments over the intervening years separating Resurrections from the original trilogy.
It was a fairly standard build, but to give the scene life and a tie-in back to the original film, we added a bunch of little microbot creatures for additional texture and detail. We referenced Geof Darrow’s original sketches from the first film [of the machine ecosystem] to keep these things looking familiar and appropriate, but we got to embellish things by giving them a sense of character when Neo unexpectedly wakes – they are surprised and get away really quickly!”
Evans found one of the most challenging parts of the sequence to be a fairly invisible aspect. “The hoses that attach to their bodies were all digital,” he explains. “We tracked all those ports on Neo’s back individually – that meant not only matching the movement, but also understanding and reproducing the musculature beneath those ports, including how his skin moves and he reacts. These CG cables also had to interact with the practical goo that was in the pod and dripping off Neo, plus those microbot creatures as they splashed around in the goo; all of this amounted to incredibly detailed work that was often painful to get just right. None of this is really in-your-face stuff, but the invisible work is often at least as challenging. It made for just a ton of fine-detail balancing.”
A throwaway line of dialog led to a short but memorable sequence featuring combat between two factions of warring machines. “The machine battle was originally just referred to in conversation, and then it was decided to do a single big shot of the action, which sounded exciting,” admits Evans. “What was even better was how it eventually became two massive shots. This started life as another piece of concept art that roughly blocked out what was going on. We used the armada ships from the original trilogy on one side of frame, updated with new detail, flying through the air, while on the other side we had a group in what we called squid tanks. They were a hybrid using familiar bits of geometry from the original, including elements from the harvester and sentinel, but with new weird dreadlocked bits that resembled sentinel legs. Lana was keen to not include the harvesters intact, because they are farming creatures, not built for firing lasers and doing battle. The Environments team did an amazing job with all the fire and smoke and debris, which was especially important in creating the spectacle, because since it was just two shots, we didn’t want to go crazy modeling all these forms and mainly built to camera whenever possible.
Among the familiar ‘faces’ from earlier Matrix entries are the Machines’ attack Sentinels and Io’s hovercraft vessels, along with Harvesters used by the Machines to service fetus fields.
DNEG extended the turbine section and filled in the rest of the environment digitally.
Upgraded from earlier Matrix entries are the Machines’ attack Sentinels and Io’s hovercraft vessels, along with Harvesters used by the Machines to service fetus fields.
Belowground in Io.
Environments, along with the matte painting team, also contributed heavily to Io, the new city that succeeds Zion as home to the human resistance, “If Zion represented a town, then Io is a mega-city – massive, with towering buildings, factories and delineated residential blocs within a sprawling cave environment,” remarks Evans. “We tried to generate stuff procedurally, when possible, but when there were specific story points, like being able to see people moving around in their residences and people here interact with these machines on their side, there had to be a great deal of detail work. Lana was very keen that this environment show how the combining of people and machines working together reflected a very different visual sensibility from that of Zion, which was very run-down and clearly just a product of human minds. With the help of this faction of machines, you go beyond just a Brutalist grouping of buildings into details featuring 3D-printed lattices and curved shapes made from unusual exotic materials.”
To facilitate the film’s less-regimented approach to shooting, DNEG always used the production imagery as a reference. “Whenever we had a fully CG shot, especially one that went between two production shots, we made sure our CG cameras could match the moving and sometimes handheld look of the production shoot,” says Evans. That meant duplicating the exact kind of camera shake and bounce, but it was absolutely necessary to make everything live together in the cut.
“Everybody loved what the first Matrix brought to cinema,” Evans concludes. “But if Lana had taken the easy way out and done the same exact thing over again 20 years later, it would have been treading on familiar territory that so many other projects have already leveraged off. This one defies a lot of conventional thinking about sequels, while having meta kinds of fun with the whole notion of sequels. That may not be what everybody expected it to be, but it is very different, and I found that aspect, along with getting to be part of the Matrix history, to be extremely worthwhile.”
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.