By MATT HURWITZ
Images courtesy of Warner Bros.
By MATT HURWITZ
Images courtesy of Warner Bros.


Dwayne Johnson “floating” out of the Rock of Eternity on set at Trilith Studios in Atlanta, lifted by one of the special effects department’s robotic arms.
Watching director Jaume Collet-Serra’s Black Adam, audiences are easily convinced that the Warner Bros./HBO Max saga was shot in a Middle Eastern city, nowhere near the Atlanta, Georgia set on which it was filmed. “We’re always most proud of things no one ever thinks are visual effects,” notes Oscar-winning Visual Effects Supervisor Bill Westenhofer (Life of Pi). “The goal is to work yourself out of any recognition.”
The film was lensed by DP Lawrence Sher (The Joker) with production design by Tom Meyer. Its primary visual effects producer was Wētā FX under the production supervision of Westenhofer in tandem with Wētā VFX Supervisor Sheldon Stopsack. Additional VFX work was by provided by Digital Domain, Scanline VFX, DNEG, Rodeo FX, Weta Digital, Lola Visual Effects, Tippett Studio, Clear Angle Studios, Effetti Digitali Italiani (EDI) and UPP. Special Effects Supervisors were Lindsay MacGowan and Shane Mahan for Legacy Effects.

Dwayne Johnson battling with Aldis Hodge’s Hawkman in the Sunken City exterior set at Trilith Studios. The athletic Hodge was suspended by wires, his wings – as well as the set extensions beyond the ground-level set – added later by Wētā FX.
The story takes place in fictional Shiruta, the modern-day version of Kahndaq, where 5,000 years prior, Teth-Adam (Dwayne “The Rock” Johnson), the people’s hero with great superpowers, had been imprisoned in the Rock of Eternity for apparently misusing his powers. He is released by a rebel (by utterance of the word “Shazam”) and brought back to battle the people’s modern-day oppressors, the Intergang. While he initially also battles the four members of the Justice Society of America – Doctor Fate, Hawkman, Atom Smasher and Cyclone – they end up fighting Intergang together, eliminating the threat posed by not only that group, but Sabbaq, who also arises from the darkness of old Kahndaq to attempt to claim his throne. By the end of the film, he has succeeded in eliminating the threat, and the hero is renamed Black Adam.
“[S]ince [director] Jaume [Collet-Serra] and [DP] Larry [Sher] were actively participating in creating the previs, they felt ownership. So, when we got to set, they knew that the previs was theirs and that was the path they were going to follow, as opposed to getting there and going, ‘Oh, there’s that previs – forget that, we’re gonna do our own thing.’ It’s amazing – you can look at the previs and look at the shots and they’re incredibly close.”
—Bill Westenhofer, Production Visual Effects Supervisor
Development of Black Adam began in 2019, with Westenhofer being brought on not long after Collet-Serra came on to helm the project. By that time, the director had worked with storyboard artists to flesh out his ideas. Then, they met to decide the best state-of-the-art methods to create the imagery the director had in mind. “LED walls were hot at the time, as was volume capture, and we ended up dabbling in all of them,” Westenhofer remarks. The production was intended as a full-scale virtual production, developed initially by Tom Meyer in ZBrush. “We had a motion capture stage setup and had motion capture performers, and we had real-time controls,” Westenhofer adds. “We were due to start on March 17, 2020 – and then the world closed down.”

Johnson in a completed “flying” shot. He was first captured laying flat in an Eyeline Studios volume capture stage with the rig later removed and extensive background animation added.
Over the pandemic hiatus and through Fall 2020, L.A.-based Day For Nite continued work creating previs for the scenes, importing the Maya storyboard files into Unreal Engine. “Right away, we can see things that are working and ones that are not,” Westenhofer states. What read in the script as “He comes out, they fight, he flips over a tank” was soon developed into fully-realized scenes.
At the same time, DP Sher began setting cameras and lighting, working with Day For Nite and Collet-Serra via Zoom. “It was great because since Jaume and Larry were actively participating in creating the previs, they felt ownership. So, when we got to set, they knew that the previs was theirs and that was the path they were going to follow, as opposed to getting there and going, ‘Oh, there’s that previs – forget that, we’re gonna do our own thing.’ It’s amazing – you can look at the previs and look at the shots and they’re incredibly close.”


Johnson “floats” down the stairs of an apartment set, standing on the small platform of an industrial automobile robotic rig.
Deciding which locations seen in the previs would be practical sets and which would be CGI was an important step. “You can look at the previs,” Westenhofer explains, “and you can see if Jaume wants to be looking in a specific direction most of the time, in which case we would build that part as a set. But as soon as that set has more than one story to it, construction costs start to go up. So, for things like the city, Shiruta, I told Tom just to focus on, say, the first story, store level, and we’ll take care of the rest.” The same goes for which characters would be digital and which would be captured in- camera. “I always try to favor scenes where there are people – humans not flying around and who aren’t superheroes. But we have a movie where there are five superheroes and four of them fly in some form. So, they’re going to be mostly digital,” Westenhofer declares.
Building a City
When filming finally began, Meyer constructed the ground-level set of Shiruta on the back lot at Trilith Studios in Atlanta, notably its Central Market or the “Sunken City” where a great amount of action in the film takes place. “It actually doubles for many places in the city,” Westenhofer explains. “We had a roundabout area and several cross-streets, and if you look in one direction, that would be the area around Adriana’s (Sarah Shahi) apartment, and if you look the other way, it was where the palace would be. And when they’re seemingly driving through town, they’re really going in circles, but by changing the set extension it felt like you were traveling through the city.”
Wētā’s Assets Department, which includes its Art Department as well as modelers, texturers and shading specialists, were responsible for crafting the city, rooted in Meyer’s design for the practical set. “Tom did a magnificent job of fleshing out the tone and feel of Shiruta,” Stopsack states. “So, a lot of the groundwork was done already. We engaged with Tom quite early. Then we spent a fair amount of time designing the architecture and the whole feeling of the city square.”


A floating Dwayne Johnson, suspended by an industrial automobile robotic arm, does away with two bad guys.
“[Black Adam] not only flies, he floats. In the comic books, he says he doesn’t want to share the ground with lesser beings. So, he feels like he should float. But we wanted Dwayne to be in the scene, and we didn’t want to have him always be bluescreen, having to shoot him looking at tennis balls. [Director] Jaume [Collet-Serra] wanted it to be super smooth, not having to expend any effort, just floating.”
—Bill Westenhofer, Production Special Effects Supervisor
The look established by Meyer, Stopsack notes, “as he often described to us, was like Middle East meets Hong Kong. It needed to be dry, somewhat monochromatic and reasonably high. A lot of chaotic streetlamps, wires and air conditioning units everywhere.” Much of that was introduced to Wētā early in concept art, and then it was up to them to flesh out the environment. Adds Stopsack, “We had the luxury of photography of Tom’s set on the backlot, which gave us a starting point via plate photography, which we then extended.” That task required extension to show the entire city of Shiruta, to allow creation of high wide shots, which would include the Central Market and Palace in the extended terrain. “We knew whatever we would start building around the Sunken Street ultimately would be utilized for propagating the wider city.”
The Asset Team’s approach was to essentially create modular building blocks to create different architectural levels and stories of each building. “We had to interchange them, dress them differently and stack the buildings up to a height that Tom deemed necessary,” Stopsack explains. “For each building we would ask, ‘Would you like this to be 15 stories high? 10 stories? Is this a round building? Do we see filigree here?’ So, we had a lot of engagement with him to make sure that the look and feel was what he envisioned.” They took advantage of any assets the Art Department had available, including packages of art, signage and other items, to lean into the same language Meyer’s teams had developed. “It’s an endless chase of getting the level of detail that you’re after.”
Wētā’s attention to detail translated to a construct that looks like a true city and not a visual effect. At the same time, constructing the entire city digitally – including the entirety of Meyer’s Sunken City area sets – gave Wētā valuable flexibility for creating scenes of mayhem which otherwise would have required destruction of the practical set. “The beauty of approaching it that way,” Stopsack observes, “is that we were left with an all-digital representation of the practical set pieces that were built. So, in the fight between Black Adam and Hawkman, if Black Adam is punched and smashes down the side of the building, those shots could be created fully digital. The entire environment was fleshed out, so we could inject these all-digital shots in between.”
In order to develop a true city grid seen in high wides, Wētā’s layout team utilized Open Streets map data, accessing real-world locations as the basis for Shiruta’s street layout. Comments Stopsack, “We looked at Middle Eastern cities around the globe to look at each’s city grid to study the general density of population and buildings and the buildings’ heights. A lot of data can be sourced, and we used that to lay a foundation for what Shiruta became.”


Pierce Brosnan on set wearing his “faux cap” suit with optical markers, holding his helmet. The remainder of his costume was created digitally, as seen in the final shot.
“[The look established by Tom Meyer] as he often described to us, was like Middle East meets Hong Kong. It needed to be dry, somewhat monochromatic and reasonably high. A lot of chaotic streetlamps, wires and air conditioning units everywhere. We had the luxury of photography of Tom’s set on the backlot, which gave us a starting point via plate photography, which we then extended. We knew whatever we would start building around the Sunken Street ultimately would be utilized for propagating the wider city.”
—Sheldon Stopsack, VFX Supervisor, Wētā FX
Moving Black Adam
As lead effects vendor, it fell to Wētā to develop the character animation models and movement, which were then shared with the other vendors for creation of their own scenes. “We were engaged fairly early on, when Bill asked us to start doing motion studies – even before building any of the Shiruta environments,” Stopsack explains. “These were done, in part, to inform how they would be shot on the practical set, like how they engaged in flying action or how Hawkman would land.”
Hawkman actor Aldis Hodge did quite a few stunts himself, such as his dives into the Central Market on a wire, touching down. “We had him rehearse with counterweights attached to his costume to give him a sense of what the wings would feel like, informing his performance,” Westenhofer notes. He was also given lightweight cloth cutouts of the wings to allow the set team to get an understanding of their size, how they would articulate, and allow DP Sher space to plan for in his frame and allow for the future-digital wings to have a home.
The motion studies also helped, working with Costume Design and the Art Department to nail down costume design and motion. Says Stopsack, “Some characters that were not completely digital had costumes that needed to be practically built, such as Hawkman and Black Adam – Hawkman’s wing design, for instance, looking at their mechanics, how do the wings unfold? Things like that.” Other designs, like Dr. Fate’s costume, were completely digital, requiring more creative input from Wētā.

Director Jaume Collet-Serra, left, discusses a scene on set at Trilith Studios in Atlanta.
A key part of Black Adam’s motion involves his simple floating movement within a scene. “He not only flies, he floats,” Westenhofer explains. “In the comic books, he says he doesn’t want to share the ground with lesser beings. So, he feels like he should float. But we wanted Dwayne to be in the scene, and we didn’t want to have him always be bluescreen, having to shoot him looking at tennis balls. Jaume wanted it to be super smooth, not having to expend any effort, just floating.”
Special Effects Supervisor J.D. Schwalm was tasked with offering practical methods to accomplish the float. The main mechanism was provided by an industrial automobile robot used in car manufacturing. “That was the coolest one,” continues Westenhofer, “which we had mounted on the set and could be programmed to pick him up and float him and move him around,” with Johnson standing on the rig’s platform, his legs being replaced later and the rig removed. “It allowed him to act. When he’s floating down the stairs, passing the kid, he could do back and forth banter and actually be in the scenes with other characters. That was really important.”
For simpler shots, Schwalm provided a small robotic cart about 2½ feet by 2½ feet, with a robotic hydraulic arm containing a saddle and a small foot platform, allowing Johnson to be raised or lowered up to four feet versus the industrial robot, which could lift him as high as 15 feet. “These sorts of things could also be done using wires, but Dwayne found this really comfortable, and it allowed him to interact naturally,” Westenhofer notes.
“[The Central Market on the set of Shiruta] actually doubles for many places in the city. We had a roundabout area and several cross-streets, and if you look in one direction, that would be the area around Adriana’s (Sarah Shahi) apartment, and if you look the other way, it was where the palace would be. And when they’re seemingly driving through town, they’re really going in circles, but by changing the set extension it felt like you were traveling through the city.”
—Bill Westenhofer, Production Visual Effects Supervisor

Dwayne Johnson in the Sunken City set during a fight scene. Production Designer Tom Meyer’s Central Market was replicated by Wētā FX in set extensions.
For his flying sequences, the VFX team made a volume capture system provided by Eyeline Studios, a division of Scanline VFX. The system once again allowed Johnson’s performance to be captured in a method quite a bit different from motion capture. The actor would lay flat on the rig, surrounded by an array of hi-res video cameras (versus infrared, as would be used in mocap). Eyeline then processes the data in its proprietary system and provides an extrapolated mesh and a set of textures. Explains Stopsack, “When the data comes to us, we then have the geometry of his performance, of his head, and we have the texture that maps onto it.”
“[The industrial automobile robot was] mounted on the set and could be programmed to pick [Johnson] up and float him and move him around [with Johnson standing on the rig’s platform]. It allowed him to act. When he’s floating down the stairs, passing the kid, he could do back and forth banter and actually be in the scenes with other characters. That was really important.”
—Bill Westenhofer, Production Visual Effects Supervisor
Wētā took the process a step further to retain Johnson’s natural head motion. “We took Eyeline’s mesh and tried to incorporate that into our full-blown Black Adam digital double,” Stopsack remarks. “We could then take their head motion data and combine that onto our puppet so that the head motion would track perfectly with our digital asset, with our digital head motion. But volume capture gives you limited body motion. If you have pretty intricate body motion, your head motion can quickly go off what the volume capture would allow, such as if the head goes backwards and you want an extreme that it won’t permit. So, our animators would then see those constraints and work within them to see how far we could bend the head back without going beyond what volume capture could support,” preventing the bend from appearing too rubbery, unlike a real person’s movement. Stopsack adds, “We used the technology for a small number of shots, but it was great when you needed the unmistakable like of the actor.”
In addition to Eyeline’s cameras, the actor was surrounded by LED walls playing back material created in Unreal, working early on with Scanline and Digital Domain, which provided interactive lighting on Johnson’s costume. The backgrounds, of course, were replaced later. LED walls came in handy for other sequences, such as filming the cockpit scenes in the Hawk Cruiser as it crashes into the Central Market. “The cockpit set was too big to place it on a gimbal,” Westenhofer reveals. “Instead, we had the content playing back on the LED screen, which was designed as being from the point of view of the cockpit so they could see themselves flying through space and crashing, and it gave them enough inspiration to sway and move as the craft was bucking in space.” For lighting, he says, “It worked really well inside the cockpits. We did replace some backgrounds, but the interactive light worked really well.”

Aldis Hodge as Hawkman. The character’s wings were added digitally, though Hodge was provided lightweight cloth cutouts to allow the actor and on-set teams an idea of the space that would be taken up by the finished digital product.
Using LED walls is not something to do frivolously, Westenhofer notes. “A lot of people come to this and hope to get what they call ‘final pixel,’ meaning you film it and the shot is done. There needs to be a fair bit more work done to get LEDs to the point where that’s really successful. You need a lot more time in prep to build these CG backgrounds, but then no one can change their mind afterwards. If you do that, it’s baked into the sauce.”
Towards the end of the film, we see Teth-Adam’s backstory in a flashback revealing the death of his son, Hurut, before he became the “The Rock”-sized superhero. For those scenes, a much slimmer double (Benjamin Patterson) was used onto which Johnson’s face was later applied. “We’d have Dwayne do a pass just to get the performance, and then the double would come in and repeat the same timing and performance. So it would be his body,” Westenhofer explains.
Later, after the scene was cut together, Johnson’s head and face were captured by Lola Visual Effects using their “Egg” setup, a system somewhat similar to volume capture. “Dwayne would sit down in a chair surrounded by several cameras,” Westenhofer describes. “Lola had studied our footage and setup lighting timed to replicate interactively the way the light on set interacted with the double throughout the shot, using colored LEDs. They could tell Dwayne to ‘Look this way’ or ‘Get ready to turn your head over here,’ and they would time the playback so he’d give the performance and move his head, give the dialogue matching what we captured on set from the other actor. Then, that head is projected onto a 3D model and locked into the shot itself, so you have Dwayne’s head and the skinny actor’s body.”


Before and after shots of a battle sequence in the Sunken City show the extent of Wētā FX’s detailed design work in set extensions and effects animation.
For Pierce Brosnan’s character, Dr. Fate, it was the opposite case. Brosnan’s own performance was filmed on the set and his body was replaced. “When he’s flying, it’s all CGI,” says Westenhofer. “But when he’s on the ground interacting with other characters, his costume has more life than a practical costume would have, so the costume is digital.”
Instead of using motion capture where Brosnan would have been filmed alone on a mocap stage, a “faux cap” system was used. Brosnan appeared on set in a simple gray tracking suit. Explains Stopsack, “It doesn’t have a full- blown active marker setup as a motion capture setup would have. The suit is simply peppered with optical markers, which are not engaged with any computer system but simply photographed with witness cameras. Our Match Move Department then uses some clever algorithms to triangulate their location and extract motion. We needed to see Pierce’s performance, his persona as an actor on set engaging with all of these characters. Then the superhero suit followed after.”


I’ve been asked which of these characteristics that describe me (disabled with spinal muscular atrophy, Chinese-American, woman) have posed the biggest challenge in moving forward, and I’d say being a woman in this business. Even to this day, when I show up to set as a VFX supervisor, the first question I’m asked is “who are you here visiting?” It’s an everyday thing that will change with time. The more women are seen and empowered in senior roles, the less these trivial questions will come up. I took a leap of faith in starting my own company, and I am committed to achieving greater equity and opportunity for everyone in VFX.

I was a single working parent early in my career, and the issue of balancing a career and family is highly personal. I was able to figure out a way where I did not have to sacrifice one for the other – but so many parents, particularly women, feel backed into making that tough choice. Women in Animation is focused on the enormous need to provide job flexibility and more support for working parents and caregivers. The number of women who have had to walk away from their jobs because of the lack of childcare, its staggering cost and not enough options for hybrid work schedules is startling, and that has all been exacerbated by COVID. We need to do better and lift up this advocacy movement.

Growing up amidst war in the Democratic Republic of the Congo in central Africa, I made a decision to pursue art to inject life into something I drew with my own hands and give it back to the people. I created The Third Pole initiative, a CG education program, to work with youth in my home country and give them the tools and the mentorship to be powerful visual storytellers. We know how Western and Asian cultures tell their stories, but not as much how Africa would tell theirs and contribute to our collective global storytelling. It’s so important to be able to preserve our oral histories; our legends are vanishing in our own time.

The lack of female visual effects supervisors is definitely the result of a lack of opportunity and unconscious bias – and that is fixable. Earlier in my career I was told that the goal was to promote the male supervisors, and I watched as guys who had worked under my VFX supervision were promoted up the ranks and given opportunities on large VFX shows. It never occurred to me that my gender would hold me back, and I was always surprised when it did. I am a strong believer in diversity and inclusion, not just because I am a bi-racial woman, but because I believe that greater diversity leads to freer thinking and greater creativity.

Creating my film Mila was a lifechanging experience, inspired by the stories my mother told me about how she felt as a child during the bombings of Trento in World War II. I fully embrace the power of animation. Hollywood might applaud socially relevant features, but it still views animation as essentially little more than “entertainment.” It has enormous potential to affect fundamental change in how we approach each other and how we deal with societal challenges. I believe that stories told through the magic of animation can move people and influence our future generations like nothing else can.
Join us for our series of interactive webinars with visual effects professionals.
As your questions, learn about the industry and glean inspiration for your career path.
Register today at
VESGlobal.org/AMA

By TREVOR HOGG
Images courtesy of HBO.

Assisting in deciding what sets needed to be built practically or digitally was having the 10 scripts essentially written before shooting commenced.
Unlike Game of Thrones, the prequel House of the Dragon, which revolves around the decline of Targaryen rule, has to deal with the expectations of its predecessor that push the boundaries of high-end episodic visual effects to achieve filmic quality. The first season consisting of 10 episodes was able to take advantage of the new virtual production stage and Warner Bros. Leavesden Studios, with showrunners Ryan Condal and Miguel Sapochnik collaborating with Visual Effects Supervisor Angus Bickerton (The King’s Man) to achieve the necessary size and scope and many more dragons for the epic fantasy HBO series. (Sapochnik has since moved to an executive producer role.)

The goal was to create a dirtier, grungier and dustier environment than Game of Thrones, which occurs 130 years later.
Bickerton joined the project back in September 2020, and at that point the scripts for the 10 episodes were essentially written. “That’s an important thing to say because as we know all TV productions are still evolving as they’re going along. You need to have settled scripts in order to say, ‘These sequences are going to be done in the volume.’ If we wanted to shoot the interior of Storm’s End Castle in Episode 110, instead of 12 weeks in post to do that environment, we needed 12 weeks prior to shooting to build it in Unreal Engine for texturing, lighting, doing test plays in the volume, to make sure it was coming out right, and working with the DPs and art department to decide which bits we were going to put on the screens and what would be sets.”

Around 2,800 visual effects shots were created for the 10 episodes, ranging from tiny birds in the frame to dragons.

A key principle for dragons is that they keep growing.
Some of the street scenes were captured in Spanish and Portuguese locations, but the rest were shot either on the virtual production stage or in the backlot at Leavesden Studios. “We had an oval space with a 270-degree wraparound screen, and it’s about 65 to 70 feet wide by 85 feet deep,” Bickerton explains. “We hung doors to block off the rest of the oval so we could make an almost 360-degree volume. Above that, we have our ceiling, which was on panels so we could raise and lower them. Normally, you drop that ceiling just inside the wall height. Our screen was 25 feet high. When you’re inside and look up, the ceiling blends into the wall. It’s a balancing act. You have to find a position where it’s slightly the wall height, but the 40 tracking cameras arranged around the screen still need to be able to get a view of the camera in order to real-time track the camera, in order to create the interactive environment on the screen.”
“Once you’ve built this beautiful cathedral, the last thing you want is to start blowing smoke and have hot flames melt the LED panels. But we wanted candles, flame bars, driving rain and smoke. The first thing that we did was to concede some of the screens to create ventilation space for smoke. The screen was lifted above the flame bar element to get it further away from the flame.”
—Angus Bickerton, Visual Effects Supervisor
As with Game of Thrones, House of the Dragon features extensive smoke, fire and rain, which meant that special effects had to occur within the virtual production stage. “Once you’ve built this beautiful cathedral, the last thing you want is to start blowing smoke and have hot flames melt the LED panels,” Bickerton notes. “But we wanted candles, flame bars, driving rain and smoke. The first thing that we did was to concede some of the screens to create ventilation space for smoke.”Additional ventilation was placed under the screens so the air was constantly moving. “The screen was lifted above the flame bar element to get it further away from the flame,” Bickerton adds. “When it came to storm sequences, we had to figure out the orientation of our motion base so we could blow the smoke and rain atmosphere past the actors and it would go across the screen. We could have separate fans blowing it away from the screen as well as off-camera.”

Sunrise and sunsets can be shot over the course of days on a virtual production stage with the same lighting conditions being maintained.

An iconic prop that makes an appearance in House of the Dragon is the Iron Throne.
“[For the flying dragon shot in Episode 110], The Third Floor did the previs that was animated with much simpler dragon assets to make sure that we were doing the right dragon motion. The Third Floor’s simulation files were given to Pixomondo, which tweaked and revised the animation that was then given back to The Third Floor, which rebuilt it for the motion base, volume and camera, and we worked out what camera moves that we had to do with the actors to match the previs.”
—Angus Bickerton, Visual Effects Supervisor
Special Effects Supervisor Michael Dawson and his team built a new motion base that could bank, pitch and rotate. “The motion base exceeded our expectations,” Bickerton remarks. “We got fast movement, good angle changes, could throw the actors around quite considerably and get shakes in their bodies. The Wirecam was more of a challenge to move around fast because you have to ramp up to speed, fly past an actor and ramp down again. [For the flying dragon shot in Episode 110], The Third Floor did the previs that was animated with much simpler dragon assets to make sure that we were doing the right dragon motion. The Third Floor’s simulation files were given to Pixomondo, which tweaked and revised the animation that was then given back to The Third Floor, which rebuilt it for the motion base, volume and camera, and we worked out what camera moves that we had to do with the actors to match the previs.”

Concept art by Kirill Barybin showing the scale of Prince Lucerys Velaryon and Arrax, which is a 14-year-old dragon.

The 2D concept art of Arrax was translated into a 3D blockout by Kirill Barybin.
“[The dragons] ultimately can’t bear their own weight. Vhagar, which is chasing Arrax, is meant to be 103 years old whereas Arrax is 14 years old. Whenever a new member of the Targaryen family is born a dragon is put in the crib with the child so that they develop a symbiosis. But there is only so much control that you have over these dragons. In the shot where you see the big silhouette of Vhagar above Arrax was a signature image that we wanted going into the sequence to show the size of him. In terms of how the motion base moved, Arrax is flappier and smaller, so it has more aggressive motions whereas Vhagar is a huge beast and the motions are a lot more general.”
—Angus Bickerton, Visual Effects Supervisor
A narrative principal is that dragons keep on growing. “They ultimately can’t bear their own weight,” Bickerton notes. “Vhagar, which is chasing Arrax, is meant to be 103 years old whereas Arrax is 14 years old. Whenever a new member of the Targaryen family is born a dragon is put in the crib with the child so that they develop a symbiosis. But there is only so much control that you have over these dragons. In the shot where you see the big silhouette of Vhagar above Arrax was a signature image that we wanted going into the sequence to show the size of him. In terms of how the motion base moved, Arrax is flappier and smaller, so it has more aggressive motions whereas Vhagar is a huge beast and the motions are a lot more general.”

A dramatic action sequence is when Prince Lucerys Velaryon and Arrax are chased by Aemond Targaryen and Vhagar.
There were no static 2D matte paintings as the camera always had to be fluid. “The trick was to always have atmosphere-like particles in the air,” Bickerton reveals. “I remember working on our first environment and asked, ‘Should we add some birds?’ And it worked. There were birds all over the place. They were small in frame but were a key element in bringing life to the shot. Miguel wanted it to be dirtier, dustier, grungier than Game of Thrones because we are taking place 130 years before, so there was a lot of smoke, and King’s Landing has a nastier look.” Bickerton was give an eight-terabyte drive of assets from Game of Thrones by HBO that included the Red Keep and King’s Landing. Explains Bickerton, “They had been built by different facilities for each season, so we had about five or six different variations of the Red Keep and King’s Landing. Our Visual Effects Art Director, Thomas Wingrove, brought in the different models, and we came up with our own fully-realized 3D environment because we wanted to be able to come back to it and know where everything was. In Game of Thrones, they tended to add in bits when needed for each episode.”
“[Showrunner/director] Miguel [Sapochnik] wanted it to be dirtier, dustier, grungier than Game of Thrones because we are taking place 130 years before, so there was a lot of smoke, and King’s Landing has a nastier look. They had been built by different facilities for each season, so we had about five or six different variations of the Red Keep and King’s Landing. Our Visual Effects Art Director, Thomas Wingrove, brought in the different models, and we came up with our own fully-realized 3D environment because we wanted to be able to come back to it and know where everything was. In Game of Thrones, they tended to add in bits when needed for each episode.”
—Angus Bickerton, Visual Effects Supervisor

A signature shot is of shadow of Vhagar flying above Arrax.

2D and 3D techniques were combined to create the disfigured face of King Viserys I Targaryen.
Around 2,800 visual effects shots were produced for the 10 episodes. “If you’re going to have character who is 1/10th the screen size of a dragon, then it’s a digital double,” Bickerton states. “We used digital doubles for some of the fast action; otherwise it’s an element of someone on a motion base, if it’s dragon-riding. We tried to shoot an element for everything. There was quite a lot of face replacement for action and storm sequences.” All of the actors were scanned to various degrees, depending on how much of their performance is needed. Comments Bickerton, “In the tournament at the beginning of Episode 101, there are numerous face replacements. We had to do CG for half the face of King Viserys I Targaryen in Episode 108, towards the end of his final days. We did a lot of 2D warping and distortion to make his neck thinner and get his face to be gaunt. The bit I love is the sheer diversity of the work. There are so many different environments and dragon characters. That’s what I like.”
By TREVOR HOGG
Images courtesy of Prime Video and ILM.



The Martian terrain traveled by Oppy was given a reddish tint while the setting inhabited by Spirit had a bluish tint.
Considering the *batteries not included vibe, it is not surprising to learn that Amblin Entertainment was involved in producing the Prime Video documentary Good Night Oppy, which chronicles NASA’s successful development and launch of Mars rovers Opportunity and Spirit in 2003, with the former defying expectations by going beyond the 90-day mission and surviving for 15 years.
To re-enact what happened to the two rovers on the Red Planet, filmmaker Ryan White turned to ILM Visual Effects Supervisors Abishek Nair and Ivan Busquets to, in essence, produce an animated film to go along with present-day interviews and archival footage. “Ryan White wanted to make a real-life version of WALL·E [a small waste-collecting robot created by Pixar] in some ways, and mentioned during the project that E.T. the Extra-Terrestrial was his favorite film growing up and wanted to bring that emotion into it,” Nair explains. “For us, it was trying to maintain that fine balance of not going too Pixar, doing justice to the engineers who worked on the rover missions and forming a connection so that the viewers feel the same thing that the engineers went through when they working with Opportunity and Spirit.”


Amongst the 34 minutes of full CG animation was the landing of the rovers on Mars.
“For us, it was trying to maintain that fine balance of not going too Pixar, doing justice to the engineers who worked on the rover missions and forming a connection so that the viewers feel the same thing that the [NASA] engineers went through when they working with Opportunity and Spirit.”
—Abishek Nair, Visual Effects Supervisor, ILM
Creating a sense of a face was important in having the rovers be able to emote. “Early on in the show, Ryan was interested in exploring a range of emotions for these rovers and was doing that in parallel in sound and visual effects,” Busquets states. “He was trying to come up with a library of plausible looks so that we were not making a caricature. Even when animating the rovers, we observed the limitations of the joints and what the range of movement is. We did cycles of, what does sad or older-looking-moving Oppy look like? It was all based on, ‘let’s use what’s in there.’ The most obvious example was using the pan cameras as eyeballs because from a physical position, they do resemble the eyeballs on a person.”


ILM created a view of Mars from outer space.
Data was provided by the NASA Jet Propulsion Laboratory. “The rovers themselves are the most accurate versions of Opportunity and Spirit,” Nair observes. “We would send turntables of the rovers to the JPL and they would point out certain things that felt a little off, like how the robotic arm would bend and including the decals/details on the rover itself. We built up the rovers with some of the stickers that were on the prototypes and those were taken off when the rovers went to Mars. We had to keep all of those things in mind. It was a good symbiotic process. The engineers at JPL were excited that we were breathing life into the rovers.” The models for Opportunity and Spirit were the same but treated differently. “We respected the story, like when they needed to compensate for how Spirit was to be driven after one of the wheels broke,” Busquets states. “All of those animation cues were respected, so we did animate Spirit differently than Oppy. Then there are differences as to the environments that they were in, and those were kept realistic and true.”
“Early on in the show, [director] Ryan [White] was interested in exploring a range of emotions for these rovers and was doing that in parallel in sound and visual effects. He was trying to come up with a library of plausible looks so that we were not making a caricature. Even when animating the rovers, we observed the limitations of the joints and what is the range of movement. We did cycles of, ‘what does sad or older-looking-moving Oppy look like?’ It was all based on, ‘let’s use what’s in there.’ The most obvious example was using the pan cameras as eyeballs because from a physical position, they do resemble the eyeballs on a person.”
—Ivan Busquets, Visual Effects Supervisor, ILM
Both environments did not share the exact same color palette. “The Spirit side of the planet had more of bluish hue to it whereas the Oppy side was redder,” reveals Nair. “Also, whenever you see the split-screen, Oppy is on screen left and Spirit is on screen right. and that was maintained throughout the documentary. There was always this visual reference as to who was where, who is doing what and even the direction that they move. Oppy would always go left to right while Spirit was right to left. We built in these little cues to psychologically know that right now you’re looking at Spirit not Oppy. As the story progressed, Spirit had a broken wheel so that helped.”



Adding to the drama was having the rovers get stuck in sandpits and trying to get out.
“The Spirit side of the planet had more of bluish hue to it whereas the Oppy side was redder. Also, whenever you see the split-screen, Oppy is on screen left and Spirit is on screen right. and that was maintained throughout the documentary. There was always this visual reference as to who was where, who is doing what and even the direction that they move. Oppy would always go left to right while Spirit was right to left. We built in these little cues to psychologically know that right now you’re looking at Spirit not Oppy. As the story progressed Spirit had a broken wheel so that helped.”
—Abishek Nair, Visual Effects Supervisor, ILM
Four major dust variants were created for Spirit and Oppy. “As the shots progressed, we started running effects simulations and dust maps on it so we could turn them up or down depending on the shots themselves,” Nair notes. There was not a lot of room for creative licence. “Normally we would go with what makes for a more cinematic shot, but with this being a documentary we kept it grounded in reality as much as possible,” Busquets states. “A place where we did make a concession was when it came to the speed. The maximum speed of the rovers was something like two inches per second. It became obvious when we started animating that we were not going anywhere. How are we going to tell a story with that?”




A critical part of making the imagery believable was incorporating photographic aberrations such as lens flares.
Since visual effects was a new area for Ryan White, ILM produced storyboards and previs that also aided editorial. “The documentary style of filmmaking is different from feature film,” Nair observes. “We had to make sure that we get some fairly detailed storyboards going for key shots at least and rough storyboards for the rest that we would be doing which would then inform us in terms of the beats, length of the shots and how it’s sitting in the edit. When it came to the particular shot of Oppy getting her wheel stuck in the stand, we had some fairly detailed storyboards, but then we went through quite a bit of postvis animation to get the idea across of the wheel spinning. We also had to work with some clever camera angles that would tell the story. We were working within a timeframe and budget and trying to make sure that visually it was telling the story that was supposed to be told there. There were pockets of sand simulation that we did early on to show the wheel spinning and kicking out of the sand. We showed that to Ryan who was excited about it, and then we brought in all of those little animation cues of Oppy struggling trying to go in reverse and get out of that sandpit.”




The pan cameras on the rovers were treated as if they were eyes, which helped to give them a personality.
Sandstorms had to be simulated. “We had photographic reference of sandstorms on Mars, so we knew exactly what it would look like,” Nair explains. “We’ve done sandstorms before on various movies, but we had to make sure that these would actually happen on Mars: the little electrical storms that happen within them that have bolts of lightning. That’s where we could bring a little bit of drama into the whole thing by having the bolts of lightning and closeups of Oppy staring up at the sandstorm and lightning flashes on her face. There were tons of auxiliary particles flying around the area around her and tons of sand bleeding off her face and solar panels. We did run that through layers of simulations and then threw the whole kitchen sink at it and started peeling back to see what we could use and omit to bring the storytelling back into the whole thing.”
“The number of unique locations, from their landing sites to the journeys, to the different craters that they visit, the amount of nuance and rocks and different type of terrain, everybody involved felt there was something special about building something not based on concept art but scientific data. However, you want to make it as photographic and exciting as possible. There was a lot of pride I saw in the team in doing that.”
—Ivan Busquets, Visual Effects Supervisor, ILM
The edit was a work in progress. “What was challenging and unique about this project was being involved from an early stage and they hadn’t finished all of their interviews,” Busquets remarks. “Ryan had some ideas for the chapters that he wanted to cover. We helped to inform the edit as much as the edit helped to inform our work. It made things a bit slower to progress, and we had to rely on rough animation and previs to feed editorial.”




Four major dust variants were created for Spirit and Oppy.
No practical plates were shot for the 34 minutes of full CG animation. “We asked to be sent to Mars to shoot some plates and were told that it would be too expensive!” laughs Busquets. “We did get a ton of data from NASA including satellite images from orbiters that have been sent to Mars. It was the equivalent of Google Earth but at a lower resolution. All of the environments that you see in the documentary are based on the real locations the rovers visited.” ILM had to fill in the gaps and could not use the actual imagery because they were not high resolution enough for 4K. A cool moment to create is when Oppy takes a selfie. “It was a fun sequence to do, and we followed the same arc of the cameras so Oppy could actually take the photographs,” Nair comments. “We did have reference of the separate images that were stitched together. We got our snapshots within that particular shot very close to what was actually taken. In the documentary we made it black and white and grainier compared to the other shots.”




Electrical storms had to be incorporated inside of the sandstorms that occur on Mars.
One of the most complex shots was depicting the solar flares hitting the spacecraft as it travels to Mars. “As an idea, it was storyboarded in a simple manner, and when we started looking at it we figured out that it wasn’t going to show the scale and the distance that these flares would travel or the danger that the rovers were in,” Nair states. “Working the timing of the camera move to the sun with the burst of flare energy… The camera takes over from there, follows the flare energy hitting the spacecraft and swivels around. That whole thing took a bit to plan out. It was a leap of faith as well because Ryan didn’t want it to look like too Transformers in a way. We had to keep things still believable but at the same time play around a little bit and have some fun with the whole thing. It’s one of our longest shots in the show as well. As for the other challenges, it was a documentary format where the edit was fluid, and we had to make sure it would conform with our timeline and the scope of work that was left to do.”
The environmental work was extensive. “The number of unique locations, from their landing sites to the journeys, to the different craters that they visit, the amount of nuance and rocks and different type of terrain, everybody involved felt there was something special about building something not based on concept art but scientific data,” Busquets remarks. “However, you want to make it as photographic and exciting as possible. There was a lot of pride I saw in the team in doing that.”
By TREVOR HOGG
Images courtesy of Marvel Studios and Digital Domain.

Production Special Effects Supervisor Daniel Sudick and his special effects teams built a 30- to 40-foot-section of the boat deck that was 15 to 20 feet up in the air.
Third acts are never easy as this is what the audience has been waiting for, and when it comes to the Marvel Cinematic Universe there have been a plethora of epic battles making things even more difficult to come up with something that has not been seen before. In Black Panther: Wakanda Forever, the Wakandans take a ship out into the ocean and successfully lure the underwater-dwelling Talokanil into a massive confrontation while the newly crowned Black Panther does single combat with Namor in a desert environment. States Hanzhi Tang, VFX Supervisor at Digital Domain, “I knew this movie was important, and having met [director] Ryan Coogler on set, you want him to succeed as he’s the nicest person. I’ve known [Marvel Studios VFX Supervisor] Geoffrey Baumann for a long time, so we had already a good working relationship; he trusted us with trying to help him navigate whatever surprises that would come up.”

The rappelling of the Dora Milaje was influenced by a dance troupe.

A back-and-forth between Digital Domain and Wētā FX ensured that their shots were seamlessly integrated with each other.
“We started off in the Atlantic Ocean and shared some parts with Wētā FX, which had already figured out the underwater and deep ocean looks. Digital Domain kept to above the surface and a couple of shots where characters had to go in and out water. There was a back and forth between us so to synchronize with each other as to the camera and the location of the water plane. Then we would do everything from the water surface and upwards. Then one of us had to do the final composite and blend the two together. Luckily, when the camera hit that water plane it acts like a wipe.”
—Hanzhi Tang, VFX Supervisor, Digital Domain
“We started off in the Atlantic Ocean and shared some parts with Wētā FX, which had already figured out the underwater and deep ocean looks,” Tang explains. “Digital Domain kept to above the surface and a couple of shots where characters had to go in and out water.” For the some of the underwater shots, Wētā FX provided the virtual camera as a first pass. “There was a back and forth between us so to synchronize with each other as to the camera and the location of the water plane,” Hang details. “Then we would do everything from the water surface and upwards. Then one of us had to do the final composite and blend the two together. Luckily, when the camera hit that water plane it acts like a wipe.” A giant set piece was constructed for the boat. “A 30- to 40-foot section of the boat deck was built that was 15 to 20 feet up in the air,” reveals Tang. “It was built as a rig that could tilt up to 45 degrees, because there is a point in the movie where the boat gets attacked and almost rolls over. People could slide down the deck. [Production Special Effects Supervisor] Dan Sudick and his special effects team had built one big in-ground tank to film people in the water, and separately this deck. As far as the water interaction on the deck, it was all CG.”

The Talokanil were supposed to have bare feet, which were inserted digitally for safety reasons.

A major task was adding digitally the rebreather masks worn by the Talokanil.
Plates were shot for the foreground elements with various bluescreens placed in the background. “All the way back was a set extension that was blended into the foreground,” Tang remarks. “Everyone in the background is a digital double.” The rappelling of the Dora Milaje was influenced by a dance troupe. Describes Tang, “There is a vertical wall where everyone does dance moves on cables that was the inspiration for the Dora Milaje being suspended. The whole thing was shot horizontally with them dangling off of cables. It was incredible.” The skies were art directed. “There was a lot of picking and choosing of the type of day and clouds,” Tang comments. “It ended up being a combination of CG and matte-painted clouds. The style of the on-set lighting by Autumn Durald Arkapaw, the cinematographer, was soft, and she would wrap the lighting around characters and give them a lovely sheen on their skin.”
“A 30- to 40-foot section of the boat deck was built that was 15 to 20 feet up in the air. It was built as a rig that could tilt up to 45 degrees, because there is a point in the movie where the boat gets attacked and almost rolls over. People could slide down the deck. [Production Special Effects Supervisor] Dan Sudick and his team built one big in-ground tank to film people in the water, and separately this deck. As far as the water interaction on the deck, it was all CG.”
—Hanzhi Tang, VFX Supervisor, Digital Domain

Shuri transports a captured Namor to a desert environment where they have engage in single combat.
Blue-skin characters, such as the Talokanil, against bluescreen is always a fun challenge, Tang reports. “Greenscreen would have been worse with the amount of spill, given that it was meant to be a blue-sky reflection,” he states. “We ended up doing roto on everything. The set is 20 feet in the air, people are being sprayed down with water, and there are all of these cables that need to be painted out. When the Talokanil board, you have 20 stunt people climbing the boat, and there’s no perimeter fence around this thing. For safety reasons, everyone had to wear decent footwear, and these characters were meant to be barefoot. They did not do the rubber feet that Steve Rogers wears in Captain America: The First Avenger, so we ended up tracking and blending CG for feet replacements. We also had to track and replace rebreather masks because the Talokanil wear them when they’re out of the water. It fits over the mouth and the gills on the neck. Those were impractical to wear, be running around and trying to perform the stunts in.”
“All the way back [for the rappelling of the Dora Milaje sequence] was a set extension that was blended into the foreground. Everyone in the background is a digital double. There is a vertical wall where everyone does dance moves on cables that was the inspiration for the Dora Milaje being suspended. The whole thing was shot horizontally with them dangling off of cables. It was incredible.”
—Hanzhi Tang, VFX Supervisor, Digital Domain

Bluescreen made more sense than greenscreen as it provided the correct blue spill that would have been caused by the sky.
Namor (Tenoch Huerta) is captured and Shuri flies him off into the desert because he gains his power from the ocean. “They have a one-on-one fight, and there was a lot of set extensions and cleanup of the background,” Tang remarks. “We put the sky and the sun in the right place.” A flamethrower was utilized on set for the desert explosion. “But it wasn’t anywhere near the size of the actual explosion in the movie. It was used for exposure, color and scale reference of how that size flame appears through the camera,” Tang says. The flying shots of Namor were sometimes the most difficult to achieve, he adds. “In some of the shots we would have captured Tenoch Huerta on bluescreen, and he’ll do some closeup acting,” Tang observes. “We tried some wirework that looked too much like wirework and ended up doing a half-body replacement from the chest down. They captured a lot of shots with him in a tuning fork and being pulled around the set, so it was a lot of paint-out for the tuning fork and all of the gear on it. It’s suitable for waist-up shots. Tenoch just had the pair of shorts, which means there’s not much to hide, and when doing extensive paint-out on skin, you can end up with large patches that can be easily seen.”
By IAN FAILES

Several sequences featuring the Giganotosaurus in Jurassic World Dominion made use of a animatronic head section on set. (Image courtesy of Universal Pictures and ILM)

For final shots, ILM would often retain the entire animatronic head of the dinosaur and add in the rest of the dinosaur body. (Image courtesy of Universal Pictures and ILM)
How do you put yourself into the shoes, or feet, of a Giganotosaurus? What about an advanced chimpanzee or a bipedal hippo god? And how do you tackle a curious baby tree-like humanoid? These are all computer-generated characters with very different personalities featured in films and shows released in 2022, and ones that needed to be brought to life in part by teams of animators. Here, animation heads leading the charge at ILM, Wētā FX, Framestore and Luma Pictures share how their particular creature was crafted and what they had to do to find the essence of that character.
When your antagonist is a Giganotosaurus
When Jurassic World Dominion Animation Supervisor Jance Rubinchik was discussing with director Colin Trevorrow how the film’s dinosaurs would be brought to the screen, he reflected on how the animals in the first Jurassic Park “weren’t villains, they were just animals. For example, the T-Rex is just curious about the jeep, and he’s flipping it over, stepping on it and biting pieces off of it. He’s not trying to kill the kids. I said to Colin, ‘Can we go back to our main dinosaur – the Giganotosaurus – just being an animal?’ Let’s explore the Giga being an animal and not just being a monster for monster’s sake. It was more naturalistic.”
With Trevorrow’s approval for this approach, Rubinchik began work on the film embedded with the previs and postvis teams at Proof Inc., while character designs also continued. This work fed both to the animatronic builds by John Nolan Studio and Industrial Light & Magic’s CG Giganotosaurus. “Early on, we did lots of walk cycles, run cycles and behavior tests. I personally did tests where the Giga wandered out from in between some trees and was shaking its head and snorting and looking around.”
Another aspect of the Giganotosaurus was that ILM would often be adding to a practical/animatronic head section with the remainder of the dinosaur in CG. For Rubinchik, it meant that the overall Giga performance was also heavily influenced by what could be achieved on set. Comments Rubinchik, “What I didn’t want to have were these practical dinosaurs that tend to be a little slower moving and restricted, simply from the fact they are massive hydraulic machines, that then intercut with fast and agile CG dinosaurs. It really screams, ‘This is what is CG and this is what’s practical.’

A full-motion CG Giganotosaurus crafted by ILM gives chase. (Image courtesy of Universal Pictures and ILM)

Animation Supervisor Jance Rubinchik had a hand in ensuring that the movement of animatronic dinosaurs made by John Nolan Studio, as shown here, were matched by their CG counterparts. (Image courtesy of Universal Pictures and ILM)
“Indeed,” Rubinchik adds, “sometimes as animators, you have all these controls and you want to use every single control that you have. You want to get as much overlap and jiggle and bounce and follow through as you can because we’re animators and that’s the fun of animating. But having something that introduced restraint for us, which was the practical on-set dinosaurs, meant we were more careful and subtler in our CG animation. There’s also a lot of fun and unexpected things that happen with the actual animatronics. It might get some shakes or twitches, and that stuff was great. We really added that stuff into Giga wherever we could.”

For Pogo shots in season 3 of The Umbrella Academy, on-set plates featured actor Ken Hall. (Image courtesy of Netflix and Wētā FX)

Voice performance for Pogo was provided by Adam Godley (right), while Wētā FX animators also contributed additional performance capture. (Image courtesy of Net-flix and Wētā FX)

The final Pogo shot. (Image courtesy of Netflix and Wētā FX)
“What I didn’t want to have were these practical dinosaurs that tend to be a little slower moving and restricted, simply from the fact they are massive hydraulic machines, that then intercut with fast and agile CG dinosaurs. It really screams, ‘This is what is CG and this is what’s practical.’ … But having something that introduced some restraint for us, which was the practical on-set dinosaurs, meant we were more careful and subtler in our CG animation. There’s also a lot of fun and unexpected things that happen with the actual animatronics. It might get some shakes or twitches, and that stuff was great. We really added that stuff into Giga wherever we could.”
—Jance Rubinchik, Animation Director, MPC
This extended even to the point of replicating the animatronic joint placements from the John Nolan Studio creatures into ILM’s CG versions. “All of the pivots for the neck, the head, the torso and the jaw were in the exact same place as they were in the CG puppet,” Rubinchik outlines. “It meant they would pivot from the same place. I was so happy with how that sequence turned out with all the unexpected little ticks and movements that informed what we did.”
Pogo reimagined
The advanced chimpanzee Pogo is a CG character viewers greeted in Seasons 1 and 2 of Netflix’s The Umbrella Academy as an assistant to Sir Reginald Hargreeves, and as a baby chimp. The most recent Season 3 of the show sees Pogo appear in an alternative timeline as a ‘cooler’ version of the character who even becomes a biker and tattoo artist. Wētā FX created each incarnation of Pogo, which drew upon the voice of Adam Godley, the on-set performance of Ken Hall and other stunt performers and stand-ins to make the final creature.
Having ‘lived’ with Pogo in his older, more frail form in the past seasons, Wētā FX Animation Supervisor Aidan Martin and his team now had the chance to work on a character who was capable of a lot more physically, including Kung Fu. “All of a sudden, Pogo’s been in the gym. He’s juiced up. He’s doubled his shoulder muscle mass and his arms are a lot bigger, so the way that he carries himself is completely different. His attitude has changed, too. He’s more gnarled and he’s a lot more jaded about the world,” Martin says.
From an animation point of view, Wētā FX animators took that new physicality into the performance and reflected it in postures and movements. “It was even things like the way he looks at somebody now,” Martin explains. “Early on in Season 1, when he looks at people, he’s very sincere. He was like a loving grandfather. Now, he’s a bit fed up with it all and he’s not looking at you with good intentions. He thinks you’re an idiot and he doesn’t have time for it. That’s where he’s coming from behind the mask.”

Pogo is a grittier character in this latest season, even working as a tattoo artist. (Im-age courtesy of Netflix and Wētā FX)
“All of a sudden, Pogo’s been in the gym. He’s juiced up. He’s doubled his shoulder muscle mass and his arms are a lot bigger, so the way that he carries himself is completely different. His attitude has changed, too. He’s more gnarled and he’s a lot more jaded about the world.”
—Aidan Martin, Animation Supervisor, Wētā FX
One of the VFX studio’s toughest tasks on this new Pogo remained the character’s eyes. “Eyeline is everything, especially with chimps,” says Martin, who also had experience on the Planet of the Apes films at Wētā FX. “When you’re trying to do a more anthropomorphized performance, chimps with their eyelines and brows do not work very well compared to humans because their eyes are just so far back and their brows sit out so far. For example, as soon as you have the head tilt down and then try to make them look up, you can lose their eyes completely. Balancing the eyeline and the head angle is really difficult, especially on chimps.”
“Even once you’ve got that working, getting the mouth shapes to read properly is also tricky,” Martin continues. “There are some really tricky shapes, like a ‘V’ and an ‘F,’ that are incredibly hard on a chimp versus a human. Their mouths are almost twice as wide as our mouths. Humans look really good when they’re talking softly, but getting a chimp to do that, it looks like they’re either just mumbling or they get the coconut mouth, like two halves clacking together, and everything’s just too big. We used traditional animation techniques here, basically a sheet of phoneme expressions for Pogo’s mouth.”
Going hyper (or hippo) realistic
Finding the performance for a CG-animated character often happens very early on in a production, even before any live action is shot. In the case of the Marvel Studios series Moon Knight’s slightly awkward hippo god Taweret, it began when Framestore was tasked with translating the casting audition of voice and on-set performer Antonia Salib into a piece of test animation.

Actor Antonia Salib performs the role of hippo god Taweret on a bluescreen set. (Image courtesy of Marvel and Framestore)

Final shot of Taweret by Framestore. (Image courtesy of Netflix and Wētā FX)
“The Production Visual Effects Supervisor, Sean Andrew Faden, asked us to put something together as if it was Taweret auditioning for the role,” relates Framestore Animation Supervisor Chris Hurtt. “We made this classic blooper-like demo where we cut it up and had the beeps and even a set with a boom mic. We would match to Antonia’s performance with keyframe animation just to find the right tone. We would later have to go from her height to an eight- or nine-foot-tall hippo, which changed things, but it was a great start.”

Salib wore an extender stick during the shoot (here with Oscar Isaac) to reach the appropriate height of Taweret. (Image courtesy of Marvel and Framestore)

Framestore had to solve both a hippo look in bipedal form and the realistic motion of hair and costume for the final character. (Image courtesy of Marvel and Frame-store)
“We looked at a lot of reference of real hippos and asked ourselves, ‘What can we take from the face so that this doesn’t just feel like it’s only moving like a human face that’s in a hippo shape?’ We found there were these large fat sacks in the corners that we could move, and it made everything feel a little more hippo-y and not so human-y. Probably the biggest challenge on her was getting Taweret to go from a human to a hippo.”
—Chris Hurtt, Animation Supervisor, Framestore
During filming of the actual episode scenes, Salib would perform Taweret in costume with the other actors and with an extender stick and ball markers to represent the real height of the character. As Hurtt describes, Framestore took that as reference and looked to find the right kind of ‘hippoisms’ on Salib. “We looked at a lot of reference of real hippos and asked ourselves, ‘What can we take from the face so that this doesn’t just feel like it’s only moving like a human face that’s in a hippo shape?’ We found there were these large fat sacks in the corners that we could move, and it made everything feel a little more hippo-y and not so human-y.”
“Probably the biggest challenge on her was getting Taweret to go from a human to a hippo,” adds Hurtt, who also praises the Framestore modeling, rigging and texturing teams in building the character. “The main thing for animation was that we had to observe what the muscles and the FACS shapes were doing on Antonia, and then map those to the character. Still, you’re trying to hit key expressions without it looking too cartoony.”
To help realize the motion of Taweret’s face shapes in the most believable manner possible, Framestore’s animators relied on an in-house machine learning tool. “The tool does a dynamic simulation like you would with, say, hair, but instead it would drive those face shapes,” Hurtt notes. “It’s not actually super-noticeable, but it’s one of those things if you didn’t have there, particularly with such a huge character, she would’ve felt very much like paper-mâché when she turned her head.”
The enduring, endearing allure of Groot
The Marvel Studios Guardians of the Galaxy films have borne several CG-animated characters; one of the most beloved being Baby Groot. He now stars in his own series of animated shorts called I Am Groot, directed by Kirsten Lepore, with visual effects and animation by Luma Pictures. The fully CG shorts started with a script and boarding process driven by Lepore, according to Luma Pictures Animation Director Raphael Pimentel.

Luma Pictures Animation Director Raphael Pimentel donned an Xsens suit (and Ba-by Groot mask) for motion capture reference at Luma Pictures during the making of I Am Groot. (Image courtesy of Luma Pictures)

The behavior settled on for Baby Groot, which had been featured in previous Marvel projects, was always ‘endearing.’ (Image courtesy of Marvel and Luma Pictures)
“There were scripts early on showing what the stories were going to be about. These quickly transitioned into boards. Then, Kirsten would provide the boards to us with sound. She would put them to music as well, which was important to get the vibe. These would then be turned over to us as an animatic of those boards with the timing and sound that Kirsten envisioned, which was pretty spot-on to the final result.”

Baby Groot’s mud bath in one of the shorts required the extensive cooperation be-tween the animation and FX teams at Luma Pictures. (Image courtesy of Marvel and Luma Pictures)

Baby Groot still delivers only one line: “I am Groot.” (Image courtesy of Marvel and Luma Pictures)
In terms of finding the ideal style of character animation for Groot in the shorts, Luma Pictures shot motion capture as reference for its animators, which was used in conjunction with video reference that Lepore also provided, and vid-ref shot by the animators themselves. The motion capture mainly took the form of Pimentel performing in an Xsens suit. “We went to Luma and identified the key shots that we wanted to do for every episode,” Pimentel recalls. “We would do one episode each day. As we were going through those key shots, we ended up shooting mocap for everything. Kirsten was there telling me the emotions that she wanted Groot to be feeling at that specific point in time. And we said, ‘Let’s keep going, let’s keep going.’ Next thing you know, we actually shot mocap for everything to provide reference for the animators.”
In one of the shorts, “Groot Takes a Bath,” a mud bath results in the growth of many leaves on the character, which he soon finds ways to groom in different styles. This necessitated a close collaboration between animation and effects at Luma. “That was a technical challenge for us,” Pimentel discusses. “In order for Kirsten to see how the leaves were behaving, she would usually have to wait until the effects pass. We built an animation rig that was very robust that would get the look as close to final as possible through animation.”
The final behavior settled on for Baby Groot in the shorts was “endearing,” Pimentel notes. Despite Groot’s temper tantrums and ups and downs, he was still kept sweet at all times. From an animation standpoint, that meant ensuring the character’s body and facial performances stayed within limits. “It’s easy to start dialing in the brows to be angry, but we had to keep the brows soft at all times. And then his eyes are always wide to the world. Regardless of what’s happening to him, his eyes are always wide to the world, much like a kid is.”
By TREVOR HOGG
Images courtesy of MUBI.





Partial set build, background plate photography, 3D model of mountaintop and depth pass are combined together to create an aerial shot.
“The mountain [where Seo-rae falls to his death] is 100% CG, but the background where that peak is situated is a real scene that we shot. There are a ton of mountains in Korea, so it’s a composite of these two. The two main locations of the mountains and sea have this unique form that goes up and down and up and down. We wanted to repeatedly show such up and down patterns, like a wave in an ocean or the landscape of the mountain range, but at the same time we didn’t want to make it too obvious for the audience to say, ‘Ah-ha! I see that.’”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
Ever since the release of Oldboy, Lee Joen-hyoung, who serves as the CEO and VFX Supervisor at Korean VFX studio 4th Creative Party, has been collaborating with filmmaker Park Chan-wook. Decision to Leave, which revolves around a detective becoming infatuated with a murder suspect, seems to be a less likely candidate for extensive digital augmentation because of the subject matter, but this was not the case. “This was the easiest read of all of director Park’s screenplays,” Lee recalls. “However, in the end, the work that I had to do was the toughest because unlike The Handmaiden or Oldboy, for which I was able to come up with the imagination straightaway in terms of the visuals and mise-en-scène, Decision to Leave was so ambiguous.” About 580 shots were created over a period of six months. “We were done at the end of 2021, but then we had some time left before Cannes and the actual release, so we did some detailing work with only a handful of people to make it even more perfect,” Lee remarks.


Invisible effects include adding photographs to the wall devoted to unsolved crimes, created by Hae-joon.
Unwanted natural elements had to be removed from the finale, which took place in a beach environment that was, in reality, three different locations combined together. “Jang Hae-joon’s portion of it was shot at the end of fall, entering into the winter season, so we started to have some snow,” Lee states. “For the sake of continuity, we had to take out snow and also had to work on the mountain that you can see from far. Even though we had to remove the snow and wind, Tang Wei, the actor, still felt those harsh conditions, which reflected the emotional state of Hae-joon. With Son Seo-rae’s portion, there was no problem because the time of day was different.” Atmospherics were also digitally added into shots. “We had to have mist in the latter part of the film because it’s set against Ipo, which is famous for mist and being humid all of the time,” Lee adds. “Mist had to be present in almost all of the outdoor scenes, but we had to define how much for a particular scene.”
“Director Park likes to use insects in his movies, such as the ants in Oldboy or the mosquitoes in Lady Vengeance or the ladybug in I’m a Cyborg, But That’s OK. I knew even before that he was going to put some kind of insect in this film, too. We already have a vast library filled with insects and their forms and movements, so we were well-equipped to execute that.”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party




Insects are always featured in the films of Park Chan-wook, with CG ants crawling over the face of Seo-rae’s dead husband.
“Hae-joon tries to replicate what Seo-rae would have done to kill her husband, and he goes up the mountain, lies down and looks up. Then Seo-rae’s hand comes in and they hold hands together. Director Park told me that the audience should be able to see the callus on her palm because it’s evidence that she is already an expert climber. It was difficult to visually make that happen because when those two hands meet together the palm becomes a little bit dark, so we had to do several retakes. That one scene was the most challenging for me.”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
The mountain where the husband, Seo-rae, falls to his death was a partially built on a backlot set surrounded by bluescreen. “The mountain is 100% CG, but the background where that peak is situated is a real scene that we shot,” Lee reveals. “There are a ton of mountains in Korea, so it’s a composite of these two. The two main locations of the mountains and sea have this unique form that goes up and down and up and down. We wanted to repeatedly show such up and down patterns, like a wave in an ocean or the landscape of the mountain range, but at the same time we didn’t want to make it too obvious for the audience to say, ‘Ah-ha! I see that.’” Ants crawl over the face of the deceased spouse. “Director Park likes to use insects in his movies, such as the ants in Oldboy or the mosquitoes in Lady Vengeance or the ladybug in I’m a Cyborg, But That’s OK. I knew even before that he was going to put some kind of insect in this film, too. We already have a vast library filled with insects and their forms and movements, so we were well-equipped to execute that,” Lee notes.




The x-ray of an arm and hand transitions into the arm and hand of Hae-joon, emphasizing that he is still thinking of Seo-rae even when having an intimate moment with his wife.
A clever shot transition moves from the x-ray of a hand to the one belonging to Hae-joon as he is having sex with his wife in bed. “I have already accumulated so much experience with Park Chan-wook-esque transitions!” Lee laughs. “We knew how the output should look like because it was worked out in the storyboarding phase and we subsequently shot the source material. That transition was a nod to what we had already did in I’m a Cyborg, But That’s OK. As a long-time collaborator, I already knew what color director Park likes for the x-ray and the timing for the movement of the hand. That transition was the symbol of how Hae-joon is really with Soe-rae even though he is physically next to his wife.” The growing emotional bond between the detective and the murder suspect is visually emphasized in the interrogation scenes. “We wanted the audience to see something happen that is not physically possible,” Lee describes. “For that we had four characters because there were two in front and there are two in the reflection of [the mirror]. We shot the real people, then the reflection pass, and composited these two together so that we were able to control the focusing and defocusing in order to fully realize director Park’s intention and vision for the scene.”
“That transition [shot of the x-ray of a hand to the one belonging to Hae-joon as he is having sex with his wife in bed] was a nod to what we had already did in I’m a Cyborg, But That’s OK. As a long-time collaborator, I already knew what color director Park likes for the x-ray and the timing for the movement of the hand. That transition was the symbol of how Hae-joon is really with Soe-rae even though he is physically next to his wife.”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party

Bunam Beach in Samcheok, Hakampo Beach and Magumpo Beach in Taean were combined to create the environment that appears in the finale.

Reflections and monitors were manipulated during the interrogation scenes to visually show that Seo-rae and Hae-joon are becoming emotionally closer to each other.
Driving shots are common place for Korean television series and films. “For our film, we wanted to make sure that the windshield of the car and the reflections on the car and how the lights will change inside of the space would be recognizable to the audience,” Lee remarks. “We had to make sure that the lighting and reflections worked perfectly; that was our full intention. Since our actors are inside the car, we also wanted to make a realistic look for the interior shot.” An unlikely shot proved to be difficult. Reveals Lee, “Hae-joon tries to replicate what Seo-rae would have done to kill her husband, and he goes up the mountain, lies down and looks up. Then Seo-rae’s hand comes in and they hold hands together. Director Park told me that the audience should be able to see the callus on her palm because it’s evidence that she is already an expert climber. It was difficult to visually make that happen because when those two hands meet together the palm becomes a little bit dark, so we had to do several retakes. That one scene was the most challenging for me.”
By IAN FAILES

Cinesite’s Montreal and Vancouver facilities took on Paws of Fury: The Legend of Hank after the film had already spent several years in development. (Image courtesy of Paramount Pictures)
A common credit on a CG animated feature film or show is ‘visual effects supervisor.’ But wait, don’t VFX supervisors work just in live-action? This is, of course, not so. Indeed, on a CG-animated project, a visual effects supervisor is a crucial role, often helping to formulate the ‘look of picture’ as well as solve many of the technical and artistic hurdles along the way – not too dissimilar at all from a VFX supervisor working in live-action.
In this roundtable, visual effects supervisors from Walt Disney Animation Studios, Pixar, DreamWorks Animation, Sony Pictures Imageworks and Cinesite Studios explain their tasks on recent animated films and shows and share their thoughts on the key trends hitting their field right now.

Cinesite is one of a limited number of VFX studios that also deliver full CG-animated features and other animated projects. (Image courtesy of Paramount Pictures)
VFX supervisor in live-action versus animation
Alex Parkinson (Visual Effects Supervisor, Cinesite): Often the difference between VFX supervisors in live-action and animation depends on the kind of VFX show you are talking about. Sometimes entire sequences in movies are CG with no live-action aspects at all. In that case, the workflow and the job would be very similar. But mostly the differences between the two jobs reflect the differences between the two mediums. In animation, you tend to have more creative ownership over the final product and way more freedom. Live-action VFX is a more technical and precise process. It is harder in a lot of ways, because you must match existing elements and every shot is put through more scrutiny.
Marlon West (Visual Effects Supervisor, Walt Disney Animation Studios, on Iwájú): While the visual effects supervisor on a live-action film is tasked with leading the team to create images that they can’t go out and capture live, for animation every image is created from ‘scratch.’ So, they are tasked with leading the charge of creating every image technically and creatively.
“Multiple time zones were our main challenge on Iwájú. We have artists in Los Angeles, London, Lagos, Montreal and Vancouver. At one point we had artists in Uganda, Kenya and Zimbabwe as well. While not hugely technical, the biggest challenge was initially story, art and editorial teams who have worked primarily with our internal tools to work with outside partners.”
—Marlon West, Visual Effects Supervisor, Walt Disney Animation Studios
Of all the tech trends that abound in CG animation right now, Cinesite Visual Effects Supervisor Alex Parkinson believes that real-time game engines have the most potential to revolutionize the industry, particularly in relation to CG cinematography.
“Let’s take a typical shot, the villain reveal. The villain walks towards the camera through darkness and mist, their cape billowing in the wind. As they move into the light more of their form is revealed, and then at the last moment they lift their face towards the light,” Parkinson describes.
“In a traditional CG animation pipeline, this would be created in a serial manner,” Parkinson continues. “The camera would be created in layout using some very rough blocked animation. It would be animated without the cape, which would be added in CFX. FX would then do the mist interaction, then the whole thing would be passed to lighting to make it work. However, what if it doesn’t work? What if the timing is off or lighting cannot make the animation work for the face reveal? The shot goes all the way back down the pipeline for a re-do, then round and round until ultimately we run out of time and have to go with what we have.”
Parkinson believes real-time game engines will change this process. “We will be able to work much more like a live-action shoot does. We will be able to assemble all the pieces we have at any time, see them all in context, and work more in parallel, tweaking each element so they work together harmoniously. The potential for a quality increase in our filmmaking is huge.”

Kylie Kuioka voices Emiko in Paws of Fury: The Legend of Hank. (Image courtesy of Paramount Pictures)
Jane Yen (Visual Effects Supervisor, Pixar, on Lightyear): I see my role at Pixar as overseeing all of the technical work that needs to happen in computer graphics to produce the film visuals. Pixar has historically been at the very forefront of computer graphics and creating CG imagery, so a lot of the Pixar history and the roles that used to be called supervising technical director, and now VFX supervisor, were based on developing new technology to make it even possible.
Matt Baer (Visual Effects Supervisor, DreamWorks Animation, on The Bad Guys): At the creative leadership level, there are more peer relationships for the animation VFX supervisor. The head of story is my peer. The head of layout is my peer. The head of animation is my peer. For example, I’m responsible for making sure our head of animation is set up with the necessary workflows and technologies. Ultimately, the head of animation is creatively responsible for the character animation. I consult during animation development and shot work so our animators have context as to how their work fits into the bigger picture.

A layout frame from a hyperjump sequence in Pixar’s Lightyear. (Image courtesy of Disney/Pixar)

Animation pass on Buzz Lightyear. (Image courtesy of Disney/Pixar and Pixar)
R. Stirling Duguid (Visual Effects Supervisor, Sony Pictures Imageworks, on The Sea Beast): At Imageworks, compared to other animation companies that are vertically integrated, we have a client/vendor relationship. My primary role is to represent Imageworks to the client as well as the director, the production designer, the art director and their producer. That’s the first step, representing the company. Then it’s about building the team and the framework for the entire production – how we go from storyboards to final composite, laying that out and making sure that we have the right people in charge for each of those departments.

Lighting, FX and rendering are the final steps in the final frame. (Image courtesy of Disney/Pixar)
Solving the art and tech and pipeline, in animation
Matt Baer: On The Bad Guys, one of our key visual goals was to pay homage to illustration and 2D animation. Our design philosophy was to use simplification to achieve our stylized and sophisticated look. Anyone who has worked in CG knows this is the opposite of what many of our tools are designed to do! We needed to replace the realistic details of traditional CG techniques with the wonderful hand-drawn imperfections seen in illustrations. Taking this to scale on a feature film required us to build new workflows for every department, allowing them to create images that look handmade – removing superfluous CG details while keeping just enough visual information to guide the eye towards the most important aspects of the shot. Once the image was reduced, our artists added custom line work, textures and 2D effects to every shot in the film.

A first look image from Walt Disney Animation Studios’ Iwájú series. (Image courtesy of Disney)
R. Stirling Duguid: For The Sea Beast, the big technical hurdle was ropes. We had thousands of them to do. Our Animation Supervisor, Joshua Beveridge, said, ‘We have to start from the ground up and build an awesome rope rig because we’re not going to make it through the movie without that.’ We came up with a procedural solution that was designed to be animation-friendly. The idea is that the length of the rope would always stay the same – usually it stretches or is cheated, but ours had the proper hang and everything. Ropes are in so many shots.
Jane Yen: Lightyear was Pixar’s largest FX film to date. Almost 70% of the film has FX elements in it. As the VFX Supervisor on an animated film, I had to look at every single component of the film, not just FX but also set building and modeling, set dressing, character modeling, building articulation, tailoring, cloth simulation, hair grooming – and that’s just the asset-building side. Then we have lighting and shading. I’m sure there’s some element in there I missed, but you can kind of get the picture that on an animated film, every single component of every visual thing that is on the screen, we had to account for.

A frame from Walt Disney Animation Studios’ Encanto, on which Marlon West served as Head of Effects Animation. (Image courtesy of Disney)
Among the many technical hurdles Sony Pictures Imageworks had to conquer in The Sea Beast was realizing the distinctive crease lines and wrinkles on several of the characters’ faces. Usually, such wrinkle-like features are modeled or textured into the detail.
Looking to capitalize on earlier work done at Imageworks with inklines on Spider-Man: Into the Spider-Verse, Visual Effects Supervisor R. Stirling Duguid and his team developed a tool called CreaseLines that gave animators the ability to easily, dynamically create and control curves on faces to define the right emotive facial creases.
“Normally if you model things like that, it requires a high-density mesh, or you have to use displacement maps, which are hard for an animator to visualize,” Duguid explains. “Our Animation Supervisor, Joshua Beveridge, had this idea for crease lines, which came from Spider-Verse. It was about thinking in the animator’s shoes, dealing with the director, getting a note and finding the quickest way to address the note.
“CreaseLines gave us the flexibility to move those lines and not be constrained by the topology, as far as how dense the mesh was. This let us directly drive displacement of the face meshes. It was a real win.”
“While animation has always sought to create new and imaginative worlds, the last few years have seen a rapid increase in the variety of visual styles produced across the industry. Despite this increase, we’ve only barely cracked open the visual possibilities in animation, which really excites me for the future.”
—Matt Baer, Visual Effects Supervisor, DreamWorks Animation
Marlon West: Multiple time zones were our main challenge on Iwájú. We have artists in Los Angeles, London, Lagos, Montreal and Vancouver. At one point, we had artists in Uganda, Kenya and Zimbabwe as well. While not hugely technical, the biggest challenge was initially story, art and editorial teams who have worked primarily with our internal tools to work with outside partners.
Alex Parkinson: Being an independent studio, we must find ways to match the quality of big studio movies as closely as we can, and that is a bar that is constantly moving, so our tools and workflow must constantly evolve to keep up. Often our problem is stylization. For example, three of our recent movies, Riverdance: The Animated Feature, Paws of Fury: The Legend of Hank and Hitpig, have featured sequences with fast-flowing water. Because we use tools developed primarily for live-action FX, they tend to produce photorealistic results. They don’t fit in our world, so we must find ways to make natural elements feel more ‘cartoony.’

Mirabel, voiced by Stephanie Beatriz, in Encanto. The fireworks were crafted as effects animation. (Image courtesy of Disney)
Stylization: a major trend in animation
Alex Parkinson: The use of non-photorealistic rendering, or NPR, exploded after Spider-Man: Into the Spider-Verse. That showed the potential for what a CG-animated movie could be. I see it as part of the maturing of our industry. If you think about 2D animation and all the looks and styles that it covers, from an episode of The Simpsons to crazy anime action, to the work of Cartoon Saloon, it is so varied and creative. CG animation is a very young art form, and up until now has tended to stay within the styles it was born from, like Toy Story, Shrek, etc. That is to say, more photoreal. 3D animation is branching out, experimenting, and developing all new NPR techniques – that’s very exciting.
R. Stirling Duguid: Spider-Man: Into the Spider-Verse was a pretty big splash for Imageworks as far as doing a whole movie like that. Then look at the evolution into The Mitchells vs the Machines, where we also took it to a really nice place. And, actually, it was quite different. You can clearly see the difference between Mitchells and Spider-Verse, but it was using a lot of the same technology.

A storyboard frame from a driving sequence in DreamWorks Animation’s The Bad Guys. (Image courtesy of Universal Pictures and DreamWorks Animation)

After storyboard, the studio moves to previs and early animation. (Image courtesy of Universal Pictures and DreamWorks Animation)

The final rendered frame. On The Bad Guys, DreamWorks Animation injected an enhanced level of stylized look and feel to the final images, partly to reflect the source material. (Image courtesy of Universal Pictures and DreamWorks Animation)
Jane Yen: I think we are now going to see movies come out that have a much more stylized and ‘pushed’ look. I think you’ll see that in our next feature film, Elemental. Even though Lightyear may not have pushed that spectrum, specifically, I think the industry is leaning that way, and I’m excited to see what comes out of it.
“Being an independent studio, we must find ways to match the quality of big studio movies as closely as we can, and that is a bar that is constantly moving, so our tools and workflow must constantly evolve to keep up. Often our problem is stylization. For example, three of our recent movies, Riverdance: The Animated Feature, Paws of Fury: The Legend of Hank and Hitpig have featured sequences with fast-flowing water. Because we use tools developed primarily for live-action FX, they tend to produce photorealistic results. They don’t fit in our world, so we must find ways to make natural elements feel more ‘cartoony.’”
—Alex Parkinson, Visual Effects Supervisor, Cinesite Studios

For The Sea Beast, Imageworks developed a new animation rig to deal with the many different kinds of ropes seen in the film, allowing them to stretch and hang and be animated as realistically as possible. (Image courtesy of Netflix and Sony Pictures Imageworks)

A face-crease-lines tool helped the studio dynamically create and control curves on faces to define the right emotive facial creases. (Image courtesy of Netflix and Sony Pictures Imageworks)
Marlon West: Stylization, supporting production design and character animation, has been very important for us at Walt Disney Animation Studios. We have endeavored to share 2D sensibilities with team members who started their careers creating images in CG. While those classic animation principles are important to character animators, overlap, staging, anticipation, etc. are just as valued by Walt Disney effects animators, too.

A final frame from The Sea Beast. (Image courtesy of Netflix and Sony Pictures Imageworks)
Matt Baer: While animation has always sought to create new and imaginative worlds, the last few years have seen a rapid increase in the variety of visual styles produced across the industry. Despite this increase, we’ve only barely cracked open the visual possibilities in animation, which really excites me for the future.
By TREVOR HOGG

Visual Effects Supervisor Ryan Tudhope honored and preserved the messiness of the practical aerial photography for Top Gun: Maverick, which in turn made the 2,400 visual effects shots seamless. (Image courtesy of Paramount Pictures)
There are strange variables in play in the 2023 Oscar race for Best Visual Effects. For one, filmmaker Taika Waititi and actress Tessa Thompson did a scene breakdown video for Vanity Fair and made fun of some visual effects work in Thor: Love and Thunder, without mention of the groundbreaking camera and lighting techniques utilized by Marvel Studios Visual Effects Supervisor Jake Morrison to create six separate lighting passes simultaneously for the Moon of Shame sequence without interrupting principal photography. In an interesting twist, the comments sparked an Internet frenzy about the unreasonable demands and deadlines that digital artists have to contend with on a daily basis, with the backlash possibly having a ripple effect on other MCU contenders Doctor Strange in the Multiverse of Madness and Black Panther: Wakanda Forever.
Then, there is the matter of Top Gun: Maverick where the filmmakers and studio are marketing how everything was done practically. However, to achieve the desired cinematic scope there are over 2,000 visual effects shots that have been seamlessly integrated into the remarkable aerial plate photography, such as the opening Blackstar scene; that in itself should make the blockbuster, which received critical acclaim and earned $1.37 billion worldwide as of mid-August, a favorite to win. Curiously, though, the visual effects team led by Visual Effects Supervisor Ryan Tudhope has been grounded from promoting the film, and there is unlikely to be any campaign support from the VFX team to add fuel to the nomination fire. Nevertheless, there is a strong possibility that no perceived lack of VFX team publicity can stop Top Gun: Maverick from topping the field, as demonstrated by Dunkirk’s Oscar win in 2018.

The major technical innovations for Avatar: The Way of Water have been facial capture, the ability to do performance capture underwater and the recreation of realistic CG water. (Image courtesy of 20th Century Studios)
Avatar: The Way of Water is certainly getting studio support, with the original Avatar (2009) being re-released to theaters to remind audiences of the highest-grossing film of all time. It is never wise to bet against director James Cameron, who knows how to push and invent technology to enhance his storytelling, in this case facial capture. With Cameron out to build on the spectacle of Avatar, the visual effects for The Way of Water are sure to be amazing. Some theatergoers will be stunned by the visuals, but repeat viewings will likely depend on the story development, which Cameron did not rush, as he understands the multi-film endeavor is pointless without a solid narrative foundation.
As for those filmmakers who showed ingenuity and a unique perspective towards how to incorporate visual effects, two in particular stand out. To begin, director Jordan Peele is having a major impact on redefining the horror genre and having it be a mirror that reflects the beauty and ugliness of society. Nope addresses the issue of spectacle and elevates the UFOs of B-movies into an aerial creature wreaking havoc on the world below. Massive wind machines were needed, so a helicopter was brought in to generate practical dust for the shots. Day for night photography consisted of a 3D rig that synced a color film camera with an infrared digital camera under the guidance of innovative Cinematographer Hoyte van Hoytema and Production Visual Effects Supervisor Guillaume Rocheron. And for an added bonus, here is the rampaging monkey brought to life with performance capture legend Terry Notary.
Some might argue that the visual effects are not Oscar caliber, but one has to be impressed by how filmmakers Dan Kwan and Daniel Scheinert were able to depict a believable multiverse without an MCU budget for Everything Everywhere All at Once. That in itself is award-worthy. The visual effects team consisted of mainly five digital artists producing 80% of the over 500 bespoke shots that include an acrobatic juggling hibachi chef who in reality was an actor pantomiming his actions with the culinary tools and ingredients added later in CG. Also, there was a case where a character had to be removed from an entire scene. Shots like the first “verse-jumping” shot of Michelle Yeoh’s character benefited from the extra time afforded by the pandemic. Michelle Yeoh is an amazing practical effect in herself, as she at one time overshadowed another martial arts icon, Jackie Chan, who actually rejected the lead role of what has become the first A24 move to earn over $100 million worldwide

Every single shot required digital augmentation from rig cleanup to water simulations to digital skies while retaining the stopmotion handcrafted aesthetic for Guillermo del Toro’s Pinocchio. (Image courtesy of Netflix)

Stunning CG environments will be the hallmark of Black Panther: Wakanda Forever, with the futuristic African homeland being the centerpiece. (Image courtesy of Marvel Studios)

Doctor Strange in the Multiverse of Madness features artistic visual effects such as a twisted orchard ravaged by magic rather than fire. (Image courtesy of Marvel Studios)

Feathered dinosaurs like the Pyroraptor make a debut in Jurassic World Dominion, with ILM having to develop a new feather system to make it possible. (Image courtesy of Universal Pictures)

By combining practical ingenuity and virtual production methodology, Bullet Train was able to create the impression of high-speed travel through Japan when, in fact, the principal photography took place on a soundstage in Los Angeles. (Image courtesy of Columbia Pictures)
Speaking of multiverses, director Sam Raimi brings his own sense of dimensional mayhem with Doctor Strange in the Multiverse of Madness, which stands to be the top contender for the MCU. Raimi has a distinct blend of horror and comedy, which is appropriate for a story that centers around the egotistical and sardonic Master of the Mystic Arts portrayed by Benedict Cumberbatch. An incursion occurs that sees two dimensions disintegrate, a mirror trap is sprung with shards of glass, and a magical, pristine orchard is revealed to be a twisted forest conjured out of a Brothers Grimm fairy tale. Found in the heart of darkness is an evil doppelganger, run-amok scarlet witchery and a beloved 1973 Oldsmobile Delta 88.
Black Panther: Wakanda Forever looks as impressive as the original film and has the legacy of Chadwick Boseman. If anyone can pay his late colleague a worthy send-off, it will be filmmaker Ryan Coogler, who returned to direct the sequel. The world-building has made the futuristic African land of Wakanda a wonder to behold. And let us not forget all of the amazing technological toys that can be created using Vibranium, which rivals anything Q has produced for English compatriot James Bond.
If visual mayhem is what you seek, there is Moonfall, which sees gravity go sideways as the lunar neighbor is revealed to house a dwarf star and becomes the target of a nanobot AI determined to get its vengeance against humanity. When it comes to destroying Earth, no one can do it better creatively and consistently than director Roland Emmerich. Very few sets were built physically, but a shuttle cockpit was brought in from a museum to assist with the flying scenes. Inflicting massive damage caused by an arsenal of weaponry rather than a cosmic event are the Russo siblings, Anthony and Joe, with The Gray Man, which ups the ante for assassins with no sense of covert activities, as their missions become news headlines as entire city blocks get decimated in the effort to kill one person!
Ryan Reynolds enters into fray with The Adam Project, where he gets to encounter his younger sarcastic self and attempts to destroy the invention of time travel much to the chagrin of author H.G. Wells. There is cool tech involved for some blockbuster flying and badass hand-to-hand combat sequences, but the visual effects do not capture the innovation of the zanier Free Guy, which is also the product of Reynolds partnering with fellow Canadian director Shawn Levy. As for award-worthy video game adaptations, Uncharted emerges from development hell with Ruben Fleischer shepherding the big-screen adaptation. Tom Holland’s acrobatic antics as Nathan Drake would even impress his most famous character, Spider-Man, especially the aerial daisy-chain sequence, and there is a different spin on a naval battle. While two long-lost 16th century ships are being transported in the air by heavy-duty cargo helicopters, the opposing forces swing on ropes going from one seafaring vessel to another.
Battling for supremacy in the DC Universe will be The Batman and Black Adam. Black Adam hopes to dodge the critical and visual effects backlash of The Scorpion King, which also starred Dwayne Johnson getting involved with Egyptian god shenanigans. No doubt the technology has greatly improved since then, but the rush to finish the visual effects on time hopefully won’t undermine quality. Johnson is such a likable person that it will be interesting to see him portray an antihero. In The Batman, the car chase through the rain by Wētā FX and Scanline VFX, flooding the streets of Gotham, are standout environmental moments when it comes to visual effects. Watch out for The Batman punching and grappling his way to a nomination.
Sony continues to spotlight comic book villains with the doctor turned bloodsucking creature of the night. Morbius reveals that the cure is worse than the disease. The vampire faces are the most impressive digital work, but the toughest was honoring physical dynamics. The entire third act was rewritten and had to be reconstructed with bluescreen. Sonic the Hedgehog 2 was nearly derailed with the restart after the pandemic lockdown led to a digital artist talent shortage and capacity issues with different vendors around the world. Fortunately, there was no character designs required this time around, so the focus could be on introducing new characters and environments. Jim Carrey is let loose once more as the evil Dr. Robotnik, with the twirlable mustache added by a gigantic mech robot and Idris Elba channeling the adversarial Knuckles.

A crowning accomplishment for Thor: Love and Thunder was the ability to capture six different lighting passes simultaneously and not interrupt principal photography during the Moon of Shame sequence. (Image courtesy of Marvel Studios)

Sonic the Hedgehog 2 embraces its video game and cartoon heritage, demonstrating that not everything has to be photorealistic. (Image courtesy of Paramount Pictures and Sega of America)

Among the environmental work in Elvis is visiting the Graceland estate over three decades during three different seasons. (Image courtesy of Warner Bros. Pictures)

Nope features a creature of the sky, major dust simulations, day-for-night photography and a raging monkey, all done in a photorealistic manner. (Image courtesy of Universal Pictures and Monkeypaw Productions)

A signature action scene for Uncharted made use of gimbals, wirework and CG to produce an aerial daisy chain of cargo crates. (Image courtesy of Columbia Pictures)

Unreal Engine and virtual production were indispensable tools for the art department, cinematography, stunts, special effects and visual effects for The Batman. (Image courtesy of Warner Bros. Pictures)

The recreation of ancient Egypt mixed with superpowers lead to stunning visuals in Black Adam that could only be accomplished with the support of CG. (Image courtesy of Warner Bros. Pictures)

The high tech that goes along with time travel gets an imaginative spin in The Adam Project. (Image courtesy of Netflix)
Elba gets to literally punch a malevolent lion in Beast, which features a CG antagonist in the vein of the infamous grizzly bear attack in The Revenant. Think Jaws on safari. Icelandic filmmaker Baltasar Kormákur has gained a reputation for being able to shift between blockbusters like Everest and indie films such as The Oath; he is aware of the importance of visual effects and using them wisely as reflected by his ownership of RVX, the effects arms of RVX Studios, and ongoing collaboration with Framestore. Other creatures unleashing havoc on the human population are the prehistoric ones brought back to life in Jurassic World Dominion, which ties together with the seminal Jurassic Park that achieved groundbreaking photorealistic digital effects. Production Designer Kevin Jenkins, Visual Effects Supervisor David Vickery and Creature Effects Supervisor John Nolan worked closely together to ensure that dinosaurs were anatomically correct and that as much of the animatronics could be maintained as possible. The major innovation was finally introducing dinosaurs with feathers, like the Pyroraptor, to the franchise, which required ILM to build a new feather system.
Curiously, the live-action version of Pinocchio by director Robert Zemeckis and starring Tom Hanks is going directly to Disney+, so it will not qualify for the Oscars. However, there will be a brief theatrical run before Guillermo del Toro’s Pinocchio takes up permanent residence on Netflix. It might seem as a stretch to include a stop-motion animation feature as part of the contender list, but the feat was actually achieved by Kubo and the Two Strings. The realm of Limbo and the interior of the dogfish are two major CG environments, and there is also a minor fully-CG character, while atmospherics range from being realistic mist to snow that is given a paper-like quality, as well as surrealistic skies, set extensions, and plenty of clean-up resulting from set shifts, light flickers, dust and hair. All of this is done while maintaining a live-action approach to both the camerawork and animation of the puppets.
Other award-worthy possibilities are the prequel Fantastic Beasts: The Secrets of Dumbledore, which casts spells of good and evil and features various magical creatures from the imagination of J.K. Rowling. Bullet Train recreated Japan and a speeding locomotive with LED screens and a soundstage in Los Angeles. Idris Elba appears as a wish-fulfilling genie in Three Thousand Years of Longing, directed George Miller. Three decades of the life of Elvis Presley gets the Baz Luhrmann treatment in Elvis, which is basically a film about a showman by a showman. The visual effects are faithful to the period while also having an element of hyper-realism to them when depicting Graceland in the various stages and seasons in the 1950s, 1960s and 1970s. The winner for Best Visual Effects at the 95th Academy Awards, being held on March 12, 2023 at Dolby Theatre in Los Angeles, is not a foregone conclusion, and that will make an interesting change from last year where Dune was the runaway favorite, with the only question being which runner-up films would get nominated.
By CHRIS McGOWAN

DNEG employees benefit from a variety of training resources available through the firm’s intranet. (Image courtesy of DNEG)
In these days of rapidly developing VFX technologies and processes, it can be hard to keep up in your field, let alone master new realms. Fortunately, there are many options for continuing education – on the job or via schools or online tutorials. “Since a majority of VFX workflows are closely allied with emerging computing technologies, VFX artists, production management and pipeline staff need to constantly keep abreast of change in workflows and tools resulting from new technologies,” says Shish Aikat, Global Head of Training for DNEG. “Depending on the depth of commitment a VFX artist has to a specific topic or workflow, the sources of learning can be anywhere from a series of tutorials on YouTube, online courses at portals such as fxphd or Gnomon Online [to] an undergraduate/graduate program in VFX at a degree/diploma-granting institution.”
Oftentimes, one need not go further than one’s job to find new training. Dijo Davis, Senior Training Manager for DNEG, comments, “Once the artists make it into DNEG, a variety of in-house training resources is available for their reference via our intranet. In addition to this, custom ‘upskilling’ training programs are organized and conducted to support employees and keep the existing and upcoming shows in line.”
Here is a look at several upskilling paths for the VFX artist.

A clay sculpting master class is one of the benefits at Framestore. (Image courtesy of Framestore)

TCS and its studios have a trainer for each department who is a specialized in the tools of that department and pipeline. (Image courtesy
of Technicolor Creative Studios)

The Unreal Fellowship is a free 30-day blended experience for learning Unreal Engine. (Image courtesy of Epic Games)
“Thanks to a lot of open-source tools for VFX, a lot of tools and learning materials are available, and going by the views and comments on VFX tutorials on YouTube, it appears that a surging number of VFX artists are taking advantage of these courses. Game engines, GPU rendering and real-time technologies have piqued the interest of a multitude of VFX artists. Companies like Epic Games and Unity offer hundreds of courses and channels for artists to interact, and VFX artists are thronging in large numbers to these sites and channels.”
—Shish Aikat, Global Head of Training, DNEG
ONLINE RESOURCES
A vast variety of visual effects courses is available on the internet. For starters, “The tutorials you can find on independent sites or YouTube are super helpful to help fill in any targeted foundational gaps an artist may have. I would say the companies and artists that provide these tutorials are champions for these applications, and they are a great way to learn focused skills,” says Matthew Cruz, Global Creative Training and Development Manager for Technicolor Creative Studios (TCS).
Aikat adds, “Thanks to a lot of open-source tools for VFX, a lot of tools and learning materials are available, and going by the views and comments on VFX tutorials on YouTube, it appears that a surging number of VFX artists are taking advantage of these courses. Game engines, GPU rendering and real-time technologies have piqued the interest of a multitude of VFX artists. Companies like Epic Games and Unity offer hundreds of courses and channels for artists to interact, and VFX artists are thronging in large numbers to these sites and channels.”
Supported by Netflix, the VES Virtual Production Resource Center offers access to free educational resources and information on the latest trends and technologies. Aimed at both current and future VFX professionals, the center is an effort of the Visual Effects Society in collaboration with the VES Technology Committee and the industry’s Virtual Production Committee (https://www.vesglobal.org/virtual-production-resources).
“When it comes to tools, there’s a tremendous amount of autodidactic training that the artists do on their own on their personal time with both paid or free master classes and a lot of online videos,” comments Sylvain Nouveau, Rodeo FX Head of FX. “VFX artists spend a lot of time sharing their knowledge with each other both during and after work hours – scouring for videos online and posting in relevant forums, checking Reddit.”

Showing a demo inside the Virtual Production Stage at FMX 2022. (Image courtesy of David Schaefer)

Many students attend New York City’s School of Visual Arts (SVA) to get a next level job or change their career path. (Image courtesy of SVA)

A virtual production class at Escape Studios at London’s Pearson College. (Image courtesy of Pearson College)
SOFTWARE COMPANY TUTORIALS
Some software company tutorials (for VFX, animation and related creative tools) charge a fee, but many are free. Autodesk offers thousands of free tutorials across YouTube and AREA, according to a company spokesperson. Its YouTube channels include Maya Learning Channel, 3ds Max Learning Channel, Flame Learning Channel and Arnold Renderer. And there are AREA tutorials and courses at https://area.autodesk.com.
SideFX offers free lessons about Houdini and Houdini Engine, which plugs into applications such as Unreal, Unity, Autodesk Maya and 3ds Max,” says SideFX Senior Product Marketing Manager Robert Magee. The SideFX website (https://www.sidefx.com) is host to 2,758 lessons designed to support self-learning within the community. Magee explains, “To help navigate all of these lessons, the SideFX site has 18 curated ‘Learning Path’ pages, which highlight the best lessons to explore for a variety of topics.” The 450 lessons created and published by SideFX are all free. The other 2,300+ tutorials are “submitted by the community.”
The Unity Learn platform offers “over 750 hours of content, both free live and for-fee on-demand learning content, for all levels of experience,” according to the company (https://learn.unity.com/). There are hundreds of Red Giant tutorials on its YouTube channel (https://www.youtube.com/c/Redgiant) and free Maxon webinars here: (https://www.youtube.com/c/MaxonTrainingTeam). Foundry’s site, learn.foundry.com, has Nuke, Modo, Flix and Katana tutorials, among others.
CONFERENCES
Conferences such as SIGGRAPH and FMX also help keep artists current. DNEG’s Aikat notes, “Many smaller VFX houses do not have the resources to house an R&D department, and they often lean on technologies and workflows seeded through academic research. Conferences like ACM SIGGRAPH are opportunities for VFX artists to learn about the products of such research. Many VFX studios send delegates to these technology conferences to soak in the future trends, workflows and solutions predicated by emerging technology.”
FMX, held in Stuttgart each spring, draws thousands of professionals and students. “At FMX, we curate conference tracks on the latest and greatest developments in projects and processes as well as in hardware, software and services,” says Mario Müller, FMX Project Manager. “Conferences play an important part in continuing education [about] the art, craft, business and technology of visual effects. Compared to the readily available information by vendors on the internet, a conference can provide a more concentrated, curated and neutral perspective as well as a community platform.”
The RealTime Conference, founded in 2020 by Jean-Michel Blottière, is fully virtual and free. The live gathering explores real-time technologies with live demos, panels, workshops, classes and keynote speeches (https://realtime.community/conference).
VIEW Conference (https://www.viewconference.it), another notable event, is set in Turin (Torino), Italy, and focuses on computer graphics, interactive techniques, digital cinema, 2D/3D animation, VR and AR, gaming and VFX.
VFX/ANIMATION SCHOOLS
Schools offer various levels of courses, which can be utilized by VFX artists currently working or between jobs. “VFX artists are constantly learning,” says Colin Giles, Head of School of Animation at Vancouver Film School (VFS). “As the industry changes, it is important that people who work as VFX artists have the outlets to keep learning while they’re working. This is why, in addition to our courses and workshops – called VFS Connect – to help people learn part-time without having to leave their jobs.”
For VFX artists to keep up-to-date at work, “some of the bigger studios have quite extensive internal training programs,” while “smaller studios and freelancers depend on online resources [free and paid for] and upskilling via training providers like ourselves,” says Dr. Ian Palmer, Vice Principal at Escape Studios, part of Pearson College in London.
According to Scott Thompson, Co-founder of Think Tank Training Center (TTT) in Vancouver, “In some cases, studios do work with schools like Think Tank to increase their understanding of new ideas. As an example, Think Tank had Mari well before the studios did, so our grads often became teachers to catch them up.”
Escape Studios provides continuing education for VFX professionals “through our short courses, most of which are in-person. We do short daytime courses for those who can take a break from their day job and evening courses for those who are fitting around other commitments. Many of our evening classes are available online,” Palmer says.

Framestore offers its employees classes in everything from software mastery to life drawing to clay sculpting. (Image courtesy of Framestore)

Students on set in the on campus greenscreen room at Vancouver Film School. (Image courtesy of the Vancouver Film School)

Technicolor Creative Studios, through MPC and its other studios, have a training academy to educate new employees in specific VFX tools. (Image courtesy of Technicolor Creative Studios)

The Unreal Fellowship teaches about virtual production and the mechanics of using Unreal Engine for storytelling. (Image courtesy of Epic Games)
VFX artists are often looking for an upgrade. Palmer explains, “Sometimes it’s just to get a fresh skill set to enhance their career. We also have people that want to change direction and need some guidance in that.”
At SVA, a lot of students “are working professionals,” Adam Meyers, Producer at New York City’s School of Visual Arts (SVA), says. “When you have that allotted time after work hours each week, it seems easier than fitting it in at work. Most of them are looking to get training that is more focused.” Meyers adds that “my current continuing education classes are online since COVID.” Asked if he often sees visual artists leaving the industry and going to school for an upgrade to get a “next level” job or to change their career path, Meyers responds, “Every semester. Education is about growth. Artists evolve just like the software.”
At the Savannah College of Art and Design in Georgia, students in the visual effects department take on assignments that reflect the most current working studio practices, such as in virtual production. Dan Bartlett, SCAD Dean of the School of Animation and Motion, notes, “The LED volumes in Atlanta and in Savannah are industry standard, so what we’re able to do in these spaces are create learning experiences that are absolute mirrors of what they would be doing if they were working for a major studio on features or on long-form television. Students work on everything from negotiating the production design components to developing both the digital and the physical assets that go into a shoot, to working in-engine – in our case Unreal Engine – to build the virtual cameras and the virtual lighting setups in order to bring those on-set shoots to life.”
At some VFX studios, like Framestore, visual effects artists do extensive in-house training and also benefit from some specialized classes. “Framestore has a real ‘melting pot’ of learning styles and preferences,” Simon Devereux, Framestore Director, Global Talent Development, says. “Historically, their studios have always offered their employees everything from life drawing and clay sculpting masterclasses to software mastery and behavioral skills training.”
Framestore recently hired a new Global Head of Content & Curriculum, Chris Williams, “who now leads the development of technical and software training, building new learning pathways, and he will ultimately develop the teaching capabilities of all our employees in order to support our global network of offices,” Devereux says. “That, along with a dedicated Unreal and 3D trainer and two production trainers, means we’re in a unique position to build on what is already an incredible investment in the personal and professional growth of our colleagues. In addition to this, we invest in a range of technical tutorial-based training platforms that are accessible across all of our studios.”
Aikat adds, “Constant in-house upskilling with customized training programs and keeping a close eye on the new trends in the market is the key to success.” DNEG also supplements its training curriculum with talks from experts on a range of topics that cover technical tools, product technologies and creative pursuits.
Rodeo FX helps broaden its employees’ horizons with evening classes (paid for by Rodeo) at official partner schools and a few technical colleges. But the greatest learning may come from the other artists. “Fifty percent of what we learn comes from others – whether it is new talent arriving from other companies or Rodeo employees sharing their knowledge with other studios,” Sylvain Nouveau, Rodeo FX Head of FX, comments. He adds, “In many ways, it’s all of these exchanges that keep the industry moving forward. With an average of about two years in a studio, there’s a lot of information flowing.” Marie-Denise Prud’homme, Rodeo FX Manager of Learning and Development, remarks, “Formally, artists share information with each other [tutorials, videos how-tos] through an information-sharing platform we use called Confluence. But shared learning can even be as simple as arranging meetings and calls on the fly.”

Behind the scenes of a fireside chat on the Virtual Production Stage at FMX 2022 with, from left: Hasraf ‘HaZ’ Dulull, HaZimation; Shelley Smith, DNEG Animation; Mikko Matikainen, The Mill; Paul Debevec, VES, Netflix; and David Sheldon-Hicks, Territory Studios. (Image courtesy of Dominique Brewing)

Many students at New York City’s School of Visual Arts (SVA) are working professionals looking to get more focused training. (Image courtesy of SVA)

An Escape Studios virtual production class taking place at the LED volume of MARS, an Escape partner, in West London. (Image courtesy of Pearson College
For on-the-job training, TCS has had a training Academy for years “for college leavers to be trained on photorealism and specific VFX tools and pipeline for features,” Cruz comments. Recently it started a new initiative. “We have a trainer for each department who are specialists in the tools of that department and pipeline. For example, for the FX department we have a dedicated FX trainer to help the department on the floor, so they would be an expert in Houdini, for example, and FX sims.” (TCS’s network of studios includes the Mill and MPC.)
Ron Edwards, TCS Global Head of Commercial Development – L&D, notes, “We encourage our artists to master cutting-edge technology and tools so they can produce content at the highest caliber. The company will also sponsor these efforts and provide employees the chance to learn from accredited institutes and alongside their peers to further their careers.” Edwards adds, “We always want to stay on top of the latest tech to future-proof ourselves and our students.”
Escape’s Palmer comments, “It’s always an exciting field to work in. Just when you think you’ve seen it all, something comes along to amaze you. The industry is full of very bright people with a thirst for knowledge, so while it’s challenging to keep up, that’s what makes it so interesting. Long may it continue!”
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
| Cookie | Duration | Description |
|---|---|---|
| cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
| cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
| cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
| cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
| cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
| viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.














