This content is for VES members only.
Login
Login
By OLIVER WEBB
Images courtesy of The Yard VFX.
The Yard VFX added morning smoke over the water to add to the atmospheric grayness of Victorian London.
Following the events of the first film, Enola Holmes 2 follows Enola on her quest to unravel the disappearance of a missing girl, assisted by her close associates and her older brother, Sherlock. Spin VFX and The Yard VFX worked together to provide the majority of the visual effects for the film, with around 550 VFX shots required overall. “Working directly with the movie VFX supervisor, we had to deliver photorealistic period images and scenery, staying true to the spirit and tone of Enola Holmes as laid out in the first installment,” says Edward Taylor, VFX Supervisor for Spin VFX on the film.
“The conversation started with Helen Judd, Mike Ellis and myself around two years ago,” remarks Laurens Ehrmann, Senior VFX Supervisor and Founder of The Yard VFX. “They asked me if The Yard would like to work on Enola Holmes 2. Of course, it was a great opportunity for us,” Ehrmann explains. “They showed me some concept art and explained the project, and we started from there. In the same period, I had some conversations with Aymeric Auté, and I proposed he join us for the show. Then we started sharing our thoughts with Mike, regarding the different environments that we had to craft, especially the outside of the matchstick factory and this big aerial shot over the Thames, where the camera is flying over the dock to reach Enola in the street.”According to Ehrmann, there weren’t a lot of images for the time period. “We tried to build a bunch of references. The main references from Mike were Jack the Ripper and murky London with lots of pollution, as well as Peaky Blinders,” he says.
The Yard VFX relied on images of Victorian London for reference when it came to constructing the concept of the dockyards. A gray sky lingers over the Big Smoke.
“There weren’t many pictures, as it was the end of the 19th century and beginning of the 20th century,” adds Auté, who served as VFX Supervisor. “We tried to go a little further in time, around 1930, as there are more pictures, but we had to take care as it was more industrial in this time period. We tried to get as many references as possible, and we created a mood board with Mike. They sent us what they wanted architecture-wise and details of windows, for example, to really fit with the time period.”
“There weren’t many pictures, as it was the end of the 19th century and beginning of the 20th century. We tried to go a little further in time, around 1930, as there are more pictures, but we had to take care as it was more industrial in this time period. We tried to get as many references as possible, and we created a mood board with Mike. They sent us what they wanted architecture-wise and details of windows, for example, to really fit with the time period.”
—Aymeric Auté, VFX Supervisor, The Yard VFX
The Spin VFX team was meticulous when it came to the final details. “The harsh smoke-filled realities of Victorian England were present but moderately gentrified in order to create an accessible vision in which the story could be presented,” Taylor says. “Period correct was always on our minds as we scoured through Lost London: 1870-1945, London’s Lost Riverscape: A Photographic Panorama, as well as Spitalfieldslife.com, nationalarchives.gov.uk and numerous other websites. The devil was in the details much of the time, as any new photography is littered with television cables, security alarms and cameras, power boxes/junctions, etc. All of these had to be digitally removed in order to sell the turn of the [18th to 19th] century time frame.”
The Yard started with their own concepts for every shot “with a big mood board with lots of pictures we had gathered, just to give them our feelings and thoughts regarding the look of the shot,” Ehrmann details. “After that we spoke about how we would craft and build the shots. For example, for the aerial drone shot, the initial on-location plate over the river with the dock, we were supposed to keep the water as it is, but to ease the connection between our CG environment and the water, we decided to recreate everything, including the water. With the CG water we were able to manage the connection with the wooden docks, to add this kind of morning smoke over the water. We were asked to connect the street with the water with a drain, but an inclined plane, where the water is flowing over the bricks. So, the idea to recreate everything was a good one, as it gave us the ability to connect everything. We added a few extras, which were shot onstage in front of greenscreen, just to add people to the background. A lot of simulation for all of the smoke coming out of the chimneys.”
The Yard VFX created 63 shots for the film.
Overall, The Yard created 63 shots for the film, notes Luhrmann, “18 outside matchstick factory and 45 shots inside the matchstick factory. They shot a room, which we duplicated in depth. Sebastien Fauchère, Comp Supervisor, created a setup to recreate this second room and to be able to craft every shot really fast. We didn’t have to go to the CG render, everything was inside Nuke. It was pretty fast to develop on every shot.”
“The harsh smoke-filled realities of Victorian England were present but moderately gentrified in order to create an accessible vision in which the story could be presented. Period correct was always on our minds as we scoured through Lost London: 1870-1945, London’s Lost Riverscape: A Photographic Panorama as well as Spitalfieldslife.com, nationalarchives.gov.uk and numerous other websites. The devil was in the details much of the time, as any new photography is littered with television cables, security alarms and cameras, power boxes/junctions, etc. All of these had to be digitally removed in order to sell the turn of the [18th to 19th] century time frame.”
—Edward Taylor, VFX Supervisor, Spin VFX
Workers outside the matchstick factory, with the fog over the Thames in the background. The Yard VFX worked hard to recreate old London, paying close attention to the architecture of the period.
The establishing shot proved to be particularly challenging for The Yard to create. “It was a really big environment,” Auté says. “Because it’s a drone shot, we can’t do this on matte painting. We have to really build it in 3D to manage all the parallax to recreate the mood of old London with this kind of move. We started from really wide, and we have to go to almost close-up inside the streets, with the water on the grounds. It was the most challenging in terms of 3D.”
“[F]or the aerial drone shot, the initial on-location plate over the river with the dock, we were supposed to keep the water as it is, but to ease the connection between our CG environment and the water, we decided to recreate everything, including the water. With the CG water we were able to manage the connection with the wooden docks, to add this kind of morning smoke over the water. We were asked to connect the street with the water with a drain, but an inclined plane, where the water is flowing over the bricks. So, the idea to recreate everything was a good one, as it gave us the ability to connect everything.
—Laurens Ehrmann, Senior VFX Supervisor, The Yard VFX
“We started from wider and then went close to the actress, meaning that every single aspect needed to be really defined. We did have time to craft it because the project’s duration was nearly six months. Six months for 63 shots is a lot of time to develop and build all of the assets and do the layouts. It was challenging but not exhausting,” adds Ehrmann.
The Yard VFX decided to recreate everything to ease the connection between the CG environment and the water. Enola leads the crowd of workers along the South Bank.
The trickiest locations to capture for Spin VFX were usually dealing with cameras that were traversing the area. As Taylor explains, “The theatre stage interior and the rooftops and their scope, blending CG with practical, were generally the most challenging. The cameras and created geometry have to be in complete sync to achieve the desired illusion, not to mention lighting and any fluid dynamic elements such as water and smoke. Trying to make seamless transitions for such large and varied vistas was truly a challenge.”
In terms of the most difficult effect to realize, Taylor notes that, “As happens from time to time, creating invisible effects can be perceived as simple, but can often be challenging. The carriage explosion was exceedingly tricky, being nestled in the branches of trees, and blending practical elements alongside CG, as well as the various panoramas of London from a hillside and out a window.”
Millie Bobby Brown as Enola Holmes.
“The theatre stage interior and the rooftops and their scope, blending CG with practical, were generally the most challenging. The cameras and created geometry have to be in complete sync to achieve the desired illusion, not to mention lighting and any fluid dynamic elements such as water and smoke. Trying to make seamless transitions for such large and varied vistas was truly a challenge.”
—Edward Taylor, VFX Supervisor, Spin VFX
“For the whole project, we had to almost set up in parallel, as a lot of shots were inside the factory and they were more 2D centric,” Auté says. “We also had a few shots where we had to build specific assets, such as cranes or the docks building. We started to build these assets in 3D, but at the same time create the layout to place the camera and see where we need to add detail and what we can render in 2D. When we had all this established, we started to refine the texture of the lighting. Or, in some cases we only did matte painting to extend a street. We tried to spread the work between the 3D and 2D team to have everything go together at the end. They have a kind of ownership of the shots, and sometimes it is good to have someone work on one piece. It’s creating a kind of energy, and people are more involved.”
45 of the visual effects shots provided by The Yard VFX were inside the matchstick factory. A room was shot, which The Yard then duplicated in depth.
Concludes Taylor, “Our VFX Producer, Brandy Drew [with Spin VFX], created an environment where daily conversations with artists kept collaboration moving forward and information flowing, no question too small, no concern to go unrecognized. Organized by level of difficulty and availability of resources, the plan was created and the workload balanced. We delivered on time and on budget.”
By TREVOR HOGG
Images courtesy of Paramount Pictures.
Along with replacing the environments, the actual jets had to digitally reskinned to represent the proper aircraft.
Getting declassified in time for the Oscars are the 2,400 visual effects shots found in Top Gun: Maverick, which were the responsibility of Ryan Tudhope and the artists at Method Studios, MPC, Lola VFX and Blind. The goal was to adhere to the imperfections that go along with shooting live-action aerial photography and to produce photorealistic CG that enhanced the storytelling and believability, rather than drew attention to itself. Before any of the scanning information was released of the F-18s to the vendors, the U.S. Navy had to give its approval.
“The L-39 is a much smaller and less capable aircraft than the Su-57, which is a fifth-generation fighter that has thrust vectoring, so it can do these maneuvers that we feature in the film where it literally goes up on its nose and turns around in midair and comes back down. Those things were not possible with anything that we had at our disposal. In those particular situations our animators under the leadership of Marc Chu [Animation Supervisor at Method Studios] pushed to get all of the flaps to do what they were supposed to do and take the shot to the next level, so it was interesting from an audience standpoint.”
—Ryan Tudhope, Visual Effects Supervisor
“That extended to the choices we were making as filmmakers in terms how the jets moved,” Tudhope explains. “The U.S. Navy was there every step of the way. The authenticity, time and effort our team put into getting all of those things perfect were driven by both [director] Joseph Kosinski’s and Tom Cruise’s desire to get all of those details right, and that was accomplished through our friends in the U.S. Navy.”
Armaments like missiles were digitally added so not to hinder the actual performances of the jets.
An area that does not get enough credit is the graphic work produced by Blind. “They were responsible for all of the heads-up displays that you see from the F-18 or Tomcat point of view and a lot of the story-driven graphics that are done throughout the film in the aircraft carrier.”
“There are many sequences that feature four jets, two teams of two, that are flying into the final mission or various training missions. Typically, we shot those with one or two F-18s and added the other F-18s in those formations. It gave us the ability to get more material practically, and since we always had a real jet in there as reference it was a huge help in matching the look and lighting. … [W]e would have a real jet doing a maneuver and add a CG jet doing a similar maneuver following behind, or in the shot where they all come through the valley and the vapor trails are going off and rush under the camera. That was one jet, and we added multiple jets doing the same thing. What is fun about that is you have this perfect thing that you’re trying to match.”
—Ryan Tudhope, Visual Effects Supervisor
Some old-fashioned techniques assisted in choreographing aerial sequences for key story beats. “Through all of that we had these F-18s on sticks,” Tudhope states. “That process was more at the front end where we were trying to translate what the Naval pilots were trying to explain to us about how things would work, as we filmed those in the hallways to try to capture their notes as to what would essentially happen, and go from there. Once we had a sense of Joe’s vision for these sequences, taking in account all of this information we were getting from the pilots, then it became a process of how to execute it.” The imperfections of the aerial photography were retained. “The difficulty of capturing that material is a fingerprint that carries all the way through to the end of the work,” Tudhope adds.
The cockpit of the full-scale Darkstar came off and went onto a gimbal that was put onstage in order to get the desired plasma effect of being in the stratosphere.
A one-to-one replacement was not possible for the aircraft. “The L-39 is a much smaller and less capable aircraft than the Su-57, which is a fifth-generation fighter that has thrust vectoring, so it can do these maneuvers that we feature in the film where it literally goes up on its nose and turns around in midair and comes back down,” Tudhope remarks. “Those things were not possible with anything that we had at our disposal. In those particular situations, our animators under the leadership of Marc Chu [Animation Supervisor at Method Studios] pushed to get all of the flaps to do what they were supposed to do and take the shot to the next level, so it was interesting from an audience standpoint.”
The blue environment of the stratosphere was influenced by aerial photography taken by weather balloons and SR-71 flights.
The imperfections of the canopy glass had to be matched in the CG versions. “There might be a situation where we had an explosion in the distance that wasn’t there and the way that those bright highlights are used through the canopy glass, which has almost like scratches on it, but in a swirling motion; it was important for us to get all of that swirling,” Tudhope explains. “We also added in a ton of armaments across the film. But once you add training munitions or bombs to the wings, it lowers the performance characteristics of the jet, and you want the jets to be doing the full-on performance. We were able to add a lot of those armaments and deal with the continuity across all of the different sequences and take that off of the requirements of the Navy to find all of that stuff for us. But these wings are alive. The wings are constantly fluctuating from the air pressure, and the flaps are moving, and there is complicated lighting moving across.”
“There are also moments where we shot real F-18s doing those taxing maneuvers and takeoffs. We had all of our camera mounts inside the F-18, so in one or two sequences where Maverick is literally taking off and you see the world receding behind him, we shot those back plates on the F-18’s internal cameras without someone sitting in the seat. The cockpit component of our full-scale jet came off and went onto one of [Special Effects Coordinator] Scott Fisher and his team’s gimbals, which we were able to put onstage.”
—Ryan Tudhope, Visual Effects Supervisor
Reskinning of the jets was reliant upon the original proxy version captured during principal photography. “We had a lighting reference in the case of the Navy jets. They are matte grey which is perfect for us, so we were able to see what the lighting characteristics were,” notes Tudhope, who used a combination of tracking markers, GPS from the camera aircraft and corresponding USGS data to get the lighting correct. Another major process was constructing the digital versions of the jets. “We get up close especially to the F-18s where they have all kinds of little dents, imperfections, bolts and rivets, been painted over a couple of times – there is a lot of detail that you want to try to capture. We were able to have a real turntable of a F-18, and we put our CG turntables right next to that. We were able to make sure that they were matching.”
The VFX team made simulations from lighting and atmospheric standpoints and match-moving the real jet relative to the terrain so the digital jets were moving at the same rate of speed.
One way to alter aerial missions was by adding digital aircrafts into shots. “There are many sequences that feature four jets, two teams of two, that are flying into the final mission or various training missions,” Tudhope states. “Typically, we shot those with one or two F-18s and added the other F-18s in those formations. It gave us the ability to get more material practically, and since we always had a real jet in there as reference it was a huge help in matching the look and lighting. It was really fun and nerdy because we would have a real jet doing a maneuver and add a CG jet doing a similar maneuver following behind, or in the shot where they all come through the valley and the vapor trails are going off and rush under the camera. That was one jet, and we added multiple jets doing the same thing. What is fun about that is you have this perfect thing that you’re trying to match.”
Appearing in the opening is a fictional stealth aircraft inspired by the hypersonic strategic reconnaissance UAV (Unmanned Aerial Vehicle) Lockheed Martin SR-72. “For most of the scenes on the ground we filmed the practical Darkstar and removed the towing vehicle digitally, and added all of the heat, haze and exhaust as if it was moving under its own power,” Tudhope remarks. “There are also moments where we shot real F-18s doing those taxing maneuvers and takeoffs. We had all of our camera mounts inside the F-18, so in one or two sequences where Maverick is literally taking off and you see the world receding behind him, we shot those back plates on the F-18’s internal cameras without someone sitting in the seat. The cockpit component of our full-scale jet came off and went onto one of [Special Effects Coordinator] Scott Fisher and his team’s gimbals, which we were able to put onstage.”
One of the trickiest elements to recreate was the canopy glass with all of its imperfections.
The stratosphere had to be recreated, explains Tudhope. “We were able to find amazing reference of weather balloons and SR-71 flights where cameras had been taken to those altitudes. All of this had to be created as a digital environment.” An emerging technology is a pivotal part of the Darkstar narrative. “As you’re seeing our sequence unfold,” he continues, “the altitude that we’re conveying and the things that occur, the look of all that from a physical standpoint is based on the data that Lockheed Martin [gave us on the scramjet engine].” A certain amount of disbelief was required when it came to the camera mounts. “When you get to the training missions and the exterior camera mounts that [Cinematographer] Claudio Miranda engineered with the Navy to place real cameras on F-18s, that process of placing of real cameras on real aircraft was extended early on in the film to Darkstar, even though it was digital, and also later into the final battle – that was the DNA to what we were doing.”
Digital jets were added for safety reasons and to get the desired formation and shot composition.
Leading the way were the cameras and lenses. “Rather than design shots that we would have to modify the mounts or change lenses,” Tudhope reveals, “we determined where the mounts were going to be and what lenses would be on those particular frames and create a composition that Joe was after. We took creative liberty where we were putting that camera on the digital jet versus a real jet. We were given a large toolbox of mounts and camera platforms to try to create shots with, and the process was, ‘What is the best way to film a plate to do this shot?’ Sometimes it was a one-to-one match and other times we would modify what we filmed in order to accomplish the shot.”
“When you get to the training missions and the exterior camera mounts that [Cinematographer] Claudio Miranda engineered with the Navy to place real cameras on F-18s, that process of placing of real cameras on real aircraft was extended early on in the film to Darkstar, even though it was digital, and also later into the final battle – that was the DNA to what we were doing.”
—Ryan Tudhope, Visual Effects Supervisor
Locations had to be digitally augmented, especially for the third act battle. Comments Tudhope, “We spent a lot of time scouting up in the Cascade Mountain Range in Washington State for this snowy environment and worked out of Naval Air Station Whidbey Island. In the film, there is an enemy base situated at the bottom of a bowl. We found half of what we wanted and augmented real footage to get what we needed. We had an amazing locations team and pilots from the Naval Air Station who would go out with GoPros in jets and fly some of these runs for us and show us what it might look like.”
Actress Monica Barbaro and Tom Cruise on set with a special camera rig developed by Cinematographer Claudio Miranda and the U.S. Navy.
One of the essential collaborators was editor Eddie Hamilton and his team. “The editing and going through all of this footage to try to put these shots together was a huge component. We would come across shots that we had nothing for, and it might be just a storyboard. In those situations, I would work with our Visual Effects Editor Latham Robertson and pour through the material that we had captured and find different options for shots and background plates, get Joe to sign off on that or get him to choose what he wanted and go from there. We went through an extensive postvis process, so we worked very loose and fast. What missiles they have remaining was a big thing, especially on the Tomcat because there are two Sidewinders. That stuff was all tracked. Once the cut started to settle down and we felt that we’ve got this sequence coming together, then we would turn over the shots to Method Studios or MPC, and they would execute all of the beautiful work that they did.”
By MATT HURWITZ
Images courtesy of Warner Bros.
Dwayne Johnson “floating” out of the Rock of Eternity on set at Trilith Studios in Atlanta, lifted by one of the special effects department’s robotic arms.
Watching director Jaume Collet-Serra’s Black Adam, audiences are easily convinced that the Warner Bros./HBO Max saga was shot in a Middle Eastern city, nowhere near the Atlanta, Georgia set on which it was filmed. “We’re always most proud of things no one ever thinks are visual effects,” notes Oscar-winning Visual Effects Supervisor Bill Westenhofer (Life of Pi). “The goal is to work yourself out of any recognition.”
The film was lensed by DP Lawrence Sher (The Joker) with production design by Tom Meyer. Its primary visual effects producer was Wētā FX under the production supervision of Westenhofer in tandem with Wētā VFX Supervisor Sheldon Stopsack. Additional VFX work was by provided by Digital Domain, Scanline VFX, DNEG, Rodeo FX, Weta Digital, Lola Visual Effects, Tippett Studio, Clear Angle Studios, Effetti Digitali Italiani (EDI) and UPP. Special Effects Supervisors were Lindsay MacGowan and Shane Mahan for Legacy Effects.
Dwayne Johnson battling with Aldis Hodge’s Hawkman in the Sunken City exterior set at Trilith Studios. The athletic Hodge was suspended by wires, his wings – as well as the set extensions beyond the ground-level set – added later by Wētā FX.
The story takes place in fictional Shiruta, the modern-day version of Kahndaq, where 5,000 years prior, Teth-Adam (Dwayne “The Rock” Johnson), the people’s hero with great superpowers, had been imprisoned in the Rock of Eternity for apparently misusing his powers. He is released by a rebel (by utterance of the word “Shazam”) and brought back to battle the people’s modern-day oppressors, the Intergang. While he initially also battles the four members of the Justice Society of America – Doctor Fate, Hawkman, Atom Smasher and Cyclone – they end up fighting Intergang together, eliminating the threat posed by not only that group, but Sabbaq, who also arises from the darkness of old Kahndaq to attempt to claim his throne. By the end of the film, he has succeeded in eliminating the threat, and the hero is renamed Black Adam.
“[S]ince [director] Jaume [Collet-Serra] and [DP] Larry [Sher] were actively participating in creating the previs, they felt ownership. So, when we got to set, they knew that the previs was theirs and that was the path they were going to follow, as opposed to getting there and going, ‘Oh, there’s that previs – forget that, we’re gonna do our own thing.’ It’s amazing – you can look at the previs and look at the shots and they’re incredibly close.”
—Bill Westenhofer, Production Visual Effects Supervisor
Development of Black Adam began in 2019, with Westenhofer being brought on not long after Collet-Serra came on to helm the project. By that time, the director had worked with storyboard artists to flesh out his ideas. Then, they met to decide the best state-of-the-art methods to create the imagery the director had in mind. “LED walls were hot at the time, as was volume capture, and we ended up dabbling in all of them,” Westenhofer remarks. The production was intended as a full-scale virtual production, developed initially by Tom Meyer in ZBrush. “We had a motion capture stage setup and had motion capture performers, and we had real-time controls,” Westenhofer adds. “We were due to start on March 17, 2020 – and then the world closed down.”
Johnson in a completed “flying” shot. He was first captured laying flat in an Eyeline Studios volume capture stage with the rig later removed and extensive background animation added.
Over the pandemic hiatus and through Fall 2020, L.A.-based Day For Nite continued work creating previs for the scenes, importing the Maya storyboard files into Unreal Engine. “Right away, we can see things that are working and ones that are not,” Westenhofer states. What read in the script as “He comes out, they fight, he flips over a tank” was soon developed into fully-realized scenes.
At the same time, DP Sher began setting cameras and lighting, working with Day For Nite and Collet-Serra via Zoom. “It was great because since Jaume and Larry were actively participating in creating the previs, they felt ownership. So, when we got to set, they knew that the previs was theirs and that was the path they were going to follow, as opposed to getting there and going, ‘Oh, there’s that previs – forget that, we’re gonna do our own thing.’ It’s amazing – you can look at the previs and look at the shots and they’re incredibly close.”
Johnson “floats” down the stairs of an apartment set, standing on the small platform of an industrial automobile robotic rig.
Deciding which locations seen in the previs would be practical sets and which would be CGI was an important step. “You can look at the previs,” Westenhofer explains, “and you can see if Jaume wants to be looking in a specific direction most of the time, in which case we would build that part as a set. But as soon as that set has more than one story to it, construction costs start to go up. So, for things like the city, Shiruta, I told Tom just to focus on, say, the first story, store level, and we’ll take care of the rest.” The same goes for which characters would be digital and which would be captured in- camera. “I always try to favor scenes where there are people – humans not flying around and who aren’t superheroes. But we have a movie where there are five superheroes and four of them fly in some form. So, they’re going to be mostly digital,” Westenhofer declares.
Building a City
When filming finally began, Meyer constructed the ground-level set of Shiruta on the back lot at Trilith Studios in Atlanta, notably its Central Market or the “Sunken City” where a great amount of action in the film takes place. “It actually doubles for many places in the city,” Westenhofer explains. “We had a roundabout area and several cross-streets, and if you look in one direction, that would be the area around Adriana’s (Sarah Shahi) apartment, and if you look the other way, it was where the palace would be. And when they’re seemingly driving through town, they’re really going in circles, but by changing the set extension it felt like you were traveling through the city.”
Wētā’s Assets Department, which includes its Art Department as well as modelers, texturers and shading specialists, were responsible for crafting the city, rooted in Meyer’s design for the practical set. “Tom did a magnificent job of fleshing out the tone and feel of Shiruta,” Stopsack states. “So, a lot of the groundwork was done already. We engaged with Tom quite early. Then we spent a fair amount of time designing the architecture and the whole feeling of the city square.”
A floating Dwayne Johnson, suspended by an industrial automobile robotic arm, does away with two bad guys.
“[Black Adam] not only flies, he floats. In the comic books, he says he doesn’t want to share the ground with lesser beings. So, he feels like he should float. But we wanted Dwayne to be in the scene, and we didn’t want to have him always be bluescreen, having to shoot him looking at tennis balls. [Director] Jaume [Collet-Serra] wanted it to be super smooth, not having to expend any effort, just floating.”
—Bill Westenhofer, Production Special Effects Supervisor
The look established by Meyer, Stopsack notes, “as he often described to us, was like Middle East meets Hong Kong. It needed to be dry, somewhat monochromatic and reasonably high. A lot of chaotic streetlamps, wires and air conditioning units everywhere.” Much of that was introduced to Wētā early in concept art, and then it was up to them to flesh out the environment. Adds Stopsack, “We had the luxury of photography of Tom’s set on the backlot, which gave us a starting point via plate photography, which we then extended.” That task required extension to show the entire city of Shiruta, to allow creation of high wide shots, which would include the Central Market and Palace in the extended terrain. “We knew whatever we would start building around the Sunken Street ultimately would be utilized for propagating the wider city.”
The Asset Team’s approach was to essentially create modular building blocks to create different architectural levels and stories of each building. “We had to interchange them, dress them differently and stack the buildings up to a height that Tom deemed necessary,” Stopsack explains. “For each building we would ask, ‘Would you like this to be 15 stories high? 10 stories? Is this a round building? Do we see filigree here?’ So, we had a lot of engagement with him to make sure that the look and feel was what he envisioned.” They took advantage of any assets the Art Department had available, including packages of art, signage and other items, to lean into the same language Meyer’s teams had developed. “It’s an endless chase of getting the level of detail that you’re after.”
Wētā’s attention to detail translated to a construct that looks like a true city and not a visual effect. At the same time, constructing the entire city digitally – including the entirety of Meyer’s Sunken City area sets – gave Wētā valuable flexibility for creating scenes of mayhem which otherwise would have required destruction of the practical set. “The beauty of approaching it that way,” Stopsack observes, “is that we were left with an all-digital representation of the practical set pieces that were built. So, in the fight between Black Adam and Hawkman, if Black Adam is punched and smashes down the side of the building, those shots could be created fully digital. The entire environment was fleshed out, so we could inject these all-digital shots in between.”
In order to develop a true city grid seen in high wides, Wētā’s layout team utilized Open Streets map data, accessing real-world locations as the basis for Shiruta’s street layout. Comments Stopsack, “We looked at Middle Eastern cities around the globe to look at each’s city grid to study the general density of population and buildings and the buildings’ heights. A lot of data can be sourced, and we used that to lay a foundation for what Shiruta became.”
Pierce Brosnan on set wearing his “faux cap” suit with optical markers, holding his helmet. The remainder of his costume was created digitally, as seen in the final shot.
“[The look established by Tom Meyer] as he often described to us, was like Middle East meets Hong Kong. It needed to be dry, somewhat monochromatic and reasonably high. A lot of chaotic streetlamps, wires and air conditioning units everywhere. We had the luxury of photography of Tom’s set on the backlot, which gave us a starting point via plate photography, which we then extended. We knew whatever we would start building around the Sunken Street ultimately would be utilized for propagating the wider city.”
—Sheldon Stopsack, VFX Supervisor, Wētā FX
Moving Black Adam
As lead effects vendor, it fell to Wētā to develop the character animation models and movement, which were then shared with the other vendors for creation of their own scenes. “We were engaged fairly early on, when Bill asked us to start doing motion studies – even before building any of the Shiruta environments,” Stopsack explains. “These were done, in part, to inform how they would be shot on the practical set, like how they engaged in flying action or how Hawkman would land.”
Hawkman actor Aldis Hodge did quite a few stunts himself, such as his dives into the Central Market on a wire, touching down. “We had him rehearse with counterweights attached to his costume to give him a sense of what the wings would feel like, informing his performance,” Westenhofer notes. He was also given lightweight cloth cutouts of the wings to allow the set team to get an understanding of their size, how they would articulate, and allow DP Sher space to plan for in his frame and allow for the future-digital wings to have a home.
The motion studies also helped, working with Costume Design and the Art Department to nail down costume design and motion. Says Stopsack, “Some characters that were not completely digital had costumes that needed to be practically built, such as Hawkman and Black Adam – Hawkman’s wing design, for instance, looking at their mechanics, how do the wings unfold? Things like that.” Other designs, like Dr. Fate’s costume, were completely digital, requiring more creative input from Wētā.
Director Jaume Collet-Serra, left, discusses a scene on set at Trilith Studios in Atlanta.
A key part of Black Adam’s motion involves his simple floating movement within a scene. “He not only flies, he floats,” Westenhofer explains. “In the comic books, he says he doesn’t want to share the ground with lesser beings. So, he feels like he should float. But we wanted Dwayne to be in the scene, and we didn’t want to have him always be bluescreen, having to shoot him looking at tennis balls. Jaume wanted it to be super smooth, not having to expend any effort, just floating.”
Special Effects Supervisor J.D. Schwalm was tasked with offering practical methods to accomplish the float. The main mechanism was provided by an industrial automobile robot used in car manufacturing. “That was the coolest one,” continues Westenhofer, “which we had mounted on the set and could be programmed to pick him up and float him and move him around,” with Johnson standing on the rig’s platform, his legs being replaced later and the rig removed. “It allowed him to act. When he’s floating down the stairs, passing the kid, he could do back and forth banter and actually be in the scenes with other characters. That was really important.”
For simpler shots, Schwalm provided a small robotic cart about 2½ feet by 2½ feet, with a robotic hydraulic arm containing a saddle and a small foot platform, allowing Johnson to be raised or lowered up to four feet versus the industrial robot, which could lift him as high as 15 feet. “These sorts of things could also be done using wires, but Dwayne found this really comfortable, and it allowed him to interact naturally,” Westenhofer notes.
“[The Central Market on the set of Shiruta] actually doubles for many places in the city. We had a roundabout area and several cross-streets, and if you look in one direction, that would be the area around Adriana’s (Sarah Shahi) apartment, and if you look the other way, it was where the palace would be. And when they’re seemingly driving through town, they’re really going in circles, but by changing the set extension it felt like you were traveling through the city.”
—Bill Westenhofer, Production Visual Effects Supervisor
Dwayne Johnson in the Sunken City set during a fight scene. Production Designer Tom Meyer’s Central Market was replicated by Wētā FX in set extensions.
For his flying sequences, the VFX team made a volume capture system provided by Eyeline Studios, a division of Scanline VFX. The system once again allowed Johnson’s performance to be captured in a method quite a bit different from motion capture. The actor would lay flat on the rig, surrounded by an array of hi-res video cameras (versus infrared, as would be used in mocap). Eyeline then processes the data in its proprietary system and provides an extrapolated mesh and a set of textures. Explains Stopsack, “When the data comes to us, we then have the geometry of his performance, of his head, and we have the texture that maps onto it.”
“[The industrial automobile robot was] mounted on the set and could be programmed to pick [Johnson] up and float him and move him around [with Johnson standing on the rig’s platform]. It allowed him to act. When he’s floating down the stairs, passing the kid, he could do back and forth banter and actually be in the scenes with other characters. That was really important.”
—Bill Westenhofer, Production Visual Effects Supervisor
Wētā took the process a step further to retain Johnson’s natural head motion. “We took Eyeline’s mesh and tried to incorporate that into our full-blown Black Adam digital double,” Stopsack remarks. “We could then take their head motion data and combine that onto our puppet so that the head motion would track perfectly with our digital asset, with our digital head motion. But volume capture gives you limited body motion. If you have pretty intricate body motion, your head motion can quickly go off what the volume capture would allow, such as if the head goes backwards and you want an extreme that it won’t permit. So, our animators would then see those constraints and work within them to see how far we could bend the head back without going beyond what volume capture could support,” preventing the bend from appearing too rubbery, unlike a real person’s movement. Stopsack adds, “We used the technology for a small number of shots, but it was great when you needed the unmistakable like of the actor.”
In addition to Eyeline’s cameras, the actor was surrounded by LED walls playing back material created in Unreal, working early on with Scanline and Digital Domain, which provided interactive lighting on Johnson’s costume. The backgrounds, of course, were replaced later. LED walls came in handy for other sequences, such as filming the cockpit scenes in the Hawk Cruiser as it crashes into the Central Market. “The cockpit set was too big to place it on a gimbal,” Westenhofer reveals. “Instead, we had the content playing back on the LED screen, which was designed as being from the point of view of the cockpit so they could see themselves flying through space and crashing, and it gave them enough inspiration to sway and move as the craft was bucking in space.” For lighting, he says, “It worked really well inside the cockpits. We did replace some backgrounds, but the interactive light worked really well.”
Aldis Hodge as Hawkman. The character’s wings were added digitally, though Hodge was provided lightweight cloth cutouts to allow the actor and on-set teams an idea of the space that would be taken up by the finished digital product.
Using LED walls is not something to do frivolously, Westenhofer notes. “A lot of people come to this and hope to get what they call ‘final pixel,’ meaning you film it and the shot is done. There needs to be a fair bit more work done to get LEDs to the point where that’s really successful. You need a lot more time in prep to build these CG backgrounds, but then no one can change their mind afterwards. If you do that, it’s baked into the sauce.”
Towards the end of the film, we see Teth-Adam’s backstory in a flashback revealing the death of his son, Hurut, before he became the “The Rock”-sized superhero. For those scenes, a much slimmer double (Benjamin Patterson) was used onto which Johnson’s face was later applied. “We’d have Dwayne do a pass just to get the performance, and then the double would come in and repeat the same timing and performance. So it would be his body,” Westenhofer explains.
Later, after the scene was cut together, Johnson’s head and face were captured by Lola Visual Effects using their “Egg” setup, a system somewhat similar to volume capture. “Dwayne would sit down in a chair surrounded by several cameras,” Westenhofer describes. “Lola had studied our footage and setup lighting timed to replicate interactively the way the light on set interacted with the double throughout the shot, using colored LEDs. They could tell Dwayne to ‘Look this way’ or ‘Get ready to turn your head over here,’ and they would time the playback so he’d give the performance and move his head, give the dialogue matching what we captured on set from the other actor. Then, that head is projected onto a 3D model and locked into the shot itself, so you have Dwayne’s head and the skinny actor’s body.”
Before and after shots of a battle sequence in the Sunken City show the extent of Wētā FX’s detailed design work in set extensions and effects animation.
For Pierce Brosnan’s character, Dr. Fate, it was the opposite case. Brosnan’s own performance was filmed on the set and his body was replaced. “When he’s flying, it’s all CGI,” says Westenhofer. “But when he’s on the ground interacting with other characters, his costume has more life than a practical costume would have, so the costume is digital.”
Instead of using motion capture where Brosnan would have been filmed alone on a mocap stage, a “faux cap” system was used. Brosnan appeared on set in a simple gray tracking suit. Explains Stopsack, “It doesn’t have a full- blown active marker setup as a motion capture setup would have. The suit is simply peppered with optical markers, which are not engaged with any computer system but simply photographed with witness cameras. Our Match Move Department then uses some clever algorithms to triangulate their location and extract motion. We needed to see Pierce’s performance, his persona as an actor on set engaging with all of these characters. Then the superhero suit followed after.”
I’ve been asked which of these characteristics that describe me (disabled with spinal muscular atrophy, Chinese-American, woman) have posed the biggest challenge in moving forward, and I’d say being a woman in this business. Even to this day, when I show up to set as a VFX supervisor, the first question I’m asked is “who are you here visiting?” It’s an everyday thing that will change with time. The more women are seen and empowered in senior roles, the less these trivial questions will come up. I took a leap of faith in starting my own company, and I am committed to achieving greater equity and opportunity for everyone in VFX.
I was a single working parent early in my career, and the issue of balancing a career and family is highly personal. I was able to figure out a way where I did not have to sacrifice one for the other – but so many parents, particularly women, feel backed into making that tough choice. Women in Animation is focused on the enormous need to provide job flexibility and more support for working parents and caregivers. The number of women who have had to walk away from their jobs because of the lack of childcare, its staggering cost and not enough options for hybrid work schedules is startling, and that has all been exacerbated by COVID. We need to do better and lift up this advocacy movement.
Growing up amidst war in the Democratic Republic of the Congo in central Africa, I made a decision to pursue art to inject life into something I drew with my own hands and give it back to the people. I created The Third Pole initiative, a CG education program, to work with youth in my home country and give them the tools and the mentorship to be powerful visual storytellers. We know how Western and Asian cultures tell their stories, but not as much how Africa would tell theirs and contribute to our collective global storytelling. It’s so important to be able to preserve our oral histories; our legends are vanishing in our own time.
The lack of female visual effects supervisors is definitely the result of a lack of opportunity and unconscious bias – and that is fixable. Earlier in my career I was told that the goal was to promote the male supervisors, and I watched as guys who had worked under my VFX supervision were promoted up the ranks and given opportunities on large VFX shows. It never occurred to me that my gender would hold me back, and I was always surprised when it did. I am a strong believer in diversity and inclusion, not just because I am a bi-racial woman, but because I believe that greater diversity leads to freer thinking and greater creativity.
Creating my film Mila was a lifechanging experience, inspired by the stories my mother told me about how she felt as a child during the bombings of Trento in World War II. I fully embrace the power of animation. Hollywood might applaud socially relevant features, but it still views animation as essentially little more than “entertainment.” It has enormous potential to affect fundamental change in how we approach each other and how we deal with societal challenges. I believe that stories told through the magic of animation can move people and influence our future generations like nothing else can.
Join us for our series of interactive webinars with visual effects professionals.
As your questions, learn about the industry and glean inspiration for your career path.
Register today at
VESGlobal.org/AMA
By TREVOR HOGG
Images courtesy of HBO.
Assisting in deciding what sets needed to be built practically or digitally was having the 10 scripts essentially written before shooting commenced.
Unlike Game of Thrones, the prequel House of the Dragon, which revolves around the decline of Targaryen rule, has to deal with the expectations of its predecessor that push the boundaries of high-end episodic visual effects to achieve filmic quality. The first season consisting of 10 episodes was able to take advantage of the new virtual production stage and Warner Bros. Leavesden Studios, with showrunners Ryan Condal and Miguel Sapochnik collaborating with Visual Effects Supervisor Angus Bickerton (The King’s Man) to achieve the necessary size and scope and many more dragons for the epic fantasy HBO series. (Sapochnik has since moved to an executive producer role.)
The goal was to create a dirtier, grungier and dustier environment than Game of Thrones, which occurs 130 years later.
Bickerton joined the project back in September 2020, and at that point the scripts for the 10 episodes were essentially written. “That’s an important thing to say because as we know all TV productions are still evolving as they’re going along. You need to have settled scripts in order to say, ‘These sequences are going to be done in the volume.’ If we wanted to shoot the interior of Storm’s End Castle in Episode 110, instead of 12 weeks in post to do that environment, we needed 12 weeks prior to shooting to build it in Unreal Engine for texturing, lighting, doing test plays in the volume, to make sure it was coming out right, and working with the DPs and art department to decide which bits we were going to put on the screens and what would be sets.”
Around 2,800 visual effects shots were created for the 10 episodes, ranging from tiny birds in the frame to dragons.
A key principle for dragons is that they keep growing.
Some of the street scenes were captured in Spanish and Portuguese locations, but the rest were shot either on the virtual production stage or in the backlot at Leavesden Studios. “We had an oval space with a 270-degree wraparound screen, and it’s about 65 to 70 feet wide by 85 feet deep,” Bickerton explains. “We hung doors to block off the rest of the oval so we could make an almost 360-degree volume. Above that, we have our ceiling, which was on panels so we could raise and lower them. Normally, you drop that ceiling just inside the wall height. Our screen was 25 feet high. When you’re inside and look up, the ceiling blends into the wall. It’s a balancing act. You have to find a position where it’s slightly the wall height, but the 40 tracking cameras arranged around the screen still need to be able to get a view of the camera in order to real-time track the camera, in order to create the interactive environment on the screen.”
“Once you’ve built this beautiful cathedral, the last thing you want is to start blowing smoke and have hot flames melt the LED panels. But we wanted candles, flame bars, driving rain and smoke. The first thing that we did was to concede some of the screens to create ventilation space for smoke. The screen was lifted above the flame bar element to get it further away from the flame.”
—Angus Bickerton, Visual Effects Supervisor
As with Game of Thrones, House of the Dragon features extensive smoke, fire and rain, which meant that special effects had to occur within the virtual production stage. “Once you’ve built this beautiful cathedral, the last thing you want is to start blowing smoke and have hot flames melt the LED panels,” Bickerton notes. “But we wanted candles, flame bars, driving rain and smoke. The first thing that we did was to concede some of the screens to create ventilation space for smoke.”Additional ventilation was placed under the screens so the air was constantly moving. “The screen was lifted above the flame bar element to get it further away from the flame,” Bickerton adds. “When it came to storm sequences, we had to figure out the orientation of our motion base so we could blow the smoke and rain atmosphere past the actors and it would go across the screen. We could have separate fans blowing it away from the screen as well as off-camera.”
Sunrise and sunsets can be shot over the course of days on a virtual production stage with the same lighting conditions being maintained.
An iconic prop that makes an appearance in House of the Dragon is the Iron Throne.
“[For the flying dragon shot in Episode 110], The Third Floor did the previs that was animated with much simpler dragon assets to make sure that we were doing the right dragon motion. The Third Floor’s simulation files were given to Pixomondo, which tweaked and revised the animation that was then given back to The Third Floor, which rebuilt it for the motion base, volume and camera, and we worked out what camera moves that we had to do with the actors to match the previs.”
—Angus Bickerton, Visual Effects Supervisor
Special Effects Supervisor Michael Dawson and his team built a new motion base that could bank, pitch and rotate. “The motion base exceeded our expectations,” Bickerton remarks. “We got fast movement, good angle changes, could throw the actors around quite considerably and get shakes in their bodies. The Wirecam was more of a challenge to move around fast because you have to ramp up to speed, fly past an actor and ramp down again. [For the flying dragon shot in Episode 110], The Third Floor did the previs that was animated with much simpler dragon assets to make sure that we were doing the right dragon motion. The Third Floor’s simulation files were given to Pixomondo, which tweaked and revised the animation that was then given back to The Third Floor, which rebuilt it for the motion base, volume and camera, and we worked out what camera moves that we had to do with the actors to match the previs.”
Concept art by Kirill Barybin showing the scale of Prince Lucerys Velaryon and Arrax, which is a 14-year-old dragon.
The 2D concept art of Arrax was translated into a 3D blockout by Kirill Barybin.
“[The dragons] ultimately can’t bear their own weight. Vhagar, which is chasing Arrax, is meant to be 103 years old whereas Arrax is 14 years old. Whenever a new member of the Targaryen family is born a dragon is put in the crib with the child so that they develop a symbiosis. But there is only so much control that you have over these dragons. In the shot where you see the big silhouette of Vhagar above Arrax was a signature image that we wanted going into the sequence to show the size of him. In terms of how the motion base moved, Arrax is flappier and smaller, so it has more aggressive motions whereas Vhagar is a huge beast and the motions are a lot more general.”
—Angus Bickerton, Visual Effects Supervisor
A narrative principal is that dragons keep on growing. “They ultimately can’t bear their own weight,” Bickerton notes. “Vhagar, which is chasing Arrax, is meant to be 103 years old whereas Arrax is 14 years old. Whenever a new member of the Targaryen family is born a dragon is put in the crib with the child so that they develop a symbiosis. But there is only so much control that you have over these dragons. In the shot where you see the big silhouette of Vhagar above Arrax was a signature image that we wanted going into the sequence to show the size of him. In terms of how the motion base moved, Arrax is flappier and smaller, so it has more aggressive motions whereas Vhagar is a huge beast and the motions are a lot more general.”
A dramatic action sequence is when Prince Lucerys Velaryon and Arrax are chased by Aemond Targaryen and Vhagar.
There were no static 2D matte paintings as the camera always had to be fluid. “The trick was to always have atmosphere-like particles in the air,” Bickerton reveals. “I remember working on our first environment and asked, ‘Should we add some birds?’ And it worked. There were birds all over the place. They were small in frame but were a key element in bringing life to the shot. Miguel wanted it to be dirtier, dustier, grungier than Game of Thrones because we are taking place 130 years before, so there was a lot of smoke, and King’s Landing has a nastier look.” Bickerton was give an eight-terabyte drive of assets from Game of Thrones by HBO that included the Red Keep and King’s Landing. Explains Bickerton, “They had been built by different facilities for each season, so we had about five or six different variations of the Red Keep and King’s Landing. Our Visual Effects Art Director, Thomas Wingrove, brought in the different models, and we came up with our own fully-realized 3D environment because we wanted to be able to come back to it and know where everything was. In Game of Thrones, they tended to add in bits when needed for each episode.”
“[Showrunner/director] Miguel [Sapochnik] wanted it to be dirtier, dustier, grungier than Game of Thrones because we are taking place 130 years before, so there was a lot of smoke, and King’s Landing has a nastier look. They had been built by different facilities for each season, so we had about five or six different variations of the Red Keep and King’s Landing. Our Visual Effects Art Director, Thomas Wingrove, brought in the different models, and we came up with our own fully-realized 3D environment because we wanted to be able to come back to it and know where everything was. In Game of Thrones, they tended to add in bits when needed for each episode.”
—Angus Bickerton, Visual Effects Supervisor
A signature shot is of shadow of Vhagar flying above Arrax.
2D and 3D techniques were combined to create the disfigured face of King Viserys I Targaryen.
Around 2,800 visual effects shots were produced for the 10 episodes. “If you’re going to have character who is 1/10th the screen size of a dragon, then it’s a digital double,” Bickerton states. “We used digital doubles for some of the fast action; otherwise it’s an element of someone on a motion base, if it’s dragon-riding. We tried to shoot an element for everything. There was quite a lot of face replacement for action and storm sequences.” All of the actors were scanned to various degrees, depending on how much of their performance is needed. Comments Bickerton, “In the tournament at the beginning of Episode 101, there are numerous face replacements. We had to do CG for half the face of King Viserys I Targaryen in Episode 108, towards the end of his final days. We did a lot of 2D warping and distortion to make his neck thinner and get his face to be gaunt. The bit I love is the sheer diversity of the work. There are so many different environments and dragon characters. That’s what I like.”
By TREVOR HOGG
Images courtesy of Prime Video and ILM.
The Martian terrain traveled by Oppy was given a reddish tint while the setting inhabited by Spirit had a bluish tint.
Considering the *batteries not included vibe, it is not surprising to learn that Amblin Entertainment was involved in producing the Prime Video documentary Good Night Oppy, which chronicles NASA’s successful development and launch of Mars rovers Opportunity and Spirit in 2003, with the former defying expectations by going beyond the 90-day mission and surviving for 15 years.
To re-enact what happened to the two rovers on the Red Planet, filmmaker Ryan White turned to ILM Visual Effects Supervisors Abishek Nair and Ivan Busquets to, in essence, produce an animated film to go along with present-day interviews and archival footage. “Ryan White wanted to make a real-life version of WALL·E [a small waste-collecting robot created by Pixar] in some ways, and mentioned during the project that E.T. the Extra-Terrestrial was his favorite film growing up and wanted to bring that emotion into it,” Nair explains. “For us, it was trying to maintain that fine balance of not going too Pixar, doing justice to the engineers who worked on the rover missions and forming a connection so that the viewers feel the same thing that the engineers went through when they working with Opportunity and Spirit.”
Amongst the 34 minutes of full CG animation was the landing of the rovers on Mars.
“For us, it was trying to maintain that fine balance of not going too Pixar, doing justice to the engineers who worked on the rover missions and forming a connection so that the viewers feel the same thing that the [NASA] engineers went through when they working with Opportunity and Spirit.”
—Abishek Nair, Visual Effects Supervisor, ILM
Creating a sense of a face was important in having the rovers be able to emote. “Early on in the show, Ryan was interested in exploring a range of emotions for these rovers and was doing that in parallel in sound and visual effects,” Busquets states. “He was trying to come up with a library of plausible looks so that we were not making a caricature. Even when animating the rovers, we observed the limitations of the joints and what the range of movement is. We did cycles of, what does sad or older-looking-moving Oppy look like? It was all based on, ‘let’s use what’s in there.’ The most obvious example was using the pan cameras as eyeballs because from a physical position, they do resemble the eyeballs on a person.”
ILM created a view of Mars from outer space.
Data was provided by the NASA Jet Propulsion Laboratory. “The rovers themselves are the most accurate versions of Opportunity and Spirit,” Nair observes. “We would send turntables of the rovers to the JPL and they would point out certain things that felt a little off, like how the robotic arm would bend and including the decals/details on the rover itself. We built up the rovers with some of the stickers that were on the prototypes and those were taken off when the rovers went to Mars. We had to keep all of those things in mind. It was a good symbiotic process. The engineers at JPL were excited that we were breathing life into the rovers.” The models for Opportunity and Spirit were the same but treated differently. “We respected the story, like when they needed to compensate for how Spirit was to be driven after one of the wheels broke,” Busquets states. “All of those animation cues were respected, so we did animate Spirit differently than Oppy. Then there are differences as to the environments that they were in, and those were kept realistic and true.”
“Early on in the show, [director] Ryan [White] was interested in exploring a range of emotions for these rovers and was doing that in parallel in sound and visual effects. He was trying to come up with a library of plausible looks so that we were not making a caricature. Even when animating the rovers, we observed the limitations of the joints and what is the range of movement. We did cycles of, ‘what does sad or older-looking-moving Oppy look like?’ It was all based on, ‘let’s use what’s in there.’ The most obvious example was using the pan cameras as eyeballs because from a physical position, they do resemble the eyeballs on a person.”
—Ivan Busquets, Visual Effects Supervisor, ILM
Both environments did not share the exact same color palette. “The Spirit side of the planet had more of bluish hue to it whereas the Oppy side was redder,” reveals Nair. “Also, whenever you see the split-screen, Oppy is on screen left and Spirit is on screen right. and that was maintained throughout the documentary. There was always this visual reference as to who was where, who is doing what and even the direction that they move. Oppy would always go left to right while Spirit was right to left. We built in these little cues to psychologically know that right now you’re looking at Spirit not Oppy. As the story progressed, Spirit had a broken wheel so that helped.”
Adding to the drama was having the rovers get stuck in sandpits and trying to get out.
“The Spirit side of the planet had more of bluish hue to it whereas the Oppy side was redder. Also, whenever you see the split-screen, Oppy is on screen left and Spirit is on screen right. and that was maintained throughout the documentary. There was always this visual reference as to who was where, who is doing what and even the direction that they move. Oppy would always go left to right while Spirit was right to left. We built in these little cues to psychologically know that right now you’re looking at Spirit not Oppy. As the story progressed Spirit had a broken wheel so that helped.”
—Abishek Nair, Visual Effects Supervisor, ILM
Four major dust variants were created for Spirit and Oppy. “As the shots progressed, we started running effects simulations and dust maps on it so we could turn them up or down depending on the shots themselves,” Nair notes. There was not a lot of room for creative licence. “Normally we would go with what makes for a more cinematic shot, but with this being a documentary we kept it grounded in reality as much as possible,” Busquets states. “A place where we did make a concession was when it came to the speed. The maximum speed of the rovers was something like two inches per second. It became obvious when we started animating that we were not going anywhere. How are we going to tell a story with that?”
A critical part of making the imagery believable was incorporating photographic aberrations such as lens flares.
Since visual effects was a new area for Ryan White, ILM produced storyboards and previs that also aided editorial. “The documentary style of filmmaking is different from feature film,” Nair observes. “We had to make sure that we get some fairly detailed storyboards going for key shots at least and rough storyboards for the rest that we would be doing which would then inform us in terms of the beats, length of the shots and how it’s sitting in the edit. When it came to the particular shot of Oppy getting her wheel stuck in the stand, we had some fairly detailed storyboards, but then we went through quite a bit of postvis animation to get the idea across of the wheel spinning. We also had to work with some clever camera angles that would tell the story. We were working within a timeframe and budget and trying to make sure that visually it was telling the story that was supposed to be told there. There were pockets of sand simulation that we did early on to show the wheel spinning and kicking out of the sand. We showed that to Ryan who was excited about it, and then we brought in all of those little animation cues of Oppy struggling trying to go in reverse and get out of that sandpit.”
The pan cameras on the rovers were treated as if they were eyes, which helped to give them a personality.
Sandstorms had to be simulated. “We had photographic reference of sandstorms on Mars, so we knew exactly what it would look like,” Nair explains. “We’ve done sandstorms before on various movies, but we had to make sure that these would actually happen on Mars: the little electrical storms that happen within them that have bolts of lightning. That’s where we could bring a little bit of drama into the whole thing by having the bolts of lightning and closeups of Oppy staring up at the sandstorm and lightning flashes on her face. There were tons of auxiliary particles flying around the area around her and tons of sand bleeding off her face and solar panels. We did run that through layers of simulations and then threw the whole kitchen sink at it and started peeling back to see what we could use and omit to bring the storytelling back into the whole thing.”
“The number of unique locations, from their landing sites to the journeys, to the different craters that they visit, the amount of nuance and rocks and different type of terrain, everybody involved felt there was something special about building something not based on concept art but scientific data. However, you want to make it as photographic and exciting as possible. There was a lot of pride I saw in the team in doing that.”
—Ivan Busquets, Visual Effects Supervisor, ILM
The edit was a work in progress. “What was challenging and unique about this project was being involved from an early stage and they hadn’t finished all of their interviews,” Busquets remarks. “Ryan had some ideas for the chapters that he wanted to cover. We helped to inform the edit as much as the edit helped to inform our work. It made things a bit slower to progress, and we had to rely on rough animation and previs to feed editorial.”
Four major dust variants were created for Spirit and Oppy.
No practical plates were shot for the 34 minutes of full CG animation. “We asked to be sent to Mars to shoot some plates and were told that it would be too expensive!” laughs Busquets. “We did get a ton of data from NASA including satellite images from orbiters that have been sent to Mars. It was the equivalent of Google Earth but at a lower resolution. All of the environments that you see in the documentary are based on the real locations the rovers visited.” ILM had to fill in the gaps and could not use the actual imagery because they were not high resolution enough for 4K. A cool moment to create is when Oppy takes a selfie. “It was a fun sequence to do, and we followed the same arc of the cameras so Oppy could actually take the photographs,” Nair comments. “We did have reference of the separate images that were stitched together. We got our snapshots within that particular shot very close to what was actually taken. In the documentary we made it black and white and grainier compared to the other shots.”
Electrical storms had to be incorporated inside of the sandstorms that occur on Mars.
One of the most complex shots was depicting the solar flares hitting the spacecraft as it travels to Mars. “As an idea, it was storyboarded in a simple manner, and when we started looking at it we figured out that it wasn’t going to show the scale and the distance that these flares would travel or the danger that the rovers were in,” Nair states. “Working the timing of the camera move to the sun with the burst of flare energy… The camera takes over from there, follows the flare energy hitting the spacecraft and swivels around. That whole thing took a bit to plan out. It was a leap of faith as well because Ryan didn’t want it to look like too Transformers in a way. We had to keep things still believable but at the same time play around a little bit and have some fun with the whole thing. It’s one of our longest shots in the show as well. As for the other challenges, it was a documentary format where the edit was fluid, and we had to make sure it would conform with our timeline and the scope of work that was left to do.”
The environmental work was extensive. “The number of unique locations, from their landing sites to the journeys, to the different craters that they visit, the amount of nuance and rocks and different type of terrain, everybody involved felt there was something special about building something not based on concept art but scientific data,” Busquets remarks. “However, you want to make it as photographic and exciting as possible. There was a lot of pride I saw in the team in doing that.”
By TREVOR HOGG
Images courtesy of Marvel Studios and Digital Domain.
Production Special Effects Supervisor Daniel Sudick and his special effects teams built a 30- to 40-foot-section of the boat deck that was 15 to 20 feet up in the air.
Third acts are never easy as this is what the audience has been waiting for, and when it comes to the Marvel Cinematic Universe there have been a plethora of epic battles making things even more difficult to come up with something that has not been seen before. In Black Panther: Wakanda Forever, the Wakandans take a ship out into the ocean and successfully lure the underwater-dwelling Talokanil into a massive confrontation while the newly crowned Black Panther does single combat with Namor in a desert environment. States Hanzhi Tang, VFX Supervisor at Digital Domain, “I knew this movie was important, and having met [director] Ryan Coogler on set, you want him to succeed as he’s the nicest person. I’ve known [Marvel Studios VFX Supervisor] Geoffrey Baumann for a long time, so we had already a good working relationship; he trusted us with trying to help him navigate whatever surprises that would come up.”
The rappelling of the Dora Milaje was influenced by a dance troupe.
A back-and-forth between Digital Domain and Wētā FX ensured that their shots were seamlessly integrated with each other.
“We started off in the Atlantic Ocean and shared some parts with Wētā FX, which had already figured out the underwater and deep ocean looks. Digital Domain kept to above the surface and a couple of shots where characters had to go in and out water. There was a back and forth between us so to synchronize with each other as to the camera and the location of the water plane. Then we would do everything from the water surface and upwards. Then one of us had to do the final composite and blend the two together. Luckily, when the camera hit that water plane it acts like a wipe.”
—Hanzhi Tang, VFX Supervisor, Digital Domain
“We started off in the Atlantic Ocean and shared some parts with Wētā FX, which had already figured out the underwater and deep ocean looks,” Tang explains. “Digital Domain kept to above the surface and a couple of shots where characters had to go in and out water.” For the some of the underwater shots, Wētā FX provided the virtual camera as a first pass. “There was a back and forth between us so to synchronize with each other as to the camera and the location of the water plane,” Hang details. “Then we would do everything from the water surface and upwards. Then one of us had to do the final composite and blend the two together. Luckily, when the camera hit that water plane it acts like a wipe.” A giant set piece was constructed for the boat. “A 30- to 40-foot section of the boat deck was built that was 15 to 20 feet up in the air,” reveals Tang. “It was built as a rig that could tilt up to 45 degrees, because there is a point in the movie where the boat gets attacked and almost rolls over. People could slide down the deck. [Production Special Effects Supervisor] Dan Sudick and his special effects team had built one big in-ground tank to film people in the water, and separately this deck. As far as the water interaction on the deck, it was all CG.”
The Talokanil were supposed to have bare feet, which were inserted digitally for safety reasons.
A major task was adding digitally the rebreather masks worn by the Talokanil.
Plates were shot for the foreground elements with various bluescreens placed in the background. “All the way back was a set extension that was blended into the foreground,” Tang remarks. “Everyone in the background is a digital double.” The rappelling of the Dora Milaje was influenced by a dance troupe. Describes Tang, “There is a vertical wall where everyone does dance moves on cables that was the inspiration for the Dora Milaje being suspended. The whole thing was shot horizontally with them dangling off of cables. It was incredible.” The skies were art directed. “There was a lot of picking and choosing of the type of day and clouds,” Tang comments. “It ended up being a combination of CG and matte-painted clouds. The style of the on-set lighting by Autumn Durald Arkapaw, the cinematographer, was soft, and she would wrap the lighting around characters and give them a lovely sheen on their skin.”
“A 30- to 40-foot section of the boat deck was built that was 15 to 20 feet up in the air. It was built as a rig that could tilt up to 45 degrees, because there is a point in the movie where the boat gets attacked and almost rolls over. People could slide down the deck. [Production Special Effects Supervisor] Dan Sudick and his team built one big in-ground tank to film people in the water, and separately this deck. As far as the water interaction on the deck, it was all CG.”
—Hanzhi Tang, VFX Supervisor, Digital Domain
Shuri transports a captured Namor to a desert environment where they have engage in single combat.
Blue-skin characters, such as the Talokanil, against bluescreen is always a fun challenge, Tang reports. “Greenscreen would have been worse with the amount of spill, given that it was meant to be a blue-sky reflection,” he states. “We ended up doing roto on everything. The set is 20 feet in the air, people are being sprayed down with water, and there are all of these cables that need to be painted out. When the Talokanil board, you have 20 stunt people climbing the boat, and there’s no perimeter fence around this thing. For safety reasons, everyone had to wear decent footwear, and these characters were meant to be barefoot. They did not do the rubber feet that Steve Rogers wears in Captain America: The First Avenger, so we ended up tracking and blending CG for feet replacements. We also had to track and replace rebreather masks because the Talokanil wear them when they’re out of the water. It fits over the mouth and the gills on the neck. Those were impractical to wear, be running around and trying to perform the stunts in.”
“All the way back [for the rappelling of the Dora Milaje sequence] was a set extension that was blended into the foreground. Everyone in the background is a digital double. There is a vertical wall where everyone does dance moves on cables that was the inspiration for the Dora Milaje being suspended. The whole thing was shot horizontally with them dangling off of cables. It was incredible.”
—Hanzhi Tang, VFX Supervisor, Digital Domain
Bluescreen made more sense than greenscreen as it provided the correct blue spill that would have been caused by the sky.
Namor (Tenoch Huerta) is captured and Shuri flies him off into the desert because he gains his power from the ocean. “They have a one-on-one fight, and there was a lot of set extensions and cleanup of the background,” Tang remarks. “We put the sky and the sun in the right place.” A flamethrower was utilized on set for the desert explosion. “But it wasn’t anywhere near the size of the actual explosion in the movie. It was used for exposure, color and scale reference of how that size flame appears through the camera,” Tang says. The flying shots of Namor were sometimes the most difficult to achieve, he adds. “In some of the shots we would have captured Tenoch Huerta on bluescreen, and he’ll do some closeup acting,” Tang observes. “We tried some wirework that looked too much like wirework and ended up doing a half-body replacement from the chest down. They captured a lot of shots with him in a tuning fork and being pulled around the set, so it was a lot of paint-out for the tuning fork and all of the gear on it. It’s suitable for waist-up shots. Tenoch just had the pair of shorts, which means there’s not much to hide, and when doing extensive paint-out on skin, you can end up with large patches that can be easily seen.”
By IAN FAILES
Several sequences featuring the Giganotosaurus in Jurassic World Dominion made use of a animatronic head section on set. (Image courtesy of Universal Pictures and ILM)
For final shots, ILM would often retain the entire animatronic head of the dinosaur and add in the rest of the dinosaur body. (Image courtesy of Universal Pictures and ILM)
How do you put yourself into the shoes, or feet, of a Giganotosaurus? What about an advanced chimpanzee or a bipedal hippo god? And how do you tackle a curious baby tree-like humanoid? These are all computer-generated characters with very different personalities featured in films and shows released in 2022, and ones that needed to be brought to life in part by teams of animators. Here, animation heads leading the charge at ILM, Wētā FX, Framestore and Luma Pictures share how their particular creature was crafted and what they had to do to find the essence of that character.
When your antagonist is a Giganotosaurus
When Jurassic World Dominion Animation Supervisor Jance Rubinchik was discussing with director Colin Trevorrow how the film’s dinosaurs would be brought to the screen, he reflected on how the animals in the first Jurassic Park “weren’t villains, they were just animals. For example, the T-Rex is just curious about the jeep, and he’s flipping it over, stepping on it and biting pieces off of it. He’s not trying to kill the kids. I said to Colin, ‘Can we go back to our main dinosaur – the Giganotosaurus – just being an animal?’ Let’s explore the Giga being an animal and not just being a monster for monster’s sake. It was more naturalistic.”
With Trevorrow’s approval for this approach, Rubinchik began work on the film embedded with the previs and postvis teams at Proof Inc., while character designs also continued. This work fed both to the animatronic builds by John Nolan Studio and Industrial Light & Magic’s CG Giganotosaurus. “Early on, we did lots of walk cycles, run cycles and behavior tests. I personally did tests where the Giga wandered out from in between some trees and was shaking its head and snorting and looking around.”
Another aspect of the Giganotosaurus was that ILM would often be adding to a practical/animatronic head section with the remainder of the dinosaur in CG. For Rubinchik, it meant that the overall Giga performance was also heavily influenced by what could be achieved on set. Comments Rubinchik, “What I didn’t want to have were these practical dinosaurs that tend to be a little slower moving and restricted, simply from the fact they are massive hydraulic machines, that then intercut with fast and agile CG dinosaurs. It really screams, ‘This is what is CG and this is what’s practical.’
A full-motion CG Giganotosaurus crafted by ILM gives chase. (Image courtesy of Universal Pictures and ILM)
Animation Supervisor Jance Rubinchik had a hand in ensuring that the movement of animatronic dinosaurs made by John Nolan Studio, as shown here, were matched by their CG counterparts. (Image courtesy of Universal Pictures and ILM)
“Indeed,” Rubinchik adds, “sometimes as animators, you have all these controls and you want to use every single control that you have. You want to get as much overlap and jiggle and bounce and follow through as you can because we’re animators and that’s the fun of animating. But having something that introduced restraint for us, which was the practical on-set dinosaurs, meant we were more careful and subtler in our CG animation. There’s also a lot of fun and unexpected things that happen with the actual animatronics. It might get some shakes or twitches, and that stuff was great. We really added that stuff into Giga wherever we could.”
For Pogo shots in season 3 of The Umbrella Academy, on-set plates featured actor Ken Hall. (Image courtesy of Netflix and Wētā FX)
Voice performance for Pogo was provided by Adam Godley (right), while Wētā FX animators also contributed additional performance capture. (Image courtesy of Net-flix and Wētā FX)
The final Pogo shot. (Image courtesy of Netflix and Wētā FX)
“What I didn’t want to have were these practical dinosaurs that tend to be a little slower moving and restricted, simply from the fact they are massive hydraulic machines, that then intercut with fast and agile CG dinosaurs. It really screams, ‘This is what is CG and this is what’s practical.’ … But having something that introduced some restraint for us, which was the practical on-set dinosaurs, meant we were more careful and subtler in our CG animation. There’s also a lot of fun and unexpected things that happen with the actual animatronics. It might get some shakes or twitches, and that stuff was great. We really added that stuff into Giga wherever we could.”
—Jance Rubinchik, Animation Director, MPC
This extended even to the point of replicating the animatronic joint placements from the John Nolan Studio creatures into ILM’s CG versions. “All of the pivots for the neck, the head, the torso and the jaw were in the exact same place as they were in the CG puppet,” Rubinchik outlines. “It meant they would pivot from the same place. I was so happy with how that sequence turned out with all the unexpected little ticks and movements that informed what we did.”
Pogo reimagined
The advanced chimpanzee Pogo is a CG character viewers greeted in Seasons 1 and 2 of Netflix’s The Umbrella Academy as an assistant to Sir Reginald Hargreeves, and as a baby chimp. The most recent Season 3 of the show sees Pogo appear in an alternative timeline as a ‘cooler’ version of the character who even becomes a biker and tattoo artist. Wētā FX created each incarnation of Pogo, which drew upon the voice of Adam Godley, the on-set performance of Ken Hall and other stunt performers and stand-ins to make the final creature.
Having ‘lived’ with Pogo in his older, more frail form in the past seasons, Wētā FX Animation Supervisor Aidan Martin and his team now had the chance to work on a character who was capable of a lot more physically, including Kung Fu. “All of a sudden, Pogo’s been in the gym. He’s juiced up. He’s doubled his shoulder muscle mass and his arms are a lot bigger, so the way that he carries himself is completely different. His attitude has changed, too. He’s more gnarled and he’s a lot more jaded about the world,” Martin says.
From an animation point of view, Wētā FX animators took that new physicality into the performance and reflected it in postures and movements. “It was even things like the way he looks at somebody now,” Martin explains. “Early on in Season 1, when he looks at people, he’s very sincere. He was like a loving grandfather. Now, he’s a bit fed up with it all and he’s not looking at you with good intentions. He thinks you’re an idiot and he doesn’t have time for it. That’s where he’s coming from behind the mask.”
Pogo is a grittier character in this latest season, even working as a tattoo artist. (Im-age courtesy of Netflix and Wētā FX)
“All of a sudden, Pogo’s been in the gym. He’s juiced up. He’s doubled his shoulder muscle mass and his arms are a lot bigger, so the way that he carries himself is completely different. His attitude has changed, too. He’s more gnarled and he’s a lot more jaded about the world.”
—Aidan Martin, Animation Supervisor, Wētā FX
One of the VFX studio’s toughest tasks on this new Pogo remained the character’s eyes. “Eyeline is everything, especially with chimps,” says Martin, who also had experience on the Planet of the Apes films at Wētā FX. “When you’re trying to do a more anthropomorphized performance, chimps with their eyelines and brows do not work very well compared to humans because their eyes are just so far back and their brows sit out so far. For example, as soon as you have the head tilt down and then try to make them look up, you can lose their eyes completely. Balancing the eyeline and the head angle is really difficult, especially on chimps.”
“Even once you’ve got that working, getting the mouth shapes to read properly is also tricky,” Martin continues. “There are some really tricky shapes, like a ‘V’ and an ‘F,’ that are incredibly hard on a chimp versus a human. Their mouths are almost twice as wide as our mouths. Humans look really good when they’re talking softly, but getting a chimp to do that, it looks like they’re either just mumbling or they get the coconut mouth, like two halves clacking together, and everything’s just too big. We used traditional animation techniques here, basically a sheet of phoneme expressions for Pogo’s mouth.”
Going hyper (or hippo) realistic
Finding the performance for a CG-animated character often happens very early on in a production, even before any live action is shot. In the case of the Marvel Studios series Moon Knight’s slightly awkward hippo god Taweret, it began when Framestore was tasked with translating the casting audition of voice and on-set performer Antonia Salib into a piece of test animation.
Actor Antonia Salib performs the role of hippo god Taweret on a bluescreen set. (Image courtesy of Marvel and Framestore)
Final shot of Taweret by Framestore. (Image courtesy of Netflix and Wētā FX)
“The Production Visual Effects Supervisor, Sean Andrew Faden, asked us to put something together as if it was Taweret auditioning for the role,” relates Framestore Animation Supervisor Chris Hurtt. “We made this classic blooper-like demo where we cut it up and had the beeps and even a set with a boom mic. We would match to Antonia’s performance with keyframe animation just to find the right tone. We would later have to go from her height to an eight- or nine-foot-tall hippo, which changed things, but it was a great start.”
Salib wore an extender stick during the shoot (here with Oscar Isaac) to reach the appropriate height of Taweret. (Image courtesy of Marvel and Framestore)
Framestore had to solve both a hippo look in bipedal form and the realistic motion of hair and costume for the final character. (Image courtesy of Marvel and Frame-store)
“We looked at a lot of reference of real hippos and asked ourselves, ‘What can we take from the face so that this doesn’t just feel like it’s only moving like a human face that’s in a hippo shape?’ We found there were these large fat sacks in the corners that we could move, and it made everything feel a little more hippo-y and not so human-y. Probably the biggest challenge on her was getting Taweret to go from a human to a hippo.”
—Chris Hurtt, Animation Supervisor, Framestore
During filming of the actual episode scenes, Salib would perform Taweret in costume with the other actors and with an extender stick and ball markers to represent the real height of the character. As Hurtt describes, Framestore took that as reference and looked to find the right kind of ‘hippoisms’ on Salib. “We looked at a lot of reference of real hippos and asked ourselves, ‘What can we take from the face so that this doesn’t just feel like it’s only moving like a human face that’s in a hippo shape?’ We found there were these large fat sacks in the corners that we could move, and it made everything feel a little more hippo-y and not so human-y.”
“Probably the biggest challenge on her was getting Taweret to go from a human to a hippo,” adds Hurtt, who also praises the Framestore modeling, rigging and texturing teams in building the character. “The main thing for animation was that we had to observe what the muscles and the FACS shapes were doing on Antonia, and then map those to the character. Still, you’re trying to hit key expressions without it looking too cartoony.”
To help realize the motion of Taweret’s face shapes in the most believable manner possible, Framestore’s animators relied on an in-house machine learning tool. “The tool does a dynamic simulation like you would with, say, hair, but instead it would drive those face shapes,” Hurtt notes. “It’s not actually super-noticeable, but it’s one of those things if you didn’t have there, particularly with such a huge character, she would’ve felt very much like paper-mâché when she turned her head.”
The enduring, endearing allure of Groot
The Marvel Studios Guardians of the Galaxy films have borne several CG-animated characters; one of the most beloved being Baby Groot. He now stars in his own series of animated shorts called I Am Groot, directed by Kirsten Lepore, with visual effects and animation by Luma Pictures. The fully CG shorts started with a script and boarding process driven by Lepore, according to Luma Pictures Animation Director Raphael Pimentel.
Luma Pictures Animation Director Raphael Pimentel donned an Xsens suit (and Ba-by Groot mask) for motion capture reference at Luma Pictures during the making of I Am Groot. (Image courtesy of Luma Pictures)
The behavior settled on for Baby Groot, which had been featured in previous Marvel projects, was always ‘endearing.’ (Image courtesy of Marvel and Luma Pictures)
“There were scripts early on showing what the stories were going to be about. These quickly transitioned into boards. Then, Kirsten would provide the boards to us with sound. She would put them to music as well, which was important to get the vibe. These would then be turned over to us as an animatic of those boards with the timing and sound that Kirsten envisioned, which was pretty spot-on to the final result.”
Baby Groot’s mud bath in one of the shorts required the extensive cooperation be-tween the animation and FX teams at Luma Pictures. (Image courtesy of Marvel and Luma Pictures)
Baby Groot still delivers only one line: “I am Groot.” (Image courtesy of Marvel and Luma Pictures)
In terms of finding the ideal style of character animation for Groot in the shorts, Luma Pictures shot motion capture as reference for its animators, which was used in conjunction with video reference that Lepore also provided, and vid-ref shot by the animators themselves. The motion capture mainly took the form of Pimentel performing in an Xsens suit. “We went to Luma and identified the key shots that we wanted to do for every episode,” Pimentel recalls. “We would do one episode each day. As we were going through those key shots, we ended up shooting mocap for everything. Kirsten was there telling me the emotions that she wanted Groot to be feeling at that specific point in time. And we said, ‘Let’s keep going, let’s keep going.’ Next thing you know, we actually shot mocap for everything to provide reference for the animators.”
In one of the shorts, “Groot Takes a Bath,” a mud bath results in the growth of many leaves on the character, which he soon finds ways to groom in different styles. This necessitated a close collaboration between animation and effects at Luma. “That was a technical challenge for us,” Pimentel discusses. “In order for Kirsten to see how the leaves were behaving, she would usually have to wait until the effects pass. We built an animation rig that was very robust that would get the look as close to final as possible through animation.”
The final behavior settled on for Baby Groot in the shorts was “endearing,” Pimentel notes. Despite Groot’s temper tantrums and ups and downs, he was still kept sweet at all times. From an animation standpoint, that meant ensuring the character’s body and facial performances stayed within limits. “It’s easy to start dialing in the brows to be angry, but we had to keep the brows soft at all times. And then his eyes are always wide to the world. Regardless of what’s happening to him, his eyes are always wide to the world, much like a kid is.”
By TREVOR HOGG
Images courtesy of MUBI.
Partial set build, background plate photography, 3D model of mountaintop and depth pass are combined together to create an aerial shot.
“The mountain [where Seo-rae falls to his death] is 100% CG, but the background where that peak is situated is a real scene that we shot. There are a ton of mountains in Korea, so it’s a composite of these two. The two main locations of the mountains and sea have this unique form that goes up and down and up and down. We wanted to repeatedly show such up and down patterns, like a wave in an ocean or the landscape of the mountain range, but at the same time we didn’t want to make it too obvious for the audience to say, ‘Ah-ha! I see that.’”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
Ever since the release of Oldboy, Lee Joen-hyoung, who serves as the CEO and VFX Supervisor at Korean VFX studio 4th Creative Party, has been collaborating with filmmaker Park Chan-wook. Decision to Leave, which revolves around a detective becoming infatuated with a murder suspect, seems to be a less likely candidate for extensive digital augmentation because of the subject matter, but this was not the case. “This was the easiest read of all of director Park’s screenplays,” Lee recalls. “However, in the end, the work that I had to do was the toughest because unlike The Handmaiden or Oldboy, for which I was able to come up with the imagination straightaway in terms of the visuals and mise-en-scène, Decision to Leave was so ambiguous.” About 580 shots were created over a period of six months. “We were done at the end of 2021, but then we had some time left before Cannes and the actual release, so we did some detailing work with only a handful of people to make it even more perfect,” Lee remarks.
Invisible effects include adding photographs to the wall devoted to unsolved crimes, created by Hae-joon.
Unwanted natural elements had to be removed from the finale, which took place in a beach environment that was, in reality, three different locations combined together. “Jang Hae-joon’s portion of it was shot at the end of fall, entering into the winter season, so we started to have some snow,” Lee states. “For the sake of continuity, we had to take out snow and also had to work on the mountain that you can see from far. Even though we had to remove the snow and wind, Tang Wei, the actor, still felt those harsh conditions, which reflected the emotional state of Hae-joon. With Son Seo-rae’s portion, there was no problem because the time of day was different.” Atmospherics were also digitally added into shots. “We had to have mist in the latter part of the film because it’s set against Ipo, which is famous for mist and being humid all of the time,” Lee adds. “Mist had to be present in almost all of the outdoor scenes, but we had to define how much for a particular scene.”
“Director Park likes to use insects in his movies, such as the ants in Oldboy or the mosquitoes in Lady Vengeance or the ladybug in I’m a Cyborg, But That’s OK. I knew even before that he was going to put some kind of insect in this film, too. We already have a vast library filled with insects and their forms and movements, so we were well-equipped to execute that.”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
Insects are always featured in the films of Park Chan-wook, with CG ants crawling over the face of Seo-rae’s dead husband.
“Hae-joon tries to replicate what Seo-rae would have done to kill her husband, and he goes up the mountain, lies down and looks up. Then Seo-rae’s hand comes in and they hold hands together. Director Park told me that the audience should be able to see the callus on her palm because it’s evidence that she is already an expert climber. It was difficult to visually make that happen because when those two hands meet together the palm becomes a little bit dark, so we had to do several retakes. That one scene was the most challenging for me.”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
The mountain where the husband, Seo-rae, falls to his death was a partially built on a backlot set surrounded by bluescreen. “The mountain is 100% CG, but the background where that peak is situated is a real scene that we shot,” Lee reveals. “There are a ton of mountains in Korea, so it’s a composite of these two. The two main locations of the mountains and sea have this unique form that goes up and down and up and down. We wanted to repeatedly show such up and down patterns, like a wave in an ocean or the landscape of the mountain range, but at the same time we didn’t want to make it too obvious for the audience to say, ‘Ah-ha! I see that.’” Ants crawl over the face of the deceased spouse. “Director Park likes to use insects in his movies, such as the ants in Oldboy or the mosquitoes in Lady Vengeance or the ladybug in I’m a Cyborg, But That’s OK. I knew even before that he was going to put some kind of insect in this film, too. We already have a vast library filled with insects and their forms and movements, so we were well-equipped to execute that,” Lee notes.
The x-ray of an arm and hand transitions into the arm and hand of Hae-joon, emphasizing that he is still thinking of Seo-rae even when having an intimate moment with his wife.
A clever shot transition moves from the x-ray of a hand to the one belonging to Hae-joon as he is having sex with his wife in bed. “I have already accumulated so much experience with Park Chan-wook-esque transitions!” Lee laughs. “We knew how the output should look like because it was worked out in the storyboarding phase and we subsequently shot the source material. That transition was a nod to what we had already did in I’m a Cyborg, But That’s OK. As a long-time collaborator, I already knew what color director Park likes for the x-ray and the timing for the movement of the hand. That transition was the symbol of how Hae-joon is really with Soe-rae even though he is physically next to his wife.” The growing emotional bond between the detective and the murder suspect is visually emphasized in the interrogation scenes. “We wanted the audience to see something happen that is not physically possible,” Lee describes. “For that we had four characters because there were two in front and there are two in the reflection of [the mirror]. We shot the real people, then the reflection pass, and composited these two together so that we were able to control the focusing and defocusing in order to fully realize director Park’s intention and vision for the scene.”
“That transition [shot of the x-ray of a hand to the one belonging to Hae-joon as he is having sex with his wife in bed] was a nod to what we had already did in I’m a Cyborg, But That’s OK. As a long-time collaborator, I already knew what color director Park likes for the x-ray and the timing for the movement of the hand. That transition was the symbol of how Hae-joon is really with Soe-rae even though he is physically next to his wife.”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
Bunam Beach in Samcheok, Hakampo Beach and Magumpo Beach in Taean were combined to create the environment that appears in the finale.
Reflections and monitors were manipulated during the interrogation scenes to visually show that Seo-rae and Hae-joon are becoming emotionally closer to each other.
Driving shots are common place for Korean television series and films. “For our film, we wanted to make sure that the windshield of the car and the reflections on the car and how the lights will change inside of the space would be recognizable to the audience,” Lee remarks. “We had to make sure that the lighting and reflections worked perfectly; that was our full intention. Since our actors are inside the car, we also wanted to make a realistic look for the interior shot.” An unlikely shot proved to be difficult. Reveals Lee, “Hae-joon tries to replicate what Seo-rae would have done to kill her husband, and he goes up the mountain, lies down and looks up. Then Seo-rae’s hand comes in and they hold hands together. Director Park told me that the audience should be able to see the callus on her palm because it’s evidence that she is already an expert climber. It was difficult to visually make that happen because when those two hands meet together the palm becomes a little bit dark, so we had to do several retakes. That one scene was the most challenging for me.”
By IAN FAILES
Cinesite’s Montreal and Vancouver facilities took on Paws of Fury: The Legend of Hank after the film had already spent several years in development. (Image courtesy of Paramount Pictures)
A common credit on a CG animated feature film or show is ‘visual effects supervisor.’ But wait, don’t VFX supervisors work just in live-action? This is, of course, not so. Indeed, on a CG-animated project, a visual effects supervisor is a crucial role, often helping to formulate the ‘look of picture’ as well as solve many of the technical and artistic hurdles along the way – not too dissimilar at all from a VFX supervisor working in live-action.
In this roundtable, visual effects supervisors from Walt Disney Animation Studios, Pixar, DreamWorks Animation, Sony Pictures Imageworks and Cinesite Studios explain their tasks on recent animated films and shows and share their thoughts on the key trends hitting their field right now.
Cinesite is one of a limited number of VFX studios that also deliver full CG-animated features and other animated projects. (Image courtesy of Paramount Pictures)
VFX supervisor in live-action versus animation
Alex Parkinson (Visual Effects Supervisor, Cinesite): Often the difference between VFX supervisors in live-action and animation depends on the kind of VFX show you are talking about. Sometimes entire sequences in movies are CG with no live-action aspects at all. In that case, the workflow and the job would be very similar. But mostly the differences between the two jobs reflect the differences between the two mediums. In animation, you tend to have more creative ownership over the final product and way more freedom. Live-action VFX is a more technical and precise process. It is harder in a lot of ways, because you must match existing elements and every shot is put through more scrutiny.
Marlon West (Visual Effects Supervisor, Walt Disney Animation Studios, on Iwájú): While the visual effects supervisor on a live-action film is tasked with leading the team to create images that they can’t go out and capture live, for animation every image is created from ‘scratch.’ So, they are tasked with leading the charge of creating every image technically and creatively.
“Multiple time zones were our main challenge on Iwájú. We have artists in Los Angeles, London, Lagos, Montreal and Vancouver. At one point we had artists in Uganda, Kenya and Zimbabwe as well. While not hugely technical, the biggest challenge was initially story, art and editorial teams who have worked primarily with our internal tools to work with outside partners.”
—Marlon West, Visual Effects Supervisor, Walt Disney Animation Studios
Of all the tech trends that abound in CG animation right now, Cinesite Visual Effects Supervisor Alex Parkinson believes that real-time game engines have the most potential to revolutionize the industry, particularly in relation to CG cinematography.
“Let’s take a typical shot, the villain reveal. The villain walks towards the camera through darkness and mist, their cape billowing in the wind. As they move into the light more of their form is revealed, and then at the last moment they lift their face towards the light,” Parkinson describes.
“In a traditional CG animation pipeline, this would be created in a serial manner,” Parkinson continues. “The camera would be created in layout using some very rough blocked animation. It would be animated without the cape, which would be added in CFX. FX would then do the mist interaction, then the whole thing would be passed to lighting to make it work. However, what if it doesn’t work? What if the timing is off or lighting cannot make the animation work for the face reveal? The shot goes all the way back down the pipeline for a re-do, then round and round until ultimately we run out of time and have to go with what we have.”
Parkinson believes real-time game engines will change this process. “We will be able to work much more like a live-action shoot does. We will be able to assemble all the pieces we have at any time, see them all in context, and work more in parallel, tweaking each element so they work together harmoniously. The potential for a quality increase in our filmmaking is huge.”
Kylie Kuioka voices Emiko in Paws of Fury: The Legend of Hank. (Image courtesy of Paramount Pictures)
Jane Yen (Visual Effects Supervisor, Pixar, on Lightyear): I see my role at Pixar as overseeing all of the technical work that needs to happen in computer graphics to produce the film visuals. Pixar has historically been at the very forefront of computer graphics and creating CG imagery, so a lot of the Pixar history and the roles that used to be called supervising technical director, and now VFX supervisor, were based on developing new technology to make it even possible.
Matt Baer (Visual Effects Supervisor, DreamWorks Animation, on The Bad Guys): At the creative leadership level, there are more peer relationships for the animation VFX supervisor. The head of story is my peer. The head of layout is my peer. The head of animation is my peer. For example, I’m responsible for making sure our head of animation is set up with the necessary workflows and technologies. Ultimately, the head of animation is creatively responsible for the character animation. I consult during animation development and shot work so our animators have context as to how their work fits into the bigger picture.
A layout frame from a hyperjump sequence in Pixar’s Lightyear. (Image courtesy of Disney/Pixar)
Animation pass on Buzz Lightyear. (Image courtesy of Disney/Pixar and Pixar)
R. Stirling Duguid (Visual Effects Supervisor, Sony Pictures Imageworks, on The Sea Beast): At Imageworks, compared to other animation companies that are vertically integrated, we have a client/vendor relationship. My primary role is to represent Imageworks to the client as well as the director, the production designer, the art director and their producer. That’s the first step, representing the company. Then it’s about building the team and the framework for the entire production – how we go from storyboards to final composite, laying that out and making sure that we have the right people in charge for each of those departments.
Lighting, FX and rendering are the final steps in the final frame. (Image courtesy of Disney/Pixar)
Solving the art and tech and pipeline, in animation
Matt Baer: On The Bad Guys, one of our key visual goals was to pay homage to illustration and 2D animation. Our design philosophy was to use simplification to achieve our stylized and sophisticated look. Anyone who has worked in CG knows this is the opposite of what many of our tools are designed to do! We needed to replace the realistic details of traditional CG techniques with the wonderful hand-drawn imperfections seen in illustrations. Taking this to scale on a feature film required us to build new workflows for every department, allowing them to create images that look handmade – removing superfluous CG details while keeping just enough visual information to guide the eye towards the most important aspects of the shot. Once the image was reduced, our artists added custom line work, textures and 2D effects to every shot in the film.
A first look image from Walt Disney Animation Studios’ Iwájú series. (Image courtesy of Disney)
R. Stirling Duguid: For The Sea Beast, the big technical hurdle was ropes. We had thousands of them to do. Our Animation Supervisor, Joshua Beveridge, said, ‘We have to start from the ground up and build an awesome rope rig because we’re not going to make it through the movie without that.’ We came up with a procedural solution that was designed to be animation-friendly. The idea is that the length of the rope would always stay the same – usually it stretches or is cheated, but ours had the proper hang and everything. Ropes are in so many shots.
Jane Yen: Lightyear was Pixar’s largest FX film to date. Almost 70% of the film has FX elements in it. As the VFX Supervisor on an animated film, I had to look at every single component of the film, not just FX but also set building and modeling, set dressing, character modeling, building articulation, tailoring, cloth simulation, hair grooming – and that’s just the asset-building side. Then we have lighting and shading. I’m sure there’s some element in there I missed, but you can kind of get the picture that on an animated film, every single component of every visual thing that is on the screen, we had to account for.
A frame from Walt Disney Animation Studios’ Encanto, on which Marlon West served as Head of Effects Animation. (Image courtesy of Disney)
Among the many technical hurdles Sony Pictures Imageworks had to conquer in The Sea Beast was realizing the distinctive crease lines and wrinkles on several of the characters’ faces. Usually, such wrinkle-like features are modeled or textured into the detail.
Looking to capitalize on earlier work done at Imageworks with inklines on Spider-Man: Into the Spider-Verse, Visual Effects Supervisor R. Stirling Duguid and his team developed a tool called CreaseLines that gave animators the ability to easily, dynamically create and control curves on faces to define the right emotive facial creases.
“Normally if you model things like that, it requires a high-density mesh, or you have to use displacement maps, which are hard for an animator to visualize,” Duguid explains. “Our Animation Supervisor, Joshua Beveridge, had this idea for crease lines, which came from Spider-Verse. It was about thinking in the animator’s shoes, dealing with the director, getting a note and finding the quickest way to address the note.
“CreaseLines gave us the flexibility to move those lines and not be constrained by the topology, as far as how dense the mesh was. This let us directly drive displacement of the face meshes. It was a real win.”
“While animation has always sought to create new and imaginative worlds, the last few years have seen a rapid increase in the variety of visual styles produced across the industry. Despite this increase, we’ve only barely cracked open the visual possibilities in animation, which really excites me for the future.”
—Matt Baer, Visual Effects Supervisor, DreamWorks Animation
Marlon West: Multiple time zones were our main challenge on Iwájú. We have artists in Los Angeles, London, Lagos, Montreal and Vancouver. At one point, we had artists in Uganda, Kenya and Zimbabwe as well. While not hugely technical, the biggest challenge was initially story, art and editorial teams who have worked primarily with our internal tools to work with outside partners.
Alex Parkinson: Being an independent studio, we must find ways to match the quality of big studio movies as closely as we can, and that is a bar that is constantly moving, so our tools and workflow must constantly evolve to keep up. Often our problem is stylization. For example, three of our recent movies, Riverdance: The Animated Feature, Paws of Fury: The Legend of Hank and Hitpig, have featured sequences with fast-flowing water. Because we use tools developed primarily for live-action FX, they tend to produce photorealistic results. They don’t fit in our world, so we must find ways to make natural elements feel more ‘cartoony.’
Mirabel, voiced by Stephanie Beatriz, in Encanto. The fireworks were crafted as effects animation. (Image courtesy of Disney)
Stylization: a major trend in animation
Alex Parkinson: The use of non-photorealistic rendering, or NPR, exploded after Spider-Man: Into the Spider-Verse. That showed the potential for what a CG-animated movie could be. I see it as part of the maturing of our industry. If you think about 2D animation and all the looks and styles that it covers, from an episode of The Simpsons to crazy anime action, to the work of Cartoon Saloon, it is so varied and creative. CG animation is a very young art form, and up until now has tended to stay within the styles it was born from, like Toy Story, Shrek, etc. That is to say, more photoreal. 3D animation is branching out, experimenting, and developing all new NPR techniques – that’s very exciting.
R. Stirling Duguid: Spider-Man: Into the Spider-Verse was a pretty big splash for Imageworks as far as doing a whole movie like that. Then look at the evolution into The Mitchells vs the Machines, where we also took it to a really nice place. And, actually, it was quite different. You can clearly see the difference between Mitchells and Spider-Verse, but it was using a lot of the same technology.
A storyboard frame from a driving sequence in DreamWorks Animation’s The Bad Guys. (Image courtesy of Universal Pictures and DreamWorks Animation)
After storyboard, the studio moves to previs and early animation. (Image courtesy of Universal Pictures and DreamWorks Animation)
The final rendered frame. On The Bad Guys, DreamWorks Animation injected an enhanced level of stylized look and feel to the final images, partly to reflect the source material. (Image courtesy of Universal Pictures and DreamWorks Animation)
Jane Yen: I think we are now going to see movies come out that have a much more stylized and ‘pushed’ look. I think you’ll see that in our next feature film, Elemental. Even though Lightyear may not have pushed that spectrum, specifically, I think the industry is leaning that way, and I’m excited to see what comes out of it.
“Being an independent studio, we must find ways to match the quality of big studio movies as closely as we can, and that is a bar that is constantly moving, so our tools and workflow must constantly evolve to keep up. Often our problem is stylization. For example, three of our recent movies, Riverdance: The Animated Feature, Paws of Fury: The Legend of Hank and Hitpig have featured sequences with fast-flowing water. Because we use tools developed primarily for live-action FX, they tend to produce photorealistic results. They don’t fit in our world, so we must find ways to make natural elements feel more ‘cartoony.’”
—Alex Parkinson, Visual Effects Supervisor, Cinesite Studios
For The Sea Beast, Imageworks developed a new animation rig to deal with the many different kinds of ropes seen in the film, allowing them to stretch and hang and be animated as realistically as possible. (Image courtesy of Netflix and Sony Pictures Imageworks)
A face-crease-lines tool helped the studio dynamically create and control curves on faces to define the right emotive facial creases. (Image courtesy of Netflix and Sony Pictures Imageworks)
Marlon West: Stylization, supporting production design and character animation, has been very important for us at Walt Disney Animation Studios. We have endeavored to share 2D sensibilities with team members who started their careers creating images in CG. While those classic animation principles are important to character animators, overlap, staging, anticipation, etc. are just as valued by Walt Disney effects animators, too.
A final frame from The Sea Beast. (Image courtesy of Netflix and Sony Pictures Imageworks)
Matt Baer: While animation has always sought to create new and imaginative worlds, the last few years have seen a rapid increase in the variety of visual styles produced across the industry. Despite this increase, we’ve only barely cracked open the visual possibilities in animation, which really excites me for the future.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.