By IAN FAILES

Valve Software
US
By IAN FAILES
Several sequences featuring the Giganotosaurus in Jurassic World Dominion made use of a animatronic head section on set. (Image courtesy of Universal Pictures and ILM)
For final shots, ILM would often retain the entire animatronic head of the dinosaur and add in the rest of the dinosaur body. (Image courtesy of Universal Pictures and ILM)
How do you put yourself into the shoes, or feet, of a Giganotosaurus? What about an advanced chimpanzee or a bipedal hippo god? And how do you tackle a curious baby tree-like humanoid? These are all computer-generated characters with very different personalities featured in films and shows released in 2022, and ones that needed to be brought to life in part by teams of animators. Here, animation heads leading the charge at ILM, Wētā FX, Framestore and Luma Pictures share how their particular creature was crafted and what they had to do to find the essence of that character.
When your antagonist is a Giganotosaurus
When Jurassic World Dominion Animation Supervisor Jance Rubinchik was discussing with director Colin Trevorrow how the film’s dinosaurs would be brought to the screen, he reflected on how the animals in the first Jurassic Park “weren’t villains, they were just animals. For example, the T-Rex is just curious about the jeep, and he’s flipping it over, stepping on it and biting pieces off of it. He’s not trying to kill the kids. I said to Colin, ‘Can we go back to our main dinosaur – the Giganotosaurus – just being an animal?’ Let’s explore the Giga being an animal and not just being a monster for monster’s sake. It was more naturalistic.”
With Trevorrow’s approval for this approach, Rubinchik began work on the film embedded with the previs and postvis teams at Proof Inc., while character designs also continued. This work fed both to the animatronic builds by John Nolan Studio and Industrial Light & Magic’s CG Giganotosaurus. “Early on, we did lots of walk cycles, run cycles and behavior tests. I personally did tests where the Giga wandered out from in between some trees and was shaking its head and snorting and looking around.”
Another aspect of the Giganotosaurus was that ILM would often be adding to a practical/animatronic head section with the remainder of the dinosaur in CG. For Rubinchik, it meant that the overall Giga performance was also heavily influenced by what could be achieved on set. Comments Rubinchik, “What I didn’t want to have were these practical dinosaurs that tend to be a little slower moving and restricted, simply from the fact they are massive hydraulic machines, that then intercut with fast and agile CG dinosaurs. It really screams, ‘This is what is CG and this is what’s practical.’
A full-motion CG Giganotosaurus crafted by ILM gives chase. (Image courtesy of Universal Pictures and ILM)
Animation Supervisor Jance Rubinchik had a hand in ensuring that the movement of animatronic dinosaurs made by John Nolan Studio, as shown here, were matched by their CG counterparts. (Image courtesy of Universal Pictures and ILM)
“Indeed,” Rubinchik adds, “sometimes as animators, you have all these controls and you want to use every single control that you have. You want to get as much overlap and jiggle and bounce and follow through as you can because we’re animators and that’s the fun of animating. But having something that introduced restraint for us, which was the practical on-set dinosaurs, meant we were more careful and subtler in our CG animation. There’s also a lot of fun and unexpected things that happen with the actual animatronics. It might get some shakes or twitches, and that stuff was great. We really added that stuff into Giga wherever we could.”
For Pogo shots in season 3 of The Umbrella Academy, on-set plates featured actor Ken Hall. (Image courtesy of Netflix and Wētā FX)
Voice performance for Pogo was provided by Adam Godley (right), while Wētā FX animators also contributed additional performance capture. (Image courtesy of Net-flix and Wētā FX)
The final Pogo shot. (Image courtesy of Netflix and Wētā FX)
“What I didn’t want to have were these practical dinosaurs that tend to be a little slower moving and restricted, simply from the fact they are massive hydraulic machines, that then intercut with fast and agile CG dinosaurs. It really screams, ‘This is what is CG and this is what’s practical.’ … But having something that introduced some restraint for us, which was the practical on-set dinosaurs, meant we were more careful and subtler in our CG animation. There’s also a lot of fun and unexpected things that happen with the actual animatronics. It might get some shakes or twitches, and that stuff was great. We really added that stuff into Giga wherever we could.”
—Jance Rubinchik, Animation Director, MPC
This extended even to the point of replicating the animatronic joint placements from the John Nolan Studio creatures into ILM’s CG versions. “All of the pivots for the neck, the head, the torso and the jaw were in the exact same place as they were in the CG puppet,” Rubinchik outlines. “It meant they would pivot from the same place. I was so happy with how that sequence turned out with all the unexpected little ticks and movements that informed what we did.”
Pogo reimagined
The advanced chimpanzee Pogo is a CG character viewers greeted in Seasons 1 and 2 of Netflix’s The Umbrella Academy as an assistant to Sir Reginald Hargreeves, and as a baby chimp. The most recent Season 3 of the show sees Pogo appear in an alternative timeline as a ‘cooler’ version of the character who even becomes a biker and tattoo artist. Wētā FX created each incarnation of Pogo, which drew upon the voice of Adam Godley, the on-set performance of Ken Hall and other stunt performers and stand-ins to make the final creature.
Having ‘lived’ with Pogo in his older, more frail form in the past seasons, Wētā FX Animation Supervisor Aidan Martin and his team now had the chance to work on a character who was capable of a lot more physically, including Kung Fu. “All of a sudden, Pogo’s been in the gym. He’s juiced up. He’s doubled his shoulder muscle mass and his arms are a lot bigger, so the way that he carries himself is completely different. His attitude has changed, too. He’s more gnarled and he’s a lot more jaded about the world,” Martin says.
From an animation point of view, Wētā FX animators took that new physicality into the performance and reflected it in postures and movements. “It was even things like the way he looks at somebody now,” Martin explains. “Early on in Season 1, when he looks at people, he’s very sincere. He was like a loving grandfather. Now, he’s a bit fed up with it all and he’s not looking at you with good intentions. He thinks you’re an idiot and he doesn’t have time for it. That’s where he’s coming from behind the mask.”
Pogo is a grittier character in this latest season, even working as a tattoo artist. (Im-age courtesy of Netflix and Wētā FX)
“All of a sudden, Pogo’s been in the gym. He’s juiced up. He’s doubled his shoulder muscle mass and his arms are a lot bigger, so the way that he carries himself is completely different. His attitude has changed, too. He’s more gnarled and he’s a lot more jaded about the world.”
—Aidan Martin, Animation Supervisor, Wētā FX
One of the VFX studio’s toughest tasks on this new Pogo remained the character’s eyes. “Eyeline is everything, especially with chimps,” says Martin, who also had experience on the Planet of the Apes films at Wētā FX. “When you’re trying to do a more anthropomorphized performance, chimps with their eyelines and brows do not work very well compared to humans because their eyes are just so far back and their brows sit out so far. For example, as soon as you have the head tilt down and then try to make them look up, you can lose their eyes completely. Balancing the eyeline and the head angle is really difficult, especially on chimps.”
“Even once you’ve got that working, getting the mouth shapes to read properly is also tricky,” Martin continues. “There are some really tricky shapes, like a ‘V’ and an ‘F,’ that are incredibly hard on a chimp versus a human. Their mouths are almost twice as wide as our mouths. Humans look really good when they’re talking softly, but getting a chimp to do that, it looks like they’re either just mumbling or they get the coconut mouth, like two halves clacking together, and everything’s just too big. We used traditional animation techniques here, basically a sheet of phoneme expressions for Pogo’s mouth.”
Going hyper (or hippo) realistic
Finding the performance for a CG-animated character often happens very early on in a production, even before any live action is shot. In the case of the Marvel Studios series Moon Knight’s slightly awkward hippo god Taweret, it began when Framestore was tasked with translating the casting audition of voice and on-set performer Antonia Salib into a piece of test animation.
Actor Antonia Salib performs the role of hippo god Taweret on a bluescreen set. (Image courtesy of Marvel and Framestore)
Final shot of Taweret by Framestore. (Image courtesy of Netflix and Wētā FX)
“The Production Visual Effects Supervisor, Sean Andrew Faden, asked us to put something together as if it was Taweret auditioning for the role,” relates Framestore Animation Supervisor Chris Hurtt. “We made this classic blooper-like demo where we cut it up and had the beeps and even a set with a boom mic. We would match to Antonia’s performance with keyframe animation just to find the right tone. We would later have to go from her height to an eight- or nine-foot-tall hippo, which changed things, but it was a great start.”
Salib wore an extender stick during the shoot (here with Oscar Isaac) to reach the appropriate height of Taweret. (Image courtesy of Marvel and Framestore)
Framestore had to solve both a hippo look in bipedal form and the realistic motion of hair and costume for the final character. (Image courtesy of Marvel and Frame-store)
“We looked at a lot of reference of real hippos and asked ourselves, ‘What can we take from the face so that this doesn’t just feel like it’s only moving like a human face that’s in a hippo shape?’ We found there were these large fat sacks in the corners that we could move, and it made everything feel a little more hippo-y and not so human-y. Probably the biggest challenge on her was getting Taweret to go from a human to a hippo.”
—Chris Hurtt, Animation Supervisor, Framestore
During filming of the actual episode scenes, Salib would perform Taweret in costume with the other actors and with an extender stick and ball markers to represent the real height of the character. As Hurtt describes, Framestore took that as reference and looked to find the right kind of ‘hippoisms’ on Salib. “We looked at a lot of reference of real hippos and asked ourselves, ‘What can we take from the face so that this doesn’t just feel like it’s only moving like a human face that’s in a hippo shape?’ We found there were these large fat sacks in the corners that we could move, and it made everything feel a little more hippo-y and not so human-y.”
“Probably the biggest challenge on her was getting Taweret to go from a human to a hippo,” adds Hurtt, who also praises the Framestore modeling, rigging and texturing teams in building the character. “The main thing for animation was that we had to observe what the muscles and the FACS shapes were doing on Antonia, and then map those to the character. Still, you’re trying to hit key expressions without it looking too cartoony.”
To help realize the motion of Taweret’s face shapes in the most believable manner possible, Framestore’s animators relied on an in-house machine learning tool. “The tool does a dynamic simulation like you would with, say, hair, but instead it would drive those face shapes,” Hurtt notes. “It’s not actually super-noticeable, but it’s one of those things if you didn’t have there, particularly with such a huge character, she would’ve felt very much like paper-mâché when she turned her head.”
The enduring, endearing allure of Groot
The Marvel Studios Guardians of the Galaxy films have borne several CG-animated characters; one of the most beloved being Baby Groot. He now stars in his own series of animated shorts called I Am Groot, directed by Kirsten Lepore, with visual effects and animation by Luma Pictures. The fully CG shorts started with a script and boarding process driven by Lepore, according to Luma Pictures Animation Director Raphael Pimentel.
Luma Pictures Animation Director Raphael Pimentel donned an Xsens suit (and Ba-by Groot mask) for motion capture reference at Luma Pictures during the making of I Am Groot. (Image courtesy of Luma Pictures)
The behavior settled on for Baby Groot, which had been featured in previous Marvel projects, was always ‘endearing.’ (Image courtesy of Marvel and Luma Pictures)
“There were scripts early on showing what the stories were going to be about. These quickly transitioned into boards. Then, Kirsten would provide the boards to us with sound. She would put them to music as well, which was important to get the vibe. These would then be turned over to us as an animatic of those boards with the timing and sound that Kirsten envisioned, which was pretty spot-on to the final result.”
Baby Groot’s mud bath in one of the shorts required the extensive cooperation be-tween the animation and FX teams at Luma Pictures. (Image courtesy of Marvel and Luma Pictures)
Baby Groot still delivers only one line: “I am Groot.” (Image courtesy of Marvel and Luma Pictures)
In terms of finding the ideal style of character animation for Groot in the shorts, Luma Pictures shot motion capture as reference for its animators, which was used in conjunction with video reference that Lepore also provided, and vid-ref shot by the animators themselves. The motion capture mainly took the form of Pimentel performing in an Xsens suit. “We went to Luma and identified the key shots that we wanted to do for every episode,” Pimentel recalls. “We would do one episode each day. As we were going through those key shots, we ended up shooting mocap for everything. Kirsten was there telling me the emotions that she wanted Groot to be feeling at that specific point in time. And we said, ‘Let’s keep going, let’s keep going.’ Next thing you know, we actually shot mocap for everything to provide reference for the animators.”
In one of the shorts, “Groot Takes a Bath,” a mud bath results in the growth of many leaves on the character, which he soon finds ways to groom in different styles. This necessitated a close collaboration between animation and effects at Luma. “That was a technical challenge for us,” Pimentel discusses. “In order for Kirsten to see how the leaves were behaving, she would usually have to wait until the effects pass. We built an animation rig that was very robust that would get the look as close to final as possible through animation.”
The final behavior settled on for Baby Groot in the shorts was “endearing,” Pimentel notes. Despite Groot’s temper tantrums and ups and downs, he was still kept sweet at all times. From an animation standpoint, that meant ensuring the character’s body and facial performances stayed within limits. “It’s easy to start dialing in the brows to be angry, but we had to keep the brows soft at all times. And then his eyes are always wide to the world. Regardless of what’s happening to him, his eyes are always wide to the world, much like a kid is.”
By TREVOR HOGG
Images courtesy of MUBI.
Partial set build, background plate photography, 3D model of mountaintop and depth pass are combined together to create an aerial shot.
“The mountain [where Seo-rae falls to his death] is 100% CG, but the background where that peak is situated is a real scene that we shot. There are a ton of mountains in Korea, so it’s a composite of these two. The two main locations of the mountains and sea have this unique form that goes up and down and up and down. We wanted to repeatedly show such up and down patterns, like a wave in an ocean or the landscape of the mountain range, but at the same time we didn’t want to make it too obvious for the audience to say, ‘Ah-ha! I see that.’”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
Ever since the release of Oldboy, Lee Joen-hyoung, who serves as the CEO and VFX Supervisor at Korean VFX studio 4th Creative Party, has been collaborating with filmmaker Park Chan-wook. Decision to Leave, which revolves around a detective becoming infatuated with a murder suspect, seems to be a less likely candidate for extensive digital augmentation because of the subject matter, but this was not the case. “This was the easiest read of all of director Park’s screenplays,” Lee recalls. “However, in the end, the work that I had to do was the toughest because unlike The Handmaiden or Oldboy, for which I was able to come up with the imagination straightaway in terms of the visuals and mise-en-scène, Decision to Leave was so ambiguous.” About 580 shots were created over a period of six months. “We were done at the end of 2021, but then we had some time left before Cannes and the actual release, so we did some detailing work with only a handful of people to make it even more perfect,” Lee remarks.
Invisible effects include adding photographs to the wall devoted to unsolved crimes, created by Hae-joon.
Unwanted natural elements had to be removed from the finale, which took place in a beach environment that was, in reality, three different locations combined together. “Jang Hae-joon’s portion of it was shot at the end of fall, entering into the winter season, so we started to have some snow,” Lee states. “For the sake of continuity, we had to take out snow and also had to work on the mountain that you can see from far. Even though we had to remove the snow and wind, Tang Wei, the actor, still felt those harsh conditions, which reflected the emotional state of Hae-joon. With Son Seo-rae’s portion, there was no problem because the time of day was different.” Atmospherics were also digitally added into shots. “We had to have mist in the latter part of the film because it’s set against Ipo, which is famous for mist and being humid all of the time,” Lee adds. “Mist had to be present in almost all of the outdoor scenes, but we had to define how much for a particular scene.”
“Director Park likes to use insects in his movies, such as the ants in Oldboy or the mosquitoes in Lady Vengeance or the ladybug in I’m a Cyborg, But That’s OK. I knew even before that he was going to put some kind of insect in this film, too. We already have a vast library filled with insects and their forms and movements, so we were well-equipped to execute that.”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
Insects are always featured in the films of Park Chan-wook, with CG ants crawling over the face of Seo-rae’s dead husband.
“Hae-joon tries to replicate what Seo-rae would have done to kill her husband, and he goes up the mountain, lies down and looks up. Then Seo-rae’s hand comes in and they hold hands together. Director Park told me that the audience should be able to see the callus on her palm because it’s evidence that she is already an expert climber. It was difficult to visually make that happen because when those two hands meet together the palm becomes a little bit dark, so we had to do several retakes. That one scene was the most challenging for me.”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
The mountain where the husband, Seo-rae, falls to his death was a partially built on a backlot set surrounded by bluescreen. “The mountain is 100% CG, but the background where that peak is situated is a real scene that we shot,” Lee reveals. “There are a ton of mountains in Korea, so it’s a composite of these two. The two main locations of the mountains and sea have this unique form that goes up and down and up and down. We wanted to repeatedly show such up and down patterns, like a wave in an ocean or the landscape of the mountain range, but at the same time we didn’t want to make it too obvious for the audience to say, ‘Ah-ha! I see that.’” Ants crawl over the face of the deceased spouse. “Director Park likes to use insects in his movies, such as the ants in Oldboy or the mosquitoes in Lady Vengeance or the ladybug in I’m a Cyborg, But That’s OK. I knew even before that he was going to put some kind of insect in this film, too. We already have a vast library filled with insects and their forms and movements, so we were well-equipped to execute that,” Lee notes.
The x-ray of an arm and hand transitions into the arm and hand of Hae-joon, emphasizing that he is still thinking of Seo-rae even when having an intimate moment with his wife.
A clever shot transition moves from the x-ray of a hand to the one belonging to Hae-joon as he is having sex with his wife in bed. “I have already accumulated so much experience with Park Chan-wook-esque transitions!” Lee laughs. “We knew how the output should look like because it was worked out in the storyboarding phase and we subsequently shot the source material. That transition was a nod to what we had already did in I’m a Cyborg, But That’s OK. As a long-time collaborator, I already knew what color director Park likes for the x-ray and the timing for the movement of the hand. That transition was the symbol of how Hae-joon is really with Soe-rae even though he is physically next to his wife.” The growing emotional bond between the detective and the murder suspect is visually emphasized in the interrogation scenes. “We wanted the audience to see something happen that is not physically possible,” Lee describes. “For that we had four characters because there were two in front and there are two in the reflection of [the mirror]. We shot the real people, then the reflection pass, and composited these two together so that we were able to control the focusing and defocusing in order to fully realize director Park’s intention and vision for the scene.”
“That transition [shot of the x-ray of a hand to the one belonging to Hae-joon as he is having sex with his wife in bed] was a nod to what we had already did in I’m a Cyborg, But That’s OK. As a long-time collaborator, I already knew what color director Park likes for the x-ray and the timing for the movement of the hand. That transition was the symbol of how Hae-joon is really with Soe-rae even though he is physically next to his wife.”
—Lee Joen-hyoung, CEO & VFX Supervisor, 4th Creative Party
Bunam Beach in Samcheok, Hakampo Beach and Magumpo Beach in Taean were combined to create the environment that appears in the finale.
Reflections and monitors were manipulated during the interrogation scenes to visually show that Seo-rae and Hae-joon are becoming emotionally closer to each other.
Driving shots are common place for Korean television series and films. “For our film, we wanted to make sure that the windshield of the car and the reflections on the car and how the lights will change inside of the space would be recognizable to the audience,” Lee remarks. “We had to make sure that the lighting and reflections worked perfectly; that was our full intention. Since our actors are inside the car, we also wanted to make a realistic look for the interior shot.” An unlikely shot proved to be difficult. Reveals Lee, “Hae-joon tries to replicate what Seo-rae would have done to kill her husband, and he goes up the mountain, lies down and looks up. Then Seo-rae’s hand comes in and they hold hands together. Director Park told me that the audience should be able to see the callus on her palm because it’s evidence that she is already an expert climber. It was difficult to visually make that happen because when those two hands meet together the palm becomes a little bit dark, so we had to do several retakes. That one scene was the most challenging for me.”
By IAN FAILES
Cinesite’s Montreal and Vancouver facilities took on Paws of Fury: The Legend of Hank after the film had already spent several years in development. (Image courtesy of Paramount Pictures)
A common credit on a CG animated feature film or show is ‘visual effects supervisor.’ But wait, don’t VFX supervisors work just in live-action? This is, of course, not so. Indeed, on a CG-animated project, a visual effects supervisor is a crucial role, often helping to formulate the ‘look of picture’ as well as solve many of the technical and artistic hurdles along the way – not too dissimilar at all from a VFX supervisor working in live-action.
In this roundtable, visual effects supervisors from Walt Disney Animation Studios, Pixar, DreamWorks Animation, Sony Pictures Imageworks and Cinesite Studios explain their tasks on recent animated films and shows and share their thoughts on the key trends hitting their field right now.
Cinesite is one of a limited number of VFX studios that also deliver full CG-animated features and other animated projects. (Image courtesy of Paramount Pictures)
VFX supervisor in live-action versus animation
Alex Parkinson (Visual Effects Supervisor, Cinesite): Often the difference between VFX supervisors in live-action and animation depends on the kind of VFX show you are talking about. Sometimes entire sequences in movies are CG with no live-action aspects at all. In that case, the workflow and the job would be very similar. But mostly the differences between the two jobs reflect the differences between the two mediums. In animation, you tend to have more creative ownership over the final product and way more freedom. Live-action VFX is a more technical and precise process. It is harder in a lot of ways, because you must match existing elements and every shot is put through more scrutiny.
Marlon West (Visual Effects Supervisor, Walt Disney Animation Studios, on Iwájú): While the visual effects supervisor on a live-action film is tasked with leading the team to create images that they can’t go out and capture live, for animation every image is created from ‘scratch.’ So, they are tasked with leading the charge of creating every image technically and creatively.
“Multiple time zones were our main challenge on Iwájú. We have artists in Los Angeles, London, Lagos, Montreal and Vancouver. At one point we had artists in Uganda, Kenya and Zimbabwe as well. While not hugely technical, the biggest challenge was initially story, art and editorial teams who have worked primarily with our internal tools to work with outside partners.”
—Marlon West, Visual Effects Supervisor, Walt Disney Animation Studios
Of all the tech trends that abound in CG animation right now, Cinesite Visual Effects Supervisor Alex Parkinson believes that real-time game engines have the most potential to revolutionize the industry, particularly in relation to CG cinematography.
“Let’s take a typical shot, the villain reveal. The villain walks towards the camera through darkness and mist, their cape billowing in the wind. As they move into the light more of their form is revealed, and then at the last moment they lift their face towards the light,” Parkinson describes.
“In a traditional CG animation pipeline, this would be created in a serial manner,” Parkinson continues. “The camera would be created in layout using some very rough blocked animation. It would be animated without the cape, which would be added in CFX. FX would then do the mist interaction, then the whole thing would be passed to lighting to make it work. However, what if it doesn’t work? What if the timing is off or lighting cannot make the animation work for the face reveal? The shot goes all the way back down the pipeline for a re-do, then round and round until ultimately we run out of time and have to go with what we have.”
Parkinson believes real-time game engines will change this process. “We will be able to work much more like a live-action shoot does. We will be able to assemble all the pieces we have at any time, see them all in context, and work more in parallel, tweaking each element so they work together harmoniously. The potential for a quality increase in our filmmaking is huge.”
Kylie Kuioka voices Emiko in Paws of Fury: The Legend of Hank. (Image courtesy of Paramount Pictures)
Jane Yen (Visual Effects Supervisor, Pixar, on Lightyear): I see my role at Pixar as overseeing all of the technical work that needs to happen in computer graphics to produce the film visuals. Pixar has historically been at the very forefront of computer graphics and creating CG imagery, so a lot of the Pixar history and the roles that used to be called supervising technical director, and now VFX supervisor, were based on developing new technology to make it even possible.
Matt Baer (Visual Effects Supervisor, DreamWorks Animation, on The Bad Guys): At the creative leadership level, there are more peer relationships for the animation VFX supervisor. The head of story is my peer. The head of layout is my peer. The head of animation is my peer. For example, I’m responsible for making sure our head of animation is set up with the necessary workflows and technologies. Ultimately, the head of animation is creatively responsible for the character animation. I consult during animation development and shot work so our animators have context as to how their work fits into the bigger picture.
A layout frame from a hyperjump sequence in Pixar’s Lightyear. (Image courtesy of Disney/Pixar)
Animation pass on Buzz Lightyear. (Image courtesy of Disney/Pixar and Pixar)
R. Stirling Duguid (Visual Effects Supervisor, Sony Pictures Imageworks, on The Sea Beast): At Imageworks, compared to other animation companies that are vertically integrated, we have a client/vendor relationship. My primary role is to represent Imageworks to the client as well as the director, the production designer, the art director and their producer. That’s the first step, representing the company. Then it’s about building the team and the framework for the entire production – how we go from storyboards to final composite, laying that out and making sure that we have the right people in charge for each of those departments.
Lighting, FX and rendering are the final steps in the final frame. (Image courtesy of Disney/Pixar)
Solving the art and tech and pipeline, in animation
Matt Baer: On The Bad Guys, one of our key visual goals was to pay homage to illustration and 2D animation. Our design philosophy was to use simplification to achieve our stylized and sophisticated look. Anyone who has worked in CG knows this is the opposite of what many of our tools are designed to do! We needed to replace the realistic details of traditional CG techniques with the wonderful hand-drawn imperfections seen in illustrations. Taking this to scale on a feature film required us to build new workflows for every department, allowing them to create images that look handmade – removing superfluous CG details while keeping just enough visual information to guide the eye towards the most important aspects of the shot. Once the image was reduced, our artists added custom line work, textures and 2D effects to every shot in the film.
A first look image from Walt Disney Animation Studios’ Iwájú series. (Image courtesy of Disney)
R. Stirling Duguid: For The Sea Beast, the big technical hurdle was ropes. We had thousands of them to do. Our Animation Supervisor, Joshua Beveridge, said, ‘We have to start from the ground up and build an awesome rope rig because we’re not going to make it through the movie without that.’ We came up with a procedural solution that was designed to be animation-friendly. The idea is that the length of the rope would always stay the same – usually it stretches or is cheated, but ours had the proper hang and everything. Ropes are in so many shots.
Jane Yen: Lightyear was Pixar’s largest FX film to date. Almost 70% of the film has FX elements in it. As the VFX Supervisor on an animated film, I had to look at every single component of the film, not just FX but also set building and modeling, set dressing, character modeling, building articulation, tailoring, cloth simulation, hair grooming – and that’s just the asset-building side. Then we have lighting and shading. I’m sure there’s some element in there I missed, but you can kind of get the picture that on an animated film, every single component of every visual thing that is on the screen, we had to account for.
A frame from Walt Disney Animation Studios’ Encanto, on which Marlon West served as Head of Effects Animation. (Image courtesy of Disney)
Among the many technical hurdles Sony Pictures Imageworks had to conquer in The Sea Beast was realizing the distinctive crease lines and wrinkles on several of the characters’ faces. Usually, such wrinkle-like features are modeled or textured into the detail.
Looking to capitalize on earlier work done at Imageworks with inklines on Spider-Man: Into the Spider-Verse, Visual Effects Supervisor R. Stirling Duguid and his team developed a tool called CreaseLines that gave animators the ability to easily, dynamically create and control curves on faces to define the right emotive facial creases.
“Normally if you model things like that, it requires a high-density mesh, or you have to use displacement maps, which are hard for an animator to visualize,” Duguid explains. “Our Animation Supervisor, Joshua Beveridge, had this idea for crease lines, which came from Spider-Verse. It was about thinking in the animator’s shoes, dealing with the director, getting a note and finding the quickest way to address the note.
“CreaseLines gave us the flexibility to move those lines and not be constrained by the topology, as far as how dense the mesh was. This let us directly drive displacement of the face meshes. It was a real win.”
“While animation has always sought to create new and imaginative worlds, the last few years have seen a rapid increase in the variety of visual styles produced across the industry. Despite this increase, we’ve only barely cracked open the visual possibilities in animation, which really excites me for the future.”
—Matt Baer, Visual Effects Supervisor, DreamWorks Animation
Marlon West: Multiple time zones were our main challenge on Iwájú. We have artists in Los Angeles, London, Lagos, Montreal and Vancouver. At one point, we had artists in Uganda, Kenya and Zimbabwe as well. While not hugely technical, the biggest challenge was initially story, art and editorial teams who have worked primarily with our internal tools to work with outside partners.
Alex Parkinson: Being an independent studio, we must find ways to match the quality of big studio movies as closely as we can, and that is a bar that is constantly moving, so our tools and workflow must constantly evolve to keep up. Often our problem is stylization. For example, three of our recent movies, Riverdance: The Animated Feature, Paws of Fury: The Legend of Hank and Hitpig, have featured sequences with fast-flowing water. Because we use tools developed primarily for live-action FX, they tend to produce photorealistic results. They don’t fit in our world, so we must find ways to make natural elements feel more ‘cartoony.’
Mirabel, voiced by Stephanie Beatriz, in Encanto. The fireworks were crafted as effects animation. (Image courtesy of Disney)
Stylization: a major trend in animation
Alex Parkinson: The use of non-photorealistic rendering, or NPR, exploded after Spider-Man: Into the Spider-Verse. That showed the potential for what a CG-animated movie could be. I see it as part of the maturing of our industry. If you think about 2D animation and all the looks and styles that it covers, from an episode of The Simpsons to crazy anime action, to the work of Cartoon Saloon, it is so varied and creative. CG animation is a very young art form, and up until now has tended to stay within the styles it was born from, like Toy Story, Shrek, etc. That is to say, more photoreal. 3D animation is branching out, experimenting, and developing all new NPR techniques – that’s very exciting.
R. Stirling Duguid: Spider-Man: Into the Spider-Verse was a pretty big splash for Imageworks as far as doing a whole movie like that. Then look at the evolution into The Mitchells vs the Machines, where we also took it to a really nice place. And, actually, it was quite different. You can clearly see the difference between Mitchells and Spider-Verse, but it was using a lot of the same technology.
A storyboard frame from a driving sequence in DreamWorks Animation’s The Bad Guys. (Image courtesy of Universal Pictures and DreamWorks Animation)
After storyboard, the studio moves to previs and early animation. (Image courtesy of Universal Pictures and DreamWorks Animation)
The final rendered frame. On The Bad Guys, DreamWorks Animation injected an enhanced level of stylized look and feel to the final images, partly to reflect the source material. (Image courtesy of Universal Pictures and DreamWorks Animation)
Jane Yen: I think we are now going to see movies come out that have a much more stylized and ‘pushed’ look. I think you’ll see that in our next feature film, Elemental. Even though Lightyear may not have pushed that spectrum, specifically, I think the industry is leaning that way, and I’m excited to see what comes out of it.
“Being an independent studio, we must find ways to match the quality of big studio movies as closely as we can, and that is a bar that is constantly moving, so our tools and workflow must constantly evolve to keep up. Often our problem is stylization. For example, three of our recent movies, Riverdance: The Animated Feature, Paws of Fury: The Legend of Hank and Hitpig have featured sequences with fast-flowing water. Because we use tools developed primarily for live-action FX, they tend to produce photorealistic results. They don’t fit in our world, so we must find ways to make natural elements feel more ‘cartoony.’”
—Alex Parkinson, Visual Effects Supervisor, Cinesite Studios
For The Sea Beast, Imageworks developed a new animation rig to deal with the many different kinds of ropes seen in the film, allowing them to stretch and hang and be animated as realistically as possible. (Image courtesy of Netflix and Sony Pictures Imageworks)
A face-crease-lines tool helped the studio dynamically create and control curves on faces to define the right emotive facial creases. (Image courtesy of Netflix and Sony Pictures Imageworks)
Marlon West: Stylization, supporting production design and character animation, has been very important for us at Walt Disney Animation Studios. We have endeavored to share 2D sensibilities with team members who started their careers creating images in CG. While those classic animation principles are important to character animators, overlap, staging, anticipation, etc. are just as valued by Walt Disney effects animators, too.
A final frame from The Sea Beast. (Image courtesy of Netflix and Sony Pictures Imageworks)
Matt Baer: While animation has always sought to create new and imaginative worlds, the last few years have seen a rapid increase in the variety of visual styles produced across the industry. Despite this increase, we’ve only barely cracked open the visual possibilities in animation, which really excites me for the future.
By TREVOR HOGG
Visual Effects Supervisor Ryan Tudhope honored and preserved the messiness of the practical aerial photography for Top Gun: Maverick, which in turn made the 2,400 visual effects shots seamless. (Image courtesy of Paramount Pictures)
There are strange variables in play in the 2023 Oscar race for Best Visual Effects. For one, filmmaker Taika Waititi and actress Tessa Thompson did a scene breakdown video for Vanity Fair and made fun of some visual effects work in Thor: Love and Thunder, without mention of the groundbreaking camera and lighting techniques utilized by Marvel Studios Visual Effects Supervisor Jake Morrison to create six separate lighting passes simultaneously for the Moon of Shame sequence without interrupting principal photography. In an interesting twist, the comments sparked an Internet frenzy about the unreasonable demands and deadlines that digital artists have to contend with on a daily basis, with the backlash possibly having a ripple effect on other MCU contenders Doctor Strange in the Multiverse of Madness and Black Panther: Wakanda Forever.
Then, there is the matter of Top Gun: Maverick where the filmmakers and studio are marketing how everything was done practically. However, to achieve the desired cinematic scope there are over 2,000 visual effects shots that have been seamlessly integrated into the remarkable aerial plate photography, such as the opening Blackstar scene; that in itself should make the blockbuster, which received critical acclaim and earned $1.37 billion worldwide as of mid-August, a favorite to win. Curiously, though, the visual effects team led by Visual Effects Supervisor Ryan Tudhope has been grounded from promoting the film, and there is unlikely to be any campaign support from the VFX team to add fuel to the nomination fire. Nevertheless, there is a strong possibility that no perceived lack of VFX team publicity can stop Top Gun: Maverick from topping the field, as demonstrated by Dunkirk’s Oscar win in 2018.
The major technical innovations for Avatar: The Way of Water have been facial capture, the ability to do performance capture underwater and the recreation of realistic CG water. (Image courtesy of 20th Century Studios)
Avatar: The Way of Water is certainly getting studio support, with the original Avatar (2009) being re-released to theaters to remind audiences of the highest-grossing film of all time. It is never wise to bet against director James Cameron, who knows how to push and invent technology to enhance his storytelling, in this case facial capture. With Cameron out to build on the spectacle of Avatar, the visual effects for The Way of Water are sure to be amazing. Some theatergoers will be stunned by the visuals, but repeat viewings will likely depend on the story development, which Cameron did not rush, as he understands the multi-film endeavor is pointless without a solid narrative foundation.
As for those filmmakers who showed ingenuity and a unique perspective towards how to incorporate visual effects, two in particular stand out. To begin, director Jordan Peele is having a major impact on redefining the horror genre and having it be a mirror that reflects the beauty and ugliness of society. Nope addresses the issue of spectacle and elevates the UFOs of B-movies into an aerial creature wreaking havoc on the world below. Massive wind machines were needed, so a helicopter was brought in to generate practical dust for the shots. Day for night photography consisted of a 3D rig that synced a color film camera with an infrared digital camera under the guidance of innovative Cinematographer Hoyte van Hoytema and Production Visual Effects Supervisor Guillaume Rocheron. And for an added bonus, here is the rampaging monkey brought to life with performance capture legend Terry Notary.
Some might argue that the visual effects are not Oscar caliber, but one has to be impressed by how filmmakers Dan Kwan and Daniel Scheinert were able to depict a believable multiverse without an MCU budget for Everything Everywhere All at Once. That in itself is award-worthy. The visual effects team consisted of mainly five digital artists producing 80% of the over 500 bespoke shots that include an acrobatic juggling hibachi chef who in reality was an actor pantomiming his actions with the culinary tools and ingredients added later in CG. Also, there was a case where a character had to be removed from an entire scene. Shots like the first “verse-jumping” shot of Michelle Yeoh’s character benefited from the extra time afforded by the pandemic. Michelle Yeoh is an amazing practical effect in herself, as she at one time overshadowed another martial arts icon, Jackie Chan, who actually rejected the lead role of what has become the first A24 move to earn over $100 million worldwide
Every single shot required digital augmentation from rig cleanup to water simulations to digital skies while retaining the stopmotion handcrafted aesthetic for Guillermo del Toro’s Pinocchio. (Image courtesy of Netflix)
Stunning CG environments will be the hallmark of Black Panther: Wakanda Forever, with the futuristic African homeland being the centerpiece. (Image courtesy of Marvel Studios)
Doctor Strange in the Multiverse of Madness features artistic visual effects such as a twisted orchard ravaged by magic rather than fire. (Image courtesy of Marvel Studios)
Feathered dinosaurs like the Pyroraptor make a debut in Jurassic World Dominion, with ILM having to develop a new feather system to make it possible. (Image courtesy of Universal Pictures)
By combining practical ingenuity and virtual production methodology, Bullet Train was able to create the impression of high-speed travel through Japan when, in fact, the principal photography took place on a soundstage in Los Angeles. (Image courtesy of Columbia Pictures)
Speaking of multiverses, director Sam Raimi brings his own sense of dimensional mayhem with Doctor Strange in the Multiverse of Madness, which stands to be the top contender for the MCU. Raimi has a distinct blend of horror and comedy, which is appropriate for a story that centers around the egotistical and sardonic Master of the Mystic Arts portrayed by Benedict Cumberbatch. An incursion occurs that sees two dimensions disintegrate, a mirror trap is sprung with shards of glass, and a magical, pristine orchard is revealed to be a twisted forest conjured out of a Brothers Grimm fairy tale. Found in the heart of darkness is an evil doppelganger, run-amok scarlet witchery and a beloved 1973 Oldsmobile Delta 88.
Black Panther: Wakanda Forever looks as impressive as the original film and has the legacy of Chadwick Boseman. If anyone can pay his late colleague a worthy send-off, it will be filmmaker Ryan Coogler, who returned to direct the sequel. The world-building has made the futuristic African land of Wakanda a wonder to behold. And let us not forget all of the amazing technological toys that can be created using Vibranium, which rivals anything Q has produced for English compatriot James Bond.
If visual mayhem is what you seek, there is Moonfall, which sees gravity go sideways as the lunar neighbor is revealed to house a dwarf star and becomes the target of a nanobot AI determined to get its vengeance against humanity. When it comes to destroying Earth, no one can do it better creatively and consistently than director Roland Emmerich. Very few sets were built physically, but a shuttle cockpit was brought in from a museum to assist with the flying scenes. Inflicting massive damage caused by an arsenal of weaponry rather than a cosmic event are the Russo siblings, Anthony and Joe, with The Gray Man, which ups the ante for assassins with no sense of covert activities, as their missions become news headlines as entire city blocks get decimated in the effort to kill one person!
Ryan Reynolds enters into fray with The Adam Project, where he gets to encounter his younger sarcastic self and attempts to destroy the invention of time travel much to the chagrin of author H.G. Wells. There is cool tech involved for some blockbuster flying and badass hand-to-hand combat sequences, but the visual effects do not capture the innovation of the zanier Free Guy, which is also the product of Reynolds partnering with fellow Canadian director Shawn Levy. As for award-worthy video game adaptations, Uncharted emerges from development hell with Ruben Fleischer shepherding the big-screen adaptation. Tom Holland’s acrobatic antics as Nathan Drake would even impress his most famous character, Spider-Man, especially the aerial daisy-chain sequence, and there is a different spin on a naval battle. While two long-lost 16th century ships are being transported in the air by heavy-duty cargo helicopters, the opposing forces swing on ropes going from one seafaring vessel to another.
Battling for supremacy in the DC Universe will be The Batman and Black Adam. Black Adam hopes to dodge the critical and visual effects backlash of The Scorpion King, which also starred Dwayne Johnson getting involved with Egyptian god shenanigans. No doubt the technology has greatly improved since then, but the rush to finish the visual effects on time hopefully won’t undermine quality. Johnson is such a likable person that it will be interesting to see him portray an antihero. In The Batman, the car chase through the rain by Wētā FX and Scanline VFX, flooding the streets of Gotham, are standout environmental moments when it comes to visual effects. Watch out for The Batman punching and grappling his way to a nomination.
Sony continues to spotlight comic book villains with the doctor turned bloodsucking creature of the night. Morbius reveals that the cure is worse than the disease. The vampire faces are the most impressive digital work, but the toughest was honoring physical dynamics. The entire third act was rewritten and had to be reconstructed with bluescreen. Sonic the Hedgehog 2 was nearly derailed with the restart after the pandemic lockdown led to a digital artist talent shortage and capacity issues with different vendors around the world. Fortunately, there was no character designs required this time around, so the focus could be on introducing new characters and environments. Jim Carrey is let loose once more as the evil Dr. Robotnik, with the twirlable mustache added by a gigantic mech robot and Idris Elba channeling the adversarial Knuckles.
A crowning accomplishment for Thor: Love and Thunder was the ability to capture six different lighting passes simultaneously and not interrupt principal photography during the Moon of Shame sequence. (Image courtesy of Marvel Studios)
Sonic the Hedgehog 2 embraces its video game and cartoon heritage, demonstrating that not everything has to be photorealistic. (Image courtesy of Paramount Pictures and Sega of America)
Among the environmental work in Elvis is visiting the Graceland estate over three decades during three different seasons. (Image courtesy of Warner Bros. Pictures)
Nope features a creature of the sky, major dust simulations, day-for-night photography and a raging monkey, all done in a photorealistic manner. (Image courtesy of Universal Pictures and Monkeypaw Productions)
A signature action scene for Uncharted made use of gimbals, wirework and CG to produce an aerial daisy chain of cargo crates. (Image courtesy of Columbia Pictures)
Unreal Engine and virtual production were indispensable tools for the art department, cinematography, stunts, special effects and visual effects for The Batman. (Image courtesy of Warner Bros. Pictures)
The recreation of ancient Egypt mixed with superpowers lead to stunning visuals in Black Adam that could only be accomplished with the support of CG. (Image courtesy of Warner Bros. Pictures)
The high tech that goes along with time travel gets an imaginative spin in The Adam Project. (Image courtesy of Netflix)
Elba gets to literally punch a malevolent lion in Beast, which features a CG antagonist in the vein of the infamous grizzly bear attack in The Revenant. Think Jaws on safari. Icelandic filmmaker Baltasar Kormákur has gained a reputation for being able to shift between blockbusters like Everest and indie films such as The Oath; he is aware of the importance of visual effects and using them wisely as reflected by his ownership of RVX, the effects arms of RVX Studios, and ongoing collaboration with Framestore. Other creatures unleashing havoc on the human population are the prehistoric ones brought back to life in Jurassic World Dominion, which ties together with the seminal Jurassic Park that achieved groundbreaking photorealistic digital effects. Production Designer Kevin Jenkins, Visual Effects Supervisor David Vickery and Creature Effects Supervisor John Nolan worked closely together to ensure that dinosaurs were anatomically correct and that as much of the animatronics could be maintained as possible. The major innovation was finally introducing dinosaurs with feathers, like the Pyroraptor, to the franchise, which required ILM to build a new feather system.
Curiously, the live-action version of Pinocchio by director Robert Zemeckis and starring Tom Hanks is going directly to Disney+, so it will not qualify for the Oscars. However, there will be a brief theatrical run before Guillermo del Toro’s Pinocchio takes up permanent residence on Netflix. It might seem as a stretch to include a stop-motion animation feature as part of the contender list, but the feat was actually achieved by Kubo and the Two Strings. The realm of Limbo and the interior of the dogfish are two major CG environments, and there is also a minor fully-CG character, while atmospherics range from being realistic mist to snow that is given a paper-like quality, as well as surrealistic skies, set extensions, and plenty of clean-up resulting from set shifts, light flickers, dust and hair. All of this is done while maintaining a live-action approach to both the camerawork and animation of the puppets.
Other award-worthy possibilities are the prequel Fantastic Beasts: The Secrets of Dumbledore, which casts spells of good and evil and features various magical creatures from the imagination of J.K. Rowling. Bullet Train recreated Japan and a speeding locomotive with LED screens and a soundstage in Los Angeles. Idris Elba appears as a wish-fulfilling genie in Three Thousand Years of Longing, directed George Miller. Three decades of the life of Elvis Presley gets the Baz Luhrmann treatment in Elvis, which is basically a film about a showman by a showman. The visual effects are faithful to the period while also having an element of hyper-realism to them when depicting Graceland in the various stages and seasons in the 1950s, 1960s and 1970s. The winner for Best Visual Effects at the 95th Academy Awards, being held on March 12, 2023 at Dolby Theatre in Los Angeles, is not a foregone conclusion, and that will make an interesting change from last year where Dune was the runaway favorite, with the only question being which runner-up films would get nominated.
By CHRIS McGOWAN
DNEG employees benefit from a variety of training resources available through the firm’s intranet. (Image courtesy of DNEG)
In these days of rapidly developing VFX technologies and processes, it can be hard to keep up in your field, let alone master new realms. Fortunately, there are many options for continuing education – on the job or via schools or online tutorials. “Since a majority of VFX workflows are closely allied with emerging computing technologies, VFX artists, production management and pipeline staff need to constantly keep abreast of change in workflows and tools resulting from new technologies,” says Shish Aikat, Global Head of Training for DNEG. “Depending on the depth of commitment a VFX artist has to a specific topic or workflow, the sources of learning can be anywhere from a series of tutorials on YouTube, online courses at portals such as fxphd or Gnomon Online [to] an undergraduate/graduate program in VFX at a degree/diploma-granting institution.”
Oftentimes, one need not go further than one’s job to find new training. Dijo Davis, Senior Training Manager for DNEG, comments, “Once the artists make it into DNEG, a variety of in-house training resources is available for their reference via our intranet. In addition to this, custom ‘upskilling’ training programs are organized and conducted to support employees and keep the existing and upcoming shows in line.”
Here is a look at several upskilling paths for the VFX artist.
A clay sculpting master class is one of the benefits at Framestore. (Image courtesy of Framestore)
TCS and its studios have a trainer for each department who is a specialized in the tools of that department and pipeline. (Image courtesy
of Technicolor Creative Studios)
The Unreal Fellowship is a free 30-day blended experience for learning Unreal Engine. (Image courtesy of Epic Games)
“Thanks to a lot of open-source tools for VFX, a lot of tools and learning materials are available, and going by the views and comments on VFX tutorials on YouTube, it appears that a surging number of VFX artists are taking advantage of these courses. Game engines, GPU rendering and real-time technologies have piqued the interest of a multitude of VFX artists. Companies like Epic Games and Unity offer hundreds of courses and channels for artists to interact, and VFX artists are thronging in large numbers to these sites and channels.”
—Shish Aikat, Global Head of Training, DNEG
ONLINE RESOURCES
A vast variety of visual effects courses is available on the internet. For starters, “The tutorials you can find on independent sites or YouTube are super helpful to help fill in any targeted foundational gaps an artist may have. I would say the companies and artists that provide these tutorials are champions for these applications, and they are a great way to learn focused skills,” says Matthew Cruz, Global Creative Training and Development Manager for Technicolor Creative Studios (TCS).
Aikat adds, “Thanks to a lot of open-source tools for VFX, a lot of tools and learning materials are available, and going by the views and comments on VFX tutorials on YouTube, it appears that a surging number of VFX artists are taking advantage of these courses. Game engines, GPU rendering and real-time technologies have piqued the interest of a multitude of VFX artists. Companies like Epic Games and Unity offer hundreds of courses and channels for artists to interact, and VFX artists are thronging in large numbers to these sites and channels.”
Supported by Netflix, the VES Virtual Production Resource Center offers access to free educational resources and information on the latest trends and technologies. Aimed at both current and future VFX professionals, the center is an effort of the Visual Effects Society in collaboration with the VES Technology Committee and the industry’s Virtual Production Committee (https://www.vesglobal.org/virtual-production-resources).
“When it comes to tools, there’s a tremendous amount of autodidactic training that the artists do on their own on their personal time with both paid or free master classes and a lot of online videos,” comments Sylvain Nouveau, Rodeo FX Head of FX. “VFX artists spend a lot of time sharing their knowledge with each other both during and after work hours – scouring for videos online and posting in relevant forums, checking Reddit.”
Showing a demo inside the Virtual Production Stage at FMX 2022. (Image courtesy of David Schaefer)
Many students attend New York City’s School of Visual Arts (SVA) to get a next level job or change their career path. (Image courtesy of SVA)
A virtual production class at Escape Studios at London’s Pearson College. (Image courtesy of Pearson College)
SOFTWARE COMPANY TUTORIALS
Some software company tutorials (for VFX, animation and related creative tools) charge a fee, but many are free. Autodesk offers thousands of free tutorials across YouTube and AREA, according to a company spokesperson. Its YouTube channels include Maya Learning Channel, 3ds Max Learning Channel, Flame Learning Channel and Arnold Renderer. And there are AREA tutorials and courses at https://area.autodesk.com.
SideFX offers free lessons about Houdini and Houdini Engine, which plugs into applications such as Unreal, Unity, Autodesk Maya and 3ds Max,” says SideFX Senior Product Marketing Manager Robert Magee. The SideFX website (https://www.sidefx.com) is host to 2,758 lessons designed to support self-learning within the community. Magee explains, “To help navigate all of these lessons, the SideFX site has 18 curated ‘Learning Path’ pages, which highlight the best lessons to explore for a variety of topics.” The 450 lessons created and published by SideFX are all free. The other 2,300+ tutorials are “submitted by the community.”
The Unity Learn platform offers “over 750 hours of content, both free live and for-fee on-demand learning content, for all levels of experience,” according to the company (https://learn.unity.com/). There are hundreds of Red Giant tutorials on its YouTube channel (https://www.youtube.com/c/Redgiant) and free Maxon webinars here: (https://www.youtube.com/c/MaxonTrainingTeam). Foundry’s site, learn.foundry.com, has Nuke, Modo, Flix and Katana tutorials, among others.
CONFERENCES
Conferences such as SIGGRAPH and FMX also help keep artists current. DNEG’s Aikat notes, “Many smaller VFX houses do not have the resources to house an R&D department, and they often lean on technologies and workflows seeded through academic research. Conferences like ACM SIGGRAPH are opportunities for VFX artists to learn about the products of such research. Many VFX studios send delegates to these technology conferences to soak in the future trends, workflows and solutions predicated by emerging technology.”
FMX, held in Stuttgart each spring, draws thousands of professionals and students. “At FMX, we curate conference tracks on the latest and greatest developments in projects and processes as well as in hardware, software and services,” says Mario Müller, FMX Project Manager. “Conferences play an important part in continuing education [about] the art, craft, business and technology of visual effects. Compared to the readily available information by vendors on the internet, a conference can provide a more concentrated, curated and neutral perspective as well as a community platform.”
The RealTime Conference, founded in 2020 by Jean-Michel Blottière, is fully virtual and free. The live gathering explores real-time technologies with live demos, panels, workshops, classes and keynote speeches (https://realtime.community/conference).
VIEW Conference (https://www.viewconference.it), another notable event, is set in Turin (Torino), Italy, and focuses on computer graphics, interactive techniques, digital cinema, 2D/3D animation, VR and AR, gaming and VFX.
VFX/ANIMATION SCHOOLS
Schools offer various levels of courses, which can be utilized by VFX artists currently working or between jobs. “VFX artists are constantly learning,” says Colin Giles, Head of School of Animation at Vancouver Film School (VFS). “As the industry changes, it is important that people who work as VFX artists have the outlets to keep learning while they’re working. This is why, in addition to our courses and workshops – called VFS Connect – to help people learn part-time without having to leave their jobs.”
For VFX artists to keep up-to-date at work, “some of the bigger studios have quite extensive internal training programs,” while “smaller studios and freelancers depend on online resources [free and paid for] and upskilling via training providers like ourselves,” says Dr. Ian Palmer, Vice Principal at Escape Studios, part of Pearson College in London.
According to Scott Thompson, Co-founder of Think Tank Training Center (TTT) in Vancouver, “In some cases, studios do work with schools like Think Tank to increase their understanding of new ideas. As an example, Think Tank had Mari well before the studios did, so our grads often became teachers to catch them up.”
Escape Studios provides continuing education for VFX professionals “through our short courses, most of which are in-person. We do short daytime courses for those who can take a break from their day job and evening courses for those who are fitting around other commitments. Many of our evening classes are available online,” Palmer says.
Framestore offers its employees classes in everything from software mastery to life drawing to clay sculpting. (Image courtesy of Framestore)
Students on set in the on campus greenscreen room at Vancouver Film School. (Image courtesy of the Vancouver Film School)
Technicolor Creative Studios, through MPC and its other studios, have a training academy to educate new employees in specific VFX tools. (Image courtesy of Technicolor Creative Studios)
The Unreal Fellowship teaches about virtual production and the mechanics of using Unreal Engine for storytelling. (Image courtesy of Epic Games)
VFX artists are often looking for an upgrade. Palmer explains, “Sometimes it’s just to get a fresh skill set to enhance their career. We also have people that want to change direction and need some guidance in that.”
At SVA, a lot of students “are working professionals,” Adam Meyers, Producer at New York City’s School of Visual Arts (SVA), says. “When you have that allotted time after work hours each week, it seems easier than fitting it in at work. Most of them are looking to get training that is more focused.” Meyers adds that “my current continuing education classes are online since COVID.” Asked if he often sees visual artists leaving the industry and going to school for an upgrade to get a “next level” job or to change their career path, Meyers responds, “Every semester. Education is about growth. Artists evolve just like the software.”
At the Savannah College of Art and Design in Georgia, students in the visual effects department take on assignments that reflect the most current working studio practices, such as in virtual production. Dan Bartlett, SCAD Dean of the School of Animation and Motion, notes, “The LED volumes in Atlanta and in Savannah are industry standard, so what we’re able to do in these spaces are create learning experiences that are absolute mirrors of what they would be doing if they were working for a major studio on features or on long-form television. Students work on everything from negotiating the production design components to developing both the digital and the physical assets that go into a shoot, to working in-engine – in our case Unreal Engine – to build the virtual cameras and the virtual lighting setups in order to bring those on-set shoots to life.”
At some VFX studios, like Framestore, visual effects artists do extensive in-house training and also benefit from some specialized classes. “Framestore has a real ‘melting pot’ of learning styles and preferences,” Simon Devereux, Framestore Director, Global Talent Development, says. “Historically, their studios have always offered their employees everything from life drawing and clay sculpting masterclasses to software mastery and behavioral skills training.”
Framestore recently hired a new Global Head of Content & Curriculum, Chris Williams, “who now leads the development of technical and software training, building new learning pathways, and he will ultimately develop the teaching capabilities of all our employees in order to support our global network of offices,” Devereux says. “That, along with a dedicated Unreal and 3D trainer and two production trainers, means we’re in a unique position to build on what is already an incredible investment in the personal and professional growth of our colleagues. In addition to this, we invest in a range of technical tutorial-based training platforms that are accessible across all of our studios.”
Aikat adds, “Constant in-house upskilling with customized training programs and keeping a close eye on the new trends in the market is the key to success.” DNEG also supplements its training curriculum with talks from experts on a range of topics that cover technical tools, product technologies and creative pursuits.
Rodeo FX helps broaden its employees’ horizons with evening classes (paid for by Rodeo) at official partner schools and a few technical colleges. But the greatest learning may come from the other artists. “Fifty percent of what we learn comes from others – whether it is new talent arriving from other companies or Rodeo employees sharing their knowledge with other studios,” Sylvain Nouveau, Rodeo FX Head of FX, comments. He adds, “In many ways, it’s all of these exchanges that keep the industry moving forward. With an average of about two years in a studio, there’s a lot of information flowing.” Marie-Denise Prud’homme, Rodeo FX Manager of Learning and Development, remarks, “Formally, artists share information with each other [tutorials, videos how-tos] through an information-sharing platform we use called Confluence. But shared learning can even be as simple as arranging meetings and calls on the fly.”
Behind the scenes of a fireside chat on the Virtual Production Stage at FMX 2022 with, from left: Hasraf ‘HaZ’ Dulull, HaZimation; Shelley Smith, DNEG Animation; Mikko Matikainen, The Mill; Paul Debevec, VES, Netflix; and David Sheldon-Hicks, Territory Studios. (Image courtesy of Dominique Brewing)
Many students at New York City’s School of Visual Arts (SVA) are working professionals looking to get more focused training. (Image courtesy of SVA)
An Escape Studios virtual production class taking place at the LED volume of MARS, an Escape partner, in West London. (Image courtesy of Pearson College
For on-the-job training, TCS has had a training Academy for years “for college leavers to be trained on photorealism and specific VFX tools and pipeline for features,” Cruz comments. Recently it started a new initiative. “We have a trainer for each department who are specialists in the tools of that department and pipeline. For example, for the FX department we have a dedicated FX trainer to help the department on the floor, so they would be an expert in Houdini, for example, and FX sims.” (TCS’s network of studios includes the Mill and MPC.)
Ron Edwards, TCS Global Head of Commercial Development – L&D, notes, “We encourage our artists to master cutting-edge technology and tools so they can produce content at the highest caliber. The company will also sponsor these efforts and provide employees the chance to learn from accredited institutes and alongside their peers to further their careers.” Edwards adds, “We always want to stay on top of the latest tech to future-proof ourselves and our students.”
Escape’s Palmer comments, “It’s always an exciting field to work in. Just when you think you’ve seen it all, something comes along to amaze you. The industry is full of very bright people with a thirst for knowledge, so while it’s challenging to keep up, that’s what makes it so interesting. Long may it continue!”
By TREVOR HOGG
Kate Winslet as Ronal, left, and Sigourney Weaver as Dr. Grace Augustine taking part in underwater and surface performance capture, which were stitched together into a singular performance. (Photo: Mark Fellman)
Images courtesy of 20th Century Studios
After recreating a famous nautical disaster that became the highest-grossing film of all time, filmmaker James Cameron turned his focus towards the skies and imagined what galactic colonialism would look like if humans discovered other inhabitable planets and moons. Avatar went on to unseat Titanic at the box office and is now being expanded into a franchise consisting of four planned sequels, the first of which, Avatar: The Way of Water, takes places 14 years after the original movie, where wheelchair-bound mercenary Jake Sully (Sam Worthington) leaves behind his crippled body for a genetically engineered human/Na’vi hybrid body to live with the indigenous Neytiri (Zoe Saldana) on the lunar setting of Pandora.
“Visual effects allow us to put up on the screen compelling and emotive characters that could not be created with makeup and prosthetics, and cannot be as engaging with robotics, and it allows us to present a world that doesn’t exist in a photoreal way as if that world really exists,” Producer Jon Landau remarks. “Those two things, combined with the story that we have, creates a compelling cinematic experience.”
The biggest technical advancement has been in facial capture. “On the first movie, we recorded facial performance with a single standard-definition head rig,” Landau details. “This time around we’re using two high-definition head rigs. We’re capturing quadruple, or more, the type of data to drive the performance. Wētā FX has a smart learning algorithm that trains on what the actors do after we put them into FACS session.”
A template is constructed before principal photography commences. “We probably have 160 people here in Los Angeles that build these files, that are a slightly cruder representation of the movie but describe exactly what it is we want to achieve,” Production Visual Effects Supervisor Richard Baneham explains. “We acquire our performances, go through an editorial process selecting the preferred performances, and put them into what we call camera loads, which are moments in time as if they are happening. A scene might be made up of 10 ‘camera loads,’ or just one depending on the consistency of the performances that we need to deal with and the intended cutting pattern. It is a true representation of our intended lighting, environments and effects to the point when we’re done with it, Jim often asks, ‘Does it match the template?’”
Fourteen years have gone by since we last saw Neytiri and Jake Sully, who have gone on to have a family that includes Kiri, Neteyam, Lo’ak and Tuk.
Returning as the main vendor to the franchise is Wētā FX, which handled the vast majority of the 3,200 visual effects shots, with ILM contributing 36. The performance capture took into account that a number of scenes take place above and below water. “When actors are performing underwater, almost always you can use the body capture, which is helpful for getting that sense of how characters would be behaving in water,” Joe Letteri, Senior Visual Effects Supervisor for Wētā FX states. “The facial gets more complicated because the face rig that we can use above ground won’t hold up below ground. We used the light system underwater, like a single GoPro, rather than a pair of high-resolution stereo cameras that we would normally use. If there was a particular emotional beat where we were on the characters and Jim needed their performance, as soon as he got the takes done in the water, he would have them come up, put on the normal head rig and repeat the performance for their facial capture above the water. Then we would stitch the two together later.”
Adopted into the Sully family is the human character of Spider, portrayed by Jack Champion. “We had Jack onstage for performance capture and captured everything with him in scale, which meant building multiple scale sets and getting the interaction from character to character in a nonuniform scale, which is a hard thing to do,” Baneham remarks. “Hopefully that culminates into a singular performance that has an integrated representation of Spider at the right scale. Then, Jim is able to take the cameras that he or I did and repeat it on set with the live-action crew. Jack as Spider live-action is often heavily informed by what he did in the capture session, but there is lots of room to maneuver or make choices on set that are different from what we did. Exterior forces impounding on a character is actually how you trick an audience into believing that something is really integrated when in fact it’s a full CG representation of Quaritch and a live-action representation of Spider. Pantomime is your enemy when it comes to integration.”
Even when the scripts were being developed, James Cameron had his team of concept artists at Lightstorm Entertainment visualizing settings, such as a bio reef from a high angle.
From left: Sigourney Weaver as Dr. Grace Augustine, James Cameron and Joel David Moore as Norm Spellman rehearse a scene in one of the many practical sets. (Photo: Mark Fellman)
Ferns were seen as a familiar plant for audiences, ferns that could be hardy enough to live in other planetary environments.
From left: Sigourney actually plays two characters, Na’vi teenager Kiri and Dr. Grace Augustine.
James Cameron enjoys being in the middle of the action when conducting principal photography.
Each sequel will introduce new cultures, with the coastal-dwelling Metkayina making an appearance in Avatar: The Way of Water. “The Metkayina skin is slightly greener and has stripes that have more of an aquatic feel, while the forest Na’vi are bluer and have a stripe pattern based on tiger stripes,” Letteri explains. “The Metkayina have a stronger dorsal ventral coloring, where they are lighter in the front and darker in the back, more similar with what you would see with aquatic creatures. And they are evolved for swimming, so they have tails that have a wide end to them that can be used to help propel them through the water. The Metkayina have what we call stripes on their arms and legs that are like thicker skin, almost like flanges that can be used for propulsion. There were a number of design changes to make them distinctly adaptive to their environment, and that is part of the story. When Jake and his clan go to visit, they have to adapt, despite not being physically built for the environment.”
Marine life has been incorporated into the world-building of Pandora. “You figured that something like a fern would have probably evolved on another planet,” Letteri notes. “There are a lot of plants that we spread out to give you that familiarity, and interspersed are what we call the exotic plants, which are the Pandora-only plants. That got set up during the daylight scenes, and then for the nighttime scenes we turned on the bioluminescence, which was on both plants [ferns and exotic], which gave it that distinction. That was definitely inspired by bioluminescence underwater, and bringing it above ground and having it illuminate this whole forest. Underwater we took a similar approach by having different types of coral-like fan or tree corals mixed with new ones that give it more of that Pandora feel, and adding bioluminescence.”
Everything is based on reality. Even the stripe pattern on the Na’vi was inspired by tiger stripes.
Returning as an antagonist is the character of Miles Quaritch, played by Stephen Lang.
From left: Ronal, Tonowari and the Metkayina clan have a skin tone that is greener than the bluer forest-dwelling Na’vi.
“Visual effects allow us to put up on the screen compelling and emotive characters that could not be created with makeup and prosthetics, and cannot be as engaging with robotics, and it allows us to present a world that doesn’t exist in a photoreal way as if that world really exists. Those two things combined with the story that we have creates a compelling cinematic experience.”
—Jon Landau, Producer
A sentient creature called Payakan has a pivotal role. “Don’t call him a whale!” Baneham laughs. “He’s a Tulkun. What we do is try to never allow design to be for design’s sake. The kinematic structures, environmental aspects of how and where they live, what they would eat and how they would hunt, all inform the design, which is the outward expression. What is bringing them to life is the motion side of things. The motion design is understanding how something locomotes, emotes and expresses itself on a physical level. We may need to make changes to where the fins are and how big they are. Then you ask, ‘What would make sense for this creature and character?’ You can start to manipulate the design into its final end stage. I always try to ground everything in terrestrial reference, and Jim is the same. Even though we know that it’s not real, our job is to make the audience walk away feeling that place and those characters can exist [on Pandora].”
On set was Wētā FX Senior Visual Effects Supervisor Eric Saindon, who had a close working relationship with Cinematographer Russell Carpenter. “In The Hobbit, there were a few shots in the troll sequence where we put a background plate behind a bluescreen and did a quick Ncam situation so you could see what was going on,” Saindon remarks. “In this film, every live-action shot was captured with Simulcam using depth-based compositing within the live-action camera. When you are looking through the lens, our live-action and CG elements are composited together at the correct depth and placed in the proper space. We know exactly where Spider is walking in and amongst five or six Na’vi. You know where the background falls and where the eyelines of the live-action characters could be. It’s not even using the bluescreen.”
Carpenter found his third collaboration with Cameron different from lensing True Lies and Titanic. “On a regular film, the director of photography is working with the production designer from the word ‘go,’” Carpenter observes. “Here, I come in and had to make sure that the language is already in the virtual world, because I was bringing live lighting to that. Towards the beginning of the film, there were definitely scenes populated by humans that had hardly anything virtual about them, but as the story branches out into the Na’vi world, then you’re into this world of compositing. That’s the point where it gets painstaking. Especially painstaking are the jungles, because what kind of light is coming down through the canopy? We don’t have a canopy onstage, but we do have branches that we can put that are fairly close. And if you’re running through a jungle, your actor is hit by this kind of light, and he goes 15 feet and it’s a different kind of light, and then there is something else. Not only does the light have to happen at the right time, but it also has to be the proper color temperature and right quality.”
Jack Champion rehearsed his human character of Spider on the performance capture stage to make his interactions with the CG characters more believable.
Action takes place out at sea. “That big Picador we actually built with two HamiltonJet engines in it,” Saindon reveals. “The boats were put on the same gimbal that we use for flying, but at a much bigger scale. We did simulations of those boats in the water so we got the proper motion of them going over the waves, how the bow goes up over the waves, and the way they jump the waves and move through.” Water interaction was achieved by utilizing spray cannons. “We could have faked it, because we had to add water anyway for the bow of the boat,” Saindon notes, “but what we wanted was Mick Scoresby [played by Brendan Cowell] up on the bow being hit by a wave, clearing his mask, being absolutely drenched, and knowing how to act if he got hit by a wave. We hit him with four spray cannons that just about knocked him over, because it was a lot more water than he was expecting! It puts you into the shot.”
Coordinating everything was a massive enterprise. “What was difficult for me, though it was satisfactory watching what it eventually became, was not having that payoff at the end of the day when you’ve done a live-action film and go, ‘That was such a great scene. I loved watching the actor do that,’” Carpenter states. “The payoff comes much later when you’ve seen the whole thing done by Wētā FX.” The important aspect is making sure that the technology does not overshadow the storytelling. “That’s the thing,” Carpenter emphasizes. “This goes back to Jim. The miracle is that there was five and a half years of blood, sweat and tears that went into this, and there is ton of technology, and when I look at the scenes, I don’t see any of the technology. It’s just an immersive experience. I see Jim as a North Pole explorer. He tries some challenges that he doesn’t know how to do, but is betting that he can do it by the time the film is finished.”
Jake Sully rides a Skimwing into battle.
Jake Sully and Neytiri take flight, riding the Mountain Banshee.
Aquatic creatures make an appearance, with a central one being a sentient species called tulkuns. A key relationship is between Lo’ak and the tulkun known as Payakan.
The human character role of Spider, portrayed by Jack Champion, was expanded by James Cameron, resulting in extensive integration between the live-action and animated performances. (Photo: Mark Fellman)
Unlike the marker-based facial capture used by Robert Zemeckis for Beowulf and The Polar Express, James Cameron decided to go the image-based route for Avatar. “It was an intense learning curve,” Cameron admits. “When we finished Avatar, what I requested was for the studio to continue to pay everybody for a couple of months so that we would have time to do a full and complete download and debrief. We did a three-day retreat during which I asked everybody to bring a white paper and notes of their three or four years of work and R&D so we could figure out how to do it better this time. Better in two ways: the end result on the screen looking better and a process that was more straightforward, efficient, intuitive and user friendly for artists.”
Along with merging their pipelines, Lightstorm Entertainment (a production company founded by Cameron and producer Lawrence Kasanoff) focused on the performance capture in water while Wētā FX looked after the CG water. “Wētā FX was responsible for developing all of the tools necessary for the computational fluid dynamic simulations that were necessary to simulate water and figuring out exactly the layers of simulation technology that would be required for that,” Cameron remarks. “In the early stages, my in-house team had to develop the methodology for capturing in the water. Our air volume used infrared. However, infrared doesn’t propagate underwater. We wanted something from a nonvisible spectrum so that our reference camera lighting didn’t interfere with the performance-capture camera system. We tested ultraviolet and that turned out to work quite well. Then we had to create the code to knit the two volumes together in real-time so that we would solve for the entire figure.”
James Cameron and actor Sam Worthington return to the Avatar franchise. (Photo: Mark Fellman)
James Cameron onboard one of the variety of military vehicles belonging to the human enterprise on Pandora known as the Research Development Administration. (Photo: Mark Fellman)
Water tanks were built specifically for the underwater performance capture. (Photo: Mark Fellman)
“When people see this film, they’ll be captivated by the world-building, the visuals, and all of the things that we couldn’t do with the camera. But what they’ll be astonished by is the sense of connection to the characters.”
—James Cameron
James Cameron describes visual effects as being the fabric of his craft when it comes to making Avatar movies. (Photo: Mark Fellman)
Edie Falco joins the Avatar franchise as General Ardmore. (Photo: Mark Fellman)
For James Cameron, visual effects are part of the image-making process.
Ocean conditions had to be recreated to get the proper performance capture interactions. “We created an underwater wind tunnel so we could have actors riding on their creatures and acting and interacting with each other where we could shoot the reference camera so that we could see their facial and body performances clearly,” Cameron states. “Capture data is useless if you don’t have great reference, because that reference footage is what we cut with as editors and is how we ground truth the dataset. It’s like a triangulation process. In the end, we used that quite extensively in the animation phase.” The technology enhances rather than hinders performances. “In live-action, an actor has to maintain a performance across multiple times, whereas with performance capture they only have to do it once. When we do additional takes, those takes are exploratory, they’re not about matching. When people see this film, they’ll be captivated by the world-building, the visuals, and all of the things that we couldn’t do with the camera. But what they’ll be astonished by is the sense of connection to the characters.”
Helping to drive the storytelling is a group of in-house concept artists at Lightstorm Entertainment. “On the sequels, I started by working for six months generating 800 pages of notes on the characters and the general ideas of the story,” Cameron explains. “I gave those notes to my co-writers and was already in the design process. It was a parallel processing between the story-building and the world-building. I knew that they were going to need creatures to ride, so I came up with two creatures – an Ilu and a Skimwing. I said, ‘Start playing with these ideas.’ Meanwhile, I’ve started to write characters jumping on an Ilu and doing this and that. It emerged that the Skimwing would be more of a warrior’s mount and the Ilu is more of a local horse-like creature, not in appearance but in behavior. We have this idea from Hitchcock on down that auteurs already have the movie running in their head and it’s just of process of communicating it to everybody else. It’s not like that at all. You have this fuzzy picture. That’s why I work so well with all of these artists, because they know that there’s something there and that their input will be the final thing that people see.”
“As I was coming up as a director, there were the regular sequences and then there were the visual effects sequences,” Cameron observes. “An Avatar movie works differently. The Way of Water is three hours long, and there is not one second of that which is not a visual effect. It more plays by the rules of an animated film, except the end result looks like photography. We used to call it special effects; however, when they’re not special anymore, what do you call them? To me, they’re not visual effects anymore, but the image-making process. My live-action cast of Jack Champion, Brendan Cowell and Edie Falco worked in the performance-capture volume first, roughing in the scene so that later they wouldn’t be disoriented when shooting in front of a bluescreen and a partial set. When you ask me, ‘What is my attitude towards visual effects now?’ It’s my craft. Do I fantasize about just going out and grabbing a camera because I love to handhold it and shoot a live-action film down and dirty? Yeah, I love that. But the truth is that I get to do that within the greater story of an Avatar production.”
By JIM McCULLAUGH
Iwájú, a comics-style animated series set in a futuristic Lagos, Nigeria, is the first production between Disney Animation and an outside studio, African animation company Kugali. CG animation is provided by Cinesite. (Image courtesy of Disney+)
What does the road ahead look like for the VFX and animation industries? The subjects that are most top of mind for those at the nexus of VFX and animation include: real-time, virtual production, LED volumes, AI, machine learning, AR, the Cloud, hybrid working, tangible effects of the pandemic, global expansion and the search for talent. With the world canvas now a bullet train of VFX-infused movies, via streaming and related platforms, a global cross-section of industry leaders meets in this VFX Voice virtual roundtable to discuss the outlook for the new year.
Paul Salvini, Global Chief Technology Officer, DNEG
When I think of global trends happening in the VFX community, the one that excites me most at DNEG is how real-time technology is establishing itself within our film pipeline. With the determination to provide our artists with the best possible content creation tools, DNEG’s UX (User Experience) and R&D teams have been working closely with our artists to find innovative and better ways of working by leveraging the power and immediate feedback of real-time technologies.
Over the last year, we completed several successful projects using our new hybrid real-time pipeline: an animated short film (“Mr. Spam Gets a New Hat”), final pixel environments for a major feature film (The Matrix Resurrections), and digital backgrounds for various virtual production projects. Thanks to the immediate feedback that real-time technologies provide, artists have more time to iterate and explore creative possibilities. The results speak for themselves. The quality of real-time output today is impressive.
Johnny Fisk, President, FuseFX
This new renaissance of entertainment touches every one of us. We’ve seen our industry explode in all directions as the use
of VFX continues to proliferate throughout all aspects of the market. High-end content is now being produced for all platforms in media. With the growing work, we, too, are stepping into the next generation of VFX. We’ve only just begun scratching the surface of how emerging tools and techniques can be utilized to tell bigger and more engaging stories moving forward. As artists, we’re forging new territory, such as utilizing real-time software and deep learning technology in our imagery and workflows. Bringing innovation to the table is bringing a renewed energy to all of our work. I don’t think there’s a more exciting time to be working in VFX than right now.
“The most recent example of a game-changer for us was the way our artists leveraged our latest AI face-swapping tools as an element to create the youthful Skywalker in The Book of Boba Fett. Combining the best of our digital facial animation techniques with the latest in machine learning really lets us achieve a combination of likeness and detail that wouldn’t have been possible just a few years ago.”
—Rob Bredow, Senior Vice President & Chief Creative Officer, ILM
Mission Impossible: Dead Reckoning – Part One is the seventh entry in the series. Visual effects and animation are supplied by ILM London.
Shazam! Fury of the Gods follows in the wake of 2019’s Shazam! SFX vendors include DNEG, OPSIS, RISE Visual Effects Studios, Scanline VFX and Weta Digital. (Image courtesy of Warner Bros. Pictures)
Christopher Edwards, Founder & CEO, The Third Floor
Regarding location-based entertainment, experiential entertainment and the real promise of the Metaverse… At The Third Floor we have always been storytellers at heart. We love helping visionaries take audiences on visceral, emotional journeys, but this isn’t strictly limited to linear media. Modern audiences are increasingly obsessed with quality interactive and immersive experiences that can take their sense of engagement to the next level.
For over 18 years, The Third Floor team has crafted cinematic moments for AAA video games and world-class theme parks, including Super Nintendo World at Universal Studios Japan. The pandemic sequestered so many people for so long that there has been a societal shift towards appreciating communal events. Whether this is a physical gathering, such as a live concert or a trip to an amusement park, or a virtual gathering in an MMO (Massive Multiplayer Online) game, there is nothing quite as satisfying as a shared experience with friends and family. So, content creators are beginning to adapt and expand their IPs to formats that complement traditional media formats and encourage social engagement and viral marketing.
Tram Le-Jones, Vice President of Solutions, ftrack
The pandemic has encouraged a lot of new thinking, which is very exciting for us as an industry. With the pivot to working from home, we’ve realized that making changes wasn’t as hard as we thought. The pandemic disrupted our norms, made us realize what’s really important in our lives, and forced us to do things personally and professionally we hadn’t done before. We’re making new connections that have opened us to adjacent and nonadjacent industries. It isn’t an entirely new concept, but the pandemic has accelerated it. Not only are we learning from others, but also they are learning from us. We’re working more collaboratively and finding that we all have much more in common than we originally thought. We’re going to see a lot more from this intersection.
“Things that would have taken months of cooking in the animation process can now take seconds. We can switch style sheets instantly, and a machine can re-code entire projects instantly as opposed to hours of hand, frame-by-frame labor. AI is an amazing way to start a conversation or to drive inspiration.”
—Andrew Melchior, Executive Vice President/Global Director of Brand Experience, The Mill
Mario (voiced by Chris Pratt) travels through an underground labyrinth in The Super Mario Bros. Movie. Illumination Studios Paris provide character animation and computer graphics. (Image courtesy of Universal Pictures)
The Little Mermaid goes live-action. VFX/SFX vendors include Framestore, ILM, DNEG, Rodeo FX, MPC, Lifecast and Clear Angle Studios. (Image courtesy of Walt Disney Studios)
Kim Libreri, Chief Technology Officer, Epic Games
Real-time technology will continue to have significant impacts on filmmaking and entertainment. As we’ve seen with the explosion of virtual production and in-camera visual effects in particular, VFX crews are becoming more and more of an integral part of the primary on-set filmmaking process. VFX artists are now joining the ranks of cinematographers, production designers, costume designers and other roles that shape production from its earliest stages. As the VFX process becomes more immediate and tactile to key creatives, artists are collaborating and iterating more with other departments. This dynamic reduces miscommunication and repetition, as creative decisions can be made interactively while in production.
Furthermore, emerging tools and workflows are starting to make transmedia production a reality. With Unreal Engine, for example, you only need to create your content once, and then you can easily deploy it across film, games, immersive experiences and other forms of art and multimedia. Real-time production is making it easier than ever to completely rethink how and where your IP can be consumed. As filmmakers become more comfortable with this new reality, adapting game content for film, and vice versa, will become the norm.
Michael Ford, Chief Technology Officer, Sony Pictures Imageworks
I’m incredibly excited about the continued industry adoption and participation in open source software initiatives. With the leadership and structure provided by the Academy Software Foundation (ASWF), the VFX and animation industry is making great strides to work together as a community to build software, libraries and processes that benefit us all. At Imageworks, we like to say that open source is the “engine of innovation” that allows us to leverage not just our talents, but also the talents of an entire industry. Open source also allows us to reach a more diverse group of people that might otherwise not have the opportunity to work in our industry, and we need this more than ever in order to build and strengthen our global workforce.
With the expanding use of game engines and faster compute via GPUs and distributed CPU rendering, the industry is moving towards a real-time future where creative decision-making is being made at a much higher rate. ICVFX (in-camera visual effects), animation and VFX workflows are all being influenced by these enabling technologies. I think the next few years are really going to change the way we think about computer graphics, especially when we look to the future of generating new and innovative looks via AI and machine learning.
Mathieu Boucher, Vice President of Operations, Hybride/Ubisoft
In the last few years, remote work has proved to be very efficient for many studios in the VFX industry, including Hybride. It opened the door to new possibilities and access to a broader and more diverse talent pool. For instance, since 2020, Hybride has managed to significantly grow its workforce with team members working from all over Quebec. I am interested to see how the VFX industry will continue to adapt to this new reality.
In 2023, we will start to see a democratization of the virtual production pipeline. The technical complexity and extensive pre-productions are becoming easier to manage while industry expertise and know-how are increasing immensely. This will open new creative opportunities for a wider spectrum of productions. AI is increasingly part of our processes, but I think it will also lead to impactful technological evolutions.
Frank Montero, Managing Director/President, ROE Visual US, Inc.
The exploration of virtual production borrows from the sentiment ‘With great risk often comes great reward.’ While advancements are continuously taking place, a certain amount of ambiguity is naturally associated with this budding technology. In truth, the concept of virtual production in film is far from revolutionary; however, the ways in which it has expanded to include digital artwork and LED displays are. Today, VP techniques facilitate production in a myriad of departments throughout studios worldwide. Most prominently, the use of LED displays for backgrounds on set simplifies the work for VFX teams in post-production while increasing real-time engagement for the cast and crew. The dynamic nature of the digital content ensures production can move forward in the desired direction while on-set modifications to the LED canvas can take place at any point in the process.
Adrianna ‘AJ’ Cohen, Senior Vice President/Global Head of Production, Mikros Animation
The technology in animation is improving at an incredible rate. An animator’s ability to create uniquely beautiful animation is getting easier and more accessible. This allows all previously unattainable ideas to come to life (e.g., mocap, animation/live-action hybrids, 2D and 3D content, etc.), and studios can take more chances on ideas they couldn’t previously afford to.
In addition to the advancement in technology, the movement to streaming providers has created a demand never seen before. Everything is changing. The need for resources will drive opportunities for every artist across the globe, which is an excellent opportunity for Mikros Animation and everyone working in the field. The challenge as a studio, however, will be to figure out how to attract and train talent, and make them part of our family.
Wayne Brinton, Business Development Director, Rodeo FX
The consumers’ high standards are not just a question of visuals. It’s about getting the same emotional experience you had when you consumed the content on your screen the first time – no matter the platform or the format. The sheer amount of content consumption in the past few decades has created expectations of fidelity in visual effects. When they/we don’t get that in an experience (ads, movies or even Snapchat filters), the experience becomes less than what it “should have been.” Like trying to redo the dragons from Game of Thrones in a Snapchat filter. Of course, users are going to be disappointed.
Consumers are expecting a very high standard in terms of imagery and visuals, yes, but that’s not really what the expectation is rooted in – they expect great storytelling.
“The benefits of virtual production are driving the growth, with the number of LED volumes likely to double in the next three to four years. While the need for post-production will shrink, digital pre-production work will increase in order to optimize the on-set shoot. Novel uses for LED volumes beyond virtual production will also help drive the growth of LED volumes.”
—Kim Davidson, President & CEO, SideFX Software
The Oscar-winning Spider-Verse saga opens a new chapter with Spider-Man: Across the Spider Verse, Part One. Sony Pictures Imageworks provides imagery and animation. (Image courtesy of Columbia Pictures and Marvel Entertainment)
John Wick: Chapter 4 is the fourth installment of the one-man-takes-on the-entire-underworld series. Primary VFX vendor is The Yard VFX, with contributions from One of Us and Mavericks VFX. (Image courtesy of Lionsgate)
“Due to the increased demand for animated series, traditional animation will have to investigate leaner techniques, such as bypassing the 2D process altogether and jumping straight into 3D, using game engine technology to be able to scale creative output.”
—Mariana Acuña Acosta, Senior Vice President, Global Virtual Production, Technicolor
Dungeons & Dragons: Honor Among Thieves deploys the latest VFX to honor the franchise. ILM, MPC, Clear Angle Studios and Legacy Effects contribute special effects. (Image courtesy of Hasbro and Paramount Studios)
Trek across post-pandemic America with Joel and Ellie in the HBO Max TV series The Last of Us. Primary VFX/SFX vendors are DNEG and RISE Visual Effects Studios. (Image courtesy of HBO Max)
Dennis Kleyn, NVX, Founder/CEO/VFX Creative Director, Planet X
Virtual production is gaining noticeable traction in the Netherlands as well as in this part of Europe in general. Our Dutch film industry is relatively small and not on the most progressive/ innovative side, so it feels a bit like most producers have just caught up with considering ‘traditional VFX’ as a creative department within the filmmaking process (rather than a problem-fixing one), and now an even newer technique is on the doorstep. Planet X has been involved in the founding of the first VP/ICVFX studio in the Netherlands: ReadySet Studios.
Rob Bredow, Senior Vice President & Chief Creative Officer, ILM
We’re very fortunate at ILM to get to work with world-class innovative filmmakers and showrunners who push us to invent new techniques on nearly every new show. Just yesterday, I was on set on one of our shows with a talented director of photography who was inventing new workflows on that day, and seeing our StageCraft team respond with just the right artistic and technical solutions just in time to shoot. It was inspiring.
AI and machine learning techniques are transforming the way software is written – and the way our artists interact with our tools. The most recent example of a game-changer for us was the way our artists leveraged our latest AI face-swapping tools as an element to create the youthful Skywalker in The Book of Boba Fett. Combining the best of our digital facial animation techniques with the latest in machine learning really lets us achieve a combination of likeness and detail that wouldn’t have been possible just a few years ago.
Andrew Melchior, Executive Vice President/Global Director of Brand Experience, The Mill
AI is such a sophisticated and wild beast. There is always the question of whether the thing you’ve created will stay within the black boxes, and there is a lot of concern about that. One thing is for sure: AI certainly creates a stir and an interest.
Regarding AR, VR and AI and their relationship to the evolution of VFX/animation, from a visual effects point of view, we have been having lots of conversations. Things that would have taken months of cooking in the animation process can now take seconds. We can switch style sheets instantly, and a machine can re-code entire projects instantly as opposed to hours of hand, frame-by-frame labor. AI is an amazing way to start a conversation or to drive inspiration.
We can address and scale down huge 3D geometries and make them real-time assets that can run on local devices. With NeRF [Neutral Radiance Fields] – instead of taking very detailed geometry and point clouds, you can now take 2D photos, and machine learning can take them and build 3D models on the fly. This means you can easily create characters with true-to-life shadows and textures that would have taken ages before. The price of entry used to be the restricting factor, but now we can access these technologies on the browser. It has completely democratized the process, which will change the game and open accessibility to everyone. However, there will always be a market for hand-made content. It is still obvious when humans make something versus a machine. When we automate everything, it does run the risk of all looking the same or similar. Hand-made content will still be considered ‘magical’ and ‘special’ because of its uniqueness and visceral qualities. It will be a smaller industry, but it will always exist.
Kim Davidson, President and CEO, SideFX Software
Emerging and evolving technologies, such as AI/ML or AR/VR, will not replace current solutions. Rather, they will continue to complement and improve current approaches. In digital modeling, for example, VR and ML are nice complements to procedural and interactive modeling techniques. By combining these technologies, modeling in the future will be more versatile, interactive and intuitive. At SideFX, we look to incorporate ML into future releases of Houdini as a complement to procedural workflows in modeling, environment generation, layout, animation, character effects and lighting.
Virtual production is rapidly changing current production pipelines. The benefits of virtual production are driving the growth, with the number of LED volumes likely to double in the next three to four years. While the need for post-production will shrink, digital pre-production work will increase in order to optimize the on-set shoot. Novel uses for LED volumes beyond virtual production will also help drive the growth of LED volumes. Who needs the “Metaverse” when you can spend time with friends inside a giant LED half-dome? The shortage of talent, particularly technical, is the biggest issue our industry is currently facing, and it is unlikely to subside over the next three to five years.
“Bringing innovation to the table is bringing a renewed energy to all of our work, and I don’t think there’s a more exciting time to be working in VFX than right now.”
—Johnny Fisk, President, FuseFX
A toy company robotics engineer builds a life-like, almost-living doll in M3GAN. Concept design and specialty props are provided by Wētā Workshop. (Image courtesy of Universal Pictures)
The excesses of early Hollywood are on full display in Babylon. ILM handles VFX and animation. (Image courtesy of Paramount Pictures)
Danny Turner, Executive Producer, Yannix Co., Ltd.
The industry has recovered in a big way and, in turn, Yannix enjoyed unprecedented demand for our services in 2021-22, particularly our Character Rotomation (RotoAnim) service for which we’ve seen exponential increases in client demand. There are no signs of things slowing down any time soon.
Throughout the global health crisis, Yannix remained open for business without any interruption by implementing a “seven-days-a-week” work strategy. By splitting our teams into two separate shifts, Yannix complied with social-distancing guidelines and, most importantly, we kept our people safe and healthy. Through it all, we never had a need to implement a “work from home” strategy. As the industry and consumers adjusted to the “new normal,” we focused on keeping the lines of communication open with our clients and strived to remain ready.
Mathieu Raynault, Founder & CEO, Raynault VFX
With the need for VFX productions at an all-time high, the landscape of the cinema and television industries is changing at an unprecedented pace. VFX companies have to reinvent and streamline their pipelines and technologies to account for both remote work and labor shortages. All this movement creates engaging challenges, thrilling opportunities and, without a doubt, uncertainty. We believe that keeping the human aspect of our business at the center of the VFX conversation is key to surfing this wave in the future. Raynault’s bet is to create a model where artists have greater ownership over their work, shots and assets. Our team thrives on overcoming their most complex tasks while maintaining a healthy work environment and VFX/life balance. This concept may sound cliché, but it’s actually at the core of our philosophy now and for the many years to come.
“The pressure to deliver large amounts of high-quality shots around tight deadlines [for high-end episodic productions] has set recruitment teams on fire – with no signs of slowing down anytime soon.”
—Gaurav Gupta, Managing CEO, FutureWorks Media Limited
Supernatural chiller The Pope’s Exorcist is the portrayal of a real-life Vatican exorcist. (Image courtesy of Sony/Screen Gems)
Vecna returns to terrify in the fifth and final season of Stranger Things. Vecna’s CG-animated hand was crafted by DNEG. VFX vendors include Alchemy 24, AB VFX Studios, BOT VFX, DNEG, Lola Visual Effects, Rodeo FX and Spin VFX. (Image courtesy of DNEG and Netflix)
Steve Read, Head of Studio and Executive Producer, Versatile Media Company Ltd.
Creative drives technology, and it all begins with a great story. The purpose of virtual production is to align both under one clear vision: to see results in real-time, in-camera and at the hands of key creatives.
Lensing shots on LED volumes allows directors and DPs to obtain immediate results and keep full control of both the practical and digital elements. This level of collaboration naturally brings the principal photography and VFX components together under the control of one creative drive. It promotes a transparent workflow in real-time to capture the best results possible. Our industry has evolved. Audiences today have a tremendous appetite for more quality content. The distribution and platforms are also quickly evolving.
Patrick Davenport, President, Ghost VFX
Talent remains our key priority. There’s always a shortage of artists, but this is exacerbated in the current climate. So, the focus has to be on employee retention, not just recruiting, but paying fairly (including overtime) and providing a supportive work environment and culture. The industry is still in a dizzying state of flux, with so many companies for sale, being acquired or merged. We would like to get on with the work and enjoy a stable environment, especially with everything going on in the world.
Nearly three years since the start of the pandemic, it feels like the shift to full-time WFH (Work from Home) or hybrid has become permanently embedded in our industry, which means studios have to optimize the employees’ remote work experience through technology and enhanced people support to maintain creative collaboration and productivity.
Mariana Acuña Acosta, Senior Vice President, Global Virtual Production, Technicolor
Traditional VFX pipelines will continue to evolve, moving to the Cloud, enabled by machine learning for automated data wrangling. Automation will continue to be a key piece of the VFX puzzle as it relates to performance transfers, keying, rotoscoping, de-noising, data clean-up, etc. Due to the increased demand for animated series, traditional animation will have to investigate leaner techniques, such as bypassing the 2D process altogether and jumping straight into 3D, using game engine technology to be able to scale creative output.
Given the challenges of on-set virtual production, standardization is key, which is why there’s a movement towards SMPTE 2110 (this is the equivalent of when physical tapes moved to digital files for content storage). Virtual production will continue to grow in other areas than just film and episodic. VP will see wider adoption in animation and advertising.
Hitesh Shah, Founder and CEO, BOT VFX
VFX embraces the gig economy – more seriously this time. Three years ago, most facilities (with rare exceptions) could not conceive of artists doing their work from remote locations because of the technical constraints of the infrastructure itself, let alone other factors. Then the pandemic forced a reluctant embrace of PCoIP (PC over IP) technology out of sheer desperate necessity. Soon, this tool of necessity became the transformative tool for facilities to free themselves of geographic constraints in hiring artist talent.
Pipelines and infrastructure built in one city could now leverage artist talent in far-off places without much incremental cost, setup time or process changes. Concurrent with this new enabling technology were two other factors: the surge in VFX service demand that exceeded the readily available industry capacity, and the wider social movement normalizing work-from-home. Facilities have embraced these changes by changing their operating model. They have begun recruiting remote talent to augment their base office teams.
The Quantum Realm and its strange creatures challenge Ant-Man in Ant-Man and the Wasp: Quantumania. SFX vendors are Digital Domain, ILM, Method Studios and Sony Pictures Imageworks. (Image courtesy of Marvel Studios and Walt Disney Studios)
Christophe Casenave, Head of Category Management and Sales Cinema Products, Carl Zeiss AG
Film productions in general, and VFX productions in particular, are striving for efficiency, driven by the high demand for streaming content. The well-established VFX workflows are being updated with new technologies like virtual production, which allow for the final production of pixels on set and offer new levels of flexibility. One of the major challenges these teams are encountering is the matching of the look of the CGI with the look of the physical lens, especially when the glass shows very strong characteristics, as seen with popular vintage lenses. Matching the look is mainly achieved with a lot of manual work in post-production and relies on guesswork to reproduce lens characteristics, which makes it difficult to scale and to use with real-time tools. Optimizing virtual production processes and making the images produced even more cinematic will be the major challenge in the near future.
Markus Manninen, Managing Director, Goodbye Kansas Studios
At Goodbye Kansas Studios, we see a continued high demand from clients to get visual effects work done during 2023. In particular, the episodic segment is continuing the trend of setting higher expectations of visual complexity and quality, with the effect being that clients are reaching out earlier to secure resources that are able to accomplish complex shots and scenes. We expect to see more strategic relationships between vendors and clients as a result.
The open source of core tools and capabilities will become a much more integral part of next-generation tools, workflows and processes in 2023. Virtual production is clearly here to stay, even as on-location work continues to grow during 2023.
David Patton, CEO, Jellyfish Pictures
As we navigate the expanding VFX landscape, 2023 will see us continue to build new global collaborative workflows – not only to break down the geographical barriers when it comes to sourcing talent, but also to allow us to build more speed, scale, agility and sustainability in delivering new projects.
With this in mind, Jellyfish Pictures has been harnessing the power of Cloud technology, working closely with providers such as Microsoft Azure, Hammerspace and HP to optimize our internal pipelines and boost productivity. Using modern Cloud-based solutions empower our artists to create the same standard of work as they would within a studio environment, no matter where they are in the world. In addition, the rapid rise of virtual production has also taken the industry by storm.
Gaurav Gupta, CEO, FutureWorks Media Limited
This year, we’ll be back to ‘business as usual,’ though the world we live in is not the same. The deep transformation our lives and our industry have been through will continue to permeate throughout 2023.
Thanks to the new ways in which consumers access content, there’s never been so much demand for our services. I’m talking not only about traditional feature films, but also high-end episodic productions. The pressure to deliver large amounts of high-quality shots around tight deadlines has set recruitment teams on fire – with no signs of slowing down anytime soon.
By TREVOR HOGG
Animator Peggy Arel repositions the Geppetto puppet on the doctor’s office set.
Images courtesy of Netflix.
When Italian writer Carlo Collodi published the novel The Adventures of Pinocchio in 1883 about a wooden marionette who desires to become a real boy, the movie industry did not exist. The story has taken a life of its own on the big screen with the most famous being the 1940 Disney animated classic, but this did not deter filmmaker Guillermo del Toro (The Shape of Water) to put forth his own version of the fairy tale utilizing stop-motion animation, and partnering with co-director Mark Gustafson (The PJs) and Netflix. “The biggest challenge is that it took almost two decades to get this made,” del Toro notes. “It was a completely new approach to the material that made the pilgrimage through the world of financing and logistics difficult.”
In Guillermo del Toro’s Pinocchio, the title character does not transform into flesh and blood. “That was clear to me from the start,” del Toro explains. “In a way, it’s about subverting and finding new meaning on the themes of Pinocchio. One is that disobedience is actually the beginning of thought and conscience, which goes against the idea that a good boy is a boy who obeys. The second one is the idea that you don’t have to transform yourself to be loved. You can be loved for exactly who you are.”
Character traits influenced the design of the puppets. “There are practical considerations because they do physically exist,”
Gustafson notes. “Some of those limitations can play to your advantage. Pinocchio is a perfect character to do in stop-motion because he is a puppet. There is something simpler about Pinocchio that makes the audience lean in and connect with him. His face isn’t all over the place. We wanted him to be made of wood, and that brings a power once you figure out this language.”
Handling the production design duties were Guy Davis and Curt Enderle. “One of my favorite characters I got to design with Guillermo was Death,” Davis reveals. “Death was fun because she was mythic like the Wood Sprite, and we designed both of them as sisters. Death and the Wood Sprite are mythology come to life. They’re their own thing. We originally thought of Death being portrayed as a Chimera mythical beast, and then she was more sphinx-like as far as the body, not as the culture. We started with the idea of the Greek mask for her face. It gave us a lot of freedom to come up with our own mythology. The Wood Sprite went through a couple of changes, too. Guillermo had a definite idea in mind as far as angels with the eyes on the wings. Death went through a lot of iterations getting her to where she was ready to be a puppet, and the same with the Wood Sprite. Even Pinocchio, as we first had him, was based on the original Gris Grimly design, but then there are other things, like Black Rabbits, that clicked from the first design pass and carried over from 2012 with not any changes to the concept art.”
Surrealism prevailed with the skies. “We went through the show and asked, ‘How many unique skies do we need?’” Art Director Robert DeSue remarks. “There is a journey montage, night and day requirements, and considerations for mood to help reinforce happier times versus somber times. It ended up being in the neighborhood of 38, maybe 40 unique skies. We decided to do a keyframe painting for each one of those skies, and we made these sheets: ‘Here is the storyboard and the shot this applies to,’ so we had the composition information. The director of photography, production designer and myself went through to decide upon the type of sky, like cloud forms and color.” Del Toro did a course correction. “He said, ‘I want you to look at the skies by Grant Wood, Georgia O’Keeffe and the Group of Seven,’” DeSue describes.
“‘The Italian ones, I like the colors and atmosphere, but that style is too realistic.’ We had keyframes, two images and a painting. That helped the skies get a nice runway. The first time Guillermo would see a sky, he might make a color correction. But in terms of style, fluffy clouds should look like that cotton batting that you use in pillows because they have a level of artificiality that looks handmade.”
TOP TO BOTTOM: A color concept of Death and the realm of Limbo, practical Death sculpture, and Pinocchio in Limbo where he goes upon dying and is subsequently revived by Death.
Pinocchio (voiced by Gregory Mann) becomes a performer at a carnival run by Volpe (voiced by Christoph Waltz), which plays upon the idea of Pinocchio being an actual puppet.
A memo was circulated consisting of eight rules of animation. “Mark and I discussed, ‘What are we going to do differently?’” del Toro recalls. “We said, ‘We’re going to try to give a depth to the acting style of this puppet that makes them become human.’ The goal is this: 15 to 20 minutes into the movie, you would forget that they are puppets. You’re just watching actors act and humans think and feel. We always emphasized with the animators to do micro-gestures, things that are brief and changing, because most of the animation is key poses and pantomime. It’s characters looking defiant, skeptical and happy. It’s all about emojis! Hayao Miyazaki said, ‘If you animate the ordinary, it will be extraordinary.’ It’s about stopping the plot and allowing life to enter the film.” Animators were cast. “We found that some of them were much better at doing characters or they had a real affinity for it,” Gustafson states. “That was useful to get some sort of consistency out of them, too. We tried to keep them in scenes as much as possible, as a sense of ownership is important. They can come away from this film going, ‘That sequence is mine.’ That feels really good.”
Printed faces for the puppets were something that Animation Supervisor Brian Leif Hansen wanted to avoid. “When you are working with the printed face, you’ve got a limited stack of money and a limited stack of faces, therefore your facial expressions would probably be on the stiffer side of things, with the budget that we had,” Hansen notes. “A silicone face you can move around all of the time. All of our main characters had the mechanical silicone face, but Pinocchio has a printed face, which is a stroke of genius because it keeps him in the hard world.”
ABOVE THREE: A color study of the village, set build and the final frame.
Various puppets were built for Sebastian J. Cricket to get the proper size and scale for each shot.
TOP AND BOTTOM: Concept art for Volpe’s wagon and the practical prop.
Sebastian J. Cricket getting captured in glass by Pinocchio was accomplished practically. “It’s so wild,” Hansen describes. “There are seven different composite layers in it, because the paper is big and because Cricket needs to walk on it. Pinocchio draws a sun on the paper. Pinocchio’s hand size was shot in a different plate. The drawing itself was a different plate. The glass is a different plate as well. Cricket needed to be animated inside the glass. We couldn’t have the glass there. We animated Cricket first and animated the glass afterwards. Cricket is pounding on the glass. It works brilliantly. It was fun to put the shot together without [anything] other than the technology of stop-motion.”
Visual effects were mainly utilized for atmospherics. “It’s easier to do some of the smoke, skies, fire embers, and even then, we did a lot of it with physical embers, miniatures, and silk to recreate a physical haze in the forest,” del Toro states. “When it’s a landscape made of water, that’s going to be rendered faster in CG. Then the trick is to art direct it not like a real piece of water. You have to make it artificial in order to match the world.”
There were 1,374 visual effects shots created by MR. X, an in-house team, with additional support provided by BOT VFX. “The benefit of stop-motion is the fact that there is not another take,” observes Visual Effects Producer Jeffrey Schaper. “You basically have the shot and have passes for it. We all used ShotGrid as our shot-tracking software, from the stages all the way through post, editorial and effects. We would generally speak with our first AD scheduler and make sure that everything was approved. As soon as it was approved, we would pile the shots and try to turn it over. At first, it was once every month, but then it became every two weeks to every week, to try to get shots out the door as quickly as we could. The good thing is that you have a shot that is turned over to the full length, and we work with 10 frame handles and let editorial decide what they’re going to use of those handles.”
Geppetto and Pinocchio trapped inside of the dogfish, which was a digital environment.
Two major visual effects environments were the interior of the dogfish and the realm of Limbo inhabited by Death. “There was a practical boat and half of a lighthouse,” Visual Effects Supervisor Aaron Weintraub reveals. “Those were the pieces of the set that the characters would interact with, and everything else was always planned to be digital inside of a dogfish. The idea was to create a feeling of this dank, wet cavern. The dogfish swallows them, and they do this water-slide trip down the throat and come out into the inside and fall down these various levels, trudge through the goo, find the ship and make their camp there.” The inside of the aquatic creature had to feel tactile. “We had little organic, fungal growths and things like that scattered throughout,” Weintraub adds. “There was a hanging mist in there too, as well as streams of standing water and weird temperature changes. Because it’s organic, like skin, when the light hits it, it had to feel like there is a subsurface scattering. It was always a question of how thick the skin is to the outside world, so there is a red glow from the sun that is coming from the outside that breathes through some of the diffused ambient light in there when the lighthouse isn’t shining directly on something.”
Originally, Limbo was not intended to be CG. “They were going to build these shelves that were filled with hundreds of hourglasses made out of laser-cut acrylic, but then COVID-19 hit and there was a massive acrylic shortage because it became sneeze guards in banks and post offices,” Weintraub explains. “We were always doing the sky dome in there, which is like planetarium-type space.”
TOP AND BOTTOM: A lightning shape study and the composited digital effect.
“[Co-director] Mark [Gustafson] and I discussed, ‘What are we going to do differently?’ We said, ‘We’re going to try to give a depth to the acting style of this puppet that makes them become human.’ The goal is this: 15 to 20 minutes into the movie, you would forget that they are puppets. You’re just watching actors act and humans think and feel.”
—Guillermo del Toro
TOP TWO: Lead Animator Jan-Erik Maas works with Pinocchio on the Birch Woods set, and the final result.
A greenscreen test of Geppetto’s boathouse set.
Many of the sets made use of motion-control cameras, such as this scene between Volpe and Spazzatura.
Early tests were conducted to get the look of the flowing sand. “We figured out the right speed and frame rate,” Weintraub recounts. “What they had on set was the wood frame of the hourglass, and we would do the glass insert, composite it inside, and render all of the reflections of the environment and characters moving around. We had a CG version of Pinocchio for all of the collisions and reflections.” The Death puppet had ping pong balls placed where the eyes were supposed to go. “We replaced each of those with animated eyeballs,” Weintraub says, “so we had to rotomate the wings so that our models would line up exactly and the eyeballs would track in, do the animation, and match the performance for pupils following [the action] and blinking at the right moment.”
One character is fully digital. “Before the Wood Sprite becomes a form, she has small eyes, and they essentially make up her wings,” reveals On-set and In-house Visual Effects Supervisor Cameron Carson. “The eyes float through a couple of scenes and interact with things. We wanted them to feel as close to our stop-motion puppets as we could. We built a couple of practical eyes out of polyurethane and did a couple of different things, which we then scanned and sent over to MR. X to try to replicate that, as well as add the ethereal trail that follows behind them. There was a lot of back and forth with that in terms of the look and how that is supposed to feel in the movie. That was probably one of our biggest ones in terms of designing because it was a little more of an unknown.” The fully-formed Wood Sprite is luminescent. “She has practical lights behind her eyes, which is helping to cause that glow,” Carson states. “In most of her shots, the Wood Sprite is captured on greenscreen, and that enabled us to separate her out and give some digital glow and atmospherics to her so it feels like she is ethereal and moving through the space.”
As many as 56 sets were shot simultaneously, though not all of them had motion control cameras. “When our camera is six inches away from the set, the smallest motion reads massively onstage,’ Carson remarks. “We will shoot for the day, come back in the morning and the camera would have moved. Just the slightest atmospheric change of temperature, or the lights coming on and warming up the set, will actually swell or shrink the wood, and it creates micro-stutters in our tracks. Almost every single shot in our production had to be stabilized, and we have to worry about light flicker.” Dust is a big problem. “It’s small particles that are on the set that when an animator touches something on the puppet, all of the chatters, or moves around. You see that in Fantastic Mr. Fox, characters with fur chatter. That movement comes from animators touching the puppet and moving their finger, and the pieces don’t get back to the same spot,” Carson adds. “We’re trying to strike a balance of removing things that are distracting to the viewer while leaving as much charm or stop-motion aesthetic as possible.”
By TREVOR HOGG
Visual Effects Producer Diana Giorgiutti.
Images courtesy of Diana Giorgiutti
Being a veteran visual effects producer, Diana Giorgiutti is used to managing time zones for an industry that literally works 24/7, which means her work day begins at the ungodly hour of 4:30 a.m. while in production for the role-playing game adaptation Dungeon & Dragons: Honor Among Thieves, which stars Chris Pine, Michelle Rodriguez, Hugh Grant and Jason Wong, and is directed by John Francis Daley and Jonathan Goldstein. “That’s about as early as I can get up!” laughs Giorgiutti, who is working from her hometown of Sydney, Australia. “2 a.m. is like 9 a.m. in Los Angeles, so I’m usually coming into the day two or three hours behind them all. I’m not an early bird! Never have been. I’m a night owl.”
Art and math were the subjects that appealed to her most as a student. “Hence the art and producer combination!” Giorgiutti notes. “My parents are both Italian, and they came to Sydney in 1961 when Australia was appealing to immigrants from Europe to help build the country. A year later, I was born [followed by three sisters]. Sydney is wonderful. The industry was small here, so I knew that if I wanted to grow in visual effects and learn, I would have to go to England. The U.S. wasn’t obtainable. I have Italian citizenship, so Europe was much easier. Leaving Sydney behind was hard, and I had always planned to come back.”
Star Wars was what caused Giorgiutti to become involved with the visual effects industry. “I was 15 years old in 1977 and could only talk one of my sisters into go with me [to see Star Wars]. I walked out of the cinema saying, ‘I don’t know what it is and how they do it, but that’s what I want to do!’ The next couple of years, I was investigating it through newspapers or magazines because there was no Internet.” The first job in the film and television business for the high school graduate was as a production assistant for a tiny company called [Sydney] Panorama Productions which did a local TV show called Variety Italian Style. “There was a five-minute cooking segment that we shot and some local adverts. Being in Newcastle at a TV station watching all of the cameras, that was my love: the tech of it all.”
Getting hired as a tape operator at VideoLab, thanks to a personal recommendation of a colleague, was the big career break for Giorgiutti. It exposed her to Grass Valley Vision Mixers, Bosch FGS-4000, Softimage 3D and a Quantel Paintbox. However, the real boom in visual effects was happening in London. “I joined an editor friend and we bought one of the first Avids in London. Running that business was not my thing and didn’t keep me very busy, so I sent a few letters out to six of the top visual effects houses. Rushes offered me a three-month freelance gig as a visual effects producer which then became five years. I did mostly pop videos because none of the other producers liked doing them [as commercials were more profitable]. I still call London the music capital of the world. I was going to gigs all of the time. I worked on some great pop videos during my time, such as ‘Frozen’ by Madonna. We did a shot where she falls backwards and turns into hundreds of black crows.”
On set shooting the Neo/Agent Smith crater scene from The Matrix Revolutions – “a joyous wet and muddy shoot,” according to Giorgiutti.
Artists back then were generalists doing everything from previs and lighting to compositing. “It would be one artist who would take the shot all of the way through,” Giorgiutti reflects. “The visual effects producer back then had more creative involvement, but over time, when it became more departmentalized, you lose touch of that. Also, the projects got bigger, and it became more of money management and working with the vendor doing the work.” The paradigm has shifted back to the creative side for the blockbuster visual effects productions that are too big for a visual effects supervisor to handle alone. “It’s important for visual effects shows to have the supervisor and producer because they balance each other well. Whereas the supervisor is more creative and technical, the producer is creative and technical, too. We have to be creative with numbers at times, but it’s good for the supervisor to have someone else to go, ‘How do you think we’ll get this shot finalled by our director quicker?’ There are all of these strategies and plans we have to discuss and come up with. If it was somebody doing that on their own, it wouldn’t be as successful.”
During The Matrix Revolutions (2003) and The Matrix Reloaded (2003), shooting with the ‘yak’ rig, known as this “because the stunt guys were barely able to keep their breakfast down,” Giorgiutti recalls.
Two significant films that Giorgiutti worked on early in her career won Oscars for Best Visual Effects, Babe and The Matrix. “[Writer/Producer] George Miller always intended to shoot with real animals and some animatronics, but also knew that he needed visual effects, so he waited for the technology. At that time, I was at VideoLab as a telecine colorist, which is now what a DI person is but much more analog, so no DaVinci Resolve. A film roll would log in, I would put it up and if I thought it looked good, I would call George and get him to have a look. Every now and then, he would ring to check if particular companies had sent something and ask me what I thought. One day, this box came from Rhythm & Hues [Studios]. I finished whatever commercial I was doing, put it up and went, ‘Ahhhhhh.’ I called George right away and he came zooming over. I was in the room when he called them to say, ‘This is mind-blowing.’ Then onward and upwards! Babe was made.”
In The Oracle’s kitchen set during filming of The Matrix Revolutions (2003) and The Matrix Reloaded (2003), taking chrome/grey ball notes with Visual Effects Supervisors Kim Libreri and Dan Glass (off to the side), 1st Assistant Director James McTeigue and DP Bill Pope, far left.
On location in New Zealand with Visual Effects Supervisor Sean Faden while shooting Mulan (2020).
This is the Destroyer on location about 30 minutes outside Santa Fe, New Mexico, during shooting for the first Thor (2011). He had to be shipped via truck in two halves.
For Giorgiutti, shooting with the Bolt camera was a fun part of the Ant-Man/Yellowjacket fight scene in the little girl’s bedroom for Ant-Man (2015).
A personal recommendation by Sally Goldberg (Computer Animation Supervisor at DFilm) led to Giorgiutti becoming involved with The Matrix, which was in a state of turmoil made worse by the visual effects producer leaving the production. “The Matrix was a whole bunch of mind-blowing things. My big job during the shooting of that film was managing the Bullet Time shots, and we had no idea what we were doing. The guys setting up the rig would go, ‘This is what we’re going to need on the day.’ I said, ‘Okay, I’ll come up with the chart.’ On the day when they took the photos, it was my job to run around to all of the 120 still cameras and record what frame they landed on, and do it quick enough because the actor would be itching to do the next take. I had all of my precious rolls of film that I had to take over to the lab to get processed. This is hundreds of thousands of dollars in the making! I like to say that I was one of the world’s first data wranglers.”
For a brief time, Giorgiutti returned to working for a visual effects company. “Luma Pictures was one of the vendors I used on every single show that I did for Marvel. The owner is a smart guy and thought having someone like myself to represent the company would help expand beyond Marvel and into other things like its own content. I was enticed over and it was great, but I was only there for a year and a half in the end because I missed the studio side. The overall management is what I love about what I do, being able to wrangle everything and everyone which you don’t do as a vendor. You just have your patch of shots, and if you’re lucky you’ll get time with the director here and there. Mostly you deal with the supervisor.” The visual effects industry has been significantly impacted by the streaming services. “The advent of Netflix has changed visual effects in another way,” Giorgiutti observes. “There is too much work and not enough people like myself with good solid experience, so a lot of green individuals are being thrown into managing these shows. I’m already talking to vendors for my potential next film, which doesn’t start shooting until May 2023.”
Key skills for a visual effects producer are mathematics, communication and counseling. “I am protective of the vendors because the studio side does not do the shots,” Giorgiutti remarks. “I’ve always felt if you look after people, they’ll look after you. If we’re going to need them at the eleventh hour to pump out the extra shots, give us a few freebies or throw in that extra effort, you make them feel like an important part of the process, which they are. Then it all fits and works nicely.” Being able to delegate responsibilities is important. “I’m not a micromanager,” she says. “I copy the coordinators on a lot of my emails so they can learn and see how things are handled. I’m always giving them lessons. If you delegate and have trust in your team, it’s wonderful. Going back to, ‘there is too much work and not enough experience,’ some of our crew on Dungeons & Dragons haven’t done much of the role before, but you get enough time in the early beginnings of post to teach, train and hopefully instill some of the better ideas and ways of doing things.”
Virtual production excels in specific situations, she says. “If you are doing The Mandalorian-type stuff that only really works if the filmmakers are prepared to essentially post the movie to a degree before they shoot, all of the environments have to be designed and not change.” Audiences are a lot savvier about visual effects. Giorgiutti adds, “If you read any of the [film fan] blogs or Reddit things, there are all these people out there giving their own opinion on why it looks so shitty. Interestingly, most of them say it’s probably because of not having enough time. How do they know this stuff? I blame the Internet. There are so many behind-the-scenes [articles] where they reveal a lot of our stuff, so these people are learning from all of that. It starts with having and shooting a solid script. D&D and even Mulan were solid scripts, and we improved both films by doing a bit of additional photography. Each of them came in on budget. A lot of movies barely have a script, or the script is too long and they go in shooting. Unless the filmmakers plan better and have better scripts, it’s going to be more of the same. Maybe films won’t be successful because of the visuals now, especially if they’re visual effects heavy.”
The Matrix Revolutions and The Matrix Reloaded crew on set at Fox stages in Sydney. A full-sized APU – Armored Personnel Unit – was deployed for fighting against the Sentinels.
With Visual Effects Supervisor Sean Faden at the top of one of the huge sand dunes in the Xinjiang desert while location scouting for Mulan.
With Marvel Studios VFX Supervisor Jake Morrison on set during Ant-Man.
Shooting on the street on a cold night in Atlanta, Georgia, for a sequence of Ant-Man flying on an ant.
On set in Santa Clarita, California, for Mulan additional photography, where Mulan’s mates fight the Shadow Warriors in a back alley as she escapes to save the Emperor.
On a New Zealand glacier, on a ride with one of the pilots who did helicopter flying for Mulan.
Giorgiutti discussing the action with one of the crew while shooting The Matrix films.
One shot took 117 versions to get approved, Giorgiutti reveals. “It was all about a character being yanked into a wall, falling down, then that comical Roadrunner moment, and the dust falling a beat after he falls. We could not make them happy on the dust! I’ve got a lot of those kinds of fond memories.” After over 40 years in the business, The Matrix remains a career highlight. “Back then, you always had a family in post, because the post-production team – director, editing, sound and visual effects – was a group of 20 to 30 people. Shooting these days, your crews go up to 500 to 600 and even more if you have a lot of extras involved. It becomes too many people. On The Matrix, I got to know all of the names of the grips and electrics. A lot of the guys doubled up with their jobs because that’s how it was back then. There weren’t that many people in the industry, certainly in Australia.”
“I think I have five more years in me,” Giorgiutti reflects. “I have seen it all. I was sticky-taping two-inch videotape together, and I remember the one-inch machines coming in that were vacuum operated so they would move really fast. Any of us girls with long hair had to wear it in a ponytail because ‘whoosh’ your hair could get caught. Now, of course, it’s digital, volume and metaverse! I’ve got to look that up!”As for what avatar she would choose for herself, Giorgiutti responds, “I would be a woman on a unicorn. Nobody has ever asked me that before, so that immediately came to my head! I love horses. If only we could all have unicorns in our lives!”
By TREVOR HOGG
Filmmaker Peter Weir. (Image courtesy of Peter Weir)
Master and Commander production images courtesy of Disney and Nathan McGuinness. Fearless images courtesy of Flash Film Works.
One does not think about spectacle when watching the movies of filmmaker Peter Weir, who believes in infiltrating the subconscious with subtle visual and sonic cues, rather than overloading the senses with eye candy to create the desired mood and atmosphere for audience members. Even when digital and practical effects play an integral role in achieving the necessary epic scope, there is an organic quality to the image being presented on the screen. “The most work that I did with special effects or CGI was Master and Commander: The Far Side of the World, and that was quite something to make [ocean scenes] look real. Most of the oceans are composites. We were only at sea for 10 days,” notes Weir, who received an Honorary Oscar at the Governors Awards for a career consisting of 13 films, six Academy Award nominations, and being a key member of the Australian New Wave that turned a cinematic hinterland into an internationally renowned film industry.
Shooting in the water tank built for Titanic in Rosarito, Mexico, was the preferred option for Master and Commander, which takes place during the Napoleonic War. “It was probably one of the most difficult films I ever worked on,” remarks Russell Boyd, who reunited with Weir after two decades and received an Oscar for Best Cinematography. “There were an awful lot of mechanical special effects in it like water explosions, all that fun. I remember about six weeks before shooting started, we all looked at each other and asked, ‘How are we going to make this picture?’ because there were so many variables. Everything was scaled up. The visual effects certainly played a part. It was a huge learning curve for all of us, but in the end, we stuck with our guns. Peter made the most genius call by commissioning these huge models that were 1/6th scale to be built at Wētā Workshop, which I believe that the studio didn’t want to do. They wanted to digitally create the boats, and it honestly would have been a disaster because those models turned out to be fantastic.”
Good fortune occurred when shooting plate footage. “The Endeavour replica was sailing around Cape Horn from west to east during our pre-production time, so I managed to get a cameraman on board,” Weir remarks. “It is hard to buy 35mm or high-quality visual shots of the ocean with swells or huge waves all around. People shoot on 16mm, [camera operators did] in the old days anyway, or on inferior digital video cameras. He didn’t get the storm but did get some fabulous ocean plates with big swells. Those were valuable for Asylum VFX [the visual effects company].” The only time CG ships appeared was in the wide shots. “Peter was so precise in what he wanted,” recalls Nathan McGuinness, former Creative Director/Visual Effects Supervisor of Asylum, who received an Oscar nomination for Best Visual Effects. “He came in early in the meetings with an oil painting of that period ship in a storm. Peter goes, ‘I want the storm to look like this.’ That’s exactly what I used as my reference.”
Weir standing in front of the HMS Surprise, which was shot in the same water tank that was built for Titanic.
“It was a monolithic compositing show,” McGuinness states. “We had 17 Flames running. Everyone was doing their roto and sitting there sifting through all of the ocean plates we had and blending. What we did was to create two or three master shots that were exactly what Peter wanted and that then helped us to control the look. [Editor] Lee Smith and his team were in my building, so we were together. That took the communication [concern] of not knowing what the editors are doing completely away.” Digital double work was minimal, McGuinness observes. “We did shoot libraries of doubles so that we could stack them onto the ship, especially for the models and anything that wasn’t live. Also, we were able to take pieces from what we shot off of the live shoot with the crew on and put that on the models.” There was a limit to what could be done on set with the actors. “We mapped it all out, had the actors go everywhere, recreated that layout with the explosions going off, and drop-comped it all in,” McGuinness describes. “We would pick up the elements that were needed, like the wood, embers and the cannon fire. We had libraries of footage that the compositors could grab and add in. We had a lot of atmospherics.”
Miniature Effects Supervisor Richard Taylor examines one of the miniatures constructed for Master and Commander at Weta Workshop.
Visual Effects Supervisor Nathan McGuinness shares a moment with Weir while producing the visual effects for Master and Commander at Asylum.
A real-life plane crash was recreated for Fearless, but Weir does not view it in the same light as his nautical adventure. “Oh, that was a different thing,” Weir explains. “That was in the earlier days before CGI. There was a company in Los Angeles [Introvision International] that specialized in making these plates for you, and they were very good. When Jeff Bridges is standing on the roof of the building, they built the actual corner of the building at the studio and then made up plates of the traffic in the background and seamlessly married them. I could see it through the camera in that case, so we were able to fold in the background into the viewfinder. For the plane crash, I bought materials that had been used for a plane cabin, and we created a lot of it in the studio because it was mostly interior, so you could create the chaos and debris.”
Orchestrating the special effects, which included dump tanks to add to the realism.
Ocean plates and live action footage of Russell Crowe are composited together.
Water cannons were utilized to get the proper interaction between the ocean and ship.
Some shots were captured using a gimbal and bluescreen rather than in the water tank.
Water tanks in action during principal photography of Master and Commander.
Introvision International altered a model that had been previously used for the television movie Miracle Landing. “We shot it upside down so that anything breaking off would fall down to help make it look like it was blowing away,” remarks William Mesa, who was the Visual Effects Supervisor on Fearless. “All kinds of wires were connected to little parts of the plane body so when the camera started rolling, we could pull off luggage compartments or different windows to make it look as if it’s being ripped away. Then we shot many different plates from various angles with Jeff Bridges and the boy next to him.” Plates were shot outdoors. “We had a truck with a VistaVision high-speed camera mounted on the roof and crashed through a cornfield as fast as we could possibly go until it got all clogged up underneath.” Weir wanted the passengers to see that the plane was out of control. “For a lot of the shots, we used a Sabreliner jet, took the door off and mounted a VistaVision camera in it,” Mesa states. “Then we made runs back towards Bakersfield and literally took that plane upside down and then back up again. I almost got sick doing it!”
The roof of a 12-story building in San Francisco was the location for when Jeff Bridges’ character stands on a ledge. “We couldn’t get certain angles there because the set was way inland to the actual building,” Mesa explains. “Plus, the ledge was much higher than the real one. Jeff could climb up on top of that and be totally safe. The biggest concern was for me to shoot the plates hanging over the end of the building and looking down. It was a safety concern for the camera. But we worked that rig out.” A number of onlookers appeared in the surrounding windows. “Tons of women would put up their phone number saying, ‘Call me tonight,’” Mesa comments. “The production manager had to go over to the building and tell them we couldn’t do anything because all of this stuff was in the background; that ended up delaying us for a while.” The set was reconstructed inside a studio. “You go through a certain process of having to get what you call ambient light,” Mesa adds. “[DP] Allen Daviau left it up to me to light a lot of that, then he would go in and tweak the lighting on the face to be the look that he wanted it to be.”
An intricate series of wires was connected to the miniature to make sure that the pieces came off at the right time for Fearless.
Experimenting with visual and sonic trickery such as camera speeds and earthquake sounds date back to the adaption of Picnic at Hanging Rock. At the turn of 20th century, a group of school girls disappear upon entering a mysterious volcanic formation in Australia. “Not only did the film not have an ending, it was a whodunnit with no ending,” Weir observes. “I had to somehow strive to make it so that you want to live in the mystery. But to do that I had to make it dreamy and not overemphasize the investigation from the police.” [Cinematographer] Russell Boyd finds it inspiring to work with a director who is adventurous with camera angles. “Peter has always liked to experiment with using different lenses and different heights with the lenses in getting a shot, and the speed of the camera. He likes slow motion just to heighten one little mannerism or a little movement. In Picnic at Hanging Rock, when the girls were crossing the river, we shot in 32 frames, which gives it that slight motion effect.” Lens distortion was utilized to create an impression that a magnetic field might be present. “There was a bit of that,” remarks John Seale, who was the Cinematographer on the film, as well as on The Last Wave and Gallipoli. “Also, the use of the rock formations, finding a face and having it quietly sitting in the top left corner where the audience might suddenly say, ‘Did you see that?’” There are voyeuristic shots. “It’s as though the rock is watching and preying on them,” Seale says. “The simplicity of girls walking up to a rock enhanced by Peter Weir is something awesome to watch.”
A miniature plane engine was placed outside of a helicopter and shot from the perspective of a passenger for Fearless.
Visual Effects Supervisor William Mesa with Weir for the studio shoot of the high-rise sequence in Fearless when Jeff Bridges stands on the ledge, which involved Introvision plate projection.
Preparing a shot of the interior of the miniature plane used for the crash sequence in Fearless.
The desert crossing scene in Gallipoli was practically shot with Weir accompanied by 1st AD Mark Egerton as he talks to his lead actors Mark Lee and Mel Gibson. (Image courtesy of Peter Weir)
Weir on the set of Picnic at Hanging Rock with actress Rachel Roberts, where he experimented with camera speeds and sound effects, such as earthquake rumbles, to create a sense of otherworldliness. (Image courtesy of Peter Weir)
Cinematographer Russell Boyd composes a camera angle for The Last Wave starring Richard Chamberlain, which was actually shot in the underground caverns situated below Sydney. (Image courtesy of Peter Weir)
Weir directing Lukas Haas for the train station scene in Witness. (Image courtesy of Peter Weir)
Weir made his Hollywood directorial debut with Witness, where he bonded with Harrison Ford and Kelly McGillis. (Image courtesy of Peter Weir)
Visions of an impending disaster dominate the narrative of The Last Wave where a criminal defense lawyer (Richard Chamberlain) discovers that he has a mystical connection to his Aboriginal clients. “At the time,” Weir states, “I was influenced by reading the works of Immanuel Velikovsky, who believed the world is changed often or several times by catastrophes, which were acts with bodies from space at one time or another, like from asteroid collisions or coming close to stars that moved out of their alignment. I also wanted to talk to these Aboriginal elders which was the most interesting part of making the film.” Seale believes that Weir always looks for that ethereal sense he can get out of a normal emotion. “The Last Wave was full of that right from the word ‘go’ because it was from the imagination of a man that comes into reality,” Seale explains. “Peter invented a tracking shot we called the ‘imperceptible move.’ The camera moves in almost subliminally so that the audience sitting in the theater would feel as though they were leaning forward, that something is going to happen. The two grips would sit down on opposite wheels of the dolly and turn them by hand so the camera was so gently moving forward. Once it got awkward for one of the grips to turn the wheel without taking his hands off of it, the other guy on the other side would take over, keeping it moving while he re-positioned his hands for the next bit.”
Gallipoli was all about honoring the memory of the Australian soldiers slaughtered during the infamous World War I battle that left a permanent scar on the nation’s psyche. “When I went, you could walk around the battlefield,” Weir recalls. “There are bullets and broken bayonets. The trenches had fallen in, but you could still see trench lines. Having gone through the experience of that day at Gallipoli on my own with no one around, that was it. I swore that I would make this picture for them as a sort of war memorial.” Special effects handled the explosions and guns. “It was a bit of Australian ‘what if?’ because when we’re in the trenches and the shell explosions were coming down the hill towards us,” Seale remembers. “There were actually giant packets of dynamite in the ground and we all got shell shocked! The trench walls were shaking and starting to collapse. We had to have earplugs in because of the compression.”
Weir pauses a moment to reflect while making The Way Back. (Photo: Simon Varsano)
Boom operator Jeffrey A. Humphries stands by 1st AD/Executive Producer Alan Curtis, who is shouting “Action!” next to Russell Boyd and Weir while shooting in the Galápagos Islands for Master and Commander. (Image courtesy of Peter Weir)
The silo death scene in Witness was achieved practically with a stunt double and a hidden oxygen tank and mask. “The farm belonged to the family called the Krantzes,” Weir states. “I asked Mr. and Mrs. Krantzes, ‘What’s that?’ They said, ‘It’s a grain silo. We store it up top, open the lever and drop what we require at various times.’ Then I said something like, ‘Can you go in through this door?’ They replied, ‘Yes, but I wouldn’t want to be in there with the door closed. If any grain fell, you’d suffocate. You have to wear a mask or something because the dust is harmful to the lungs.’ I thought, ‘My god, what a weapon!’ We quickly reconfigured it all, and Harrison Ford loved it.”
McGuinness describes his Australian countryman as a legend. “Super in the moment. You spent a lot of time in awe with the experience and nature that he had. Peter was calm and astute. You felt good being around him, which was always the case. He always thought of everybody. Peter always remembered everyone’s names. Always respected everybody from top to bottom. You could see the human side of Peter as well as commanding a full production under a lot of pressure. It was a high-pressure movie with a lot of pressure coming from all sides. Peter pushed through it.” Mesa was intrigued by how Weir works. “Before we started anything in pre-production, he interviewed everybody,” Mesa observes. “There were people who did not make it on the show because he knew that they were going to be problematic to him. And what it does is make a great experience in making the movie.”
Reflecting on his attitude towards filmmaking, Weir states, “My approach had been that I wanted [to do to] the audience as I had done to me by other filmmakers [to feel like] I really was there. I really believed it. And so, when I walk outside, I’d used to joke and say, ‘I don’t know where I parked the car. I’m so lost in the movie, I can’t remember real life.’”
A gimbal used for the HMS Surprise during the making of Master and Commander. (Image courtesy of Peter Weir)
Weir directs Jeff Bridges on top of a high rise in San Francisco during the making of Fearless.
Weir and Cinematographer John Seale sweat it out in the jungles of Belize while shooting The Mosquito Coast. (Image courtesy of John Seale)
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.