By CHRIS McGOWAN
Images courtesy of DNEG and Sony Pictures Entertainment.
By CHRIS McGOWAN
Images courtesy of DNEG and Sony Pictures Entertainment.
Tom Holland as treasure hunter Nathan Drake negotiates a daisy-chain of crates falling from a C-17 cargo plane in a complex mix of practical and visual effects from DNEG.
Ferdinand Magellan was a Portuguese explorer who led a Spanish expedition of five ships in 1519 to seek a western route to the Moluccas (Spice Islands). Magellan perished along the way and only one ship made it back, in 1522, but it was the first craft to circumnavigate the world. Flash forward five hundred years, and Ruben Fleischer’s Uncharted spins a fictional tale about a present-day search for two lost treasure-laden ships from Magellan’s fleet. The Sony Pictures movie is a prequel of sorts to the tremendously popular Uncharted video game series, developed by Naughty Dog and published by Sony Interactive Entertainment. The film’s treasure hunters included Nathan Drake (Tom Holland) and Victor Sullivan (Mark Wahlberg), along with Chloe Frazer (Sophia Ali) and Santiago Moncada (Antonio Banderas).
The On-Set and Overall VFX Supervisor was Chas Jarrett. DNEG was the primary VFX vendor, completing 739 shots over 23 sequences, with teams led by Visual Effects Supervisor Sebastian von Overheidt (DNEG Vancouver) and Visual Effects Supervisor Benoit de Longlee (DNEG Montreal). Other contributing VFX vendors included The Third Floor, RISE Visual Effects Studios, Soho VFX and Savage Visual Effects.
Crates falling from the C-17 cargo plane was part of a continuous 90-second ‘oner’ sequence that mixed bluescreen, wire rigs, robot arms and digi-doubles.
Holland reaches out while a KUKA robot arm holds a crate and large fans supply the wind for the shoot. Live-action filming for the sequence took place at Studio Babelsberg in Potsdam, Germany, outside Berlin.
DNEG was tasked with handling various jaw-dropping sequences, including a 90-second shot in which Nate and Chloe – along with cargo crates and a Mercedes Gullwing car – fall out of a C-17 cargo plane while flying over the South China Sea. Von Overheidt considered the shot “a fun challenge. We called this sequence ‘the oner’ because it’s constructed as one continuous 90-second shot.”
“[For the falling out of a C-17 cargo plane scene] we had several practical elements with the actors hanging on wires and interacting with a stand-in car prop. We combined the practical elements with long stretches of full-CG moments. Some sections required either close-up digi-doubles to hold up, or even a transition between plate and digi-double right in camera with nowhere to hide. Mix that with the disorienting camera, and you have quite a complex puzzle to solve.”
—Sebastian von Overheidt, Visual Effects Supervisor, DNEG
Von Overheidt adds, “We had several practical elements with the actors hanging on wires and interacting with a stand-in car prop. We combined the practical elements with long stretches of full-CG moments. Some sections required either close-up digi-doubles to hold up, or even a transition between plate and digi-double right in camera with nowhere to hide. Mix that with the disorienting camera, and you have quite a complex puzzle to solve.”
To begin creating the sequence, von Overheidt reveals, “We received LiDAR scans and HDR photography of each individual cargo crate and all the other props like the Mercedes Gullwing, as well as a full scan of the C-17 interior, which was built as a set. From there we built the entire daisy-chain of crates and the C-17 interior. At the same time, we also worked on a fully digital version of the Gullwing and the C-17 exterior model with some custom modifications compared to a standard model. Ruben had asked us to create a billionaire’s version of the well-known plane.”
The exterior model of the C-17 cargo plane was built with some custom modifications befitting a super-billionaire’s souped-up version of the plane.
“[Tom Holland] indeed got thrown around quite a bit. All the crates on the exterior were mounted on top of KUKA robot arms so that they could move on a full gimbal in a programmed sequence. They were also modified with extra padding or using softer materials, so that Tom Holland and the stuntmen could jump in between them, holding onto the netting of crates. It gave a great realistic-looking reaction for most of the shots, so we got away with a lot of head replacements on the shots with Holland’s stuntman. In quite a few shots we still went for a full digital-double solution because we wanted the performance even more violent or the camera to be more dynamic than what was shot.”
—Sebastian von Overheidt, Visual Effects Supervisor, DNEG
Once camera, object and body tracking were done, Layout Supervisor Kristin Pratt and DFX Supervisor Gregor Lakner and their teams blocked the entire sequence out, “which is also the crucial step where we’d analyze each shot and figure out what CG extensions need to be added,” von Overheidt says. This also involved finding solutions for any discrepancies between the 3D-scanned crates and the ones used on set. “Our job was to piece this all together while finding the best transitions into CG and amp up the action and movement.” There were also some entirely full CG shots. He adds, “The environment was stitched based on multi-camera array footage shot at around 7,500 feet and then augmented to look a bit more desolate in terms of islands. All the clouds and wind FX and debris are CG.”
Lighting in the open sky was a challenge. “The plates were shot on a soundstage with stationary lighting, but our characters fall tumbling through an environment with only one light-source, the sun,” von Overheidt explains. “DFX Supervisor Daniel Elophe and the team broke this mammoth puzzle down into manageable sub-sections which were assembled to one long shot in compositing at the end.” The team around Lighting Supervisors Sonny Sy and Chris Rickard and Compositing Supervisor Francesco Dell’Anna kept track of changing light directions and found creative solutions to make it all work with the plates, while allowing for a free choreography of the camera and the animation, done by Layout Lead Steve Guo and Senior Animator Patrick Kalyn. “The result works really well and we ended up getting the best of both,” von Overheidt says, “seeing the sun rotating on high-action free-fall moments while coming back into a more character-focused lighting when there is dialogue and we’re locked into practical photography.”
Greenscreens assisted with the construction of a 500-year-old Magellan ship. The ships were highly detailed and complex assets built for every form of action called for in the making of Uncharted.
Tom Holland got his share of shaking and stirring thanks to a robot arm. Von Overheidt comments, “He indeed got thrown around quite a bit. All the crates on the exterior were mounted on top of KUKA robot arms so that they could move on a full gimbal in a programmed sequence. They were also modified with extra padding or using softer materials, so that Tom and the stuntmen could jump in between them, holding onto the netting of crates.” They were thrown around randomly by the robot arms to get the sense of snaking of the daisy-chain. Von Overheidt adds, “It gave a great realistic-looking reaction for most of the shots, so we got away with a lot of head replacements on the shots with Holland’s stuntman. In quite a few shots we still went for a full digital-double solution because we wanted the performance even more violent or the camera to be more dynamic than what was shot.”
To build Magellan’s two ships, sets were split into different stages, LiDAR scanned, pieced together and combined with the overall design.
The scenes with Magellan’s ships (the Trinidad and the Concepción) and the huge helicopters carrying them required extensive VFX, but the scene wasn’t created entirely full CG. Von Overheidt notes, “There was actually a lot of great footage shot on big sets. This sequence really had everything in it. The scenes were shot on several stages resembling different parts of the ships, which we were extending with CG. The helicopters we had designed are based on some classic cargo helicopters, but even beefier.”
In the case of the Concepción, the set was split into four different stages – the stern, the main deck including helm, the bow and the crow’s nest with a partial mast, according to von Overheidt. “Our CG Supervisor Ummi Gudjonsson and Build Supervisors Chris Cook and Rohan Vaz started by assembling the various on-set stages for which we had received LiDAR scans, piecing them together, lining them up to each other and combining them with the overall design of the ship.”
Von Overheidt continues, “The same process went into the Trinidad and any other set, like the helicopters. Throughout the boat battle sequence we picked about a dozen hero shots based on the criteria [of] which ones would reveal the most problems, and we would constantly check whether our model of the ships lined up to those shots. The tricky part is that practical sets aren’t perfect. They may not be symmetrical, or the same section may have different dimensions across the different sets. In addition to that initial step, it then requires careful planning and a lot of work to get to the detail level of a good practical set. The ships were highly detailed, and complex assets were built for every form of action, including total destruction. Both ships were fully-rigged sailing ships with ropes, cloth banners, sails, flexing masts and yardarms, flapping doors, all the cannons, etc. [There were] a lot of moving parts which helped to bring across some of the crazy movements and crashes the ships would do in the sequence.”
“There was actually a lot of great footage shot on big sets [for the scenes with Magellan’s ships and helicopters carrying them.] This sequence really had everything in it. The scenes were shot on several stages resembling different parts of the ships, which we were extending with CG. The helicopters we had designed are based on some classic cargo helicopters, but even beefier.”
—Sebastian von Overheidt, Visual Effects Supervisor
Between the two ships and helicopters, around 20 mercenaries, Braddock (Tati Gabrielle), Hugo (Pingi Moli) and the Scotsman (Steven Waddington) all become part of different fights which were augmented with head replacements or full digi-doubles. Von Overheidt explains, “The journey of the flight was across [some] 330 shots, so we built a massive environment that we used to block out the sequence. Ruben wanted an action-packed sequence. Especially, the shots where we see the boats and helicopters moving through the Karst landscape had to be dynamic and exciting, and we wanted to feel their weight and impact on the helicopter’s flight dynamics.”
Von Overheidt adds, “Now, real-world physics obviously weren’t a priority on this sequence to begin with, but we still aimed towards that feel for a plausible animation and also staging the camera in a way that it would guide the audience through the disorienting action and make the ships look massive at the same time. We basically had to stick to real-world physics while also constantly breaking it at the same time. The entire sequence was a close collaboration between our layout team and the animation team led by Animation Supervisor Jason Fittipaldi and Animation Lead Konstantin Hubmann, and [On-Set and Overall] VFX Supervisor Chas Jarrett, himself whose roots are in animation.
The CGI helicopters were based on classic cargo helicopters but made beefier. They had unusually heavy loads to carry – Magellan’s ships – across the South China Sea, with footage shot in Thailand serving as the South China Sea.
“Generally speaking, working with big practical sets is great for visual effects because you have real references to match to – the real material, the real lighting and how the camera captures it. Even if you end up replacing parts of it anyway, it’s a great start. Actors feel more comfortable interacting with a real environment as well. The trade-off is that matching into complex practical sets can be quite the puzzle for visual effects.”
—Sebastian von Overheidt, Visual Effects Supervisor, DNEG
Magellan’s ships, carried by helicopters, waged battle in the air.
“For the South China Sea environment, we had received extensive footage from a practical shoot in Thailand. Film Production mounted a multi-camera array under a helicopter and flew through the landscape also shooting at different lighting conditions during the day,” von Overheidt says. The original plan was to use this material as practical backgrounds and only extend plates or create specific shots full CG. “As we were creating a digital version of the environment, we soon realized that our team, led by Environment Supervisor Gianluca Pizzaia and Environment Lead Matt Ivanov, was able to create one big environment which would cover the entire flight path throughout the sequence. And straight out of rendering it looked pretty much photorealistic. We presented our results to Ruben, who got excited about it. Everyone was confident that this would be the way to do it. It gave us and Ruben so much more freedom to find great cameras and shot composition that we decided to go full CG on the environment all the way through.”
Von Overheidt continues, “It allowed us to move the camera anywhere we wanted and fully customize the environment to our needs. It made the whole process a lot more efficient as well. Throughout the third act, there is also a progression in lighting from afternoon to sunset. Compositing Supervisor Kunal Chindarkar and Compositing Lead Ben Outerbridge made sure we transitioned seamlessly into these different lighting conditions and moods as we reached the final shot of the Conception sinking and Nate and Sully flying into the sunset.”
Asked if the filmmakers let the look of the Uncharted video games influence the visuals of the movie, von Overheidt comments, “Not from a visual effects perspective, no. I can’t speak for the Production Art Department though. I used to game quite a bit but never played Uncharted before, so when I joined the show, it was actually the first time I checked it out, mainly to understand the characters and some of the main levels. My main influence for creating images comes from photography and graphic design. I get most of my inspiration from actually being outdoors. We had some great artwork from the production team and the Thailand footage to look at. We would also often look at references for all kinds of scenes, like crazy skydiving stunts or video footage of heavy-lifting helicopters.”
Looking at the melding of the big-scale practical and digital in Uncharted, von Overheidt concludes, “Generally speaking, working with big practical sets is great for visual effects because you have real references to match to – the real material, the real lighting and how the camera captures it. Even if you end up replacing parts of it anyway, it’s a great start. Actors feel more comfortable interacting with a real environment as well. The trade-off is that matching into complex practical sets can be quite the puzzle for visual effects.”
With the help of bluescreen, Pingi Moli (Hugo), Tati Gabrielle (Braddock) and Steve Waddington (The Scotsman) appear to walk down the ramp of the C-17’s cargo bay onto a busy operations base.
By TREVOR HOGG
Images courtesy of Netflix.
“Bad Travelling” is the animation directorial debut of David Fincher.
There are sinister underpinnings to human nature which are mined narratively to create stories filled with destructive conflict and satirical humor for the Emmy-winning Netflix animated anthology Love, Death + Robots, executive produced by filmmakers David Fincher (The Social Network) and Tim Miller (Terminator: Dark Fate). The nine shorts curated for Love, Death + Robots Vol. 3 are examples of drastically different visual styles from the likes of Patrick Osborne, David Fincher, Emily Dean, Robert Bisi and Andy Lyon, Jennifer Yuh Nelson, Tim Miller, Carlos Stevens, Jerome Chen and Alberto Mielgo, with animation provided by Pinkman.tv, Sony Pictures Imageworks, Axis Studios, Blur Studio, Titmouse, BUCK, Polygon Pictures and Blow Studio.
“In Vaulted Halls Entombed” is a military adventure that descends into Lovecraftian horror.
“When 3D animation came out, it allowed us to do certain things that we couldn’t do in 2D animation. The same with a lot of the game engines. You are able to express an entire world, adjust things in real-time and change the light if you want. It’s not baked into things like it is usually.”
—Jennifer Yuh Nelson, Supervising Director
“Jibaro” is the only episode that is not based on pre-existing material.
Returning as the supervising director from her previous outing on Vol. 2 is Jennifer Yuh Nelson (Kung Fu Panda 2 & 3), who worked with a mixture of new and veteran collaborators as well as making her own contribution with the muscle-flexing action adventure “Kill Time Kill.” Notable first-time participants are David Fincher making his animation directorial debut with the monstrous seafaring tale “Bad Travelling” and Patrick Osborne helms the macabre-funny, post-apocalyptic sequel “Three Robots: Exit Strategies.” Returnees include visual effects veteran Jerome Chen helming “In Vaulted Halls Entombed,” where a special forces team encounters an ancient evil, and Oscar-winner Alberto Mielgo envisioning a fatal romance between a deaf Renaissance knight and a lethal siren in “Jibaro.” Inventive animation styles are found in “Night of the Mini Dead,” which uses tilt-shift photography to make everything look tiny, Mobius and psychedelic-flavored “The Very Pulse of the Machine,” and in the painterly impressionism of “Jibaro.”
As to whether real-time technology and game engines are impacting the type of stories being told, Nelson does not believe this to be the case. “I don’t know if it’s types of stories that it has affected,” she explains. “It’s the look and how much you can deal with certain levels of complexity. When 3D animation came out, it allowed us to do certain things that we couldn’t do in 2D animation. The same with a lot of the game engines. You are able to express an entire world, adjust things in real-time and change the light if you want. It’s not baked into things like it is usually.” The impact of game engines like Unreal and Unity cannot be ignored. “I’m so old that I was on the cusp of the desktop revolution, and it used to be when I started in the business you had to have a lot of money to be able to do 3D animation,” recalls Miller. “Then desktop technology and software came along and it democratized the process, which allowed us to start Blur borrowing $20,000. I thought that was amazing, but game engine technology is going to be a paradigm shift again. You don’t need heavy machines to render. Even lots of cheap PCs are still expensive and need some technical infrastructure. Now guys can do minutes-long shorts in their basements at home and you can see it on the web. You see a lot of interesting artists doing great things by themselves or with small teams. Game engine technology is super freaking exciting. I feel like that I’ve been waiting for it a while, but now it’s here.”
“Kill Team Kill” is a kindred spirit of Predator, Commando and Escape from New York.
“[G]ame engine technology is going to be a paradigm shift again. You don’t need heavy machines to render. Even lots of cheap PCs are still expensive and need some technical infrastructure. Now guys can do minutes-long shorts in their basements at home and you can see it on the web. You see a lot of interesting artists doing great things by themselves or with small teams. Game engine technology is super freaking exciting. I feel like that I’ve been waiting for it a while, but now it’s here.”
—Tim Miller, Director
“Mason’s Rats” revolves around a Scottish farmer battling with weapon-wielding rats determined to steal his crops.
“Night of the Mini Dead” was created by using tilt-shift photography which makes everything look tiny.
When it comes to her own short, where a squad of soldiers in Afghanistan encounter a CIA experiment gone horribly wrong, Nelson decided to channel a fondness for a particular cinematic era that made action icons out of Arnold Schwarzenegger, Sylvester Stallone, Bruce Willis and Jean-Claude Van Damme. “For ‘Kill Team Kill,’” she says, “my inspiration was cartoons from the 1990s and action movies from that time, like Predator, Commando, and G.I. Joe cartoons. They were good fun at the time, and the story by Justin Coates had that feel to it, so that’s where that came from.” Handling the animation was the studio responsible for The Boys Presents: Diabolical and The Legend of Vox Machina. “I got to work with Titmouse, and they’re an amazing studio with a wide variety of different styles. I got to work with Antonio Canobbio and Benjy Brooke who helped to find this look. It’s a 2D style, so it has to be animatable. The character designs themselves are covered with veins and packets of ammo which are hard to animate, but we got the benefits of amazing animators from all over the world, and you can see that level of expertise in it.”
“[For ‘Jibaro’] we used real scans of armor that you might see in museums. When you see the armor, it feels almost unbelievable that you can fit a person inside. The cool thing about this is we don’t actually need to fit a person inside because these aren’t real characters. You can just have their neck. We were using real Renaissance armor. We were redesigning it a little bit, but the cool thing is that we’re seeing something that is historically accurate. I feel that is extremely new and fresh.”
—Albert Mielgo, Director
“Swarm” was adapted by Miller from a short story by Bruce Sterling, and revolves around human factions with conflicting views as to whether advancement should be achieved through genetic manipulation or cybernetic enhancement and technology. Adding further complications is the discovery of an insectoid race that may be of a higher intelligence than humanity. “We have a set of eight-sided dice and roll them!” laughs Miller when describing how he decides upon the animation style, character design and world-building. “It was interesting that we had this short which is almost entirely in zero-G, but we were still going to do some motion capture for that,” notes Miller. “Then the pandemic hit and motion capture was not an option anymore. I didn’t want to get caught in the uncanny valley either, so I decided to stylize the characters to a certain degree, which helps the story not be quite as horrible as it would be otherwise. I loved making the show. It was a challenge to think about the physics of how people move through zero-G, and anything with lots of creatures is a good time. I get a lot of vicarious enjoyment from knowing the animators and creature designers are going to enjoy the process of making this.”
Mocap was combined with CG keyframe animation to produce “Swarm.”
“[For ‘Kill Team Kill’] it’s a 2D style, so it has to be animatable. The character designs themselves are covered with veins and packets of ammo which are hard to animate, but we got the benefits of amazing animators from all over the world, and you can see that level of expertise in it.”
—Jennifer Yuh Nelson, Supervising Director
“The Very Pulse of the Machine” is a love letter to French comics great Jean “Moebius” Giraud.
“Three Robots: Exit Strategies” features the neurotic XBOT 4000, dimwitted and enthusiastic K-RVC, and the brilliant and deadpan 11-45-G examining the demise of humanity.
Self-taught as an artist, Mielgo (The Windshield Wiper) utilizes the principles of painting, in particular lighting, when producing animated shorts such as “Jibaro.” “I create a simple image by removing what is not necessary for the eye to understand,” he says. Themes rather than the premise influence the animation style. “In terms of the girl, I wanted her to be a walking treasure, and in order to do that I was doing research on folklore jewelry from Northern Africa, China, India and Pakistan. In the case of the guys, I prefer the Renaissance rather than the Medieval in terms of design. We did something interesting, which is we used real scans of armor that you might see in museums. When you see the armor, it feels almost unbelievable that you can fit a person inside. The cool thing about this is we don’t actually need to fit a person inside because these aren’t real characters. You can just have their neck. We were using real Renaissance armor. We were redesigning it a little bit, but the cool thing is that we’re seeing something that is historically accurate. I feel that is extremely new and fresh.”
Sheena Duggal is an acclaimed visual effects supervisor and artist whose work has shaped numerous studio tent-pole and Academy Award nominated productions. Most recently, Duggal was Visual Effects Supervisor on the box office blockbuster Venom: Let There Be Carnage and was a BAFTA nominee this year for Best Special Effects for her work on the Oscars VFX-shortlisted Ghostbusters: Afterlife. Sheena is the only woman VFX Supervisor to earn that level of recognition from the Academy this awards season. She was the first woman to be honored with the VES Award for Creative Excellence, bestowed in 2020.
The lack of female visual effects supervisors is definitely the result of a lack of opportunity and unconscious bias – and that is fixable. Earlier in my career, I was told that the goal was to promote the male supervisors, and watched as guys who had worked under my VFX supervision were promoted up the ranks and given opportunities on large VFX shows. It never occurred to me that my gender would hold me back, and I was always surprised when it did. I am a strong believer in diversity and inclusion, not just because I am a bi-racial woman, but because I believe that greater diversity leads to freer thinking and greater creativity.
Good girls get nowhere. Be disobedient, be persistent, never take disrespect thrown your way… be smart and graceful and remember you are equal.
Never stop fighting for the right to be the best you can be. Women spend too much time being congenial, and it’s time for us to speak up about our achievements and the opportunities we’ve created for ourselves. We’re talented, we’re here, and we’re ready.
Even if women break though the glass ceiling, they end up on a glass cliff where they can be pushed off, because there, is no cadre of women to cheerlead in support that is equivalent to a “boy’s club.” We need to be building an industry culture and a structure that supports women in the field and sets them up for success. I take my opportunity to be a role model and a voice for other women seriously; I want to not just open doors, but bust through them.
Change can happen fast if everyone is motivated. We need to do it now.
In having this inevitable conversation, we can’t exclude men or accuse them if we want to create the change we want to see. We must do it together. Women are almost always expected to solve the systemic problems we did not create or perpetuate in a patriarchal culture. A lot of well-meaning people lack self-awareness or fail to understand their role in enabling sexism or great inequities. If meritocracy fails to work, then uplifting women needs to be a conscious choice. I would ask all men in VFX to go through implicit bias training and be active problem-solvers and advocates for women, because people still give men’s voices more credibility. It takes a lot of people to create success for an outlier.
Join us for our series of interactive webinars with visual effects professionals. Ask your questions, learn about the industry and glean inspiration for your career path.
Register today at VESGlobal.org/AMA
By TREVOR HOGG
Images courtesy of Sky and HBO.
Michelle de Swarte portrays Natasha who has a fateful encounter with a mysterious baby seeking to control her life.
Upon reading the synopsis for the HBO and Sky horror comedy The Baby, one gets a distinct impression that anxiety about motherhood drives the narrative created by Lucy Gayme and Siân Robins-Grace. The summary states, “Controlling, manipulative and with violent powers, the baby twists Natasha’s life into a horror show. Where does it come from? What does it want? And what lengths will Natasha have to go to in order to get her life back? She doesn’t want a baby. The baby wants her.” When this observation gets mentioned to VFX Producer Anne Akande and Visual Effects Supervisor Owen Braekke-Carroll both of them laugh in agreement. “It’s certainly a dissection of many angles of motherhood!” states Braekke-Carroll. “There is symbolism and scenes that absolutely tap into practical and real fears of breast feeding and abandonment. We were tasked with bringing some of the juicer parts of the script into the visual medium. It’s quite literal in many ways.”
Bobbi (Amber Grappy), Natasha (Michelle de Swarte) and Mrs. Eaves (Amira Ghazalla) stand in horror at the violent chaos that ensues in The Baby.
“We weren’t pushing [visual effects] beyond anything because the show was one that we knew early on was grounded in reality. The baby is a baby. There are a lot of misconceptions about what this baby is and what his agenda is. There are a few moments where we have some heightened reality and he is still a baby, but a bit different.”
—Anne Akande, VFX Producer
Gayme and Robins-Grace had a clear and descriptive vision of the reality and tone of the series. “Siân and Lucy were keen from the outset on getting a realistic and grounded tone throughout the series, and this influenced how we then approached the body of work,” remarks Akande. “We were involved early in the process to ensure that the shoot methodology would be effective and give visual effects enough material to pull off some of the more dramatic scenes. Beyond that giving the guidance, they were also collaborative, open and willing to take feedback on the best way forward via visual effects to hit each story point.” The visual effects work for the eight episodes consisted of just under 650 shots by Framestore, Jellyfish Pictures, Freefolk and Goldcrest. “We weren’t pushing it beyond anything because the show was one that we knew early on was grounded in reality,” notes Akande. “The baby is a baby. There are a lot of misconceptions about what this baby is and what his agenda is. There are a few moments where we have some heightened reality and he is still a baby, but a bit different.”
A key location is a seaside cottage at the base of a cliff, directly fronting the shoreline.
This beachside cliff was LiDAR scanned, recreated through DMP/CG, then combined with plate photography.
Nicole Kassell helmed the pilot, Faraz was responsible for three episodes, and Stacey Gregg and Ella Jones each directed two episodes. “It’s always interesting working with different directors across a series, and in this case they did all have different approaches to handling the visual effects,” states Akande. “Nicole Kassell had a lot of experience in visual effects and had a hands-on approach from storyboards, concept, previs through to execution. Others brought their comedy experience to help drive the storytelling beats, and there was also some experimentation using different shooting techniques and machinery on set. All of this brought an interesting mix that fused with the tone of the show, creating a unique place for The Baby in the comedy/horror genre.” Storyboarding and concept art were produced for all of the key creative beats.
“We definitely knew that we needed a digital asset. By casting twins, we were able to double our shooting hours. The babies absorbed the nature of the set quickly, and we saw them grow up over the course of six months of shooting with them. That left us with a strange, hybridized methodology over time, whether it be face replacements from plates with a CG body, a stand-in prosthetic baby with a head replacement being pushed around in a pram, or one digital arm, plate head and a prosthetic body. There was also an army of stand-in babies.”
—Owen Braekke-Carroll, Visual Effects Supervisor
The cottage and immediate gardens were built on the site of a small quarry which provided the immediate base of the cliff and surroundings.
“To compensate for [the unpredictability of the babies] on set, we ended up treating almost every frame with the baby cast in it as a potential visual effects shot. This included taking large volumes of data and notation for most scenes and essentially treating them as a CG creature in the scene.”
—Anne Akande, VFX Producer
“Concept art for key moments, such as our Demon Baby nightmare scene, was developed by the Framestore art department and was crucial in helping settle the creative vision as much as possible before shot execution,” remarks Braekke-Carroll. “From the storyboards, some key shots were turned into previs.” Scripts for the eight episodes were broken down to determine what shots required visual effects. “We worked closely with the art department throughout the shoot to help find the right combination of set, location and bluescreen,” explains Akande. “A key location in the script that we return to many times is a seaside cottage at the base of a cliff, directly fronting the shore line. Locations were unable to find a site that hit all the required points, so visual effects were tapped to make this work. The cottage and immediate gardens were built on the site of a small quarry, which gave us the immediate base of the cliff and surrounds. A secondary location along the site of a dramatic coastline in Newhaven [England] was the basis for the extension. This beachside cliff was LiDAR scanned, recreated through DMP/CG, then combined with plate photography to combine the two locations together.” Deaths are plentiful throughout the story, but the focus is on the aftermath rather than the actual act of violence. “There is an implied causal link between the baby and a death,” states Braekke-Carroll. “But he’s not necessary physically holding the knife.”
Due to the unpredictable nature of the babies on set, every shoot day could wildly deviate from the plan and the visual effects team would be required to help.
“The sheer nature of the amount of time that we were going to have a baby onscreen and on set meant that a lot of things we had planned for would sometimes go flawlessly without any help from us,” notes Braekke-Carroll. “On another occasion, the entire day might need to be completely changed and require our input for all sorts of reasons.” Identical twins were cast in the title role. “We definitely knew that we needed a digital asset,” remarks Akande. “By casting twins, we were able to double our shooting hours. The babies absorbed the nature of the set quickly, and we saw them grow up over the course of six months of shooting with them.” Digital doubles were avoided as much as possible. “That left us with a strange, hybridized methodology over time, whether it be face replacements from plates with a CG body, a stand-in prosthetic baby with a head replacement being pushed around in a pram, or one digital arm, plate head and a prosthetic body,” states Braekke-Carroll. “There was also an army of stand-in babies. When it comes to performance with our hero twins, that became a hybridized process where we used a combination of digital passes, keying tools, reprojections and face tracking from source plates. Then also leaning on machine learning additional 2D layering to change the performance.”
“It’s always interesting working with different directors across a series, and in this case they did all have different approaches to handling the visual effects. Nicole Kassell had a lot of experience in visual effects and had a hands-on approach from storyboards, concept, previs through to execution. Others brought their comedy experience to help drive the storytelling beats, and there was also some experimentation using different shooting techniques and machinery on set. All of this brought an interesting mix that fused with the tone of the show, creating a unique place for The Baby in the comedy/horror genre.”
—Anne Akande, VFX Producer
The eyes were difficult to get right. “The animation of the performance of the baby isn’t quite straightforward,” remarks Braekke-Carroll. “The eyes are quite loose and gaze differently. We took parts of plates for the area around the eyes for the micro-performance and combined that with CG or machine learning layers.” A wealth of material was gathered from reference photography. “We could be working on something in Episode 104 and there’s a performance that nails it in Episode 102,” states Akande. “Everybody on set was invested in getting us the material. It could be the first two seconds before the take, and that was needed for the face replacements. We also learned about which baby is good at being still or restless. The one thing that we tried to educate people on is that the babies are a member of the cast. If you replace a cast member with a stand-in for 30 shots, that becomes visual effects. Every time a baby was in a shot, the first port of call was our hero baby. The real performance will always be better than the alternative. CG was the last resort, and that was what we let the showrunners and executive producers know from the beginning.”
Face-generation camera setups were orchestrated that proved to be useful as animation reference and being utilized for machine learning. “Anytime we were using a digital baby performance and we would also be running a machine learning output as well,” explains Braekke-Carroll, “rather than treating that as a facial replacement solution we had it as an additional layer setup that could be incorporated partially or fully in with the other renders for the other parts. A lot of the shots that you will see won’t necessarily be a machine learning output, but there will be parts of the lips, eyes or cheek that will give it an extra degree of photographic verisimilitude that you get from that output.” One of the most difficult visual effects tasks was to have a baby falling asleep or sleeping. “We had to find a bunch of solutions and ended up shooting a lot of high-frame-rate plates of the baby and played them back at normal speed,” adds Braekke-Carroll. “We looked for a nice section where it felt like they were sleeping. A machine learning dataset was built just of the baby’s eyes. The high-frame-rate photography gave it a gentle effect, rather than trying to animate too much micro eye movement.”
The show had one primary asset, which was the baby digital double. The bulk of the baby work was handled by Framestore.
In some cases, shots were a combination of high-frame-rate plate photography, digital-double parts and a machine learning layer on top.
Point-of-view shots take place within the birth canal and womb. “We had a free remit to take B camera, get all of the jars of Vaseline, PCB tubing, probe lens, lights, blood, sputum and pus,” reveals Braekke-Carroll. “We took all of the bits and pieces and gelled them up. We got some nice closeup photography inspired by the scenes in The Tree of Life. I was pushing against building a CG interior because, tonally, I didn’t think that it fit the episode. From that photography, visual effects added a layer of fine particulate and endoscopic lensing. There is also the diffusion of the water and cloudiness. But the actual content of the walls was practical photography.”
“The one thing that we tried to educate people on is that the babies are a member of the cast. If you replace a cast member with a stand-in for 30 shots, that becomes visual effects. Every time a baby was in a shot, the first port of call was our hero baby. The real performance will always be better than the alternative. CG was the last resort, and that was what we let the showrunners and executive producers know from the beginning.”
—Owen Braekke-Carroll, Visual Effects Supervisor
The unpredictability of the on-set babies posed the biggest challenge across the series. “To compensate for this on set, we ended up treating almost every frame with the baby cast in it as a potential visual effects shot,” explains Akande. “This included taking large volumes of data and notation for most scenes and essentially treating them as a CG creature in the scene.” The best moment was literally saved for last. “We’re looking forward to the final sequence underwater,” states Braekke-Carroll. “It’s a beautiful and unexpected scene that wraps up the story and bookends the series nicely.”
By TREVOR HOGG
South Korea-based Gulliver Studios created the dramatic effects for the surprise Netflix hit series Squid Game, which returns for a second season in 2024. (Image courtesy of Netflix)
With The Mandalorian taking a breather after winning two consecutive Emmy Awards for Outstanding Special Visual Effects in a Season or a Movie, it will be up to The Book of Boba Fett to continue the winning streak as the iconic bounty-hunter-turned-crime-lord series has been described as The Mandalorian 2.5. Whether Boba Fett receives a nomination, or more, will be revealed when the Primetime Emmy Awards takes center stage on September 12, 2022 at the Microsoft Theater in Los Angeles. The other category is Outstanding Special Visual Effects in a Single Episode, which last year was awarded to Star Trek: Discovery, another contender from a storied science fiction franchise which will be trying to repeat the feat once again.
An interesting bellwether is the 2022 Visual Effects Society Award nominations that place Loki and Foundation at the forefront with both being singled out for their stunning environmental work for Lamentis and Trantor. “We were asked to create meteor effects from scratch,” states Digital Domain Visual Effects Supervisor Jean Luc-Dinsdale when discussing Lamentis and its moon Lamentis-1. “We went through multiple versions of providing the meteors, the impacts, and the dust and debris that flies around them. That was then tweaked and populated throughout the episode because the meteors are a constant threat, but not always the focus of the sequence.”
Trantor is literally 50 different cities stacked on top of each other. “Every level was built hundreds of years before the next one, so there was a lot of concepting and architectural research that went into how Trantor and its multilevel structure was designed,” explains DNEG Visual Effects Supervisor Chris Keller. “We created all of these interstitial elements between buildings, like bridges, platforms, megastructures spanning 1,000 meters, through the sky ceiling of a certain level into the next level. Then you’ll see hyperloop trains and, if you look carefully, flying vehicles. All of that had a certain logic.”
Part of the futuristic appeal of Star Trek:Discovery is the amount of attention and detail put into creating believable UI. (Image courtesy of Paramount+)
When it comes to photorealistic CG characters, Leshy-infected Eskel and Nivellen from The Witcher and Ampersand from Y: The Last Man were also nominated for VES Awards. “There has been real growth on the monster side,” explains Andrew Laws, Production Designer for The Witcher. “We work in ZBrush from the ground up to understand the movement and how the creature is going to take shape in all dimensions. It’s a much more fluid process. Once we have established a ZBrush model that has an organic shape, we’ll do some overpainting to get the mood of the creature. When it is agreed upon how that is going work, then the 3D model goes out to visual effects and the vendors to bring in the detail and movement.”
Originally, Ampersand was going to be real but was changed to CG because Disney has a ‘no primate’ rule. “Stephen Pugh and Jesse Kawzenuk, our amazing visual effects supervisors, made it so easy for me,” recalls cinematographer Catherine Lutes. “I was constantly laughing at the puppet Amp that we had. It helped with the way that the light was falling, and that’s a good reference as well for visual effects. Stephen said that camera shouldn’t do things that a monkey wouldn’t do. If the camera is a little bit stilted or doesn’t move smoothly, that’s great because that’s what would happen if you were trying to follow an actual monkey running or moving.”
One question is whether The Book of Boba Fett can carry on the Emmy-winning
ways of The Mandalorian. (Image courtesy of Lucasfilm)
Nostalgia reigns supreme as Ewan McGregor and Haden Christensen reprise their roles from the Star Wars prequels for Obi-Wan Kenobi. (Image courtesy of Lucasfilm)
Oscar Isaac becomes a part of the MCU for the first time, along with Ethan Hawke, in the Disney+ series Moon Knight. (Image courtesy of Disney)
Raised by Wolves is seen as a better exploration of an Alien-inspired universe
than the prequels directed by Ridley Scott. (Image courtesy of HBO)
There is no shortage of monsters to be found in The Witcher, such as a powerful vampire known as a Bruxa. (Image courtesy of Netflix)
The Wheel of Time features a wide gamut of visual effects from creatures, magic and world-building done in a grounded fashion. “One thing that was important for me from the beginning was that this world feel authentic and real,” explains The Wheel of Time creator, executive producer and showrunner Rafe Judkins, “even for the actors and crew, trying to go to places, as much as we can put stuff in-camera, even if we end up augmenting or enhancing it later with visual effects.”
The fact that the sixth season is the grand finale for The Expanse may see Emmy voters finally honor the body of work with a nomination. “The most challenging thing is wrapping your head around things that may not sound that difficult initially, like deorbiting maneuvers where you slow going forward to be able to drop,” notes Bret Culp, Senior Visual Effects Supervisor of The Expanse. “We’ve done a good job and, as a result, it has been made clear to us that we are favorites with a lot of people at NASA and have an open invitation to visit the JPL [Jet Propulsion Laboratory].”
The usual suspects include Lost in Space, which has been rightly lauded for being able to turn practical locations into alien worlds and making biomechanical robotic beings that are empathetic and menacing. “The most challenging visual effects sequence in the finale of Lost in Space was creating the horde of killer alien robots and sprawling wreckage of their crashed ship,” remarks Lost in Space Visual Effects Supervisor Jabbar Raisani. “The entire episode had to be filmed on stage, and we decided to shoot against black. As both the director of the episode and the VFX Supervisor, I relied heavily on shot planning with our in-house previs team which maintained close collaboration with the production designer to maximize our efforts and bring the series to its epic conclusion.”
“The most challenging visual effects sequence in the finale of Lost in Space was creating the horde of killer alien robots and sprawling wreckage of their crashed ship. The entire episode had to be filmed on stage, and we decided to shoot against black. As both the director of the episode and the VFX Supervisor, I relied heavily on shot planning with our in-house previs team which maintained close collaboration with the production designer to maximize our efforts and bring the series to its epic conclusion.”
—Jabbar Raisani, Visual Effects Supervisor, Lost in Space
For those looking for major robot battles, Season 3 of Lost in Space will not disappoint. (Image courtesy of Netflix)
The Battle of New York scene from 2012’s The Avengers was used as a flashback in Hawkeye, which was released as a limited series in 2021. (Image courtesy of Disney)
A welcome return to the world created by Gene Roddenberry is Patrick Stewart reprising his signature role of Jean-Luc Picard in Star Trek: Picard. (Image courtesy of Paramount+)
Returning for sophomore seasons are Star Trek: Picard and Raised by Wolves, with the former mining the fan adoration for the Starfleet officer portrayed by Patrick Stewart and the latter infusing Alien mythology into the android survival tale produced by legendary filmmaker Ridley Scott. The hardest sequence to design, create and execute for Raised by Wolves was the outerspace sequence between Mother and the Necro serpent in Episode 208,” reveals Raised by Wolves Visual Effects Supervisor Raymond McIntyre Jr. “The flying Necro serpent is lured away from killing Campion by Mother, who leads the serpent into outer space in order to attempt to kill it. This scene was added deep into postproduction, and visual effects was tasked with designing an entire sequence from scratch as no live-action footage existed. Visual effects designs included the flying serpent, lighting design in outer space, nebulas, the planet Kepler 22B seen from this viewpoint, Mother’s new kill scream and a visualization of the failure of the EMF dome protecting this area of the planet. Execution involved creating realistic camera motion for each shot, and beauty lighting with sun flares, allowing for dirt on the lens to show up during flares, all while rendering fully CG shots.”
Making their debuts are Obi-Wan Kenobi, which has Ewan McGregor reprising his role as the legendary Jedi Master from the Star Wars prequel trilogy, and Star Trek: Strange New Worlds, an exploration of life on the USS Enterprise under the command of Captain Christopher Pike; both of them serve as prologues to the original movie and television series and have the best chances to get nominations for their respective franchises, especially if a proper balance is struck between nostalgia and canon expansion.
Then there is a matter of art imitating life that will resonate with some while being too close to the bone for others, where the viral mayhem portrayed is even more devastating and required extensive invisible effects to paint out modern-day life. In Sweet Tooth, a pandemic causes hybrid babies that are part human and animal, with the adolescent protagonist being half deer, while Station Eleven focuses on humanity trying to rebuild society after a virus has decimated the population, and See envisions a future where blindness has reached epidemic proportions.
A favorite to win at the Emmys is Foundation, which features stellar environments throughout the AppleTV+ series. (Image courtesy of Apple TV+)
A planet gets destroyed amongst the purple haze in the Disney+ series Loki.(Image courtesy of Marvel Studios)
A surreal situation for the cast and crew of Station Eleven was shooting a story about a pandemic during one. (Image courtesy of HBO)
Animal/human hybrids populate the world of Sweet Tooth because of a deadly virus. (Image courtesy of Netflix)
Serving as dark social commentary on the growing financial divide is Squid Game, which combines elements of The Most Dangerous Game, childhood games and Dickensian debt into a rating sensation for Netflix, and is a strong contender to upset the voting establishment. “The game spaces in Squid Game were unique and something we had never experienced before,” states Cheong Jai-hoon, Visual Effects Supervisor of Squid Game. “What we wanted to achieve from the settings of Squid Game was a fabricated yet realistic look, and it was quite challenging to balance the two conflicting characteristics. Especially in Episode 107, characters play the game of Glass Stepping Stones from high above the ground, and we had to create an environment that would make the viewers immerse in the fear and tension. We put the most effort into deciding the depth from the stepping stones to the ground and the overall scale of the whole setting. We could have easily exaggerated, but we strived to find the right balance between what seemed fake and realistic, as it was more difficult than we thought.”
Also, present is the author only outdone by the Bard himself when it comes to number of film and television adaptations of his works. Lisey’s Story was conceived by prolific horror maestro Stephen King, who has supernatural unrest intersecting with personal trauma. Comic book adaptations are not in short supply. A superhero who has a sharp wit and archery skills is paired with a like-minded protégé in Hawkeye, which channels Shane Black’s penchant for Christmas, action sequences and odd-ball comedic pairings. For those wanting an irreverent take on the genre, James Gunn helms the small screen adaptation of Peacemaker, where an extremist murderer embarks on a quest for peace. Moon Knight introduces the Marvel Studios equivalent of Batman, but with an Egyptian god reincarnation twist that raises questions about the mental sanity of the main character.
Superman & Lois reimagines The Daily Planet colleagues as a married couple trying to balance domestic life and a rogues’ gallery of high-flying adversaries. “If Superman is fighting someone in the air where they would both be horizontal, it was much more time efficient and easier on the actors if they can be vertical,” states cinematographer Stephen Maier, who added a physical camera shake for the sake of realism. “The stunt team will often go away to design or rehearse something, do their previs that they film on their iPhones, cut it together and show it to us. We have a close collaboration with special effects in regards to atmospheric smoke and haze. The gags that they come up help to exemplify the strength of Superman, such as him lifting a car.”
Considering the growing demand for content and the acceptance of visual effects as the primary work tool of potential nominees reflect how far the production quality of television and streaming shows have come in being able to expand the scope of creatives with a theatrical sensibility. It is because of this that the Primetime Emmy Awards has become as fascinating to watch as the Academy Awards as both showcase the very best of what can be achieved when talented digital artists get to contribute to the storytelling. Undoubtedly, the eventual winner will encapsulate the highest of level of creative and technical ingenuity achievable under current circumstances and will serve as a building block for what is to follow.
By TREVOR HOGG
MPC Episodic created a post-apocalyptic environment
for The Witcher. (Image courtesy of MPC and Netflix)
When it comes to witnessing what is achievable with visual effects, no longer does one have to go to a theater, as high-end episodic has essentially become a long form cinematic experience that can be enjoyed by turning on a television or mobile device. This is not going to change with streamers spending billions of dollars to create content to stand apart from their like-minded competitors. The result is an impressive array of shows that are not lacking in storytelling ambition, whether it be The Wheel of Time, The Witcher, Foundation or The Book of Boba Fett. Virtual production has become synonymous with The Mandalorian, but this innovative methodology is only an aspect of the visual effects landscape which continues to evolve technologically. What does the future look like for the visual effects industry and episodic productions in the pandemic and post-pandemic era? This is a question that we try to answer by consulting the players responsible for producing the wealth of content that is available for viewers to watch.
Robin Hackl, Visual Effects Supervisor & Co-founder, Image Engine
“The requirements of television work are identical to feature film work in many ways. But back then it was much less resolution involved with the final output. Interestingly, we became known as a television visual effects house, and that precluded us from actually doing feature film work. It came with a stigma back in those days and was a large barrier that we had to break through. District 9 was a tipping point of recognition of us being able to execute on large-scale work.
“Shawn Walsh [General Manger and Executive Producer, Image Engine] has done a good job of holding the line. Placing the value on what we deliver to the client and making them understand what that value is and why it is of value. The shortened timelines have been the long-term progression ever since I could remember. Coupled on top of that are the demands. Now the expectations are far greater than what they were. Where is that breaking point? It is up to us to hold the line as best as we can and inform our clients what our capabilities and capacities are in order to avoid that.”
Image Engine, which contributed to The Mandalorian, was originally seen as a television visual effects studio, making it difficult to garner film work, but that paradigm no longer exists. (Image courtesy of Image Engine and Lucasfilm)
“Now the expectations are far greater than what they were. Where is that breaking point? It is up to us to hold the line as best as we can and inform our clients what our capabilities and capacities are in order to avoid that.”
—Robin Hackl, Visual Effects Supervisor & Co-founder, Image Engine
Drew Jones, Chief Business Development Officer, Cinesite
“You’re fine-tuning the teams of people attached to particular projects, ensuring that you have the right personalities dealing with the right style of work so that you can shortcut the processes and still deliver the quality threshold that it needs to be. You haven’t got the luxury of time to develop an idea across many months for the most part.
“Vendors having concept artists and art departments in-house are definitely a use for a quick, more cost-effective process to get closer to an answer within the visual effects post-production environment. We will often use conceptual artists to build imagery quickly to present an option to a production rather than go through a long gestation period of a CG build and compositing to get an idea across.
“There is more exploration into ideas through streamers. The projects, scripts and series are often filled with quite fantastical ideas that may have never seen the light of day on the big screen. The content I don’t think has changed. I don’t feel like we’re doing anything outrageously different. All visual effects have a complexity component to them, and at the end of the day it comes down to how far the directors want to push their thoughts and ideas.”
ILM had fun dealing with the Loki variants, including an alligator, for Marvel Studios and Disney+ series Loki. (Image courtesy of ILM and Marvel Studios)
Serving as a bridge between Seasons 2 and 3 of The Mandalorian is The Book of Bobba Fett. (Image courtesy of ILM and Disney)
Final graded image by DNEG that was shot against greenscreen for Star Trek: Discovery. (Image courtesy of DNEG and Paramount+)
“The area that’s getting the most attention at the moment … is facial replacement work, with articles and papers going in-depth about how AI and computational analysis are making those kinds of computer-generated content far more photographic than before. It’s definitely an area that could lead to some very different approaches as to how visual effects are fundamentally implemented.”
—Paul Riddle, Executive VFX Supervisor, DNEG
A massive tarantula was created by Image Engine for the reimagining of The Twilight Zone. (Image courtesy of Image Engine and Paramount+)
Janet Muswell Hamilton, Senior Vice President, Visual Effects, HBO
“Right now, you have a lot of executives and post executives who have been doing a good job of producing the visual effects, who needed additional help because their slates were busy. The industry has just exploded. Being able to take work off of their plates to help them find heads of departments, facilities, and getting their heads around budgets –- that was the first thing I did. But what I needed was processes in place in order to make it easier for me so I wasn’t working in 10 different ways, because HBO has been a bespoke studio. I am a fan of tools and processes that help us with the creative process.
“House of the Dragon is utilizing the LED screens at Warner Bros. Studios Leavesden for a whole bunch of sequences. It’s 360 [degrees] and has a retractable roof. The ability to shoot magic hour for about a week is incredible. Yes, you need to do it upfront. Yes, you need a director who is willing to go that way. Ultimately, when you start to see the results, how beautiful things look and the stories you can tell that you couldn’t tell before because you couldn’t go there or afford it – it’s going to revolutionize how we do things. It’s a technology that is here to stay. It was an unexpected benefit of the lockdown. My biggest desire is to never ever shoot another greenscreen driving shot!”
Vision starts to disintegrate courtesy of Digital Domain for the Marvel Studios and Disney+ series WandaVision. (Image courtesy of Digital Domain and Disney)
Alex Hope, Co-CEO, beloFX
“The biggest barrier to growth for many visual effects companies for many years has been finding talented artists. There is a finite global talent pool, but it is one we are all working hard to build. We’re all making huge efforts to train and develop visual effects talent at every stage whether that’s in college, entry level into the industry or once people have gotten into the industry. Many visual effects companies are getting behind career development for artists. In the U.K. we are helped by organizations like ScreenSkills, who standardize training at various levels, and to ensure that the industry is working to support the education sector to bring new talent into the industry.
“Visual effects is perhaps the fastest-growing component of the film and television industry. It’s fantastic that we’ve seen an explosion in content creation of all types, and we’ve seen a consequent growth in demand for visual effects, so certainly the money spent on content creation is coming through to all parts of the industry, including visual effects. As we see more localized production for streamers, it’s going to be really interesting to see what opportunities this provides for partnerships between local visual effects companies and those companies in more established centers, like the U.K. and Canada, and that’s very exciting and interesting to us at beloFX.”
Lucy Ainsworth-Taylor, CEO & Co-Founder, BlueBolt
“With the global demand and need to get shows finished, work is being spread everywhere, often disregarding the rebate. We still cannot compete with the Indian prices, but the flip side is that the talent coming out of India now adds to the international remote marketplace. Netflix purchasing Scanline VFX is not a game-changer at all. Studios have purchased facilities before, and as long as they can keep feeding the work into the facility, it will work. With the content Netflix is making at the moment, it makes sense, but I would assume they should probably buy many more facilities for the amount of work they require! Scanline is a well-respected visual effects house; does this mean they will now only work on Netflix shows?”
Bubbles were digitally created by Milk VFX for the outer space suffocation scene in Intergalactic. (Image courtesy of Milk VFX and Sky One)
The robot battle scene in Season 3 of Lost in Space. (Image courtesy of Digital Domain and Netflix)
An actual helicopter was used to create the impression of a spacecraft landing on the water in a scene from Foundation. (Images courtesy of Important Looking Pirates and Apple TV+)
The big screen gets adapted for the small screen with Image Engine bringing the world of Snowpiercer to life. (Image courtesy of Image Engine and TNT)
John Fragomeni, Global President, Digital Domain
“Working on award-winning projects like WandaVision, Lost in Space, Carnival Row and Loki was basically like making six to eight mini-films. We use the same tools on episodics that we use on features, and often the same team of artists. That has helped to accelerate our development on some of the tools we use, giving us the ability to handle the volume of work while still delivering quality.
“Some builds tend to lend themselves to features. For instance, the Free City game world we made for Free Guy or the 2.5 miles of New York City that we recreated for Spider-Man: No Way Home. But that doesn’t mean that you couldn’t do that for an episodic, given enough time and budget.
“One thing we are seeing more and more of on the episodic side is that the productions are coming to us with a detailed vision of what they want for the entire season. This helps us forecast schedules more finitely and identify breaking points when it comes to tight deadlines. From that, we can determine with the production where we can best serve the visuals, then coordinate with any other visual effects vendors the production may bring in. One of the more interesting by-products of the rise of elevated quality effects in episodics is that studios that used to compete for the same projects are now partners. As the demands for effects grow, we’ll probably see more groups involved.”
Rising Sun Pictures was part of the visual effects team on the live-action remake of Cowboy Bebop for Netflix. (Image courtesy of Rising Sun Pictures and Netflix)
Michelle Martin, Chief Business Development Officer, Milk VFX
“The speed with which streaming content has grown globally has given VFX houses the opportunity to raise the bar in VFX, to create high-end content for tentpole series and feature-length projects, in turn giving a wider range of artists the opportunity to work on interesting projects. Standards have certainly been raised.
“The networks and studios are engaging us earlier and are keen to discuss capacity with us, as well as share more information regarding their up-and-coming slates. There’s a keenness to share information and artists are being block-booked ahead of productions starting, which is where we should be to help develop and visualize the storytelling. We are seeing a very different landscape to where we were 10 years ago.”
Christopher Gray, Global Managing Director, Episodic, MPC
“In the short term, we’re already seeing the wider application of these techniques [virtual production, real-time, machine learning]. Every show we are working on in episodic employs at least one of these toolsets in some capacity, but I think the greatest opportunity, as silicone begins to catch up, is the ability to iterate more quickly, particularly in animation. In the next five years, we’ll see more widespread adoption of real-time and near real-time GPU rendering for final pixel. The technology is close, but it’s the development of existing workflows and the continued widening of the knowledge base that needs to expand to capitalize on this moment.
“We’re seeing the rise of great new prospects for counter programming, film and episodic projects that would struggle to find an audience five years ago, and we’re doing so more and more now thanks to strategic work in this space by Amazon and Apple leading the charge, and Netflix particularly so, with its commitment to international and local-language film and series and limited theatrical releasing. The exciting aspect is that as studio operations become more integrated and these two mediums converge, film production can benefit greatly from these efficiency gains, and episodic production can benefit from a knowledge base carved at the highest level.”
David Conley, Executive VFX Producer, Weta FX
“The bidding process has changed due to sheer demand, and streamers have a completely different greenlight process than the traditional studio system. Where once we had the luxury of bidding over several weeks against a schedule that was fairly developed, and you could bid down to the crew weeks, we are now being asked to turn around more bids in less time, days even. To drive confidence in our bidding system, we’re relying on a more robust set of analytics to help drive the bidding process. That said, this means we really rely on the perception and skills of our bidding team because no project is similar, and analytics and performance metrics can never replicate the creative process. We rely on our bidding team to have great creative skills when reading and breaking down a script before applying metrics based on analytics.
“I would say there’s greater narrative risks being taken in episodic, but more technical and creative risks in film. That said, the bar is raised across the board for both episodic and theatrical. There’s more being spent in episodic [than previously] but at a lower price point per shot, with the expectation that results are feature-level quality. To date, features are still where, as an industry, we are being asked to produce groundbreaking visual effects. However, streaming services mean that mid-range projects now have direct access to wide audiences, so our challenge becomes leveraging our emerging technologies from the feature side to help produce high-end-caliber visual effects across multiple episodes, within the schedule demands of episodic, at a viable price point that works for our industry.”
“There is more exploration into ideas through streamers. The projects, scripts and series are often filled with quite fantastical ideas that may have never seen the light of day on the big screen.”
—Drew Jones, Chief Business Development Officer, Cinesite
Fantasy has become a prominent genre on the streaming services, with DNEG taking part in Shadow & Bone for Netflix. (Image courtesy of DNEG and Netflix).
Making use of extensive virtual production is the HBO prequel House of the Dragon, which stars Emma D’Arcy as Princess Rhaenyra Targaryen and Matt Smith as Prince Daemon Targaryen. (Image courtesy of HBO)
Måns Björklund, Executive Producer, Important Looking Pirates
“Virtual production, real-time and machine learning are becoming more common, or even standards nowadays. With moving more work into prep instead of post, visual effects becomes even more involved before anything has been shot. For sure, there is more work around than ever. However, finding artists is harder than ever, and costs have also risen. We spend a lot of time finding artists and developing them in-house. Instead of just having a few clients doing high-end work, nowadays the demand is almost doing 10 mid-to-high-end features per season of episodic work. Due to shorter turnarounds, there is a more ‘going direct to the goal instead of trying all possible versions.’ The room for experimentation depends mostly on when you get involved with a project and how long a schedule you have. I feel there is more room to rebid and not have ‘fixed’ bids. It’s a constant discussion and collaboration with clients to get the best results within the budget and time. Important Looking Pirates don’t have facilities around the world. We are trying to do our thing and focus on the quality of our work.”
Paul Riddle, Executive VFX Supervisor, DNEG
“One of the main considerations in approaching our episodic work at the moment is the diversity within that work, whether that’s the creative requirements, the timescales involved or the financial requirements of the production. There’s so much scope for projects of varying sizes and complexities within episodic that there’s a real need for a degree of specialist skills and creative approaches in our artists across the globe.
“There has been an interest lately in ‘deep fake’ AI and machine learning, and how those things will eventually come to be utilized within our industry. There’s definitely an interest there and a buzz around it, and we’ve seen clients wanting to understand how it can be utilized sensibly without doing it just for its own sake, using the technology for a real creative impact.
“[T]he ‘feast-or-famine’ nature of the visual effects industry has somewhat dissipated to allow for visual effects facilities to have a more stable financial footing and thus provide more stability for their employees.”
—Stefan Drury, Executive Producer, ILM TV
What was originally meant to be practical became a CG Ampersand created by ILM for Y: The Last Man. (Image courtesy of Hulu and ILM)
Expanding upon the Vikings franchise for Netflix is Vikings: Valhalla, with visual effects produced by MPC Episodic. (Image courtesy of MPC and Netflix)
ILM was recruited to produce Nivellen, which was a combination of practical and digital effects, for The Witcher. (Image courtesy of ILM and Netflix)
“The area that’s getting the most attention at the moment, it seems, is facial replacement work, with articles and papers going in-depth about how AI and computational analysis are making those kinds of computer-generated content far more photographic than before. It’s definitely an area that could lead to some very different approaches as to how visual effects are fundamentally implemented.”
Stefan Drury, Executive Producer, ILM TV
“It’s been the busiest I’ve seen the industry in the 24 years I’ve been in it! More importantly, it’s also consistent, with a steady stream of projects in development and post at almost all times. It certainly feels like we’ve been able to better forward a plan for our crew, and the ‘feast-or-famine’ nature of the visual effects industry has somewhat dissipated to allow for visual effects facilities to have a more stable financial footing and thus provide more stability for their employees.
“There is still room for experimentation, especially through close collaboration with writers/director/showrunners in the episodic format. We’ve been involved with several episodic shows in which we’ve been part of the development, pitch and greenlight process, which have involved working closely with the clients to find creative methodologies to make projects possible. That said, this experimentation generally has to happen early in the production, agreed upon by all and adhered to, as the waterfall nature of episodic delivery and the sheer volume of material to be reviewed means post schedules leave little room for misdirection.”
Meredith Meyer-Nichols, Head of Production, Rising Sun Pictures
“In 2017, 100% of RSP’s work was on theatrical releases, and in 2021 we were 50/50 streaming to theatrical. As we move into 2022, this trend continues. Streaming projects are predominately series in nature, at about 35%. From our perspective, on the projects that we’re working on, they have large-scale budgets and demand the same kind of quality that RSP is known for. They’re essentially gigantic movies with thousands of visual effects shots and hours of content. RSP, in partnership with the University of South Australia, are delivering accredited courses in visual effects. In classrooms set up to mirror real-world production environments and with instructors who are working professionals, RSP rigorously trains students in the technologies and techniques they’ll need to succeed in an expanding global film industry. We have done a remarkable job of turning out job-ready graduates and have found that our graduates are in high demand with most major visual effects studios.”
By IAN FAILES
A final shot from Mulan, on which Diana Giorgiutti was Visual Effects Producer. (Image courtesy of Walt Disney Pictures)
It’s been a couple of highly unusual and disrupted years in the visual effects industry. Among the many weathering the storm have been VFX producers, those responsible for managing projects, undertaking and reviewing bids, tracking VFX shot delivery and so many other aspects of the visual effects process.
Here, several visual effects producers – some operating for film studios, or as independent contractors, or work at VFX studios, and some who do both VFX supervision and producing – discuss the biggest challenges they’re currently facing.
Our roundtable of producers include: Diana Giorgiutti, currently on Dungeons & Dragons, after having worked on Mulan; Terron Pratt, who recently finished three seasons of Lost in Space before moving to post on Season 4 of Stranger Things; Hal Couzens, in post on Beast, with past credits including F9 and Dumbo; Karen Murphy-Mundell, whose recent films include Blade Runner 2049 and Gemini Man, in post on Black Adam; Mark Spatny, experienced in both VFX supervision (Lethal Weapon and Station 19 series) and a VFX producer (currently on The Peripheral); Scott Coulter, a VFX supervisor and producer for independent features, most recently Reagan; Annie Normandin, a VFX producer at Rodeo FX on Jungle Cruise, Shang-Chi and the Legend of the Ten Rings and Season 5 of Better Call Saul; and Anouk L’heureux, Vice President of Production at Rodeo FX, with VFX producing experience at several other VFX studios.
AN INTERESTING CHALLENGE: SO MUCH WORK
Diana Giorgiutti: “For me, the explosion of streaming content alongside theatrical releases has now created a situation where there is too much work and not enough crew to cover everything. This in turn leads to a lot of crew being thrown into positions they simply are not really experienced or qualified to do. The other key and equally important factor is that there are not enough VFX facilities to easily do all the VFX work across all the varying release timelines. You really have to be on your game to make sure you are doing deals well ahead to guarantee VFX capacity.
“For my current project, we awarded the work to our vendors well ahead of shooting, which is something I had not done before. And we awarded with only script pages as reference. There were little to no visuals at this point, so the award bids were very early and based loosely on words off the page. In turn, this has led to a lot of changes, from the award to actual shot turnovers. A lot changes from the script through prep as things are fleshed out leading up to shooting, and then shooting itself. Not to mention the many adjustments that happen getting a film greenlit.”
A greenscreen stuffy version of the baby elephant on the film Dumbo. Hal Couzens was VFX Producer. (Image courtesy of Walt Disney Pictures)
“Not only does one have a diplomatic role to play straddling a few fences to ensure the right info is given at the right time and in the right way, one also gets to see all stages of the project from development, prep, shoot, post and wrap, occasionally resulting in a lovely moment – being ignored – on a red carpet.”
—Hal Couzens, VFX Producer
Mark Spatny: “Hands down, the biggest problem for VFX producers right now is the global glut of work filling up every VFX facility in the world, thanks to the explosion of high-production-value streaming shows piling in on top of the normal big tentpole features. Prior to March 2020, VFX vendors were constantly knocking on doors looking for work and undercutting each other to get it. In today’s climate, I’ve had AAA multi-national vendors and small boutiques alike turn down $2 million of work with a relatively easy eight-month schedule, simply because there aren’t enough artists and in-house supervisors to get the work done. Even simple roto and paint work that previously would have been completed in a week by an outsource vendor, have to be planned months in advance. And prices have soared accordingly.”
Terron Pratt: “Most recently, one of the biggest challenges we’re facing is limited artist resources. With so much content being created right now, vendors all over the world are booked for months, even years out. On Season 3 of Lost in Space, thankfully, we were one of the first shows to get back to production, which gave us a slight edge for booking talent when it came time for post. We could see it coming and built a season-wide plan for distributing the work as early as we could. Even with that, we still felt the pinch toward the back half of post forcing us to spread the work a bit more than was originally planned.”
On Blade Runner 2049, Murphy-Mundell VFX-produced effects ranging from holograms to vehicles and digital humans. (Image courtesy of Warner Bros. Pictures)
A greenscreen plate of actor Damon Wayans in the TV series Lethal Weapon, on which Mark Spatny was Visual Effects Supervisor. (Image courtesy of Mark Spatny)
A final visual effects shot by Rodeo FX for Season 2 of The Witcher. (Image courtesy of Netflix)
A final visual effects shot on Season 3 of Lost in Space. (Image courtesy of Netflix)
PANDEMIC VFX, THE RISE AND RISE OF WFH, AND NEW TECH
Scott Coulter: “Dealing with COVID has been the biggest challenge for me, namely staffing. I have had to rethink every aspect of working in film. From set work to post, we really have taken the idea of remote work to heart. The primary lesson for me is that specialization is not as valuable as it was before. Now you have to wear many hats simply because there are fewer people on set.”
Annie Normandin: “Now that remote work is becoming a new reality, we’ve had to rethink the ways we approach projects, and stay together and cohesive as a team while continuing to offer the clients new possibilities. I mean, let’s take Shang-Chi, for example. When we were tasked to do the scaffolding fight scene in Macau, the client had planned for it to be a matter of sending a team on location, getting footage, and then we would composite it in the background. But with the travel bans, the production had to change the approach, and they asked us to digitally recreate the entire city. This was a completely new process and required a lot of collaboration with the client to get the best possible outcome. Adaptation is key.”
Scott Coulter was Visual Effects Producer on this Layton’s “Mystery Journey” commercial featuring a CG hamster. (Image courtesy of Scott Coulter)
Anouk L’heureux: “I would add that establishing a good partnership with the different clients is key to the success of our shows. To listen to their needs, manage expectations and work with them to find creative solutions. I also think that today one of the other main challenges is how to keep that sense of community and togetherness as a team when everybody is alone at home. As a producer, we also have to deal with the human aspects of our work.”
Hal Couzens: “During the pandemic, a lot of films went to tell stories in locations requiring few, if any, extras. My most recent [project] had us in South Africa on the borders of Zimbabwe and Namibia in a tented camp. Naturally, the pressure of this landed mainly on the unit and transport departments. However, being so remote there was almost no cellphone connectivity or internet, and base camp and set were far apart. As a unit, we had 40Mbps for the entire operation. No, not 40Mbps per person or department. What internet there was, we needed for production to run the operation, including getting the rushes back to the editors in the U.K.
“It was back to the days of runners delivering messages as even sending WhatsApp messages was a significant challenge, let alone running a previs operation back in the U.K. Given our distance and the need to remain extremely tight during a pandemic, we worked without an actual on-set server for the first seven weeks of a data-intensive shoot. This created a number of expected and unexpected issues. One doesn’t appreciate multiple users operating on the same system together until one can’t! Herculean efforts in data management with rather long hours from our coordinators got us through. Not an experiment I intend to repeat soon!”
“The primary lesson for me is that specialization is not as valuable as it was before. Now you have to wear many hats, simply because there are fewer people on set.”
—Scott Coulter, VFX Producer/Supervisor
A scene from Black Sails. Terron Pratt worked on the show as Visual Effects Producer. (Image courtesy of Starz)
Gray and chrome balls and a Macbeth chart are captured on the set of Season 3 of Lost in Space. (Image courtesy of Netflix)
The entire Macau surrounds were synthetic in this Rodeo FX sequence from Shang-Chi and the Legend of the Ten Rings. (Image courtesy of Walt Disney Pictures)
Karen Murphy-Mundell: “Technology-wise, for me, the latest challenge is weighing the pros and cons of the latest LED walls and on-set virtual camera system technologies. We have to determine the benefits in quality of the final shots as well as the cost of using the technology compared to old school/traditional methods. Putting up an LED wall right now is an expensive venture. There is real pressure in determining what you can save in post and quantifying the value of being able to provide temp shots quicker and screen a more complete film earlier.”
WHAT YOU MIGHT NOT KNOW ABOUT VFX PRODUCING
Spatny: “The VFX producer is an equal partner to the VFX supervisor on a project and is every bit as responsible for its success – or failure – as the supervisor. At a facility, many artists just think of producers as the timecard police who pinch pennies on every shot. They aren’t aware that every project is an intricate, constantly moving puzzle that the producer has to solve, with a million variables and moving parts all over the world. It’s frankly embarrassing that one of our major industry awards doesn’t recognize and include the VFX producer. I’m glad the Emmys and the VES Awards are more progressive in that way, and I’m proud to say I had a hand in making that happen for both organizations.”
On the set of Station 19. (Image courtesy of Mark Spatny)
L’heureux: “I think people might see a producer as someone who takes care of business and the admin part. But when working with a VFX supervisor, you need to be as creative as they are so that you can help them, provide the tools they need and understand what the client has in mind. And through that creativity, a symbiosis between the VFX supervisor and the VFX producer occurs.”
Normandin: “The VFX supervisor and the VFX producer are a real duo. We really have to work in sync with a great team around us to bring a sequence to life, so you need to know your partner and trust them.”
Coulter: “What people don’t always know about VFX producing is that people come from all sorts of areas. My background originally came from practical effects. Over time, this has proven to be a fantastic training ground for visual effects. Even in today’s productions, I am always offering a practical solution, such as using a translight instead of a greenscreen. On a large project like Automata, I proposed full-scale practical puppets instead of the planned pure CG robots. This saved production countless dollars and provided a superior result.”
Murphy-Mundell: “I think that people outside film communities aren’t aware that the job as a VFX producer involves evaluating the daily changes in all departments while in film production and how it affects the VFX plan. Decisions made on set in stunts, art department, costumes, construction and script can have big consequences for VFX costs and methodology many months into post.”
Giorgiutti: “As a studio-side VFX producer, most people probably don’t realize that we are on from the very beginning of a film to the very end. A lot of the time the VFX producer starts even ahead of the VFX supervisor. Other than the director and producers, VFX is the only department that is on a film all the way. Bearing this in mind, I often say that my job as a VFX producer is 50% budget and management of filmmakers and our VFX teams, with the other 50% being counseling. This counseling is totally tied into the plethora of politics we have to navigate to keep a good balance with the filmmakers, studios, VFX crew and VFX facilities.”
Couzens: “Indeed, often the VFX team and the VFX producer are the longest-running crew members on a production, aside from director, producers and occasionally an accountant. The result of this is that the VFX producer provides a continuity that runs the ‘full length of the counter’ and can thus provide support in making choices, creative and otherwise, to all sides of filmmaker/ studio/editorial/finance and facility equations. Not only does one have a diplomatic role to play, straddling a few fences to ensure the right info is given at the right time and in the right way, one also gets to see all stages of the project from development, prep, shoot, post and wrap, occasionally resulting in a lovely moment – being ignored – on a red carpet.”
Pratt: “VFX producing doesn’t start in post. I think it’s important that the VFX team is involved as early as possible, even at the script stage, including the VFX producer. The conversations early on are not solely creative but also involve securing resources [production team, on-set team, previs artists, budgeting and scheduling], understanding which vendors are going to be available when and engaging them early. The job is as much about developing relationships and building a team as it is about hitting a deadline. The only way to hit that deadline is with the right teammates.”
By CHRIS McGOWAN
Pixar’s Turning Red is a coming-of-age fantasy/comedy featuring 3D animation with anime influences, directed by Domee Shi. (Image courtesy of Pixar/Disney)
Animated features and series of all types and stripes are launching this year, thanks to growing global demand, an anime goldrush and the impact of the streamers. The diverse array of titles includes both high-grossing sequels of franchises that are mostly for children and fare for teens and/or adults that push the animation envelope in terms of content and style.
Animated movies have already established themselves as a significant part of the movie business, having achieved formidable box office grosses. Four animation titles in the top 25 films released in the past four years have grossed over $1 billion dollars worldwide, led by Disney’s Incredibles 2 (2018), The Lion King (2019), Toy Story 4 (2019) and Frozen II (2019), according to Box Office Mojo. More than 50 animated titles have topped the $500 million mark. Animation has also established a strong streaming presence.
“There has been an explosion of animation over the last few years in all areas and styles,” says David Prescott, Senior Vice President, Creative Production of DNEG Animation, which co-produced the Disney/20th Century Studios hit Ron’s Gone Wrong in 2021 and worked on Paramount’s Under the Boardwalk for this year and Alcon/Sony’s Garfield project for 2024.
Kane Lee, Head of Content for Baobab Studios, comments that “there has been a real sea change in supply and demand for animation content, both in general and across genres.”
According to Ingrid Johnston, Animal Logic Head of Production, “Right now, we have a great opportunity to see a wide range of animation styles and stories told in animation. Flee [the animated Danish docudrama] being nominated for this year’s Oscars is a great example of this. The success of films like [Sony’s] Spider-Man: Into the Spider-Verse are showing that audiences are engaged in different styles of animation. We have already seen an increase in the amount of animated content for adults, such as Love, Death + Robots, and filmmakers are seeing animation as a way of telling more diverse stories.” Animal Logic co-produced Sony’s Peter Rabbit 2: The Runaway (2021) and is working with Netflix Animation on The Magician’s Elephant and Warner on Toto, the latter two due in 2023 and 2024, respectively.
High-profile 2022 titles include Paramount’s Blazing Samurai, Universal/Illumination’s Minions: The Rise of Gru, Universal/ DreamWorks’ The Bad Guys and Puss in Boots: The Last Wish, Sony’s Spider-Man: Across the Spider-Verse (Part One), Warner’s DC League of Super-Pets, Disney’s The Ice Age Adventures of Buck Wild and Strange World, Disney/20th Century’s The Bob’s Burgers Movie and Disney/Pixar’s Turning Red and Lightyear.
Plus, there are animated series bowing in 2022 that will join several dozen already available. New arrivals include Dan Harmon’s Krapopolis for Fox, Amazon’s The Legend of Vox Machina and The Boys Presents: Diabolical, and the Disney sequel The Proud Family: Louder and Prouder.
The streamers have jumped into animation in a big way, either as producers or distributors. Netflix has had the biggest footprint, acquiring or producing numerous animated films. Several 2022 Netflix releases have auteurs at the helm: Guillermo del Toro’s Pinocchio, Henry Selick’s Wendell and Wild, Richard Linklater’s Apollo 10 1/2: A Space Age Childhood, Nora Twomey’s My Father’s Dragon and the stop-action horror anthology The House, written by Enda Walsh. Netflix is also distributing Rise of the Teenage Mutant Ninja Turtles: The Movie, The Sea Beast and Riverdance: The Animated Adventure this year. Meanwhile, Sony Pictures Animation’s Hotel Transylvania: Transformania is distributed by Amazon Studios and Paramount/Skydance Animation’s Luck by Apple TV.
Lee comments, “As we move into streaming and other new platforms, the playing field is more level, and we have more ready access to global content than ever before. So, the audience’s perception of what animation is, can be and who it’s for – especially here in the U.S. – is changing.” Lee’s firm, Baobab Studios, makes both animated films and interactive animation, often releasing titles on multiple platforms.
Twenty-seven years after Toy Story, Pixar used computer animation tools in Lightyear that were “vastly superior in terms of scale and complexity” to what was possible in 1995. (Image courtesy of Pixar/Disney)
Emiko (Kylie Kuioka) in Paramount’s Blazing Samurai, animated by Cinesite. (Image courtesy of Paramount Pictures)
Matt Groening (The Simpsons) created and co-developed Disenchantment, which was produced by Rough Draft Studios and the ULULU Company (Images courtesy of Netflix)
WIDE VARIETY, DAZZLING DIVERSITY
VFX firms WetaFX and RISE Visual Effects Studios have expanded into animated film production, joining the likes of Animal Logic and Cinesite, who are well established in producing animated features. WetaFX CEO Prem Akkaraju comments, “Weta Animated has been something that has been discussed for years within Weta. We have such a wealth of storytelling talent within the company that creating a business structure around them to help generate original content really felt like the logical next step. Weta has also developed a robust pipeline of tools over the years that give artists and directors a broad palette to work from in creating a style that best suits their creative project. Now is the perfect time for us to make this move.”
Johnston notes, “Factors such as an increase in animation studios, with traditional VFX studios starting to make animated films, and audience demand for content have expanded the types of animated films being created. Also, storytellers and filmmakers are seeing that not only can you tell stories beyond traditional family films, but that there’s also an audience who want to see them. The idea of traditional animated content is really being challenged, and we’re also seeing how it can work alongside other forms of content to tell rich, complex stories.”
“There has been an explosion of stylistic exploration in computer feature animation in recent years which I find super exciting,” says David Ryu, Vice President, Tools for Pixar Animation Studios. “Projects big and small are experimenting with looks, and I love seeing the range of looks projects are finding. There’s borrowing from so many influences: 2D hand-drawn animation, stopmotion, live-action film, so many lineages of 2D traditional art. And the ways we see the principles of all these things being put together to make something new is exciting and inspiring. This is going in so many directions, and I’m excited to see what looks arise over the next few years and what that means in terms of the technologies and pipelines we use to make them.”
Sony Pictures Imageworks contributed to the visual effects on Hotel Transylvania: Transformania. (Image courtesy of Amazon Studios)
Elfo and Bean cross a bridge in Disenchantment, which was animated by Rough Draft Studios (Futurama). (Image courtesy of Netflix)
Animal Logic worked on Warner’s DC League of Super-Pets. (Image courtesy of Warner Bros. Pictures)
Cinesite Head of Animation Eamonn Butler points to many titles with NPR (Non-Photorealistic Rendering) styles, for example The Mitchells vs. the Machines (produced by Sony and distributed by Netflix), Sony’s Spider-Man: Into the Spider-Verse and Arcane (produced by Riot Games and Fortiche and distributed by Netflix), which utilize 3D-manipulated renders combined with hand drawn 2D techniques, painterly lighting and clever experimentation with surfacing and form. Cinesite has itself created a painterly NPR look for Hitpig, an Aniventure film from author Berkeley Breathed. Butler says that it’s exciting to experiment “with design, animation style and lighting” to create unique and appealing looks for Cinesite’s movies. Cinesite also teamed with Aniventure on Blazing Samurai and Riverdance: The Animated Adventure.
NEW STORYTELLERS TELLING DIFFERENT STORIES
Looking at animation history, WetaFX’s Senior Animation Supervisor Sidney Kombo-Kintombo comments, “Animation was at first an art for the initiated only. It used to be expensive and only a very small group of people had the required expertise. But nowadays, animation is a very accessible door to producing and sharing a story. Thanks to online tutorials, student licenses for professional software and the generosity of studios such as Weta, knowledge and professional tools are being put at the disposal of whoever wants to learn, even in remote regions where the use of internet is still a luxury. Thanks to that, we have witnessed the emergence of new talents and storytellers that create content based on remote cultures, stories and legends. These changes have a refreshing and enriching effect on the entire animation industry. The art is getting richer with more diverse artists and storytellers representing a wider range of cultures, [and] the world is opening up even more to the fact that there is more than one way of animating.”
Animal Logic approaches each film it works on as a new opportunity to evolve artistically and technically. “We first consider the story and then we look for the best way to represent it visually,” according to Johnston. “This has allowed each of the films we’ve worked on to have their own distinctive look.” This includes the Warner-distributed Legend of the Guardians: Owls of Ga’hoole, with richly detailed feathers and foliage, Sony’s Peter Rabbit films with realistic rabbits integrated with live-action plates, and Warner’s LEGO Movie franchise with their unique stop-motion style. “Even within the LEGO universe, each film explored what elements of real world or stop-motion would best suit the requirements of the film. And DC League of Super-Pets has a whole new look of its own, too.”
ANIME CONTINUES TO GROW AND EXPAND
Anime accounts for a growing portion of the global animation business. Demon Slayer the Movie: Mugen Train and Spirited Away have globally earned over $503 million and $396 million at the box office, respectively, and 20-odd titles are nearing or above $100 million, while innumerable series and movie sequels contribute to large totals for anime franchises.
Netflix has invested heavily in anime acquisitions and original programming. Hulu also has a large selection. Sony owns Crunchyroll, which, as of March, had more than 40,000 episodes, or 16,000+ hours, of a wide range of anime, according to Rahul Purini, Crunchyroll Chief Operating Officer. Looking back, he observes, “Animation in the West has primarily focused on children or comedy and the growth of anime and video games has helped create a generation that is much more comfortable with adult dramatic animation.” He adds, “Anime is not new, but it has grown exponentially over the last decade or so with the expansion of streaming platforms and expanded international rights and distribution. Many don’t understand that anime is not a genre in itself – there are many styles within it, like fantasy, action, adventure, comedy and more. And as investment in the anime ecosystem and industry increases, you will see the storytelling growing and expanding in all directions.”
NEW TECHNOLOGY UNLEASHES CREATIVITY
Pixar’s Toy Story (1995) was the first feature-length computer-animated film. This year, Pixar will launch its latest spinoff, Lightyear, about which Ryu comments, “It’s interesting, in terms of software, that we’re actually using lots of spiritually similar stuff! We are still using RenderMan, and our animation system Presto shares DNA with our old ‘Menv’ system used in those days. Of course, RenderMan and Presto are light years ahead of what they were in those days. We’re in a different universe in terms of the scale and complexity of what we can do. And looking at where we’re at now vs. where we were, it’s cool to see the sea change in terms of artist interactivity and how far we’ve come in terms of reducing the technical barriers to entry. Both of these are showing up on the screen in terms of the complexity and quality of what we’re making.”
Lighthouse Studios, which animated The Cuphead Show!, is based in Kilkenny, Ireland and specializes in 2D animation. (Image courtesy of Netflix)
Cinesite animated Riverdance: The Animated Adventure, directed by Eamonn Butler and Dave Rosenbaum. (Image courtesy of Netflix)
Turning Red has quickly become another top-notch addition to Pixar’s growing library of classic animated films. (Image courtesy of Pixar/Disney)
Kranz (Zachary Levi) in Mission Control in Richard Linklater’s Apollo 10 ½: A Space
Age Childhood, which blends a unique combination of hand-drawn animation, live-action and CGI. (Image courtesy of Netflix)
Warner Animation Group, Animal Logic, DC Entertainment and Seven Bucks Productions teamed on the production of Warner’s DC League of Super-Pets. (Image courtesy of Warner Bros. Pictures)
Hank (Michael Cera) and Jimbo (Samuel Jackson) in Blazing Samurai. (Image courtesy of Paramount Pictures)
Prescott comments, “New technology always has an interesting effect on any form of creative storytelling. With animation, it is allowing filmmakers the chance to tell stories they were not able to really approach before. It is also allowing each project to have a look and feel that suits that particular project. We’re integrating real-time workflows and machine learning to develop new and cutting-edge interactive experiences for our artists, which is changing the animation production process, all designed to let creativity thrive.”
Cinesite Chief Technology Officer Michelle Sciolette, speaking of game engines such as Unreal Engine and Unity, notes that, “In the past few years, major technical advancements in the graphics-processing unit [GPU] of computers have enabled such engines to render production-quality imagery while maintaining their real-time speed. Cinesite, along with its production partner Aniventure, is currently developing an animated feature film utilizing game engine technology.”
Akkaraju adds, “The most exciting technology development is the inclusion of AI and machine learning techniques in the animation workflow. It has the potential to affect the artists’ day-to-day workflows as much as the transition to having computers handle the in-betweens. There is still much of the animation workflow that is manual and repetitive – tasks that are not aiding the creative process.”
ADAPTING TO THE PANDEMIC
The growth of animation was accelerated by adaptations to COVID-19. “With the arrival of the pandemic, live-action filmmaking was largely put on hold while the demand for animation accelerated – our mostly digital pipelines could adapt to remote work and such. So, in 2022 we’re going to see the fruits of that labor, and it’s only just the beginning,” says Lee.
During the pandemic, “live-action directors and studios were able to consider animated scripts, and now we’re seeing a large demand for animated films. The industry has never been busier,” comments Johnston.
“Audience demand for animation has also grown substantially over the last few years and people are consuming more and more,” observes Akkaraju. “One interesting by-product of this trend has been an expanding of audience perception of what animated storytelling can be. We’re seeing a wider range of stories and storytellers be embraced by the mainstream, and that’s great for everyone.”
Concludes Johnston, “The beauty of animation is that anything you can imagine can be created, so there’s a lot more freedom and scope within the world of animation.”
By TREVOR HOGG
Kristen Prahl, VFX Producer for Ghost VFX in Copenhagen, Denmark. (Image courtesy of Kristen Prahl)
Even though Copenhagen, Denmark has been home for a decade, Ghost VFX Visual Effects Producer Kristen Prahl was born in the American South and raised in the Midwest, first living in Kentucky, then moving to Troy, Ohio, which is just north of Dayton. Her mother was an art major, but decided to go into teaching because she thought there would be better career opportunities with a degree in education rather than art. Childhood hobbies were shared amongst the siblings.
“My older sister and I were very involved in 4-H growing up. It was more ‘artsy’ though and we’d do tons of craft projects. I made it to State a couple of years, otherwise my early claim to fame was at the local county fair.” Movies were also part of her adolescent life. “Hands down, the movie that has left a lasting impression on me is Jurassic Park [1993]. I was only 11 then, so naturally I was that girl in the theater that screamed and jumped out of my seat during the Velociraptor kitchen scene. Another special movie for me is the original Lion King [1994]. I made my dad take me to the theater twice. My parents got the hint and took me on a studio tour at MGM where I got to see animators at work. From that day, I swore to my parents that that was what I was going to do when I grew up.”
Disappointment arose when after high school Prahl attended Ohio University where she studied in the School of Art + Design with hopes of earning a major in Graphic Design. “Unfortunately, I didn’t make the cut, and I told my parents I was thinking of art history as my new major. My parents gave me the option of pursuing art history or choosing any school [within reason] that I wanted, without majoring in art history. I took them up on their offer and choose Savannah College of Art and Design (SCAD). After three years, I graduated with a BFA in Visual Effects.”
California soon beckoned. “With prospects looking slightly better in the early 2000s, I got to make the leap, and I owe my parents everything for that,” says Prahl. “During my last year of college, I went on a school trip to Los Angeles and managed to get an internship at Zoic Studios. Initial industry highlights include being a lip double on CSI and painting ooze on a ‘dead’ body for a forensics scene.”
An interesting early job for the aspiring artist was as a dust-busting lead at Digital Domain. “It’s not far off from regular housekeeping,” notes Prahl. “You essentially paint out any dust that might have gotten caught when film is scanned. Typically, you’ll just clone pixels from another frame or from another part of the same frame. Most films today are shot digitally, so dust-busting isn’t really needed all that much, but I had some great years at Digital Domain, which I’ll always cherish.” She then transitioned to be a rotoscope artist on Speed Racer and Star Trek. “I haven’t been on the box as an artist for some time now,” she admits, “but obviously the technology has gotten better. Nevertheless, the process of rotoscoping is still user driven, and so, until an AI learns this art form, it can still be extremely time-consuming.”
A temporary move to Europe became more permanent. “My boyfriend at the time [now husband] is Danish and wanted to try life a bit closer to home as he’d been stateside for over 10 years,” recalls Prahl. “I was totally onboard to give Copenhagen a try. We had only planned to stay for a year, but 10 years later and we’re still here. I was quite worried about being able to find the same type of work, but I was able to get my foot in the door at Ghost in 2011 as a freelance roto/paint artist. I bounced around at a couple of other companies, but always found myself back at Ghost. The atmosphere was similar to what I knew from Digital Domain, and a giant plus was that the working language was English.”
VFX Supervisor Ivan Kondrup Jensen, Prahl and Creative Director Martin Gårdeler representing the Emmy-winning Star Trek: Discovery Ghost VFX team on the red carpet. (Photo courtesy of Ghost VFX)
“Often, success is a perplexing combination of hard work and chance. Remember that you need both, and trust that you’ll stumble upon what you need, when you need it.”
—Kristen Prahl, VFX Producer, Ghost VFX
Ghost VFX had humble origins. “Ghost was originally founded by a few ex-LEGO employees working out of a garage [aka trailer],” explains Prahl. “They did mostly commercial work at first, but over the years local features were added, then Hollywood blockbusters. I remember when we were awarded work on Rogue One: A Star Wars Story. It was a milestone for the company and also one of my favorite shows to have been part of. Production has been at a ludicrous speed ever since, and today high-end streaming shows make up the lion’s share of what we do.”
Becoming a visual effects producer was a natural transition. “I’ve always been a bit compulsive in terms of organizing and planning,” Prahl acknowledges, “so when Ghost looked to expand their production group, I threw my name in the hat. I started out as an assistant on various commercials and local features, then moved into my first real producer role on Legendary’s feature Krampus and their first season of the TV show Colony. In more recent years, I’ve been primarily working as Ghost’s VFX Producer on Star Trek: Discovery.”
Prahl celebrates winning the 2021 Primetime Creative Arts Emmy for the Star Trek: Discovery episode “Su’Kal.” (Photo: Anna-Lene Riber. Courtesy of Kristen Prahl)
A milestone for Ghost VFX was being awarded work on Rogue One: A Star Wars Story. (Image courtesy of Ghost VFX and Lucasfilm)
One of the favorite projects Prahl worked on was Rogue One: A Star Wars Story. (Image courtesy of Ghost VFX and Lucasfilm)
A number of the Ghost VFX artists who worked on Rogue One: A Star Wars Story grew up with the Star Wars franchise. (Image courtesy of Ghost VFX and Lucasfilm)
Rogue One: A Star Wars Story was the first film in the franchise to deviate from the Skywalker family storyline. (Image courtesy of Ghost VFX and Lucasfilm)
As technology advances, the role of the visual effect producer has essentially remained the same, according to Prahl. “Obviously, shows come in many shapes and sizes, but I think that being a producer at its core is about understanding team dynamics and keeping everyone’s focus on the [hopefully] shared end goal. On the client side, it is about building trust, creating transparency and clear communication. Internally, it’s often about trying to predict future challenges and never assuming anything.” She has endured a few tough shows over the years. “The hardest of shows also make you realize that you are a part of an amazing team of very talented artists, and that you can take on any curveball the client might throw your way. I’ve managed to be incredibly lucky to have so many talented people working alongside me.”
Working on Rogue One: A Star Wars Story was memorable for Prahl. “Rogue One was a blast because most artists [at Ghost VFX] grew up with Star Wars, and everyone at the company wanted to help out with any small task just to be able to say to their friends or parent, ‘I worked on Star Wars!’ To be honest, I didn’t see A New Hope until I got to college. My dad was a big Star Trek fan, and I’ve seen every old [and new] Star Trek movie and all of Next Generation multiple times.” Over the past six years the company’s focus has been more on high episodic content. “The biggest difference is schedule and pace,” Prahl observes. “Movie shot production can span many months, even years, depending on where in the chain you start. Here you have the ability to work on looks for months before rolling it out to your hero shots, then all shots. Episodic shows, on the other hand, always have new assets or effects for every episode. We still run through all the same steps, but much faster and often with overlapping episodes as these typically are spaced out a few weeks apart.”
From Lost in Space. Prahl believes that a VFX producer should understand team dynamics and keep everyone focused. (Image courtesy of Ghost VFX and Netflix)
“[B]eing a producer at its core is about understanding team dynamics and keeping everyone’s focus on the [hopefully] shared end goal. On the client side, it is about building trust, creating transparency and clear communication. Internally, it’s often about trying to predict future challenges and never assuming anything.”
—Kristen Prahl, VFX Producer, Ghost VFX
There is a convergence occurring between visual effects for television and film. “Right now, we are finding the streaming schedules to be an excellent fit, but we always try to push for as much visual complexity and realism as we can, and so any additional time is always greatly appreciated,” states Prahl. “There’s a high demand right now, and for many vendors not having enough capacity is becoming a real challenge. I do think we’ll see this trend continue in the years to come with heightened competition for viewership and market share between all the studios. However, as we move from growth to a more mature streaming market, we’ll see the usual suspects assume their dominant role, and it will be up to us smaller shops to be lean enough to compete.” In regard to Netflix buying Scanline VFX, Prahl notes, “We’ve seen studio in-source in the past, but typically this has been through organic growth with varying success. If demand continues this crazy upward trend, it’s likely we’ll see more studios secure capacity through retainers and acquisitions.”
There is a convergence occurring between visual effects for television and film, as illustrated by the Netflix production of Lost in Space. (Image courtesy of Ghost VFX and Netflix)
Prahl has served as the production VFX Producer for Star Trek: Discovery since Season 2. (Image courtesy of Ghost VFX and Paramount+)
“This year’s Oscar nominees, for instance, were all male. However, women are well represented in production, and at Ghost we are also starting to see an uptick of more young women coming through our doors. We still have a way to go, and it would definitely be fantastic to see more women in every discipline and at all levels.”
—Kristen Prahl, VFX Producer, Ghost VFX
In recent years, Prahl has been primarily working as Ghost’s VFX Producer on Star Trek: Discovery. (Image courtesy of Ghost VFX and Paramount+)
Prahl has traversed multiple galaxies in her career, from Rogue One: A Star Wars Story (2016), one of her favorite shows she has worked on, to Star Trek: Discovery, pictured here. (Image courtesy of Ghost VFX and Paramount+)
Prahl is proud of what the Ghost VFX team has been able to accomplish on Star Trek: Discovery and their capacity to handle the high demand for content. (Image courtesy of Ghost VFX and Paramount+)
Overall, the bidding process has remained the same. “But we see a bit more time put into previs, which is extremely helpful,” remarks Prahl. “Time zones are not necessarily important for the client, but can be a huge advantage for vendors with solid global pipelines. However, rebates have historically been a requirement for getting a seat at the initial bidding table.” Sharing shots and assets amongst vendors has become easier. “As the industry has matured, off-the-shelf software and open-source formats have come to play a central role at most facilities. As a result, this also means that most companies can now easily share geometry, textures and shader assignments. Disciplines further down the pipeline are still often too entangled in proprietary code, so things like rigs and shading still typically require a full rebuild, albeit with a turntable or similar as reference.”
Inspiration can be found in the application of real-time graphics and video game technology. “Especially virtual production, both as we saw on The Lion King and with ‘the volume’ on The Mandalorian, has really taken the industry by storm,” states Prahl. “At Ghost, we’re starting to implement some of these approaches, but mainly with game engines and real-time renders as another tool in the ‘traditional’ visual effects pipeline. It will be exciting to follow how/if the game and visual effects industry will merge.” The visual effects industry remains male dominated, especially on the artist side. “This year’s Oscar nominees, for instance, were all male. However, women are well represented in production, and at Ghost we are also starting to see an uptick of more young women coming through our doors. We still have a way to go, and it would definitely be fantastic to see more women in every discipline and at all levels.”
Prahl expresses the enjoyment she receives working in the visual effects industry. “First, I really love my job and what I do, and this a common factor for anyone who’s had a long career in any industry. I’ve poured my heart and soul into this industry and learned from ‘Dory’ to just keep swimming when faced with adversity.”
A particular career highlight would appeal to her father. “Nothing beats seeing your name on the big screen for the first time, but honestly, I’m still on an all-time high from our recent Emmy win for Outstanding Visual Effects in a Single Episode for Star Trek: Discovery (“Su’Kul”). I got nominated alongside Ivan Kondrup Jensen, Ghost’s VFX Supervisor on the show, and I’m so proud of what our team was able to accomplish. A favorite quote comes from Peter Pan, which goes, ‘All you need is a little faith, trust and pixie dust.’ Often, success is a perplexing combination of hard work and chance. Remember that you need both, and trust that you’ll stumble upon what you need, when you need it.”
By TREVOR HOGG
Images courtesy of Disney/Pixar.
Each time Buzz Lightyear attempts to achieve hyperspace during a test flight, he freezes in time while those around him grow older.
As Buzz Lightyear experiences identity issues throughout the Toy Story franchise, the beloved animated character takes on a new persona in Lightyear, as director Angus MacLane (Finding Dory) wanted to make the movie that inspired Andy to buy the toy. But do not expect a carbon-copy interpretation since the demands and intention of the project were entirely different. “The key for us was to capture the elements of what people love about Buzz,” states Galyn Susman, Producer of Lightyear. “In the Toy Story world, Buzz is more defined in relationship to Woody. We are now making a feature where Buzz is the protagonist, so obviously some of the things that make Buzz endearing as a sidekick aren’t substantive enough to necessarily carry through a feature film. The thing that we came up with that we love about Buzz is that he is out of step with reality.”
The theme was built into the narrative structure. “Buzz and his compatriots are stranded on a planet and need to develop a fuel that will help them to reach hyperspeed so that they can get back to Earth,” explains Susman. “What they all discover is every time he goes on a test flight, because he’s approaching the speed of light, time passes slowly for him. He ends up spending act one like a skipping stone through time. His disconnect with reality is that he’s frozen in a time that doesn’t exist anymore, and everybody else on the planet is moving on with their lives. It’s much more serious than Toy Story. You can’t have a sci-fi action epic adventure kind of movie if you don’t feel like you have real stakes.”
The lighting was complex for spaceship cockpit shots.
Concept art developed by Bill Zahn exploring what hyperspace travel might look like from a cosmic perspective.
A major source of inspiration were the blockbuster films of the 1980s and 1990s. “It would need to be emotional, relatable and funny,” remarks director MacLane. “But the core idea was how do we make this movie so that it has the excitement potential of the sci-fi movies that people of my age grew up with that gave you ‘that was awesome’ feeling. We achieved that with a combination of elements. Some of it is limitation. You could only afford one probe droid or one [additional prop element], but that gave a clarity and simplicity to things that I enjoyed. Creatively, I wanted to chase a look that was cartoony enough to be animated but realistic enough to be concerned about the character’s safety, and allowed the characters who sit in that universe to feel like they belong there. I wanted to make a film that was making clear decisions visually about what the audience is seeing. I wanted the art direction to be something that feels clunky, substantial and manufactured.”
There is a major reason why Lightyear does not look like previous Pixar movies. “Pixar has a great library called ‘The Backlot,’” explains Tim Evatt, Production Designer of Lightyear. “It’s just a library of pieces that have already been used in previous existing films, and I knew that in order for Lightyear to have its own language we almost need to not use anything from The Backlot. We needed to replenish and make our own backlot. The strength of having a modeling art department is that we were able to replenish our pieces and make a new movie.” The art directors were proficient with 3D modeling, which eased the transition of 2D concepts. “They were able to get the shape language into 3D as soon as possible, start building things in 3D and distribute those pieces to the other departments,” adds Evatt. “We weren’t having to talk about what is the shape language.”
“I knew that in order for Lightyear to have its own language we almost need to not use anything from The Backlot [Pixar shot library]. We needed to replenish and make our own backlot. The strength of having a modeling art department is that we were able to replenish our pieces and make a new movie.”
—Tim Evatt, Production Designer
Lightyear provides a clever twist on the signature line, ‘To infinity and beyond!’
Buzz Lightyear stares at an experimental fuel cell that he uses for a number of test flights, which results in a surprising side effect.
The physicality of Sox was inspired by animatronics.
Certain poses such as this one pay homage to the Toy Story franchise.
There was an extra dimension of complexity in the lighting, especially for the spaceship cockpit shots. “When Buzz is flying out in deep space, we relied on a lot of the self-illuminated buttons in his cockpit,” states Ian Megibben, Cinematographer – Lighting on Lightyear. “On our past movies we have tracked whether a light is on or off all the way, from our modeling department through animation and into lighting and rendering. But it was far more complex with this because before, in the past, it was whether a car had its headlights on or off. Here Buzz has 300 different buttons in his cockpit, and they all had to be something that the animator could animate on and off, and we’re going to see that reflected in his helmet.” An approach was adopted that was similar to using LED panels for The Mandalorian. Comments Megibben, “We would capture probes of our environments, and a lot of times, for the sake of optimization, the set dressing department would say, ‘We’re not going to dress anything behind the camera.’ And I said, ‘I actually need those! Because we’re going to see that reflected in Buzz’s helmet.’ This is the first time that we leaned into something that nerdy and specific.”
When it came to the animation rigs, Buzz was treated differently when wearing the Space Ranger suit. “When he’s in the Space Ranger suit, his body is not in it. It’s just a hard-rigged suit,” notes David DeVan, Animation Supervisor of Lightyear. “When he’s wearing soft goods in other scenes, that is his body underneath the fabric. We made a model of Buzz, and then made the hard-suit Buzz.” The hard suit has inherent challenges. “You have to cheat everything because you can’t get his arms in front of him,” observes DeVan. “We had to accept the limitations. Part of it is accepting that he is in this big barrel thing, and that’s part of how he moves. We wanted to incorporate that into the motion and feeling of things. The shots where Buzz dives and rolls in the fight scene were exciting and tactile because they incorporate the limitations and physicality of what’s there.”
A LED screen-style setup was utilized to get the proper reflections and refractions on the space helmet visor worn by Buzz Lightyear.
The animation of the adversarial Emperor Zurg and his robots followed a ‘less is more’ principle. “We always talk about how Yoda doesn’t have to do anything,” remarks DeVan. “The harder you show them working, the less powerful they must be.” The mechanical limitations of feline robot companion Sox were played to full advantage. “We went through the gambit of how cat-like is she?,” DeVan says, boiling it down to what’s funny. “The challenge with Sox early on was it had to have rotational joints on a complex shape and it took some time to figure out how it was going to work.”
Effects like smoke plumes should be photoreal enough to be believable but also be able to sit seamlessly in a stylized animation environment. “The process for me is always, let’s get as much reference as we possibly can of the real-world thing that we’re looking at and figure out what it means to hit that,” explains Bill Watral, FX Supervisor of Lightyear. “You start a simulation and target that. Then you start taking away levels of detail or adding stylized silhouettes to things. You also get reference of really stylized stuff like Japanese anime of space shuttles launching and put that side by side with SpaceX footage. What is the difference between these two things? It’s the levels of frequency and details. Then you try to find this happy balance between the level of detail that feels right and makes you believe that this is the phenomenon you’re witnessing, but not so much detail that your eye goes there and you’re starting to scrutinize it in relationship to all of the other work around you.”
Emperor Zurg (James Brolin) appears to be an invincible adversary of Buzz Lightyear (Chris Evans).
Tim Evatt and Ian Megibben created concept art in an effort to develop the aesthetic of hyperspace travel.
A real-life approach was taken when re-envisioning the Space Ranger outfit from the Toy Story franchise for Lightyear.
Grant Alexander explores the silhouette and poses of Buzz Lightyear wearing his iconic Space Ranger outfit.
Producing a sun went faster than originally thought. “That was one of those effects I wish I had six months to work on, and we got it done in about a month,” remarks Watral. “It was challenging because that’s one of those [where] we could have made a realistic sun, but it just didn’t fit in the world. We really stylized the heck out of that one. It lent itself to the time frame that we had to work on it, too. [We had] Enrique Vila, the effects artist, and the compositor on that. They worked tightly together and we added layers as needed. Originally, we thought that we were going to add a lot more detail into the sun to see it, but Angus embraced this idea from Sunshine where everything is really freaking blown out, which added to the sense of danger when Buzz slingshots around the sun and there is this heatshield that comes on. The way to sell that was to bloom things out. We created a library of arcs and magnetic loops. In the end, when we started to bloom it, we realized that we could back off on some of that detail because we didn’t need it, and it was actually introducing too much chatter on the images, drawing our eye away. We’re always dropping detail away from where we don’t want you to be looking.”
Atmospherics were essential in creating the various biomes found on the lunar-locked planet of T’Kani Prime. “When I first got onto the film, that was the first thing we tackled,” remarks Watral. “On previous films, we had a thing called dress effects, which is basically an effect that you can dress in at lighting time in our lighting software Katana. But that was relatively limited to mostly semi-homogeneous volumes to fill the air a little bit. We knew in this film that we were going to need big plumes and big vistas with plumes dressed out all over the place and all over the planet. We took a month and went in and rewired that system to be more robust. We added all of these new simulations at a much larger scale and squirreled those away on disks. We made a handshake deal with the lighting artists where we said, ‘We have these simulations that can be cached out at a regular speed, half speed, quarter speed, and a static version. You can choose anyone of those versions you want on this pulldown in Katana. You pick the silhouette you want, the speed for the scale, then place it in the scene and dress it around.”
Getting access to extra render cores was factored into the budget. “We had to store this data to begin with. Storage is not expensive, but also not cheap. The render time is a huge thing. We originally explored ways at render time to re-rasterize these grids into voxels that are larger further away while the closeup voxels are smaller. But in the end, what we found is if we let RenderMan do its thing, it was mostly okay as long as we split the layers separately and lighting had control over it. We could iterate on those independently and had enough time. Lightspeed is the optimization section of the lighting department, and they go in and turn all of the nobs and optimizations to try to get things to render the best. We’re trying for somewhere between 30 to 40 hours per frame. Some of these with lots of volumes in them will be a lot heavier than that. But that’s where we’re at right now. We’re in the thick of it.”
VES members are invited to a screening of ELVIS, hosted [...]
Find out moreNecessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.