
Event – VES Montreal Mega Party 2018
You are Invited to Attend the VES Montreal Mega Party 2018!
Thursday, December 6, 2018 from 7:00PM until late
Theatre Plaza
Montreal, QC H2S 2M5
VIP access and drinks for VES members + 1 guest
By TREVOR HOGG
Images courtesy of Digital Domain and Marvel Studios.
Digital Domain had to come up with how effects simulations would work and look to show what happens when two universes collide with each other.
Living up to its title is Doctor Strange in the Multiverse of Madness, where filmmaker Sam Raimi (Spider-Man) infuses his horror sensibilities into the Marvel Cinematic Universe for the sequel that stars Benedict Cumberbatch, Elizabeth Olsen, Rachel McAdams, Chiwetel Ejiofor and Benedict Wong. Digital Domain was hired by Marvel Studios Visual Effects Supervisor Janek Sirrs (The Matrix) for a 100 shots that consist of the aftermath of two universes colliding with each other, a Sinister Universe version of the Sanctum Sanctorum, and the revelation of a hex conjured by Wanda and the attempt to capture her in the Mirror Universe. “For this show, it was a standard pipeline,” Digital Domain Visual Effects Supervisor Joel Behrens states. “We didn’t have to go outside of using our typical tools. We did V-Ray for lighting and rendering, Houdini for our effects, Nuke for compositing and Maya for asset builds.”
“We had the ground level of buildings built as set pieces and dressed accordingly per universe, and then they would do a LiDAR scan of that. We would get the geometry from the LiDAR data, build out the actual extensions based off the ground-level buildings and maintain that architectural look that was already established with the ground level, and run our destruction simulation, which in this case was a pseudo anti-gravity destruction.”
—Joel Behrens, Visual Effects Supervisor, Digital Domain
Filmmaker Sam Raimi brings his signature comedic spin on the horror genre to the MCU with the release of Doctor Strange in the Multiverse of Madness.
High-resolution digital doubles of Wanda and Doctor Strange were created by Digital Domain and shared with other vendors.
The Mirror Trap proved to be the most challenging look to develop and required 20 other versions of Wanda to appear in the various reflections.
On his way to find the Sinister Sanctum Sanctorum inhabited by the antagonistic and alternative version of himself, Doctor Strange (Benedict Cumberbatch) and Christine Palmer (Rachel McAdams) traverse through two different realities of New York City that have collided together and are ripping each other apart. “There were a lot of effects simulations and visual design that needed to happen to come up with that look,” Behrens reveals. “On a backlot location in London, there was a basic intersection and street that you see multiple times throughout the movie in various universes. We had the ground level of buildings built as set pieces and dressed accordingly per universe, and then they would do a LiDAR scan of that. We would get the geometry from the LiDAR data, build out the actual extensions based off the ground-level buildings and maintain that architectural look that was already established with the ground level, and run our destruction simulation, which in this case was a pseudo anti-gravity destruction.”
Plates were shot in Iceland, while portions of the grand staircase and foyer were practically built for the Sinister Sanctum Sanctorum.
“You can’t have it [the set of destroyed buildings] drifting away too quickly. Otherwise your buildings are gone by the end of the shot. It was finding that balance of how to keep interesting motion up there, along with these large, smoky, viscus-liquid tendrils that pull away from the buildings as well as the pieces of debris. Then, on top of that you’ve got the incursion from the other universe colliding and floating through space, whether in real-time or in a slightly slow-motion fashion to heighten the interest of the motion.”
—Joel Behrens, Visual Effects Supervisor, Digital Domain
Animation and motion tests were conducted for the debris. “You can’t have it drifting away too quickly. Otherwise your buildings are gone by the end of the shot,” Behrens remarks. “It was finding that balance of how to keep interesting motion up there, along with these large, smoky, viscus-liquid tendrils that pull away from the buildings as well as the pieces of debris. Then, on top of that you’ve got the incursion from the other universe colliding and floating through space, whether in real-time or in a slightly slow-motion fashion to heighten the interest of the motion.” As with all of movies by Sam Ramai, a certain automobile is seen floating in the background. “Some of the floating objects like the Oldsmobile Delta 88 were hung off of a large crane arm on wires to have them slightly spinning and drifting through a scene,” Behrens remarks. “It was taking a look at that and matching it. Sam and Janek wanted everything to have a hyper-real, more than slightly slower than real-time look in this universe from the incursion. We tended to go with a heavier slow-motion, high-frame-rate-shutter look for these elements like the Brooklyn Bridge or the subway that is crashing through the ground behind them or the Empire State Building.”
Plate photography was shot in Iceland of the black sand beaches to create an otherworldly feel for the Sinister Sanctum Sanctorum.
A car that cameos in the films of Sam Raimi is an Oldsmobile Delta 88.
Different iconic landmarks appear in the Sinister Universe, such as the Brooklyn Bridge in a Victorian Steampunk Gothic style sliding through the ocean.
The Sinister Sanctum Sanctorum was treated as if it was a haunted house.
A haunted house aesthetic was given to the original model of the Sanctum Sanctorum to create the Sinister Universe version. “We didn’t modify the silhouette or the shape of the building, but changed the materials it was made out of, such as using large, weathered pieces of stone that had large cracks in them,” Behrens states. “We had vines that went up the side, and it looked like it had been a 16th century mausoleum that has seen a lot of age and wear over time.” The building is slowly being pulled apart by a massive dark void. “You have this large chunk in the upper right-hand corner that is slowly pulling rocks apart from the construction along with these sinewy, inky tendrils that come off of it,” Behrens adds. “Then, once you go inside the Sanctum there is some weird universe stuff going on. Inside the foyer you’ve got the Grand Staircase, and behind it is this odd otherworldly beach where the surf is rolling up and [there’s] a gigantic, red ominous moon.” Plates were shot in Iceland, while on set Cumberbatch walked on a mirror that had some actual sand on it. “It was a fairly complex simulation to tie in with the surf edge as waves break on the beach,” Behrens says, “and we had these tributaries around the staircase that we wanted water to pool in so Doctor Strange would walk through the water. They played him dry for wet, so we also had to try to make his boots look like that they were getting wet from the water he is walking through.”
“[O]nce you go inside the Sanctum there is some weird universe stuff going on. Inside the foyer you’ve got the Grand Staircase, and behind it is this odd otherworldly beach where the surf is rolling up and [there’s] a gigantic, red ominous moon. … It was a fairly complex simulation to tie in with the surf edge as waves break on the beach, and we had these tributaries around the staircase that we wanted water to pool in so Doctor Strange would walk through the water. They played him dry for wet, so we also had to try to make his boots look like that they were getting wet from the water he is walking through.”
—Joel Behrens, Visual Effects Supervisor, Digital Domain
Doctor Strange encounters Wanda in an orchard that is an illusion, disguising a darker reality that builds upon the Marvel Studios television series WandaVision. “WandaVision had that digitized, almost pixelated television look for the hex as it got dropped and revealed what was actually happening in the world,” Behrens notes. “For this one they wanted it to be more natural and organic. We decided to tie it more into her smoky red magic and the dark hole magic. The idea of this wall of her magic that reveals the universe was tricky because we didn’t want it to be like a simplified wipe-away or dissolve. We had this swirling stuff that, as it wiped over certain branches of the trees, the smoke would cling and almost pull away, so you got this interaction between the wall and the objects that it intersects with as it was revealing that.” The trees appear significantly different in the post-hex world. Comments Behrens, “They went from a fairly normal, healthy trees to these twisted, decayed trees.” Practical trees were constructed around the actors, with the rest being a digital set extension. “We struggled with drifting fog that didn’t look too much like smoke because they didn’t want it to feel like a burnt-out forest that the characters were standing in,” Behrens says. “There is a lot of talking back forth in that scene. We ended up doing background extensions and atmospheric simulations as well as the CG dark hole.”
“Some of the floating objects like the Oldsmobile Delta 88 were hung off of a large crane arm on wires to have them slightly spinning and drifting through a scene. … [Director] Sam [Raimi] and [Marvel Visual Effects Supervisor] Janek [Sirrs] wanted everything to have a hyper-real, more than slightly slower than real-time look in this universe from the incursion. We tended to go with a heavier slow-motion, high-frame-rate-shutter look for these elements like the Brooklyn Bridge or the subway that is crashing through the ground behind them or the Empire State Building.”
—Joel Behrens, Visual Effects Supervisor, Digital Domain
A swap gas-like fog that has a pinkish-orange color drifts around the trees.
A portion of the post-hex orchard was practically built around the characters.
It was important to avoid the real orchard appearing as a forest ravaged by a wildfire.
In order to imprison Wanda in the Mirror Universe, Doctor Strange conjures a Mirror Trap. “There were all sorts of permutations, such as being glassier, but we went for it being more mirror-like and reflecting a digital environment as well as a digital-double version of our actress,” Behrens explains. “When shooting the footage on greenscreen for Wanda, we had a few witness cameras for some of those reflections; however, because these mirror pieces were at different angles and magnifications of her, we ended up using a digital double for quite a bit of that.” The decision was to have a lot less distortion in the mirror images. “We needed to see clearly what is happening with Wanda,” Behrens explains. “We had spikes built out of this mirrored glass as well so you could see her reflections, in which we used a combination of both digital-double and plate reflections. It was tricky coming up with the look because you didn’t want it to feel like a flat chrome surface. So we were trying to put imperfections on the front glass, smudges, a bit of dust and dirt, scratches, small cracks, things that were grounded in reality, because the initial passes did not look good when they were strictly a purely reflective material. It looked incredibly CG. It’s always about those imperfections and realism that you can add to those pieces that hopefully fit it in there and make it feel realistic.”
By TREVOR HOGG
Images courtesy of The CW.
A shot taken from the pilot episode outside the Fortress of Solitude.
A creative partnership and friendship forged 28 years ago at MGM has Visual Effects Supervisor John Gajdecki (Stargate: Atlantis) and Visual Effects Producer Matthew Gore (Battlestar Galactica) working together again on the second season of The CW production Superman & Lois. Co-creator and showrunner Todd Helbing (The Flash) has produced a unique spin on the superhero genre where at the heart of the story are the family struggles of Clark Kent (Tyler Hoechlin), Lois Lane (Elizabeth Tulloch) and their twin sons (Jordan Elsass, Alex Garfin) who have inherited the patriarchal supernatural abilities.
“We know that the artists are good and the art is built into the people. This team manages the process so tightly that we can deliver shots without panic two days before they’re on TV.”
—John Gajdecki, Visual Effects Supervisor
Superman & Lois is a testament to logistical prowess of the visual effects team that manages without panicking to deliver shots two days before they are on television.
Even though Superman & Lois is spun off of Supergirl, the look of the series is uniquely its own.
Superman (Tyler Hoechlin) must not only fly, he must fly through elements such as smoke, fog, clouds and dust. Bigger ambitions resulted in a 25% increase in visual effects shots in Season 2.
A major part of the visual language are Cooke Xtal Xpress lenses, which were emulated in the visual effects shots.
It was important to be able to produce feature-quality visual effects shots within a television schedule.
Water was poured onto Tyler Hoechlin when shooting the scene where Superman rescues a submarine in Episode 201. The submarine was created in Maya, with Houdini water and lots of clouds and smoke from the stock library added in Nuke.
The Kent family farmhouse is dilapidated and the barn has burned down in the Bizarro World version. Even though there are assets and locations that carried over from Season 1, modifications still had to be made.
A particular expression comes to mind for Gajdecki when describing how the show operates. “You hear people saying, ‘Armchair generals talk strategy, but the pros talk logistics.’ ‘Can we get the weapons to the front? Will there be food for the soldiers when they get there?’ We know that the artists are good and the art is built into the people,” Gajdecki states. “This team manages the process so tightly that we can deliver shots without panic two days before they’re on TV.”
“At the beginning of the season, we said that we wanted to take this up another level and try to get as close to feature quality as we could, knowing our limitations. But every time Superman does something, there is usually a CG component to it that adds to the schedule and budget issues. John mentioned to me that he had an in-house team on Project Blue Book, and so we proposed that. The in-house team has shined and been great in helping us to get this on the air.”
—Matthew Gore, Visual Effects Producer
Necessity led to a creative solution for Season 2 of Superman & Lois. “We have a tight network schedule, so it’s tough to try to do what we’re trying to do,” Gore notes. “At the beginning of the season, we said that we wanted to take this up another level and try to get as close to feature quality as we could, knowing our limitations. But every time Superman does something, there is usually a CG component to it that adds to the schedule and budget issues. John mentioned to me that he had an in-house team on Project Blue Book, and so we proposed that. The in-house team has shined and been great in helping us to get this on the air.”
15 different vendors worked on Season 2, including Zoic Studios, Refuge VFX, Boxel Studios, Frame Lab Studios, Barnstorm VFX, Tribal Imaging, Od Studios and Lux VFX.
The more shots were broken up over multiple vendors, the more often they would come back to the in-house team as the final 2D. “I see us as the final line of digital defense,” Gajdecki remarks. “Everything comes in and we do the comps, and we’ll put that photographic pass on it to make sure that the lens flares feel right and that the camera move looks like it fits between the shot, before and after, and the contrast, smoke and dust levels are matching. When Superman flies in and comes to a stop, something else has to keep going otherwise it doesn’t feel right. With some artists it is hard to explain why that’s wrong. We came up with the term energy transfer and suddenly people went, ‘Okay, I see it now.’”
“I see us [the in-house team] as the final line of digital defense. Everything comes in and we do the comps, and we’ll put that photographic pass on it to make sure that the lens flares feel right and that the camera move looks like it fits between the shot, before and after, and the contrast, smoke and dust levels are matching. When Superman flies in and comes to a stop, something else has to keep going otherwise it doesn’t feel right. With some artists it is hard to explain why that’s wrong. We came up with the term energy transfer and suddenly people went, ‘Okay, I see it now.’”
—John Gajdecki, Visual Effects Supervisor
There is mutual cooperation between the different departments which allows for seamless integration between practical and CG elements.
Tyler Hoechlin was put through the deepfake process to apply Bizarro’s face. Deepfake technology was critical in being able to create the Bizzaro version of Superman.
Bigger ambitions have meant that the visual effects count increased from 2,300 to 2,500 shots for Season 1, to an additional 25% for Season 2. Also expanded are the number of vendors, which can be a many as 15 depending on their availability. Among the contributors are Zoic Studios, Refuge VFX, Boxel Studios, Frame Lab Studios, Barnstorm VFX, Tribal Imaging, Od Studios and Lux VFX. “We have an honest relationship with our vendors,” Gore notes. “In my conversations with them, are you available and can you do this work? They’ll tell you flat-out. It won’t be like the shop that wants to take the work and figure it out. They’ll say, ‘We don’t have these artists available for those weeks, but we have these who are available.’ We’re constantly trying to fill what somebody can do.” Even though there are assets and locations that carried over from Season 1, modifications still had to be made. “The [Kent family] farmhouse ended up in Bizarro World, so even though the farmhouse didn’t change between seasons, there was a whole new farmhouse and barn,” reveals Gajdecki. “The barn has burned down and the farmhouse is dilapidated.”
Villainous Ally’s acolytes prepare to travel through the portal to Earth Bizarre in a greenscreen plate of the actors on the dressed studio floor. The portal was created in Houdini and comped in Nuke with smoke elements.
A greenscreen plate of Superman and Tal-Rho on the volcano’s edge. The Maya model of the interior of the volcano was comped with Houdini lava, live-action volcano splashes and lots of smoke and embers to produce the final shot.
“There is stuff that we can’t talk about for the series finale that might come as a text from Todd Helbing with a photo that says, ‘We’re going to do this,’” Gore comments. “We went, ‘Okay. Cool. Let’s figure out how.’ A lot of times Todd will give us a heads up that something big is out there, so we can at least start thinking about it even if we don’t get a full outline or a script yet. At least we know conceptually this is something that we have to try to fit into the schedule.”
“Anything that we need we get. When we shoot greenscreens, the line producer says, ‘It’s Gadjecki Rules.’ They light it for us and get the exposures and interactive light that we need. Our shots look good even though we have little time because we shoot the pieces so well and production is behind us.”
—John Gajdecki, Visual Effects Supervisor
Gajdecki has a particular philosophy towards visual effects. “Every shot that we do has to look like that the art department directed it and the camera operators operated the camera. We are sensitive to the inputs from the other departments to make sure that we put in the same chaos that would be in a real shot.” The cooperation is mutual, he says. “Anything that we need we get. When we shoot greenscreens, the line producer says, ‘It’s Gadjecki Rules.’ They light it for us and get the exposures and interactive light that we need. Our shots look good even though we have little time because we shoot the pieces so well and production is behind us.”
The term ‘energy transference’ was coined for the scenes where Superman flies in and comes to a stop, to explain why the energy all around that action must continue.
Episode 208 features a portal. Comments Gajdecki, “Upon hearing that there was going to be a portal, we got all of the reference together, numbered all of the references, put it out in front of our executives, got on the Zoom and asked, ‘Todd, is any of this close to what you’re thinking?’ He might have 50 images to look at, and we start to talk about the nature of the portal, what’s the portal doing and the behavior. We go from there and start narrowing down the focus.” Superman gets affected in a dramatic way. “There had to be some component of him getting shredded as he enters it,” Gajdecki adds. “Then we had this whole other thing where once he’s inside there, what does it look like? It goes back to who does what. Our in-house team started a look in Episode 201 where Bizarro is flashing to stuff. We need some cool flash that is supposed to sell that it’s him traveling through the portal. The in-house team was tasked with that. They came up with a look. It evolved to a certain point, and we knew that we wanted it to feel like it had depth to it. That was our other challenge. Todd wanted it to feel that he is breaching something, but we also wanted to sell that there is something behind it, and then he had to shred in there at some point. We didn’t want it to be holes underneath and you see through him. We wanted a substance there, but didn’t want it to be bone and skin because it’s CW.”
Only 10% of the shots don’t have a custom 3D element.
Adding to the shot count was number of deepfakes associated with the character of Bizarro, who is revealed to be an alternative dimension version of Superman. “We let everyone know that the AI approach was not going to be a one-size-fits-all answer to all the Bizarro shots,” Gore explains. “We had numerous discussions with Todd that if a shot was going to ‘break’ the AI, we were going to have to apply more traditional methods to get the sequences to where he wanted them to be. For example, there were a couple shots where we needed to go all CG on Bizarro in order for him to be able to appear in the same shot as Tyler during some of the fight scenes. We massaged the cuts with Todd. Boxel Studio took on all the work that the AI couldn’t do. The in-house team came up with the look for Bizarro’s eyes and then worked with Tribal Imaging in Calgary, Wild Media Entertainment in Toronto/Vancouver, Kalos Studios, Animism Studios and Refuge VFX to make sure no matter who took Bizarro to final, the look for Bizarro’s eyes and makeup was going to be consistent in all the comps across the various episodes. It was also a great learning experience working with this new AI toolset. And to be clear, not every scene in the Bizarro story arc was achieved using AI.
By TREVOR HOGG
Images courtesy of Universal Studios.
The chase involving the Velociraptors combined plate photography from Malta and Chris Pratt riding a stationary motorbike on a massive rolling road in the U.K.
Prehistoric beasts have had a constant presence in the life of David Vickery, who served as Visual Effects Supervisor on Jurassic World: Fallen Kingdom and Jurassic World: Dominion. “I’m still working on dinosaurs! Not the film ones, but for promotional media, advertisements and public relations related stuff.” Any concerns of repeating himself were alleviated by working with a different director, crew and script. “There are always new challenges.” About 1,450 visual effects shots were created, with ILM being responsible for 1,000 shots while the rest were handled by Lola VFX and Hybride. “[Director] Colin Trevorrow had a one-on-one relationship with the storyboard artist, and those storyboards were handed over to us to create animatics. I worked closely with the previs and postvis teams [provided by Proof].” Collaborating with Production Designer Kevin Jenkins was easy as he is a former art director at ILM. “Kevin clearly understands visual effects and worked a lot in 3D, so he could hand those designs over to us,” Vickery observes. “Because of our previous working relationship, Kevin trusted me to take incomplete designs to ILM and to continue their evolution.”
“There are probably more animatronics in Jurassic World: Dominion than in Fallen Kingdom and Jurassic World combined. … [T]he digital dinosaurs we had were an exact match for the physical animatronic dinosaurs that we had on set. It didn’t matter where they positioned the rig because the range of motion was exactly the same as the range of motion in the digital dinosaur. The goal being to get a plate that gave us practical animatronics that could move and perform on set and be digitally extended without us having to replace it.”
—David Vickery, Visual Effects Supervisor
‘Digital archeology’ was conducted to return the original T-Rex from Jurassic Park to her former glory.
1,450 visual effects shots were created, with ILM being responsible for 1,000 shots while the rest were handled by Lola VFX and Hybride.
An entirely new feather system was constructed in Houdini by ILM to deal with creatures such as the Pyroraptor.
Chris Pratt reprises his role of Owen Grady in Jurassic World: Dominion.
Jurassic World: Dominion was the first Hollywood production to recommence shooting during the pandemic.
Five weeks into principal photography for Jurassic World: Dominion, the pandemic caused a global lockdown. “Nobody knew what was going to happen when COVID-19 hit and we had to start shooting again during the pandemic,” Vickery states. “It was hard to understand how we were going to be able to communicate with each other, because suddenly we had to stay distant, were all wearing masks and couldn’t all cluster around the director’s monitors. We had tech scouts where the DP, John Schwartzman, was still isolating before he was able to come back onto set. We were deploying all sorts of new techniques such as wearing Bolero headsets rather than the usual walkie system.” Travel restrictions caused a few key scenes to be reimagined, such as Velociraptors chasing a motorcycle driven by Owen Grady (Chris Pratt) through the streets of Malta. “We had array photography and LiDAR data from location, which was then projected and manipulated rather than being a fully CG environment. Once we got back in the U.K., Chris rode a stationary motorbike that was placed on a massive rolling road that was 25 feet wide. The bike was rigged so it could weave left and right.”
“There was a cool bit of what I called ‘digital archeology.’ We had 3D SoftImage files but didn’t even have the software. However, we managed to get those NURBS files into Maya and convert them into polygon meshes. We also referenced all of Stan Winston’s photography. It’s a beautiful piece of recreation of that original T-Rex model.”
—David Vickery, Visual Effects Supervisor
A different approach was adopted for Mosasaurus attacking the crab boat. “Our editors scoured through 16 seasons of The Deadliest Catch program, correlated and created an edit from outtakes,” Vickery reveals. “The goal was to use the natural aesthetic of the footage that we had and integrate the CGI elements into it. The crab pod is something that we added in as well as the Mosasaurus and a bunch of spray.” Shifting weather patterns had to be accommodated. “When we were shooting at a lumber yard and had to put two huge Apatosaurus in the background, the whole sequence was shot in the morning in bright sunlight and no snow,” Vickery says. “Then it started snowing at lunchtime and we had to reshoot the entire scene again because the snow was going to change the look of our plates. We had a huge team of effects artists at ILM whose job was adding digital snow and dust.” An entirely new feather system was constructed in Houdini by ILM to deal with creatures such as the Pyroraptor. “It relied on the geometry of the feathers being described as a curve for the quill and a flat piece of polygonal geometry for the feather itself,” Vickery explains. “The feathers had to be able to interact with environmental elements. “On set we had the special effects team with air movers, some snow and atmospheric effects, but we had to recreate the same effect in post to be able to integrate the Pyroraptor.”
When designing dinosaurs the first point of reference is the holotype.
A massive animatronic was constructed for the Giganotosaurus.
There are probably more animatronics in Jurassic World: Dominion than in Fallen Kingdom and Jurassic World combined.
“There was a moment on set where Sam Neill, Laura Dern, Jeff Goldblum, Chris Pratt, Bryce Dallas Howard and DaWanda Wise are with the biggest animatronic that I’ve ever seen in my life. Bringing all of those things together was amazing!”
—David Vickery, Visual Effects Supervisor
There was a close collaboration between the art department, visual effects and creature effects to make sure that there was seamless integration of CG and practical elements.
The goal was to get a plate where the practical animatronics could move and perform on set and be digitally extended without it being replaced.
Production Designer Kevin Jenkins created clay maquettes that were scanned and given to ILM to make sure that the models were anatomically correct before being 3D printed by John Nolan and the creature effects team.
Each instalment of the franchise introduces new dinosaurs. “The Giganotosaurus was a real dinosaur,” Vickery remarks. “You look for the holotype, which is often a partial specimen so scientific that experts have to guess the rest. Some of the dinosaurs look so bizarre, like the Therizinosaurus, which is this huge theropod that is covered in feathers and has one-meter-long baseball bat-like claws on the edge of its fingers.” A massive animatronic was built for the Giganotosaurus. “There are probably more animatronics in Jurassic World: Dominion than in Fallen Kingdom and Jurassic World combined.” Jenkins collaborated with Trevorrow and paleontologist consultant Steve Brusatte to develop concepts that were turned into clay maquettes and then scanned. The scans were given to ILM which made sure the model was anatomically correct before handing them off to John Nolan, the head of the creature effects team, for 3D printing. “This meant that the digital dinosaurs we had were an exact match for the physical animatronic dinosaurs that we had on set,” Vickery notes. The process helped to minimize the amount of CG. “It didn’t matter where they positioned the rig because the range of motion was exactly the same as the range of motion in the digital dinosaur. The goal being to get a plate that gave us practical animatronics that could move and perform on set and be digitally extended without us having to replace it.”
“From a simulation perspective, [with feathers] you’re dealing with a huge amount of geometry that is deforming and moving on a frame-by-frame basis, and is reacting to external forces like the wind, but also to the way that the creature is moving. Creatively, ever since Jurassic Park, fans and paleontologists have been crying out to see feathers on dinosaurs, and we will deliver this time.”
—David Vickery, Visual Effects Supervisor
Restored to her former glory is the T-Rex from Jurassic Park. “There was a cool bit of what I called ‘digital archeology,’” Vickery recalls. “We had 3D SoftImage files but didn’t even have the software. However, we managed to get those NURBS files into Maya and convert them into polygon meshes. We also referenced all of Stan Winston’s photography. It’s a beautiful piece of recreation of that original T-Rex model.”
Feathers proved to be the biggest creative and technical challenge. “From a simulation perspective,” Vickery comments, “you’re dealing with a huge amount of geometry that is deforming and moving on a frame-by-frame basis, and is reacting to external forces like the wind, but also to the way that the creature is moving. Creatively, ever since Jurassic Park, fans and paleontologists have been crying out to see feathers on dinosaurs, and we will deliver this time.”
The on-set special effects team deployed air movers, snow and atmospheric effects so the feathers of the Pyroraptor interacted with environmental elements.
All of Stan Winston’s photography was referenced in the recreation of the original T-Rex model from Jurassic Park.
Seeing the Jurassic Park and Jurassic World franchises come together was a career highlight for Vickery. “There was a moment on set where Sam Neill, Laura Dern, Jeff Goldblum, Chris Pratt, Bryce Dallas Howard and DaWanda Wise are with the biggest animatronic that I’ve ever seen in my life. Bringing all of those things together was amazing!”
By TREVOR HOGG
Images courtesy of Apple, Inc.
Amongst the winged creatures re-created are the Hatzegopteryx.
Going beyond the Hollywood portrayals is the Apple TV+ natural documentary series Prehistoric Planet, which travels back 66 million years to the Late Cretaceous period when dinosaurs reigned supreme. Serving as executive producers are filmmaker Jon Favreau (Iron Man) and Mike Gunton, Creative Director, Factual at BBC Studios. Directing the five episodes are Adam Valdez and Andy Jones who respectively worked as a visual effects supervisor and animation supervisor on The Lion King and The Jungle Book for Favreau. Collaborating closely together were digital artists from MPC and cinematographers from BBC’s Natural History Unit.
Tyrannosaurus rex and juvenile go for a swim in the episode ‘Coasts.’
A Mosasaurus as it would have appeared during the Late Cretaceous period, which occurred 66 million years ago.
Progressing from The Lion King and The Jungle Book was not a huge leap for Jones. “Wildlife, natural history and what the BBC has been doing for years was our goal for a lot of the shots. In Jon Favreau’s mind, he always wanted it to feel as naturalistic and realistic as possible,” Jones notes. Nuances have to be incorporated into the animation to believably convey the emotional state of the creature. Explains Jones, “You want to lean away from anthropomorphism as much as possible because right away people will say, ‘Oh, we’re watching animation.’ Mammals share such a common bond with us, even elephants and giraffes have this look of concern for their kids, and we try to use some of that sparingly. We looked at larger lizards and birds a lot. The way birds care for their young is different. There is not this nuzzling.”
“The whole point is when you look at natural animals, they do things that are so weird and wonderful, so why not just portray that because it’s fascinating on its own? While the BBC Natural History Unit is obsessed with scientific accuracy, they’re also storytellers and know how to make things compelling; that was a real balancing act.”
—Adam Valdez, Director
Success is found in the subtle details. “You could say that the work we do is like a thousand of tiny traces on a thousand tiny items, and if it all stacks up correctly, you get a win,” Valdez remarks. “Sometimes you don’t know what those things are until you’re in the midst of it. One of the things that we’ve learned over the last couple shows was that human audiences will project a lot onto characters for you. You don’t have to lean too hard in any visual storytelling. That’s the magic of the medium. Sometimes it’s a moment of stillness that could convey the idea that the animal might be thinking or feeling the event that just happened.” The events had to fit within natural order of things. Valdez adds, “The whole point is when you look at natural animals, they do things that are so weird and wonderful, so why not just portray that because it’s fascinating on its own? While the BBC Natural History Unit is obsessed with scientific accuracy, they’re also storytellers and know how to make things compelling; that was a real balancing act.”
A herd of Dreadnoughtus are inserted digitally into live-action plate photography.
Mongolian Titanosaur and Barsboldia gather around a watering hole.
The rules of wildlife documentary filmmaking were applied when conceptualizing and executing scenes.
A male and female Barbaridactylus were created by MPC, which was the sole vendor for Prehistoric Planet.
As interesting as creating realistic dinosaurs was the process of making the show. “The trick was how do you make something that feels like you went and got the footage hiding out for eight weeks or hiding the track cameras all over and bringing the footage back,” Valdez states. “It’s a painstaking editorial process. What you learn is it’s not like BBC Natural History Unit [to] just go somewhere and film randomly. They know what’s interesting and what the dynamics are at a certain time and place. The Natural History Unit brought us deeply researched stories and our role was to go, ‘Okay, you have a notion, but what we’re going to do is make an animatic that is so tight that you know exactly where to go to get shot by shot.’” Shots were determined by the reality of documentary filmmaking. Valdez comments, “If you shot a hunt like a movie with eight camera positions, that’s not how they get those once-in-a-lifetime moments. They get them rarely [with one camera]. It was our job to make an animatic that felt 100% like they had shot it, and then give them a shopping list: go get these backgrounds, and precisely match the lens and how the camera is moving.”
“We went through quite a bit making the T-Rex because we definitely wanted to nail our version of what we really think the T-Rex is today. It was the first asset that we built and to show off what the series would be. Him and the baby T-Rex. As much we know about them in terms of fur, coloration, and the idea of what these babies would have been like, we needed our Baby Yoda!”
—Andy Jones, Director
Edmontosaurus and juvenile appear in Prehistoric Planet, with Andy Jones and Adam Valdez sharing directorial duties.
Biomes determined the creatures, not the other way around, with the episodes titled ‘Coasts,’ ‘Deserts,’ ‘Freshwater,’ ‘Ice Worlds’ and ‘Forests.’ “That gives you one point of view on the nature of life and planet as a working ecosystem together,” Valdez observes. “Animals are our way in, whether it’s chimps, lions or dinosaurs. That’s why you see it framed the way that you do. Paul Stewart was in charge of ‘Coasts’ as the writer, producer and natural history partner. All of those particular stories have to do with the fact that where the land and sea meet you have a lot of dynamics. You have a lot of biodiversity, food source, territory and raising young. That’s the framing concept for the whole show.” The final sequence in ‘Coasts’ deals with the birth of a baby Tuarangisaurus. “For a Tuarangisaurus to make a baby that’s 12 feet long and 25% of the body mass of the mother is a massive investment, so they’re going to raise one at a time,” Valdez explains. “Then it turns out that the family shows some investment around the young as well. They found fossil evidence that backs all of this up. You find evidence of these sea creatures in the sands and earth where there was previously the Western Interior Seaway, a huge stretch of water that divided North America which had huge coastlands. The show hints on these ideas all the way through.”
“If you shot a hunt like a movie with eight camera positions, that’s not how they get those once-in-a-lifetime moments. They get them rarely [with one camera]. It was our job to make an animatic that felt 100% like they had shot it, and then give them a shopping list: go get these backgrounds, and precisely match the lens and how the camera is moving.”
—Adam Valdez, Director
A pack of Pachyrhinosaurus are hunted by the Nanuqsaurus.
When it comes to proper pronunciations of dinosaurs’ names, Jones laughs. “It’s never set in stone how to pronounce it until Sir David Attenborough says it! The Deincheirus was one of the fun dinosaurs in the series for me because it’s such a weird-looking animal with a big duck bill and massive claws. This is one where scientists would say, ‘He had these massive claws that probably could be used to defend himself in some sort of battle with males.’ But what else could these claws be use for? Let’s tell a story that’s not about fighting.’ We know that he probably ate seagrass or some sort of grass or some sort of vegetation. Those claws would be used to rip up and dig up the grasses and roots. Dealing with all of the flies is another thing. His claws could scratch a little bit, but his arms are so small that he can’t reach his whole body. The Deincheirus spots a scratching tree post to go up and start using that to scratch. For the ending of the episode, we wanted to tell the story of what happens when you eat so much food; his bowels get loose, he fertilizes the entire place and moves on. The Deincherius is a great character!”
“Deincheirus was one of the fun dinosaurs in the series for me because it’s such a weird-looking animal with a big duck bill and massive claws. This is one where scientists would say, ‘He had these massive claws that probably could be used to defend himself in some sort of battle with males.’ But what else could these claws be use for? Let’s tell a story that’s not about fighting.’ We know that he probably ate seagrass or some sort of grass or some sort of vegetation. Those claws would be used to rip up and dig up the grasses and roots. … For the ending of the episode, we wanted to tell the story of what happens when you eat so much food; his bowels get loose, he fertilizes the entire place and moves on. The Deincherius is a great character!”
—Adam Valdez, Director
Biomes determined the creatures, not the other way around, with the episodes titled ‘Coasts,’ ‘Deserts,’ ‘Freshwater,’ ‘Ice Worlds’ and ‘Forests.’
A baby Triceratops is portrayed as ‘naturalistically’ as possible, rather than rely on anthropomorphism to convey emotion.
Events involving the Corythoraptor had to fall within the natural order of things.
Prehistoric Planet showcases what scientists currently believe to be the actual behaviors of dinosaurs rather than the Hollywood portrayal of them.
For Director Adam Valdez, animation is a balance of aesthetics and physics.
In a lot of the shots, the feet of the dinosaurs were framed out because in numerous natural history documentaries, you don’t see the feet touching the ground.
‘Ice Worlds’ is a serious episode that explores family dynamics and the relationship between predator and prey. “It’s similar to ‘Deserts’ in the sense that you have these extreme environments, and it requires animals to go to greater lengths to survive,” Valdez observes. “You have this match that creates this endless loop of predation. and [questions] how does the prey species survive constantly being hunted. You have to figure out as a family group. The pack of Pachyrhinosaurus are rhino-like creatures that resemble Triceratops. They’re huge and powerful. The Nanuqsaurus don’t stand a chance attacking the group. But they are significant predators that are also big. What happens in the winter is that predators will work together as a team. You have a team of predators and a family group. It becomes a war of attrition, a siege. If we hound this family enough, eventually they’ll make a mistake, and we’ll take advantage of that mistake. It’s heavy. It’s like a standoff. You have to sit there and see who will last longer through the storm and winter that is around them.”
Going through the most iterations was an iconic dinosaur. “We went through quite a bit making the T-Rex because we definitely wanted to nail our version of what we really think the T-Rex is today,” Jones reveals. “It was the first asset that we built and to show off what the series would be. Him and the baby T-Rex. As much we know about them in terms of fur, coloration, and the idea of what these babies would have been like, we needed our Baby Yoda!”
Precise previs was created so that the cinematographers for the series knew exactly what needed to be shot for the plate photography.
The experience that the BBC Natural History Unit has in shooting hundreds of thousands of hours of real animals was leveraged when choreographing scenes.
The Cretaceous Period had enough similarities to today’s Earth that the production could fill a lot of the backgrounds on today’s Earth.
Family dynamics is a prominent theme explored in Prehistoric Planet.
For Jones, figuring out the motion of the creatures was a major task. “When I first saw the design of the giant pterosaurus, I thought there was no way that thing could fly,” he explains. “It’s the size of a giraffe. Figuring that out and having people watch it and believe it, is cool. Shooting at Palouse Falls was so much fun. We knew the environment when we prevised it, so we had a good layout. Actually, getting the shots was way challenging because we were hanging people on ropes to get the cameras in the positions that were needed. It was a fun sequence all around.”
Cinzia Angelini grew up in the 1970s in Milan, Italy, inspired by Japanese cartoons and the films of Hayao Miyazaki and Disney classics, which she studied frame by frame. Renowned director, animator and Head of Story at Cinesite Studios, Cinzia has worked for major animation studios in Europe and the U.S. for more than 25 years. Her body of work includes Balto, Prince of Egypt, The Road to Eldorado, Spider-Man 2, Minions, Despicable Me 3 and The Grinch. Cinzia wrote and directed the acclaimed CG animated short film Mila, a war story that centers on the plight of civilian children, and is currently directing HITPIG, an Aniventure animated feature produced at Cinesite.
Creating Mila was a life-changing experience, inspired by the stories my mother told me about how she felt as a child during the bombings of Trento in World War II. I wanted to use the medium I love, animation, and shine a light on the terrible realities for millions of children and families around the world who are caught in the crossfire of war. Audiences have embraced Mila’s messages of hope, imagination and perseverance and I’m so encouraged that there is a growing appetite for honest and authentic stories.
I fully embrace the power of animation. Hollywood might applaud socially relevant features, but it still views animation as essentially little more than “entertainment.” It has enormous potential to affect fundamental change in how we approach each other and how we deal with societal challenges. I believe that stories told through the magic of animation can move people and influence our future generations like nothing else can.
If Mila can change even one decision maker’s experience about the consequences of war, then all our efforts were well worth it.
The Mila theme is resonating with people around the world. Our team had 350 artists who gave their time and talent from 35 countries, the largest independent virtual studio collaboration ever created. And a surprising number of those volunteers have their own personal experience with war or in their family histories, which also moved them to want to be a part of this project. The strong theme of the film ended up being the secret for its success. Mila is more than a film; it’s a story within a story.
Inclusion and diversity were key elements in assembling the Mila team, and I’m proud that a significant percent of women were in leadership positions.
Finding ways to harness the talents of so many artists and filmmakers versed in different styles of work, cultures and languages made the final film that much richer. The entire process has been challenging and incredibly rewarding. It helped me improve as an artist, influenced how I collaborate with colleagues and showed me strength of our global interconnectivity in new and inspiring ways.
I’ve always embraced risk as an opportunity to innovate, grow and become stronger. Risk takers challenge the norm and push the boundaries of their professions. I keep leaning into the unknown, because even if things don’t work out as planned, you learn so much from the journey.
Join us for our series of interactive webinars with visual effects professionals. Ask your questions, learn about the industry and glean inspiration for your career path.
Register today at VESGlobal.org/AMA
By CHRIS McGOWAN
Images courtesy of DNEG and Sony Pictures Entertainment.
Tom Holland as treasure hunter Nathan Drake negotiates a daisy-chain of crates falling from a C-17 cargo plane in a complex mix of practical and visual effects from DNEG.
Ferdinand Magellan was a Portuguese explorer who led a Spanish expedition of five ships in 1519 to seek a western route to the Moluccas (Spice Islands). Magellan perished along the way and only one ship made it back, in 1522, but it was the first craft to circumnavigate the world. Flash forward five hundred years, and Ruben Fleischer’s Uncharted spins a fictional tale about a present-day search for two lost treasure-laden ships from Magellan’s fleet. The Sony Pictures movie is a prequel of sorts to the tremendously popular Uncharted video game series, developed by Naughty Dog and published by Sony Interactive Entertainment. The film’s treasure hunters included Nathan Drake (Tom Holland) and Victor Sullivan (Mark Wahlberg), along with Chloe Frazer (Sophia Ali) and Santiago Moncada (Antonio Banderas).
The On-Set and Overall VFX Supervisor was Chas Jarrett. DNEG was the primary VFX vendor, completing 739 shots over 23 sequences, with teams led by Visual Effects Supervisor Sebastian von Overheidt (DNEG Vancouver) and Visual Effects Supervisor Benoit de Longlee (DNEG Montreal). Other contributing VFX vendors included The Third Floor, RISE Visual Effects Studios, Soho VFX and Savage Visual Effects.
Crates falling from the C-17 cargo plane was part of a continuous 90-second ‘oner’ sequence that mixed bluescreen, wire rigs, robot arms and digi-doubles.
Holland reaches out while a KUKA robot arm holds a crate and large fans supply the wind for the shoot. Live-action filming for the sequence took place at Studio Babelsberg in Potsdam, Germany, outside Berlin.
DNEG was tasked with handling various jaw-dropping sequences, including a 90-second shot in which Nate and Chloe – along with cargo crates and a Mercedes Gullwing car – fall out of a C-17 cargo plane while flying over the South China Sea. Von Overheidt considered the shot “a fun challenge. We called this sequence ‘the oner’ because it’s constructed as one continuous 90-second shot.”
“[For the falling out of a C-17 cargo plane scene] we had several practical elements with the actors hanging on wires and interacting with a stand-in car prop. We combined the practical elements with long stretches of full-CG moments. Some sections required either close-up digi-doubles to hold up, or even a transition between plate and digi-double right in camera with nowhere to hide. Mix that with the disorienting camera, and you have quite a complex puzzle to solve.”
—Sebastian von Overheidt, Visual Effects Supervisor, DNEG
Von Overheidt adds, “We had several practical elements with the actors hanging on wires and interacting with a stand-in car prop. We combined the practical elements with long stretches of full-CG moments. Some sections required either close-up digi-doubles to hold up, or even a transition between plate and digi-double right in camera with nowhere to hide. Mix that with the disorienting camera, and you have quite a complex puzzle to solve.”
To begin creating the sequence, von Overheidt reveals, “We received LiDAR scans and HDR photography of each individual cargo crate and all the other props like the Mercedes Gullwing, as well as a full scan of the C-17 interior, which was built as a set. From there we built the entire daisy-chain of crates and the C-17 interior. At the same time, we also worked on a fully digital version of the Gullwing and the C-17 exterior model with some custom modifications compared to a standard model. Ruben had asked us to create a billionaire’s version of the well-known plane.”
The exterior model of the C-17 cargo plane was built with some custom modifications befitting a super-billionaire’s souped-up version of the plane.
“[Tom Holland] indeed got thrown around quite a bit. All the crates on the exterior were mounted on top of KUKA robot arms so that they could move on a full gimbal in a programmed sequence. They were also modified with extra padding or using softer materials, so that Tom Holland and the stuntmen could jump in between them, holding onto the netting of crates. It gave a great realistic-looking reaction for most of the shots, so we got away with a lot of head replacements on the shots with Holland’s stuntman. In quite a few shots we still went for a full digital-double solution because we wanted the performance even more violent or the camera to be more dynamic than what was shot.”
—Sebastian von Overheidt, Visual Effects Supervisor, DNEG
Once camera, object and body tracking were done, Layout Supervisor Kristin Pratt and DFX Supervisor Gregor Lakner and their teams blocked the entire sequence out, “which is also the crucial step where we’d analyze each shot and figure out what CG extensions need to be added,” von Overheidt says. This also involved finding solutions for any discrepancies between the 3D-scanned crates and the ones used on set. “Our job was to piece this all together while finding the best transitions into CG and amp up the action and movement.” There were also some entirely full CG shots. He adds, “The environment was stitched based on multi-camera array footage shot at around 7,500 feet and then augmented to look a bit more desolate in terms of islands. All the clouds and wind FX and debris are CG.”
Lighting in the open sky was a challenge. “The plates were shot on a soundstage with stationary lighting, but our characters fall tumbling through an environment with only one light-source, the sun,” von Overheidt explains. “DFX Supervisor Daniel Elophe and the team broke this mammoth puzzle down into manageable sub-sections which were assembled to one long shot in compositing at the end.” The team around Lighting Supervisors Sonny Sy and Chris Rickard and Compositing Supervisor Francesco Dell’Anna kept track of changing light directions and found creative solutions to make it all work with the plates, while allowing for a free choreography of the camera and the animation, done by Layout Lead Steve Guo and Senior Animator Patrick Kalyn. “The result works really well and we ended up getting the best of both,” von Overheidt says, “seeing the sun rotating on high-action free-fall moments while coming back into a more character-focused lighting when there is dialogue and we’re locked into practical photography.”
Greenscreens assisted with the construction of a 500-year-old Magellan ship. The ships were highly detailed and complex assets built for every form of action called for in the making of Uncharted.
Tom Holland got his share of shaking and stirring thanks to a robot arm. Von Overheidt comments, “He indeed got thrown around quite a bit. All the crates on the exterior were mounted on top of KUKA robot arms so that they could move on a full gimbal in a programmed sequence. They were also modified with extra padding or using softer materials, so that Tom and the stuntmen could jump in between them, holding onto the netting of crates.” They were thrown around randomly by the robot arms to get the sense of snaking of the daisy-chain. Von Overheidt adds, “It gave a great realistic-looking reaction for most of the shots, so we got away with a lot of head replacements on the shots with Holland’s stuntman. In quite a few shots we still went for a full digital-double solution because we wanted the performance even more violent or the camera to be more dynamic than what was shot.”
To build Magellan’s two ships, sets were split into different stages, LiDAR scanned, pieced together and combined with the overall design.
The scenes with Magellan’s ships (the Trinidad and the Concepción) and the huge helicopters carrying them required extensive VFX, but the scene wasn’t created entirely full CG. Von Overheidt notes, “There was actually a lot of great footage shot on big sets. This sequence really had everything in it. The scenes were shot on several stages resembling different parts of the ships, which we were extending with CG. The helicopters we had designed are based on some classic cargo helicopters, but even beefier.”
In the case of the Concepción, the set was split into four different stages – the stern, the main deck including helm, the bow and the crow’s nest with a partial mast, according to von Overheidt. “Our CG Supervisor Ummi Gudjonsson and Build Supervisors Chris Cook and Rohan Vaz started by assembling the various on-set stages for which we had received LiDAR scans, piecing them together, lining them up to each other and combining them with the overall design of the ship.”
Von Overheidt continues, “The same process went into the Trinidad and any other set, like the helicopters. Throughout the boat battle sequence we picked about a dozen hero shots based on the criteria [of] which ones would reveal the most problems, and we would constantly check whether our model of the ships lined up to those shots. The tricky part is that practical sets aren’t perfect. They may not be symmetrical, or the same section may have different dimensions across the different sets. In addition to that initial step, it then requires careful planning and a lot of work to get to the detail level of a good practical set. The ships were highly detailed, and complex assets were built for every form of action, including total destruction. Both ships were fully-rigged sailing ships with ropes, cloth banners, sails, flexing masts and yardarms, flapping doors, all the cannons, etc. [There were] a lot of moving parts which helped to bring across some of the crazy movements and crashes the ships would do in the sequence.”
“There was actually a lot of great footage shot on big sets [for the scenes with Magellan’s ships and helicopters carrying them.] This sequence really had everything in it. The scenes were shot on several stages resembling different parts of the ships, which we were extending with CG. The helicopters we had designed are based on some classic cargo helicopters, but even beefier.”
—Sebastian von Overheidt, Visual Effects Supervisor
Between the two ships and helicopters, around 20 mercenaries, Braddock (Tati Gabrielle), Hugo (Pingi Moli) and the Scotsman (Steven Waddington) all become part of different fights which were augmented with head replacements or full digi-doubles. Von Overheidt explains, “The journey of the flight was across [some] 330 shots, so we built a massive environment that we used to block out the sequence. Ruben wanted an action-packed sequence. Especially, the shots where we see the boats and helicopters moving through the Karst landscape had to be dynamic and exciting, and we wanted to feel their weight and impact on the helicopter’s flight dynamics.”
Von Overheidt adds, “Now, real-world physics obviously weren’t a priority on this sequence to begin with, but we still aimed towards that feel for a plausible animation and also staging the camera in a way that it would guide the audience through the disorienting action and make the ships look massive at the same time. We basically had to stick to real-world physics while also constantly breaking it at the same time. The entire sequence was a close collaboration between our layout team and the animation team led by Animation Supervisor Jason Fittipaldi and Animation Lead Konstantin Hubmann, and [On-Set and Overall] VFX Supervisor Chas Jarrett, himself whose roots are in animation.
The CGI helicopters were based on classic cargo helicopters but made beefier. They had unusually heavy loads to carry – Magellan’s ships – across the South China Sea, with footage shot in Thailand serving as the South China Sea.
“Generally speaking, working with big practical sets is great for visual effects because you have real references to match to – the real material, the real lighting and how the camera captures it. Even if you end up replacing parts of it anyway, it’s a great start. Actors feel more comfortable interacting with a real environment as well. The trade-off is that matching into complex practical sets can be quite the puzzle for visual effects.”
—Sebastian von Overheidt, Visual Effects Supervisor, DNEG
Magellan’s ships, carried by helicopters, waged battle in the air.
“For the South China Sea environment, we had received extensive footage from a practical shoot in Thailand. Film Production mounted a multi-camera array under a helicopter and flew through the landscape also shooting at different lighting conditions during the day,” von Overheidt says. The original plan was to use this material as practical backgrounds and only extend plates or create specific shots full CG. “As we were creating a digital version of the environment, we soon realized that our team, led by Environment Supervisor Gianluca Pizzaia and Environment Lead Matt Ivanov, was able to create one big environment which would cover the entire flight path throughout the sequence. And straight out of rendering it looked pretty much photorealistic. We presented our results to Ruben, who got excited about it. Everyone was confident that this would be the way to do it. It gave us and Ruben so much more freedom to find great cameras and shot composition that we decided to go full CG on the environment all the way through.”
Von Overheidt continues, “It allowed us to move the camera anywhere we wanted and fully customize the environment to our needs. It made the whole process a lot more efficient as well. Throughout the third act, there is also a progression in lighting from afternoon to sunset. Compositing Supervisor Kunal Chindarkar and Compositing Lead Ben Outerbridge made sure we transitioned seamlessly into these different lighting conditions and moods as we reached the final shot of the Conception sinking and Nate and Sully flying into the sunset.”
Asked if the filmmakers let the look of the Uncharted video games influence the visuals of the movie, von Overheidt comments, “Not from a visual effects perspective, no. I can’t speak for the Production Art Department though. I used to game quite a bit but never played Uncharted before, so when I joined the show, it was actually the first time I checked it out, mainly to understand the characters and some of the main levels. My main influence for creating images comes from photography and graphic design. I get most of my inspiration from actually being outdoors. We had some great artwork from the production team and the Thailand footage to look at. We would also often look at references for all kinds of scenes, like crazy skydiving stunts or video footage of heavy-lifting helicopters.”
Looking at the melding of the big-scale practical and digital in Uncharted, von Overheidt concludes, “Generally speaking, working with big practical sets is great for visual effects because you have real references to match to – the real material, the real lighting and how the camera captures it. Even if you end up replacing parts of it anyway, it’s a great start. Actors feel more comfortable interacting with a real environment as well. The trade-off is that matching into complex practical sets can be quite the puzzle for visual effects.”
With the help of bluescreen, Pingi Moli (Hugo), Tati Gabrielle (Braddock) and Steve Waddington (The Scotsman) appear to walk down the ramp of the C-17’s cargo bay onto a busy operations base.
By TREVOR HOGG
Images courtesy of Netflix.
“Bad Travelling” is the animation directorial debut of David Fincher.
There are sinister underpinnings to human nature which are mined narratively to create stories filled with destructive conflict and satirical humor for the Emmy-winning Netflix animated anthology Love, Death + Robots, executive produced by filmmakers David Fincher (The Social Network) and Tim Miller (Terminator: Dark Fate). The nine shorts curated for Love, Death + Robots Vol. 3 are examples of drastically different visual styles from the likes of Patrick Osborne, David Fincher, Emily Dean, Robert Bisi and Andy Lyon, Jennifer Yuh Nelson, Tim Miller, Carlos Stevens, Jerome Chen and Alberto Mielgo, with animation provided by Pinkman.tv, Sony Pictures Imageworks, Axis Studios, Blur Studio, Titmouse, BUCK, Polygon Pictures and Blow Studio.
“In Vaulted Halls Entombed” is a military adventure that descends into Lovecraftian horror.
“When 3D animation came out, it allowed us to do certain things that we couldn’t do in 2D animation. The same with a lot of the game engines. You are able to express an entire world, adjust things in real-time and change the light if you want. It’s not baked into things like it is usually.”
—Jennifer Yuh Nelson, Supervising Director
“Jibaro” is the only episode that is not based on pre-existing material.
Returning as the supervising director from her previous outing on Vol. 2 is Jennifer Yuh Nelson (Kung Fu Panda 2 & 3), who worked with a mixture of new and veteran collaborators as well as making her own contribution with the muscle-flexing action adventure “Kill Time Kill.” Notable first-time participants are David Fincher making his animation directorial debut with the monstrous seafaring tale “Bad Travelling” and Patrick Osborne helms the macabre-funny, post-apocalyptic sequel “Three Robots: Exit Strategies.” Returnees include visual effects veteran Jerome Chen helming “In Vaulted Halls Entombed,” where a special forces team encounters an ancient evil, and Oscar-winner Alberto Mielgo envisioning a fatal romance between a deaf Renaissance knight and a lethal siren in “Jibaro.” Inventive animation styles are found in “Night of the Mini Dead,” which uses tilt-shift photography to make everything look tiny, Mobius and psychedelic-flavored “The Very Pulse of the Machine,” and in the painterly impressionism of “Jibaro.”
As to whether real-time technology and game engines are impacting the type of stories being told, Nelson does not believe this to be the case. “I don’t know if it’s types of stories that it has affected,” she explains. “It’s the look and how much you can deal with certain levels of complexity. When 3D animation came out, it allowed us to do certain things that we couldn’t do in 2D animation. The same with a lot of the game engines. You are able to express an entire world, adjust things in real-time and change the light if you want. It’s not baked into things like it is usually.” The impact of game engines like Unreal and Unity cannot be ignored. “I’m so old that I was on the cusp of the desktop revolution, and it used to be when I started in the business you had to have a lot of money to be able to do 3D animation,” recalls Miller. “Then desktop technology and software came along and it democratized the process, which allowed us to start Blur borrowing $20,000. I thought that was amazing, but game engine technology is going to be a paradigm shift again. You don’t need heavy machines to render. Even lots of cheap PCs are still expensive and need some technical infrastructure. Now guys can do minutes-long shorts in their basements at home and you can see it on the web. You see a lot of interesting artists doing great things by themselves or with small teams. Game engine technology is super freaking exciting. I feel like that I’ve been waiting for it a while, but now it’s here.”
“Kill Team Kill” is a kindred spirit of Predator, Commando and Escape from New York.
“[G]ame engine technology is going to be a paradigm shift again. You don’t need heavy machines to render. Even lots of cheap PCs are still expensive and need some technical infrastructure. Now guys can do minutes-long shorts in their basements at home and you can see it on the web. You see a lot of interesting artists doing great things by themselves or with small teams. Game engine technology is super freaking exciting. I feel like that I’ve been waiting for it a while, but now it’s here.”
—Tim Miller, Director
“Mason’s Rats” revolves around a Scottish farmer battling with weapon-wielding rats determined to steal his crops.
“Night of the Mini Dead” was created by using tilt-shift photography which makes everything look tiny.
When it comes to her own short, where a squad of soldiers in Afghanistan encounter a CIA experiment gone horribly wrong, Nelson decided to channel a fondness for a particular cinematic era that made action icons out of Arnold Schwarzenegger, Sylvester Stallone, Bruce Willis and Jean-Claude Van Damme. “For ‘Kill Team Kill,’” she says, “my inspiration was cartoons from the 1990s and action movies from that time, like Predator, Commando, and G.I. Joe cartoons. They were good fun at the time, and the story by Justin Coates had that feel to it, so that’s where that came from.” Handling the animation was the studio responsible for The Boys Presents: Diabolical and The Legend of Vox Machina. “I got to work with Titmouse, and they’re an amazing studio with a wide variety of different styles. I got to work with Antonio Canobbio and Benjy Brooke who helped to find this look. It’s a 2D style, so it has to be animatable. The character designs themselves are covered with veins and packets of ammo which are hard to animate, but we got the benefits of amazing animators from all over the world, and you can see that level of expertise in it.”
“[For ‘Jibaro’] we used real scans of armor that you might see in museums. When you see the armor, it feels almost unbelievable that you can fit a person inside. The cool thing about this is we don’t actually need to fit a person inside because these aren’t real characters. You can just have their neck. We were using real Renaissance armor. We were redesigning it a little bit, but the cool thing is that we’re seeing something that is historically accurate. I feel that is extremely new and fresh.”
—Albert Mielgo, Director
“Swarm” was adapted by Miller from a short story by Bruce Sterling, and revolves around human factions with conflicting views as to whether advancement should be achieved through genetic manipulation or cybernetic enhancement and technology. Adding further complications is the discovery of an insectoid race that may be of a higher intelligence than humanity. “We have a set of eight-sided dice and roll them!” laughs Miller when describing how he decides upon the animation style, character design and world-building. “It was interesting that we had this short which is almost entirely in zero-G, but we were still going to do some motion capture for that,” notes Miller. “Then the pandemic hit and motion capture was not an option anymore. I didn’t want to get caught in the uncanny valley either, so I decided to stylize the characters to a certain degree, which helps the story not be quite as horrible as it would be otherwise. I loved making the show. It was a challenge to think about the physics of how people move through zero-G, and anything with lots of creatures is a good time. I get a lot of vicarious enjoyment from knowing the animators and creature designers are going to enjoy the process of making this.”
Mocap was combined with CG keyframe animation to produce “Swarm.”
“[For ‘Kill Team Kill’] it’s a 2D style, so it has to be animatable. The character designs themselves are covered with veins and packets of ammo which are hard to animate, but we got the benefits of amazing animators from all over the world, and you can see that level of expertise in it.”
—Jennifer Yuh Nelson, Supervising Director
“The Very Pulse of the Machine” is a love letter to French comics great Jean “Moebius” Giraud.
“Three Robots: Exit Strategies” features the neurotic XBOT 4000, dimwitted and enthusiastic K-RVC, and the brilliant and deadpan 11-45-G examining the demise of humanity.
Self-taught as an artist, Mielgo (The Windshield Wiper) utilizes the principles of painting, in particular lighting, when producing animated shorts such as “Jibaro.” “I create a simple image by removing what is not necessary for the eye to understand,” he says. Themes rather than the premise influence the animation style. “In terms of the girl, I wanted her to be a walking treasure, and in order to do that I was doing research on folklore jewelry from Northern Africa, China, India and Pakistan. In the case of the guys, I prefer the Renaissance rather than the Medieval in terms of design. We did something interesting, which is we used real scans of armor that you might see in museums. When you see the armor, it feels almost unbelievable that you can fit a person inside. The cool thing about this is we don’t actually need to fit a person inside because these aren’t real characters. You can just have their neck. We were using real Renaissance armor. We were redesigning it a little bit, but the cool thing is that we’re seeing something that is historically accurate. I feel that is extremely new and fresh.”
Sheena Duggal is an acclaimed visual effects supervisor and artist whose work has shaped numerous studio tent-pole and Academy Award nominated productions. Most recently, Duggal was Visual Effects Supervisor on the box office blockbuster Venom: Let There Be Carnage and was a BAFTA nominee this year for Best Special Effects for her work on the Oscars VFX-shortlisted Ghostbusters: Afterlife. Sheena is the only woman VFX Supervisor to earn that level of recognition from the Academy this awards season. She was the first woman to be honored with the VES Award for Creative Excellence, bestowed in 2020.
The lack of female visual effects supervisors is definitely the result of a lack of opportunity and unconscious bias – and that is fixable. Earlier in my career, I was told that the goal was to promote the male supervisors, and watched as guys who had worked under my VFX supervision were promoted up the ranks and given opportunities on large VFX shows. It never occurred to me that my gender would hold me back, and I was always surprised when it did. I am a strong believer in diversity and inclusion, not just because I am a bi-racial woman, but because I believe that greater diversity leads to freer thinking and greater creativity.
Good girls get nowhere. Be disobedient, be persistent, never take disrespect thrown your way… be smart and graceful and remember you are equal.
Never stop fighting for the right to be the best you can be. Women spend too much time being congenial, and it’s time for us to speak up about our achievements and the opportunities we’ve created for ourselves. We’re talented, we’re here, and we’re ready.
Even if women break though the glass ceiling, they end up on a glass cliff where they can be pushed off, because there, is no cadre of women to cheerlead in support that is equivalent to a “boy’s club.” We need to be building an industry culture and a structure that supports women in the field and sets them up for success. I take my opportunity to be a role model and a voice for other women seriously; I want to not just open doors, but bust through them.
Change can happen fast if everyone is motivated. We need to do it now.
In having this inevitable conversation, we can’t exclude men or accuse them if we want to create the change we want to see. We must do it together. Women are almost always expected to solve the systemic problems we did not create or perpetuate in a patriarchal culture. A lot of well-meaning people lack self-awareness or fail to understand their role in enabling sexism or great inequities. If meritocracy fails to work, then uplifting women needs to be a conscious choice. I would ask all men in VFX to go through implicit bias training and be active problem-solvers and advocates for women, because people still give men’s voices more credibility. It takes a lot of people to create success for an outlier.
Join us for our series of interactive webinars with visual effects professionals. Ask your questions, learn about the industry and glean inspiration for your career path.
Register today at VESGlobal.org/AMA
By TREVOR HOGG
Images courtesy of Sky and HBO.
Michelle de Swarte portrays Natasha who has a fateful encounter with a mysterious baby seeking to control her life.
Upon reading the synopsis for the HBO and Sky horror comedy The Baby, one gets a distinct impression that anxiety about motherhood drives the narrative created by Lucy Gayme and Siân Robins-Grace. The summary states, “Controlling, manipulative and with violent powers, the baby twists Natasha’s life into a horror show. Where does it come from? What does it want? And what lengths will Natasha have to go to in order to get her life back? She doesn’t want a baby. The baby wants her.” When this observation gets mentioned to VFX Producer Anne Akande and Visual Effects Supervisor Owen Braekke-Carroll both of them laugh in agreement. “It’s certainly a dissection of many angles of motherhood!” states Braekke-Carroll. “There is symbolism and scenes that absolutely tap into practical and real fears of breast feeding and abandonment. We were tasked with bringing some of the juicer parts of the script into the visual medium. It’s quite literal in many ways.”
Bobbi (Amber Grappy), Natasha (Michelle de Swarte) and Mrs. Eaves (Amira Ghazalla) stand in horror at the violent chaos that ensues in The Baby.
“We weren’t pushing [visual effects] beyond anything because the show was one that we knew early on was grounded in reality. The baby is a baby. There are a lot of misconceptions about what this baby is and what his agenda is. There are a few moments where we have some heightened reality and he is still a baby, but a bit different.”
—Anne Akande, VFX Producer
Gayme and Robins-Grace had a clear and descriptive vision of the reality and tone of the series. “Siân and Lucy were keen from the outset on getting a realistic and grounded tone throughout the series, and this influenced how we then approached the body of work,” remarks Akande. “We were involved early in the process to ensure that the shoot methodology would be effective and give visual effects enough material to pull off some of the more dramatic scenes. Beyond that giving the guidance, they were also collaborative, open and willing to take feedback on the best way forward via visual effects to hit each story point.” The visual effects work for the eight episodes consisted of just under 650 shots by Framestore, Jellyfish Pictures, Freefolk and Goldcrest. “We weren’t pushing it beyond anything because the show was one that we knew early on was grounded in reality,” notes Akande. “The baby is a baby. There are a lot of misconceptions about what this baby is and what his agenda is. There are a few moments where we have some heightened reality and he is still a baby, but a bit different.”
A key location is a seaside cottage at the base of a cliff, directly fronting the shoreline.
This beachside cliff was LiDAR scanned, recreated through DMP/CG, then combined with plate photography.
Nicole Kassell helmed the pilot, Faraz was responsible for three episodes, and Stacey Gregg and Ella Jones each directed two episodes. “It’s always interesting working with different directors across a series, and in this case they did all have different approaches to handling the visual effects,” states Akande. “Nicole Kassell had a lot of experience in visual effects and had a hands-on approach from storyboards, concept, previs through to execution. Others brought their comedy experience to help drive the storytelling beats, and there was also some experimentation using different shooting techniques and machinery on set. All of this brought an interesting mix that fused with the tone of the show, creating a unique place for The Baby in the comedy/horror genre.” Storyboarding and concept art were produced for all of the key creative beats.
“We definitely knew that we needed a digital asset. By casting twins, we were able to double our shooting hours. The babies absorbed the nature of the set quickly, and we saw them grow up over the course of six months of shooting with them. That left us with a strange, hybridized methodology over time, whether it be face replacements from plates with a CG body, a stand-in prosthetic baby with a head replacement being pushed around in a pram, or one digital arm, plate head and a prosthetic body. There was also an army of stand-in babies.”
—Owen Braekke-Carroll, Visual Effects Supervisor
The cottage and immediate gardens were built on the site of a small quarry which provided the immediate base of the cliff and surroundings.
“To compensate for [the unpredictability of the babies] on set, we ended up treating almost every frame with the baby cast in it as a potential visual effects shot. This included taking large volumes of data and notation for most scenes and essentially treating them as a CG creature in the scene.”
—Anne Akande, VFX Producer
“Concept art for key moments, such as our Demon Baby nightmare scene, was developed by the Framestore art department and was crucial in helping settle the creative vision as much as possible before shot execution,” remarks Braekke-Carroll. “From the storyboards, some key shots were turned into previs.” Scripts for the eight episodes were broken down to determine what shots required visual effects. “We worked closely with the art department throughout the shoot to help find the right combination of set, location and bluescreen,” explains Akande. “A key location in the script that we return to many times is a seaside cottage at the base of a cliff, directly fronting the shore line. Locations were unable to find a site that hit all the required points, so visual effects were tapped to make this work. The cottage and immediate gardens were built on the site of a small quarry, which gave us the immediate base of the cliff and surrounds. A secondary location along the site of a dramatic coastline in Newhaven [England] was the basis for the extension. This beachside cliff was LiDAR scanned, recreated through DMP/CG, then combined with plate photography to combine the two locations together.” Deaths are plentiful throughout the story, but the focus is on the aftermath rather than the actual act of violence. “There is an implied causal link between the baby and a death,” states Braekke-Carroll. “But he’s not necessary physically holding the knife.”
Due to the unpredictable nature of the babies on set, every shoot day could wildly deviate from the plan and the visual effects team would be required to help.
“The sheer nature of the amount of time that we were going to have a baby onscreen and on set meant that a lot of things we had planned for would sometimes go flawlessly without any help from us,” notes Braekke-Carroll. “On another occasion, the entire day might need to be completely changed and require our input for all sorts of reasons.” Identical twins were cast in the title role. “We definitely knew that we needed a digital asset,” remarks Akande. “By casting twins, we were able to double our shooting hours. The babies absorbed the nature of the set quickly, and we saw them grow up over the course of six months of shooting with them.” Digital doubles were avoided as much as possible. “That left us with a strange, hybridized methodology over time, whether it be face replacements from plates with a CG body, a stand-in prosthetic baby with a head replacement being pushed around in a pram, or one digital arm, plate head and a prosthetic body,” states Braekke-Carroll. “There was also an army of stand-in babies. When it comes to performance with our hero twins, that became a hybridized process where we used a combination of digital passes, keying tools, reprojections and face tracking from source plates. Then also leaning on machine learning additional 2D layering to change the performance.”
“It’s always interesting working with different directors across a series, and in this case they did all have different approaches to handling the visual effects. Nicole Kassell had a lot of experience in visual effects and had a hands-on approach from storyboards, concept, previs through to execution. Others brought their comedy experience to help drive the storytelling beats, and there was also some experimentation using different shooting techniques and machinery on set. All of this brought an interesting mix that fused with the tone of the show, creating a unique place for The Baby in the comedy/horror genre.”
—Anne Akande, VFX Producer
The eyes were difficult to get right. “The animation of the performance of the baby isn’t quite straightforward,” remarks Braekke-Carroll. “The eyes are quite loose and gaze differently. We took parts of plates for the area around the eyes for the micro-performance and combined that with CG or machine learning layers.” A wealth of material was gathered from reference photography. “We could be working on something in Episode 104 and there’s a performance that nails it in Episode 102,” states Akande. “Everybody on set was invested in getting us the material. It could be the first two seconds before the take, and that was needed for the face replacements. We also learned about which baby is good at being still or restless. The one thing that we tried to educate people on is that the babies are a member of the cast. If you replace a cast member with a stand-in for 30 shots, that becomes visual effects. Every time a baby was in a shot, the first port of call was our hero baby. The real performance will always be better than the alternative. CG was the last resort, and that was what we let the showrunners and executive producers know from the beginning.”
Face-generation camera setups were orchestrated that proved to be useful as animation reference and being utilized for machine learning. “Anytime we were using a digital baby performance and we would also be running a machine learning output as well,” explains Braekke-Carroll, “rather than treating that as a facial replacement solution we had it as an additional layer setup that could be incorporated partially or fully in with the other renders for the other parts. A lot of the shots that you will see won’t necessarily be a machine learning output, but there will be parts of the lips, eyes or cheek that will give it an extra degree of photographic verisimilitude that you get from that output.” One of the most difficult visual effects tasks was to have a baby falling asleep or sleeping. “We had to find a bunch of solutions and ended up shooting a lot of high-frame-rate plates of the baby and played them back at normal speed,” adds Braekke-Carroll. “We looked for a nice section where it felt like they were sleeping. A machine learning dataset was built just of the baby’s eyes. The high-frame-rate photography gave it a gentle effect, rather than trying to animate too much micro eye movement.”
The show had one primary asset, which was the baby digital double. The bulk of the baby work was handled by Framestore.
In some cases, shots were a combination of high-frame-rate plate photography, digital-double parts and a machine learning layer on top.
Point-of-view shots take place within the birth canal and womb. “We had a free remit to take B camera, get all of the jars of Vaseline, PCB tubing, probe lens, lights, blood, sputum and pus,” reveals Braekke-Carroll. “We took all of the bits and pieces and gelled them up. We got some nice closeup photography inspired by the scenes in The Tree of Life. I was pushing against building a CG interior because, tonally, I didn’t think that it fit the episode. From that photography, visual effects added a layer of fine particulate and endoscopic lensing. There is also the diffusion of the water and cloudiness. But the actual content of the walls was practical photography.”
“The one thing that we tried to educate people on is that the babies are a member of the cast. If you replace a cast member with a stand-in for 30 shots, that becomes visual effects. Every time a baby was in a shot, the first port of call was our hero baby. The real performance will always be better than the alternative. CG was the last resort, and that was what we let the showrunners and executive producers know from the beginning.”
—Owen Braekke-Carroll, Visual Effects Supervisor
The unpredictability of the on-set babies posed the biggest challenge across the series. “To compensate for this on set, we ended up treating almost every frame with the baby cast in it as a potential visual effects shot,” explains Akande. “This included taking large volumes of data and notation for most scenes and essentially treating them as a CG creature in the scene.” The best moment was literally saved for last. “We’re looking forward to the final sequence underwater,” states Braekke-Carroll. “It’s a beautiful and unexpected scene that wraps up the story and bookends the series nicely.”
By TREVOR HOGG
South Korea-based Gulliver Studios created the dramatic effects for the surprise Netflix hit series Squid Game, which returns for a second season in 2024. (Image courtesy of Netflix)
With The Mandalorian taking a breather after winning two consecutive Emmy Awards for Outstanding Special Visual Effects in a Season or a Movie, it will be up to The Book of Boba Fett to continue the winning streak as the iconic bounty-hunter-turned-crime-lord series has been described as The Mandalorian 2.5. Whether Boba Fett receives a nomination, or more, will be revealed when the Primetime Emmy Awards takes center stage on September 12, 2022 at the Microsoft Theater in Los Angeles. The other category is Outstanding Special Visual Effects in a Single Episode, which last year was awarded to Star Trek: Discovery, another contender from a storied science fiction franchise which will be trying to repeat the feat once again.
An interesting bellwether is the 2022 Visual Effects Society Award nominations that place Loki and Foundation at the forefront with both being singled out for their stunning environmental work for Lamentis and Trantor. “We were asked to create meteor effects from scratch,” states Digital Domain Visual Effects Supervisor Jean Luc-Dinsdale when discussing Lamentis and its moon Lamentis-1. “We went through multiple versions of providing the meteors, the impacts, and the dust and debris that flies around them. That was then tweaked and populated throughout the episode because the meteors are a constant threat, but not always the focus of the sequence.”
Trantor is literally 50 different cities stacked on top of each other. “Every level was built hundreds of years before the next one, so there was a lot of concepting and architectural research that went into how Trantor and its multilevel structure was designed,” explains DNEG Visual Effects Supervisor Chris Keller. “We created all of these interstitial elements between buildings, like bridges, platforms, megastructures spanning 1,000 meters, through the sky ceiling of a certain level into the next level. Then you’ll see hyperloop trains and, if you look carefully, flying vehicles. All of that had a certain logic.”
Part of the futuristic appeal of Star Trek:Discovery is the amount of attention and detail put into creating believable UI. (Image courtesy of Paramount+)
When it comes to photorealistic CG characters, Leshy-infected Eskel and Nivellen from The Witcher and Ampersand from Y: The Last Man were also nominated for VES Awards. “There has been real growth on the monster side,” explains Andrew Laws, Production Designer for The Witcher. “We work in ZBrush from the ground up to understand the movement and how the creature is going to take shape in all dimensions. It’s a much more fluid process. Once we have established a ZBrush model that has an organic shape, we’ll do some overpainting to get the mood of the creature. When it is agreed upon how that is going work, then the 3D model goes out to visual effects and the vendors to bring in the detail and movement.”
Originally, Ampersand was going to be real but was changed to CG because Disney has a ‘no primate’ rule. “Stephen Pugh and Jesse Kawzenuk, our amazing visual effects supervisors, made it so easy for me,” recalls cinematographer Catherine Lutes. “I was constantly laughing at the puppet Amp that we had. It helped with the way that the light was falling, and that’s a good reference as well for visual effects. Stephen said that camera shouldn’t do things that a monkey wouldn’t do. If the camera is a little bit stilted or doesn’t move smoothly, that’s great because that’s what would happen if you were trying to follow an actual monkey running or moving.”
One question is whether The Book of Boba Fett can carry on the Emmy-winning
ways of The Mandalorian. (Image courtesy of Lucasfilm)
Nostalgia reigns supreme as Ewan McGregor and Haden Christensen reprise their roles from the Star Wars prequels for Obi-Wan Kenobi. (Image courtesy of Lucasfilm)
Oscar Isaac becomes a part of the MCU for the first time, along with Ethan Hawke, in the Disney+ series Moon Knight. (Image courtesy of Disney)
Raised by Wolves is seen as a better exploration of an Alien-inspired universe
than the prequels directed by Ridley Scott. (Image courtesy of HBO)
There is no shortage of monsters to be found in The Witcher, such as a powerful vampire known as a Bruxa. (Image courtesy of Netflix)
The Wheel of Time features a wide gamut of visual effects from creatures, magic and world-building done in a grounded fashion. “One thing that was important for me from the beginning was that this world feel authentic and real,” explains The Wheel of Time creator, executive producer and showrunner Rafe Judkins, “even for the actors and crew, trying to go to places, as much as we can put stuff in-camera, even if we end up augmenting or enhancing it later with visual effects.”
The fact that the sixth season is the grand finale for The Expanse may see Emmy voters finally honor the body of work with a nomination. “The most challenging thing is wrapping your head around things that may not sound that difficult initially, like deorbiting maneuvers where you slow going forward to be able to drop,” notes Bret Culp, Senior Visual Effects Supervisor of The Expanse. “We’ve done a good job and, as a result, it has been made clear to us that we are favorites with a lot of people at NASA and have an open invitation to visit the JPL [Jet Propulsion Laboratory].”
The usual suspects include Lost in Space, which has been rightly lauded for being able to turn practical locations into alien worlds and making biomechanical robotic beings that are empathetic and menacing. “The most challenging visual effects sequence in the finale of Lost in Space was creating the horde of killer alien robots and sprawling wreckage of their crashed ship,” remarks Lost in Space Visual Effects Supervisor Jabbar Raisani. “The entire episode had to be filmed on stage, and we decided to shoot against black. As both the director of the episode and the VFX Supervisor, I relied heavily on shot planning with our in-house previs team which maintained close collaboration with the production designer to maximize our efforts and bring the series to its epic conclusion.”
“The most challenging visual effects sequence in the finale of Lost in Space was creating the horde of killer alien robots and sprawling wreckage of their crashed ship. The entire episode had to be filmed on stage, and we decided to shoot against black. As both the director of the episode and the VFX Supervisor, I relied heavily on shot planning with our in-house previs team which maintained close collaboration with the production designer to maximize our efforts and bring the series to its epic conclusion.”
—Jabbar Raisani, Visual Effects Supervisor, Lost in Space
For those looking for major robot battles, Season 3 of Lost in Space will not disappoint. (Image courtesy of Netflix)
The Battle of New York scene from 2012’s The Avengers was used as a flashback in Hawkeye, which was released as a limited series in 2021. (Image courtesy of Disney)
A welcome return to the world created by Gene Roddenberry is Patrick Stewart reprising his signature role of Jean-Luc Picard in Star Trek: Picard. (Image courtesy of Paramount+)
Returning for sophomore seasons are Star Trek: Picard and Raised by Wolves, with the former mining the fan adoration for the Starfleet officer portrayed by Patrick Stewart and the latter infusing Alien mythology into the android survival tale produced by legendary filmmaker Ridley Scott. The hardest sequence to design, create and execute for Raised by Wolves was the outerspace sequence between Mother and the Necro serpent in Episode 208,” reveals Raised by Wolves Visual Effects Supervisor Raymond McIntyre Jr. “The flying Necro serpent is lured away from killing Campion by Mother, who leads the serpent into outer space in order to attempt to kill it. This scene was added deep into postproduction, and visual effects was tasked with designing an entire sequence from scratch as no live-action footage existed. Visual effects designs included the flying serpent, lighting design in outer space, nebulas, the planet Kepler 22B seen from this viewpoint, Mother’s new kill scream and a visualization of the failure of the EMF dome protecting this area of the planet. Execution involved creating realistic camera motion for each shot, and beauty lighting with sun flares, allowing for dirt on the lens to show up during flares, all while rendering fully CG shots.”
Making their debuts are Obi-Wan Kenobi, which has Ewan McGregor reprising his role as the legendary Jedi Master from the Star Wars prequel trilogy, and Star Trek: Strange New Worlds, an exploration of life on the USS Enterprise under the command of Captain Christopher Pike; both of them serve as prologues to the original movie and television series and have the best chances to get nominations for their respective franchises, especially if a proper balance is struck between nostalgia and canon expansion.
Then there is a matter of art imitating life that will resonate with some while being too close to the bone for others, where the viral mayhem portrayed is even more devastating and required extensive invisible effects to paint out modern-day life. In Sweet Tooth, a pandemic causes hybrid babies that are part human and animal, with the adolescent protagonist being half deer, while Station Eleven focuses on humanity trying to rebuild society after a virus has decimated the population, and See envisions a future where blindness has reached epidemic proportions.
A favorite to win at the Emmys is Foundation, which features stellar environments throughout the AppleTV+ series. (Image courtesy of Apple TV+)
A planet gets destroyed amongst the purple haze in the Disney+ series Loki.(Image courtesy of Marvel Studios)
A surreal situation for the cast and crew of Station Eleven was shooting a story about a pandemic during one. (Image courtesy of HBO)
Animal/human hybrids populate the world of Sweet Tooth because of a deadly virus. (Image courtesy of Netflix)
Serving as dark social commentary on the growing financial divide is Squid Game, which combines elements of The Most Dangerous Game, childhood games and Dickensian debt into a rating sensation for Netflix, and is a strong contender to upset the voting establishment. “The game spaces in Squid Game were unique and something we had never experienced before,” states Cheong Jai-hoon, Visual Effects Supervisor of Squid Game. “What we wanted to achieve from the settings of Squid Game was a fabricated yet realistic look, and it was quite challenging to balance the two conflicting characteristics. Especially in Episode 107, characters play the game of Glass Stepping Stones from high above the ground, and we had to create an environment that would make the viewers immerse in the fear and tension. We put the most effort into deciding the depth from the stepping stones to the ground and the overall scale of the whole setting. We could have easily exaggerated, but we strived to find the right balance between what seemed fake and realistic, as it was more difficult than we thought.”
Also, present is the author only outdone by the Bard himself when it comes to number of film and television adaptations of his works. Lisey’s Story was conceived by prolific horror maestro Stephen King, who has supernatural unrest intersecting with personal trauma. Comic book adaptations are not in short supply. A superhero who has a sharp wit and archery skills is paired with a like-minded protégé in Hawkeye, which channels Shane Black’s penchant for Christmas, action sequences and odd-ball comedic pairings. For those wanting an irreverent take on the genre, James Gunn helms the small screen adaptation of Peacemaker, where an extremist murderer embarks on a quest for peace. Moon Knight introduces the Marvel Studios equivalent of Batman, but with an Egyptian god reincarnation twist that raises questions about the mental sanity of the main character.
Superman & Lois reimagines The Daily Planet colleagues as a married couple trying to balance domestic life and a rogues’ gallery of high-flying adversaries. “If Superman is fighting someone in the air where they would both be horizontal, it was much more time efficient and easier on the actors if they can be vertical,” states cinematographer Stephen Maier, who added a physical camera shake for the sake of realism. “The stunt team will often go away to design or rehearse something, do their previs that they film on their iPhones, cut it together and show it to us. We have a close collaboration with special effects in regards to atmospheric smoke and haze. The gags that they come up help to exemplify the strength of Superman, such as him lifting a car.”
Considering the growing demand for content and the acceptance of visual effects as the primary work tool of potential nominees reflect how far the production quality of television and streaming shows have come in being able to expand the scope of creatives with a theatrical sensibility. It is because of this that the Primetime Emmy Awards has become as fascinating to watch as the Academy Awards as both showcase the very best of what can be achieved when talented digital artists get to contribute to the storytelling. Undoubtedly, the eventual winner will encapsulate the highest of level of creative and technical ingenuity achievable under current circumstances and will serve as a building block for what is to follow.
You are Invited to Attend the VES Montreal Mega Party 2018!
Thursday, December 6, 2018 from 7:00PM until late
Theatre Plaza
Montreal, QC H2S 2M5
VIP access and drinks for VES members + 1 guest
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.