VES New York Live@5!!!
To join us use the Google form below.
We’ll be sending out an email with the Zoom link the day of the event. Everyone is welcome!
By TREVOR HOGG
Images courtesy of The CW.
A creative partnership and friendship forged 28 years ago at MGM has Visual Effects Supervisor John Gajdecki (Stargate: Atlantis) and Visual Effects Producer Matthew Gore (Battlestar Galactica) working together again on the second season of The CW production Superman & Lois. Co-creator and showrunner Todd Helbing (The Flash) has produced a unique spin on the superhero genre where at the heart of the story are the family struggles of Clark Kent (Tyler Hoechlin), Lois Lane (Elizabeth Tulloch) and their twin sons (Jordan Elsass, Alex Garfin) who have inherited the patriarchal supernatural abilities.
“We know that the artists are good and the art is built into the people. This team manages the process so tightly that we can deliver shots without panic two days before they’re on TV.”
—John Gajdecki, Visual Effects Supervisor
A particular expression comes to mind for Gajdecki when describing how the show operates. “You hear people saying, ‘Armchair generals talk strategy, but the pros talk logistics.’ ‘Can we get the weapons to the front? Will there be food for the soldiers when they get there?’ We know that the artists are good and the art is built into the people,” Gajdecki states. “This team manages the process so tightly that we can deliver shots without panic two days before they’re on TV.”
“At the beginning of the season, we said that we wanted to take this up another level and try to get as close to feature quality as we could, knowing our limitations. But every time Superman does something, there is usually a CG component to it that adds to the schedule and budget issues. John mentioned to me that he had an in-house team on Project Blue Book, and so we proposed that. The in-house team has shined and been great in helping us to get this on the air.”
—Matthew Gore, Visual Effects Producer
Necessity led to a creative solution for Season 2 of Superman & Lois. “We have a tight network schedule, so it’s tough to try to do what we’re trying to do,” Gore notes. “At the beginning of the season, we said that we wanted to take this up another level and try to get as close to feature quality as we could, knowing our limitations. But every time Superman does something, there is usually a CG component to it that adds to the schedule and budget issues. John mentioned to me that he had an in-house team on Project Blue Book, and so we proposed that. The in-house team has shined and been great in helping us to get this on the air.”
The more shots were broken up over multiple vendors, the more often they would come back to the in-house team as the final 2D. “I see us as the final line of digital defense,” Gajdecki remarks. “Everything comes in and we do the comps, and we’ll put that photographic pass on it to make sure that the lens flares feel right and that the camera move looks like it fits between the shot, before and after, and the contrast, smoke and dust levels are matching. When Superman flies in and comes to a stop, something else has to keep going otherwise it doesn’t feel right. With some artists it is hard to explain why that’s wrong. We came up with the term energy transfer and suddenly people went, ‘Okay, I see it now.’”
“I see us [the in-house team] as the final line of digital defense. Everything comes in and we do the comps, and we’ll put that photographic pass on it to make sure that the lens flares feel right and that the camera move looks like it fits between the shot, before and after, and the contrast, smoke and dust levels are matching. When Superman flies in and comes to a stop, something else has to keep going otherwise it doesn’t feel right. With some artists it is hard to explain why that’s wrong. We came up with the term energy transfer and suddenly people went, ‘Okay, I see it now.’”
—John Gajdecki, Visual Effects Supervisor
Bigger ambitions have meant that the visual effects count increased from 2,300 to 2,500 shots for Season 1, to an additional 25% for Season 2. Also expanded are the number of vendors, which can be a many as 15 depending on their availability. Among the contributors are Zoic Studios, Refuge VFX, Boxel Studios, Frame Lab Studios, Barnstorm VFX, Tribal Imaging, Od Studios and Lux VFX. “We have an honest relationship with our vendors,” Gore notes. “In my conversations with them, are you available and can you do this work? They’ll tell you flat-out. It won’t be like the shop that wants to take the work and figure it out. They’ll say, ‘We don’t have these artists available for those weeks, but we have these who are available.’ We’re constantly trying to fill what somebody can do.” Even though there are assets and locations that carried over from Season 1, modifications still had to be made. “The [Kent family] farmhouse ended up in Bizarro World, so even though the farmhouse didn’t change between seasons, there was a whole new farmhouse and barn,” reveals Gajdecki. “The barn has burned down and the farmhouse is dilapidated.”
“There is stuff that we can’t talk about for the series finale that might come as a text from Todd Helbing with a photo that says, ‘We’re going to do this,’” Gore comments. “We went, ‘Okay. Cool. Let’s figure out how.’ A lot of times Todd will give us a heads up that something big is out there, so we can at least start thinking about it even if we don’t get a full outline or a script yet. At least we know conceptually this is something that we have to try to fit into the schedule.”
“Anything that we need we get. When we shoot greenscreens, the line producer says, ‘It’s Gadjecki Rules.’ They light it for us and get the exposures and interactive light that we need. Our shots look good even though we have little time because we shoot the pieces so well and production is behind us.”
—John Gajdecki, Visual Effects Supervisor
Gajdecki has a particular philosophy towards visual effects. “Every shot that we do has to look like that the art department directed it and the camera operators operated the camera. We are sensitive to the inputs from the other departments to make sure that we put in the same chaos that would be in a real shot.” The cooperation is mutual, he says. “Anything that we need we get. When we shoot greenscreens, the line producer says, ‘It’s Gadjecki Rules.’ They light it for us and get the exposures and interactive light that we need. Our shots look good even though we have little time because we shoot the pieces so well and production is behind us.”
Episode 208 features a portal. Comments Gajdecki, “Upon hearing that there was going to be a portal, we got all of the reference together, numbered all of the references, put it out in front of our executives, got on the Zoom and asked, ‘Todd, is any of this close to what you’re thinking?’ He might have 50 images to look at, and we start to talk about the nature of the portal, what’s the portal doing and the behavior. We go from there and start narrowing down the focus.” Superman gets affected in a dramatic way. “There had to be some component of him getting shredded as he enters it,” Gajdecki adds. “Then we had this whole other thing where once he’s inside there, what does it look like? It goes back to who does what. Our in-house team started a look in Episode 201 where Bizarro is flashing to stuff. We need some cool flash that is supposed to sell that it’s him traveling through the portal. The in-house team was tasked with that. They came up with a look. It evolved to a certain point, and we knew that we wanted it to feel like it had depth to it. That was our other challenge. Todd wanted it to feel that he is breaching something, but we also wanted to sell that there is something behind it, and then he had to shred in there at some point. We didn’t want it to be holes underneath and you see through him. We wanted a substance there, but didn’t want it to be bone and skin because it’s CW.”
Adding to the shot count was number of deepfakes associated with the character of Bizarro, who is revealed to be an alternative dimension version of Superman. “We let everyone know that the AI approach was not going to be a one-size-fits-all answer to all the Bizarro shots,” Gore explains. “We had numerous discussions with Todd that if a shot was going to ‘break’ the AI, we were going to have to apply more traditional methods to get the sequences to where he wanted them to be. For example, there were a couple shots where we needed to go all CG on Bizarro in order for him to be able to appear in the same shot as Tyler during some of the fight scenes. We massaged the cuts with Todd. Boxel Studio took on all the work that the AI couldn’t do. The in-house team came up with the look for Bizarro’s eyes and then worked with Tribal Imaging in Calgary, Wild Media Entertainment in Toronto/Vancouver, Kalos Studios, Animism Studios and Refuge VFX to make sure no matter who took Bizarro to final, the look for Bizarro’s eyes and makeup was going to be consistent in all the comps across the various episodes. It was also a great learning experience working with this new AI toolset. And to be clear, not every scene in the Bizarro story arc was achieved using AI.
By TREVOR HOGG
Images courtesy of Universal Studios.
Prehistoric beasts have had a constant presence in the life of David Vickery, who served as Visual Effects Supervisor on Jurassic World: Fallen Kingdom and Jurassic World: Dominion. “I’m still working on dinosaurs! Not the film ones, but for promotional media, advertisements and public relations related stuff.” Any concerns of repeating himself were alleviated by working with a different director, crew and script. “There are always new challenges.” About 1,450 visual effects shots were created, with ILM being responsible for 1,000 shots while the rest were handled by Lola VFX and Hybride. “[Director] Colin Trevorrow had a one-on-one relationship with the storyboard artist, and those storyboards were handed over to us to create animatics. I worked closely with the previs and postvis teams [provided by Proof].” Collaborating with Production Designer Kevin Jenkins was easy as he is a former art director at ILM. “Kevin clearly understands visual effects and worked a lot in 3D, so he could hand those designs over to us,” Vickery observes. “Because of our previous working relationship, Kevin trusted me to take incomplete designs to ILM and to continue their evolution.”
“There are probably more animatronics in Jurassic World: Dominion than in Fallen Kingdom and Jurassic World combined. … [T]he digital dinosaurs we had were an exact match for the physical animatronic dinosaurs that we had on set. It didn’t matter where they positioned the rig because the range of motion was exactly the same as the range of motion in the digital dinosaur. The goal being to get a plate that gave us practical animatronics that could move and perform on set and be digitally extended without us having to replace it.”
—David Vickery, Visual Effects Supervisor
Five weeks into principal photography for Jurassic World: Dominion, the pandemic caused a global lockdown. “Nobody knew what was going to happen when COVID-19 hit and we had to start shooting again during the pandemic,” Vickery states. “It was hard to understand how we were going to be able to communicate with each other, because suddenly we had to stay distant, were all wearing masks and couldn’t all cluster around the director’s monitors. We had tech scouts where the DP, John Schwartzman, was still isolating before he was able to come back onto set. We were deploying all sorts of new techniques such as wearing Bolero headsets rather than the usual walkie system.” Travel restrictions caused a few key scenes to be reimagined, such as Velociraptors chasing a motorcycle driven by Owen Grady (Chris Pratt) through the streets of Malta. “We had array photography and LiDAR data from location, which was then projected and manipulated rather than being a fully CG environment. Once we got back in the U.K., Chris rode a stationary motorbike that was placed on a massive rolling road that was 25 feet wide. The bike was rigged so it could weave left and right.”
“There was a cool bit of what I called ‘digital archeology.’ We had 3D SoftImage files but didn’t even have the software. However, we managed to get those NURBS files into Maya and convert them into polygon meshes. We also referenced all of Stan Winston’s photography. It’s a beautiful piece of recreation of that original T-Rex model.”
—David Vickery, Visual Effects Supervisor
A different approach was adopted for Mosasaurus attacking the crab boat. “Our editors scoured through 16 seasons of The Deadliest Catch program, correlated and created an edit from outtakes,” Vickery reveals. “The goal was to use the natural aesthetic of the footage that we had and integrate the CGI elements into it. The crab pod is something that we added in as well as the Mosasaurus and a bunch of spray.” Shifting weather patterns had to be accommodated. “When we were shooting at a lumber yard and had to put two huge Apatosaurus in the background, the whole sequence was shot in the morning in bright sunlight and no snow,” Vickery says. “Then it started snowing at lunchtime and we had to reshoot the entire scene again because the snow was going to change the look of our plates. We had a huge team of effects artists at ILM whose job was adding digital snow and dust.” An entirely new feather system was constructed in Houdini by ILM to deal with creatures such as the Pyroraptor. “It relied on the geometry of the feathers being described as a curve for the quill and a flat piece of polygonal geometry for the feather itself,” Vickery explains. “The feathers had to be able to interact with environmental elements. “On set we had the special effects team with air movers, some snow and atmospheric effects, but we had to recreate the same effect in post to be able to integrate the Pyroraptor.”
“There was a moment on set where Sam Neill, Laura Dern, Jeff Goldblum, Chris Pratt, Bryce Dallas Howard and DaWanda Wise are with the biggest animatronic that I’ve ever seen in my life. Bringing all of those things together was amazing!”
—David Vickery, Visual Effects Supervisor
Each instalment of the franchise introduces new dinosaurs. “The Giganotosaurus was a real dinosaur,” Vickery remarks. “You look for the holotype, which is often a partial specimen so scientific that experts have to guess the rest. Some of the dinosaurs look so bizarre, like the Therizinosaurus, which is this huge theropod that is covered in feathers and has one-meter-long baseball bat-like claws on the edge of its fingers.” A massive animatronic was built for the Giganotosaurus. “There are probably more animatronics in Jurassic World: Dominion than in Fallen Kingdom and Jurassic World combined.” Jenkins collaborated with Trevorrow and paleontologist consultant Steve Brusatte to develop concepts that were turned into clay maquettes and then scanned. The scans were given to ILM which made sure the model was anatomically correct before handing them off to John Nolan, the head of the creature effects team, for 3D printing. “This meant that the digital dinosaurs we had were an exact match for the physical animatronic dinosaurs that we had on set,” Vickery notes. The process helped to minimize the amount of CG. “It didn’t matter where they positioned the rig because the range of motion was exactly the same as the range of motion in the digital dinosaur. The goal being to get a plate that gave us practical animatronics that could move and perform on set and be digitally extended without us having to replace it.”
“From a simulation perspective, [with feathers] you’re dealing with a huge amount of geometry that is deforming and moving on a frame-by-frame basis, and is reacting to external forces like the wind, but also to the way that the creature is moving. Creatively, ever since Jurassic Park, fans and paleontologists have been crying out to see feathers on dinosaurs, and we will deliver this time.”
—David Vickery, Visual Effects Supervisor
Restored to her former glory is the T-Rex from Jurassic Park. “There was a cool bit of what I called ‘digital archeology,’” Vickery recalls. “We had 3D SoftImage files but didn’t even have the software. However, we managed to get those NURBS files into Maya and convert them into polygon meshes. We also referenced all of Stan Winston’s photography. It’s a beautiful piece of recreation of that original T-Rex model.”
Feathers proved to be the biggest creative and technical challenge. “From a simulation perspective,” Vickery comments, “you’re dealing with a huge amount of geometry that is deforming and moving on a frame-by-frame basis, and is reacting to external forces like the wind, but also to the way that the creature is moving. Creatively, ever since Jurassic Park, fans and paleontologists have been crying out to see feathers on dinosaurs, and we will deliver this time.”
Seeing the Jurassic Park and Jurassic World franchises come together was a career highlight for Vickery. “There was a moment on set where Sam Neill, Laura Dern, Jeff Goldblum, Chris Pratt, Bryce Dallas Howard and DaWanda Wise are with the biggest animatronic that I’ve ever seen in my life. Bringing all of those things together was amazing!”
By TREVOR HOGG
Images courtesy of Apple, Inc.
Going beyond the Hollywood portrayals is the Apple TV+ natural documentary series Prehistoric Planet, which travels back 66 million years to the Late Cretaceous period when dinosaurs reigned supreme. Serving as executive producers are filmmaker Jon Favreau (Iron Man) and Mike Gunton, Creative Director, Factual at BBC Studios. Directing the five episodes are Adam Valdez and Andy Jones who respectively worked as a visual effects supervisor and animation supervisor on The Lion King and The Jungle Book for Favreau. Collaborating closely together were digital artists from MPC and cinematographers from BBC’s Natural History Unit.
Progressing from The Lion King and The Jungle Book was not a huge leap for Jones. “Wildlife, natural history and what the BBC has been doing for years was our goal for a lot of the shots. In Jon Favreau’s mind, he always wanted it to feel as naturalistic and realistic as possible,” Jones notes. Nuances have to be incorporated into the animation to believably convey the emotional state of the creature. Explains Jones, “You want to lean away from anthropomorphism as much as possible because right away people will say, ‘Oh, we’re watching animation.’ Mammals share such a common bond with us, even elephants and giraffes have this look of concern for their kids, and we try to use some of that sparingly. We looked at larger lizards and birds a lot. The way birds care for their young is different. There is not this nuzzling.”
“The whole point is when you look at natural animals, they do things that are so weird and wonderful, so why not just portray that because it’s fascinating on its own? While the BBC Natural History Unit is obsessed with scientific accuracy, they’re also storytellers and know how to make things compelling; that was a real balancing act.”
—Adam Valdez, Director
Success is found in the subtle details. “You could say that the work we do is like a thousand of tiny traces on a thousand tiny items, and if it all stacks up correctly, you get a win,” Valdez remarks. “Sometimes you don’t know what those things are until you’re in the midst of it. One of the things that we’ve learned over the last couple shows was that human audiences will project a lot onto characters for you. You don’t have to lean too hard in any visual storytelling. That’s the magic of the medium. Sometimes it’s a moment of stillness that could convey the idea that the animal might be thinking or feeling the event that just happened.” The events had to fit within natural order of things. Valdez adds, “The whole point is when you look at natural animals, they do things that are so weird and wonderful, so why not just portray that because it’s fascinating on its own? While the BBC Natural History Unit is obsessed with scientific accuracy, they’re also storytellers and know how to make things compelling; that was a real balancing act.”
As interesting as creating realistic dinosaurs was the process of making the show. “The trick was how do you make something that feels like you went and got the footage hiding out for eight weeks or hiding the track cameras all over and bringing the footage back,” Valdez states. “It’s a painstaking editorial process. What you learn is it’s not like BBC Natural History Unit [to] just go somewhere and film randomly. They know what’s interesting and what the dynamics are at a certain time and place. The Natural History Unit brought us deeply researched stories and our role was to go, ‘Okay, you have a notion, but what we’re going to do is make an animatic that is so tight that you know exactly where to go to get shot by shot.’” Shots were determined by the reality of documentary filmmaking. Valdez comments, “If you shot a hunt like a movie with eight camera positions, that’s not how they get those once-in-a-lifetime moments. They get them rarely [with one camera]. It was our job to make an animatic that felt 100% like they had shot it, and then give them a shopping list: go get these backgrounds, and precisely match the lens and how the camera is moving.”
“We went through quite a bit making the T-Rex because we definitely wanted to nail our version of what we really think the T-Rex is today. It was the first asset that we built and to show off what the series would be. Him and the baby T-Rex. As much we know about them in terms of fur, coloration, and the idea of what these babies would have been like, we needed our Baby Yoda!”
—Andy Jones, Director
Biomes determined the creatures, not the other way around, with the episodes titled ‘Coasts,’ ‘Deserts,’ ‘Freshwater,’ ‘Ice Worlds’ and ‘Forests.’ “That gives you one point of view on the nature of life and planet as a working ecosystem together,” Valdez observes. “Animals are our way in, whether it’s chimps, lions or dinosaurs. That’s why you see it framed the way that you do. Paul Stewart was in charge of ‘Coasts’ as the writer, producer and natural history partner. All of those particular stories have to do with the fact that where the land and sea meet you have a lot of dynamics. You have a lot of biodiversity, food source, territory and raising young. That’s the framing concept for the whole show.” The final sequence in ‘Coasts’ deals with the birth of a baby Tuarangisaurus. “For a Tuarangisaurus to make a baby that’s 12 feet long and 25% of the body mass of the mother is a massive investment, so they’re going to raise one at a time,” Valdez explains. “Then it turns out that the family shows some investment around the young as well. They found fossil evidence that backs all of this up. You find evidence of these sea creatures in the sands and earth where there was previously the Western Interior Seaway, a huge stretch of water that divided North America which had huge coastlands. The show hints on these ideas all the way through.”
“If you shot a hunt like a movie with eight camera positions, that’s not how they get those once-in-a-lifetime moments. They get them rarely [with one camera]. It was our job to make an animatic that felt 100% like they had shot it, and then give them a shopping list: go get these backgrounds, and precisely match the lens and how the camera is moving.”
—Adam Valdez, Director
When it comes to proper pronunciations of dinosaurs’ names, Jones laughs. “It’s never set in stone how to pronounce it until Sir David Attenborough says it! The Deincheirus was one of the fun dinosaurs in the series for me because it’s such a weird-looking animal with a big duck bill and massive claws. This is one where scientists would say, ‘He had these massive claws that probably could be used to defend himself in some sort of battle with males.’ But what else could these claws be use for? Let’s tell a story that’s not about fighting.’ We know that he probably ate seagrass or some sort of grass or some sort of vegetation. Those claws would be used to rip up and dig up the grasses and roots. Dealing with all of the flies is another thing. His claws could scratch a little bit, but his arms are so small that he can’t reach his whole body. The Deincheirus spots a scratching tree post to go up and start using that to scratch. For the ending of the episode, we wanted to tell the story of what happens when you eat so much food; his bowels get loose, he fertilizes the entire place and moves on. The Deincherius is a great character!”
“Deincheirus was one of the fun dinosaurs in the series for me because it’s such a weird-looking animal with a big duck bill and massive claws. This is one where scientists would say, ‘He had these massive claws that probably could be used to defend himself in some sort of battle with males.’ But what else could these claws be use for? Let’s tell a story that’s not about fighting.’ We know that he probably ate seagrass or some sort of grass or some sort of vegetation. Those claws would be used to rip up and dig up the grasses and roots. … For the ending of the episode, we wanted to tell the story of what happens when you eat so much food; his bowels get loose, he fertilizes the entire place and moves on. The Deincherius is a great character!”
—Adam Valdez, Director
‘Ice Worlds’ is a serious episode that explores family dynamics and the relationship between predator and prey. “It’s similar to ‘Deserts’ in the sense that you have these extreme environments, and it requires animals to go to greater lengths to survive,” Valdez observes. “You have this match that creates this endless loop of predation. and [questions] how does the prey species survive constantly being hunted. You have to figure out as a family group. The pack of Pachyrhinosaurus are rhino-like creatures that resemble Triceratops. They’re huge and powerful. The Nanuqsaurus don’t stand a chance attacking the group. But they are significant predators that are also big. What happens in the winter is that predators will work together as a team. You have a team of predators and a family group. It becomes a war of attrition, a siege. If we hound this family enough, eventually they’ll make a mistake, and we’ll take advantage of that mistake. It’s heavy. It’s like a standoff. You have to sit there and see who will last longer through the storm and winter that is around them.”
Going through the most iterations was an iconic dinosaur. “We went through quite a bit making the T-Rex because we definitely wanted to nail our version of what we really think the T-Rex is today,” Jones reveals. “It was the first asset that we built and to show off what the series would be. Him and the baby T-Rex. As much we know about them in terms of fur, coloration, and the idea of what these babies would have been like, we needed our Baby Yoda!”
For Jones, figuring out the motion of the creatures was a major task. “When I first saw the design of the giant pterosaurus, I thought there was no way that thing could fly,” he explains. “It’s the size of a giraffe. Figuring that out and having people watch it and believe it, is cool. Shooting at Palouse Falls was so much fun. We knew the environment when we prevised it, so we had a good layout. Actually, getting the shots was way challenging because we were hanging people on ropes to get the cameras in the positions that were needed. It was a fun sequence all around.”
Cinzia Angelini grew up in the 1970s in Milan, Italy, inspired by Japanese cartoons and the films of Hayao Miyazaki and Disney classics, which she studied frame by frame. Renowned director, animator and Head of Story at Cinesite Studios, Cinzia has worked for major animation studios in Europe and the U.S. for more than 25 years. Her body of work includes Balto, Prince of Egypt, The Road to Eldorado, Spider-Man 2, Minions, Despicable Me 3 and The Grinch. Cinzia wrote and directed the acclaimed CG animated short film Mila, a war story that centers on the plight of civilian children, and is currently directing HITPIG, an Aniventure animated feature produced at Cinesite.
Creating Mila was a life-changing experience, inspired by the stories my mother told me about how she felt as a child during the bombings of Trento in World War II. I wanted to use the medium I love, animation, and shine a light on the terrible realities for millions of children and families around the world who are caught in the crossfire of war. Audiences have embraced Mila’s messages of hope, imagination and perseverance and I’m so encouraged that there is a growing appetite for honest and authentic stories.
I fully embrace the power of animation. Hollywood might applaud socially relevant features, but it still views animation as essentially little more than “entertainment.” It has enormous potential to affect fundamental change in how we approach each other and how we deal with societal challenges. I believe that stories told through the magic of animation can move people and influence our future generations like nothing else can.
If Mila can change even one decision maker’s experience about the consequences of war, then all our efforts were well worth it.
The Mila theme is resonating with people around the world. Our team had 350 artists who gave their time and talent from 35 countries, the largest independent virtual studio collaboration ever created. And a surprising number of those volunteers have their own personal experience with war or in their family histories, which also moved them to want to be a part of this project. The strong theme of the film ended up being the secret for its success. Mila is more than a film; it’s a story within a story.
Inclusion and diversity were key elements in assembling the Mila team, and I’m proud that a significant percent of women were in leadership positions.
Finding ways to harness the talents of so many artists and filmmakers versed in different styles of work, cultures and languages made the final film that much richer. The entire process has been challenging and incredibly rewarding. It helped me improve as an artist, influenced how I collaborate with colleagues and showed me strength of our global interconnectivity in new and inspiring ways.
I’ve always embraced risk as an opportunity to innovate, grow and become stronger. Risk takers challenge the norm and push the boundaries of their professions. I keep leaning into the unknown, because even if things don’t work out as planned, you learn so much from the journey.
Join us for our series of interactive webinars with visual effects professionals. Ask your questions, learn about the industry and glean inspiration for your career path.
Register today at VESGlobal.org/AMA
By CHRIS McGOWAN
Images courtesy of DNEG and Sony Pictures Entertainment.
Ferdinand Magellan was a Portuguese explorer who led a Spanish expedition of five ships in 1519 to seek a western route to the Moluccas (Spice Islands). Magellan perished along the way and only one ship made it back, in 1522, but it was the first craft to circumnavigate the world. Flash forward five hundred years, and Ruben Fleischer’s Uncharted spins a fictional tale about a present-day search for two lost treasure-laden ships from Magellan’s fleet. The Sony Pictures movie is a prequel of sorts to the tremendously popular Uncharted video game series, developed by Naughty Dog and published by Sony Interactive Entertainment. The film’s treasure hunters included Nathan Drake (Tom Holland) and Victor Sullivan (Mark Wahlberg), along with Chloe Frazer (Sophia Ali) and Santiago Moncada (Antonio Banderas).
The On-Set and Overall VFX Supervisor was Chas Jarrett. DNEG was the primary VFX vendor, completing 739 shots over 23 sequences, with teams led by Visual Effects Supervisor Sebastian von Overheidt (DNEG Vancouver) and Visual Effects Supervisor Benoit de Longlee (DNEG Montreal). Other contributing VFX vendors included The Third Floor, RISE Visual Effects Studios, Soho VFX and Savage Visual Effects.
DNEG was tasked with handling various jaw-dropping sequences, including a 90-second shot in which Nate and Chloe – along with cargo crates and a Mercedes Gullwing car – fall out of a C-17 cargo plane while flying over the South China Sea. Von Overheidt considered the shot “a fun challenge. We called this sequence ‘the oner’ because it’s constructed as one continuous 90-second shot.”
“[For the falling out of a C-17 cargo plane scene] we had several practical elements with the actors hanging on wires and interacting with a stand-in car prop. We combined the practical elements with long stretches of full-CG moments. Some sections required either close-up digi-doubles to hold up, or even a transition between plate and digi-double right in camera with nowhere to hide. Mix that with the disorienting camera, and you have quite a complex puzzle to solve.”
—Sebastian von Overheidt, Visual Effects Supervisor, DNEG
Von Overheidt adds, “We had several practical elements with the actors hanging on wires and interacting with a stand-in car prop. We combined the practical elements with long stretches of full-CG moments. Some sections required either close-up digi-doubles to hold up, or even a transition between plate and digi-double right in camera with nowhere to hide. Mix that with the disorienting camera, and you have quite a complex puzzle to solve.”
To begin creating the sequence, von Overheidt reveals, “We received LiDAR scans and HDR photography of each individual cargo crate and all the other props like the Mercedes Gullwing, as well as a full scan of the C-17 interior, which was built as a set. From there we built the entire daisy-chain of crates and the C-17 interior. At the same time, we also worked on a fully digital version of the Gullwing and the C-17 exterior model with some custom modifications compared to a standard model. Ruben had asked us to create a billionaire’s version of the well-known plane.”
“[Tom Holland] indeed got thrown around quite a bit. All the crates on the exterior were mounted on top of KUKA robot arms so that they could move on a full gimbal in a programmed sequence. They were also modified with extra padding or using softer materials, so that Tom Holland and the stuntmen could jump in between them, holding onto the netting of crates. It gave a great realistic-looking reaction for most of the shots, so we got away with a lot of head replacements on the shots with Holland’s stuntman. In quite a few shots we still went for a full digital-double solution because we wanted the performance even more violent or the camera to be more dynamic than what was shot.”
—Sebastian von Overheidt, Visual Effects Supervisor, DNEG
Once camera, object and body tracking were done, Layout Supervisor Kristin Pratt and DFX Supervisor Gregor Lakner and their teams blocked the entire sequence out, “which is also the crucial step where we’d analyze each shot and figure out what CG extensions need to be added,” von Overheidt says. This also involved finding solutions for any discrepancies between the 3D-scanned crates and the ones used on set. “Our job was to piece this all together while finding the best transitions into CG and amp up the action and movement.” There were also some entirely full CG shots. He adds, “The environment was stitched based on multi-camera array footage shot at around 7,500 feet and then augmented to look a bit more desolate in terms of islands. All the clouds and wind FX and debris are CG.”
Lighting in the open sky was a challenge. “The plates were shot on a soundstage with stationary lighting, but our characters fall tumbling through an environment with only one light-source, the sun,” von Overheidt explains. “DFX Supervisor Daniel Elophe and the team broke this mammoth puzzle down into manageable sub-sections which were assembled to one long shot in compositing at the end.” The team around Lighting Supervisors Sonny Sy and Chris Rickard and Compositing Supervisor Francesco Dell’Anna kept track of changing light directions and found creative solutions to make it all work with the plates, while allowing for a free choreography of the camera and the animation, done by Layout Lead Steve Guo and Senior Animator Patrick Kalyn. “The result works really well and we ended up getting the best of both,” von Overheidt says, “seeing the sun rotating on high-action free-fall moments while coming back into a more character-focused lighting when there is dialogue and we’re locked into practical photography.”
Tom Holland got his share of shaking and stirring thanks to a robot arm. Von Overheidt comments, “He indeed got thrown around quite a bit. All the crates on the exterior were mounted on top of KUKA robot arms so that they could move on a full gimbal in a programmed sequence. They were also modified with extra padding or using softer materials, so that Tom and the stuntmen could jump in between them, holding onto the netting of crates.” They were thrown around randomly by the robot arms to get the sense of snaking of the daisy-chain. Von Overheidt adds, “It gave a great realistic-looking reaction for most of the shots, so we got away with a lot of head replacements on the shots with Holland’s stuntman. In quite a few shots we still went for a full digital-double solution because we wanted the performance even more violent or the camera to be more dynamic than what was shot.”
The scenes with Magellan’s ships (the Trinidad and the Concepción) and the huge helicopters carrying them required extensive VFX, but the scene wasn’t created entirely full CG. Von Overheidt notes, “There was actually a lot of great footage shot on big sets. This sequence really had everything in it. The scenes were shot on several stages resembling different parts of the ships, which we were extending with CG. The helicopters we had designed are based on some classic cargo helicopters, but even beefier.”
In the case of the Concepción, the set was split into four different stages – the stern, the main deck including helm, the bow and the crow’s nest with a partial mast, according to von Overheidt. “Our CG Supervisor Ummi Gudjonsson and Build Supervisors Chris Cook and Rohan Vaz started by assembling the various on-set stages for which we had received LiDAR scans, piecing them together, lining them up to each other and combining them with the overall design of the ship.”
Von Overheidt continues, “The same process went into the Trinidad and any other set, like the helicopters. Throughout the boat battle sequence we picked about a dozen hero shots based on the criteria [of] which ones would reveal the most problems, and we would constantly check whether our model of the ships lined up to those shots. The tricky part is that practical sets aren’t perfect. They may not be symmetrical, or the same section may have different dimensions across the different sets. In addition to that initial step, it then requires careful planning and a lot of work to get to the detail level of a good practical set. The ships were highly detailed, and complex assets were built for every form of action, including total destruction. Both ships were fully-rigged sailing ships with ropes, cloth banners, sails, flexing masts and yardarms, flapping doors, all the cannons, etc. [There were] a lot of moving parts which helped to bring across some of the crazy movements and crashes the ships would do in the sequence.”
“There was actually a lot of great footage shot on big sets [for the scenes with Magellan’s ships and helicopters carrying them.] This sequence really had everything in it. The scenes were shot on several stages resembling different parts of the ships, which we were extending with CG. The helicopters we had designed are based on some classic cargo helicopters, but even beefier.”
—Sebastian von Overheidt, Visual Effects Supervisor
Between the two ships and helicopters, around 20 mercenaries, Braddock (Tati Gabrielle), Hugo (Pingi Moli) and the Scotsman (Steven Waddington) all become part of different fights which were augmented with head replacements or full digi-doubles. Von Overheidt explains, “The journey of the flight was across [some] 330 shots, so we built a massive environment that we used to block out the sequence. Ruben wanted an action-packed sequence. Especially, the shots where we see the boats and helicopters moving through the Karst landscape had to be dynamic and exciting, and we wanted to feel their weight and impact on the helicopter’s flight dynamics.”
Von Overheidt adds, “Now, real-world physics obviously weren’t a priority on this sequence to begin with, but we still aimed towards that feel for a plausible animation and also staging the camera in a way that it would guide the audience through the disorienting action and make the ships look massive at the same time. We basically had to stick to real-world physics while also constantly breaking it at the same time. The entire sequence was a close collaboration between our layout team and the animation team led by Animation Supervisor Jason Fittipaldi and Animation Lead Konstantin Hubmann, and [On-Set and Overall] VFX Supervisor Chas Jarrett, himself whose roots are in animation.
“Generally speaking, working with big practical sets is great for visual effects because you have real references to match to – the real material, the real lighting and how the camera captures it. Even if you end up replacing parts of it anyway, it’s a great start. Actors feel more comfortable interacting with a real environment as well. The trade-off is that matching into complex practical sets can be quite the puzzle for visual effects.”
—Sebastian von Overheidt, Visual Effects Supervisor, DNEG
“For the South China Sea environment, we had received extensive footage from a practical shoot in Thailand. Film Production mounted a multi-camera array under a helicopter and flew through the landscape also shooting at different lighting conditions during the day,” von Overheidt says. The original plan was to use this material as practical backgrounds and only extend plates or create specific shots full CG. “As we were creating a digital version of the environment, we soon realized that our team, led by Environment Supervisor Gianluca Pizzaia and Environment Lead Matt Ivanov, was able to create one big environment which would cover the entire flight path throughout the sequence. And straight out of rendering it looked pretty much photorealistic. We presented our results to Ruben, who got excited about it. Everyone was confident that this would be the way to do it. It gave us and Ruben so much more freedom to find great cameras and shot composition that we decided to go full CG on the environment all the way through.”
Von Overheidt continues, “It allowed us to move the camera anywhere we wanted and fully customize the environment to our needs. It made the whole process a lot more efficient as well. Throughout the third act, there is also a progression in lighting from afternoon to sunset. Compositing Supervisor Kunal Chindarkar and Compositing Lead Ben Outerbridge made sure we transitioned seamlessly into these different lighting conditions and moods as we reached the final shot of the Conception sinking and Nate and Sully flying into the sunset.”
Asked if the filmmakers let the look of the Uncharted video games influence the visuals of the movie, von Overheidt comments, “Not from a visual effects perspective, no. I can’t speak for the Production Art Department though. I used to game quite a bit but never played Uncharted before, so when I joined the show, it was actually the first time I checked it out, mainly to understand the characters and some of the main levels. My main influence for creating images comes from photography and graphic design. I get most of my inspiration from actually being outdoors. We had some great artwork from the production team and the Thailand footage to look at. We would also often look at references for all kinds of scenes, like crazy skydiving stunts or video footage of heavy-lifting helicopters.”
Looking at the melding of the big-scale practical and digital in Uncharted, von Overheidt concludes, “Generally speaking, working with big practical sets is great for visual effects because you have real references to match to – the real material, the real lighting and how the camera captures it. Even if you end up replacing parts of it anyway, it’s a great start. Actors feel more comfortable interacting with a real environment as well. The trade-off is that matching into complex practical sets can be quite the puzzle for visual effects.”
By TREVOR HOGG
Images courtesy of Netflix.
There are sinister underpinnings to human nature which are mined narratively to create stories filled with destructive conflict and satirical humor for the Emmy-winning Netflix animated anthology Love, Death + Robots, executive produced by filmmakers David Fincher (The Social Network) and Tim Miller (Terminator: Dark Fate). The nine shorts curated for Love, Death + Robots Vol. 3 are examples of drastically different visual styles from the likes of Patrick Osborne, David Fincher, Emily Dean, Robert Bisi and Andy Lyon, Jennifer Yuh Nelson, Tim Miller, Carlos Stevens, Jerome Chen and Alberto Mielgo, with animation provided by Pinkman.tv, Sony Pictures Imageworks, Axis Studios, Blur Studio, Titmouse, BUCK, Polygon Pictures and Blow Studio.
“When 3D animation came out, it allowed us to do certain things that we couldn’t do in 2D animation. The same with a lot of the game engines. You are able to express an entire world, adjust things in real-time and change the light if you want. It’s not baked into things like it is usually.”
—Jennifer Yuh Nelson, Supervising Director
Returning as the supervising director from her previous outing on Vol. 2 is Jennifer Yuh Nelson (Kung Fu Panda 2 & 3), who worked with a mixture of new and veteran collaborators as well as making her own contribution with the muscle-flexing action adventure “Kill Time Kill.” Notable first-time participants are David Fincher making his animation directorial debut with the monstrous seafaring tale “Bad Travelling” and Patrick Osborne helms the macabre-funny, post-apocalyptic sequel “Three Robots: Exit Strategies.” Returnees include visual effects veteran Jerome Chen helming “In Vaulted Halls Entombed,” where a special forces team encounters an ancient evil, and Oscar-winner Alberto Mielgo envisioning a fatal romance between a deaf Renaissance knight and a lethal siren in “Jibaro.” Inventive animation styles are found in “Night of the Mini Dead,” which uses tilt-shift photography to make everything look tiny, Mobius and psychedelic-flavored “The Very Pulse of the Machine,” and in the painterly impressionism of “Jibaro.”
As to whether real-time technology and game engines are impacting the type of stories being told, Nelson does not believe this to be the case. “I don’t know if it’s types of stories that it has affected,” she explains. “It’s the look and how much you can deal with certain levels of complexity. When 3D animation came out, it allowed us to do certain things that we couldn’t do in 2D animation. The same with a lot of the game engines. You are able to express an entire world, adjust things in real-time and change the light if you want. It’s not baked into things like it is usually.” The impact of game engines like Unreal and Unity cannot be ignored. “I’m so old that I was on the cusp of the desktop revolution, and it used to be when I started in the business you had to have a lot of money to be able to do 3D animation,” recalls Miller. “Then desktop technology and software came along and it democratized the process, which allowed us to start Blur borrowing $20,000. I thought that was amazing, but game engine technology is going to be a paradigm shift again. You don’t need heavy machines to render. Even lots of cheap PCs are still expensive and need some technical infrastructure. Now guys can do minutes-long shorts in their basements at home and you can see it on the web. You see a lot of interesting artists doing great things by themselves or with small teams. Game engine technology is super freaking exciting. I feel like that I’ve been waiting for it a while, but now it’s here.”
“[G]ame engine technology is going to be a paradigm shift again. You don’t need heavy machines to render. Even lots of cheap PCs are still expensive and need some technical infrastructure. Now guys can do minutes-long shorts in their basements at home and you can see it on the web. You see a lot of interesting artists doing great things by themselves or with small teams. Game engine technology is super freaking exciting. I feel like that I’ve been waiting for it a while, but now it’s here.”
—Tim Miller, Director
When it comes to her own short, where a squad of soldiers in Afghanistan encounter a CIA experiment gone horribly wrong, Nelson decided to channel a fondness for a particular cinematic era that made action icons out of Arnold Schwarzenegger, Sylvester Stallone, Bruce Willis and Jean-Claude Van Damme. “For ‘Kill Team Kill,’” she says, “my inspiration was cartoons from the 1990s and action movies from that time, like Predator, Commando, and G.I. Joe cartoons. They were good fun at the time, and the story by Justin Coates had that feel to it, so that’s where that came from.” Handling the animation was the studio responsible for The Boys Presents: Diabolical and The Legend of Vox Machina. “I got to work with Titmouse, and they’re an amazing studio with a wide variety of different styles. I got to work with Antonio Canobbio and Benjy Brooke who helped to find this look. It’s a 2D style, so it has to be animatable. The character designs themselves are covered with veins and packets of ammo which are hard to animate, but we got the benefits of amazing animators from all over the world, and you can see that level of expertise in it.”
“[For ‘Jibaro’] we used real scans of armor that you might see in museums. When you see the armor, it feels almost unbelievable that you can fit a person inside. The cool thing about this is we don’t actually need to fit a person inside because these aren’t real characters. You can just have their neck. We were using real Renaissance armor. We were redesigning it a little bit, but the cool thing is that we’re seeing something that is historically accurate. I feel that is extremely new and fresh.”
—Albert Mielgo, Director
“Swarm” was adapted by Miller from a short story by Bruce Sterling, and revolves around human factions with conflicting views as to whether advancement should be achieved through genetic manipulation or cybernetic enhancement and technology. Adding further complications is the discovery of an insectoid race that may be of a higher intelligence than humanity. “We have a set of eight-sided dice and roll them!” laughs Miller when describing how he decides upon the animation style, character design and world-building. “It was interesting that we had this short which is almost entirely in zero-G, but we were still going to do some motion capture for that,” notes Miller. “Then the pandemic hit and motion capture was not an option anymore. I didn’t want to get caught in the uncanny valley either, so I decided to stylize the characters to a certain degree, which helps the story not be quite as horrible as it would be otherwise. I loved making the show. It was a challenge to think about the physics of how people move through zero-G, and anything with lots of creatures is a good time. I get a lot of vicarious enjoyment from knowing the animators and creature designers are going to enjoy the process of making this.”
“[For ‘Kill Team Kill’] it’s a 2D style, so it has to be animatable. The character designs themselves are covered with veins and packets of ammo which are hard to animate, but we got the benefits of amazing animators from all over the world, and you can see that level of expertise in it.”
—Jennifer Yuh Nelson, Supervising Director
Self-taught as an artist, Mielgo (The Windshield Wiper) utilizes the principles of painting, in particular lighting, when producing animated shorts such as “Jibaro.” “I create a simple image by removing what is not necessary for the eye to understand,” he says. Themes rather than the premise influence the animation style. “In terms of the girl, I wanted her to be a walking treasure, and in order to do that I was doing research on folklore jewelry from Northern Africa, China, India and Pakistan. In the case of the guys, I prefer the Renaissance rather than the Medieval in terms of design. We did something interesting, which is we used real scans of armor that you might see in museums. When you see the armor, it feels almost unbelievable that you can fit a person inside. The cool thing about this is we don’t actually need to fit a person inside because these aren’t real characters. You can just have their neck. We were using real Renaissance armor. We were redesigning it a little bit, but the cool thing is that we’re seeing something that is historically accurate. I feel that is extremely new and fresh.”
Sheena Duggal is an acclaimed visual effects supervisor and artist whose work has shaped numerous studio tent-pole and Academy Award nominated productions. Most recently, Duggal was Visual Effects Supervisor on the box office blockbuster Venom: Let There Be Carnage and was a BAFTA nominee this year for Best Special Effects for her work on the Oscars VFX-shortlisted Ghostbusters: Afterlife. Sheena is the only woman VFX Supervisor to earn that level of recognition from the Academy this awards season. She was the first woman to be honored with the VES Award for Creative Excellence, bestowed in 2020.
The lack of female visual effects supervisors is definitely the result of a lack of opportunity and unconscious bias – and that is fixable. Earlier in my career, I was told that the goal was to promote the male supervisors, and watched as guys who had worked under my VFX supervision were promoted up the ranks and given opportunities on large VFX shows. It never occurred to me that my gender would hold me back, and I was always surprised when it did. I am a strong believer in diversity and inclusion, not just because I am a bi-racial woman, but because I believe that greater diversity leads to freer thinking and greater creativity.
Good girls get nowhere. Be disobedient, be persistent, never take disrespect thrown your way… be smart and graceful and remember you are equal.
Never stop fighting for the right to be the best you can be. Women spend too much time being congenial, and it’s time for us to speak up about our achievements and the opportunities we’ve created for ourselves. We’re talented, we’re here, and we’re ready.
Even if women break though the glass ceiling, they end up on a glass cliff where they can be pushed off, because there, is no cadre of women to cheerlead in support that is equivalent to a “boy’s club.” We need to be building an industry culture and a structure that supports women in the field and sets them up for success. I take my opportunity to be a role model and a voice for other women seriously; I want to not just open doors, but bust through them.
Change can happen fast if everyone is motivated. We need to do it now.
In having this inevitable conversation, we can’t exclude men or accuse them if we want to create the change we want to see. We must do it together. Women are almost always expected to solve the systemic problems we did not create or perpetuate in a patriarchal culture. A lot of well-meaning people lack self-awareness or fail to understand their role in enabling sexism or great inequities. If meritocracy fails to work, then uplifting women needs to be a conscious choice. I would ask all men in VFX to go through implicit bias training and be active problem-solvers and advocates for women, because people still give men’s voices more credibility. It takes a lot of people to create success for an outlier.
Join us for our series of interactive webinars with visual effects professionals. Ask your questions, learn about the industry and glean inspiration for your career path.
Register today at VESGlobal.org/AMA
By TREVOR HOGG
Images courtesy of Sky and HBO.
Upon reading the synopsis for the HBO and Sky horror comedy The Baby, one gets a distinct impression that anxiety about motherhood drives the narrative created by Lucy Gayme and Siân Robins-Grace. The summary states, “Controlling, manipulative and with violent powers, the baby twists Natasha’s life into a horror show. Where does it come from? What does it want? And what lengths will Natasha have to go to in order to get her life back? She doesn’t want a baby. The baby wants her.” When this observation gets mentioned to VFX Producer Anne Akande and Visual Effects Supervisor Owen Braekke-Carroll both of them laugh in agreement. “It’s certainly a dissection of many angles of motherhood!” states Braekke-Carroll. “There is symbolism and scenes that absolutely tap into practical and real fears of breast feeding and abandonment. We were tasked with bringing some of the juicer parts of the script into the visual medium. It’s quite literal in many ways.”
“We weren’t pushing [visual effects] beyond anything because the show was one that we knew early on was grounded in reality. The baby is a baby. There are a lot of misconceptions about what this baby is and what his agenda is. There are a few moments where we have some heightened reality and he is still a baby, but a bit different.”
—Anne Akande, VFX Producer
Gayme and Robins-Grace had a clear and descriptive vision of the reality and tone of the series. “Siân and Lucy were keen from the outset on getting a realistic and grounded tone throughout the series, and this influenced how we then approached the body of work,” remarks Akande. “We were involved early in the process to ensure that the shoot methodology would be effective and give visual effects enough material to pull off some of the more dramatic scenes. Beyond that giving the guidance, they were also collaborative, open and willing to take feedback on the best way forward via visual effects to hit each story point.” The visual effects work for the eight episodes consisted of just under 650 shots by Framestore, Jellyfish Pictures, Freefolk and Goldcrest. “We weren’t pushing it beyond anything because the show was one that we knew early on was grounded in reality,” notes Akande. “The baby is a baby. There are a lot of misconceptions about what this baby is and what his agenda is. There are a few moments where we have some heightened reality and he is still a baby, but a bit different.”
Nicole Kassell helmed the pilot, Faraz was responsible for three episodes, and Stacey Gregg and Ella Jones each directed two episodes. “It’s always interesting working with different directors across a series, and in this case they did all have different approaches to handling the visual effects,” states Akande. “Nicole Kassell had a lot of experience in visual effects and had a hands-on approach from storyboards, concept, previs through to execution. Others brought their comedy experience to help drive the storytelling beats, and there was also some experimentation using different shooting techniques and machinery on set. All of this brought an interesting mix that fused with the tone of the show, creating a unique place for The Baby in the comedy/horror genre.” Storyboarding and concept art were produced for all of the key creative beats.
“We definitely knew that we needed a digital asset. By casting twins, we were able to double our shooting hours. The babies absorbed the nature of the set quickly, and we saw them grow up over the course of six months of shooting with them. That left us with a strange, hybridized methodology over time, whether it be face replacements from plates with a CG body, a stand-in prosthetic baby with a head replacement being pushed around in a pram, or one digital arm, plate head and a prosthetic body. There was also an army of stand-in babies.”
—Owen Braekke-Carroll, Visual Effects Supervisor
“To compensate for [the unpredictability of the babies] on set, we ended up treating almost every frame with the baby cast in it as a potential visual effects shot. This included taking large volumes of data and notation for most scenes and essentially treating them as a CG creature in the scene.”
—Anne Akande, VFX Producer
“Concept art for key moments, such as our Demon Baby nightmare scene, was developed by the Framestore art department and was crucial in helping settle the creative vision as much as possible before shot execution,” remarks Braekke-Carroll. “From the storyboards, some key shots were turned into previs.” Scripts for the eight episodes were broken down to determine what shots required visual effects. “We worked closely with the art department throughout the shoot to help find the right combination of set, location and bluescreen,” explains Akande. “A key location in the script that we return to many times is a seaside cottage at the base of a cliff, directly fronting the shore line. Locations were unable to find a site that hit all the required points, so visual effects were tapped to make this work. The cottage and immediate gardens were built on the site of a small quarry, which gave us the immediate base of the cliff and surrounds. A secondary location along the site of a dramatic coastline in Newhaven [England] was the basis for the extension. This beachside cliff was LiDAR scanned, recreated through DMP/CG, then combined with plate photography to combine the two locations together.” Deaths are plentiful throughout the story, but the focus is on the aftermath rather than the actual act of violence. “There is an implied causal link between the baby and a death,” states Braekke-Carroll. “But he’s not necessary physically holding the knife.”
“The sheer nature of the amount of time that we were going to have a baby onscreen and on set meant that a lot of things we had planned for would sometimes go flawlessly without any help from us,” notes Braekke-Carroll. “On another occasion, the entire day might need to be completely changed and require our input for all sorts of reasons.” Identical twins were cast in the title role. “We definitely knew that we needed a digital asset,” remarks Akande. “By casting twins, we were able to double our shooting hours. The babies absorbed the nature of the set quickly, and we saw them grow up over the course of six months of shooting with them.” Digital doubles were avoided as much as possible. “That left us with a strange, hybridized methodology over time, whether it be face replacements from plates with a CG body, a stand-in prosthetic baby with a head replacement being pushed around in a pram, or one digital arm, plate head and a prosthetic body,” states Braekke-Carroll. “There was also an army of stand-in babies. When it comes to performance with our hero twins, that became a hybridized process where we used a combination of digital passes, keying tools, reprojections and face tracking from source plates. Then also leaning on machine learning additional 2D layering to change the performance.”
“It’s always interesting working with different directors across a series, and in this case they did all have different approaches to handling the visual effects. Nicole Kassell had a lot of experience in visual effects and had a hands-on approach from storyboards, concept, previs through to execution. Others brought their comedy experience to help drive the storytelling beats, and there was also some experimentation using different shooting techniques and machinery on set. All of this brought an interesting mix that fused with the tone of the show, creating a unique place for The Baby in the comedy/horror genre.”
—Anne Akande, VFX Producer
The eyes were difficult to get right. “The animation of the performance of the baby isn’t quite straightforward,” remarks Braekke-Carroll. “The eyes are quite loose and gaze differently. We took parts of plates for the area around the eyes for the micro-performance and combined that with CG or machine learning layers.” A wealth of material was gathered from reference photography. “We could be working on something in Episode 104 and there’s a performance that nails it in Episode 102,” states Akande. “Everybody on set was invested in getting us the material. It could be the first two seconds before the take, and that was needed for the face replacements. We also learned about which baby is good at being still or restless. The one thing that we tried to educate people on is that the babies are a member of the cast. If you replace a cast member with a stand-in for 30 shots, that becomes visual effects. Every time a baby was in a shot, the first port of call was our hero baby. The real performance will always be better than the alternative. CG was the last resort, and that was what we let the showrunners and executive producers know from the beginning.”
Face-generation camera setups were orchestrated that proved to be useful as animation reference and being utilized for machine learning. “Anytime we were using a digital baby performance and we would also be running a machine learning output as well,” explains Braekke-Carroll, “rather than treating that as a facial replacement solution we had it as an additional layer setup that could be incorporated partially or fully in with the other renders for the other parts. A lot of the shots that you will see won’t necessarily be a machine learning output, but there will be parts of the lips, eyes or cheek that will give it an extra degree of photographic verisimilitude that you get from that output.” One of the most difficult visual effects tasks was to have a baby falling asleep or sleeping. “We had to find a bunch of solutions and ended up shooting a lot of high-frame-rate plates of the baby and played them back at normal speed,” adds Braekke-Carroll. “We looked for a nice section where it felt like they were sleeping. A machine learning dataset was built just of the baby’s eyes. The high-frame-rate photography gave it a gentle effect, rather than trying to animate too much micro eye movement.”
Point-of-view shots take place within the birth canal and womb. “We had a free remit to take B camera, get all of the jars of Vaseline, PCB tubing, probe lens, lights, blood, sputum and pus,” reveals Braekke-Carroll. “We took all of the bits and pieces and gelled them up. We got some nice closeup photography inspired by the scenes in The Tree of Life. I was pushing against building a CG interior because, tonally, I didn’t think that it fit the episode. From that photography, visual effects added a layer of fine particulate and endoscopic lensing. There is also the diffusion of the water and cloudiness. But the actual content of the walls was practical photography.”
“The one thing that we tried to educate people on is that the babies are a member of the cast. If you replace a cast member with a stand-in for 30 shots, that becomes visual effects. Every time a baby was in a shot, the first port of call was our hero baby. The real performance will always be better than the alternative. CG was the last resort, and that was what we let the showrunners and executive producers know from the beginning.”
—Owen Braekke-Carroll, Visual Effects Supervisor
The unpredictability of the on-set babies posed the biggest challenge across the series. “To compensate for this on set, we ended up treating almost every frame with the baby cast in it as a potential visual effects shot,” explains Akande. “This included taking large volumes of data and notation for most scenes and essentially treating them as a CG creature in the scene.” The best moment was literally saved for last. “We’re looking forward to the final sequence underwater,” states Braekke-Carroll. “It’s a beautiful and unexpected scene that wraps up the story and bookends the series nicely.”
By TREVOR HOGG
With The Mandalorian taking a breather after winning two consecutive Emmy Awards for Outstanding Special Visual Effects in a Season or a Movie, it will be up to The Book of Boba Fett to continue the winning streak as the iconic bounty-hunter-turned-crime-lord series has been described as The Mandalorian 2.5. Whether Boba Fett receives a nomination, or more, will be revealed when the Primetime Emmy Awards takes center stage on September 12, 2022 at the Microsoft Theater in Los Angeles. The other category is Outstanding Special Visual Effects in a Single Episode, which last year was awarded to Star Trek: Discovery, another contender from a storied science fiction franchise which will be trying to repeat the feat once again.
An interesting bellwether is the 2022 Visual Effects Society Award nominations that place Loki and Foundation at the forefront with both being singled out for their stunning environmental work for Lamentis and Trantor. “We were asked to create meteor effects from scratch,” states Digital Domain Visual Effects Supervisor Jean Luc-Dinsdale when discussing Lamentis and its moon Lamentis-1. “We went through multiple versions of providing the meteors, the impacts, and the dust and debris that flies around them. That was then tweaked and populated throughout the episode because the meteors are a constant threat, but not always the focus of the sequence.”
Trantor is literally 50 different cities stacked on top of each other. “Every level was built hundreds of years before the next one, so there was a lot of concepting and architectural research that went into how Trantor and its multilevel structure was designed,” explains DNEG Visual Effects Supervisor Chris Keller. “We created all of these interstitial elements between buildings, like bridges, platforms, megastructures spanning 1,000 meters, through the sky ceiling of a certain level into the next level. Then you’ll see hyperloop trains and, if you look carefully, flying vehicles. All of that had a certain logic.”
When it comes to photorealistic CG characters, Leshy-infected Eskel and Nivellen from The Witcher and Ampersand from Y: The Last Man were also nominated for VES Awards. “There has been real growth on the monster side,” explains Andrew Laws, Production Designer for The Witcher. “We work in ZBrush from the ground up to understand the movement and how the creature is going to take shape in all dimensions. It’s a much more fluid process. Once we have established a ZBrush model that has an organic shape, we’ll do some overpainting to get the mood of the creature. When it is agreed upon how that is going work, then the 3D model goes out to visual effects and the vendors to bring in the detail and movement.”
Originally, Ampersand was going to be real but was changed to CG because Disney has a ‘no primate’ rule. “Stephen Pugh and Jesse Kawzenuk, our amazing visual effects supervisors, made it so easy for me,” recalls cinematographer Catherine Lutes. “I was constantly laughing at the puppet Amp that we had. It helped with the way that the light was falling, and that’s a good reference as well for visual effects. Stephen said that camera shouldn’t do things that a monkey wouldn’t do. If the camera is a little bit stilted or doesn’t move smoothly, that’s great because that’s what would happen if you were trying to follow an actual monkey running or moving.”
The Wheel of Time features a wide gamut of visual effects from creatures, magic and world-building done in a grounded fashion. “One thing that was important for me from the beginning was that this world feel authentic and real,” explains The Wheel of Time creator, executive producer and showrunner Rafe Judkins, “even for the actors and crew, trying to go to places, as much as we can put stuff in-camera, even if we end up augmenting or enhancing it later with visual effects.”
The fact that the sixth season is the grand finale for The Expanse may see Emmy voters finally honor the body of work with a nomination. “The most challenging thing is wrapping your head around things that may not sound that difficult initially, like deorbiting maneuvers where you slow going forward to be able to drop,” notes Bret Culp, Senior Visual Effects Supervisor of The Expanse. “We’ve done a good job and, as a result, it has been made clear to us that we are favorites with a lot of people at NASA and have an open invitation to visit the JPL [Jet Propulsion Laboratory].”
The usual suspects include Lost in Space, which has been rightly lauded for being able to turn practical locations into alien worlds and making biomechanical robotic beings that are empathetic and menacing. “The most challenging visual effects sequence in the finale of Lost in Space was creating the horde of killer alien robots and sprawling wreckage of their crashed ship,” remarks Lost in Space Visual Effects Supervisor Jabbar Raisani. “The entire episode had to be filmed on stage, and we decided to shoot against black. As both the director of the episode and the VFX Supervisor, I relied heavily on shot planning with our in-house previs team which maintained close collaboration with the production designer to maximize our efforts and bring the series to its epic conclusion.”
“The most challenging visual effects sequence in the finale of Lost in Space was creating the horde of killer alien robots and sprawling wreckage of their crashed ship. The entire episode had to be filmed on stage, and we decided to shoot against black. As both the director of the episode and the VFX Supervisor, I relied heavily on shot planning with our in-house previs team which maintained close collaboration with the production designer to maximize our efforts and bring the series to its epic conclusion.”
—Jabbar Raisani, Visual Effects Supervisor, Lost in Space
Returning for sophomore seasons are Star Trek: Picard and Raised by Wolves, with the former mining the fan adoration for the Starfleet officer portrayed by Patrick Stewart and the latter infusing Alien mythology into the android survival tale produced by legendary filmmaker Ridley Scott. The hardest sequence to design, create and execute for Raised by Wolves was the outerspace sequence between Mother and the Necro serpent in Episode 208,” reveals Raised by Wolves Visual Effects Supervisor Raymond McIntyre Jr. “The flying Necro serpent is lured away from killing Campion by Mother, who leads the serpent into outer space in order to attempt to kill it. This scene was added deep into postproduction, and visual effects was tasked with designing an entire sequence from scratch as no live-action footage existed. Visual effects designs included the flying serpent, lighting design in outer space, nebulas, the planet Kepler 22B seen from this viewpoint, Mother’s new kill scream and a visualization of the failure of the EMF dome protecting this area of the planet. Execution involved creating realistic camera motion for each shot, and beauty lighting with sun flares, allowing for dirt on the lens to show up during flares, all while rendering fully CG shots.”
Making their debuts are Obi-Wan Kenobi, which has Ewan McGregor reprising his role as the legendary Jedi Master from the Star Wars prequel trilogy, and Star Trek: Strange New Worlds, an exploration of life on the USS Enterprise under the command of Captain Christopher Pike; both of them serve as prologues to the original movie and television series and have the best chances to get nominations for their respective franchises, especially if a proper balance is struck between nostalgia and canon expansion.
Then there is a matter of art imitating life that will resonate with some while being too close to the bone for others, where the viral mayhem portrayed is even more devastating and required extensive invisible effects to paint out modern-day life. In Sweet Tooth, a pandemic causes hybrid babies that are part human and animal, with the adolescent protagonist being half deer, while Station Eleven focuses on humanity trying to rebuild society after a virus has decimated the population, and See envisions a future where blindness has reached epidemic proportions.
Serving as dark social commentary on the growing financial divide is Squid Game, which combines elements of The Most Dangerous Game, childhood games and Dickensian debt into a rating sensation for Netflix, and is a strong contender to upset the voting establishment. “The game spaces in Squid Game were unique and something we had never experienced before,” states Cheong Jai-hoon, Visual Effects Supervisor of Squid Game. “What we wanted to achieve from the settings of Squid Game was a fabricated yet realistic look, and it was quite challenging to balance the two conflicting characteristics. Especially in Episode 107, characters play the game of Glass Stepping Stones from high above the ground, and we had to create an environment that would make the viewers immerse in the fear and tension. We put the most effort into deciding the depth from the stepping stones to the ground and the overall scale of the whole setting. We could have easily exaggerated, but we strived to find the right balance between what seemed fake and realistic, as it was more difficult than we thought.”
Also, present is the author only outdone by the Bard himself when it comes to number of film and television adaptations of his works. Lisey’s Story was conceived by prolific horror maestro Stephen King, who has supernatural unrest intersecting with personal trauma. Comic book adaptations are not in short supply. A superhero who has a sharp wit and archery skills is paired with a like-minded protégé in Hawkeye, which channels Shane Black’s penchant for Christmas, action sequences and odd-ball comedic pairings. For those wanting an irreverent take on the genre, James Gunn helms the small screen adaptation of Peacemaker, where an extremist murderer embarks on a quest for peace. Moon Knight introduces the Marvel Studios equivalent of Batman, but with an Egyptian god reincarnation twist that raises questions about the mental sanity of the main character.
Superman & Lois reimagines The Daily Planet colleagues as a married couple trying to balance domestic life and a rogues’ gallery of high-flying adversaries. “If Superman is fighting someone in the air where they would both be horizontal, it was much more time efficient and easier on the actors if they can be vertical,” states cinematographer Stephen Maier, who added a physical camera shake for the sake of realism. “The stunt team will often go away to design or rehearse something, do their previs that they film on their iPhones, cut it together and show it to us. We have a close collaboration with special effects in regards to atmospheric smoke and haze. The gags that they come up help to exemplify the strength of Superman, such as him lifting a car.”
Considering the growing demand for content and the acceptance of visual effects as the primary work tool of potential nominees reflect how far the production quality of television and streaming shows have come in being able to expand the scope of creatives with a theatrical sensibility. It is because of this that the Primetime Emmy Awards has become as fascinating to watch as the Academy Awards as both showcase the very best of what can be achieved when talented digital artists get to contribute to the storytelling. Undoubtedly, the eventual winner will encapsulate the highest of level of creative and technical ingenuity achievable under current circumstances and will serve as a building block for what is to follow.
By TREVOR HOGG
When it comes to witnessing what is achievable with visual effects, no longer does one have to go to a theater, as high-end episodic has essentially become a long form cinematic experience that can be enjoyed by turning on a television or mobile device. This is not going to change with streamers spending billions of dollars to create content to stand apart from their like-minded competitors. The result is an impressive array of shows that are not lacking in storytelling ambition, whether it be The Wheel of Time, The Witcher, Foundation or The Book of Boba Fett. Virtual production has become synonymous with The Mandalorian, but this innovative methodology is only an aspect of the visual effects landscape which continues to evolve technologically. What does the future look like for the visual effects industry and episodic productions in the pandemic and post-pandemic era? This is a question that we try to answer by consulting the players responsible for producing the wealth of content that is available for viewers to watch.
Robin Hackl, Visual Effects Supervisor & Co-founder, Image Engine
“The requirements of television work are identical to feature film work in many ways. But back then it was much less resolution involved with the final output. Interestingly, we became known as a television visual effects house, and that precluded us from actually doing feature film work. It came with a stigma back in those days and was a large barrier that we had to break through. District 9 was a tipping point of recognition of us being able to execute on large-scale work.
“Shawn Walsh [General Manger and Executive Producer, Image Engine] has done a good job of holding the line. Placing the value on what we deliver to the client and making them understand what that value is and why it is of value. The shortened timelines have been the long-term progression ever since I could remember. Coupled on top of that are the demands. Now the expectations are far greater than what they were. Where is that breaking point? It is up to us to hold the line as best as we can and inform our clients what our capabilities and capacities are in order to avoid that.”
“Now the expectations are far greater than what they were. Where is that breaking point? It is up to us to hold the line as best as we can and inform our clients what our capabilities and capacities are in order to avoid that.”
—Robin Hackl, Visual Effects Supervisor & Co-founder, Image Engine
Drew Jones, Chief Business Development Officer, Cinesite
“You’re fine-tuning the teams of people attached to particular projects, ensuring that you have the right personalities dealing with the right style of work so that you can shortcut the processes and still deliver the quality threshold that it needs to be. You haven’t got the luxury of time to develop an idea across many months for the most part.
“Vendors having concept artists and art departments in-house are definitely a use for a quick, more cost-effective process to get closer to an answer within the visual effects post-production environment. We will often use conceptual artists to build imagery quickly to present an option to a production rather than go through a long gestation period of a CG build and compositing to get an idea across.
“There is more exploration into ideas through streamers. The projects, scripts and series are often filled with quite fantastical ideas that may have never seen the light of day on the big screen. The content I don’t think has changed. I don’t feel like we’re doing anything outrageously different. All visual effects have a complexity component to them, and at the end of the day it comes down to how far the directors want to push their thoughts and ideas.”
“The area that’s getting the most attention at the moment … is facial replacement work, with articles and papers going in-depth about how AI and computational analysis are making those kinds of computer-generated content far more photographic than before. It’s definitely an area that could lead to some very different approaches as to how visual effects are fundamentally implemented.”
—Paul Riddle, Executive VFX Supervisor, DNEG
Janet Muswell Hamilton, Senior Vice President, Visual Effects, HBO
“Right now, you have a lot of executives and post executives who have been doing a good job of producing the visual effects, who needed additional help because their slates were busy. The industry has just exploded. Being able to take work off of their plates to help them find heads of departments, facilities, and getting their heads around budgets –- that was the first thing I did. But what I needed was processes in place in order to make it easier for me so I wasn’t working in 10 different ways, because HBO has been a bespoke studio. I am a fan of tools and processes that help us with the creative process.
“House of the Dragon is utilizing the LED screens at Warner Bros. Studios Leavesden for a whole bunch of sequences. It’s 360 [degrees] and has a retractable roof. The ability to shoot magic hour for about a week is incredible. Yes, you need to do it upfront. Yes, you need a director who is willing to go that way. Ultimately, when you start to see the results, how beautiful things look and the stories you can tell that you couldn’t tell before because you couldn’t go there or afford it – it’s going to revolutionize how we do things. It’s a technology that is here to stay. It was an unexpected benefit of the lockdown. My biggest desire is to never ever shoot another greenscreen driving shot!”
Alex Hope, Co-CEO, beloFX
“The biggest barrier to growth for many visual effects companies for many years has been finding talented artists. There is a finite global talent pool, but it is one we are all working hard to build. We’re all making huge efforts to train and develop visual effects talent at every stage whether that’s in college, entry level into the industry or once people have gotten into the industry. Many visual effects companies are getting behind career development for artists. In the U.K. we are helped by organizations like ScreenSkills, who standardize training at various levels, and to ensure that the industry is working to support the education sector to bring new talent into the industry.
“Visual effects is perhaps the fastest-growing component of the film and television industry. It’s fantastic that we’ve seen an explosion in content creation of all types, and we’ve seen a consequent growth in demand for visual effects, so certainly the money spent on content creation is coming through to all parts of the industry, including visual effects. As we see more localized production for streamers, it’s going to be really interesting to see what opportunities this provides for partnerships between local visual effects companies and those companies in more established centers, like the U.K. and Canada, and that’s very exciting and interesting to us at beloFX.”
Lucy Ainsworth-Taylor, CEO & Co-Founder, BlueBolt
“With the global demand and need to get shows finished, work is being spread everywhere, often disregarding the rebate. We still cannot compete with the Indian prices, but the flip side is that the talent coming out of India now adds to the international remote marketplace. Netflix purchasing Scanline VFX is not a game-changer at all. Studios have purchased facilities before, and as long as they can keep feeding the work into the facility, it will work. With the content Netflix is making at the moment, it makes sense, but I would assume they should probably buy many more facilities for the amount of work they require! Scanline is a well-respected visual effects house; does this mean they will now only work on Netflix shows?”
John Fragomeni, Global President, Digital Domain
“Working on award-winning projects like WandaVision, Lost in Space, Carnival Row and Loki was basically like making six to eight mini-films. We use the same tools on episodics that we use on features, and often the same team of artists. That has helped to accelerate our development on some of the tools we use, giving us the ability to handle the volume of work while still delivering quality.
“Some builds tend to lend themselves to features. For instance, the Free City game world we made for Free Guy or the 2.5 miles of New York City that we recreated for Spider-Man: No Way Home. But that doesn’t mean that you couldn’t do that for an episodic, given enough time and budget.
“One thing we are seeing more and more of on the episodic side is that the productions are coming to us with a detailed vision of what they want for the entire season. This helps us forecast schedules more finitely and identify breaking points when it comes to tight deadlines. From that, we can determine with the production where we can best serve the visuals, then coordinate with any other visual effects vendors the production may bring in. One of the more interesting by-products of the rise of elevated quality effects in episodics is that studios that used to compete for the same projects are now partners. As the demands for effects grow, we’ll probably see more groups involved.”
Michelle Martin, Chief Business Development Officer, Milk VFX
“The speed with which streaming content has grown globally has given VFX houses the opportunity to raise the bar in VFX, to create high-end content for tentpole series and feature-length projects, in turn giving a wider range of artists the opportunity to work on interesting projects. Standards have certainly been raised.
“The networks and studios are engaging us earlier and are keen to discuss capacity with us, as well as share more information regarding their up-and-coming slates. There’s a keenness to share information and artists are being block-booked ahead of productions starting, which is where we should be to help develop and visualize the storytelling. We are seeing a very different landscape to where we were 10 years ago.”
Christopher Gray, Global Managing Director, Episodic, MPC
“In the short term, we’re already seeing the wider application of these techniques [virtual production, real-time, machine learning]. Every show we are working on in episodic employs at least one of these toolsets in some capacity, but I think the greatest opportunity, as silicone begins to catch up, is the ability to iterate more quickly, particularly in animation. In the next five years, we’ll see more widespread adoption of real-time and near real-time GPU rendering for final pixel. The technology is close, but it’s the development of existing workflows and the continued widening of the knowledge base that needs to expand to capitalize on this moment.
“We’re seeing the rise of great new prospects for counter programming, film and episodic projects that would struggle to find an audience five years ago, and we’re doing so more and more now thanks to strategic work in this space by Amazon and Apple leading the charge, and Netflix particularly so, with its commitment to international and local-language film and series and limited theatrical releasing. The exciting aspect is that as studio operations become more integrated and these two mediums converge, film production can benefit greatly from these efficiency gains, and episodic production can benefit from a knowledge base carved at the highest level.”
David Conley, Executive VFX Producer, Weta FX
“The bidding process has changed due to sheer demand, and streamers have a completely different greenlight process than the traditional studio system. Where once we had the luxury of bidding over several weeks against a schedule that was fairly developed, and you could bid down to the crew weeks, we are now being asked to turn around more bids in less time, days even. To drive confidence in our bidding system, we’re relying on a more robust set of analytics to help drive the bidding process. That said, this means we really rely on the perception and skills of our bidding team because no project is similar, and analytics and performance metrics can never replicate the creative process. We rely on our bidding team to have great creative skills when reading and breaking down a script before applying metrics based on analytics.
“I would say there’s greater narrative risks being taken in episodic, but more technical and creative risks in film. That said, the bar is raised across the board for both episodic and theatrical. There’s more being spent in episodic [than previously] but at a lower price point per shot, with the expectation that results are feature-level quality. To date, features are still where, as an industry, we are being asked to produce groundbreaking visual effects. However, streaming services mean that mid-range projects now have direct access to wide audiences, so our challenge becomes leveraging our emerging technologies from the feature side to help produce high-end-caliber visual effects across multiple episodes, within the schedule demands of episodic, at a viable price point that works for our industry.”
“There is more exploration into ideas through streamers. The projects, scripts and series are often filled with quite fantastical ideas that may have never seen the light of day on the big screen.”
—Drew Jones, Chief Business Development Officer, Cinesite
Måns Björklund, Executive Producer, Important Looking Pirates
“Virtual production, real-time and machine learning are becoming more common, or even standards nowadays. With moving more work into prep instead of post, visual effects becomes even more involved before anything has been shot. For sure, there is more work around than ever. However, finding artists is harder than ever, and costs have also risen. We spend a lot of time finding artists and developing them in-house. Instead of just having a few clients doing high-end work, nowadays the demand is almost doing 10 mid-to-high-end features per season of episodic work. Due to shorter turnarounds, there is a more ‘going direct to the goal instead of trying all possible versions.’ The room for experimentation depends mostly on when you get involved with a project and how long a schedule you have. I feel there is more room to rebid and not have ‘fixed’ bids. It’s a constant discussion and collaboration with clients to get the best results within the budget and time. Important Looking Pirates don’t have facilities around the world. We are trying to do our thing and focus on the quality of our work.”
Paul Riddle, Executive VFX Supervisor, DNEG
“One of the main considerations in approaching our episodic work at the moment is the diversity within that work, whether that’s the creative requirements, the timescales involved or the financial requirements of the production. There’s so much scope for projects of varying sizes and complexities within episodic that there’s a real need for a degree of specialist skills and creative approaches in our artists across the globe.
“There has been an interest lately in ‘deep fake’ AI and machine learning, and how those things will eventually come to be utilized within our industry. There’s definitely an interest there and a buzz around it, and we’ve seen clients wanting to understand how it can be utilized sensibly without doing it just for its own sake, using the technology for a real creative impact.
“[T]he ‘feast-or-famine’ nature of the visual effects industry has somewhat dissipated to allow for visual effects facilities to have a more stable financial footing and thus provide more stability for their employees.”
—Stefan Drury, Executive Producer, ILM TV
“The area that’s getting the most attention at the moment, it seems, is facial replacement work, with articles and papers going in-depth about how AI and computational analysis are making those kinds of computer-generated content far more photographic than before. It’s definitely an area that could lead to some very different approaches as to how visual effects are fundamentally implemented.”
Stefan Drury, Executive Producer, ILM TV
“It’s been the busiest I’ve seen the industry in the 24 years I’ve been in it! More importantly, it’s also consistent, with a steady stream of projects in development and post at almost all times. It certainly feels like we’ve been able to better forward a plan for our crew, and the ‘feast-or-famine’ nature of the visual effects industry has somewhat dissipated to allow for visual effects facilities to have a more stable financial footing and thus provide more stability for their employees.
“There is still room for experimentation, especially through close collaboration with writers/director/showrunners in the episodic format. We’ve been involved with several episodic shows in which we’ve been part of the development, pitch and greenlight process, which have involved working closely with the clients to find creative methodologies to make projects possible. That said, this experimentation generally has to happen early in the production, agreed upon by all and adhered to, as the waterfall nature of episodic delivery and the sheer volume of material to be reviewed means post schedules leave little room for misdirection.”
Meredith Meyer-Nichols, Head of Production, Rising Sun Pictures
“In 2017, 100% of RSP’s work was on theatrical releases, and in 2021 we were 50/50 streaming to theatrical. As we move into 2022, this trend continues. Streaming projects are predominately series in nature, at about 35%. From our perspective, on the projects that we’re working on, they have large-scale budgets and demand the same kind of quality that RSP is known for. They’re essentially gigantic movies with thousands of visual effects shots and hours of content. RSP, in partnership with the University of South Australia, are delivering accredited courses in visual effects. In classrooms set up to mirror real-world production environments and with instructors who are working professionals, RSP rigorously trains students in the technologies and techniques they’ll need to succeed in an expanding global film industry. We have done a remarkable job of turning out job-ready graduates and have found that our graduates are in high demand with most major visual effects studios.”
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
|cookielawinfo-checbox-analytics||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".|
|cookielawinfo-checbox-functional||11 months||The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".|
|cookielawinfo-checbox-others||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.|
|cookielawinfo-checkbox-necessary||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".|
|cookielawinfo-checkbox-performance||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".|
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.