Events Search and Views Navigation
6:30 pm
VES Members and Their Guests are Invited to a 3D [...]
Find out more8:00 pm
VES Members and Up to Four (4) Guests in Berlin [...]
Find out moreBy TREVOR HOGG
Images courtesy of Netflix.
Brought together by Netflix is the visual effects supervisor duo of Jabbar Raisani and Marion Spates who are collaborating on their third series together, beginning with Lost in Space and followed by Stranger Things and Avatar: The Last Airbender. Their current partnership is an epic adaption of the animated series that was previously tried by M. Night Shymalan in a much-maligned feature film that fans and critics would rather burn or bury.
As for what the live-action adaption could achieve that was not possible for the source material, Spates comes up with one word, “Perspective. The cartoon is 2D and hand-drawn perspectives, while this is all done in 3D with proper depth and scale. You are immersed in the show.” Raisani, who also took on the roles of director and executive producer, is in agreement. “Across the board, we tried to represent if this was real, what would this really look like? Hopefully, that allows for a suspension of disbelief that when you’re watching the animated [series], you know that it’s not real, but we tried to make it feel like a real living, breathing world.”
Spanning eight episodes, the Netflix adaption concludes where the first season of the animated series ended with the Fire Nation laying siege to the home of the Northern Water Tribe as part of its plan to gain global supremacy over the other three nations that have the ability to control water, wind and earth. The main threat is the Avatar, a reincarnated spirit who can master all four elements, who has returned after freak natural accident imprisoned him in a frozen tomb for a century. “We always knew the scale, scope, quantity and quality were going to be difficult because those things never align with the time,” Raisani states. “Specific challenges were Appa and Momo. We knew that we had to nail those characters because how beloved they are, and it took a long time to get them just right. On the Netflix side, working with Ted Biaselli, he was a great resource to look to, talk through the characters and look at the animated series, and he helped us all shape that feeling to get the emotion right for those characters. Once we nailed that, the execution was much faster than we might have anticipated. Bending was hard, too. It was hard to figure out how to do it right. There was a lot of trail and error. It took us awhile to understand exactly what components went into each form of bending.”
“Jared Higgins, who is a Visual Effects Supervisor, came up with the ‘barbershop pole’ as a way to always move the water. We used it to a degree on the other forms as well, but the water is where you see it the most because what we didn’t want was a ball or orb of water. It always feels like it has a current. That you can see throughout the series.”
—Jabbar Raisani, Executive Producer/Director
Around 3,400 visual effects shots were created for the eight episodes over a period of 18 months by Framestore, Scanline VFX, Important Looking Pirates, Accenture Song VFX, Pixomondo, Image Engine, Rodeo FX, Untold Studios, Outpost VFX, BigHugFX, Cadence Effects, The Resistance VFX, Atomic Pictures, NEXODUS, FABLEfx and DNEG VP. “We looked at the animated series and wanted to mimic everything that we could; however, what do we need to do to ground it in today’s reality?” Spates notes. “We looked at flamethrowers for firebending. For water, we didn’t find a whole lot of reference. We found some water that was in space and a lot of slow-motion buckets of water. You can’t find a water whip online anywhere, but if you do, let us know because we’ll use it for references!” Simulations had to be art directed. “You’re trying to art direct something that is a random event and attempting to use forces to get exactly what you’re after,” Raisani notes. “We understand that you cannot art direct every single drop or drip or element.” A major part of the waterbending recipe was the concept of an underlying force that resembles a twirling barbershop pole. “Jared Higgins, who is a Visual Effects Supervisor, came up with the ‘barbershop pole’ as a way to always move the water,” Raisani explains. “We used it to a degree on the other forms as well, but the water is where you see it the most because what we didn’t want was a ball or orb of water. It always feels like it has a current. That you can see throughout the series.”
Three departments that had to work closely were stunt, visual effects and special effects, all of which had multiple supervisors to manage the workload. “We all start from the moment we are on set,” Spates notes. “We didn’t have an opportunity to come on at the beginning in production, but we were definitely there for the reshoots. The way that Jabbar and I approach things is you get in and work with all of the different departments early on in the process so that way everyone is on the same page of what we’re trying to pull off. They can’t do it without us, and we can’t do it without them.” Easing the communication was an established shorthand. “Fortunately, with people like Jeff Aro, who was one of the stunt coordinators, we worked with him on Lost in Space for years, so there’s an existing relationship that helps to streamline the process,” Raisani remarks. “Nevin Swain was also the prop master on Lost in Space. I know a lot of the crew up in Vancouver from many years of working there.” The special effects team led by Chris Flemington and Mark Gibbard provided some clever solutions. “We had this cool contraption where they could blow wind from a hose,” Spates reveals. “For Aang’s landing, they could just blow the dirt out of the way, so that way we get that effect for free which was great.” Earthbending was mainly digital with exception of some practical debris. “For fire, we had the firelight on their hands to simulate the interactive lighting, and it had to be operated by the board operator, so when someone is bending and doing the firebending all of that has to be timed out and matched exactly,” Spates remarks. “We definitely had some things to figure out for Season 2 because we needed to tighten some areas as far as the interactive lighting and some things that we did on set.”
“[For the location of the Southern Air Temple] I spent a lot of time in China on another project, but I knew where these amazing mountains [Zhangjiajie National Forest Park] are, which were used in the film Avatar. That’s exactly what we wanted because the scale of those mountains is unbelievable. There is a lot of imagery that we can steal from the Internet, which obviously is what helped us to make such awesome Avatar mountains, because that’s what they have been labeled over the course of the multiple shows that have been there. It was awesome.”
—Marion Spates, Visual Effects Supervisor
Classic scenes were recreated, like when Aang is showing off his airbending skills to a group of children and accidentally crashes into a statue. “It was moments like that where you’re trying to emulate the animated series as well as you can,” Raisana observes. “For that particular shot, Gordon Cormier was on a practical driving rig that drove him around, then we do a swap to a full CG version of Aang. Once he comes to camera, he is digital and full CG, crashes and falls to the ground. We definitely used all of the magic of the practical stuff on set as well as full CG stuff, and pulling from this beloved animated series. Moments like that are fun.”
Interestingly, a cinematic franchise that shares the same name but isn’t related provided the location of the Southern Air Temple. “I spent a lot of time in China on another project, but I knew where these amazing mountains [Zhangjiajie National Forest Park] are, which were used in the film Avatar” Spates recalls. “That’s exactly what we wanted because the scale of those mountains is unbelievable. There is a lot of imagery that we can steal from the Internet, which obviously is what helped us to make such awesome Avatar mountains, because that’s what they have been labeled over the course of the multiple shows that have been there. It was awesome.”
“[W]hat do we need to do to ground it in today’s reality? We looked at flamethrowers for firebending. For water, we didn’t find a whole lot of reference. We found some water that was in space and a lot of slow-motion buckets of water. You can’t find a water whip online anywhere, but if you do, let us know because we’ll use it for references!”
—Marion Spates, Visual Effects Supervisor
Acting alongside the live-action cast were CG characters, with two of the hardest being the air bison Appa and the flying lemur Momo. “There was a huge structure that was covered in fur for Appa that the actors are climbing or riding on top of,” Raisani explains. “There was a lot more of a physical representation of Appa on set and less so with Momo. There is a great scene in Episode 105, which Roseanne Liang directed, where Momo finds a little acorn. This acorn represents the fact that this forest has been burned down is going to be rebuilt. That’s a scene that we worked hard to ensure Momo brought an emotion to his performance and a connection, not only to Katara but to Aang, who is in the position of emotional strife, and make it feel like Momo makes a character choice to give this acorn that he wants to eat to Aang because Aang is struggling and he wants to do something for his friend. It’s moments like that we worked hard to ensure that they are giving a performance as opposed to being cool or cute-looking digital characters.”
A fun creature was the ostrich horse ridden by the Earth Kingdom. “I love the ostrich horses,” Spates remarks. “Accenture Song VFX worked on that creature and did such a good job of bringing the movement of the ostrich into the ostrich horse. It’s unbelievable how [they captured] just little nuance motions of how they move around and walk. Also, there was the challenge of how do we make it into a horse and how do we make fur bend into the tail of a horse? Also, they had all of the armor, too. That becomes a big challenge because all of that stuff has to be simulated because there’s movement in the armor.”
“Another fun sequence is the Aang-Bumi fight, which is part of the block I directed in Episode 104. I leaned heavily on the animated series and tried to do everything I could to represent that animated series in living, breathing form. For every beat that I could, I would grab the animated series and say, ‘We’re going to do this shot and that shot.’ The crew had a fun time shooting it, and we definitely had a great time in post putting that onscreen.”
—Jabbar Raisani, Executive Producer/Director
On a different plane of existence is the Spirit World, which can be accessed via the Avatar State. “I won’t get into how the original footage was shot because we weren’t there,” Spates states. “That was one area we could get creative and stylized. Normally, Jabbar and I stay away from stylized stuff because we always try to keep it grounded.” It was a tricky balancing act achieving the proper visual aesthetic. “We were trying to come up with something that felt heightened but also photographic, so we were leaning on a lot of photographic elements like chromatic aberrations, treating it as if it was something that was happening with the lens but was also happening with Aang,” Raisani remarks. “Especially in that first scene where he hadn’t been in the Spirit World, and we were trying to make it feel almost out of focus. It’s overwhelming, and he doesn’t know how to process it. We were trying to get that visually into the footage, but also emotionally connected with what Aang is feeling. In terms of the color, Marion worked with our in-house vendor to figure out how we take this forest in Vancouver and make it feel heightened, but don’t break to where it does not feel like a real place at all.”
A signature effect is the Avatar State when the Avatar achieves maximum power. “As far as the Avatar State and the arrow and eyes, that was something we put in a lot of effort into,” Spates reveals. “We figured out with one of our in-house vendors what that would look like. The tricky thing is you’re putting all of this illumination and light on an actual image of a character. It’s easier when it’s all in CG, but we couldn’t make the CG aspect of Aang be so far different than the physical production footage of Aang. We had to figure out how to illuminate his head. We have a little of subdermal that we put around the arrow and his eyes. How much detail between the iris, pupil and sclera? We spent days and hours [figuring that out].”
Nothing would have been possible without the contributions of the other three Production Visual Effects Supervisors: Jared Higgins, Christopher D. Martin and Alex Gitler. as well as the army of vendors. “Scanline VFX did this incredible sequence of Koizilla wreaking havoc on the Fire Nation, and it’s all done through animation combined with simulations,” Raisana states. “Really complex work that is a combination of character, story, performance and technical complexity. It’s a cool sequence.” Spates agrees with his colleague. “Definitely what I want people to see is Koizilla, which is insane,” Spates says. “You talk about simulation — that is a lot of simulations. It gives me chills to my bones every time I see it. The things that we did to it, also in color, to represent what happens in the animated series has turned out fabulous.”
Raisani has a personal bias. “Another fun sequence is the Aang-Bumi fight, which is part of the block I directed in Episode 104. I leaned heavily on the animated series and tried to do everything I could to represent that animated series in living, breathing form. For every beat that I could, I would grab the animated series and say, ‘We’re going to do this shot and that shot.’ The crew had a fun time shooting it, and we definitely had a great time in post putting that onscreen.”
By OLIVER WEBB
Images courtesy of Republic Pictures and Theodor Groeneboom.
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood. Based on Megan Hunter’s 2017 novel, The End We Start From is a hauntingly realistic depiction of a dystopian London submerged underwater.
Theodor Groeneboom served as Visual Effects Supervisor on the film. “A friend of mine in London went to film school with Mahalia,” Groeneboom recounts. “Mahalia reached out to me because of him. I used to live in London and worked in a few of the big studios there doing visual effects. Then, I moved back out to Norway a couple of years ago and started my own little company. We do quite a bit of work for the U.K. visual effects companies and some of the independent UK films as well. It felt like an extension of keeping in touch with everything that was happening there. I was involved from the early stages – the screenplay breakdowns, planning on the shoot and throughout the shoot and post-production, so it has been a long journey. I very much enjoyed it.”
The post-apocalyptic elements in the film are a backdrop in the story. “It’s not a visual effects film,” Groeneboom acknowledges. “That’s very much true for how the book portrays it as well. Production Designer Laura Ellis Cricks and Mahalia were very much into making sure the film has some kind of texture to it, not just visually but the way the landscape conveys through the film that it’s not just front and center that everything is happening.”
“We need to treat visual effects like some kind of story-driven element. They are just sprinkled around to provide texture to what is happening to Jodie [Comer] and her character, which is quite different from making a big scene and point out of it. It just happens to be what they are going through. There are quite a few scenes where we just put stuff in the background that could tell some kind of environmental thing that something has happened and not draw any attention to it. I guess that’s part of the whole textural side of things; they just want to paint the world but not make it completely obvious. I suppose subdued and textual were words that were frequently used about how to make the visual effects integrate.”
—Theodor Groeneboom, Visual Effects Supervisor
Explains Groeneboom, “It’s not so much a coherent film that goes from A to B; it sort of drifts in and out of these moments that you take in when you view the film that go away from the whole. We need to treat visual effects like some kind of story-driven element. They are just sprinkled around to provide texture to what is happening to Jodie and her character, which is quite different from making a big scene and point out of it. It just happens to be what they are going through. There are quite a few scenes where we just put stuff in the background that could tell some kind of environmental thing that something has happened and not draw any attention to it. I guess that’s part of the whole textural side of things; they just want to paint the world but not make it completely obvious. I suppose subdued and textual were words that were frequently used about how to make the visual effects integrate. Suzie Lavelle, the DP, was also a big part of that conversation in driving the lighting and the look of everything.”
Groeneboom and his team collected lots of material for visual references and relied heavily on the production design ‘bible’ that Production Designer Laura put together. “We were just looking at real photos of flooded areas such as farmland and cities around Europe and all from recent events,” Groeneboom explains. “It’s all based on real references that are quite current. The concepts from Laura were amazing, and it was sort of a bible – its own locations and references from both the recce and the [actual] places. There’s a bunch of scenes where there are animals trapped in a bit of mud, and you just see those skeletons sticking out. It’s just in the background and you don’t notice it, but it’s everything that can tell some kind of story. Barbed wires being cut for some of the fences, just whatever environmental storytelling they can think of, we tried to put in the background for some of these shots.”
“There’s a bunch of scenes where there are animals trapped in a bit of mud, and you just see those skeletons sticking out. It’s just in the background, and you don’t notice it, but it’s everything that can tell some kind of story. Barbed wires being cut for some of the fences, just whatever environmental storytelling they can think of, we tried to put in the background for some of these shots.”
—Theodor Groeneboom, Visual Effects Supervisor
“There were loads of references we were putting into the film,” he continues. “There were references for the textural quality of the rain and how much we should use. Some of it was more on the practical side of things and how to shoot, etc. and elements we want to use. As an overarching production design bible, there was a lot of stuff that came from Laura. A lot of ideas of not strictly stuff they wanted to see in the film, but Mahalia liked the feeling of it. There’s definitely less of the 28 Days Later vibe with the military and trying to keep it a bit more chaotic and grounded in the people around Jodie.”
“[Fleet Street underwater] was all approached from the same angle as everything else, that it needs to be grounded in reality, and you shouldn’t really pay attention to the effects of it. You should just take in the image as this is something that’s happened. We rebuilt the whole of Fleet Street. … I was taking photographs of every single façade, building, element and item that I could find. We later modeled them up in 3D for textures and lighting ornament and rebuilt the street from scratch. There are some obvious liberties taken to make sure that lighting looks as good as it can, so there are gaps in-between the buildings just to put nice eye lights on every other building.”
—Theodor Groeneboom, Visual Effects Supervisor
Belo and Lavelle had a clear vision in their heads of what they wanted to achieve. “It wasn’t specific, but they were after a certain feeling. I think the production design bible picked all the right pieces,” Groeneboom remarks. “It was Suzie who set the real textural film quality to everything with her cinematography. We did really early development, sketches of trying to make things feel like they are underwater, like a suburban submerged in the city. This was all CG stuff that we were playing with beforehand to see whether or not it was doable with a small team and the amount of resources that we had. These were all based off the same references from the production bible.”
The biggest sequence for Groeneboom involved trying to figure out how to do London underwater, specifically Fleet Street. “Underwater in whatever capacity we could do,” he adds. “I think we got there in the end. It was all approached from the same angle as everything else, that it needs to be grounded in reality, and you shouldn’t really pay attention to the effects of it. You should just take in the image as this is something that’s happened. We rebuilt the whole of Fleet Street. I was dangling off a rental bus, one of the old Routemasters, and I was taking photographs of every single façade and building and element and item that I could find. We later modeled them up in 3D for textures and lighting ornament and rebuilt the street from scratch. There are some obvious liberties taken to make sure that lighting looks as good as it can, so there are gaps in-between the buildings just to put nice eye lights on every other building. Once the light hits a certain angle on Fleet Street, it just becomes completely obscured by the buildings.”
Fleet Street proved to be a challenging task as it’s an iconic street in London that people are very familiar with. “There are so many great photo references of it as well,” Groeneboom details. “It was a matter of bouncing between what Mahalia felt was real, or what she would accept being real, and what Suzie thought of the texturing and lighting of the scene and how she would approach it from a practical point of view. Obviously, she can’t light an entire street, but how as a DP would she approach it from a practical point of view? We had lots of discussions trying to figure out the right angle for the sun, but also blocking out certain elements to create interesting patterns on one side and having the other side a bit more muted. The opening shot of that scene was quite challenging as well, just trying to make it feel like London specifically. It was shot on a little greenscreen.”
There were around 120 visual effects shots in total, with 13 or 14 of those being Fleet Street. “The film is quite slow-paced, especially in terms of number of cuts in the film. Every single little detail and item on Fleet Street was modeled up and painstakingly created. There are some hints of hope in this scene as well. For example, there are few people dotted around in the windows. The bus was a challenge as well because the chassis of the bus was not favorable to any particular lighting. If you see the Routemasters buses in the city and take pictures of them, they are just uniformly red. It’s very hard to see any shading on them. Getting some prospective lighting on them proved to be a little bit difficult, so we had to exaggerate the amount of light and shadow that the material actually has. It’s a combination of plastics and metal, so that was a bit of a faffle.”
Continues Groeneboom, “[Lighting was also an issue on] quite a few shots where there is flooding on some of the signs, but it’s all CG. We rebuilt a lot of environments and tried to make it all as integrated as possible. For instance, the subtly of some of the elements, like dead animals floating. Maybe you don’t pick it up when you watch the film for the first time, but there are quite a few of these images where they are driving and there is stuff in the background. I quite like that because a lot of films I’ve worked on previously have been blockbusters and the effects are front and center, but here they are subdued in the background. One of my favorite shots is of a traffic jam. There is a giant boom mic on the windshield of the car, which was a bit of pain to remove. We just extended the whole background with a bit of the M25 in the background, and there are fires and fire trucks and lots of things happening, but you probably don’t see it. In terms of rain enhancements, it was just putting more in. On the day, you can only get rain so far close to certain things before you have to do some augmentation with visual effects because of rain. Working with real effects elements is always a bit unwieldy. We also worked on the little baby bumps as well, which are partially prosthetics and partially visual effects.”
Groeneboom and his team had ample support from every department in pre-production and on set. The riggers and the gaffers were especially helpful in pulling up greenscreens if needed and accommodating for the lighting and tracking markers whenever Groeneboom and his team were able to be on set for supervision. “We were there for the most important days, but they would take our potential work into consideration by just phoning us up and asking if we need tracking markers here, for example,” he adds. “So, in terms of on the shoot, we were quite welcome and an integral part of solving some of these shots. In terms of the post-production side, it was me and my company [Rebel Unit in Bergen, Norway] doing it. SunnyMarch were producing the whole film, and they were our client for the job. We are a small team of 10-11 people. I think we had six people on this at the most. The idea of my company is most of the people working there have worked in larger facilities before, so we are trying to move away from the idea of thinking large pipelines and overcomplicating, or over-engineering things, which we are trying to do as much as we can in off-the-shelf software and just being a bit more nimble about the approach. I’m quite happy with a small team of six people doing the work. There were two of us, including myself, doing the modeling for the Fleet Street elements.”
“We rebuilt a lot of environments and tried to make it as integrated as possible. For instance, the subtly of some of the elements, like dead animals floating. Maybe you don’t pick it up when you watch the film for the first time, but there are quite a few of these images where they are driving and there is stuff in the background. I quite like that because a lot of films I’ve worked on previously have been blockbusters and the effects are front and center, but here they are subdued in the background.”
—Theodor Groeneboom, Visual Effects Supervisor
Groeneboom concludes,“Given the fact we are a small team and what we were able to achieve, I’m quite proud of the work we did. I really enjoyed working with Mahalia. It’s her first film, and I think she’s approached it in such an interesting and inspiring manner. I’m very interested in seeing what she does next. It was also great working with Suzie, as she’s lovely to work with, and working with the production team was one of my favorite parts. Reading good reviews, and the fact that the film has an important backdrop about the state of the world is interesting. From a visual effects point of view, you tend to go with more lackluster ideas of what apocalyptic visions happen to be. I thought this felt more real and important. Also, watching Jodie perform was really cool. She’s absolutely phenomenal.”
By TREVOR HOGG
Images courtesy of Netflix.
As Akira Kurosawa was a major influence on George Lucas’ The Hidden Fortress in particular, Zack Synder was inspired by Seven Samurai for what was initially meant to be a Star Wars pitch that has since been retooled for Netflix as a multiplatform original IP. A feature film was shot that has been divided into two parts with the first being Rebel Moon: Part One – A Child of Fire followed by Rebel Moon: Part Two – The Scargiver. Participating in what Synder has termed a “giant atmospheric space adventure” is Visual Effects Supervisor Marcus Taormina who previously collaborated with the filmmaker known for speed ramping, hyper-real, painterly compositions and lens flares on another Netflix production, Army of the Dead. “What has been nice about working on both movies simultaneously, both shooting them and doing post-production, is that we’re looking at both movies at the same time,” Taormina states. “A lot of what is set up in Part One we use in Part Two.”
Overall, 1,380 visual effects shots were created by Framestore, Luma Pictures, Mammal Studios, Rodeo FX, Scanline VFX and Wētā FX for Rebel Moon: Part One – A Child of Fire, which revolves around an adopted daughter of a despot standing up against him by assembling a gang of notorious renegades to protect a planet that she now calls home. What makes the production somewhat unusual is that Synder doubles as his own cinematographer. “It’s nice to have a director/DP because I only have to go to one side of the set versus splitting time,” Taormina notes. “Days of Heaven was a huge inspiration for this movie, which means organic filming, daylight dependent, lots of lens flares, and we also had a custom one-of-a-kind anamorphic package that Zack created for the film, which in itself was a huge challenge. What was nice was that I could go to him about things that I needed. ‘I love the lens flares, but I need you to do a clean pass as I have to erase that flare and put it back over the work later on.’ He was understanding to that and granted me those opportunities on set.”
Skies set the tone for scenes, and Snyder sent a whole library of them to Taormina. “It would always be, ‘I like this and that reference,’” Taormina explains. “I would pull them together and go, ‘What do you like about this one?’ I would smash them all together. We had a lot of discussions about the gas giant Mara at the beginning of the movie, which is not a sky, but it’s [related] enough that it’s an important part. Zack found a colored look that he liked that was a dirtier orange. Obviously, he had to put the lightbox up there, which was headache in itself because the flare contaminated the lens a lot. There was a lot of compositing needed to put those shots together, but it adds to the believability because when you get this dirty orange wash over Kora [Sofia Boutella] it feels cohesive.” Over 38 worlds had to be conceptualized with the main ones being Veldt, Neu Wodi, Daggus, Castor, Sharaa, Gondival and Motherworld. “It starts with our production designers, Stephen Swain and Sefan Dechant; they sent a lot of reference packages our way. Obviously, if there were practical pieces, we would try to infuse those back into our digital worlds. But trying to make them unique yet familiar was a challenge. Atmospherics, the mood and lighting, all of those things were important and played a role. When going to Sharaan we meet King Levitica. It’s moody because it feels like we’re not supposed to be there as the viewers,” Taormina remarks.
“It starts with our production designers, Stephen Swain and Sefan Dechant; they sent a lot of reference packages our way. Obviously, if there were practical pieces, we would try to infuse those back into our digital worlds. But trying to make them unique yet familiar was a challenge. Atmospherics, the mood and lighting, all of those things were important and played a role.”
—Marcus Taormina, Visual Effects Supervisor
For arid Neu Wodi, the ranch was a real location; however, when the creature known as a Bennu takes flight, the entire environment becomes digital. “We talk about how we wanted to get a lot more claustrophobic,” Taormina states. “When we fly through the spires, it feels like they’re closing in on us, and then the Bennu [nicknamed Beatrice] smashes Tarak [Staz Nair] off onto the cliff. We have to make sure that when Tarak jumps that he jumps down into a huge ravine, which feels threatening. We let it open back up towards the end where there’s this majestic scale and beauty to it. Designing that was fun.” The Bennu harkens back to Pegasus. “It’s neither a raven or crow but both at the same time, and a gryphon, too.” Stunt performer Albert Valladares was placed in the middle of a parallelogram with his colleagues holding on to ropes that were attached to his backpack to simulate what it would be like trying to restrain a rearing Bennu. “Every take I was like, ‘Zack, hang on. I’ve got to give a note.’ I’m giving notes to special effects and creature stunt performers while Zack is giving notes to Staz Nair. On occasion I would say, ‘Staz, just imagine that at this moment that you’re going to get ripped apart, and now you have a subtle moment where you get to interact with him. Live in that moment. And also let Albert do some of the performances and lead you, in a sense,” Taormina says. Nair was subsequently captured sitting on a gimbal setup in a parking lot for the aerial sequence. “I said to Zack, ‘I know that we have these anamorphic lenses, which are great. However, let’s do one camera anamorphic and shoot the other three spherically. It will be super sharp, but don’t worry, we’ll add our optics later on to it to make it feel anamorphic.’ We basically reanimated and recomposed all of those shots and scaled them down so that it feels like he’s flying through.”
“We talk about how we wanted to get a lot more claustrophobic. When we fly through the spires, it feels like they’re closing in on us, and then the Bennu [nicknamed Beatrice] smashes Tarak [Staz Nair] off onto the cliff. We have to make sure that when Tarak jumps that he jumps down into a huge ravine, which feels threatening. We let it open back up towards the end where there’s this majestic scale and beauty to it.”
—Marcus Taormina, Visual Effects Supervisor
Influencing the aesthetic of the cobalt-mining planet Daggus was Blade Runner. “I wanted it to feel dark, dingy and moist,” Taormina states. “A lot of narrative and production design determined that environment as we had a lot of built pieces to the set.” Dwelling in the basement level is a native spider/humanoid hybrid called Harmada (Jena Malone) which abducts a child and in doing so comes into conflict with Nemesis (Bae Donna). “The stuntvis or previs was a combination of digital shots and stunts in motion capture suits. Because that space was so small, stunts had to be careful about the movements. Jena Malone is on this huge swivel rig, which is best described as a cart that she is strapped to because we didn’t want her body moving too much, as we needed to see her just below the navel or bust to make it cohesive. We had three to five stunt performers swiveling the rig while having it going up and down to the performances. They also have noodles that are interacting with Donna. It was a harmonious integration of everything together because we didn’t want to replace Jena as her performance was too great.” Nemesis wields two swords that avoid being replicas of lightsabers. “When I had that initial conversation with Zack. all we knew is that we were going to have acrylic rods with LEDs in them that had a warmer amber color. I had this light painting of streaking sparklers. I was like, ‘That’s cool. I think we can do that almost like a synthetic shutter or delayed shutter on the swords, add a heat signature and smoke and then add what we call ‘sword popcorn’ as well. It’s those sparks that came off of it.”
“Every take I was like, ‘Zack, hang on. I’ve got to give a note.’ I’m giving notes to special effects and creature stunt performers while Zack is giving notes to Staz Nair. On occasion I would say, ‘Staz, just imagine that at this moment that you’re going to get ripped apart, and now you have a subtle moment where you get to interact with him. Live in that moment. And also let Albert do some of the performances and lead you, in a sense.”
—Marcus Taormina, Visual Effects Supervisor
Anthony Hopkins voices a robot called JC-1435, also referred to as ‘Jimmy,’ that decides to participate in the rebellion against the oppressive Imperium led by Kora. “I asked Zack if we could video record and do a ADR scratch session with Anthony Hopkins,” Taormina reveals. “If you don’t have your performer listening to the way that Anthony delivers the lines then there may be a mismatch, and when you put them all together in the end, your brain is going ‘something is wrong or odd here.’ The inflections and body movements are not right. We made the Jimmy suit for Dustin Ceithamer that had chest plates in the front and back, shoulder pads, a face plate and some hands, but the hands had to be replaced digitally. When we got into post, Dustin’s performance and Zack’s direction were so great and minimal that it was less robotic and more human. I actually mandated that when we could, which is about 90% of the shots, to keep the practical chest and face plates, which is a lot more difficult to do because it was so beautiful in the way Zack captured them in the available light.”
“When I had that initial conversation [about the swords of Nemesis and avoiding the lightsabers of Star Wars] with Zack. all we knew is that we were going to have acrylic rods with LEDs in them that had a warmer amber color. I had this light painting of streaking sparklers. I was like, ‘That’s cool. I think that we can do that almost like a synthetic shutter or delayed shutter on the swords, add a heat signature and smoke and then add what we call ‘sword popcorn’ as well. It’s those sparks that came off of it.”
—Marcus Taormina, Visual Effects Supervisor
Another interesting approach was for the restraining devices that have a mechanical base resemble a crab. Explains Taormina, “Those are called ecto-shackles in the script, but we named them ‘Beetlejuice Chairs.’ We have stunt performers grabbing the cast when they get thrown back, and then we swap out and put the practical prop in there. Like for Gondival or in Providence when the gentleman gets captured, we do the stunt followed by the digital version of it and then swap it out, and we had special effects create this RC-controlled base of the ecto-shackle. We put him on that with only the spine and added all of the digital pieces of it walking. Again, with your brain you’re trying to do the trickery of ‘what’s real and not.’ That’s a great example of stunts, props and special effects doing a fantastic job. We get the plates and go, ‘Let’s make this look cool.’”
By TREVOR HOGG
Images courtesy of Netflix.
Producer and long-time David Fincher collaborator Peter Mavromates extend their partnership in the The Killer where an assassin seeks revenge after a botched assignment. The Netflix feature consists of 900 digitally-augmented shots that range from shortening the tail of a dog to CG airplanes, tasked to a vendor list that includes Ollin VFX, Artemple-Hollywood, Savage VFX and Wylie Co. as well as an in-house team. “Visual Effects Compositor Christopher Doulgeris and I will go into the color bay with [Colorist] Eric Weidt and talk about some issue that we had,” Mavromates explains. “Even sometimes if it’s an outside vendor, we’ll focus to help problem-solve. It’s this wonderful and fluid atmosphere, and it works for David Fincher because he’s always got ideas flowing. He doesn’t want to be on a clock at a facility where you’ve got from 2 p.m. to 5 p.m. and then it’s overtime. There’s none of that. David will walk the halls and stop in on people to check on stuff.”
“The dog gives a vicious performance but had a tail that is probably about 12 or 14 inches long, which drove David crazy because when it wagged, he looked too cute! David calls me in and says, ‘We’ve got to get this tail down to two inches.’ … Ollin VFX in Mexico doctored the tail. When you look at the movie and see that tail, there is another 10 inches or so that you’re not seeing anymore!”
—Peter Mavromates, Producer
An unusual visual effects situation arose when a guard dog pursues The Killer, played by Michael Fassbender, after its owner has been murdered. “The dog gives a vicious performance but had a tail that is probably about 12 or 14 inches long, which drove David crazy because when it wagged, he looked too cute!” Mavromates laughs. “David calls me in and says, ‘We’ve got to get this tail down to two inches.’ This is a night scene, so in terms of the type of work that you have to do on manipulating an image, it was tough footage. We had 37 shots, and for that we worked with Ollin VFX in Mexico, and they doctored the tail. When you look at the movie and see that tail, there is another 10 inches or so that you’re not seeing anymore!” What has become more common is the reframing of shots in the DI. “That is something David discovered while we were still shooting on film for Panic Room,” Mavromates remarks. “Once you had all of that film scanned and you’re in the DI suite, there is an opportunity to improve on the headroom. You couldn’t move it right or left that much because of the way it was shot on the negative, but you have a lot of north and south. We did about 100 shots then, and that number has continued to go up where it’s more than 50% of the shots in recent movies where the framing is adjusted.”
[The reframing of shots in DI] is something David discovered while we were still shooting on film for Panic Room. Once you had all of that film scanned and you’re in the DI suite, there is an opportunity to improve on the headroom. You couldn’t move it right or left that much because of the way it was shot on the negative, but you have a lot of north and south. We did about 100 shots then, and that number has continued to go up where it’s more than 50% of the shots in recent movies where the framing is adjusted.”
—Peter Mavromates, Producer
Considering that the entire first reel of the movie has the protagonist surveying the building across the street in Paris, one would have thought that Rear Window would have been an influence. “It’s astonishing how little we talked about that movie,” Mavromates notes. “The movie that David referenced and has nothing to do with the look of it is Le Samourai in terms of the tone and what the character is.” The daytime building in Paris actually exists and has been featured in Emily in Paris. “The initial shooting was in Paris and was in that square,” Mayromates explains. “We had eight cameras rolling so that David could capture the images of the people walking in the square and the façade of the building in the daytime. You could capture them simultaneously with different lens lengths so that the action matches perfectly because it’s literally the same take. Later, in New Orleans, they shot the individual window settings all laid out on a stage for the night scenes. Based on the daytime footage of that building and some nighttime plates, Artemple built an element where they put those windows in that were shot in New Orleans, then added tinted glass on the foreground.”
“Special effects put a wick in the bottle [of the Molotov cocktail that Fassbender throws] that had these LED lights, which were golden, and Ollin VFX went in and put the flame over that. What was astonishing is Fassbender threw the bottle that far and it landed a little bit to the left of the door. I couldn’t do that, for sure!”
—Peter Mavromates, Producer
A digital double was created for the scooter escape of The Killer through the streets of Paris after the botched assignment. “The scooter and Michael Fassbender are all CG,” Mavromates reveals. “They did shoot Michael Fassbender on a scooter. That was in our early edits. Then Wylie Co. came in and slowly replaced everything. The background is a photographic plate behind the digital character and scooter in the foreground. If you were doing that all from scratch, that’s a big ask. At least when you have those plates, a lot of lighting decisions are made inherently.”
“There might be a few Fitbits that are actually photographic. I doubt it, because we created an interface in post for it. The interface that you see is not one that is exactly right for a commercial product as with the little music player that he has. Somebody asked me, ‘What is that MP3 player? Is that the Microsoft one? What is that?’ I answered, ‘No, that is the Fincher pod.’”
—Peter Mavromates, Producer
Even the Fitbit that Fassbender wears got a facelift. “There might be a few Fitbits that are actually photographic,” Mavromates observes. “I doubt it, because we created an interface in post for it. The interface that you see is not one that is exactly right for a commercial product as with the little music player that he has. Somebody asked me, ‘What is that MP3 player? Is that the Microsoft one? What is that?’ I answered, ‘No, that is the Fincher pod.’” The Molotov cocktail that Fassbender throws was also digitally augmented. “Special effects put a wick in the bottle that had these LED lights, which were golden, and Ollin VFX went in and put the flame over that. What was astonishing is Fassbender threw the bottle that far and it landed a little bit to the left of the door. I couldn’t do that, for sure!”
By TREVOR HOGG
Images courtesy of Toho Company.
Not since Neill Blomkamp released District 9 in 2009 has an international production received receive Academy Award and VES Award nominations for its visual effects work. But Takashi Yamazaki has repeated the feat by stomping through the box office beyond Japan along with an iconic kaiju that has been cinematic staple since 1954. Godzilla Minus One revolves around a World War II kamikaze pilot suffering from survivor’s guilt having a re-encounter with the title character, which has gone through further mutation because of American nuclear tests at Bikini Atoll.
“In the current digital era, we tried to use technology that could only be used digitally. We have a lot of close-up shots of Godzilla to instill fear in the audience because it’s quite rare in Godzilla films for Godzilla to appear in the same scenes as people. We made it possible because of the high level of detail we included in the CG.”
—Takashi Yamazaki, Director/Screenwriter/Visual Effects Supervisor
Visual and special effects have dramatically evolved like the creatures in the Godzilla franchise. “In the current digital era, we tried to use technology that could only be used digitally,” notes Yamazaki, who was previously responsible for the live-action adaption of Parasyte. “We have a lot of close-up shots of Godzilla to instill fear in the audience because it’s quite rare in Godzilla films for Godzilla to appear in the same scenes as people. We made it possible because of the high level of detail we included in the CG. On the story side, 1954 Godzilla does a great job balancing the human drama with the Godzilla scenes; therefore, we were mindful in trying to have strong story and character development, and to make sure that it’s woven together with what Godzilla is doing on screen.”
Along with being the director, Yamazaki was the Visual Effects Supervisor on the project. “Let me add screenwriter to that as well! Being all three had its benefits, although having said that, when I went to shoot on location, I wanted to ask the writer why he wrote in that specific scene because it was so difficult to shoot! When we went into post, I wanted to ask the director why did he shot the scene in that particular way because it made the visual effects that much more challenging! But of course, I only have myself to blame for all of it!” The different roles had an influence on each other. “Normally, when I write a screenplay, I have to pay some consideration to the visual effects team. Can they achieve this scene? However, in this instance I decided to trust my future self for the sake of efficiency. In post-production, because the director was the one and same as the visual effects supervisor, it allowed for more trial and error in the same amount of time. We were able to avoid any miscommunication or difference in creative direction when it came time for approvals.”
“[I]n this instance I decided to trust my future self for the sake of efficiency. In post-production, because the director was the one and same as the visual effects supervisor, it allowed for more trial and error in the same amount of time. We were able to avoid any miscommunication or difference in creative direction when it came time for approvals.”
—Takashi Yamazaki, Director/Screenwriter/Visual Effects Supervisor
“We watched various Godzilla movies again and learned what makes everyone think, ‘This is Godzilla,’” Yamazaki explains. “We analyzed a great deal of photos and videos for historical background. The assistant director’s team collected a variety of background materials, and I remember being terrified that we could no longer use existing materials. Once we had a clear picture of what was going on at the time, we couldn’t just fudge it.” Storyboards drove the design process. “I drew everything that involved visual effects. It was a huge amount of work. The assistant director [Kôhei Adachi] and Kiyoko Shibuya [Visual Effects Supervisor at Shirogumi, which was the sole vendor] were very impatient with me! Basically, there was no such thing as concept art. Based on the storyboards, I set up the scenes and created previsualization with some staff members using simple CGI. From that point on, I sat next to the CG artists and gave them direct instructions if they were not going in the direction I was aiming for. If something is closely related to the shooting, I made it before the shooting starts. However, if it was necessary to include it in the editing, after the shooting is over [for example, a full CG shot], I made postvis in the same way to set the rhythm of the editing.”
The overall silhouette of the creature is based on the previous Godzilla suits. “The dorsal fin is more acute and the legs are thicker to give a more ferocious impression,” Yamazaki remarks. “We tried to heighten the fear by making Godzilla closer to the characters, so we included many fine details to allow the camera to get closer to the characters.” No motion capture was utilized in the animation of Godzilla. “However, to help define Godzilla’s look, the animator and I spent a lot of time testing Godzilla’s walk. In Shin Godzilla, Godzilla’s posture feels very straight and uptight. In Hollywood, Legendary’s Godzilla feels more aggressive, like an animal ready to pounce. But we wanted something different from both of those. In Japan, Godzilla represents both God and Monster, so we wanted its movement to feel almost divine or God-like. We adjusted the height of its waist, how it moves and its posture many times before arriving at its current design,” Yamazaki states.
“Basically, there was no such thing as concept art. Based on the storyboards, I set up the scenes and created previsualization with some staff members using simple CGI. From that point on, I sat next to the CG artists and gave them direct instructions if they were not going in the direction I was aiming for.”
—Takashi Yamazaki, Director/Screenwriter/Visual Effects Supervisor
As for the world-building, it was important to transport audiences back to 1947. “We collected a large number of photos and videos from that time and were conscious of recreating the atmosphere of that period,” Yamazaki explains. “We could not use any existing open sets, and we did not have the budget to construct new buildings, so we created a composite of a digital building on a simple road set. It was difficult to make the two fit together.” The visual effects shot count is misleading. “Although there were 610 cuts, screentime amounted to two thirds of the entire film. Production time was roughly eight months after the shoot was over and in full swing. Although several people were involved in modeling and scene design before the shoot.”
“In Japan, Godzilla represents both God and Monster, so we wanted its movement to feel almost divine or God-like. We adjusted the height of its waist, how it moves and its posture many times before arriving at its current design.”
—Takashi Yamazaki, Director/Screenwriter/Visual Effects Supervisor
The first ocean battle was the most complex scene to execute. “We wanted to shoot this on location at sea because I felt it gave the picture this cool documentary style,” Yamazaki reveals. “What I didn’t expect is that everyone got seasick, and the weather was quite unstable, which made filming difficult. Once we took the footage back to the office, the natural waves that we captured were both beautiful and complex, which made it difficult making a giant creature swim through it creating its own waves. With that said, filming on one location and overcoming this challenge unified the team, and that natural imagery we were able to capture made the shot more powerful and convincing. Would I write an ocean scene into my next screenplay? That’s debatable!”
Simulating the ocean was hard because of the amount of data and work to make it appear believable. “A young staff member showed us a simulation of the ocean that he had made as a hobby using his home-made computer,” Yamazaki recalls. “It was so good that we rewrote part of the scenario and increased the number of ocean scenes considerably. However, in the latter half of the work, I regretted why I did that. We didn’t have a server with plenty of data to store it, so we had to make do by deleting cuts as they became available. I was astonished when I was told that the total amount of data exceeded one peta!” A personal favorite is Godzilla emitting a heat ray and destroying the entire Ginza area. Yamazaki comments, “Including the gimmick of the dorsal fin and the depiction of the area after it is destroyed, I believe I was able to create a heat ray that is more powerful and more horrific than ever before; that is a proper metaphor for the atomic bombing.”
Watch these fascinating videos on the history of Godzilla, the making of Godzilla Minus One and the production of the VFX behind the film. Click here: https://www.youtube.com/watch?v=rIZRvKsnqtU, https://www.youtube.com/watch?v=vvuD5bPYimU, https://www.youtube.com/watch?v=CQZPh-tnH5o/
By TREVOR HOGG
Images courtesy of Apple+ and Legendary Entertainment.
The AppleTV+ series Monarch: Legacy of Monsters provides the backstory for the mysterious organization formed upon the discovery of kaiju through a family drama where two half-siblings learn of each other’s existence and their father’s connection to Monarch. The MonsterVerse production created by Chris Black and Matt Fraction on behalf of Legendary Entertainment, Toho Company and Warner Bros. Entertainment consists of 10 episodes that required Visual Effects Supervisor Sean Konrad to craft over 3,000 shots with the help of Rising Sun Pictures, Rodeo FX, Framestore, FuseFX, Outpost VFX, Crafty Apes, Wētā FX, MPC, Storm Studios, Vitality VFX, BOT VFX, Mr. Wolf, Scarab Digital, The Third Floor, Proof, MPC Visualization and an in-house team.
Integrated into the narrative are previous MonsterVerse installments that appeared on the big screen. “What’s interesting about the show is the scene in the beginning of Episode 101 has this Kong: Skull Island [2017] classic adventure movie tone that is a bit off-the-wall crazy and every second has a new amplification of stakes through the action,” Konrad notes. “Then you get back to Kate Randa’s [Anna Sawai] perspective later on in the episode, and you’re going to Godzilla [2014] and you’re having that serious tone of this is a city being destroyed. You need to communicate those ideas visually in a way to people that is meaningful and serious. Then you have the final creature scene in the episode, which is a bunch of new monsters coming out of the ground and attacking our protagonist. Each one of them has a different tone.” Do not expect shots from the kaiju point of view. “A lot of the series is constructed around a subjective point of view of our characters experiencing the action from their perspective and reinforcing that with the visual effects design. A lot of times we knew the kind of creature we wanted and the terrain was immaterial to what the creature was doing, but it did influence how and where we shot it,” Konrad says.
Taking advantage of the physical location rather than rely heavily on bluescreen was the mandate. “We went to a bunch of locations in Hawaii for Episode, 101 and one of them was Lānaʻi Lookout [on O’ahu],” Konrad recalls. “It is this beautiful volcanic rock cone that points out into the ocean. If you’re running away from a monster that’s a great place to be heading towards. We looked at the terrain and the action we wanted to plan. There is a big bamboo forest in the corner of this landscape, and we let that take you through Bill Randa [John Goodman] being chased by the Mother Longlegs spider from Kong: Skull Island to a single point on that landmass, and this giant crab, which is made from the same volcanic rock, comes out of the ground. It was difficult because we prevised the scene based on some storyboards before scouting it, which is always a dangerous thing. We didn’t have a crab designed, so we grabbed The Third Floor graphic crab out of their archive and animated that for the previs. Simultaneously, I was doing the concept for the crab itself. All of that happened within a five-week period where we concepted, re-prevised and re-storyboarded the whole thing. It’s so much better than trying to shoot that into a bluescreen. It’s a hard process and definitely time-consuming. People did some brilliant work to get that done. We wanted the characters to feel in the action.”
“There is a big bamboo forest in the corner of this [volcanic Hawaiian] landscape, and we let that take you through Bill Randa [John Goodman] being chased by the Mother Longlegs spider from Kong: Skull Island to a single point on that landmass, and this giant crab, which is made from the same volcanic rock, comes out of the ground. It was difficult because we prevised the scene based on some storyboards before scouting it, which is always a dangerous thing. We didn’t have a crab designed, so we grabbed The Third Floor graphic crab out of their archive and animated that for the previs.”
—Sean Konrad, Visual Effects Supervisor
Television production happens at a quicker pace than movies. “When you’ve got one movie, you have over six months to do an hour and a half episode as supposed to 30 days to shoot a movie,” sates Special Effects Supervisor Paul Benjamin. “The setups have to be doable to make it into TV land. We had a bit of prep time for the first two episodes. It helps if the directors are on beforehand to know if you have some bigger builds. However, it’s always hard to get the directors before the episode starts.” Some visual research was done regarding the MonsterVerse. “I took a quick view of what they’ve been up to and been doing. When you watch the movies, sometimes it’s hard to figure out how they did everything, or how you block it out and film it. You get a general sense of what you’re up against or what they’re going to be looking for. But as far as pulling builds from watching the movies, it’s quite difficult,” Benjamin adds. Atmospherics were not a significant part of the visual language. “We did a couple of episodes of heavy smoke when they went to the Lost Lands to give a different look to the environment. We definitely did some snow but didn’t do a lot of exterior snow dressing, except for the base camp.” The weather was not always agreeable. “We did a snow dress and had a heavy rain that night, but luckily it held up and we didn’t plug any drains. We had somebody there watching just in case all of our paper snow came down and plugged one of the drains,” Benjamin remarks.
“We did one shaky gimbal set and that was for the hallway scene in the USSR when the ship was shaking. The monster comes in there and starts bashing around. We built that whole big deck on a floor and made it flip back and forth and tilted it up to whatever angle we wanted. We had big shaker motors on it and rocked it back and forth this way and tilted it that way. We shook that one pretty good.”
—Paul Benjamin, Special Effects Supervisor
Airbag decks were favored over hydraulic gimbals, such as when the school bus is tipping over the severely damaged Golden Gate Bridge. “We did one shaky gimbal set and that was for the hallway scene in the USSR when the ship was shaking,” Benjamin explains. “The monster comes in there and starts bashing around. We built that whole big deck on a floor and made it flip back and forth and tilted it up to whatever angle we wanted. We had big shaker motors on it and rocked it back and forth this way and tilted it that way. We shook that one pretty good.” It was not all about shaking things. “For the last few episodes we did a vortex, so we had a lot of big wind machines and ratcheting things and pulling things into the vortex. I want to see that portion of it. For the vortex in Alaska, we were shaking the trucks and equipment. But when the creature was chasing them, we didn’t do any explosions for that. The only thing that we did was blast air cannons to have some snow flying around. We did some pyro for the seismic charges that were set off at the power plant. Then we had Kurt Russell running through the lightning field. We had a bunch of mortars going off around them at that point, too.” After being in special effects for 23 years, Benjamin learned a particular lesson. “A lot of times, I find that the smaller gags are trickier than the bigger ones, like the dripping goo coming down from the ship. Something like that can be a lot of work, and testing to the desired look that everyone wants for that is sometimes more work than flipping a car over.” The practical elements are critical in making the stunts and visual effects believable. “We’re trying to do anything to help give the set some life so that the actors can get into it a bit more. You don’t have to fully act when you have the set moving around.”
“A lot of times, I find that the smaller gags are trickier than the bigger ones, like the dripping goo coming down from the ship. Something like that can be a lot of work, and testing to the desired look that everyone wants for that is sometimes more work than flipping a car over.”
—Paul Benjamin, Special Effects Supervisor
Collaboration is pivotal to the success of any project. “Ultimately, my attitude is that visual effects are a big part of the show,” states Jess Hall, Cinematographer, Episodes 101 and 102. “The CG has to be integrated into the photography, so I take it as my responsibility that those things have to work together. That means being collaborative and also organized about how you light; for example, on greenscreen matching lighting and doing the work in advance in terms previs and storyboards. But, ultimately, I treat it as a collaboration for which I bare a lot of responsibility for the end result. It’s not like I’m going to shoot someone on a greenscreen, hand it over to visual effects and let them do their thing; that’s not going to produce a good result.” A different color palette was adopted for the series. “We scaled back a little bit on the gaudier and pulp elements of some of the movies. We tried to bring it more into the dramatic cinematic space. Even if you look at our version of Skull Island. my reference for that was more Apocalypse Now. It was the golden warm light but naturalistic approach. Then we go to Tokyo and you have these cool tones, but the lighting was always naturalistic. The composition was reasonably consistent. The lenses I would shoot the faces on were a similar composition. You’re building this thread of visual language that is bulletproof in a way. You can apply these period looks or more action-sequence elements in there, but it doesn’t feel out of place. That was the challenge of the show, and a lot of thinking around design for me was about bringing them together enough, but having them different enough because you also had to understand these timelines. It was important that Skull Island did look different from the 1950s content, otherwise I don’t think you understand where you were.”
“We scaled back a little bit on the gaudier and pulp elements of some of the movies. We tried to bring it more into the dramatic cinematic space. Even if you look at our version of Skull Island. my reference for that was more Apocalypse Now. It was the golden warm light but naturalistic approach. Then we go to Tokyo and you have these cool tones. but the lighting was always naturalistic.”
—Jess Hall, Cinematographer
Hall partnered with filmmaker Matt Shakman on Episodes 101 and 102. “Matt is a dramatist and a real actor’s director,” Hall describes. “Having to try to pretend that things are moving around you, and you’re on a bluescreen and saying, ‘Okay, the crowd is coming from the right.’ Or, ‘Now turn to look at the crowd 300 feet away.’ ‘Follow the tennis ball.’ We’ve all seen how that can be quite tough for actors and performers. Matt is always looking to put the actors in the position where they’re comfortable and can give the best performance, but also feel the scene. We went to practical photography. There were a lot of visual effects but a lot of photography on location, and a lot of in-camera stunt action and real effects work that went on in all of these scenes that added to the sense of realism, which is what we wanted.” Atmospherics like smoke were never utilized without intention. “You have to be careful about how much you put in because quickly you lose the contrast of the shot. It’s something that you rely on special effects to operate. Ultimately, I’m the one who has to say, ‘Turn the smoke on or off.’ The whole show had this low-level haze. I was going for a softer, more dramatic look with a bit of texture in there and in the shadows.” The legacy of what has come before loomed large over the production. Hall observes. “You’ve got this huge IP and franchise and so many different elements to it. How do you take that, respect that and do that justice, but also do something that is distinct and appropriate for the show that you’re making and is your own work. Threading that needle was hard.”
By OLIVER WEBB
Images courtesy of AMC Networks.
Following the events of the final season of The Walking Dead, Daryl Dixon washes ashore in France and must undertake a perilous journey in order to find a way home in the series The Walking Dead: Daryl Dixon. Jao M’Changama served as Overall Visual Effects Supervisor on the show, with Sébastien Voisin and Justine Paynat-Sautivet working as VFX producers. “Excuse My French is the French supervision company that hired me for the show. The show’s French line producers, Raphael Benoliel and Augustin de Belloy, found the Excuse My French team, and AMC hired them. It was unreal for me at the beginning. having the Walking Dead come to Paris and getting the chance to be the Visual Effects Supervisor – and getting to destroy Paris. I was finishing a day on a cute CGI commercial spot with friends, and we were aiming to go to a bar when I got the call. That was surprising because we weren’t expecting this big of a show to come over to France, and being selected to work on it was like a dream come true,” M’Changama says.
When it came to the initial conversations about the look of the show, M’Changama and his team set up a Zoom call with creator and showrunner David Zabel and the AMC team. “They were asking not how I see the show but questioning me about my strengths and what I like and how I feel about zombies,” M’Changama adds. “I explained that I come from advertising. I’ve been supervising advertisements for 10 years for both big and small campaigns. They were very pleased to know that I worked in various worlds and with different types of narratives, including the Ubisoft live campaigns, exploring sci fi- gladiators and war action styles. It meant that I can easily change worlds, which is one of my strengths. It was really cool to be only focused on the Walking Dead world and to transfer the 15 years of Walking Dead that I’ve been watching to France and know that I have a huge part to play in that role. I wanted to create trust with AMC, and I know it’s the first time they’ve worked in France, so that was mainly the first exchange. Then, the week after there were some tech recces in the south of France. David [Zabel] was there as well as the director, Dan Percival, and Executive Producer Greg Nicotero, father of the Dead. It was great to switch between the U.S. TV show and being part of it [in France] in the best way we could.”
In terms of visual references, M’Changama looked at France’s urbex (urban exploration) culture to define the apocalypse in France. “There are still a lot of abandoned buildings left here in France, which we can’t destroy because they’re national treasures,” he notes. “The government won’t clean them, so they just stay there until something happens to them. One of my goals was to chase a lot of those urban exploration references. There are tons of Youtubers and photographers that goes everywhere, so I look for those dead hospitals, dead schools and dead churches. It gave us visual satisfaction because a lot of those places haven’t been touched and are still there waiting to be seen. We have this huge heritage In France. That was one of my first directions.”
“There are still a lot of abandoned buildings left here in France, which we can’t destroy because they’re national treasures. The government won’t clean them, and so they just stay there until something happens to them. One of my goals was to chase those urban exploration references. … so I look for those dead hospitals, dead schools and dead churches. It gave us visual satisfaction because a lot of those places haven’t been touched and are still there waiting to be seen.”
—Jao M’Changama, Overall Visual Effects Supervisor
M’Changama paid close attention to weaponry as it’s such a large part of the Walking Dead universe. “France doesn’t have the same variety of weapons as the U.S., so this was an important part of our research,” he details. “For the visual effects, we shot specific plates because we have a heritage with older weapons on France. We wanted to design something that wasn’t too old school and fit into the Walking Dead universe. It was the same thing for Norman’s [Norman Reedus portrays Daryl Dixon] mace. He’d already used a mace in Season 10. In France it’s smaller, so we needed to put the green tape on the mace and ask Norman to do it a bit slower to make it seem heavier. We couldn’t slow down the movement with VFX. We needed to adapt a bit.”
M’Changama worked closely with Series Production Designer Clovis Weil. “We went to the same school and hadn’t seen each other for about 15 years. At the first meeting about the set decoration, we remembered when we were in school and how we built a frame and a picture, which was something learned at school,” M’Changama explains. “It was great to work with Clovis as a friend and very talented individual. His team was working on the show around three months before I arrived, so they’d already been thinking about the show. It was a real collaboration between the set design and the VFX department.”
“David has a great knowledge of France culture, but when it came to destroying it, he put a lot of trust in set decoration and VFX and instructed these departments on how to do that best. It was a very smart move,” M’Changama adds. “Expectations were high. We wanted to get out of the Emily in Paris perfect world and show how we, the French, see the apocalypse. It was all about telling little stories in the background that helped us build the world around the main characters.”
“[Creator/showrunner David Zabel] has a great knowledge of France culture, but when it came to destroying it, he put a lot of trust in set decoration and VFX. … We wanted to get out of the Emily in Paris perfect world and show how we, the French, see the apocalypse. It was all about telling little stories in the background that helped us build the world around the main characters.”
—Jao M’Changama, Overall Visual Effects Supervisor
There were 740 VFX shots in total, and over 400 artists worked on the show. “We managed the workload with confidence,” M’Changama says. “It wasn’t an issue, but we knew that we weren’t the usual Walking Dead team. We had to show everyone that we could bring something new to the show. Sébastien, like me, has a solid background in advertising. On the other end, Justine had one in feature films. We are used to holding a huge workflow and doing a lot of work in a small amount of time. We knew that working in that way was achievable because we had the team needed to do the job. What is specific to our way of working is that we prioritize everything. Every single frame is important and every shot is important. ‘Scope’ and ‘details’ were the words from David with all the teams. We just wanted to build a consistent and imaginative show. We want the first episode to be as good as the last episode and all the fights done with the same love and care.”
M’Changama worked with five top Paris studios: BUF, MPC, Mathematic Light and Mac Guff. “Paris is a fairly small city, so we knew each other and each other’s work. I worked at MPC for 10 years before, so I knew that MPC had solid strength in environmental work and digital matte painting. So, for the deconstruction of the city, we just wanted them to work on it very closely. We exchanged lots of concept art. BUF just made Eiffel, so we knew that they had all the assets of the iron lady. We knew it would be fun for the team to destroy what they’d built. We also knew that Mathematic had a solid comp and craft vision that is very elaborate, and they are fast at adapting a lot of simulation techniques and lighting. We felt that they were the best guys to work on the zombies. The French zombies needed a lot of love and ingenuity. Greg Nicotero brought some incredible concepts of new species appearing in the Walking Dead universe. I think the VFX team succeeded in bringing a gorgeous update to zombies: Pulsating veins, burning blood, and more details not to be spoiled here. One of the supervisors with Light is also a flame artist – like me, and we knew they could craft technical shots, including 2D and CGI, very quickly. This process allows us to quickly share the steps with David Zabel and the AMC team. Mac Guff is a studio known for their generalists who can adapt to reconstructing towns to matte painting. They handled several shots successfully. So, by knowing our studios’ strengths, we managed the shot dispatch with accuracy, and this also helped us to build trust with David and AMC.”
“The French zombies needed a lot of love and ingenuity. [Executive Producer] Greg Nicotero brought some incredible concepts of new species appearing in the walking dead universe. I think the VFX team succeeded in bringing a gorgeous update to zombies: Pulsating veins, burning blood and more…”
—Jao M’Changama, Overall Visual Effects Supervisor
When it came to filming the streets of Paris, the Eiffel Tower was, of course, the main event, but for M’Changama and his team, every street was important. “The most challenging locations for us involved the locations with bluescreens because we faced last-minute changes on the script,” M’Changama remarks. “For the bluescreens scenes, there was the rooftop camp, the cargo ship and the Eiffel Tower. The screenplay wasn’t completely finished as we were going to shoot. The good ideas from the showrunner and the director came in very late in the process, and as they were amazing ideas, we needed to adapt our initial plans. For example, the location of the boat was initially only in the sea and finally a part of it happens in a harbor, What could we do regarding the visual effects in the harbor? It changed the way we would film. It was challenging because we changed our vision to the good ideas that sometimes came 10 minutes before shooting. The on-set VFX team provided a large panel of solutions, not only on bluescreen but all along the road trip we made in France. Then, when we were in the VFX post-production process, we were honestly excited and impatient to share our vision with David and all the team.”
One of the biggest challenges for M’Changama was matching their work with the rest of the Walking Dead universe. “We had a vision of how the zombies work, but we are not doing another show about zombies, we are bringing the Walking Dead to France. We can bring something new to it, but we need it to match what already exists and we need it to be accurate. The zombie blood and how we will kill the zombies has been one of the most creative aspects for us. Specifically, the exploding head in the arena towards the end of the series. Greg Nicotero brought in several SFX components to make the pumping zombie skin and other makeup FX magic. But one of the biggest challenges was shooting in a national treasure area where we were not allowed to put blood on the walls and the ground. The lights were changing, and it was becoming dark, and the blood was very specific. We needed to make those little details match. That one single shot was very fun to do, and we did a great job. It wasn’t too digital or too gory, but it was 100% CGI.”
“The zombie blood and how we will kill the zombies has been one of the most creative aspects for us. Specifically, the exploding head in the arena towards the end of the series. Greg Nicotero brought in several SFX components to make the pumping zombie skin and other makeup FX magic. But one of the biggest challenges was shooting in a national treasure area where we were not allowed to put blood on the walls and the ground. … It wasn’t too digital or too gory, but it was 100% CGI.”
—Jao M’Changama, Overall Visual Effects Supervisor
M’Changama and his team also faced issues with anachronism. For one of the flashback sequences, Notre Dame’s spire, which was destroyed in the recent fire, had to be recreated. For M’Changama, working on these fine details was an important part of the process “I’m happy that Hollywood trusted France’s VFX ecosystem. I think we did a great job, and I’m proud of what we’ve achieved. We’re eager for the next challenge!”
By TREVOR HOGG
Images courtesy of Prime Video.
Essential for any successful digital augmentation is having a member of the visual effects team present during the live-action shooting to ensure that are the required elements, whether it be plate photography or LiDAR scans of sets, are acquired and provided to vendors, thereby establishing a solid foundation for the work to be done in post-production. In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
“When living away, you start immersing yourself in the project 24/7,” explains Roni Rodrigues, On-Set VFX Supervisor. “I did block one which was Episodes 201 and 202 and then straightaway did block two which was Episodes 203 and 204. Then Mike Stillwell did blocks three and four. When shooting block one, you will do some scenes from Episodes 201 and 202 together because we’re revisiting a lot of those locations. For block one, we had Thomas Napper as the director and for block two Sanaa Hamri as the director, so the team changed as well, including the DP and 1st AD.”
“We did previs and then postvis [of Heroes of the Horn] with the stunt team trying to work out how these heroes would appear and be involved in a battle. We wanted to shoot in a visceral handheld -in-amongst-it way, but it didn’t lend itself to shooting one plate with them and shooting it again without. We thought going back to the smoke gave them that ethereal quality without looking like Casper the Friendly Ghost.”
—Mike Stillwell, On-Set VFX Supervisor
Overseeing the fantasy series is creator and showrunner Rafe Judkins. “Rafe’s position was important to give consistency on the visual identity of the show,” Rodrigues notes. “Even though the directors changed, we always kept everything in the same universe. That’s one of the reasons why it was so important for us to spend quite a lot of time in pre-production, because we managed to plan ahead for many of the details in every single scene. It was easy for us to transfer that information from the first block to the second block and achieve the desired results.” A lot of time was spent developing relationships with other departments, “from the collaboration with the DP discussing on-set light interaction to the production designer and how we want to build sets so there is a seamless line between what is practical and a CG extension,” Rodrigues adds.
“The relationships that Roni had started in the first four episodes made it so much easier for me because I was able to come in and build upon on what was already there,” remarks Mike Stillwell, On-Set VFX Supervisor. “The stunt guys would be showing me their stuntvis before they’ve shown other people and asking, ‘What do you think? Is this going to work?’ Jan Petrina, the Stunt Supervisor, would do incredible stuntvis with fantastic After Effects work in it. It paved the way for what we wanted to do. It was a constant dialogue. He never promised something that we couldn’t deliver and vice versa. We had each other’s backs wherever possible.”
The smoke reveal of the Heroes of the Horn was not the original idea. Stillwell observes, “We did previs and then postvis with the stunt team trying to work out how these heroes would appear and be involved in a battle. We wanted to shoot in a visceral handheld -in-amongst-it way, but it didn’t lend itself to shooting one plate with them and shooting it again without. We thought going back to the smoke gave them that ethereal quality without looking like Casper the Friendly Ghost.”
“It was nice to work with Andy Scrase [Visual Effects Supervisor] as he was on the same page about the details, and the more information that we actually give to the post-production team, the more they can do,” Rodrigues states. “The cyberscan booth was there [in the studio] 24/7 for us. Hats off to the visual effects production team, production manager and production coordinators because we wanted not just everyone being scanned, every time our lead actors changed their clothes, we wanted that variation as well.” A variety of exterior locations were found in the Czech Republic. “The whole city of Cairhein was built, and was so vast and rich in details that it was incredibly helpful,” Rodrigues notes. “Also, we went to Italy and Morocco. There is a scene where the guys are riding horses, and those epic mountains in the background are real. Obviously, we as visual effects did enhancements that make it better. However, having a good location and production designer are not just good for the showrunner but for the actors as well, as it’s easier for them to perform and to get into the character.”
“[In addition to the Czech Republic] … we went to Italy and Morocco. There is a scene where the guys are riding horses, and those epic mountains in the background are real. Obviously, we as visual effects did enhancements that make it better. However, having a good location and production designer are not just good for the showrunner but for the actors as well, as it’s easier for them to perform and to get into the character.”
—Roni Rodrigues, On-Set VFX Supervisor.
Killing Turak (Daniel Francis) was not as easy for the production team as it was for Rand al’Thor (Josha Stradowski). “When Rand is approaching the tower and Turak [Daniel Francis] has his heron-marked blade and does some fancy moves, we had so many meetings where I was presenting different ideas on, ‘How do we kill Turak?’” Stillwell recalls. “Rand has unbelievable power and is pissed. How would he do this? We were talking about turning someone to stone and then shattering them. Or having Rand fill them with lava and they explode from within. Or Rand whipping through these blades of air and doing the classic thing where they look fine and then slowly slide apart because of being sliced in half. What Rand ends up doing is so nonchalant, but it shows the power that he has. We spoke to Josha about it. Josha even plays it like he didn’t expect it to be that easy.”
Lessons were learned, in particular, when it came to characters channeling the One Power, which involves manipulating intricate illuminated weaves of water, fire, earth, air and spirit. “They used interactive light on Season 1, and when we received the plates, the lighting was baked into the plate, and it was too much,” Rodrigues explains. “We ended up painting out interactive light from all of those scenes. We wanted the showrunner to have a full scope of flexibility to decide in post-production which direction he wants to go. What we did was to do a performance take without interactive light and then we did it again, but with interactive light. The idea was once they decided on the take to use, we would get the plate with the interactive light and paint light in. In this way, we had the flexibility to paint the light as many times as we wanted in any position we wanted.”
“It was a case of myself, Rafe, directors and writers throwing ideas out. No idea is a bad idea. ‘What if every time we see him it’s a different person, but they all have his face?’ And then we would discuss how possible is that? How good is it going to look? We wanted people to be as confused as Matt was. We looked at a lot of different concept art from different shows, and we watched a lot of different films, like those of Gasper Noé. We were trying to get reference from not-obvious things.”
—Mike Stillwell, On-Set VFX Supervisor
Hallucinations allowed for surreal imagery. “For the psychedelic visions Matt Cauthon [Barney Harris] has after drinking the tea, we did a lot of work on how things would work with the mirrors, how it would look when his hands and veins are becoming distorted and enlarged,” Stillwell states. “There were lots of discussions about how to make that trip look terrifying, and the switching out of him and his mother. It was a case of myself, Rafe, directors and writers throwing ideas out. No idea is a bad idea. ‘What if every time we see him it’s a different person, but they all have his face?’ And then we would discuss how possible is that? How good is it going to look? We wanted people to be as confused as Matt was. We looked at a lot of different concept art from different shows, and we watched a lot of different films, like those of Gasper Noé. We were trying to get reference from not-obvious things.” A character gets turned into stone and dissipates into the air. “The art department did a couple of busts for us to use as reference. I remember having so many show-and-tells about dust and discussions about how fine a grain we wanted for the ashes. We shot a load of reference of all of this stuff blowing away and being shattered and thrown around. But the main thing was to make sure to get clean plates because we had to have a tight body track with witness cameras to create a good CG model.”
“[For a character that gets turned into stone and breaks up] the art department did a couple of busts for us to use as reference. I remember having so many show-and-tells about dust and discussions about how fine a grain we wanted for the ashes. We shot a load of reference of all of this stuff blowing away and being shattered and thrown around. But the main thing was to make sure to get clean plates because we had to have a tight body track with witness cameras to create a good CG model.”
—Mike Stillwell, On-Set VFX Supervisor
One of the cool visual effects are the shields created by channeling air. “The ideas for that were still broad when we were actually shooting,” Stillwell reveals. “It was a case of talking to the actors and giving them something to work with, because so often they’re having to do all of this channeling and having to imagine what it is and how it’s affecting them. When Marcus Rutherford [Perrin Aybara] is protecting them from Ishamael [Fares Fares] at the end when we were rehearsing it, I asked if it was okay for me to go and show something. I was slamming my body onto his shield and saying, ‘This is the weight of what’s hitting you. You’re not just deflecting bullets like Captain America. These are massive large forces, not little pinpricks.’ I was just trying to give him something to work with so he can imagine it, because if the actor’s performance works, the visual effects work so much better.”
BY CHRIS McGOWAN
In just a few years, turbocharged by the pandemic, remote work has become widely established in the VFX industry and is now a preferred option for many visual artists. It has also helped boost the globalization of visual effects work, which in turn has increased the demand for remote workers.
Framestore was one of the VFX studios that succeeded in pivoting quickly in the new environment. Looking back, Framestore CEO Mel Sullivan recalls, “Thanks to our systems team we had an effective, efficient and secure remote working pipeline in place within two weeks of lockdown, and since then it’s mainly been a case of refining and enhancing our work-from-home capabilities.”
During that time, Sullivan says, “We’ve delivered some of the biggest shows on Earth from home – Oscar nominees, record-breaking box-office successes, you name it – so it’s now second nature and certainly isn’t hampering what we do. Like most of the companies operating at our level, we’ve now moved to a hybrid way of working. This has been a question of striking a balance between the improved work-life balance the industry is now enjoying versus the genuine benefits of being surrounded by your peers, being able to connect with each other in real-time and spark those spontaneous insights and conversations that only happen if you’re in the same physical space.”
“Our staff are at the core of Cinesite’s success; everyone is different, and we celebrate that. By allowing individuals to work according to their own working styles, we’ll get the best out of everyone rather than forcing them to conform to another person’s style, at their detriment.”
—Sashka Jankovska, Chief HR Officer, Cinesite
After COVID-19’s arrival, the necessity to maintain social distancing and prioritize employee safety led to a significant shift towards remote work in the VFX industry. During the pandemic, many VFX studios established remote work setups, enabling their artists to continue working on projects from home. Advancements in technology, high-speed internet connections and the availability of powerful remote collaboration tools made this transition possible. Artists could remotely access necessary software, collaborate with colleagues and render their work using cloud-based systems. Among such tools today, Hammerspace delivers a global data environment which spans across data centers and AWS, Azure and Google cloud infrastructure. Frame.io (owned by Adobe) is a leading cloud-based storage platform for video assets and files. LucidLink offers a high-performance cloud file system for distributed workloads, while Seagate’s Lyve Data Services provides data management, recovery and secure migration to any cloud service. Moxion (owned by Autodesk) is a dailies and content-review platform and offers an instant dailies service in HDR, with full Dolby Vision, HDR10 and 4K resolution playback support. The Television Academy for Engineering Development awarded Teradici an Emmy in 2020; it is a leader in “remoting software for performance, security and deployment flexibility.” Acquired in 2021, the firm’s Teradici CAS (Cloud Access Software) is now HP Anyware. MS Teams and Zoom are among the popular tools for communications collaboration.
Key benefits of remote work in the VFX business include increased flexibility, access to global talent and cost savings. Some of the challenges include achieving sufficient collaboration and communication, plus hardware and software requirements, data security and protection of intellectual property.
While remote work has grown tremendously, some studios may still prefer a hybrid approach, combining remote work with in-person collaboration for certain stages of production or specific projects that benefit from on-site interactions. Cinesite offers three different options at its London, Montreal and Vancouver offices – home-based, hybrid-based and office-based. It offers all-team meetings to provide transparency and motivation as well as seasonal opportunities to socialize with colleagues and friends, and it has retained its physical offices for employees in London, Montreal and Vancouver for in-person collaboration and special projects.
Early on, Cinesite shifted to a flexible way of working, embracing workplace shifts made necessary by the pandemic. “After everyone settled into their new work-from-home environment, we quickly realized we could be even more creative and effective in delivering high-quality series and feature film work when adopting a flexible way of working,” notes Cinesite’s Chief HR Officer, Sashka Jankovska. “Our staff are at the core of Cinesite’s success; everyone is different, and we celebrate that. By allowing individuals to work according to their own working styles, we’ll get the best out of everyone, rather than forcing them to conform to another person’s style, at their detriment.”
“Looking ahead to 2024, we are exploring a hybrid approach that will enable our artists to work both in-studio and remotely. The primary driver behind this adjustment is to alleviate ‘work-from-home fatigue,’ encourage and foster team-building and provide better support and integration for new talent into Digital Domain. By combining the best of both worlds, this approach will create an ideal environment for collaboration and creativity while retaining the advantages of remote work.”
—Lala Gavgavian, President and COO, Digital Domain
President and COO of Digital Domain Lala Gavgavian comments, “Our rapid response to the pandemic in 2020 and the shift to a work-from-home paradigm have been among our significant operational achievements at Digital Domain. The evolving WFH setup has expanded our access to global talent without the need for physical relocations to a brick-and-mortar building and allowed us to scale our capacity in ways that were limited prior to the work-from-home success.” She adds, “Looking ahead to 2024, we are exploring a hybrid approach that will enable our artists to work both in-studio and remotely. The primary driver behind this adjustment is to alleviate ‘work-from-home fatigue,’ encourage and foster team-building and provide better support and integration for new talent into Digital Domain. By combining the best of both worlds, this approach will create an ideal environment for collaboration and creativity while retaining the advantages of remote work.”
Hitesh Shah, BOT VFX CEO and Founder, comments, “The discussion of ‘if remote work is in the mix’ is almost gone – most of the attention is on how organizations balance in-office versus remote. Organizations continue to wade through the conflicting needs posed by talent access, talent lifestyle preferences, collaboration needs, training needs and economics.” He adds, “Like most other industries, the VFX industry has gone from large ‘religious’ debates about all in-office or all-remote to everyone finding the balance that works for their situation.” Shah notes that “collaborative content-review software” helps make remote work viable.
In a broader sense, the world is figuring out what types of businesses and industries are okay to work remotely – or not. According to David Lebensfeld, President and VFX Supervisor at Ingenuity Studios. “The VFX industry has gotten comfortable with the idea that remote work is here to stay. It’s interesting though – more workers are returning to the office in some locations more than others. So, for example, Europe has more workers returning to the office. Yet in New York City and Los Angeles, we’re finding that people want to continue working remotely.”
Lebensfeld observes, “Most artists do want the option. I think hybrid and remote work will endure in the VFX industry, if for no other reason than access to talent being key, and remote/hybrid options definitely expand the pool of qualified candidates.” In terms of key software and hardware that facilitate remote work, “There are common tools that everyone uses, including MS Teams and Zoom. We use Teradici to power remote working around the world,” Lebensfeld adds. “Remote work is supported well by the cloud. You can access workstations or storage from anywhere and move workstation demand closer to team member location, such as through AWS or Azure for improved latency speed, which is more or less attached to distance.”
“Europe has more workers returning to the office. Yet in New York City and Los Angeles, we’re finding that people want to continue working remotely. Most artists do want the option. I think hybrid and remote work will endure in the VFX industry, if for no other reason than access to talent being key, and remote/hybrid options definitely expand the pool of qualified candidates.”
—David Lebensfeld, President/VFX Supervisor, Ingenuity Studios
Remote work is having a positive effect on many visual effects artists. “I think it is impacting work/life balance, quality of life and flexibility to handle family needs and work after the kids are sleeping,” Lebensfeld says. “All of this depends on the team member and the project needs at any given time. Sometimes, having work completed at odd hours really benefits a project timeline while still promoting a healthy balance.” He comments, “The hope is that post-strike demand returns to a level that is sustainable for the industry, and we happen to think that the option of remote work allows us to best meet our hiring goals, especially when demand spikes. I don’t think remote work fully replaces how people work together, however. In-person working and team interactions bring added benefits of additional context and relationship-building. I find that mentoring is often easier with in-person dialogue and rapport.”
Gaurav Gupta, CEO of FutureWorks, notes that remote work can bring filmmakers together more quickly and offer “flexibility, speed, cost optimization and competition for the best talent at global scale.” He comments, “Content creation, by its nature, works best when creators are able to collaborate frequently and effectively, where multiple stakeholders can share their ideas quickly and review changes in a qualitative manner. Starting with Smartjog, Aspera and Cinesync in the 2000s, to modern cloud-based applications like Frame.io, Moxion, etc. now, technology has enabled creators to share high-quality audio, video and images.”
The cloud is already essential for much remote work. Gupta explains, “Whether private or public, cloud technologies are being used everywhere in content production; from set to screen, remote work and cloud go hand in hand. I think the industry is now aligned to make The 2030 Vision published by MovieLabs a reality sooner than its stated time frame. Its first principle ‘All assets are created or ingested straight into the cloud and do not need to be moved’ heralds a complete cloud future.”
Just as remote work has enabled globalization, the latter has fed the former. “Netflix has created the first true global studio without borders. Their focus on pushing both technology and creativity in content production at global scale has accelerated the development and adoption of remote working. The flywheel is now spinning faster and faster,” Gupta says.
“For me the most exciting developments are in the area of a Global File System, technologies like Hammerspace enabling seamless movement of data to a location where it’s needed with AI intelligence. I think technologies like these were only available to a select few studios, and now anyone will be able to adopt this. This solves a very important problem for a globally distributed workforce.”
—Gaurav Gupta, CEO, FutureWorks
Gupta adds, “For me the most exciting developments are in the area of a Global File System, technologies like Hammerspace enabling seamless movement of data to a location where it’s needed with AI intelligence. I think technologies like these were only available to a select few studios, and now anyone will be able to adopt this. This solves a very important problem for a globally distributed workforce.”
Still, with all the benefits brought by remote work, the old-fashioned approach is also important. “Being in the office is also vital for our new and early-career artists since they benefit so much from being able to work and learn alongside more experienced creatives,” Sullivan says. “Linked to this is our broader mission to ensure everyone working for Framestore, whether they’re in the U.K., U.S., Canada, India or Australia, can benefit from our unique creative culture and the wealth of training, development and mentoring opportunities we offer.”
The remote workflow has evolved steadily on several fronts since early 2020. Rob Hifle, CEO and Creative Director at Lux Aetena, notes the deep globalization of the workforce occurring in VFX. “Connectivity is continually improving, leading to more global collaboration as well as hybrid working, leveraging talent and expertise from different regions of the world. This will lead to wider representation of backgrounds and experiences across the industry, with the unique and valuable insights that brings.”
By OLIVER WEBB
Last year, Avatar: The Way of Water scooped the Academy Award for Best Visual Effects, beating Top Gun: Maverick, Black Panther: Wakanda Forever, All Quiet on the Western Front and The Batman. When the nominees are announced on Tuesday, January 23, for Best Visual Effects at the 96th Academy Awards, it will be another competitive year with outstanding films in contention for the VFX Oscar.
Nominated for Best Visual Effects in 2014, the first Guardians of the Galaxy installment narrowly missed out to Interstellar. In 2017, Guardians of the Galaxy Vol. 2 saw itself nominated for the award, this time losing out to Blade Runner 2049. A third-time’s-the-charm win would be fitting for the third and final installment; although it should be noted that with the extraordinary craftsmanship of the film’s Visual Effects Production Supervisor, Stephane Ceretti, and all of the VFX vendors involved, it would be anything but luck, as Vol. 3 remains a serious forefront contender. “We had 3,066 visual effects shots in the film, a huge number,” says Ceretti, who also worked on the first film. “On top of that, we had to do the Christmas special at the same time, which was an additional 560 shots.”
Ceretti credits Guardians series director James Gunn for constantly evolving and challenging his VFX team to break new ground. “James Gunn always wants things to look as real as possible,” Ceretti explains. “He’s got his filming style that’s very specific to him, and that has evolved from the first Guardians film. He was really challenging us in terms of how he is now filming things with smaller cameras that are really portable and moving all the time. We knew that we had potentially a lot of full CG sequences, especially with the flashbacks. We had discussions about the different worlds that we would either revisit from previous films, like Knowhere, or the new places that we would introduce like Counter Earth or The Orgoscope. We worked alongside Production Designer Beth Mickle, who is fantastic and had previously done The Suicide Squad with James.”
Rocket proved to be the most challenging CG character to create for Ceretti and his team. “It was great to come back and finish the story of Rocket,” Ceretti says. “We knew we had to build Rocket across his different ages throughout the film. There was also a sense of an animalistic feeling that we had to get from Rocket that, in my opinion, we had lost a little bit across the different films.” Ceretti adds, “The emphasis on the high level of detail in terms of both modeling, grooming and keyframe animation for all of these animals was really key for the success of the movie and being able to tell that story.”
Indiana Jones and the Dial of Destiny marked the titular character’s return to the big screen. With Andrew Whitehurst serving as Visual Effects Supervisor, the film consisted of 2,350 visual effects shots. Whitehurst describes, “Because the movie is a journey, most scenes take place in different times and locations. It’s a story that has many components with little overlap between them. We knew that the scope of the work would mean that we would need several vendors on the show. By the time we were starting to think about that, Kathy Siegel, the VFX Producer, was onboard, so we were talking about the various folks that we might approach to work on it and who’d we worked with before, and who we knew specialized in certain types of work. It was an exercise in logistics really.”
Along with the prologue, the Siege of Syracuse sequence was one of the main focus points in the pre-production of Dial of Destiny. “We started blocking it out with Previs Supervisor Clint Reagan,” Whitehurst notes. “[Editor] Mike McCusker, who was cutting that sequence, was on the project by then. We were exploring the sequence through previs and storyboards because a screenplay often isn’t the best place to explore an action sequence like that. The other thing we needed to figure out was how to film it and how the narrative would drive the layout of the environment. In my office in L.A., I had a map on my whiteboard that I was forever updating with what ancient Syracuse looked like based on the cut. I had all the various story beats as thumbnail sketches drawn on Post-it notes that I was laying out along the planes’ trajectories as I drew them out on the board. From that we could work out how the coastline and the city had to be laid out to accommodate that action. Previs was also vital in working out how we might stage action in the plane set pieces that we had at Pinewood. We were able to figure out ahead of shooting how we could get our actors, crew and cameras into such a confined space. It saved a lot of time on the shoot.”
“Everyone working on the film was very conscious of the previous films and the way that they were made,” Whitehurst explains. “The thing you will hear more from Jim [director James Mangold] when you are on set or talking about previs shots, is ‘nouns and verbs.’ Every shot needs a noun and every shot needs a verb. That’s a very classical filmmaking way of working, and I think that attention to detail really shines through in the finished film.”
Following Rogue One: A Star Wars Story, Gareth Edwards was at the directing helm again with The Creator. FIN VFX contributed over 100 shots for the film. “By the time we joined the project, The Creator had been in Gareth’s head for years,” says FIN VFX Supervisor Stuart White. “When we received our first shots to begin work on, ILM had already finalized a large number of their shots, having had about a two-year head start. So, the expectations were high! I’ve long been a fan of concept artist [Production Designer] James Clyne. He and his team at ILM had been working on the most amazing body of concept art for this movie that I think I’ve ever seen, all handed to us as a PDF that numbered hundreds of pages.” Discussing the most challenging visual effects shot to create, White notes a shot where two (CG) police transport ships land in an alleyway, and about nine (partially CG) robots jump out of it and run into a building. “In the postvis, they had replaced the whole top half of the frame with a still that basically implied ‘insert Blade Runner awesome future city alleyway and skyscrapers here.’”
Joe DiValerio of Outpost VFX served as VFX Supervisor and interacted with Charley Henley, Production Visual Effects Supervisor, for Ridley Scott’s Napoleon. “Our work was just under 30 shots, which we ultimately hope are invisible. The film has a unique look and style we needed to adapt to,” DiValerio says. “We had a sequence, fleshed out shot design, and a solid plan of shot plates and elements that needed to be assembled. There was some amazing roto and paint work to preserve the essence of the practical element photography. We had some very clever uses of mixing up the photography in ways it wasn’t originally intended – mixing up takes, augmenting stunt work, mixing angles and standard hiding logistics of the practical effects. The biggest challenge we anticipated was the CG crowd work. The production team had motion capture performances with specific actions for our sequence. We were able to build a library of just the right parts and then art-direct them into the film. Our sequence had a little bit of everything, matte painting, CG FX, crowds, element work and stunt enhancement.”
Rodeo FX completed 237 VFX shots for The Little Mermaid with Graeme Marshall and Ashley Bellm both serving as Rodeo FX Visual Effects Producers on the film. “We were only one of multiple vendors,” say Marshall and Bellm with one voice. “As on most projects, we interacted mainly with the client-side VFX team, helmed by VFX Supervisor Tim Burke, and VFX Producer Leslie Lerman. Their team was an absolute delight to work with. Although we did receive notes from Rob [director Rob Marshall], our main point of contact was the studio VFX team. Rodeo was initially brought on board to take on some overflow character work from another studio. All of the merfolk shots were pretty challenging, there was a lot of plate reconstruction and integration to be done. Our most difficult shot was in the closing scene of the film when Ariel’s family and community join her on the beach to send her off on her adventures with Prince Eric. The shots with Scuttle, Sebastian and Flounder were also challenging as the character assets – built by our friends at Framestore – were shared across multiple vendors, and we had to ensure continuity not only in their appearance, but in their overall demeanor and performance.”
One of the most highly anticipated films of the year was Christopher Nolan’s Oppenheimer. The combined effects departments created over 200 shots and only half of those needed post work in VFX. Production VFX Supervisor Andrew Jackson was the first person to read the Oppenheimer script after Producer Emma Thomas. “Chris told me before I read the script that he wanted to avoid computer-generated effects,” Jackson says. “He thought if we could shoot it practically, it would fit better with the language and feel of the film. His approach to effects is very similar to mine in that we don’t see a clear divide between VFX and SFX, and believe that if something can be based on filmed elements it will always bring more richness and depth to the work. After my initial discussions with Chris, I spent the first three months of the project in SFX Supervisor Scott Fisher’s workshop, developing and testing dozens of simulations and effects. For the duration of the shoot, we ran a small VFX IMAX film unit, which compiled an extensive library of filmed elements. The final shots ranged from using the raw elements as shot, through to complex composites of multiple filmed elements led by VFX Supervisor Giacomo Mineo and the team at DNEG, the film’s sole VFX partner.
“Each shot presented its own set of challenges as the script describes thoughts and ideas rather than specific visual images,” Jackson continues. “This was both exciting and challenging as we searched for solutions we could build and shoot that were both driven by the story and visually engaging. Combined with the set of creative rules like only using real elements shot on film, this meant that we had to dig deeper to find solutions that were often more interesting than if there had been no limits. The one sequence that was the most challenging was the Trinity test itself. Unlike the other effects scenes, this one needed to replicate the original test but still be made only using filmed elements. It was a huge design and compositing exercise, involving retiming and combining multiple high-speed explosion and ground detail elements. The result is a truly unique cinematic experience.”
MPC and Wētā FX were the lead studios on Transformers: Rise of the Beasts, and the number of visual effects shots ended up in the 1,800-1,830 range. “[Director] Steven Caple Jr. and I met in pre-production when there was only a script. He’d been gathering mood material and came in with a good idea of the world he wanted. It was a more grounded, gritty and distressed look he was after,” says MPC VFX Supervisor Gary Brozenich. “We discussed a heavier patina for the robots in general and wanted to link the look of their aging to the places they inhabited. The Autobots are city dwellers and the Maximals are jungle-dwelling – we should feel the environment in their shell. Both should have a different quality. Unicron and its interior carried the weight of introducing the franchise’s ultimate villain, one that is ingrained in the lore and the minds of a huge fan base. We wanted to make sure we were true to the original structural design in its planetary form.”
Matt Aitken was Wētā FX Visual Effects Supervisor on the film. “Our work on the show was all complex with character animation and FX simulations in almost every shot. The transformation shots were particularly challenging; we set up a dedicated transformation team comprising specialists from animation, rigging and models to handle the specifics of those, and some of the transformation shots were in progress throughout our time working on the movie. The most difficult transformation sequence was probably Mirage transforming into an exosuit around Noah as he slowly stands up. We kept Noah’s face but most of the time his body, clothing and hair has been replaced to allow the articulating suit pieces to neatly form around him. That was the last shot we delivered on the show!” Wētā FX also provided the bulk of effects for Cocaine Bear. “Our work on Cocaine Bear focused on the furry, drug-fueled lead, Cokie, a CG black bear that goes on a rampage after stumbling on abandoned cocaine in the wilderness,” notes VFX Supervisor Robin Hollander.
This year, audiences also saw Brie Larson reprise her role as Captain Marvel in The Marvels, which undoubtedly will be a contender at the Academy Awards, while DC’s Blue Beetle could be an outside pick. This year also marked the return of John Wick with John Wick: Chapter 4, an excellent addition to the franchise. Barbie will certainly be in contention for Best Production Design. The film’s visual effects are also worth special mention. Whichever film wins the award for Best Visual Effects, it has been an incredible year for film VFX.
VES Members and Their Guests are Invited to a 3D [...]
Find out moreVES Members and Up to Four (4) Guests in Berlin [...]
Find out moreNecessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.