
Lyndon J. Barrois, Sr. is an acclaimed artist, animation director and filmmaker, whose film credits include The Matrix Trilogy, Happy Feet, Sucker Punch and The Thing, where he directed pivotal character animation sequences. He currently wins accolades for his unique gum wrapper sculptures and stop-motion animations of historic figures and events, whose portrait and Sportrait films are produced entirely on iPhones. An advocate for underrepresented voices in the entertainment industry, he serves on the Academy Museum’s Inclusion Advisory Committee, fostering programs and supporting exhibitions, as well as the AMPAS VFX Executive Branch.
My gum wrapper sculptures were directly influenced by my upbringing in New Orleans, where making Mardi Gras floats out of repurposed materials is part of the culture. My mom was a fanatical gum chewer and her discarded Wrigley Chewing Gum wrappers – paper on one side, foil on the other – spawned my artistic path of shaping them into humanistic sculptures. At age 10, these figures had just one job: to drive my Hot Wheels cars! As I progressed to sculpting athletes and icons, it was my desire to animate them and bring to life great moments in sports and history that led me to CalArts and a long career in animation.
When The Academy Museum was preparing to open in Los Angeles, we knew we had an opportunity and a responsibility to ensure that diverse artists and filmmakers were represented on every floor, with no one left out. It was and is imperative that this space is true to film-
makers of color and women, and not “whitewashed.” Since the advent of film, people of color have always done the work, but too often the narratives write us out. The Museum is one forum to right those wrongs..and The Academy’s Aperture 2025 diversity standards for Best Picture nominees is another move to ensure that artists from all backgrounds are not overlooked and rightfully recognized.
Art is a huge umbrella. So many people want to be artists or are qualified to be teachers who don’t get the opportunity. When it comes to nurturing and hiring talent, you have to cast a wide net beyond the people you know and people who look like you, or it limits the possibilities for such a big part of the population who deserve their shot in merit – and have so much to offer by sharing their unique vision and life experience. The work must continue and intensify to let people from all communities know that opportunities exist and to help them pursue our viable, exciting profession and succeed.
Since the advent of film, people of color have always done the work, but too often the narratives write us out.
Al and evolving technology are exciting and intimidating all at once – but we’ve been here before. Advances in tech is just the changing nature of the art form and the key is always adaptation. Way back we thought photography would kill painting, motion pictures would kill live theater, CG would kill 2D animation, mo-cap would kill character animation – and none of that happened. Advances push us to work harder, learn new things, and better our craft. The best strategy is to embrace change – because it will keep coming – and co-exist in this richer, dynamic universe.
Join us for our series of interactive webinars with visual effects professionals. Ask your questions, learn about the industry and glean inspiration for your career path.
Register today at VESGlobal.org/AMA
By TREVOR HOGG
Images courtesy of Netflix.
Extra detail was added to Monkey D. Luffy to avoid him looking like plastic rather than rubber.
While Monkey D. Luffy aims to become the king of the pirates, the manga infused with his contagious optimism and enthusiasm reigns supreme as the all-time best seller with 516.6 million copies sold since Eiichiro Oda created One Piece in 1997. The story that revolves around a treasure-hunt frenzy caused by the execution of Pirate King Gol D. Roger, which sees participants from around the world engage in questionable antics to ensure victory over the competition. The franchise has obtained longevity in other mediums with the ongoing anime television series consisting of over 1,000 episodes, and now Netflix has released a live-action version created by Steven Maeda and Matt Owens that stars Iñaki Godoy, Mackenyu, Emily Rudd, Jacob Romero Gibson, Taz Skylar, Vincent Regan and Morgan Davies. Placed in charge of making the fantasy elements a cinematic reality for the eight episodes of the first season were Visual Effects Supervisor Victor Scalise and Visual Effects Producer Scott Ramsey who previously collaborated together on Cowboy Bebop.
Colton Osorio as Young Luffy and Peter Gadiot as Shanks have conversation with the Lord of the Coast.
While a guiding principle is that visual effects have to be grounded in order to be believable, there is a cartoony aesthetic to the source material and its animated spinoffs that fans expect to be honored. “It’s truly a fine balance because if you make it too cartoony the fans might love it, but then the general audience might think it’s too much,” Scalise notes. “When we first came on everybody said, ‘We didn’t want him [Monkey D. Luffy] to be like Plastic Man.’ Originally, the punches were going to be stiff, fast and straight, and when we started applying them to shots, they felt lifeless. We started putting little bends and wiggles and an extra weight on the fist to get wrinkles, and it brought the effect to life. It’s tough because when a rubber arm has motion blur there is no detail or texture when it’s punching fast. We went in and put a lot of extra hair on his arm. That was probably one of our biggest challenges, of how do you take an animated character and make them feel real but still honor the manga. It’s funny because the further you push it, the cooler stuff comes out of it too.”
The actual skies were retained for the ocean scenes with ships.
Paired with ARRI ALEXA LF cameras were Hawk MHX Hybrid Anamorphic lenses that produce a signature fisheye effect and extreme facial closeups for the show. “That was a challenge and a half!” laughs Ramsey. “When we got down to Cape Town we found out about the lenses. The biggest challenge was, ‘Do we have enough set to fit into this?’ Because they did morph out the sides quite a bit and our sets were only built out to certain areas, so there was quite a lot of set extensions. It increased the budget quite a bit because you’re always on a lens that is roughly a 25mm if not wider. Also, the warping on the sides was challenging.” The wide angle-lensing created an extra layer of visual effects work. “When you’re trying to roto and track, you don’t have straight lines,” Scalise states. “There’s a natural blur inherent in the lenses that now you’re rotoscoping a blurred edge and tracking sharping objects that go to the edge of the frame and get soft.” Atmospherics were aided by the choice of lenses. “If there is a light anywhere near the edge of frame there is a flare in that shot,” Scalise observes. “Sometimes you don’t see it until you look at the color channels and hidden in the blue channel is a lens flare,” Scalise remarks. “We were able to add a lot of glare halation to get layers, which give a sense of depth and a cheated atmospheric perspective.”
A digital double had to be made of Buggy the Clown (Jeff Ward), as he has the ability to disassemble, have his parts fly around and reassemble himself.
“When we first came on everybody said, ‘We didn’t want [Monkey D. Luffy] to be like Plastic Man.’ Originally, the punches were going to be stiff, fast and straight, and when we started applying them to shots, they felt lifeless. We started putting little bends and wiggles and an extra weight on the fist to get wrinkles, and it brought the effect to life. … That was probably one of our biggest challenges, of how do you take an animated character and make them feel real but still honor the manga. It’s funny because the further you push it, the cooler stuff comes out of it too.”
—Victor Scalise, Visual Effects Supervisor
The extensive water simulations were created by Goodbye Kansas Studios and Rising Sun Pictures.
Glare halation was added to give a sense of depth to the shots.
All eight episodes were cut before sending off the work to the vendors. “That allowed us to get the whole series filled,” Ramsey explains. “Episode 101 through to 108, we know exactly how the story is going to progress and the ups and downs, so by the time we get to Episode 108 it is a great finale, and that had to go through the showrunners, Netflix and Tomorrow Studios.” The visual effects work for 2,334 shots was broken down in accordance to the strengths of Goodbye Kansas Studios, Framestore, Ingenuity Studios, Rising Sun Pictures, Barnstorm VFX, Eyeline Studios, Scanline VFX, Mr. Wolf, Refuge, CoSA VFX, Incessant Rain and NetFX. “The good thing is that we have worked with a lot of our vendors for awhile now,” Scalise remarks. “Even when it comes to a shared shot, they have already talked to each other before, so the handoffs are good. It creates a lot of logistical complications, as well as sometimes the assets aren’t truly one-to-one, so there is extra work on both sides.”
Ships were shot in the parking lot at Cape Town Film Studios, which meant that the water had to be created digitally.
“The problem is when you have the real ships parked in a parking lot. Rising Sun Pictures did an amazing job with the ships. When we put their model next to the real one, I bet that most people will pick the real one as being CG. The Going Merry was a full ship build placed on a tractor trailer so it could drive around slowly. Then we had half of Miss Love Duck constructed, which was repurposed into Garp’s warship. There was also a salvage ship and Maui’s sloop. We had CG versions of all of them.”
—Victor Scalise, Visual Effects Supervisor
The wide-angle lenses greatly increased the amount of digital set extensions in shots.
Much of the action unfolds on the ocean with the pirate ships battling each other while trying to evade the law-enforcing Marines. “The problem is when you have the real ships parked in a parking lot,” Scalise notes. “Rising Sun Pictures did an amazing job with the ships. When we put their model next to the real one, I bet that most people will pick the real one as being CG. The Going Merry was a full ship build placed on a tractor trailer so it could drive around slowly. Then we had half of Miss Love Duck constructed, which was repurposed into Garp’s warship. There was also a salvage ship and Maui’s sloop. We had CG versions of all of them.” The water simulations were expanded upon in certain cases. “We actually shot the Lord of the Coast in the tank without planning on replacing the water except for the shots where the creature was going to interact with it,” Scalise reveals. “Once we got to the sequence, we went, ‘Screw it. Let’s completely replace the water in all of the shots so we can have much more control over continuity.’ When you look at the creature going into the water and how the water is interacting, it looks real.”
The Going Merry was a complete practical ship build that was placed on a tractor trailer so it could move.
Much of the action unfolds on the ocean with the pirate ships battling each other while trying to evade the law-enforcing Marines.
Real skies were captured rather than rely on huge bluescreens and compositing them in later in post-production. “It’s funny – some of the shots of the real skies feel as if they were composited!” Scalise laughs. “To save cost, we kept a lot of the natural skies for when we were on the boats. The amount of bluescreen to cover the boats with these lenses was almost impossible to get the sky. We decided early on not to use a lot of bluescreen and to place it on the horizon.” A classic size and scale problem is avoiding ships looking like miniatures out in the open water. “Size and scale turned out well,” Ramsey reflects. “The only point where it came into play was when we were doing Garp throwing the cannonball back at Luffy. They had parked the practical Going Merry, and it was shot over Garp’s shoulder, but when we got into post the shot didn’t feel like the distance that was wanted. We roto’d Garp out of that scene and replaced it with a 3D Going Merry.”
Camera tricks assisted in creating the impression that Monkey D. Luffy has the ability to stretch his limbs.
Iñaki Godoy as Monkey D. Luffy battles a group of Marines.
“Size and scale turned out well. The only point where it came into play was when we were doing Garp throwing the cannonball back at Luffy. They had parked the practical Going Merry, and it was shot over Garp’s shoulder, but when we got into post the shot didn’t feel like the distance that was wanted. We roto’d Garp out of that scene and replaced it with a 3D Going Merry.”
—Scott Ramsey, Visual Effects Producer
Monkey D. Luffy (Iñaki Godoy) discovers the Going Merry while visiting a shipbuilding yard.
Italy influenced the world-building, as Loguetown was based on Sorrento and Shells Town on Positano, Italy. “We had plans going in to modify real-world locations so we weren’t going into full CG builds of environments for some of the bigger ones,” Scalise remarks. “We able to shoot other parts of the Amalfi Coast and turn that into different islands.” A signature massive aquatic creature is Lord of the Coast, created by Goodbye Kansas Studios. “We did it the traditional way in that once we had the look of the concept, we went into modeling and did all of the turntables,” Scalise states. “Overall, we liked the early concept art, which is in the show. The amount of detail into the muscle work and all of the different layers of animation where, if you look at it closely, there are a lot of moving things to the slime on the skin. The biggest thing that we went back and forth on was, originally, a bit of red was put into the fins and pulling that out was our biggest note.”
Admiral Garp utilizes a telepathic species of snail known as a Den Den Mushi to be able to vocally and visually communicate across the world.
A digital double had to be produced for Buggy the Clown, as he can literally cause every part of his body to dissemble, fly around and reassemble. “I love that scene between Buggy and Luffy because almost every shot is a visual effect and their onscreen relationship was great,” Scalise remarks. “It’s a funny, dark, scary scene. One my favorite creatures that we built is the News Coo by Framestore. It was originally built for only three shots; however, when everybody saw it, anytime we could possibly figure out a place to sneak it in, we added it to another half dozen shots.” Ramsey favors the naval battle in Episode 105 between Garp and Luffy. “It was entirely shot at Cape Town Film Studios without water. Every shot is probably a visual effects shot. We brought in Rising Sun Pictures because they can do great ship models and CG water. It’s fast-paced, exciting, and kicks off Episode 105 really well.”
All eight episodes of the first season were cut before sending off the work to the vendors, with 2,334 visual effects shots created in total.
“[The naval battle in Episode 105] was entirely shot at Cape Town Film Studios without water. Every shot is probably a visual effects shot. We brought in Rising Sun Pictures because they can do great ship models and CG water. It’s fast-paced, exciting, and kicks off Episode 105 really well.”
—Victor Scalise, Visual Effects Supervisor
Italy influenced the world-building, as Loguetown was based on Sorrento and Shells Town on Positano.
Scalise concludes, “Everybody’s hope is that this [live adaption of One Piece] opens up the world to people because of the positive message of ‘do what you feel that you’re supposed to do versus what people are telling you what you should do.’ That relates to a lot of people and gives them an optimism, which this world probably needs right now.”
Watch a featurette on the making of One Piece Season 1 and some of the VFX work involved in the live-action adaption of the manga gemstone. Click here: https://www.youtube.com/watch?v=0EIfmn5Gk9A
By TREVOR HOGG
Images courtesy of Crafty Apes VFX and Prime Video.
The set for Bing-Bang Burger was actually a miniature, so careful attention had to be paid to make sure that the shadows were cast correctly.
When it comes to tall orders, I’m a Virgo Visual Effects Supervisor Todd Perry asked Crafty Apes VFX to produce a 13-foot high teenage protagonist named Cootie, who gets to experience the world beyond his sheltered existence for the first time in the satirical Prime Video series created by director/writer Boots Riley. “Boots didn’t want to use CG to make Cootie look bigger, so several different approaches were utilized for every shot,” states Aleksandra Sienkiewicz, Visual Effects Supervisor at Crafty Apes. “We had shots with puppets that we had to replace the head or augment the body. Or Cootie was shot in a miniature set with him being closer to the camera so he was appearing bigger. Or he was on a platform.” The methodology had to appear indistinguishable to the viewer. “For me, it was interesting to see because I grew up watching a movie like Elf where they did a lot of that kind of trickery,” remarks Ruth Stewart-Patterson, Production Manager at Crafty Apes. “We were doing our comp work and trying to make things look seamless.”
For the driving shots through the city Crafty Apes VFX had to add motion to hands of the puppet and do a face replacement.
“Boots [series creator/director/writer Boots Riley] didn’t want to use CG to make Cootie look bigger, so several different approaches were utilized for every shot. We had shots with puppets that we had to replace the head or augment the body. Or Cootie was shot in a miniature set with him being closer to the camera so he was appearing bigger. Or he was on a platform.”
—Aleksandra Sienkiewicz, Visual Effects Supervisor, Crafty Apes VFX
I’m a Virgo is a superhero satire created by Boots Riley about a 13-foot tall teenager named Cootie.
Over a period of six months Crafty Apes created 210 shots. ”The thing about working on TV is when you break it down per episode or sequence, it’s easier,” Stewart-Patterson states. “You might have five sequences per episode, and sometimes you get episodes in different orders, but at the end day it’s like a movie where you go one sequence at a time and assign another team to a different sequence so that they work all together – the organization and the types of shots as well. For example, on some shots we’re integrating Cootie and on another one with CG. You have to gather the work together and figure out what pairs with what, and once you figure that out, then you try to keep these teams together as you go per episode or sequence.” Consistency had to be maintained throughout the seven episodes. “We try to keep similar shots with the same artist to make sure everything is cohesive and we’ve shared techniques with all of the artists,” Sienkiewicz remarks. “We would talk to Todd and Boots to make sure that we’re on the same page.”
Elements meant to be significantly larger, like Baby Cootie, were shot closer to the camera.
“For the scene when Cootie is on the car driving through the city, it was shot with a massive puppet on the car. We had to remove the rig, replace the head and add motion to the fingers so he looked realistic. There were a lot of 2D techniques that we used. The face was shot on a greenscreen so we had several different takes with various lighting conditions to choose from. We used cues [in the plates] to track in the face into the puppet and added a little bit of motion into the face.”
—Aleksandra Sienkiewicz, Visual Effects Supervisor, Crafty Apes VFX
The specular quality of the puppet for Cootie was different from natural skin so lighting adjustments had to be made by Crafty Apes VFX.
Every shot was storyboarded to show how the actors would be positioned in front of the camera. “The elements we received from Todd Perry were awesome,” Sienkiewicz states. “The lighting conditions were always matching in the plates, and there were sometimes several plates that had to be merged together.” Since Cootie was closer to the camera and every character were on different planes, close attention had to be paid to shadows. Sienkiewicz explains, “In Episode 101, Cootie was super big in Bing-Bang Burger while the set was miniature. We had to make sure that the shadows of him are interacting with the walls and the other characters, as well as ensuring that the lighting conditions are matching. It was a different way of thinking compared to other shows I’ve worked on before.” Stewart-Patterson joined the project later on. “Our Shotgun mirrored their Shotgun, so it was easy to find what needed to be done, relay and see what’s left to do. It’s a lot easier when you have a road map rather than chaos,” Stewart-Patterson offers. Receiving editorial turnovers can be stressful because sometimes they come in earlier or later than expected, so flexibility is paramount. “We might get a turnover of 50 shots within an episode, so we pick out some key shots,” Stewart-Patterson details. “From there you can focus on those 10 shots and spread them out as you go further along in time. The trust between us and the client was strong, so picking the correct takes and making sure that we’ll get things done on time and scheduled properly was key to completing this project.”
Cootie has an intimate moment with Flora, who has the ability to rapidly flash a multitude of colors in a manner that resembles a hummingbird.
2D rather than 3D effects were the focus. “It was cool to see actual characters and models already in camera [rather than having to construct everything in CG],” Stewart-Patterson notes. “Aleks and Todd had a great relationship, so Aleks already knew where Todd was going to go and probably what his notes were going to be.” Boots Riley has a specific vision when it comes to how he wants to shoot and see the shots. “It was like a fresh breath of air,” Sienkiewicz remarks. “I’ve been in this industry for the past 13 or 14 years, and this was the first time I worked with miniatures, rather than build CG characters we needed to blend or augment their movements, or do face replacements. For the scene when Cootie is on the car driving through the city, it was shot with a massive puppet on the car. We had to remove the rig, replace the head and add motion to the fingers so he looked realistic. There were a lot of 2D techniques that we used. The face was shot on a greenscreen so we had several different takes with various lighting conditions to choose from. We used cues [in the plates] to track in the face into the puppet and added a little bit of motion into the face.” The puppet had more of a specular quality than normal skin. “There was some augmentation in terms of the brightness,” Sienkiewicz adds.
Series creator Boots Riley did not want to use CG to make Cootie look bigger, so several different approaches were utilized for every shot.
“A shot that stands out to me and was one of the most complex ones was Flora’s flashback in Episode 103. There was an oner shot that was 5,000 frames, and we had 13 plates that had to be stitched together because the motion was not seamless. In addition to that, it was at super speed so everything around her is still, so we had to stabilize all of the actors to make sure they don’t move or blink. If you wanted to change one thing, all of the 5,000 frames had to be rendered.”
—Aleksandra Sienkiewicz, Visual Effects Supervisor, Crafty Apes VFX
The miniature house was shot against greenscreen and then composited into the plate photography.
Some sleepless nights were spent thinking about the project. “A shot that stands out to me and was one of the most complex ones was Flora’s flashback in Episode 103,” Sienkiewicz reveals. “There was an oner shot that was 5,000 frames, and we had 13 plates that had to be stitched together because the motion was not seamless. In addition to that, it was at super speed so everything around her is still, so we had to stabilize all of the actors to make sure they don’t move or blink.” Alternations could not be taken lightly. Continues Sienkiewicz, “If you wanted to change one thing all of the 5,000 frames had to be rendered. We divided shots between different artists. You need to find the perfect spot for making a transition where you can hide things, typically motion blur or foreground elements that make it look natural.” Flora has the ability to flash a multitude of different colors. Sienkiewicz explains, “Flora’s effect was cool. All of the Flora elements were shot at different speeds and colors. We also had plates with flashing lights that we needed to select frames that Boots liked. Because Boots liked the postvis that editorial did, we ended up asking editorial to export us AAF files that we could import into Nuke. Rather than go through minutes or hours of footage, we exactly knew which element he liked; that was our baseline for timing and then we could enhance and move forward with other effects. Boots never wanted things to look perfect; he likes an old, ragged look but with a modern twist. There was a lot of creative brainstorming about how we wanted this character to look like. Boots compared it to the Tasmanian Devil.”Another creature comes to mind for Stewart-Patterson. “When you play it fast it looks cool because it appears like a hummingbird changing colors.”
Every shot was storyboarded to show how the actors would be positioned in front of the camera.
“The shots in Bing-Bang Burger was a real Cootie shot close to the camera, but the set was a miniature. He was touching his head, so there was lots of trickery to make it look realistic. The biggest challenge was to make sure that the puppet and the real elements were cohesive and working together. There was lots of comp trickery involved!”
—Aleksandra Sienkiewicz, Visual Effects Supervisor, Crafty Apes VFX
Boots Riley wanted to have an old, ragged look, but with a modern twist.
Lots of compositing trickery was called upon when doing the face replacements on the puppet for Cootie. “The neck was one of the biggest issues, to make sure that he’s tucked in nicely behind the shirt,” Sienkiewicz states. “And even the chest, because the puppet was bigger that Cootie himself, so we needed to slim him down; there was some complex paintwork involved. There were a lot of rigs involved with the puppet, so there was massive cleanup in some of the shots. The shots in Bing-Bang Burger was a real Cootie shot close to the camera, but the set was a miniature. He was touching his head, so there was lots of trickery to make it look realistic. The biggest challenge was to make sure that the puppet and the real elements were cohesive and working together. There was lots of comp trickery involved!” The major task for Stewart-Patterson was being able to adapt to meet needs of the production. “When working closely with creatives, we try to reel them in but also give them as much space you can. It’s being able to assist and properly reflect to the client what creative changes might be or how much time it might take. This team really brought it in well, and the show looks great.”
Watch the VFX breakdown reel for I’m a Virgo from Crafty Apes VFX. Click here: https://vimeo.com/834933437.
By OLIVER WEBB
Images courtesy of Milk VFX and Amazon Prime.
Aziraphale’s wings were originally created for Season 1, but had to be recreated for Season 2 to allow for more complex motion requirements.
Created by Neil Gaiman and Terry Pratchett, and adapted for the screen by Gaiman for Amazon Prime, British fantasy comedy series Good Omens follows the antics and adventures of angel Aziraphale (Michael Sheen) and demon Crowley (David Tennant) after they are exiled from Heaven and Hell and team up to form an unlikely duo on Earth. Milk VFX Supervisor/Co-Founder Jean-Claude Deguara served as VFX Supervisor on Good Omens Season 1 before sliding into the client-side VFX Supervisor seat for Season 2. Matias Derkacz served as Visual Effects Supervisor for Milk on the second season.
“Before the ‘let it be light’ moment, we used one of the environments to create white flashes of energy that revolved around Crowley and Aziraphale. This approach provided the right amount of light, as this part of the sequence required it to be darker. But for the creation of the nebulas, we needed the opposite effect. We used a simplified version of the nebulas on the LED screens, which allowed us to achieve complex light changes. Having access to the LED screens allowed us to archive the correct light interaction on Crowley and Aziraphale, helping us achieve the desired impact.”
—Matias Derkacz, Visual Effects Supervisor, Milk VFX
“I first got involved with Good Omens on Season 1 when I was Compositing Supervisor,” Derkacz says. “Working with Jean-Claude Deguara as a client-side VFX supervisor was brilliant. Working with him so closely on Season 1, I knew what he was after. I was part of all the discussions about how to creatively push forward the show and was involved with those conversations from the very beginning. Jean-Claude had some great ideas.”
The wings were animated to follow Crowley’s body movements in order to enhance their emotions.
Discussing his initial conversations about the look of the second season, Derkacz explains that he was brought on set for the opening sequence. “I was brought in to help with virtual production for the opening sequence. Before the ‘let it be light’ moment, we used one of the environments to create white flashes of energy that revolved around Crowley and Aziraphale. This approach provided the right amount of light, as this part of the sequence required it to be darker. But for the creation of the nebulas, we needed the opposite effect. We used a simplified version of the nebulas on the LED screens, which allowed us to achieve complex light changes. Having access to the LED screens allowed us to archive the correct light interaction on Crowley and Aziraphale, helping us achieve the desired impact. So, we created material which was not meant to be on camera, but to get the light interaction.”
“It required a lot of conversations back and forth with Douglas [director Douglas Mackinnon] and Neil,” Derkacz continues. “We had a great way of working where we could regularly check in on how things were going. We used real NASA pictures of nebulas as reference. We adapted the color palettes and shapes to make it more appealing to the eye, more interesting and different. That was the way that we approached that sequence. It was 100-plus shots that we had to actually deal with.”
One of the most pressing conversations for Visual Effects Supervisor Matias Derkacz and his team revolved around the look of the Land of Uz in Episode 2.
“[Aziraphale’s and Crowley’s wings] were created for Season 1, but we redid them as they needed more complex behavior. The wings were animated to follow Crowley’s and Aziraphale’s body movements to enhance their emotions. So, there was a bit more animation work involved with this. It was more complex in the way that feathers behaved since we had to have proper feather interactions. That was the biggest sequence for sure in terms of volume and complexity. It was a fun and challenging sequence.”
—Matias Derkacz, Visual Effects Supervisor, Milk VFX
One of the main assets of the show, which played a role in the Before the Beginning sequence was Aziraphale’s and Crowley’s wings, which had to be recreated for Season 2. “The wings were created for Season 1, but we redid them as they needed more complex behavior,” Derkacz explains. “The wings were animated to follow Crowley’s and Aziraphale’s body movements to enhance their emotions. So, there was a bit more animation work involved with this. It was more complex the way that feathers behaved since we had to have proper feather interactions. That was the biggest sequence for sure in terms of volume and complexity. It was a fun and challenging sequence.”
Derkacz and his team were involved with only one shot on the Soho street, a shot where the camera starts high on a crane and then goes underneath a car.
Production Designer Michael Ralph was responsible for creating the numerous time periods and their look throughout the show. One of the most pressing conversations for Derkacz and his team revolved around the look of the Land of Uz in Episode 2. “That asset needed to be done in a specific way. There were always conversations about making sure what we had created didn’t look like the Grand Canyon or like any other part of the United States,” he says. “We had an early discussion with Jean-Claude to try and get the look set for that sequence, especially when Aziraphale comes out the portal. That environment was one we discussed thoroughly, and we ended up doing everything digitally. It was based on a mix of real photos, but some of the references we had were really similar to the Grand Canyon, so we had to get something similar like that but different enough for people not to comment on it.”
Production Designer Michael Ralph was responsible for creating the numerous time periods throughout the show. The environment for the Land of Uz was completely CG, based on a mix of real photos and references to the Grand Canyon.
Around 100 people from different departments were involved with the production of the second season. “We had a big team,” Derkacz adds. “We had to be quite smart in the way that we scheduled the work because it’s a TV series and there is loads of work. Scenes like the opening sequence and the first six minutes of the second season were really important. It hasn’t really been done before, is really abstract and took a lot of time to achieve. The way that we managed the workload was working on the amount of volume that we had and in the order of episodes that we were discussing at the time. Some assets might require weeks of work.”
The actors were shot standing on greenscreen and then seamlessly added to a fully CG cemetery.
“The only shot we were involved with on the Soho street was for a shot where the camera starts quite high and then goes underneath the car. That is a really cool shot. It’s a blend between two plates because you have the main plate of the crane, and when you go underneath the car it’s full CGI, and then you go around the car and you have a mix between a CG car and a real car, and then Crawley gets out of the car. All of that has to be done as a single camera move. That’s sort of Douglas’s [director Douglas Mackinnon] signature as he loves doing these transitions, and I think they were great and really helped the flow of the episode,”
—Matias Derkacz, Visual Effects Supervisor, Milk VFX
When it came to creating the set extensions, the Soho street had been completed for Season 1 and was already an established asset. “The only shot we were involved with on the Soho street was for a shot where the camera starts quite high and then goes underneath the car. That is a really cool shot. It’s a blend between two plates because you have the main plate of the crane, and when you go underneath the car it’s full CGI, and then you go around the car and you have a mix between a CG car and a real car, and then Crawley gets out of the car. All of that has to be done as a single camera move. That’s sort of Douglas’s signature as he loves doing these transitions, and I think they were great and really helped the flow of the episode,” Derkacz details.
Fire was shot as an element and done in 2D, and integrated on the set due to safety protocols.
Derkacz and his team also worked on a significant shot for one of the wartime environments. “We did work on one crane shot where we go into London and there are some zombies and fires on the street. That was quite straightforward set extension, which we blended with the smoke that we had on set. We added fire that was shot as an element, and that was all done in 2D and integrated on the set because they couldn’t have that fire next to the people as it wasn’t safe; that was added in post. We also created an environment that was used in virtual production. We built numerous destroyed buildings and added CGI fire, which were used on set as an LED screen when Crowley is driving through,” Derkacz notes.
David Tennant as Crowley and Shelley Conn as Beelzebub experience a front-seat encounter of the devilish kind, courtesy of visual effects.
“The work that we did on the tongue of the demon was really fun. It’s a quick shot, but for that shot we had to create all the CG for the tongue and all of the saliva. It’s lots of work involved for only one shot, but the good thing about that is that you build the asset, you get the shot, and you have a part of the sequence that has been approved and you can move on. … I would say that the most challenging to create wasn’t a character, it was Aziraphale’s wings. All these minimal changes in color had to be translated to white wings, which was a really difficult and complex aspect, but the end result looked great.”
—Matias Derkacz, Visual Effects Supervisor, Milk VFX
Zombies with long-range, tentacle-like CG tongues can be found on the street in London. Milk delivered more than 500 shots for the second season of the show.
Derek Jakobi as Metatron, Jon Hamm as Gabriel and Liz Carr as Saraqael in a flashback scene from Heaven of Gabriel (Jon Hamm) before he’s demoted from Heaven and loses his memories. (Image courtesy of Amazon Prime)
There was lots of work put into the creature work and into key characters as well. “I really enjoyed all of the creature work that we did, especially the geckos,” Derkacz says. “The work that we did on the tongue of the demon was really fun. It’s a quick shot, but for that shot we had to create all the CG for the tongue and all of the saliva. It’s lots of work involved for only one shot, but the good thing about that is that you build the asset, you get the shot, and you have a part of the sequence that has been approved and you can move on. So, it’s pretty much based on deliveries and as well on the volume of work for the full team. I think I enjoyed the process most, working with a wonderful team from the beginning until the end. I would say that the most challenging to create wasn’t a character, it was Aziraphale’s wings. All these minimal changes in color had to be translated to white wings, which was a really difficult and complex aspect, but the end result looked great.”
Angel Aziraphale (Michael Sheen) and demon Crowley (David Tennant). According to Milk VFX Visual Effects Supervisor Matias Derkacz, the biggest challenge wasn’t to create a character, it was the wings. (Image courtesy of Amazon Prime)
Milk delivered more than 500 shots for Season 2. “What is great about Good Omens in terms of visual effects work is that it isn’t like a typical VFX TV series. Good Omens is really complex in the sense that you build an asset which is only used once. We are going back in time and we build all of this for the shot and then that’s it; we never come back to it. It’s really fun to work on a show like that because you do such much,” Derkacz concludes.
Watch a brief video on “The Making of Good Omens: Season 2, Ep. 0 VFX Breakdown Chapter 1.” Click here: https://www.amazon.com/gp/video/detail/B0BYZ6RH44?ref_=dv_dp_explore_sign_in
By TREVOR HOGG
Images courtesy of Fin Design + Effects and Netflix.
A major environment build for Fin Design + Effects was constructing Vienna with moving traffic.
Thought to be dead mercenary, Tyler Rake (Chris Hemsworth) returns in the Netflix sequel Extraction 2 where the action shifts from Bangladesh to Austria under the direction of Sam Hargrave. Further complications arise when attempting to rescue a crime lord’s family from a Georgian prison. Contributing to the mayhem is Visual Effects Supervisor Björn Mayer and an army of vendors. One of the contributors was Fin Design + Effects, which over a period of six months produced 150 shots that consisted of a 360-degree panoramic view of Vienna and the climatic glass-awning fight scene.
“The thing that prepared me the most was watching the first one, and you get a sense that it’s going to be a gritty action movie [like] John Wick, but much more in your face,” states Will Towle, Visual Effects Supervisor at Fin Design + Effects. “There are different types of visual effects movies. You have Guardians of the Galaxy, which are your bombastic huge-effects shows. Then you’ve got your more down-to-earth invisible effects stuff like Extraction 2, which fits well in my compositing background. The challenge for this kind of movie is that you have to get the audience to buy into it and see the peril that the characters are in. Compositing is all about those finishing touches and adding believability. I’m also a believer that wherever possible we should use live-action effects. You can’t get more real than real. Quite often you can use them in combination with CG to bring the CG up to the next level.”
Despite building a large set piece for the awning fight sequence, major set extensions were required to replace the bluescreen.
Clear and concise instructions were articulated by Mayer. “Björn knows what he wants, and they had actually built quite a big set of the awning, but we ended up replacing all of it because you have reflections in the pieces of glass such as crew members and bluescreen,” Towle remarks. “It gave us a good starting point and base to ground our CG. Early on, Björn and I talked about other references. Pictures were shared of what this sequence might look like. We started with broad strokes and we slowly zoomed into the tiny details like anisotropic filtering, dirt maps on the glass, and kept adding layers of detail to the CG build until it looked photoreal.” Rotomation of the characters was a critical part of the process. “Because we were replacing the glass in every shot and glass is highly reflective, that required us to do accurate digital doubles of four or five characters, and those had to be rotomated into every single shot,” Towle explains. “This is not work we’ve done too extensively before, so new pipeline tools had to be written to help with that, and new animation tools as well.”
Everything had a basis in plate photography, which helped to provide a ground truth for the CG.
Guiding the shot design process were storyboards and previs. “We can start placing our cameras and see what pieces of the build might need to be up-res’d because there is a level of detail scale where the further from camera it is, the fewer details we put in to save render time and disk space,” Towle explains. “Björn and his team had flown a helicopter over Vienna and taken thousands of photos from all different positions, which was fantastic for getting a sense of the environment and what the sun looks like at the time of day that the sequence is taking place. Vienna’s midday sun is bright with not much atmosphere. We also did a lot of research on Google Maps. We flew cameras over the area on Google Earth and did some photogrammetry to build a proxy city early on to figure out camera angles and what buildings we would see from what camera height.”
“The challenge for this kind of movie is that you have to get the audience to buy into it and see the peril that the characters are in. Compositing is all about those finishing touches and adding believability. I’m also a believer that wherever possible we should use live-action effects. You can’t get more real than real. Quite often you can use them in combination with CG to bring the CG up to the next level.”
—Will Towle, Visual Effects Supervisor, Fin Design + Effects
Half of the 150 shots were entirely digital. “I would love to keep as much plate as I can, but, unfortunately, for a lot of these shots we did have to replace them with CG,” Towle states. “The challenge there was making it look as much like the reference as possible. It helped us in a way to go full CG. In almost every shot the characters are still plate, and for this show, to assist us keep the sequence on track in terms of color and lighting, we created a neutral grading pipeline for Fin Design + Effects; this is the first time we’ve done that. What that entailed was neutral grading every single shot in the sequence, and that allowed us to create our Vienna city in a neutral graded environment as well. It enabled us to light, render and drop a shot in that is looking good already and you’re nudging the last five percent in compositing. But we still had live-action elements, and with those come rebuilding edges, roto and bluescreen removal.”
The hardest part of the glass-awning fight sequence was replacing the reflections and refractions of the crew and bluescreen.
Constructing the DC Towers and surrounding Vienna were the biggest photoreal environments for Finn Design + Effects. “How are we going to build the city and add moving traffic to it?” Towle observes. “We had to have moving reflections on every surface and nearby buildings as well. That was something we were definitely anticipating. Also, the way that light travels through glass and compositing glass has some fundamental difficulties. Glass is all reflections and refractions; that’s classically quite difficult to comp because if you have a CG render of a pane of glass and you’re seeing through it, you don’t have a depth channel for the stuff you’re seeing through it. You have a depth channel of the piece of glass. We had to come up with a way to defocus all of this stuff properly. To effectively achieve that in compositing we came up with a method to stripe out the reflections and refractions from the pieces of glass, defocus them separately using a different depth channel and recombine them all. That was all done in comp.”
Deep compositing was critical in being able to manipulate and produce the correct reflections and refractions in the glass.
“[T]hey had actually built quite a big set of the awning, but we ended up replacing all of it because you have reflections in the pieces of glass such as crew members and bluescreen. It gave us a good starting point and base to ground our CG. Early on, Björn [Visual Effects Supervisor Björn Mayer] and I talked about other references. Pictures were shared of what this sequence might look like. We started with broad strokes and we slowly zoomed into the tiny details like anisotropic filtering, dirt maps on the glass, and kept adding layers of detail to the CG build until it looked photoreal.”
—Will Towle, Visual Effects Supervisor, Fin Design + Effects
Rotomation was utilized extensively for the characters and their digital doubles.
“For the hero shots that were close to the glass,” Towle continues, “we rendered it in two different layers and comped it back together in deep compositing so we didn’t have to render with holdouts. We could push and pull layers every so slightly using deep compositing. Then, just as we expected, the build of the city was extremely challenging. We started with a photographic base that was our ground truth, then put it on a sphere to see how it looks. You begin to notice things that are not in the correct perspective, like warped buildings. You can go, ‘That building needs to be reprojected onto geometry, while this one has to be full CG.’ We used it as a guide for how detailed our environment had to be. The furthest-away stuff was on a sphere. As you got a bit closer, it was put on cards, even closer on geometry, and the closest full CG. That allowed us to keep our render times down and still get a believable photoreal result.”
Watch Fin Design + Effects’ amazing behind-the-scenes work on the glass-awning fight sequence in their Extraction 2 VFX Breakdown. Click here: https://www.findesign.com.au/projects/extraction-2/
By TREVOR HOGG
Images courtesy of BlueBolt and Netflix.
A last-minute alternation was to give the concluding Valhalla scene more of an epic ethereal quality.
After being the sole visual effects vendor on 10th century British saga The Last Kingdom, BlueBolt gets to apply five seasons worth of expertise to the Netflix feature film The Last Kingdom: Seven Kings Must Die. The story revolves around Uhtred of Bebbanburg attempting to unite England after the death of King Edward. “This is not a documentary,” observes Richard Frazer, VFX Supervisor at BlueBolt. “We have to start from a point of authenticity and then take some creative license because we’re still making a piece of entertainment.” The methodology did not change going from a series to feature format. “Apart from the Valhalla stuff, there wasn’t anything crazy dissimilar to what we’ve done before. We looked at it as if we were making one and a half or two episodes of the show.” Post-production lasted for seven to eight months. “The shot count of 375 doesn’t sound like much but they were extremely complex,” Frazer explains. “The main difference is that it’s one big linear focus [rather than doing pre-production, shooting and post-production of different episodes at the same time]. There wasn’t any overlap with anything else that was going on.”
Sky replacements were part of the visual effects work, such as turning scenes from day to night.
Outside of the usual numerous blood shots, the main focus was the battle at the end. “You cut to this wide battle shot [there were four or five of them], and for the three seconds that it’s onscreen there are all of these nuances of the advancement in the configuration of the two sides that have to be communicated,” Frazer states. “The Valhalla sequence was something that came in quite late and wasn’t planned for. It was filmed as scripted with Uhtred in the hall at Beddanburg. He hears this commotion, goes over, and it’s like a mirror of the hall that he has just left. It was shot that way and never meant to be visual effects. The film was locked and sent off for reviews to one of the executives who wanted it to be this much bigger thing at the end. We were nervous because it was going to take quite a significant bunch of money to get it done, and it’s such an important moment. Uhtred is on the cusp between life and death. It couldn’t be cheesy. If you die in battle, you either go to Valhalla, which is an eternal banquet with all of your fallen comrades for the rest of time, or to Fólkvangr. where you can be out in the countryside, if getting drunk for eternity isn’t your thing!”
CG water was avoided for the series but required for the feature version.
Classical paintings of Valhalla were too grand and over the top in their scale, which ran counter to the tone of show. “We did reference a shot where we go to the Isle of Man and there is a Viking longhouse, which was based on a real one that was discovered,” Frazer remarks. “For the Hall of Valhalla, we went to the production designer who had designed the Hall of Bebbanburg and said, ‘It has to feel otherworldly but grounded in a reality that Uhtred understands because of the implication that it might be a hallucination that he is having.” A blinding bright light appears in the doorway of the hall. “It was based on the Hall of Beddanburg, which was practically built and expanded into this infinite space. At the end were these giant doors that opened out onto the meadows of Fólkvangr. Originally, you were supposed to see through the doorway to the meadows beyond. We played with the exposure and levels of haze and reached this level where it was blown out and light streaking through. The executives liked that as a look because it implied something ethereal beyond, but still felt quite grounded,” Frazer says.
“The Valhalla sequence was something that came in quite late and wasn’t planned for. It was filmed as scripted with Uhtred in the hall at Beddanburg. He hears this commotion, goes over, and it’s like a mirror of the hall that he has just left. It was shot that way and never meant to be visual effects. The film was locked and sent off for reviews to one of the executives who wanted it to be this much bigger thing at the end. We were nervous because it was going to take quite a significant bunch of money to get it done, and it’s such an important moment. Uhtred is on the cusp between life and death. It couldn’t be cheesy.”
—Richard Frazer, VFX Supervisor, BlueBolt
Motion capture – with a shield and baseball bat in hand – was essential in producing the necessary number of soldiers for battle sequences.
Epic warfare is a signature part of series storytelling for The Last Kingdom, with the climax of the feature being based on the Battle of Brunanburh. “There were probably more records or poems written about the aftermath,” Frazer notes. “I don’t know how much the specific military beats within it are accurate to the real historical event. They wanted to have Untred and his forces overwhelmed and outnumbered, but by using tactics he manages to save the day. Uhtred concedes ground in order to expose their flank to a pincer maneuver attack. As much as we were trying to shoot that with drones on the battlefield, in the end it was too difficult to do that as a physical maneuver with the number of people we had. We shot a bunch of empty plates of the battlefield and did it completely CG to show the nuances of that maneuvering, which is counter to what I normally do. You should always start from a place of having real people, but at least what we attempted to shoot with the drone served as reference to rebuild all of that with our CG characters.”
It was a challenge to have the opposing armies crushing against each other enough while not making it visually confusing for the viewer.
“[The Hall of Valhalla] was based on the Hall of Beddanburgh, which was practically built and expanded into this infinite space. At the end were these giant doors that opened out onto the meadows of Fólkvangr. Originally, you were supposed to see through the doorway to the meadows beyond. We played with the exposure and levels of haze and reached this level where it was blown out and light streaking through. The executives liked that as a look because it implied something ethereal beyond, but still felt quite grounded.”
—Richard Frazer, VFX Supervisor, BlueBolt
The Uhtred side consists of Wessex and mercenary soldiers that wear uniforms, while the opposing Allied side is made of Danes and Islanders dressed entirely different from each other. “We had to find a way to create base characters and then randomize those to make them appear as individuals with the various head dressings, accessories and fur,” Frazer explains. “If you’re in a wide battle shot you need to have many variations of the actions that people are doing, otherwise you quickly start seeing repeats. We had motion capture suits in the office with a shield and baseball bat and would churn out whole new actions. We tried to get as many different people in the suits as possible to vary it up.” The battle was shot in a bowl in Hungary. “The Allied side comes down a raised hill, and Uhtred and his side come in between the treelines. We used that because when you’re in amongst all of the fighting, you need to see things above people’s heads to orientate yourself,” Frazer describes.
Approximately 370 visual effects shots were created by BlueBolt, which was the sole vendor for the feature as well as the five seasons of The Last Kingdom.
“As much as we were trying to shoot [the final battle scene] with drones on the battlefield, in the end it was too difficult to do that as a physical maneuver with the number of people we had. We shot a bunch of empty plates of the battlefield and did it completely CG to show the nuances of that maneuvering, which is counter to what I normally do. You should always start from a place of having real people, but at least what we attempted to shoot with the drone served as reference to rebuild all of that with our CG characters.”
—Richard Frazer, VFX Supervisor, BlueBolt
BlueBolt was responsible for the arrival of the armada. “It was the first shot we started and the last one that was delivered!” Frazer reveals. “It went through a lot of variations. The long shot from Season 2 that I composited of Uhtred on a slave ship, which came from inside the boat, across the bow of the deck and then up and wide, and pulls back showing the ship sailing across the sea – the director originally wanted to create something like that. One area that we haven’t touched on the show is CG water. We tend to use stock plates and work from there. That worked well for all of the shots of ships in the previous seasons, but if there is a specific countermove that is wanted, you’re limited with what you can do based on the spot plates that you were able to source. We tried doing the big pullback as it came through the ships and rigging. It wasn’t quite working. We scrapped it midway through and came back to it. That was the most challenging shot [of the movie]. The ocean itself is some stock footage that we sourced, but then the wake and all of the interaction of the boats are entirely CG. That was the tricky part.”
Watch BlueBolt’s VFX breakdown video of large, complex CG battle scenes and army formations as well as other elaborate environments for Seven Kings Must Die. Click here: https://www.blue-bolt.com/ourwork/seven-kings-must-die.
By TREVOR HOGG
Images courtesy of Disney+.
The methodology for each visual effect shot was determined on a case-by-case basis.
Working on the Disney+ fantasy series American Born Chinese holds a special place in the heart of Visual Effects Supervisor Kaitlyn Yang (Raised by Wolves), because the content on the screen and the creative talent in front and behind the camera reflected her ethnic heritage in an authentic and meaningful way. The story reimagines the myth of the Monkey King by giving the trickster god domestic life troubles in the present day and mines the humor in trying to gain self-acceptance beyond the borders of China. There is no shortage of fantasy elements as a battle between the forces of good and evil ensues with two high school students thrust right in the middle of the heavenly conflict.
“In my opinion, visual effects are best served when it’s in a supporting role. They’re invisible or we tap into the magic realism whenever the visual effects need to be front and center because story calls for it.”
—Kaitlyn Yang, Visual Effects Supervisor
American Born Chinese wanted to make sure that everything was grounded with a focus placed upon magic realism.
“Just in the script phase, right off the bat you have a falcon changing into a tiger changing into a fish! How do we do that? That ultimately becomes the perfect handoff of what was shot practically and just seeing enough of something to whet your appetite. It was kind of a morph that was more into the sparkle-lighting effects realm aided by the sound design. The camera is moving so fast that you’re like, ‘Oh, now it’s going to happen.’ There were so many of those all in the row that was so mesmerizing to watch.”
—Kaitlyn Yang, Visual Effects Supervisor
The Monkey King is capable of 72 transformations, with one of them being a falcon.
“I was extremely thrilled when I got the email, and not having so many women visible I felt like I was chosen by the claw in Toy Story!” states Yang, who is the founder and CEO of Alpha Studios. “Having the braintrust [of Asian creators, heads of departments and cast] with lived experiences allowed us to know when to take liberties and pay homage,” Yang states. “You see so many nods to Asian traditions throughout the series. We had a lot of good ideas to pick from, and then figuring out which ones fit the tone and vibe that [showrunners] Kelvin Yu and Destin Daniel Cretton had already envisioned in their head and how to amplify that.”
Stunts were pivotal in pulling off the fantasy action sequences, with wirework being utilized by martial arts icon/Oscar winner Michelle Yeoh.
Having being responsible for Shang-Chi and the Legend of the Ten Rings, Cretton was experienced in dealing with a visual effects-heavy production while other directors were not as experienced, such as [director] Lucy Liu. “It was never that we are designing specifically for their vision,” Yang remarks. “They were always willing for new ideas and improvements, but key words were passed around here and there. Like any good storytelling, it’s about a feeling that you’re trying to capture. How fast was the waterfall moving and could that contribute to the intense dialogue that the dad and son are having? In my opinion, visual effects are best served when it’s in a supporting role. They’re invisible, or we tap into the magic realism whenever the visual effects need to be front and center because story calls for it.”
Determining the pacing and look of effects such as waterfalls was the desired tone for the scene.
Amongst the creative and technical challenges to produce 1,100 visual effects shots for the eight episodes was the ability of the Monkey King to do as many as 72 transformations, which are dictated by the scenario and where he has to go next. “Just in the script phase, right off the bat you have a falcon changing into a tiger changing into a fish!” Yang notes. “How do we do that? That ultimately becomes the perfect handoff of what was shot practically and just seeing enough of something to whet your appetite. It was kind of a morph that was more into the sparkle-lighting effects realm aided by the sound design. The camera is moving so fast that you’re like, ‘Oh, now it’s going to happen.’ There were so many of those all in the row that was so mesmerizing to watch.”
Among the invisible effects were smartphone screen inserts.
“The first and last episodes were always planned to be visual effects heavy. In Episode 108, we have this intense battle in the sky that we had to somehow make believable. There were so many moving parts, figuring out how to digitally duplicate something that we had practically. I started out as a special effects artist. I will always have a love of getting as much in-camera as possible knowing even if most of it doesn’t appear in the final frame that it’s going to be the coolest reference we’re going to have.”
—Kaitlyn Yang, Visual Effects Supervisor
The glass explosion was achieved practically.
For the opening chase in Episode 101 where a multitude of transformations occur, previs was produced by DigitalFilm Tree that was informed by Stunt Coordinator Peng Zheng. “He and his team shot at their practice ring on iPhones and mobile phones to give us a sense of the camera movement and how one scene stitched with another,” Yang explains. “That helped so many departments to visualize what do we need to do and where the handoff was. I hope that people will love the chase scene as much as we loved putting it together. That sequence, in particular, took the most days, shot count and budget. It was not only a fun sequence but set the tone for the rest of the season. That was one of the first sequences that we shot.” The transformation shots might be described as kung fun magic. “We didn’t do as many as 72 transformations, but perhaps for later seasons! Knock on wood! I grew up with Hong Kong cinema, so I love the types of choreography that you don’t see too much nowadays that have just enough surrealism and magic that ties it together. It’s so fast-moving you can’t look away,” Yang states.
Approximately 1,100 visual effects were created for the eight episodes.
“We took all of the tools in the toolbox and deployed them whenever we saw fit. In Episode 101, we LiDAR-scanned our sets knowing that we had to do extensive set extensions. We worked closely with the art department which had done extensive research and also relied on Gene Luen Yang, the writer of the book with the same name, because he has done extensive research.”
—Kaitlyn Yang, Visual Effects Supervisor
Most of the visual effects were temped by Pixelloid working alongside DigitalFilm Tree, “not necessarily to the degree of the final look that we ultimately settled on, but providing us with a stepping stone to keep improving upon,” Yang notes. “By cut five or six we were able to see some version of the effect already built in. The effects that went through the most iterations were probably for the Heaven episode. That was mainly shot on bluescreen, and we definitely used all of the time on the clock for that one to fine tune the lighting and matte painting. The first and last episodes were always planned to be visual effects heavy. In Episode 108, we have this intense battle in the sky that we had to somehow make believable. There were so many moving parts, figuring out how to digitally duplicate something that we had practically. I started out as a special effects artist. I will always have a love of getting as much in-camera as possible knowing even if most of it doesn’t appear in the final frame that it’s going to be the coolest reference we’re going to have.”
Nothing like having a battle between deities in a high school where teenage angst reigns supreme.
The methodology was devised on a case-by-case basis. “We took all of the tools in the toolbox and deployed them whenever we saw fit,” Yang explains. “In Episode 101, we LiDAR-scanned our sets knowing that we had to do extensive set extensions. We worked closely with the art department which had done extensive research and also relied on Gene Luen Yang, the writer of the book with the same name, because he has done extensive research.” Human Engine ran a portable scanning station on set to assist with the construction of digital doubles. “We had people flying into the sky or walking on water or cliff jumping. Even before wire removal and background replacements, watching the dailies was so cool. We were able to get some CG doubles, but for most of it we had an incredible stunt team that showed their homage to Asian cinema. Hopefully, the audience will be so immersed that everything we put on the screen will be taken as is, because the best compliment for visual effects is not to even know that it was there.”
Watch mesmerizing transformations of the Monkey King in DNEG’s “VFX Breakdown” for American Born Chinese. Click here: https://vimeo.com/844316668
By TREVOR HOGG
Images courtesy of Lucasfilm Ltd. and Disney.
The hardest shot to get right for ILM was encountered when the bag is removed and reveals the de-aged Indiana Jones for the first time.
Receiving the franchise baton from Steven Spielberg (Jaws), James Mangold (Ford v Ferrari) embarks on the fifth and final adventure with a character who made archeology, fedoras and bullwhips cool, and provided Harrison Ford with his third iconic cinematic persona. Joining in the fun and global mayhem that ensues in Indiana Jones and the Dial of Destiny is Oscar-winning VFX Supervisor Andrew Whitehurst (Ex Machina) along with ILM VFX Supervisor Robert Weaver.
An array camera car captured plates for the tuk-tuk chase that occurs on the streets of Morocco, which was handled by Soho VFX.
Much has been made of the opening 25 minutes that resembles lost footage from the making of Raiders of the Lost Ark and features a de-aged Harrison Ford. “We started by looking at archival material and building our CG head and A/B between the two, which was a useful comparison,” states Whitehurst, who was on the project for three years. “Early on we had scanned Harrison Ford, so we had him as he is now, which could also be used for checking proportions. There was never ever a particular shot that we thought, ‘This is the one we’ll use to test.’ We all knew that the bag coming off of the head was going to be the most important shot because it’s the first time you see him, and it was also going to end up in the trailer. That was shot early on, and we got working on it earnestly.”
When de-aging Mads Mikkelsen it was more important that he resembled the character of Jürgen Voller rather than his younger self.
De-aging was accomplished by utilizing ILM FaceSwap. “It’s was a terrifying challenge that I wanted to step into and take on,” Weaver remarks. “Things are always evolving, and we’re good at keeping up and leading development in all areas that are necessary to perfect the techniques. Andrew had a good summary of it by stating it’s an umbrella suite of tools. By adding the machine learning to the mix, it’s contributing yet another tool to the arsenal to accomplish what is needed. We did rely on archival footage from past films, and it helped tremendously in many different ways, but it didn’t replace our previous techniques, which we still need to build a fully CG asset that is believable standing on its own as well. It was necessary to treat each shot individually.”
The World War II prologue features a chase on top of a train with Toby Jones, the only character not de-aged. The chase includes an all-CG shot of Ford running across the train.
FaceSwap is not treated as a magic bullet. “What we have is a selection of tools and the ability to mix between the various outputs of these processes that we can look at and go, ‘We’ll take the eyes from here and some of the skin tone from this,’” Whitehurst notes. “To be able to create like a painter would be in that final likeness, and to maintain that performance. The brilliance of it is having the ability to use a little bit of this and more of that. For another shot it is not working, so you have to use a different combination of approaches.”
The parade sequence, which had Edinbourgh stand in for New York City, was the responsibility of Rising Sun Pictures.
Remarkably, Harrison Ford was still able to fit into the original costume. “The long and short of it is, Harrison is super fit and energetic,” Whitehurst states. “He will act like 40-year-old Harrison for those moments.” Ford has a distinct running style described as a cross between Jack Sparrow and Tom Cruise. “We took a lot of time to study his running in various films, and we had an all-CG shot of him running across the train,” Weaver notes. “Essentially, we had to get those mannerisms into that run while working out the cadence of what is necessary to step over the pipes and jump between cars.”
The prologue was intended to resemble recently-discovered lost footage shot during the making of Raiders of the Lost Ark.
“[De-aging Harrison Ford] was a terrifying challenge that I wanted to step into and take on. … By adding the machine learning to the mix [in ILM’s FaceSwap], it’s contributing yet another tool to the arsenal to accomplish what is needed. We did rely on archival footage from past films, and it helped tremendously in many different ways, but it didn’t replace our previous techniques, which we still need to build a fully CG asset that is believable standing on its own as well. It was necessary to treat each shot individually.”
—Robert Weaver, VFX Supervisor, ILM
The de-aging of Ford was accomplished through machine learning and a variety of techniques known as ILM FaceSwap.
When de-aging Mads Mikkelsen, a different creative choice was made. “Mads as younger man is fuller in the face, which didn’t feel right for the character,” Whitehurst reveals. “When we started looking at how we were going to FaceSwap Mads, we decided not to refer extensively to what Mads actually looked like when he was 30. It was more about figuring out what young Jürgen Voller looked like. That was done with a whole bunch of paint tests. We figured out the look for that, and then were able to roll that through the shots. It’s interesting because young Jürgen Voller kind of looks like young Mads Mikkelsen, but also not – that benefits the character.”
ILM created a fully CG environment for the scenes that take place in Syracuse, Sicily circa 214 B.C..
“It was amazing. We could rotate that plane through multiple axis a great deal, and it was full size. But it could also shake around, so we were inside a lot. The physicality was real. That made it tricky because the camera is shaking in the inside, or the plane shaking against the background that shouldn’t be shaking, but the believability of the motion that you get from the people inside of the plane is worth the pain because it sells it.”
—Andrew Whitehurst, VFX Supervisor
Filmmaker Mangold and frequent cinematographer Phedon Papamichael favor using wide lenses for closeup shots resulting in more of the surrounding environment appearing in the frame. “There are probably more closeups of a 40-year-old Indiana Jones in this movie than there are in all of Indiana Jones and the Last Crusade,” Whitehurst observes. “The important thing was being able to accommodate how they wanted to shoot it. I don’t want anyone to ever feel compromised. If it means pushing in and getting close because narratively and dramatically that is going to have the most impact, we should do that.” Among the action sequences is Indiana Jones trying to outrun a subway train while riding horseback. “For the closeups, the train is CG. We had a beautiful set build at Pinewood of the whole station and a little bit into the tunnel. Then we had a long black tent that extended out so he could ride a horse down [the tunnel], so we were able to shoot the plates. but the train was always going to be CG. The actual tunnel itself is CG as well,” Whitehurst says.
Phoebe Waller-Bridge appears in the aircraft gimbal, which was shot on a bluescreen stage. (Photo: Jonathan Olley)
“We do a lot of dynamic studies and various simulations to see what our perceptions are. A lot times in our work, things that are physically correct are not what we’re looking for, so we need to augment them. But in the case of the airplane and all of the atmospherics, James [Director Mangold] had a clear idea in his mind as to what these atmospheric dynamics should be, and he essentially likened them to a waterfall flowing out into the past. That was our cue to build this massive storm and put the dynamics onto the planes that are believable… It was a fine balance to walk.”
—Robert Weaver, VFX Supervisor, ILM
Ford strikes a classic pose while Cinematographer Phedon Papamichael scouts a set. (Photos: Jonathan Olley)
A signature stunt is Indiana Jones leaping from tuk-tuk to tuk-tuk during the street chase in Tangiers. “That was shot in Morocco,” Whitehurst reveals. “We have a guy in a tuk-tuk who will then jump into the other tuk-tuk. We did a CG takeover of the tuk-tuk that he leapt from and have a CG van colliding into it.” Stunts were a crucial partner. “I’ve worked with Ben Cooke, who was Supervising Stunt Coordinator a couple of times. Dan Bradley is an incredibly experienced second unit director. Once we figured out where we wanted to shoot the tuk-tuk. which was Fez in Morocco, Dan went there. The dialogue needed to happen, so we roughly knew how long the sequence had to be, as well as where we were going in and out of it. But the script didn’t describe specific action beats. Dan walked around Fez and said, ‘We could do this with the tuk-tuks right here.’ On the basis of that, we were able to previs the sequence. We scanned and photographed Fez and the vehicles down the whole route because we knew that there was going to be process back at Pinewood for all of the dialogue between our principals. Because everything was previs, we knew what the angles were going to have to be. We decided to put the edit together as we were shooting the process work so we knew that everything was going to cut together okay. Then it’s a case of paring things down in the edit and refining until we get the completed sequence.”
Frequent collaborators filmmaker James Mangold and Cinematographer Phedon Papamichael enter the world of Indiana Jones together for the first time. From left: Production Designer Adam Stockhausen, Mangold and Papamichael on set. (Photo: Jonathan Olley)
ILM was also responsible for the third act. “There are some fantastic escapades that happen with the airplane going through the eye of the storm and coming into 214 B.C.,” Weaver states. “They did an amazing job on set shooting a partial version of the airplane, and we would add an extended version of that through CG at various times. There were other times where things couldn’t be filmed practically, and it would have to be completely CG, partially airborne and jumping out.” The proper weight of the aircraft had to be conveyed to add to the sense of peril. “We do a lot of dynamic studies and various simulations to see what our perceptions are,” Weaver remarks. “A lot times in our work, things that are physically correct are not what we’re looking for, so we need to augment them. But in the case of the airplane and all of the atmospherics, James had a clear idea in his mind as to what these atmospheric dynamics should be, and he essentially likened them to a waterfall flowing out into the past. That was our cue to build this massive storm and put the dynamics onto the planes that are believable, getting enough wind flex that is believable but doesn’t feel like those wings are going rip off in any minute. It was a fine balance to walk.”
“I always push to get as much as we possibly can in-camera, particularly if it’s something that is interacting with the characters because you get better performance. The physicality is there because it’s real. If you can’t do that, we try to find reference that is as close as we can get to that and then analyze it even for the small things.”
—Andrew Whitehurst, VFX Supervisor
A rig was built for the interior of the plane by the special effects team led by Alistair Williams. “It was amazing,” Whitehurst explains. “We could rotate that plane through multiple axis a great deal, and it was full size. But it could also shake around, so we were inside a lot. The physicality was real. That made it tricky because the camera is shaking in the inside, or the plane shaking against the background that shouldn’t be shaking, but the believability of the motion that you get from the people inside of the plane is worth the pain because it sells it. Then that helps when we cut to the exteriors and have felt this violence, and now we can see that in the wider shots.” Practical elements are essential even when shots become fully CG. “I always push to get as much as we possibly can in-camera, particularly if it’s something that is interacting with the characters because you get better performance. The physicality is there because it’s real. If you can’t do that, we try to find reference that is as close as we can get to that and then analyze it even for the small things. For example, the hanger doors open, we see the bomb for the first time and the engines fire up. One of the things with old airplane V engines is when the propellers start to spin, too much fuel is going into the engine, so it’s not up to speed yet, so you get this big blast of flame that comes out through the exhaust. We did that, and it looks cool as well. It adds that extra flavor and verisimilitude to it that there is a bit more texture,” Whitehurst says.
Cinematographer Phedon Papamichael prepares a driving scene on a bluescreen set. (Photo: Jonathan Olley)
Combining reality and fantasy is a staple of the Indiana Jones franchise, such as having a portal appear in the sky for Indiana Jones and the Dial of Destiny. “Indy films are slightly earthy, grubby-real, but then have an element of the supernatural on top,” Whitehurst observes. “We stylistically absolutely had a precedent in the previous films. One of the things that James and I talked about a lot was a Douglas Trumbull cloud tank, where it’s something that feels physical and real but has an interesting aesthetic quality to it as well. Certainly, we were looking at that when designing the inside of the portal as we fly towards 214 B.C. and when we come out of the other side. We designed something that had an interesting physicality to it so it felt like something that nature might produce. We had these arms wrapping around the horizon as if it was coming to Earth trying to grab [it], with this portal in the center of it. There was an oddness to it. But we were also referencing actual cloud formations and odd storm footage a lot to get the movement and the shapes within that form. There was an overall sculptural design aspect to it that we wanted it to narratively feel like, which was threatening, but also getting that naturalism into it that is very Indy.”
Interestingly, Ford still fits into the original costume and has remained extremely fit. (Photo: Jonathan Olley)
A complete CG environment build was the Sicilian city of Syracuse circa 214 B.C.. “It was down to the level of being able to fly through the streets,” Weaver reveals. “Basically, we wanted to take away any limitations that James may have in telling the story. Wherever he wanted to put the camera, it was going to be completely believable and period appropriate.” In total, 2,350 visual effects shots were created by ILM, Rising Sun Pictures, Important Looking Pirates, Soho VFX, Midas VFX, The Yard VFX and Crafty Apes. “The tricky thing with a movie like this is, there were so many locations and numerous times where every single scene is something new,” Whitehurst states. “Somebody asked me, ‘What was it like to make this?’ It was like doing 12 commercials simultaneously, and a lot of them are very different. That’s the hard part. There is a little action beat towards the end when Helena is on a motorbike chasing Voller. I loved it when we did the previs. I thought it was going to be a fun moment. We shot it with Phoebe Waller-Bridge on a soundstage. She was awesome, and we blasted her with water and wind all day long. Phoebe just did it and completely sold it. Also, there is an awesome shot with Phoebe right under the tail of the plane as it’s bouncing down the runway, and you feel it slam down just next to her. I love that whole beat. It feels exciting, present in the action. The performance is great, and the work is beautiful.”
By TREVOR HOGG
Images courtesy of Universal Pictures.
Coming into contact with the Question Boxes enables characters to transform and acquire different superpowers.
Claiming the crown of the top-grossing video-game adaptation of all time is The Super Mario Bros. Movie with a worldwide box office totaling $1.3 billion, which is nearly three times what was earned by second place Warcraft. The overwhelming success was caused by Nintendo partnering with filmmakers Aaron Horvat, Michael Jelenic and Pierre Leduc, screenwriter Matthew Fogel, Illumination Mac Guff and Universal Pictures to produce a story that at times literally pays homage to the source material while also pushing the boundaries of expectations, as demonstrated by the reptilian villain Bowser channeling his inner Ozzy Osbourne and Elton John to sing heartfelt songs dedicated to the unreceptive Princess Peach.
Lumalee is a volumetric character, so classic shading techniques could not be utilized.
In the middle of the zany craziness is Illumination veteran Milò Riccarand, who was the Head of CG on the project and has seen the evolution of effects utilized by the Paris-based animation studio since Despicable Me became a global sensation. “The way we use effects has changed hugely because technology and software has evolved a lot,” Riccarand states. “We have a huge render farm. Directors want something bigger, more realistic and interesting. That’s what I love about it.” When doing simulations in animation, it’s all about achieving the proper balance between plausibility and stylization. “We do a lot of proprietary software and jump at the possibility to do what we need in order to achieve the look that the director wants, whether it be physical or cartoony,” Riccarand says. Sequences were mapped out using storyboards and previs. “We did two passes of previs, one animated and the other with characters. This is because the interaction between the characters, the cast and camera can be tricky, as their relationship with each other needs to be like a dance in order to get a good position.”
By combining detail and stylization, the fur and skin of Donkey Kong appears to be realistic and appealing.
There is no shortage of characters with short legs in The Super Mario Bros. Movie, which stems from respecting the original silhouettes and poses of the original character designs. “To be fair, it’s not just the short legs,” Riccarand remarks. “The legs can be short in one and become long in another one. A principle of animation is squash and stretch. That can come across as being elastic, which is hard on CFX because you want to do something that is physically realistic, so you have to find a way to do that.” Mario transforms whenever he makes contact with Question Boxes, which allow him to acquire various powers. “It’s a joint collaboration between effects, compositing and lighting departments. We are doing a lot of work in Nuke in order to have particles flowing on his face or forming a force field. Numerous layers had to be comped and lit,” Riccarand says.
One of the most complex sequences and environments was executing the cosmic car chase that takes place on the Rainbow Road.
A chase sequence takes place on a prism-colored cosmic transitway. “The Rainbow Road was a complex challenge for us,” Riccarand notes. “The camera had to chase all of the characters, which was difficult technically. The road was a volumetric entity with clouds inside. To make everything look lit [was challenging]. It’s a rainbow, so it’s not something solid. Our render department helped us to sort it out and be able to render that.” Supercharged vehicles add to excitement. “Those were a lot of fun!” Riccarand laughs. “This is not a live-action movie, so we can do what we want with the characters. You get crazy moments with the cars. The Koopa General drives this huge car that takes up all of the road.” The cloud elements assisted in conveying the proper size, scale and speed. “Each shot was set-dressed in order to get the best angle for the camera. We added tiny elements in frame that are going by quickly and cause you to say, ‘They’re going fast.’”
A signature environment is the Mushroom Kingdom that Mario and Princess Peach walk through.
Bowser lives on a volcanic world and resides in Lava Lake Keep. “It was not scary to do because our proprietary software helped us to deal with the smoke,” Riccarand states. “But when it comes to scale and a thousand shots to do, we had to be smart to find a good solution that doesn’t cost too much. You can’t have multiple simulations rendering overnight to have them ready for the day after. The lava had to be stylized to work with the characters in the scene. It was a bit more cartoony, but with realistic lighting.” A staple of video games is to have a world filled with giant mushrooms. Explains Riccarand, “People love that! The Mushroom Kingdom was a nice set to do. There was a lot of modeling and procedural environments in order to do the set extensions. It is something iconic from the game, so the viewer knows what they are looking for; so we needed to pay attention in order to not disappoint them.” Scenes in World Bowser take place at night, while the Mushroom Kingdom action unfolds in the daytime. “Night sequences are easier to hide things with mist and darkness, and you can use high contrast lighting. When you’re outside in sunny daylight, it’s more complex to light the characters in a believable and interesting way.”
While following through with his nefarious plans, Bowser also has a habit of singing love songs dedicated to Princess Peach.
“Those [supercharged vehicles] were a lot of fun! This is not a live-action movie, so we can do what we want with the characters. You get crazy moments with the cars. The Koopa General drives this huge car that takes up all of the road. … Each shot was set-dressed in order to get the best angle for the camera. We added tiny elements in frame that are going by quickly and cause you to say, ‘They’re going fast.’”
—Milò Riccarand, Head of CG/FX Supervisor, Illumination Mac Guff
The simulations for lava had to be stylized to ensure that the environments did not clash with the cartoony character designs.
Some scenes are a direct homage to actual gameplay. “That was fun because the idea was to take something that exists in the game, however, then put it into a more cinematic environment,” remarks Riccarand. “It had to respectful but on a bigger scale.” There was constant feedback from Nintendo, with signature elements like Question Boxes being incorporated into the storytelling. “The Question Boxes are opaque with a little light source inside. They are the same design as Nintendo, but had to have some magic in order to have them fit into the movie.” Something not part of the gaming experience was having Bowser sing in a concert inspired settings and lighting. “That was one of the craziest ideas! It’s like a jazz concert, but him singing to his girl. We have some good key lights and lots of reference from jazz concerts. The idea was to put the character in the light and simply have him singing. One of our co-directors, Pierre Leduc, previously was our animation director on a lot of movies, so his experience helped us in getting the proper silhouettes and poses.”
A lot of fun was had in getting to design, create and execute the various vehicles that could not exist in a live-action movie.
“[The volcanic world of Bowser] was not scary to do because our proprietary software helped us to deal with the smoke. But when it comes to scale and a thousand shots to do, we had to be smart to find a good solution that doesn’t cost too much. You can’t have multiple simulations rendering overnight to have them ready for the day after. The lava had to be stylized to work with the characters in the scene. It was a bit more cartoony, but with realistic lighting.”
—Milò Riccarand, Head of CG/FX Supervisor, Illumination Mac Guff
A scene from the commercial that begins the movie.
Simulations such as the water from the bursting pipes in the bathroom are more driven by comedic rather than realistic timing. “In two frames you can go from no water to a huge splash of water,” Riccarand observes. “It’s interesting because that’s when effects works with animation. It’s a joint venture between the two. Before doing the effects, the animator will say, ‘I would love to have a splash here and there.’ Afterwards, in effects, we try to make that happen.” Talking in an infantile voice and expressing pessimistic views is the imprisoned star known as Lumalee. “It’s a volumetric character, so it’s not the classic shading. This type of character is difficult to animate because it’s a star shape. In the beginning you’d think, ‘I will not be able to do some extreme movements.’ But you absolutely can. In the end credits, Lumalee is playing the saxophone.”
Actual gameplay was incorporated into the storytelling and upgraded with a cinematic sensibility.
“The Mushroom Kingdom was a nice set to do. There was a lot of modeling and procedural environments in order to do the set extensions. It is something iconic from the game, so the viewer knows what they are looking for; so we needed to pay attention in order to not disappoint them.”
—Milò Riccarand, Head of CG/FX Supervisor, Illumination Mac Guff
Nintendo was closely involved with the production to ensure that aesthetic of the video game was maintained.
Grooming for the moustaches of the Mario Bros. and Donkey Kong was achieved through proprietary software. “Donkey Kong has numerous layers of fur,” Riccarand explains. “It has to have a lot of detail to feel real, but it doesn’t need to be realistic. There is a closeup shot of the hand of Donkey Kong, and you see one of his fingers at a huge scale. I couldn’t do the skin realistically, as it needs to be pleasant to look at. We did a lot of tests to produce something that looks real and that has good subsurface scattering, but the texture and painting of it is appealing.” Plenty of destruction occurs in the finale battle that unfolds in Brooklyn. “FX Supervisor Simon Pate drove this sequence. You have the superpowers of the characters plus various simulations everywhere, like smoke and dust. You want to do things that are not too scary but good to see and impressive. It can be hard to do a battle like that in the daytime, but the lighting helps a lot.” Advancements were made in the area of compositing. Riccarand notes, “We did a lot of new work on lens aberrations, and the in-house renderer has been greatly improved because with each movie we want to render more and more. Technology gets better, and people ask for more! But that’s what makes this type of job interesting. You’re always trying to find the best and new ways to do things.”
By TREVOR HOGG
Images courtesy of Lucasfilm Ltd. and Aardman.
Lead Animator and Key Stop Motion Animator Laurie Sitzia animates the interaction between Anni and Kalina during the tunnel sequence.
If you want to see the global talent practicing the art of storytelling through animation, the recent trend towards anthologies appearing on the streaming services such as Netflix and Disney+ are providing the platform for viewers to do so. After using a series of nine anime shorts to showcase the influence of Japanese cinema on George Lucas and the Star Wars universe, Lucasfilm has expanded the worldview for Star Wars: Visions Volume 2 to include U.K. stop-motion maestros Aardman for I Am Your Mother. The tongue-in-cheek story, directed by Magdalena Osinska, cleverly shifts the dysfunctional volatile father-son dynamic to a nurturing mother-daughter class conflict and transforms a beloved supporting character into a self-promoting salesman.
One of the most difficult physical actions to pull off was getting the puppets to hug each other in a believable manner.
“The theme of family is something I love to explore in my work, and a lot of the characters in ‘I Am Your Mother’ are inspired by people I’m familiar with, so this largely informed the visual aesthetic,” Osinska explains. “I intentionally wanted the main characters to be alien, as that was inspired by my arrival in the U.K. as a Polish national. I was an alien, so our characters are literally aliens! In terms of animation, I was keen for a female animator to develop these mother-daughter relationships. Our lead animator, Laurie Sitzia, developed and tested Anni and Kalina. We had in-depth conversations with Laurie about mother-daughter relationships, about Anni and Kalina’s background, motivations and emotions throughout the film. We talked about the closeness and warmth of their relationship, but also that relatable feeling of being embarrassed by your parents, and being able to show these layers of feelings through the puppet’s performance.”
As for the visual language, it was important to retain a warmth between Anni and Kalina despite the rising tension between them. “Such subtle emotional shifts needed to be evident, and I worked hard to put them into the finer nuanced details of their facial expressions,” states Laurie Sitzia, Lead Animator and Key Stop Motion Animator. “This was especially important in the cockpit sequence where we watch Anni’s frustrations escalate until she reaches boiling point with an outburst that shocks and hurts her mother. She immediately regrets it but can’t take it back. Kalina contains her hurt to protect her daughter, but we still needed to see a glimpse of it.”
Concept design of the spacetug, which serves both as a home and spacecraft for Anni and Kalina.
“I was keen for a female animator to develop these mother-daughter relationships. Our lead animator, Laurie Sitzia, developed and tested Anni and Kalina. We had in-depth conversations with Laurie about mother-daughter relationships, about Anni and Kalina’s background, motivations and emotions throughout the film. We talked about the closeness and warmth of their relationship, but also that relatable feeling of being embarrassed by your parents, and being able to show these layers of feelings through the puppet’s performance.”
—Magdalena Osinska, Director
Director Magdalena Osinska prepares the spacetug for the complex tunnel shot that occurs during the race.
Reflecting how the relationship evolves between Anni and Kalina are the book-ending hugs. “Hitting the perfect pose in a hug in stop-motion is always a challenge and much harder than it looks. Puppets don’t usually fit together very easily. You’d never know it, but Anni’s head is actually cheated forward and is floating on a rig about 10mm forward of Kalina’s face for the last few frames of their final pose. It was the only way to get their tilted heads to look like they were making contact with each other. The rig had to be cheated in mid-shot, quite a fiddly operation,” Sitzia says.
Lead Animator and Key Stop Motion Animator Laurie Sitzia found it important to display subtle emotional shifts to reflect the evolution of the mother-daughter relationship between Kalina and Anni.
There cannot be any Star Wars story without droids, and the faithful malfunctioning companion Z1 displays the characteristics of an accordion-expanding sausage dog. “[Animation Director] Steve Cox developed and 3D printed the prototype for Z1 himself. As a huge Star Wars fan, he was keen to ensure that the final puppet would feel totally in keeping with the other astromechs, and Z1 really does,” Sitzia notes. “He was fun to animate, too, with elements like the ears and legs folding in/out and the slinky body allowing for playful dips and dog-like movement while still retaining a very droid-like quality. Z1 could also stretch out long which worked so well for his fall from the table in the opening sequence and later on when he is swinging from the spacetug. His head tilt also gave him that inquisitive dog-like expression. I remember multiple pipes and tubes being ordered in a search for the perfect part to create his concertina mid-section, as making this from scratch would have been really tricky. It was quite a process to find the right thing, and Steve tells me it ended up being a motorbike shock absorber cover that with a few tweaks worked out perfectly!”
“Hitting the perfect pose in a hug in stop-motion is always a challenge and much harder than it looks. Puppets don’t usually fit together very easily. You’d never know it, but Anni’s head is actually cheated forward and is floating on a rig about 10mm forward of Kalina’s face for the last few frames of their final pose. It was the only way to get their tilted heads to look like they were making contact with each other. The rig had to be cheated in mid-shot, quite a fiddly operation.”
—Laurie Sitzia, Lead Animator and Key Stop Motion Animator
Color keys used to map out the various sequences.
“We used ZBrush for modeling and Maya for modeling and animation,” Osinska remarks, “Nuke for compositing and face tracking, Houdini for rendering. And I’ve been told by Signe Tveitan, our wonderful 3D modeler, that we used Karma and Solaris to calculate how the light interacts with the surface.” Fun was had with the visual effects. “I was keen for the look of the spaceships thrusters to be quite stylized and a mixture between 2D and 3D visual effect, a little bit comic, like Scott Pilgrim vs. the World,” Osinska says. “[Production Designer] Aurélien Predal created very cool 2D shapes for each ship that also reflected the character of their drivers. It was an interesting challenge to then translate it into 3D, and we actually spent quite a bit of time on it to find that perfect look. The other visual effects were screens and holograms. We played with the visibility, color, lines, static, to make it work for our film and also within the Star Wars universe. Besides that, there were fireworks and lots of other atmospheres that made everything sit nicely together.” The set of city and school building were constructed entirely in 3D. “Usually, the foreground is physical, but the rest would be CG. Lighting the 3D parts played a huge role in bringing together these two techniques; this actually goes both for the sets in general but also the mouths, and our Lighting Lead, Tessa Mapp. did an amazing job on it. A big bow to our CG and VFX Supervisors, Ben Toogood and Bram Ttwheam, who married the stop-motion with CG so beautifully,” Osinska says.
The main character lineup as conceptualized by Félicie Haymoz. From left to right: Z1, Kalina Kalfus, Annisoukaline ‘Anni’ Kalfus, Dorota Van Reeple and Julan Van Reeple.
“[Animation Director] Steve Cox developed and 3D printed the prototype for Z1 himself. As a huge Star Wars fan, he was keen to ensure that the final puppet would feel totally in keeping with the other astromechs, and Z1 really does,” Sitzia notes. “He was fun to animate, too, with elements like the ears and legs folding in/out and the slinky body allowing for playful dips and dog-like movement while still retaining a very droid-like quality.”
—Laurie Sitzia, Lead Animator and Key Stop Motion Animator
One of the visual effects that had to be produced was a hologram of Wedge Antilles in the opening shot.
Appearing on screens throughout the city promoting the family pilot race and his own merchandise is Rebel Alliance X-wing pilot Wedge Antilles, actually voiced by the original live-action actor, Denis Lawson. “Anni and her mum come from a planet very very far away, from an unprivileged place, but Anni is an extremely talented pilot, the first one from her planet to be accepted to the prestigious Hanna City Flight Academy,” Osinska states. “Her mum taught her piloting skills, for example, the Ryloth Roll maneuver, which then helps Anni win the race, but Anni doesn’t value it when we meet her at the beginning of the story. Instead, naturally, she is looking up to well-known heroes like Wedge Antilles. It was an absolute pleasure to work with Denis Lawson, and it didn’t take much convincing at all to play a slightly lighthearted and fun version of his character.” There is no shortage of Easter eggs, such as Jawa causing technological havoc in the opening. “I love the Jawa, too! They actually appear one more time in the film, at the Academy Fair by a stall selling things ‘Touched by Luke Skywalker.’ The Jawa tries to steal Luke’s lightsaber just before we see another hand [Maz Kanata] grabbing it first. We wanted to put fans’ worries to bed by solving the big mystery of how Maz Kanata took possession of Luke’s lightsaber in The Force Awakens!”
Figuring out the key poses for the Anni puppet.
“It is really hard to choose one sequence as I loved each one of them, but if I must I’d choose the tunnel scene after Anni and Kalina crash, both the exterior and interior,” Osinska remarks. “The reason being that it was quite challenging, and there was lots of problem-solving, which is what I love about stop-motion. The tunnel we built was only four meters long, but we needed to make it feel as if it’s going for hundred of meters. DP Tristan Oliver, along with Motion Control Operator Adam Cook and Art Director Andy Brown, figured out a way of detaching the panels that are off-camera and moving them forward in front of the spaceship to achieve the feeling that it’s a very long tunnel with forks and turns. The precision of collaboration between the camera and set department was faultless!”
“I love the Jawa, too! They actually appear one more time in the film, at the Academy Fair by a stall selling things ‘Touched by Luke Skywalker.’ The Jawa tries to steal Luke’s lightsaber just before we see another hand [Maz Kanata] grabbing it first. We wanted to put fans’ worries to bed by solving the big mystery of how Maz Kanata took possession of Luke’s lightsaber in The Force Awakens!”
—Magdalena Osinska, Director
There is no shortage of Easter eggs that can be found in each frame. If you look closely enough, you will see Luke Skywalker’s lightsaber.
Click here to watch a Lucasfilm Ltd. featurette on the making of Star Wars: Visions Volume 2: https://www.youtube.com/watch?v=0lHQA9yvJSI
VES Members and Up to Two (2) Guests are Invited [...]
Find out moreNecessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.