Images courtesy of Paramount Pictures and Sega of America, Inc.
Jim Carrey reprises his role as the main antagonist, Dr. Ivo Robotnik.
MPC served as the main vendor with 1,200 to 1,400 shots involving character animation.
Successfully adapting video games has been a rare feat in Hollywood – and even more so to warrant a sequel – but filmmaker Jeff Fowler and Visual Effects Supervisor Ged Wright have come together once again to produce Sonic the Hedgehog 2 for Paramount Pictures and Sega Sammy Group. In the sequel, Dr. Ivo Robotnik (Jim Carrey) escapes from the Mushroom Planet and partners with an alien echidna called Knuckles (Idris Elba) to find a mystical emerald that has the power to destroy civilizations and defeat his hedgehog nemesis Sonic (Ben Schwartz) and the two-tailed fox Tails (Colleen O’Shaughnessey).
“After finishing shooting, we did a version of the film through Blender rendered in Eevee with motion blur, depth of field, proper tracks, and integrated the characters into the live-action photography. An in-house team of 10 to 14 people produced well over 12,000 versions of shots [within a period of 12 to 14 weeks] that went into editorial and allowed the creative team to come up with a cut of the movie that didn’t throw MPC off track. This was critical in allowing the film to get completed by the absolute skin of our teeth.”
—Ged Wright, Visual Effects Supervisor
For Tails, Visual Effects Supervisor Ged Wright wanted to make sure that his tails were present and creating nice shapes.
“The thing that has drawn me back to this was Jeff Fowler, [producer] Toby Ascher, [executive producer] Nan Morales and all of the of the other people involved who made turning up to work feel like you’re making a movie with your friends rather than going into battle every day,” states Wright. “The first one made the storytelling side of things more straightforward [this time around] and created a certain number of parameters that you’re working within.”
“We ended up with 1,200 or 1,400 character animation shots, so those little furry dudes are in most of the movie. Certain things were reusable [from the first movie], like the look and quality of the fur. These characters have vast eyes, so a significant amount of work was spent improving the shading, look and feel of them as they communicate a great deal, and that worked paid off in this film.”
—Ged Wright, Visual Effects Supervisor
The sequel was unable to outrun the pandemic. “We were making an animated film on a live-action schedule, which means rather than having three or four years we’re doing it in half of the time,” notes Wright. “Then all movie productions stopped. All of the big companies shrunk their staff, but [when all of the productions simultaneously started up again] then everyone wanted to hire back their same staff within the same six- to eight-week period. You had a huge amount of content, however, not enough people trained on how to do it. There is a massive skills shortage in the visual effects industry right now.”
Knuckles rarely opened his hand because it looked like he was wearing oven mitts.
Critical in being able to release Sonic the Hedgehog 2 on time was the emphasis placed on previs and postvis. “After finishing shooting,” comments Wright, “we did a version of the film through Blender rendered in Eevee with motion blur, depth of field, proper tracks, and integrated the characters into the live-action photography. An in-house team of 10 to 14 people produced well over 12,000 versions of shots [within a period of 12 to 14 weeks] that went into editorial and allowed the creative team to come up with a cut of the movie that didn’t throw MPC off track. This was critical in allowing the film to get completed by the absolute skin of our teeth.”
DNEG got the opportunity to explore the Mushroom Planet that was hinted at the end of the original movie.
There were over 1,900 visual effects shots by MPC, DNEG, Marza Planet Animation and an in-house team while Fish Flight Entertainment assisted with the previs and postvis. “We ended up with 1,200 or 1,400 character animation shots, so those little furry dudes are in most of the movie,” laughs Wright. “Certain things were reusable [from the first movie], like the look and quality of the fur. These characters have vast eyes, so a significant amount of work was spent improving the shading, look and feel of them as they communicate a great deal, and that worked paid off in this film.”
“The focus was on making sure that the things that were in and around the humans had the most amount of real production budget spent on them. We wanted the production designer involved in designing the whole film because otherwise we would have had big chunks of the film where those decisions were being made by the wrong people. There is no other way to put it!”
—Ged Wright, Visual Effects Supervisor
A massive amount of blue electrical energy is generated by Dr. Ivo Robotnik (Jim Carrey) on the Mushroom Planet.
No matter the lighting conditions and position of the camera, Sonic, Knuckles and Tails always had to be recognizable. “If you photograph one of us at sunset, we look different, which is something we have come to expect,” says Wright. “But with iconic, stylized characters there is often an expectation for them to look consistent throughout the film, which is not a photographic reality. It was easy at first but got harder towards the end.” The original 2D character designs had to be adapted to work in 3D. “Sonic’s mouth had to be off to one side and generally on the camera’s side,” adds Wright. “For Knuckles, you rarely want to open his hand because it looks like he’s wearing oven mitts. You want to keep him on character and make sure that he feels strong and intimidating. For Tails, you want to make sure that his tails are present as part of his character and creating nice shapes. They can easily look as if they were dragged along the floor. Nobody wanted that.”
A ring portal opens with adversaries searching for Dr. Ivo Robotnik (Jim Carrey).
Driving the character animation was the voice cast. “A lot of the time, if the voice performance changes, then it feels like the actual physical performance needs to change, not just the lip sync,” notes Wright. “We were as diligent as possible to make sure that the voice performance was turned over as early as possible so that the animators could be sitting there working with it. We also did a similar thing with filming the actors while they were performing. We didn’t end up going through the process of doing any facial motion capture this time around because the characters are so wildly different and the amount of effort that goes into capturing that data felt like it was a diminishing return doing that.” Less was considered more with the lip sync. “They don’t have lips so it can feel like a latex mask moving around if you’re not careful,” observes Wright. “You want to be hitting the core shapes; however, focusing on properly enunciating each syllable is not the best outcome.” The process of getting the live-action and CG characters to interact did not greatly change. “We had the usual hit list of interactive items, like little sandbags that people can pick up,” says Wright. “We slightly moved things along from the first film, as far as on-set reference, which was more helpful. The most challenging interaction bits are when the characters are hugging them. Picking them up was more successful, because with the little bodies it’s easier to figure out what that interaction is going to be. To nail those interactive shots, you need to be doing a 3D representation of the human characters. We weren’t able to do that this time around because we simply ran out of time.”
Various Rube Goldberg traps were constructed out of mushrooms by Dr. Ivo Robotnik (Jim Carrey).
An emphasis was placed on getting practical elements. “For the snowboard chase we did a week-long shoot up in the Canadian Rockies that gave us a tremendous amount of material to inform that,” states Wright. “When they discover the Big Owl cavern, that was all CG because there are no human characters. The focus was on making sure that the things that were in and around the humans had the most amount of real production budget spent on them. We wanted the production designer involved in designing the whole film because otherwise we would have had big chunks of the film where those decisions were being made by the wrong people. There is no other way to put it!” Minimal greenscreen was utilized. “At the end of the film when they’re in the riverbed, rather than surround everything in greenscreen we chose a location that had a similar texture and feel,” adds Wright. “That’s a better approach rather than having to change absolutely everything.” Virtual production was part of the toolset. “We had the LED volume and used that to get the lighting in a better place on the set pieces and characters,” says Wright. “One example is when Robotnik is in the ‘mech head’ and has all of the electricity around him, he was actually in a LED volume. It was better to do the roto and extract him off something that was giving him all sorts of interesting lighting cues on his face and eyes rather than trying to light the actor and have a clean key to pull.”
“We had the LED volume and used that to get the lighting in a better place on the set pieces and characters. One example is when Robotnik is in the ‘mech head’ and has all of the electricity around him, he was actually in a LED volume. It was better to do the roto and extract him off something that was giving him all sorts of interesting lighting cues on his face and eyes rather than trying to light the actor and have a clean key to pull.”
—Ged Wright, Visual Effects Supervisor
Most of the interior of the giant mech robot was dark until the emerald electricity comes on and lights it up.
Knuckles and Tails can move at high speed like Sonic. “One of the core differences this time around is we had other characters in that heightened reality,” observes Wright. “As soon as you have two characters that are moving at the same speed, you almost don’t know that everything is in a heightened reality. There were a different set of parameters there. We had a couple of instances where we wanted to demonstrate that effect upon the rest of the world. One of them is when they’re fighting in the backyard and Robotnik spills his popcorn as they go into this heightened speed. We shot Jim on a super high-speed camera to get that. Most of the time it was two CG characters that are in that world, so you’ve got a lot more flexibility to alter and add things to be able to heighten those moments.” The speed trails were tricky. “They’re quite a graphic stylized element and self-illuminated, so it’s hard to get a sense of depth,” explains Wright. “You end up having to design them specifically for the shots to get the right look.” Driving everything was the sheer volume of the performance and character animation. “There is no shortcut for that. It takes time both for the animation team and the wonderful animation director that we had, Eric Guaglione, to come on and find that language,” Wright says, adding that the tonal variety of the narrative was an asset. “What I enjoyed about making this film was the possibility to lean into storytelling and the intimacy between characters while also having these big action beats. Often on movies you get to do one or the other. It’s unusual to be able to do both.”
Emmy-Award winner Margaret Dean is the Head of Studio for SKYBOUND, the home of Invincible and The Walking Dead, and is responsible for the production of original content and studio operations.Inspired at an early age by dramatic black and white films from the 30s and 40s, Marge discovered the moving image as an art student and delved into her passion for visual storytelling through animation. Known for building studios and animation pipelines, Marge has been responsible for the design or re-design of several studios, and as President of Women in Animation, she is a recognized global leader in advancing women in the field of animation.
As the head of a studio, you are responsible for creating and nurturing the culture, where everyone feels they belong. What I work to do is instill a space that embraces mentoring – not only to expand and diversify the workforce, but because it lends a strong sense of inspiration and community. The flow of shared experience, knowledge and support is critical to building a collaborative environment. Women in Animation’s mentoring program is our most successful initiative and demand continues to grow. What is truly exciting is that our formal mentorship matches planted the seeds to grow new networks. I don’t think you can make your way in this often-challenging industry without people who share their lessons learned, foster your talents and provide encouragement – and as someone who benefitted from great mentors, I’m proud to be in a position to pay this forward.
I was a single working parent early in my career, and the issue of balancing a career and family is highly personal. I was able to figure out a way where I did not have to sacrifice one for the other – but so many parents, particularly women, feel backed into making that tough choice. Women in Animation is focused on the enormous need to provide job flexibility and more support for working parents and caregivers. The number of women who have had to walk away from their jobs because of the high cost and lack of childcare and too few options for hybrid work schedules is startling – even more so due to COVID. We need to do better and we highly encourage partners to join our advocacy.
There is an enormous need to provide job flexibility and more support for working parents and caregivers.
Women in Animation wants to achieve 50/50 parity for women and underrepresented genders in the animated creative workforce by 2025 – and we believe the industry is already committed to that goal.What we‘re focused on now is how to make it easter to do it. We’ve created a searchable database of more than 6,000 women/diverse gender professionals to dispel that myth of ‘Ican’t find anyone to hire.’ We are also working on breaking down barriers to build the pipeline, including creating pathways that do not require going to an expensive art school or college. I’m very excited about our ongoing work with the California Board of Education and The BRIC Foundation to build out training and apprenticeship programs to prepare people for a multitude of jobs and enrich our talent pool.
Ask Me Anything – VFX Pros Tell All
Join us for our series of interactive webinars with visual effects professionals. Ask your questions, learn about the industry and glean inspiration for your career path.
A time jet getting ready to jump across decades. Height references included weather balloons and footage of Felix Baumgartner’s record freefall as a reference point for the curvature of the Earth.
Netflix’s The Adam Project is a family drama embedded in time-traveling, world-saving science fiction. To help with the sci-fi aspect, Overall Visual Effects Supervisor Alessandro Ongaro tasked DNEG London with conjuring up unique-looking wormholes, decades-hopping “time jets” and digi-double “time soldiers.”
The wormholes were a key visual effect in The Adam Project. It was decided that the wormholes would have some kind of funnel in the middle through which a jet could disappear. Then each wormhole had to disappear after the jet had gone through.
In the Shawn Levy-directed story, Adam (Ryan Reynolds) is a time pilot from 2050, on an illegal mission to rescue his wife Laura (Zoe Saldana). He crash-lands in 2022, and, as he heals himself and fixes his jet, enlists the help of his 12-year-old self (Walker Scobell). It is the year following the death of their father, Louis Reed (Mark Ruffalo), from which they have never recovered emotionally. Reed was a brilliant quantum physicist who accidentally invented time travel, which has been used by Reed’s former partner, Maya Sorian (Catherine Keener), to enrich herself and create a dystopian future. To undo this terrible timeline, Adam and his younger self travel to 2018 to seek out the help of their younger father. Once there, they must also find a way to make peace with his future absence. Jennifer Garner also stars in the film, portraying Adam’s mother Ellie Reed.
“The wormhole was the main pivot of the movie. DNEG has a great history of doing wormholes and black holes for movies, so we had the tricky task of coming up with something innovative that had not been done before.”
—Mike Duffy, VFX Producer, DNEG
The time jets needed to project velocity and urgency when leaving Earth’s environment. Vibration and camera shake were added to Adam’s jet to give the chase a more frantic feel.
The single most important effect in The Adam Project was, arguably, the wormholes that the time jets create for time jumping. Explains DNEG VFX Producer Mike Duffy, “The wormhole was the main pivot of the movie. DNEG has a great history of doing wormholes and black holes for movies, so we had the tricky task of coming up with something innovative that had not been done before.”
“There was also one instance where we had to digitally replace young Adam’s legs. There is a scene where they are plummeting so quickly towards Earth that they had to rig young Adam on wires to make him appear weightless, but to add to the comedy value we added [a boy’s] legs dangling behind him, which really helped to sell the gag of those couple of shots.”
—Alexander Seaman, Visual Effects Supervisor, DNEG
Time soldiers drop from the jet of Sorian (Catherine Keener) and begin pursuit on hoverboards. Stunt actors in time soldier outfits were digitally scanned on set, with the data used to create digital versions.
Older Adam (Ryan Reynolds), younger Adam (Walker Scobell) and Laura (Zoe Saldana) are in a classic GMC Jimmy truck as they flee the time soldiers through the forest.
DNEG Visual Effects Supervisor Alexander Seaman recalls, “The creation and animation of the wormhole was actually quite a simple 3D task with some fairly rudimentary 3D volumes and shapes which could then be easily animated to be scaled up and down.” The real challenge was working out the unique design of the wormhole, which had to disappear after the jet flew through it. Seaman adds, “Director Shawn Levy guided us towards optical flares and lens distortions as references. We looked at the way that different prisms behaved, and then ultimately decided that the wormhole needed to have some kind of funnel in the middle of it which the ship could disappear through.”
DNEG augmented the time jets’ original designs and designed their cloaking effect. The spectacular dogfights of the swift and agile jets – piloted by older Adam or Sorian’s head of security, Christos (Alex Mallari Jr.) – were “a fairly complex process,” according to Seaman. “We were provided with a few rounds of pre-visualization, as well as some aerial footage and plates of mountains from above the clouds in North America, which we then repurposed to create the same camera angles and speeds. Where this wasn’t possible, we digitally created the parts of the environment, including a digital valley, a digital rock surface and a digital cave. We would then block all of that out and animate the chase. Next, we would assess whether the scene was thrilling or fast enough and augment each shot accordingly. On some occasions the aerial footage wasn’t fast or high enough, so we had to look for ways to re-speed the plates we already had or simply replace it with a CG version of the same thing from a different perspective.”
A time jet in pursuit of the GMC Jimmy in the forest. DNEG had to replace forest, build forest extensions and blend it all with existing plate material.
Continues Seaman, “Once we had established how high and far away from the Earth they wanted to put the chase, we looked at references such as weather balloons and Felix Baumgartner’s world record freefall [in 2012]. This footage proved useful as a reference point for the curvature of the Earth and sense of serenity at that altitude. We also used the Hubble space telescope footage as a reference for how the clouds cast shadows onto the oceans and land masses. We then used some of our own proprietary tools to generate some of the atmosphere effects that you see from the Earth.”
The time jets needed to project velocity and urgency once they were leaving the Earth’s environment, at the edge of space where everything is calm and serene, Seaman explains, “We had to use a few film-making tricks, including adding a certain amount of vibration and camera shake to Adam’s jet in particular to give it a more frantic feel. We also used an element of ‘space dust’ through the air, which gave a sense of traveling through something that we could justify as water particles. Anytime that the jets got close to each other, we could justify haze or vapor from the jets washing past and over them. When the Sorian jet starts shooting at the time jet, we’ve got the tracers from the guns, which are able to convey a sense of speed and danger as well.”
The older Adam and younger Adam in the cockpit at a dramatic yet comical moment during a chase sequence with Sorian’s jet pursuing them.
Adds Seaman, “One of the features of that sequence was a huge Earth that the environment team did a really good job of creating. If we had kept Earth in the correct position throughout that sequence, you would have only seen it in a couple of shots, so we had to really cheat where the Earth was in relation to the camera in order to keep some kind of visual anchor point as to where they were going and how fast they were moving. In some cases, we even cheated the scale of the Earth to make it feel like they were traveling faster away from it.”
“Once we had established how high and far away from the Earth they wanted to put the chase, we looked at references such as weather balloons and Felix Baumgartner’s world record freefall [in 2012]. This footage proved useful as a reference point for the curvature of the Earth and sense of serenity at that altitude. We also used the Hubble space telescope footage as a reference for how the clouds cast shadows onto the oceans and land masses. We then used some of our own proprietary tools to generate some of the atmosphere effects that you see from the Earth.”
—Alexander Seaman, Visual Effects Supervisor, DNEG
DNEG also worked on the truck chase sequences that involved the Adams and Laura fleeing Sorian in a classic GMC Jimmy. They drive along and through a forest with Sorian’s jet and flying time soldiers in hot pursuit. DNEG had to replace forest and build forest extensions and blend it all with existing plate material. Explains Seaman, “There was a real forest complete with various types of vegetation and a dirt road running through the middle of it. To make the sequence more thrilling, they wanted to replace the dirt road and instead show the [truck] weaving to and fro between various bushes. We used the on-set reference for what the trees and plants looked like and then had a very talented modeling team recreate the same vegetation, as well as a very good environment team effectively fill in the forest for the pieces that were absent.”
Older Adam pilots a time jet through a CGI canyon. Much of that environment was digitally created, including a digital valley, a digital rock surface and a digital cave.
When the time soldiers flew through the forest in pursuit of the truck, cutting between trees while riding hoverboards, it is reminiscent of the Star Wars: Return of the Jedi chases in the forests of Endor. Responds Seaman, “We also felt that there were some influences by Return of the Jedi in the style of the forest and the speed at which the heroes were being chased through it. But this wasn’t something that we were asked to match or reference.”
The time soldiers often required digi-doubles. Seaman notes, “There were real-life stunt actors in time soldier outfits that they digitally scanned on set. They then sent us the data, and we recreated digital versions of the stuntmen in their costumes. We did a couple of varieties of them holding their weapons in different ways with slightly different imperfections to their armor. We then modeled and rigged the hover platforms to their feet. The team had done a really good job of filming the stunt performers through the sequence, but sometimes they were not going quite fast enough. So, in a lot of cases, we digitally re-produced them, using the footage as a reference to see how they moved and how their costumes reacted to the environment, but ultimately digitally replacing them to make them go faster.”
Digi-doubles were also used as replacements for actors in aircrafts, especially during the flying scenes. Comments Seaman, “There was also one instance where we had to digitally replace young Adam’s legs. There is a scene where they are plummeting so quickly towards Earth that they had to rig young Adam on wires to make him appear weightless, but to add to the comedy value we added [a boy’s] legs dangling behind him, which really helped to sell the gag of those couple of shots.”
“We had to use a few film-making tricks, including adding a certain amount of vibration and camera shake to Adam’s jet in particular to give it a more frantic feel. We also used an element of ‘space dust’ through the air, which gave a sense of traveling through something that we could justify as water particles. Anytime that the jets got close to each other, we could justify haze or vapor from the jets washing past and over them. When the Sorian jet starts shooting at the time jet, we’ve got the tracers from the guns, which are able to convey a sense of speed and danger as well.”
—Alexander Seaman, Visual Effects Supervisor, DNEG
Closeup of a time jet in the canyons. DNEG was tasked with adding extra detailing to the time jet exteriors and their cockpits.
DNEG has a history of creating wormholes and black holes, and was challenged to come up with something innovative for The Adam Project.
DNEG contributed more than 350 shots spread over eight sequences, out of the 1,432 total VFX shots in the movie. The other visual effects studios working on The Adam Project included Scanline VFX, Lola VFX, Supervixen Studios and Clear Angle Studios, and there was an in-house VFX team. Cameron Waldbauer was Special Effects Supervisor.
Images courtesy of Netflix and Alt.vfx, except where noted.
Benedict Cumberbatch portrays malicious rancher Phil Burbank while Kodi Smit-McPhee takes on the role of his brother’s effeminate stepson Peter Gordon in The Power of the Dog. (Image courtesy of Netflix)
With the exception of the Red Mill Inn, the town of Herndon, Montana was CG.
While movie critics praised the performances of Benedict Cumberbatch, Kirsten Dunst, Jesse Plemons and Kodi Smit-McPhee in The Power of the Dog, nothing was ever mentioned about the visual effects work supervised by Jay Hawkins (Wolf Like Me) and produced by Alt.vfx which amounted to over 200 shots. The lack of awareness and recognition is not something that bothers Hawkins. “People ask me, ‘What did you do on The Power of the Dog? That’s not a visual effects film.’ I show them the breakdown and they’re always quite surprised, which makes me happy.” Digital doubles were made to increase the herds of cattle, set extensions were required for the ranch, a town had to be digitally constructed, CG wounds were placed on animals and actors, and the outline of a dog was etched into the rolling hills.
“People ask me, ‘What did you do on The Power of the Dog? That’s not a visual effects film.’ I show them the breakdown and they’re always quite surprised, which makes me happy.”
—Jay Hawkins, Visual Effects Supervisor
Based on the novel by Thomas Savage, the cinematic adaption by Jane Campion (Bright Star) is set in the 1920s Montana where Phil (Benedict Cumberbatch) wages brutal psychological warfare against the new bride (Kirsten Dunst) and stepson (Kobi Smit-McPhee) of his brother, George (Jesse Plemons), that unfolds on their family ranch. New Zealand doubled for Montana during principal photography, which was conducted by Ari Wenger (Lady Macbeth) who received an Oscar-nomination for her contributions. Campion had done some extensive scouting in Montana where Thomas Savage lived. “I thought it was going to be alpine trees and big logging forests, but that wasn’t the look or terrain that Jane was going for. She wanted vast and open fields which we found in New Zealand. In terms of changing New Zealand for Montana, we weren’t doing any of that.”
Part of a ranch house was built on a farm in the Hawkdun Range in Maniototo by Production Designer Grant Major (Mulan). “The house had to service all of these different story beats and lines of sight,” explains Hawkins. “On one of early recces there was a small-scale 3D printed model of the house. We walked out to the location, which wound up being used for the film, and placed and rotated the model around in the light and starting thinking about where the rest of the buildings should be placed.” Extensive previs was utilized for the interior shots as there was not a budget for big translights, and the preference was to avoid greenscreen or bluescreen. “We came up with this idea of vinyl backdrops [of which we had three],” notes Hawkins. “I did previs for what we would shoot outside of the window, what would be the set’s field of view and what would be the set’s horizon, given that we had a limited size for the backdrop that could be used outside of the window.”
Rocks were digitally constructed to integrate the railroad tracks into the landscape.
A drone captured aerial plate photography of the ranch. “We didn’t do a whole lot of drone footage on the show, and, on that day, it was the arrival of the governor for the dinner scene,” remarks Hawkins. “We had to make sure that the drone stayed at the right altitude so you could see enough of the top of the house, given the fact only half of it had been built. Some practical snow blankets were laid down while the cowboys are running to the front door. In the rough cut before seeing the shot with the full house and snow, we weren’t sure, but when we started adding snow and post rendered the house, it came alive.” Grant Major produced concept art for the fictional setting of Herndon, Montana. “While scouting, we couldn’t find something that spoke to Jane,” adds Hawkins, “so the only physically constructed building was the exterior of the Red Mill Inn, which was located a couple hundred meters from the ranch house. The rest of the town is CG.” Drone photogrammetry scans were taken of the ranch house and Red Mill Inn. “When we went to rebuild it,” he says, “we were able to take the real-world measurements of photogrammetry scan, marry those with the original concept and build from there with texture reference from the practical build.”
“[T]he only physically constructed building was the exterior of the Red Mill Inn, which was located a couple hundred meters from the ranch house. The rest of the town is CG. When we went to rebuild it, we were able to take the real-world measurements of photogrammetry scan, marry those with the original concept and build from there with texture reference from the practical build.”
—Jay Hawkins, Visual Effects Supervisor
Visual Effects Supervisor Jay Hawkins thought the terrain was going to be alpine trees and big logging forests, but director Jane Campion wanted vast and open fields.
In two different scenes, the shape of a dog was incorporated into the rolling hills. “There was a lot of time spent rotting in vans and discussing things,” recalls Hawkins. “One evening. we climbed this big hill which was being considered as a possible location for the picnic scene. We were watching the sunset on the hills behind the house, and there were these really long shadows that were winding around them. One looked like the face of a witch and another resembled a tiger. Ari and I were sitting in wonder watching the nose of the witch go from being perfect to abstract. We thought, ‘What if the dog was a shadow puppet like that on a hill?’ When I got back from that recce, I worked with my concept artist on a bunch of different versions of the dog. Maybe it would be sculptural or embedded into a rock formation in the hills. However, the shadow throw was so strong and powerful that Jane loved it. We kept refining that concept. You’re trying to sculpt a ridge line that is also a shadow receiver of the ideal shape that you want when the sun is at a certain part of the day. In the end it was a fully 2D effect.”
While the car was practical, the train was a CG asset. “The carriages were based upon the passenger carriages we were able to get for the train station platform shot when they’re arriving at the station,” states Hawkins. “That was captured by the drone unit during COVID-19. We had all of these different options of plates and found that one. Extensive relighting and reworking were required on the plate to get it to work.” Rocks were digitally constructed to integrate the railroad tracks into the landscape. “Initially,” notes Hawkins, “the shot of the people next to tracks was supposed to have nothing around them. But it felt so naked with just the tracks and the cowboys standing there. We wound up putting in the stockyards, a section of town and additional elements until that shot itself felt correct.” Having the proper number of extras was not an issue. “Our bigger crowd scenes like at the railway station were shot pre-pandemic,” he adds, “and when we were on our interiors, New Zealand was in a fortunate situation where there were zero COVID-19 cases.”
“We thought, ‘What if the dog was a shadow puppet like that on a hill?’ When I got back from that recce, I worked with my concept artist on a bunch of different versions of the dog. Maybe it would be sculptural or embedded into a rock formation in the hills. However, the shadow throw was so strong and powerful that [director] Jane [Campion] loved it. We kept refining that concept. You’re trying to sculpt a ridge line that is also a shadow receiver of the ideal shape that you want when the sun is at a certain part of the day. In the end it was a fully 2D effect.”
—Jay Hawkins, Visual Effects Supervisor
Cattle were an important part of the visual storytelling. A cow library was built in Houdini of different groupable bovine behaviors.
Cattle were an important storytelling and visual element. “Before the film had even started, Ari had a cow breakdown for the different seasons and how many would logically be at the ranch,” remarks Hawkins. “For two or three days, we had real cattle with us. I did a massive texture and behavior study with as many witness cameras as I could. Then I worked with my team to construct a bunch of different groupable behaviors so that Chris Gardner, my technical director, could build them into his Houdini cow library. He had some nice anti-collision things, so if one cow stopped another it would walk around them. It took awhile but was quite good. I’m looking forward to another cow film just so we can use it again!” Not everything could be procedural, he adds. “If they were clumped together in a mass, there was always heaps of art direction because we had to integrate it with what was happening in the plate.”
“Initially, the shot of the people next to tracks was supposed to have nothing around them. But it felt so naked with just the tracks and the cowboys standing there. We wound up putting in the stockyards, a section of town and additional elements until that shot itself felt correct.”
—Jay Hawkins, Visual Effects Supervisor
Development of the ranch house, which was practically built and extended in CG.
The shape of a dog was etched into the natural landscape.
Only two shots used greenscreen. “The backgrounds were such a high contrast that I wouldn’t be able to get a nice clean roto, and as a result the shots would suffer if I didn’t use greenscreen,” states Hawkins. “Also, the lighting conditions allowed for it.” On set the wounds were done practically. “It was when we were in the edit that we realized more was needed,” Hawkins observes. “That became a fun exercise of Googling things like anthrax and wolf attacks on bears.” The dissection of the rabbit was CG because real animal parts were not allowed on set. As for atmospherics, extensive dust had to be digitally added. “That was fun too,” reveals Hawkins, “because Murray Smallwood, our Compositing Supervisor, was into experimenting with EmberGen as a kit to use inside of Nuke, and he got some wonderful results with that. We put dust into quite a lot of scenes to add life to them.” There were times that the skies had to be altered. “Everything that we did was based on things that were shot,” Hawkins says. “If I wasn’t shooting for visual effects, then I was capturing sky domes, reference out of the windows and time-lapse of clouds to build a library. In that part of New Zealand, we were blessed with so many potentially beautiful skies.”
Images courtesy of Universal Pictures and DreamWorks Animation
A character design plate with contributions from Julien Le Rolland, Taylor Krahenbuhl, Anthony Holden, Pierre Perifel and Jorge Capote.
For Australian author Aaron Blabey, the best way to describe The Bad Guys, a series of illustrated books depicting what are viewed to be despicable creatures trying to redeem themselves, was as “Tarantino for kids.” The cinematic adaptation found a home at DreamWorks Animation, with it being overseen by producer Damon Ross and director Pierre Perifel, who was making his feature directorial debut. The vocal cast features Sam Rockwell as Mr. Wolf, Marc Maron as Mr. Snake, Craig Robinson as Mr. Shark, Anthony Ramos as Mr. Piranha and Awkwafina as Ms. Tarantula. The creative journey began for Pierre Perifel in March 2019 with the lockdown caused by the pandemic occurring halfway through preproduction.
A character experssion sheet of Mr. Wolf with the model created by Hyun Huh and designed by Jorge Capote.
“The bad guys are in the warm colors and a cooler palette when they attempt to be good guys. The police moments would be the regular color of a police car, like deep reds, white and black. When it’s more the desperate moments, it would be desaturated, almost black and white. There is strong lighting in Los Angeles, so we have white skies and warm light.”
—Pierre Perifel, Director
“There is no way you can stick for the long run with something that you don’t like or feel drawn to,” admits Perifel. “The universe of the books struck a chord with me as it could be a heist movie by Quentin Tarantino or Steven Soderbergh. I added my own influences as animator back in France. Underneath all of this is the journey of Wolf. The idea that people can change and figure out more meaning in their personal lives, was something I connected a lot with for personal reasons.” The illustrations from the books had to be altered in order to be cinematic. “The art of Aaron Blabey is simple and efficient,” observes Perifel, “but yet we had to expand upon it to make a visual experience on the big screen. There are also limitations to his characters that you want to change or rework so you can have them actually moving. A shark without legs in our world would have been difficult to do. The same for Piranha.”
Sam Rockwell voices Mr. Wolf, who attempts to pull off his biggest con job.
Perifel wanted to create a new animation style which combined influences of Hayao Miyazaki and Ernest & Celestine. “The code of anime is that the posing of the characters has a lot to do with economical animation. Over the last few years at the studio, we had tended to be video reference and realistic for our acting in animation. I didn’t want to forget that, but wanted to try something that was more stylized and illustrative.” A simple color theory was developed by production designer Luc Desmarchelier that reflected the mental state of the main characters. “The bad guys are in the warm colors and a cooler palette when they attempt to be good guys,” explains Perifel. “The police moments would be the regular color of a police car, like deep reds, white and black. When it’s more the desperate moments, it would be desaturated, almost black and white.” The location had an impact on the color palette,” Perifel adds. “There is strong lighting in Los Angeles, so we have white skies and warm light.”
Pierre Perifel wanted to create a new animation style that combined influences of Hayao Miyazaki and Ernest & Celestine.
The storyboard by director Pierre Perifel and the final frame that appeared in the movie.
Storytelling drives the technology at DreamWorks Animation. “The head of layout, Todd Jansen, wanted to give us an anamorphic lens, which is what you usually do in live-action because it has a Los Angeles film vibe to it,” states J.P. Sans, Head of Character Animation for The Bad Guys. “We wrote tools to have this lens distortion whenever we needed to. The other tool that we had was a comic-book style, so there were a lot of drawing effects. We could draw motion blur and multiple legs for when a character was spinning around, instead of using rigs and CG elements. Everything felt handmade but still had that CG aspect, so it feels like a hybrid.” Animation tests involved copying 2D films frame by frame into CG, which were then shown to Perifel. “It was a great way to find our parameters of, ‘Are we close or are we too far off?’” states Sans. “The style that we found was removing some of that motion in CG and letting the mind fill in the blanks like you do in 2D.”
“We wrote tools to have this [anamorphic] lens distortion whenever we needed to. The other tool that we had was a comic-book style, so there were a lot of drawing effects. We could draw motion blur and multiple legs for when a character was spinning around, instead of using rigs and CG elements. Everything felt handmade but still had that CG aspect, so it feels like a hybrid.”
—J.P. Sans, Head of Character Animation
A color script by Luc Desmarchelier and Pierre Perifel for a dramatic car chase.
It was important to make Ms. Tarantula appealing rather than creepy. “The fur on tarantulas looks pointy and like it could stab you,” remarks Sans. “We wanted to bring a cuteness by making the fur feel soft. Because of going anthropomorphic, we added a torso and head that separates from the body so that it gives you a humanistic feel. We wanted her to feel like a spider based on the speed and how she moves around. But we didn’t overdo the legs, because if you have every leg doing something different or you can see every leg, you’re always going to remind people that she is a spider and some people don’t like spiders! It’s about visually simplifying the characters. At times we hid legs. Sometimes when Tarantula is running around, you only see four legs and visually it’s more appealing and easier to swallow than all of these eight limbs coming out of this body.” The vocal delivery of Marc Maron was a perfect fit for Mr. Snake. “Marc brought so much personality to that character and who he was that we wanted to visually maintain that sarcastic dry humor in his expressions. The actual visual recordings give us a lot of ideas on mannerisms that we could incorporate into the character animation,” Sans notes.
Concept art by Floriane Marchix that explores the white skies and warm light of Los Angeles.
“We wanted to bring a cuteness [to Ms. Tarantula] by making the fur feel soft. Because of going anthropomorphic, we added a torso and head that separates from the body so that it gives you a humanistic feel. We wanted her to feel like a spider based on the speed and how she moves around. But we didn’t overdo the legs, because if you have every leg doing something different or you can see every leg, you’re always going to remind people that she is a spider and some people don’t like spiders! It’s about visually simplifying the characters.”
—J.P. Sans, Head of Character Animation
Central to the technical process was figuring out the workflows and tools needed to allow digital artists to solve visual problems like an illustrator. “We wanted to come up with ways that would allow us to hide detail in the rigging so you could procedurally lose some of detail on a per-shot basis depending on the angle of the light,” states Matt Baer, Visual Effects Supervisor for The Bad Guys. “We also wanted the ability to add additional linework later on to enhance the idea that the image looked handmade. If you look at Wolf, some of his linework is built into the rig. That allows the character animator to move these expression lines around. We also even painted some lines into his fur. That stuff is cooked into those renders.” Textures were strategically chosen. “Where we wanted detail to show up on each of those characters was where the highlight would transition into the mid-tones or where the mid-tones would transition into shadow,” details Baer. “Each of the characters came with their version of a base color and then a texture map. Based on where the light was sitting, we could dial in some of that texture in those transitional areas.”
A lighting key by Floriane Marchix for a scene when Ms. Tarantula hacks into a security camera system.
The Bad Guys was inspired by Australian Aaron Blabey wanting to invert archetypal evil animals and place them in a story that would be ‘Quentin Tarantino for children.’
Genders were switched when creating Ms. Tarantula, voiced by Awkwafina.
The texture of the characters was influenced by the environmental lighting.
“We wanted to come up with ways that would allow us to hide detail in the rigging so you could procedurally lose some of detail on a per-shot basis depending on the angle of the light. We also wanted the ability to add additional linework later on to enhance the idea that the image looked handmade. If you look at Wolf, some of his linework is built into the rig. That allows the character animator to move these expression lines around. We also even painted some lines into his fur. That stuff is cooked into those renders.”
—Matt Baer, Visual Effects Supervisor
A new tool called Doodle was created to help sell the illusion that the explosion was a 2D effect.
2D effects had to be created procedurally. “We built a big sprite library and created a bunch of procedural simulations techniques that could be rendered and composited in a way that you can mix and match the sprites with simulations,” remarks Baer. “The goal was to not know where one started and where one ended. The massive explosion had to avoid appearing as a fluid simulation. “We wanted to represent the cooler and hotter areas of an explosion in a much more graphic way,” Baer explains. “A new tool called Doodle was created that allowed effects artists to essentially add additional 2D animated elements on top of the base explosion, which helped to sell the illusion of the whole thing being done as 2D effect.” The same approach was adopted for environmental effects. “You’re trying to boil each of those components down to the necessary detail so that the audience can fill in the rest,” adds Baer. “For effects, we didn’t want a lot of detail inside. We needed just enough to sell the motion of what the effect was doing. We didn’t want to see every single leaf, but needed the ability to make it look that we took a dry brush and brushed it across the whole tree. When you are outside of the silhouette those textures and speckles would appear as physically geometric leaves.”
It was important to have an anamorphic lens, so tools were written to create the corresponding lens distortion.
The characters had to be modified from the books, such as giving legs to Mr. Shark, voiced by Craig Robinson.
The tight pre-production schedule was the biggest challenge. “Preparing all of the assets and characters would have been fine if it was the regular style, but I wanted something that was different from what we’re usually doing and not just relying on the PBRT rendering, which is physical lighting,” notes Perifel. “It was to be more stylized with brush textures and linework. Figuring all of this out in six months was tricky. But once the team figured it out it went smoothly; that would be the hardest part of it. Then, of course, the transition to working from home technically.” Every scene was carefully crafted narratively and emotionally. “There are two action sequences in the second half of the film that are incredibly fun to look at,” touts Perifel. “There is a lot in this movie.”
To better handle the height differences of the characters Ms. Tarantula was often placed on the shoulder of Mr. Shark.
For the greater portion of the movie Dr. Michael Morbius (Jared Leto) walks around in his human form.
“We designed Morbius’ face to have some resemblance to Jared Leto and wanted to carry through all of his idiosyncrasies into what the monster does. That was achieved through standard techniques and leaning on machine learning techniques that we employ these days. … We marked up the face to have an accurate tracking of where his head was in 3D space, knowing that we would totally own the face underneath. We kept the hair and clothes; there were times that we didn’t. It’s a performance that leans on machine learning techniques that Digital Domain has put into place over the years.”
—Matthew Butler, Visual Effects Supervisor
Highlighting the rogues’ gallery of Spider-Man villains is the Sony Pictures Universe of Marvel Characters, which began with the malevolent symbiote Venom and has expanded to include Dr. Michael Morbius, a brilliant biochemist turned vampire. The origin story gets explored in Morbius, directed by Daniel Espinosa (Child 44), with Jared Leto playing the title character alongside Michael Keaton, Adria Arjona, Jared Harris, and Matt Smith. Hired to handled the visual effects was Matthew Butler (Ready Player One), who collaborated with Digital Domain, One of Us, Lola VFX, Storm Studios, Sony Pictures Imageworks and NVIZ to visualize and create 1,000 shots.
Dr. Michael Morbius (Jared Leto) attempts to cure his rare blood disease with experimental vampire-bat science.
Making impossible things occur in a believable manner for audience members is tricky. “In this case, you have Jared Leto and Matt Smith playing vampires, and we see them walking around as humans for the greater portion of the movie, so we know their physical inertia,” explains Butler. “We tried to masquerade within almost gratuitous visual effects that justified where physical reality needed to be bent. The face was the most impressive work in the movie, but honoring the dynamics was much harder.” The vampire faces had to appear monstrous but still resemble the actor so to retain its appeal. Comments Butler, “We designed Morbius’ face to have some resemblance to Jared Leto and wanted to carry through all of his idiosyncrasies into what the monster does. That was achieved through standard techniques and leaning on machine learning techniques that we employ these days. We let the director Daniel Espinosa run free with his actors on the day. We marked up the face to have an accurate tracking of where his head was in 3D space, knowing that we would totally own the face underneath. We kept the hair and clothes; there were times that we didn’t. It’s a performance that leans on machine learning techniques that Digital Domain has put into place over the years.”
Aftering injecting himself with his cure, Dr. Michael Morbius (Jared Leto) gains superhuman strength but also transforms into a vampire.
“In this case, you have Jared Leto and Matt Smith playing vampires, and we see them walking around as humans for the greater portion of the movie, so we know their physical inertia. We tried to masquerade within almost gratuitous visual effects that justified where physical reality needed to be bent. The face was the most impressive work in the movie, but honoring the dynamics was much harder.”
—Matthew Butler, Visual Effects Supervisor
Prosthetic makeup was strictly used for the gaunt and sickly versions of Jared Leto and Matt Smith. “There are levels of madness,” notes Butler. “Phase seven ended up being the most dramatic and extreme. Most of time, Morbius is at phase three. There was also a phase that we referred to as a balloon, where he would elastically go into this mode and comes back again. Those shots are so subtle that they’re almost subliminal.” The first transformation happens off screen. “Morbius bursts out of the glass where he is being contained and is full monster. Now that we know what he’s going to become we can balloon into that, not go as far, and see the progression,” notes Butler. It was important to avoid the transformation appearing as a morph or dissolve. “We would have different parts of the face do various behaviors,” Butler adds. “Morbius has this pale skin, so we had to pull red blood cells out of him and accompany that with a fairly grotesque vein work and a quite translucent subsurface to the skin. When he reverts back from being a monster in the container ship, the last frame of that shot is a fully digital face of Jared. It’s quite an achievement by Digital Domain because it’s a lot easier to do a monster than a human who is your lead actor – and full frame.”
The face of Morbius was designed to have some resemblance to Jared Leto.
To receive the PG-17 rating many shots had blood removed or made to look black.
“There are levels of madness. Phase seven ended up being the most dramatic and extreme. Most of time, Morbius is at phase three. There was also a phase that we referred to as a balloon, where he would elastically go into this mode and comes back again. Those shots are so subtle that they’re almost subliminal.”
—Matthew Butler, Visual Effects Supervisor
A unique ability possessed by Morbius is echolocation, where high frequency sound pulses are emitted through the nose or mouth of a bat that listens for the resulting echo. “How does one visualize something that is not visual?” asks Bulter. “We had to show that he is seeing these surfaces by bouncing soundwaves against them. It didn’t have to be sound, but a wave-like phenomena that has a particle system response. I also wanted it to be beautiful. We see Morbius sending out these pulses and an energization of the surfaces which have inherent colors.” Not everything happens on the ground. “Flying was one of my biggest fears as it can quickly become hokey. Morbius has a certain mass and telegraphs that by the way he walks around and picks up and puts down a cup. You can’t suddenly [show] that he has shed his mass and is now a helium balloon. I love any natural phenomena because it tends to be beautiful, and if someone has seen a real thing before, they are now clued into a curtained believability. The technique that we used to hide some of the sins here was a cavitation of the volumes around them.”
“We would have different parts of the face do various behaviors. Morbius has this pale skin, so we had to pull red blood cells out of him and accompany that with a fairly grotesque vein work and a quite translucent subsurface to the skin. When he reverts back from being a monster in the container ship, the last frame of that shot is a fully digital face of Jared. It’s quite an achievement by Digital Domain because it’s a lot easier to do a monster than a human who is your lead actor – and full frame.”
—Matthew Butler, Visual Effects Supervisor
Greenscreen and bluescreen were favored over LED walls. “There was a greenscreen shoot for the cave exterior for the opening sequence in Costa Rica, which was some of the hardest in the movie because that was shot indoors,” reveals Butler. “It was difficult to get that lighting to look real. The end of the movie is all bluescreen. We built a partial set with it being 90% synthetic. We knew what it was and could pull it up on the video feed there because we had already conceived it.” The third act completely changed from being situated during the day to a nighttime setting. “We shot for months in a park in England doubling for Central Park in New York City,” notes Butler, “and the action we did with Jared Leto, Matt Smith and Adria Arjona was all gone. The entire third act was reconceived digitally. We selectively re-shot little pieces of Jared, Matt and Adria, and the rest was CG.” The theatrical rating of Morbius impacted the blood and gore. “It is hard to do a PG vampire movie,” observes Butler. “There were so many shots where we went back and took blood out or made it black.” The stunt performances were impressive, in particular by Jared Leto’s stunt double, Greg Townley, who literally ran up walls sprayed with Coca-Cola for the subway scene. States Butler, “I came at this from what can we do practically first.” A major accomplishment occurs during the container ship sequence. “It is an elegant shot,” savors Butler, “where you see him go from full monster and become Jared Leto. I love that shot and am so proud of it. It looks like Jared, but that’s all synthetic.”
In the subway scene the walls were sprayed with Coca-Cola so that the stunt double for Jared Leto could actually run up the walls.
After establishing a relationship with director Matt Reeves on the prequel trilogy for The Planet of the Apes, Weta FX veteran Dan Lemmon was moved into the role of Production Visual Effects Supervisor for the next blockbuster helmed by the filmmaker. The Batman takes place during the early days of the Caped Crusader and stars Robert Pattinson, Zoë Kravitz, Paul Dano, Colin Farrell, Jeffrey Wright, John Turturro and Andy Serkis. Handling the signature car chase between Batman and Penguin, as well as the Batcave and the memorial service at City Hall, was Weta FX, with key members of the team being Visual Effects Supervisor Anders Langlands, Animation Supervisor Dennis Yoo and Compositing Supervisor Beck Veitch who collaborated on a total of 320 shots.
The workshop, gym equipment and Batmobile areas were practically constructed while the rest of the Batcave was a digital environment.
“[Cinematographer] Greig [Frasier] was putting globs of silicon sealant from a caulking gun onto a plate of glass in front of the lens to create these beautiful abstract lens flares, particularly in the vehicle shots and throughout the chase scenes. Initially, we were mystified as to what they were until Dan explained what Greig was actually doing there.”
—Anders Langlands, Visual Effects Supervisor
“It’s definitely exciting to be able to put your own spin on things,” notes Langlands. “This is a detective story that is a love letter to all of those old 1970s crime thrillers which I’m a huge fan of personally, like Chinatown, The French Connection, Taxi Driver and the paranoia trilogy [Klute, The Parallax View, All the President’s Men]. Greig Fraser [Dune] is a fantastic cinematographer, and the photography is stunning throughout. That combination of things made it an exciting journey to be part of.”
A CG cape was created to get the billowing effect that director Matt Reeves wanted for the shot.
Greig Fraser shot with two sets of ARRI Large Format Anamorphic lenses, with one being optically pristine and the other detuned so it could not focus on anything outside the center of the frame. He also had a unique approach to the lens filtration that refracted streetlights and car headlights into a spiderweb of light. Comments Langlands, “Greig was putting globs of silicon sealant from a caulking gun onto a plate of glass in front of the lens to create these beautiful abstract lens flares, particularly in the vehicle shots and throughout the chase scenes. Initially, we were mystified as to what they were until Dan explained what Greig was actually doing there. We did talk about generating some elements with effects to create procedural 2D stuff, but in the end decided to do the same thing that Greig did and shoot some elements for ourselves. Beck’s team was able to take those elements and construct a tool that emulated what Greig had achieved in the live action. I was definitely not cursing [Greig]. It was a lot of fun.”
Weta FX referred to its element library to get the necessary explosion effects that were subsequently graded and timed to be consistent with the plate photography.
“For the compositing team, our challenge was to deconstruct all of the things that happened to the detuned lenses and be able to replicate that for the set extensions and CG shots because it’s so distinctive,” notes Veitch. “We got the chase sequence turned over quite late and had to develop whole new tools and templates for Nuke to be able to implement rain interaction and wheel spray at speed. It was a mixture of simulated effects for hero cars and Eddy templates for background traffic.”
Extensive digital rain had to be created as director Reeves wanted the car chase to feel wet and dangerous the whole time.
The torrential rain was the major creative and technical task. “Matt wanted the car chase to feel wet and dangerous the whole time,” states Langlands. “We added digital rain to all of those shots, which is fairly simple, but when you’re flying through it, that’s a lot more complex. We were modeling how they osculate and deform as they fall so you get those motion blur streaks from them. Our effects team was simulating hundreds of millions of raindrops in every shot. Then we were simulating all of the wheel spray coming off of the wheels. In some shots, we had a mix of 3D, but in most shots 2D solutions for all of the raindrops hitting the road surface. Getting the look of that right, making it feel believable, and being efficient enough that we could do it across that huge number of shots was a real challenge.”
Plates were shot on several different locations including the Dunsfold Aerodrome in Surrey, England for the car chase.
Dennis Yoo figured out the timing, composition and action beats for the car chase was achieved by creating the postvis. “The great part about that was Matt Reeves shot everything,” remarks Yoo. “You add a CG car beside the practical one, then you have something to play off the motion with. It makes everything easier. What people don’t understand is that it’s a chaotic sequence, but there is also artistry in there with the composition and our motion so you understand the direction that you’re going. That was quite a challenge because it’s a mix of cars crashing into each other, but if it was all chaos no one would know what is going on. It was fun to do, and trying to keep that as realistic as possible was also a challenge.”
“For the compositing team, our challenge was to deconstruct all of the things that happened to the detuned lenses and be able to replicate that for the set extensions and CG shots because it’s so distinctive. We got the chase sequence turned over quite late and had to develop whole new tools and templates for Nuke to be able to implement rain interaction and wheel spray at speed. It was a mixture of simulated effects for hero cars and Eddy templates for background traffic.”
—Beck Veitch, Compositing Supervisor
As many as of 11 plates had to be integrated together for the memorial service scene at City Hall.
“[Director] Matt [Reeves] wanted the car chase to feel wet and dangerous the whole time. We added digital rain to all of those shots, which is fairly simple, but when you’re flying through it, that’s a lot more complex. … Our effects team was simulating hundreds of millions of raindrops in every shot. Then we were simulating all of the wheel spray coming off of the wheels. In some shots, we had a mix of 3D, but in most shots 2D solutions for all of the raindrops hitting the road surface. Getting the look of that right, making it feel believable, and being efficient enough that we could do it across that huge number of shots was a real challenge.”
—Anders Langlands, Visual Effects Supervisor
The Batmobile had to be photorealistic. “I worked on a movie that was all vehicles,” continues Yoo, “so we grabbed some of that tech, [and in doing so] the ground contacts and using the actual LiDAR from set to dynamically move the wheels for us allowed for more realism to be built into the rig. We reference for everything so we could look at the actual vehicle to understand what it was doing and then mimic that even though we’re changing the motion.” Batman had to come across as skilled driver. “The Batmobile was bouncing off the trucks, and the Batmobile looks like its causing all of this mayhem. We didn’t want it to look like that at all. Matt was adamant about Penguin starting that whole pile-up, and the Batmobile was [more] in there riding the wave than causing more havoc,” adds Langlands.
Distinct lens flares created by cinematographer Greig Fraser required customized tools by Weta FX to digitally recreate them.
A dramatic upside-down shot is taken from the perspective of Penguin as Batman walks towards Penguin’s overturned vehicle. “That was a funny one because I saw someone on Twitter saying, ‘It’s the most beautiful shot without any CG,’ not realizing that Batman is mostly CG in that shot,” reveals Langlands. “They had a rain machine going, but when you get a big piece of material like a cape wet, it just wants to bunch up and hang down. Matt wanted to have it billowing out in the wind as he’s walking up, so we had to do a digital cape with a cloth simulation. In the plate there is a huge fireball behind him, and because we’re putting a dark object in front of something that is causing a lens flare we had to take the CG Batman and track it to the live-action Batman in composting to patch bits of him.” Rain was not the only problematic natural element. “We raided our element library for every explosion that we had historically,” remarks Veitch. “That was a challenge to get their temperatures matching, because it was on a whole lot of different film stocks and digital formats. Then timing all of those so they came through and exploded at the right time, and patching when we needed to – that was a complex composite.”
“What people don’t understand is that [the car chase] is a chaotic sequence, but there is also artistry in there with the composition and our motion so you understand the direction that you’re going. That was quite a challenge because it’s a mix of cars crashing into each other, but if it was all chaos no one would know what is going on. It was fun to do, and trying to keep that as realistic as possible was also a challenge.”
—Anders Langlands, Visual Effects Supervisor
Situated in an abandoned neo-gothic subway situated underneath the Wayne Tower is the Batcave, which is a huge environment with the workshop, exercise equipment and Batmobile areas being practically built. “Initially there was suppose to be a bat colony which was suppose to be assimilated, but it kept building up more and more,” states Yoo. “It didn’t help that the environment was so big, so we had to cheat because they wanted bats in the foreground, but that foreground didn’t make any sense compared to the environment. We were having scale issues by having the bats quite close to the camera, which didn’t make sense for the bats that were further back.” Darkness prevails in the setting. “We’re adding little kicks and pings off of the superstructure in order to get texture in the background, which is fun because you’re placing little light sources around the place out of focus,” notes Langlands. “That ends up becoming like putting splashes of paint on the canvas.” The shots were simple to composite. “I enjoyed them,” states Veitch, “because we were playing around with a lot of proprietary defocus tools, and being able to compose the focus in those shots and try to make them look authentic. Those lenses are quite incredible and they gave us a lot of reference shots.”
Director Matt Reeves discusses a shot with Robert Pattinson while on the set of The Batman, which was inspired by the crime thrillers of the 1970s.
Crashing the memorial service for the mayor at City Hall is a SUV, at the behest of the Riddler, portrayed by Paul Dano. “We were combining anywhere between four to 11 plates,” remarks Veitch. “We had small crowds because of COVID, and then there was the careening car which was a safety issue. There were about eight plates for the top-down shots. The shots of the car coming towards the camera involved compositing takes of who they wanted – the Riddler at the top in the mezzanine area, the car coming forward, smoke and dust coming off it, and trying to retain all of that. Very tricky shot.” Careful research went into matching the different plates with each other. “We had to figure out what section of the plate we had to use within those shots,” adds Veitch. “There was a massive amount of paint and roto to do before our compositing team even touched it. Then it’s making sure that we can retain as much of the plate as possible, and then adding atmospherics where we needed to help us cover up edges or replace atmospherics that we had to lose. It is quite incredible how much smearing and lensing artifacts you get with the detuned lensing that we had to match up. It was a lot of work on those shots even without the CG.”
Netflix distributed Girls from Ipanema, a Brazil-produced series about four Brazilian female friends in Rio de Janeiro’s bossa nova scene in the late 1950s. Sao Paulo-based Quanta Post contributed to the VFX. (Image courtesy of Netflix)
The growth and globalization of the visual effects industry has resulted in worldwide interconnectivity and a vast workflow spanning the planet. There is more top-notch VFX in films and series than ever before, boosted by the growth in streaming content, episodic fare becoming more cinematic in terms of quality, and a continued evolution in VFX technology. Demand for VFX artists as a whole is also growing due to the surging video game industry, amusement park visual effects and the gradual ascension of VR and AR.
All of those factors have increased the work for VFX studios and the demand for skilled artists from Vancouver to London to Mumbai. Financial incentives in certain locations have helped globalize the VFX business for some time now. And the COVID-19 crisis further accelerated home entertainment demand and remote VFX work. “The pandemic has really kicked the globalization of the VFX industry into high gear, and now even more producers know what can be achieved with VFX,” says David Lebensfeld, Founding Partner and VFX Supervisor of Ingenuity Studios, which has offices in Los Angeles, New York and Vancouver.
Local productions outside North America, such as many series funded by Netflix, are spreading work across the planet in both film production and post-production. Fiona Walkinshaw, Framestore’s Global Managing Director, Film, comments, “The streamers have made no secret about their desire for regionally-focused content and how this feeds into their business strategies.
As of late 2021, the Korea-produced dystopian survival drama Squid Game was Netflix’s most watched series. Seoul-based Gulliver Studios supplied VFX. (Image courtesy of Siren Pictures and Netflix)
There’s a tremendous desire for stories that could only come from a certain city or country – shows like Squid Game or Money Heist, for example, which, like the Scandi noir boom, captivate viewers by dint of their freshness and unique cultural or geographical perspectives. This will inevitably mean our worlds become larger, as we work with storytellers, producers and below-the-line talent from all over the world. It’s an exciting prospect, and it will help us all grow and learn.” Walkinshaw adds, “In time I’m sure we’ll also see new VFX hotspots establishing themselves – you just have to look at the way the Harry Potter franchise helped turbocharge London’s VFX industry, or what the Lord of the Rings films did for New Zealand.” Framestore itself is quite globalized, with offices in London, Mumbai, Montreal, Vancouver, Melbourne, New York, Los Angeles and Chicago.
Visual Effects studios have spread widely over the last two years across North America, Europe and Australia/New Zealand, and a growing number can be found also in Asia. Many facilities built initially for wire removal and rotoscoping have evolved into full-service VFX studios. BOT VFX, founded in 2008 in India, has expanded from an outsourcing facility in Chennai for rotoscoping and other detail work into a large and complete VFX business; it now has its headquarters in Atlanta and has worked on high-profile recent projects, including The Book of Boba Fett, Dune and Black Widow.
Just as Korea has grown into a movie/series global powerhouse, so too have its VFX studios expanded over the last 10 years.
Ragnarok, a Norwegian-language fantasy series from Copenhagen-based SAM Productions, is distributed by Netflix. Ghost VFX and Oslo-based Stardust Effects contributed VFX. (Image courtesy of Sam Productions and Netflix)
The Last Forest, Luiz Bolognesi’s movie about the Yanomami Indians of the Amazon rainforest, produced in Brazil, mixes documentary and staged scenes. It was distributed globally by Netflix. (Image courtesy of Netflix)
S.O.Z.: Soldiers or Zombies, an eight-episode horror-action TV series distributed by Prime Video, is a Mexican production created by Nicolas Entel and Miguel Tejada Flores. (Image courtesy of Prime Video)
Netflix globally distributed Invisible City, a Brazil-produced fantasy series about mythological creatures in the rain forest, created by Carlos Saldanha, the Brazilian director of various successful animated films, such as the Ice Age movies and Rio. (Image courtesy of Netflix)
“The pandemic has really kicked the globalization of the VFX industry into high gear, and now even more producers know what can be achieved with VFX.”
—David Lebensfeld, Founding Partner and VFX Supervisor, Ingenuity Studios
Gulliver Studios supplied VFX for the Netflix hit series Squid Game, while Dexter Studios contributed VFX work to Bong Joon- Ho’s Parasite. Dexter and five other VFX studios worked on Space Sweepers, arguably Korea’s first high-production science fiction film. Korea’s 4th Creative Party helped with the VFX for Joon-ha’s acclaimed Snowpiercer and Okja films (along with Method Studios). And Digital Idea worked on the hit zombie film Train to Busan.
VHQ Media, founded in 1987 in Singapore, has grown into a large film studio and claims to be Asia’s largest post-production house, working on both national and international productions. It also has studios in Beijing, Kuala Lumpur and Jakarta. Many international VFX firms have opened offices in Asia, including DNEG (four offices in India), The Third Floor (Beijing), Scanline VFX (Seoul), Method Studios (Pune), ILM (Singapore), Digital Domain (Taiwan, Hyderabad and four locations in China) and MPC (Bangalore).
“The global growth of the VFX industry and VFX as a tool of technology is limitless and boundless, to say the least,” says Merzin Tavaria, President, Global Production and Operations at DNEG. The London-based firm is another example of a VFX studio with offices spread across the globe. It was formed in 2014 by a merger between Prime Focus (India-based) and Double Negative (U.K.-based) and has studios in Los Angeles, Vancouver, Montreal, Toronto and London along with Mumbai, Bangalore, Chandigarh, and Chennai in India.
“There are some fantastic companies doing amazing work in all corners of the globe,” says Pixomondo CEO Jonny Slow, “and at the moment, they are all working to keep up with an unprecedented level of demand. Growing demand, driven by episodic content with a higher budget and huge creative ambition, is a big factor in all the trends affecting the market [this year] and beyond.” Pixomondo has offices in Vancouver, Toronto, Montreal, Los Angeles, Frankfurt, Stuttgart and London.
The streamers Netflix, Amazon, Hulu (majority owned by Disney) and Apple TV have added their VFX demand to that coming from traditional movie/TV companies and affiliated streaming services (HBO Max, Disney+, Peacock, Paramount+). Florian Gellinger, RISE Visual Effects Studios Co-Founder and Executive Producer, notes, “Right now, as the market is so saturated, the work is going to be globally distributed to whoever has availability and meets the required profile. So yes, clients will have to look increasingly globally for a good fit.” RISE has offices in Berlin, Cologne, Munich, Stuttgart and London.
Other VFX studios concur that the business has been activated.“We have too much work, which means we need more capacity, more artists and more supervisors. Right now, we’re ensuring that we continue to make our established clients happy while bringing in new clients,” says Tom Kendall, VFX Head of Business Development, Sales & Marketing for Ghost VFX, which has offices in Los Angeles, Copenhagen, London, Manchester, Toronto and Vancouver.
Executive Producer Måns Björklund of Stockholm-based Important Looking Pirates (ILP) notes, “There aren’t enough VFX companies in the world to do all the work. The demand for content has boomed, and the need for clients to seek new vendors around the world has increased.”
DNEG is one of the pioneers in the globalization of VFX workflows. Tavaria comments, “With nine facilities working seamlessly together across three continents, I believe we’ve led by example, creating an ever-expanding global network that can deliver highly creative and compelling visual storytelling while introducing new norms of efficiency and flexibility.”
He adds, “The standardization of workflows, tools and capabilities across sites allows us to move work around our network to cater to the demands of our clients and to balance the load across locations to maximize utilization. We also take full advantage of time zone differences to create efficiencies in our production scheduling.”
Framestore recently opened a studio in Mumbai, which already has 130 on-site creatives. Walkinshaw comments, “Being able to set up in Mumbai and seamlessly integrate with our new colleagues there is an incredible advantage, especially given the tremendous talent pool there. Generally speaking, increased access to amazing talent is the main consequence of this worldwide connectivity.”
Another positive effect of globalization is that “exchanging work between companies has become much easier despite everyone running their own pipeline,” says Gellinger. Slow sees the globalization of VFX as a positive trend that creates stability for those companies who are prepared to evolve continuously and adapt to constant change. “It’s not the only trend in the VFX industry, but it’s a trend in response to demand for capacity and client requirements for speed and efficiency. But there is also a quality threshold. Quality output drives stability for VFX companies, wherever their artists are located.”
Lebensfeld comments, “What certainly helps with this business becoming more globalized is access to talent that isn’t in your zip code, which backfills what already makes us competitive.” To open up to talent in another country, he says, “we already have the technology – the hardware, software and methodology – to work remotely. Anything else past that are just details. We look at it less like it is a global business and more as one that breaks down borders. One of the more exciting things is getting the chance to develop artists and give opportunities to people who would not have had them otherwise. They only need a computer, inherent artistic talent, and we work with them on training in a studio environment. I’m a big believer that a combination of local studio artists and international artists is the best way to go because the industry still relies on specific locations for some projects for a variety of reasons. The business still needs a large base of talent in certain production hubs.”
Teddy Roosevelt (Aidan Quinn) pulling an arrow from the arm of Brazilian explorer Cãndido Rondon (Chico Diaz) in the HBO mini-series The American Guest. The 2021 Brazilian production showcased the work of Brazilian artists, including Orbtal Studios in Sao Paulo, which supplied visual effects. (Photo courtesy of HBO Max)
The Korean zombie apocalypse and coming-of-age series All of Us Are Dead is distributed internationally by Netflix. (Image courtesy of Film Monster Co., JTBC Studios/Kimjonghak Production Co. and Netflix)
Korean sci-fi mystery series The Silent Sea, written and directed by Choi Hang-yong, is distributed by Netflix globally. (Image courtesy of Artist Company and Netflix)
“[S]ince the workforce has become so flexible in where it settles, recruiting has become a lot harder and companies have to reach out further than they used to in order to meet their talent requirements. [Globalization] has solved a couple of these problems by having access to top talent across borders, not being limited to one’s own backyard.”
—Florian Gellinger, Co-founder and Executive Producer, RISE Visual Effects Studios
Ghost VFX worked on Shadow and Bone, a fantasy series distributed by Netflix and based on books by Israeli novelist Leigh Bardugo. Shadow and Bone was shot in Budapest and Vancouver. (Image courtesy of 21 Laps Entertainment and Netflix)
How I Fell in Love with a Gangster is a Polish crime drama distributed globally by Netflix. (Image courtesy of Netflix)
Money Heist is a Spanish heist drama that had a successful run of five seasons and is one of Netflix’s biggest international hits. (Image courtesy of Atresmedia/Vancounver Media and Netflix)
Gellinger observes that it has become easier for artists to find a job in their desired ‘adventure destination’ abroad. “And since the workforce has become so flexible in where it settles, recruiting has become a lot harder and companies have to reach out further than they used to in order to meet their talent requirements.” Yet globalization also “has solved a couple of these problems by having access to top talent across borders, not being limited to one’s own backyard.”
Walkinshaw adds, “From a production perspective it means juggling more time zones, currencies and teams, so this part of the business has become more complex, and there is a need for investment in both more people and technology solutions to make it easier for production to function. The role of a producer working for a company like Framestore on a project spread globally is far more complex and demanding than it used to be.”
Producers and supervisors now must be more patient and organized because of the time differences, and they have to schedule their work around that, according to Kendall. “The projects are shot in so many diverse locations, it’s about being able to address clients’ needs in a timely manner and be flexible in terms of how we work.”
Gellinger notes that the way that business is being distributed globally is “definitely creating stability, but only as long as companies keep investing in their talent. Flying in entire teams from abroad is not a business model. Investing in education and training is more important than ever.”
Slow comments, “We have seen a lot of these consequences [of globalization] playing out for the past few years. It has allowed the formation of larger, better funded, better organized companies that are becoming attractive investment propositions. This has been very positive for the industry – for growth to happen, investment is required.”
DARK BAY Virtual Production Studio is an example of how VFX globalization has been boosted by Netflix and by government help. Baran bo Odar and Jantje Friese, the creators of Netflix’s hit series Dark – a German science fiction thriller – built an LED stage in Potsdam-Babelsberg in part to shoot 1899, their next Netflix series. Odar and Friese’s production company Dark Ways holds a majority share in DARK BAY (Studio Babelsberg has a minority share). Funding from the state of Brandenburg in Germany and a long-term booking commitment from Netflix backed the venture.
Incentives continue to play a role in the globalization of VFX. Framestore’s Walkinshaw comments, “National or regional incentives have provided a huge boost for our industry and encouragedinternational collaboration.
They’ve been key to growth in the U.K. and Canada – to date our biggest sites for film and episodic work – and the willingness of studios to put work in these regions helps create a virtuous circle: it allows companies to invest in their talent, facilities and infrastructure, makes those places a magnet for established talent from elsewhere, and also helps schools and universities attract ambitious students. Take Montreal for example – Framestore was the first major studio to open a studio there [in 2013], and it’s now an established and hugely-respected hub for the global visual effects industry.”
“The pandemic forced studios like us to build a pipeline that works in remote environments,” says Lebensfeld. “We were able to leverage figuring out how to work remotely with talent that have previously been local to our studio locations. We have history and momentum with these artists, and we figured out processes that mirror what we have already been doing – just with remote capabilities.”
Already existing worldwide VFX interconnectivity helped DNEG to address the challenges of the pandemic, according to Tavaria. “The unprecedented speed with which our technology teams enabled global remote working was astounding, based on work that was already underway. It also, somewhat counter-intuitively, brought us closer together and enabled even more collaboration across our global teams,” he comments. “These advances have positioned us well to cater to the growth in demand for visual effects and animation work this year, driven by the increases in content production by TV and OTT companies, in addition to increased demand for our VFX and animation services from our studio clients.”
Walkinshaw comments, “The pandemic has definitely encouraged us to think outside the box, be this seeking workarounds for physical shoots, having colleagues working remotely from different countries or broadening our talent pool by making hires from different territories, because so much of the workforce has spent time outside the office. I imagine this will endure, especially as we continue to seek skills beyond the ‘traditional,’ particularly in areas such as technology and gaming.”
Slow says, “In our industry, technology and innovation are the fundamental drivers of changes like globalization. We are at a very interesting point – with technology driving once-in-a-lifetime changes in content distribution and production technique – and these trends have been accelerated by a major pandemic. The consequences are significant, and the impact will largely play out over the next five years.”
Lebensfeld concludes, “Pre-pandemic [film companies] went on location and brought on as many extras as needed. The scope of requests has expanded well beyond that. There’s a VFX solution for every aspect of a story. That’s a powerful thing.” He adds, “I think the VFX industry has transformed these past two years, with very positive changes overall. There’s no going back now. Our industry is global, and that’s here to stay.”
“We have seen a lot of these consequences [of globalization] playing out for the past few years. It has allowed the formation of larger, better funded, better organized companies that are becoming attractive investment propositions. This has been very positive for the industry – for growth to happen, investment is required.”
—Jonny Slow, CEO, Pixomondo
Amazon Studios produced the epic fantasy series The Wheel of Time, another international VFX effort involving Cinesite, MPC, Automatik VFX, Outpost VFX, Union Visual Effects and RISE Visual Effects Studios, among others. (Image courtesy of Sony Pictures Television and Amazon Studios)
HBO series Beforeigners is a science-fiction crime drama produced in Norway by Oslo-based Rubicon TV AS. (Photo courtesy of HBO Max)
A scene from A Shaun the Sheep Movie: Farmageddon from Aardman Animations, which was originally founded as a stop-frame studio in 1972. (Image courtesy of Aardman Animations)
Being able to successfully manage an animation studio in 2022 and beyond requires foresight and the desire to be an industry leader. The landscape has been transformed by globalization, dominance of streaming services, growing demand for content, and the development of other media platforms such as virtual reality. Being flexible has enabled Oscar-winning Aardman Animations to remain relevant since being founded as a stop-motion studio in 1972, responsible for Creature Comforts and Wallace & Gromit, and creating CG features and AR projects. More recently there is Emmy-lauded Baobab Studios, which has specialized in interactive action animation since 2015 and is creating The Witchverse anthology series for Disney+ based on Baba Yaga. Veteran animator and lecturer Ken Fountain has found himself in the middle of all of this, having worked on Megamind for DreamWorks Animation, Pearl for Google Spotlight Stories and Baba Yaga for Baobab Studios, as well as doing tutorials on SplatFrog.com.
There is still value in using real and tactile material for characters and world-building, believes Sarah Cox, Executive Creative Director at Aardman Animations. “It’s less about creating assets digitally and more about the way that technology can enable handcrafted processes to be more efficient, beautiful and effective.” Technology is embraced by Lorna Probert, Head of Interactive Production at Aardman Animations, who last year released the Wallace & Gromit AR adventure The Big Fix Up. “There is some exciting technology, like virtual production and the way that we can use real-time engines for previs and creating more flexible and iterative processes.”
Another major trend pointed out by Cox is the push for photorealism. “The distinction between live action and animation, what isreal and what’s not real, is continually going to be blurred as we use live-action information in animation.” The filmmaking process is not entirely different. “Stop-frame is effectively like a live-action asset because it’s shot in-camera, and then you’re using those assets to mix with other bits of animation,” observes Cox. “What we did with our last Christmas special was shoot all of the effects in-camera. The snow was made out of puffs of wool, but then composited together in the most technically proficient, up-to-date way.”
Baba Yaga won the Daytime Emmy for Interactive Media and resulted in Disney+ partnering with Baobab Studios to create a Witchverse series for the streaming service. (Image courtesy of Baobab Studios)
Virtual reality has yet to become mainstream, partly hampered by being perceived strictly as a marketing tool. “You are still wearing what is not a comfortable thing on your face,” states Probert. “Until your interface with the content becomes more natural and comfortable, that suspension of reality is always going to be broken.” The communal aspect is presently missing. “A lot of our content design is for family viewing, so that whole being in your own space is quite contrary to what we do,” notes Cox. “It quadruples the production time [because various viewer narrative decisions have to be accounted for], and the other challenge is most of our work is comedy. If there is user interaction, you can’t time the gags in quite the same way.” It is important to recognize that each medium cannot be treated in the same way. “It’s creating content that plays to the strength of the format, and we’re doing lots of exploration,” remarks Probert. “The fact that you can, for the first time, be in one of our sets, be able to build and move things and explore the detail in VR is exciting. It’s exciting to be in the barn with Shaun the Sheep and see him reacting to you – that’s an interesting thing for us to play with.”
Veteran animator and lecturer Ken Fountain finds it to be exciting that the demand for animation content is allowing for experimentation such as Spider-Man: Into the Spider-Verse. (Image courtesy of Sony Pictures Entertainment and Marvel)
Netflix decided to bridge the gap between Season 1 and 2 of the live-action The Witcher by releasing an anime prequel Nightmare of the Wolf. (Image courtesy of Netflix)
The Very Small Creatures is a pre-school series produced by Aardman Animations for Sky Kids. (Image courtesy of Aardman Animations and Sky Kids)
The mandate for Baobab Studios is to make viewers feel that they are an essential part of the storytelling. (Image courtesy of Baobab Studios)
Animation is no longer just for families as illustrated by the adult-oriented anthology Love, Death + Robots. (Image courtesy of Netflix)
When co-founding Baobab Studios CEO Maureen Fancombined her video game experience with the filmmaking expertise of CCO Eric Darnell and technical leadership of CTO Larry Cutler. “The reason why something is a success isn’t because it’s stop-motion versus traditional animation. It is how good the story and characters are. The style is in support of that story. For Crow: The Legend and Baba Yaga, we created both for VR and 2D. Namoo was created within the VR tool Oculus Quill, where you are literally painting 360, but the project was meant to be a 2D output. The animation is different because the director is different. We brought in Erick Oh, and it’s more like stop-motion because in Quill there is no frame interpolation and rigging. Every single project that we’ve done has had completely different methods and tools, which is fun.” The mandate driving everything is making sure that the user feels like a protagonist. “Certain parts of Baba Yaga and Bonfire were straight animation, but our animators also built in a bunch of cycles that we fed into the AI engine. We built a character-emotive AI system similar to games, so whatever the audience did, the story would change and reorder itself, and the characters would do different things.”
“There is a real-time revolution that is going to come, but not everybody has embraced it yet,” remarks Fan. “Even when the big studios adopt real-time, unless you’re having the product completely change where it’s interactive, you would still animate the same way. I don’t think that animators need to change any time soon. But if you’re interested in doing interactive animation, your skillset needs to be embracing the real-time aspect.”
The visual language for VR is still being developed. “The fun thing about VR is no one knows what they’re doing! You don’t need a lot of previous experience. It is finding the best animator and rigger, and find one who has a flexibility to try different things and not always have to do things the way they did previously.” Globalization of the industry has not only had an impact on the workforce. “You will notice that all of our projects have specifically cast minorities and women,” adds Fan. “That’s because I’m a female minority and feel like if I don’t do it, who will? Crow: The Legend was inspired by a Native American legend, and it was one of the first indigenous-themed stories in animation. Instead of a hero’s journey, they’re much more about the community. Baba Yaga is based on a character well-known in Eastern European literature. I’m excited about the different types of stories that we can tell with globalization.”
Previously an animation supervisor at Baobab Studios and currently working as Animation Supervisor for DNEG Animation, Ken Fountain points out that VR is rooted in an artform that predates cinema, which is theatre. “If you’re talking about going into the AR/VR space, the filmmaking language is totally changing because you can’t rely on an editor anymore,” observes Fountain. “The editor is the one standing with the headset and making the choices of where to look. The way that you build a story and performance, and attract the eye and create compositions, is based around theatrical approaches rather than cinematic ones. You have to be procedural and theatrical.” As much as it is exciting for a user to be able to choose their own narrative, there is also respect for boundaries.
Netflix paid over $100 million to purchase the rights to The Mitchells vs. the Machines. (Image courtesy of Netflix)
“Ultimately, it’s up to whoever is creating theexperience to decide if they’re giving the user room to do literally whatever they want. Personally, I like the engineered outcomes,” adds Fountain, as parameters have a positive impact on the creative process. “You make your best work when you have limitations, and that translates into writing stories for this open universe. Unless you give people a box, the outcome is not going to be as good.”
With the growing reliance on AI and machine learning, could there come a time where animation is literally created in real-time? “That seems so far away,” remarks Fountain. “The first wall to get over for AI is it being able to create empathy in animation.” Animators are not going to be replaced any time soon by machines, but the required skillset has changed somewhat, according to Fountain. “Because there’s so much demand for content as an artist, you have to be able to do so many more different things stylistically, which wasn’t always the case. Also, because there are so many start-up companies, your technical generalist knowledge is way more valuable now.” Streaming has provided a platform for projects that are not strictly family entertainment. “Streaming has eclipsed everything right now,” states Fountain. “The Netflix release of Arcane is another bit of proof that people are craving adult-based animation, because the production value of that series is amazing. The Disney-Pixar model has had too much of a grip for too long.”
Different animation styles and techniques are being melded together. Comments Fountain, “Even in VR, we used 2D-animated effects in our engine at Baobab Studios. It’s the same thing that Arcane is doing by combining CG animation, 2D effects and rough 2D-composited motion graphics. Something like Love, Death + Robots has so many new techniques and combinations of things that are sometimes hit and miss. I am so glad that people are shooting for those things because it’s making the artform better.”
The Big Fix Up is an augmented and mixed reality adventure starring Wallace and Gromit. (Image courtesy of Aardman Animations and Fictioneers Ltd.)
The animation style of Namoo was dictated by Korean filmmaker Erick Oh and the story he wanted to tell. (Image courtesy of Baobab Studios)
The VR rhythm game Beat Saber involves slicing through cubes to the beat of popular hits from artists like Billie Eilish, available in DLC releases. Beat Saber has sold more than 4 million copies across all VR platforms and earned more than $100 million in total revenue. (Image courtesy of Beat Games and Oculus Studios/Meta)
VR is going mainstream next year. VR is going nowhere. AR will be bigger than VR.
There is no consensus on where virtual reality and augmented reality are headed and how soon they will get there. But although the virtual reality and augmented reality platforms are still far from mass acceptance, certain positive signs indicate that they really will become large, viable businesses, growing steadily over the next several years.
Tuong H. Nguyen, Senior Principal Analyst for Gartner, Inc., comments, “AR and VR have been hyped as being ready for widespread adoption for a long time, but the technology, use cases, content and ecosystem readiness have yet to live up to the hype.” Nguyen believes that VR in particular will go mainstream when it has three Cs – content, convenience and control. He notes, “While we’ve made progress on all these fronts, we’re still far from reaching the point where each of those aspects are sufficiently mature to make VR go mainstream. It will become mainstream, but I don’t expect it to happen until five to 10 years from now.”
Others are pessimistic that VR will ever become mainstream. “The answer is never. Sorry. Here’s why. People don’t like wearing stuff on their face and getting sick doing it, and having to pay a lot of money for the privilege,” says Jon Peddie, President of Jon Peddie Research and author of the book Augmented Reality, Where We Will All Live. “The VR market has bifurcated into industrial and scientific – where it started in the 1990s – and consumer. The consumer portion is a small group of gamers and a few – very few – people who watch 360 videos.”
On the other hand, in the opinion of Maze Theory CEO Ian Hambleton, the point has passed for people to doubt VR’s future. “With over 10 million active headsets on Oculus Quest sold now, it’s an active ecosystem. The 10 million unit number is often citedas a crucial stepping point.” Maze Theory developed the VR title Dr. Who and the Edge of Time.
You may find yourself in a digital living room using a HP Reverb G2 VR headset, which boasts a resolution of 2,160 x 2,160 pixels per eye and a 114-degree field of view. (Image courtesy of Hewlett Packard)
In November, Qualcomm CEO Cristiano Amon announced at the company’s 2021 investor day that Meta had sold 10 million Oculus Quest 2 headsets worldwide (Qualcomm’s Snapdragon XR2 chipset powers the Quest 2). A Qualcomm spokesperson later clarified that the number wasn’t meant to be official and came from market-size estimates from industry analysts. But as Qualcomm obviously knows how many Snapdragon XR2 chips it has sold to Meta, the cat seemed to be out of the bag.
Meta’s Oculus VR app marked another key milestone at the end of 2021, when it was the most popular download on Apple’s App Store on Christmas Day. The Oculus app beat out long-standing leaders like TikTok, YouTube, Snapchat and Instagram for having the most downloads.
And the category’s size is bigger than Quest 2. There are also Sony PlayStation VR, HP Reverb G2, Valve Index VR, HTC Vive Pro 2 and HTC Vive Cosmos, among other headsets. And PlayStation’s Next Generation VR (NGVR) is also joining the mix.
Research firm Statista estimates that the total cumulative installed base of VR headsets worldwide reached 16.4 million units in 2021 and that the cumulative installed base will surpass 34 million in 2024. And Statista predicts the global VR market size will grow from $5 billion in 2021 to $12 billion by 2024. Another firm, Reportlinker.com, foresees 62 million units shipped by 2026.
Hambleton thinks the launch of the next-generation Sony PlayStation VR (sometimes called NGVR) will give a major boost to VR. “[It’s] really important. NGVR will be huge. That’s our prediction. It’s got some great new features and sorted out many of the issues of the previous headset, including inside-out tracking and much better controllers. So long as they ensure there’s a strongpipeline of content for NGVR, we think it will do really well.” The latter support is likely – the current PlayStation VR has released over 500 VR games and experiences since the format’s debut in October, 2016.
Hewlett Packard’s HP Reverb G2 VR headset, developed in collaboration with Windows and Valve. (Image courtesy of Hewlett Packard)
Peak Blinders: The King’s Ransom is a narrative-driven VR adventure developed and published by Maze Theory. (Image courtesy of Maze Theory)
Hewlett Packard’s HP Reverb G2 VR headset with handheld controllers. (Image courtesy of Hewlett Packard)
A “home environment” view inside the standalone Oculus (now Meta) Quest 2 headset. More than 10 million Quest 2s had been sold as of November, according to estimates. (Image courtesy of Oculus Studios and Facebook/Meta)
Another indication of a growing market came when Oculus announced in February 2021 that the rhythm game Beat Saber had sold over four million copies across all platforms and over 40 million songs from paid DLCs. In an October Oculus blog, it was revealed that Beat Saber had surpassed $100 million in gross lifetime revenue on the Quest platform alone.
In February 2021, Facebook announced that more than 60 VR games available for Oculus Quest and Quest 2 had garnered over $1 million since the beginning of 2020, with six topping $10 million, including Skydance Interactive’s The Walking Dead: Saints and Sinners, released on Oculus in January 2020. The latter title has grossed more than $50 million across all platforms, it was announced by Skydance late last year.
“VR is definitely at an inflection point. It’s starting to look like the early PC evolution, which expanded from just hardcore enthusiasts and tinkerers and hobbyists to everyday use for a whole lot of people. That’s happening with VR now — people beyond the initial core believers are buying headsets and making it a regular part of their lives,” says Johanna Peace, Manager of Technology Communications at Meta, formerly Facebook.
She continues, “A lot of that is thanks to Quest 2. Before Quest 2, there hadn’t been a high-resolution, all-in-one form factor headset at that price point yet, so when we built it, people reallyresponded. A big part of this is because the headset is so intuitive and approachable. With a small and portable form factor, no wires or external sensors, anyone can pick it up and in seconds they’ll be immersed in a VR experience. That simplicity is incredibly powerful, and it removes big barriers to adopting VR.” She adds, “Quest 2 sales are strong and have surpassed our expectations, and we’re thrilled to see the community’s response to Quest 2.”
Peace notes that other genres growing in popularity in VR include fitness/wellness/meditation such as Supernatural and FitXR, multiplayer/social games like POPULATION: ONE and adventure games like Star Wars: Tales from the Galaxy’s Edge.
Vicki Dobbs Beck, ILMxLAB Vice President of Immersive Content Innovation, believes that VR has already begun its breakout and will continue to be adopted by a more mainstream audience given the headsets’ (such as Meta Quest 2) accessibility and ease of use. She comments, “In addition to a robust array of premium game titles, new content categories are further helping to drive growth.” ILMxLAB has shown its interest in the format with its production of the Star Wars: Tales from the Galaxy’s Edge interactive VR game titles, compatible with the Oculus Quest systems.
Beck also sees social VR sites having a positive effect on the acceptance of VR. She comments, “Against the backdrop of the pandemic and the desire to reconnect across geographies, we’ve seen a rise in engagement through social VR sites like VRChat and AltSpaceVR. Whether to experience immersive theater, remote viewing parties, engage in collaboration or just ‘be’ with friends, I expect such use will meaningfully increase in the year ahead.” Other popular social VR sites include Rec Rom, Bigscreen VR, and Meta’s Horizon Home and Horizon Worlds.
The Metaverse, a predicted global network of immersive virtual worlds, is expected to boost virtual reality, one of its key components. “VR’s greatest strengths are the power of ‘being there’ and the power of ‘connection.’ While there is neither a single definition of the Metaverse nor a universal strategy for engagement, I believe that VR will be one of the most compelling ways to explore and experience new worlds [and] emerging stories and establish relationships with characters,” says Beck.
Peace adds, “VR will be one of many entry points into the Metaverse, similar to how phones, laptops and other devices are entry points to the Internet today. The Metaverse won’t happen overnight, and it will take years to be fully realized. But VR today shows a glimpse of the immersive, social experiences that can be possible in the Metaverse, and these experiences will continue to develop as VR hardware advances and as the building blocks of the Metaverse are built.”
Augmented Reality has also not yet hit the mainstream, but it has plenty of believers, as it only requires special glasses or goggles, not headsets. Peddie comments, “AR has the biggest potential long-term. AR properly done will change our lives – for the better. When done right, it will be like wearing sunglasses or normal corrective lenses. It won’t be conspicuous or obnoxious, and it won’t take you out of the now – it expands it. AR has nothing in common with VR.”
Participating actively with the Oculus (now Meta) Quest 2 headset. The Quest 2 is credited with taking virtual reality closer to being a mainstream business. (Image courtesy of Oculus Studios and Facebook/Meta)
A Dalek from Doctor Who: The Edge of Time, a VR adventure developed by Maze Theory and published by Playstack. (Image courtesy of the BBC, Maze Theory and Playstack)
The Valve Index is a tethered high-end VR system, which comes with headset, hand controllers and base stations, and connects to your PC. (Image courtesy of Valve)
Skydance Interactive’s The Walking Dead: Saints and Sinners was one of the best-selling VR game titles as of late 2021. (Image courtesy of Skydance Interactive)
“With over 10 million active headsets on Oculus Quest sold now, it’s an active ecosystem. The 10 million unit number is often cited as a crucial stepping point.”
—Ian Hambleton, CEO, Maze Theory
Nguyen comments, “AR continues to mature. Much of the maturation and adoption are driven by the enterprise frontline work of AR. Companies like Snap and Niantic, as well as use cases like virtual try-on, and Google AR dinosaurs, animals and [Google] Maps arrow [Live View], have raised the profile for consumer AR, but we’ll still far from mainstream.”
Beck notes that while there is growing interest in AR and some novel applications, the pivotal shift will come with the introduction of compelling AR glasses. She comments, “The kinds of experiences we can create when people do not have to hold a phone or tablet could be truly transformational. A key will be the seamless blend of our digital and physical realities.”
Mobile AR is already widely used with smart phones, tablets and other mobile devices. The most notable example is the Niantic game Pokémon Go, developed and published in collaboration with Nintendo and The Pokémon Company for iOS and Android devices. Pokémon Go has over 150 million worldwide active users (its peak was 232 million users in 2016) and has passed one billion downloads, according to Niantic. Pokémon Go’s in-app purchases account for a large proportion of consumer mobile AR spending, according to Statista, which predicts that the number of mobile AR users will grow from 800 million in 2021 to 1.7 billion by 2024.
Zombies go boom in Skydance Interactive’s The Walking Dead: Saints and Sinners, which has grossed more than $50 million across all VR platforms, according to a Skydance announcement in October. (Image courtesy of Skydance Interactive)
Microsoft HoloLens 2, Magic Leap One, Google Glass Enterprise Edition 2, Ray-Ban Stories and Vuzix Blade are examples of ARglasses on the market. Apple is expected to launch either AR or MR (mixed reality) glasses by early 2023. Statista forecasts AR glasses sales rising from 410,000 units in 2021 to 3.9 million units in 2024. The firm predicts that enterprise spending on AR glasses will rise from $2 billion in 2021 to almost $12 billion in 2024.
Nguyen concludes, “AR and VR will continue to see forward momentum. The pace and trajectory of AR and VR haven’t changed. The difference will be the level of hype and the mismatch between that hype and the reality. Regardless, it’ll take five to 10 years.”
12th Annual VES Awardsves-admin2020-06-30T15:55:10-07:00
12TH ANNUAL VES AWARDS
Wednesday, February 12, 2014 Beverly Hilton Hotel
Beverly Hills, CA
Comedian Patton Oswalt served as host to the more than 1000 guests gathered at the Beverly Hilton to celebrate VFX talent in 24 awards categories. The teams from Gravity, Frozen,Game of Thrones and PETA led the wins in their respective categories.
Academy Award winner Sandra Bullock made a crowd‐pleasing previously unannounced presentation to her Gravity director, Academy Award nominee Alfonso Cuarón, recipient of the VES Visionary Award. Academy Award winning visual effects pioneer John Dykstra was presented with the VES Lifetime Achievement Award by previous VES Méliès Award winner Doug Trumbull.
Visionary Award Alfonso Cuarón Awarded for uniquely and consistently employing the art and science of visual effects to foster imagination and ignite future discoveries by way of artistry, invention and groundbreaking work.
Lifetime Achievement Award John Dykstra Awarded for significant and lasting contributions to the art and science of the visual effects industry by way of vision, artistry, invention and innovation.
Below is the complete list of Winners and Nominees for the 12th Annual VES Awards. A sortable list for ALL years of VES Award winners / nominees can be found on the Previous VES Awards page. All archival viewing materials are cleared for viewing by logged-in VES members behind the VES website firewall. For more information, please review the VES Awards Rules & Procedures, Section 14: Ownership & Clearances here.
Please click on the category to reveal the nominees and winners
This award is to honor the overall achievement of the visual effects within a live action motion picture where the visual effects are a visible, essential, and integral part of the story and play a principal and active role in the motion picture. A rule of thumb for defining whether a motion picture would be considered effects-driven would be to ask if the story could be told without the active participation of the VFX (including Special Effects). On the whole, the VFX in an effects-driven film would be easily identifiable by the viewing public and professionals working in the VFX field.
Fully animated films are not eligible in this category.
Iron Man 3
Star Trek: Into Darkness
The Hobbit: The Desolation of Smaug
This award is to honor the overall achievement of the visual effects within a live action motion picture where the visual effects play a supporting, minor or background role in the telling of the story. Supporting visual effects, when taken as a whole, may help create the setting, environment, or mood of an entire film, but are generally intended to be subtle or invisible to the lay viewer. They do not consist of a significant number of CG characters, science fiction or fantasy elements, and other highly visible effects that one would expect to see in a visual effects-driven or “tent pole” film.
Effects-driven films may not enter their “invisible” effects in this category, and animated films are not eligible.
The Great Gatsby
The Lone Ranger(Winner)
The Secret Life of Walter Mitty
The Wolf of Wall Street
White House Down
This award is to honor the overall achievement of the visual effects that play a supporting or background role within a single episode of a broadcast series, miniseries, made-for-television movie, or special wherein the visual effects are not necessarily essential to the telling of the story in the way that the effects of an effects-driven broadcast program are. Supporting visual effects, when taken as a whole, may help create the setting, environment, or mood of an entire program, and are generally intended to be invisible to the lay viewer. They do not consist of a significant number of CG characters, science fiction or fantasy elements, and other highly visible effects that one would expect to see in a visual effects driven broadcast program.
Da Vinci’s Demons: The Lovers
Hawaii 5-0: Ho’onani Makuakane
Mob City: A Guy Walks in to A Bar
Moonfleet: Episode 2
The Borgias: Relics
The award is to honor the overall achievement of the visual effects within an entire Special Venue project. Special Venues are defined as installations specifically set up to project large-format films (e.g. IMAX or OMNIMAX theaters), theme park theaters that may include a motion-based ride, museums, World Fairs, and similar venues.
To be eligible, a Special Venue project must have been exhibited publicly:
In a commercial venue for a paid admission, which may include the general admission to a theme park or special venue theater;
For a minimum period of one week on a regular daily schedule; and
Premiered in the current awards year in a Special Venue theater as defined above.
The following are not eligible in this category, regardless of the material’s original capture format:
Special purpose events such as trade shows and conventions;
Video material generally referred to as “pre-show” material;
Repurposed films, i.e. projects initially intended for the theatrical market but which have been blown up for exhibition in large-format Special Venue theaters;
Projects that were created as conventional 2D theatrical presentations but have been repurposed to stereographic 3D;
Any 2D or stereographic 3D feature motion picture that either premiered first, or simultaneously, in any regular movie theater or in any broadcast medium;
Any project that runs for an equal or greater amount of time in any regular movie theater or in any broadcast medium; and
Movies intended for simultaneous distribution in both Special Venue and normal movie theaters. The intent of this category is to honor those projects made specifically for the Special Venue market.
Hayden Planetarium’s Dark Universe
Mysteries of the Unseen World
Space Shuttle Atlantis(Winner)
SpongeBob SquarePants 4D: The Great Jelly Rescue!
Michael “Oz” Smith
This award is to honor the overall achievement of the animation within an entire animated motion picture. The animation may be created by traditional cel animation, computer animation, and/or stop motion, as long as it meets the definitions of Animation and Animated Project as stated in the Appendix of this Rules & Procedures. The vocal performance of characters may be taken into consideration along with the visual qualities in evaluating the overall effectiveness of the animation.Title sequences are not eligible in this category.
Cloudy With a Chance of Meatballs 2
Despicable Me 2
Peter Del Vecho
Lino Di Salvo
This award is to honor the overall achievement in a single animated character in a live action motion picture. The character may have been created by any technique or combination of techniques, including animatronics, as long as it meets the definition of Animation as stated in the Glossary of these Rules & Procedures.
Title sequences are not eligible in this category.
Oz the Great and Powerful: China Girl
Pacific Rim: Kaiju – Leatherback
The Hobbit: The Desolation of Smaug: Smaug(Winner)
This award is to honor the overall achievement in a single animated character in an animated motion picture. The character may have been created by any technique or combination of techniques, including animatronics, as long as it meets the definition of Animation as stated in the Glossary.
Title sequences are not eligible in this category.
Epic: Mary Katherine
Dylan C. Maxwell
Sang Jun Lee
Frozen: Bringing the Snow Queen to Life(Winner)
The Croods: Eep
Won Young Byun
Chris De St. Jeor
This award is to honor the overall achievement in a single animated character in a broadcast program or commercial. The character may have been created by any technique or combination of techniques, including animatronics, as long as it meets the definition of Animation as stated in the Glossary. The character may or may not be photorealistic.
Title sequences are not eligible in this category.
Game of Thrones: Raising the Dragons
PETA: 98% Human(Winner)
Three, The Pony
Tim Van Hussen
Toy Story of Terror
Kiki Mei Kee Poh
This award is to honor the overall achievement of a single created environment in a live action motion picture that best creates an illusion of setting for the story being told. Created environments are defined as either completely artificial environments, or the enhancement of an existing practical set or location through the addition of elements not present during photography. The environment may occur more than once in the project and under different conditions, but must be the same environment, created by the exact same team.
This category judges not only the techniques for creating the environment, but also their integration with any practical plate photography. Before & Afters must show the integration of the multiple elements used to create the environment.
Stereo extractions of environments that do not contain any other significant enhancements or fully animated productions are not eligible in this category. For practical purposes, the environment should be a single setting within the story, and not, for example, all locations within an entire city.
Iron Man 3: Shipyard
Pacific Rim: Virtual Hong Kong
This award is to honor the overall achievement of a single created environment in an animated motion picture that best creates an illusion of setting for the story being told. The environment may occur more than once in the project and under different conditions, but must be the same environment, created by the exact same team.
Before & Afters must show the integration of the multiple elements used to create the environment.
Stereo extractions of environments that do not contain any other significant enhancements are not eligible in this category. For practical purposes, the environment should be a single setting within the story, and not, for example, all locations within an entire city.
Epic: Pod Patch
Frozen: Elsa’s Ice Palace(Winner)
Virgilio John Aquino
Monsters University: Campus
The Croods: The Maze
This award is to honor the overall achievement of a single created environment in a live action broadcast program that best creates an illusion of setting for the story being told. Created environments are defined as either completely artificial environments, or the enhancement of an existing practical set location through the addition of elements not present during photography. The environment may occur more than once in the project and under different conditions, but must be the same environment, created by the exact same team.
This category judges not only the techniques for creating the environment, but also their integration with any practical plate photography. Before & Afters must show the integration of the multiple elements used to create the environment.
Stereo extractions of environments that do not contain any other significant enhancements, or fully animated productions, are not eligible in this category. For practical purposes, the environment should be a single setting within the story, and not, for example, all locations within an entire city.
Game of Thrones: The Climb(Winner)
Hell On Wheels: Big Bad Wolf
Matt Von Brock
Liberty Group Limited: Answer
This award honors the art of cinematography within the digital realm of a live action feature motion picture. Digital Cinematography is defined as the outstanding use of traditional cinematography techniques to communicate story and mood in a live action feature film, such as light direction, color, camera framing or movement, and depth of field within a primarily CG scene. It recognizes the combined collaborative work of pre-vis and layout artists, the lighting/CG supervisor, shot lighters, animators, and similar artists within this creative and interpretive process. Judges are to consider the use of light and camera in the scene, but are NOT judging the details of the models or environments that are being lit (these should compete in the Created Environment category). In the case of a live action movie, the film’s Director of Photography may be included among the entrants if, and ONLY if, he/she had a significant hands-on role in the final look of the CG elements.
Iron Man 3
Man of Steel
Pacific Rim: Hong Kong Ocean Brawl
The Hobbit: The Desolation of Smaug
Thelvin Tico Cabezas
This award honors the art of virtual cinematography within the digital realm of a live action broadcast program or commercial. Digital Cinematography is defined as the outstanding use of traditional cinematography techniques to communicate story and mood in a broadcast program or commercial, such as light direction, color, camera framing or movement, and depth of field within a primarily CG scene. It recognizes the combined collaborative work of pre-vis and layout artists, the lighting/CG supervisor, shot lighters, animators, and similar artists within this creative and interpretive process. Judges are to consider the use of light and camera in the scene, but are NOT judging the details of the models or environments that are being lit (these should compete in the Created Environment category). In the case of a live action program, the program’s Director of Photography may be included among the entrants if, and only if, he/she had a significant hands-on role in the final look of the CG elements.
Mad Max: Ethos
Murdered: Soul Suspect
Rafael Francisco Colón
Qualcomm Snapdragon: A Dragon is Coming
This award honors the art of cinematography within the digital realm of a live action broadcast program or commercial. Digital Cinematography is defined as the outstanding use of traditional cinematography techniques to communicate story and mood in a broadcast program or commercial (live action or animated), such as light direction, color, camera framing or movement, and depth of field within a primarily CG scene. It recognizes the combined collaborative work of pre-vis and layout artists, the lighting/CG supervisor, shot lighters, animators, and similar artists within this creative and interpretive process. Judges are to consider the use of light and camera in the scene, but are NOT judging the details of the models or environments that are being lit (these should compete in the Created Environment category). In the case of a live action program, the program’s Director of Photography may be included among the entrants if, and ONLY if, he/she had a significant hands-on role in the final look of the CG elements.
Gravity: ISS Exterior(Winner)
Star Trek: Into Darkness
The Lone Ranger: Colby Locomotive
This award is to honor outstanding achievement in compositing multiple elements into a final visual effect shot or group of shots in a live action feature motion picture. This category is for a body of work created for a single motion picture by an individual artist or team of artists.
Multiple entries from the same project are eligible provided the compositing teams are 100% different and the shots being submitted are completelydifferent. Title sequences are eligible as long as:
They are submitted in textless form in order not to conflict with any other awards rule; and
They are part of the storytelling and are not a specially designed separate animated title sequence in a live action project.
Animated films are not eligible in this category.
Iron Man 3: Barrel of Monkeys
Justin Van Der Lek
Iron Man 3: House Attack
The Hobbit: The Desolation of Smaug