By OLIVER WEBB

Falcon's Beyond Global
jmafilm@gmail.com
US
http://imdb.to/Jasonambler
http://twitter.com/JasonAmbler
http://linkedin.com/in/jasonambler
By OLIVER WEBB
Rodeo FX did about 580 shots in total for The Rings of Power. (Photo: Ben Rothstein. Image courtesy of Prime Video)
At the 74th Primetime Emmys, The Book of Boba Fett scooped Outstanding Special Visual Effects in a Season or Movie, with The Mandalorian taking home the award the previous year. It remains to be seen whether the much anticipated third season of The Mandalorian will take home the award after successive wins for the Star Wars universe. In the Outstanding Special Visual Effects in a Single Episode category, Squid Game won for the “VIPS” episode. With several new productions and more prequels and sequels from acclaimed trilogies and series such as The Lord of the Rings, Vikings and Game of Thrones, as well as an Addams Family spin-off, it has been an excellent year for visual effects. It will undoubtedly be a significantly close race at this year’s awards.
At the 2023 VES Awards, The Rings of Power garnered three awards for the episodes “Adar” and “Udun.” “Rodeo did 580 shots on the series,” says Visual Effects Supervisor Ara Khanikian. “I associate Lord of the Rings with epic landscapes, so having a hand in that was really interesting for us, in terms of classic matte paintings and CG environments. We had a lot of fun with the Harfoots, with the scale comps and just kind of playing with the scale of the smaller species versus human size. We had all the work surrounding the stranger as well. That started with him crash-landing in Middle Earth where he creates a crater of fire, and that involved some very complex work. We had a lot of very complex effects of the stranger not being in control of his powers and just seeing how his powers interact with the environment for the first time.”
“The most challenging sequence in terms of scope was definitely the one from the final episode with the battle between the mystics because of how much had to happen in it,” adds Rodeo FX Effects Supervisor Nathan Arbuckle. “We had all of the nature effects such
as leaves and trees and power from the stranger. We also had all of the fire that the mystics were going to put in there. We had all the celestial energy, and we had to have the wraith look as well when the stranger hits them with the celestial energy. The final episode involved a full CG forest build and doing lots of simulations for all of the CG plants as well as the effects between the stranger and the mystics. Then, of course, having the mystics get blasted away and turned into moths. We also did the Morgul blade effect, the glowing sword that builds out of a dark energy.”
Thing and Jenna Ortega as Wednesday Addams in Wednesday. (Image courtesy of Netflix)
The Netflix series Wednesday features an array of visual effects from fantastical monsters to supernatural abilities. The Rocket Science visual effects team delivered more than 300 shots across the series, with the majority of the scope of work focused on Thing, along with Wednesday’s Scorpio-Pet, digital doubles, dynamic CG fire, explosions and FX simulations with additional supporting visual effects. “Tom Turnbull, Rocket Science President/VFX Supervisor was very clear that the visual effects needed to be grounded in reality while set in this fantastical world,” says Visual Effects Supervisor John Coldrick. “As good as the miming of a disembodied hand was, there were shots where issues such as incorrect center of gravity and appearances of hovering over the ground would crop up. While the motion of a hand with no arm was magical, it still had to obey the laws of physics. This would sometimes involve altering the fingers so they connected with the ground during runs and gallops, removing sliding and altering axes of rotation. The goal was to make it difficult to imagine a ghostly arm on the end of that wrist, and make Thing live in a natural way.”
“The biggest challenge was to seamlessly integrate Thing into each shot, regardless of any obstacles on location, set and time of day,” Coldrick adds. “The disembodied hand needed to interact with the surrounding environment as well as with other actors, imparting its own personality and performance.
Union VFX delivered over 300 shots for The Sandman. (Image courtesy of Netflix)
Vikings: Valhalla could find itself among the nominees once again after a strong start to the second season. (Photo: Bernard Walsh. Image courtesy of Netflix)
Rachel Weisz portrays twin sisters in Dead Ringers, based on the David Cronenberg 1988 horror film. (Image courtesy of Prime Video)
The Last of Us masterfully brings the video game to life. (Photo: Liane Hentscher. Image courtesy of HBO)
Angus Bickerton and his visual effects team picked up a nomination at the 21st VES Awards for their work on House of the Dragon. (Image courtesy HBO)
Production landed on a combination of hand-acting from Victor Dorobantu dressed in a blue suit, and prosthetics to add the stump to his real hand, and the Rocket Science visual effects team seamlessly integrating Thing into frame. The RS VFX team created a partial rig consisting of the wrist and the hand down to the finger base. This was tracked onto the stunt performer in the plate. We often had to tweak the wrist performance to create the illusion of a bodiless hand.”
“Realizing Thing was more challenging than it looks,” explains Kayden Anderson, Rocket Science Visual Effects Producer. “For many of the complex shots, we had to troubleshoot to devise unique solutions. Maintaining the actor’s hand performance and recapturing the environment with fidelity was not a one-size-fits-all scenario.”
MARZ Visual Effects Supervisor Ed Englander also worked on the series. “We did a couple of sequences of shots with Uncle Fester, when he first runs into Wednesday out in the forest and she tries to draw a sword on him and he shocks her,” Englander remarks. “There were a couple of shots when Fester is resuscitating Thing, and we did some of the electricity effects there as well. One of the larger shots we had to do was an environment shot, which was an aerial drone shot of Jericho. It was a wide shot of the whole town and you can see across the river. We had to fill in the backsides of many of the buildings on the main street because outside of this one shot, you never see the whole town. It was all standard fronts and facades, and we had to build out the backs of those and do a bit of aesthetic modification on some of the edges of the town. There were houses in rural residential areas that extended out past the main street which we had to add as well. They wanted it to feel like one of those fall postcards for anywhere out in Vermont. I believe the entire production was filmed in Romania, but we had a lot of reference footage, and you can’t do a Google search of Vermont without autumn trees showing up everywhere. There were plenty of good bits of material to flesh that out.”
Chris Sumpter as Jake in Episode 104 of The Midnight Club. (Photo: Eike Schroter. Images courtesy of Netflix)
The boom age of the 1880s in New York is stylishly captured in The Gilded Age. (Photo: Alison Cohen Rosa. Image courtesy of HBO)
The Rings of Power took home three awards at the 2023 VES awards. (Photo: Ben Rothstein. Image courtesy of Prime Video)
One of the biggest challenges for Rocket Science VFX on Wednesday was seamlessly integrating Thing into each shot. (Image courtesy of Netflix)
The Sandman is a strong contender for a nomination at this year’s Emmys. (Image courtesy of Netflix)
Another strong contender is the fantasy drama series The Sandman. Based on Neil Gaiman’s original comic book, The Sandman follows Dream as he embarks on a journey after years of imprisonment in order to regain his power. “The team and I at Union VFX worked on the show for over a year and delivered over 300 shots spread across all 11 episodes of Season 1,” explains Visual Effects Supervisor Dillan Nicholls. “Bringing this rich, complex and much-loved series of comics to the screen was always going to be a challenge and open to interpretation. We read the comics and tried to immerse ourselves in the world of The Sandman as much as possible in advance of joining the project. Initial discussions focused on some key sequences and early concepts from the Warner Bros. team, establishing what kind of aesthetic we were looking for and the level of realism of different sequences as the more abstract, surreal world of ‘the dreaming’ crosses over into the real world.”
“Some of the work had a very open brief, particularly for sequences taking place in the world of dreams, or ‘the dreaming,’” Nicholls adds. “We were encouraged to be creative, using reference from the comics as a starting point, but with freedom to experiment with different techniques and aesthetics to achieve surreal and abstract results. On the other hand, some of the work was very much grounded in reality and required fairly traditional ‘invisible’ effects work such as set extensions and greenscreens. Often in the show, the two worlds of the dreaming and the waking meet and we would create fairly traditional, realistic-looking VFX shots, but with something not quite right, a surreal twist.”
Discussing the most challenging effect to realize on the series, Nicholls notes that the views across the Thames by the tavern in Episode 6 were shot against greenscreens and required extensive full CG/DMP environments and a full CG tavern. “These scenes took place in the ‘real world’ in London but across dramatically different time periods, across 600 years, and required a lot of research as well as an evolving CG build of the tavern through the ages. We needed to maintain a balance between making it recognizably the same location – a crucial story point – while also showing that area of London evolved over hundreds of years, from green fields to present day Canary Wharf,” he says.
Five Days at Memorial won Outstanding Supporting Visual Effects in a Photoreal Episode at this year’s VES awards. “In total, UPP delivered 265 shots for this project,” says Visual Effects Supervisor Viktor Muller. “As on similar projects of this kind, it was desirable that the visual effects weren’t apparent at first sight. If a shot revealed itself as a VFX one, it would, in fact, be incorrectly done. For us, the biggest challenge was creating the New Orleans Superdome, where we were depicting the battle with the hurricane. The most demanding aspect of this shot was figuring out how to approach it so that it looked as real as possible but simultaneously remained visually attractive and interesting for the audience. In other words, as on the basis of real references hardly anything would be visible, we needed to find the balance between making sure that the viewers were able to see something and keeping the shot realistic.”
Set nearly 200 years before the events of Game of Thrones, House of the Dragon depicts the events leading up to the Dance of the Dragons. Visual Effects Supervisor Angus Bickerton and his visual effects team picked up a nomination at the 21st VES Awards for their work on “The Black Queen” episode. Contrastingly, Vikings: Valhalla, in its second season, is set 100 years after the events of Vikings. Valhalla was nominated at last year’s awards for Outstanding Visual Effects in a Single Episode for “Bridge,” with the original series having been awarded Outstanding Special Visual Effects in a Supporting Role at the 72nd Primetime Emmy Awards. Valhalla could find itself among the nominees once again after a strong start to the second season.
New to the mix is the post-apocalyptic drama, The Last of Us, which masterfully brings the video game to life. Adapted by the game’s creator, Neil Druckmann, and Chernobyl creator Craig Mazin, The Last of Us follows the hardened, middle-aged Joel, who is tasked with escorting 14-year-old Ellie across a treacherous and barren America in what may be the final hope for the survival of humanity. Another potential newcomer in the running for a nomination is Amazon Prime’s Dead Ringers. Based on David Cronenberg’s 1988 horror classic, Dead Ringers is centered around twin gynaecologists (portrayed by Rachel Weisz) in a gender-flipped version of the original film.
When the winner of the Primetime Emmy Awards is announced September 18, it will certainly be a close race. Nonetheless, the visual effects work over the course of the last year has been nothing short of remarkable. Visual effects have played a central role in some of the biggest series released this year and each of the mentioned series can be extremely proud of their groundbreaking and beautifully crafted work.
Chloë Grace Moretz in the The Peripheral, based on a William Gibson novel. (Photo: Sophie Mutevelian. Image courtesy of Prime Video)
Paz Vega as Ava Mercer, left, and Giancarlo Esposito as Leo Pap in Episode “White” of Kaleidoscope. (Photo: David Scott Holloway. Image courtesy of Netflix)
By CHRIS McGOWAN
Method Studios and MPC did some heavy lifting on Top Gun: Maverick, assisted by Lola VFX, BLIND LTD, Intelligent Species and Gentle Giant Studios. (Image courtesy of Paramount Pictures and ViacomCBS Inc.)
The VFX boom goes on, and it is a shared surge. There has been continued growth in the visual effects business, with vendors from many different countries working together on movies and series. And often the individual VFX companies themselves have multiple facilities located around the world, including ILM, Wētā FX, DNEG, BOT VFX, Technicolor Creative Services, Streamland Media, Pixomondo, Digital Domain, Outpost VFX and Framestore. The collaborations range across North America, Europe, Australia, New Zealand and South and East Asia, with more participation on the horizon from emergent visual effects houses in Latin America and Africa as well. Consequently, great opportunities and new challenges have emerged with the increasing cooperation between geographically distant studios.
Says Jeanie King, ILM’s Vice President of Production, “The entertainment industry is far more globalized than it has ever been in the past, and it is to everyone’s advantage to spread the work. It gives us all more capacity to get all the work done. Also, we are able to access more talent in different regions, which benefits everyone. Effect vendors and clients alike need to be more flexible and organized now because everyone is spread [over] more time zones.”
King adds, “Due to the high volume of projects requiring VFX throughout the industry, studios have had to spread the work around. Many projects have had 10 to 20 VFX studios involved. Vendors need to have a diversified portfolio as do the studios/clients in order to protect themselves for their deliveries.”
The Indian Hindi-language fantasy-adventure film Brahmāstra: Part One – Shiva is an example of an “in-house two-vendor” solution at scale. In this case, multiple facilities of sister companies ReDefine and DNEG delivered over 4,000 VFX shots for the epic film. (Image courtesy of DNEG and Dharma Productions)
“On Marvel Studios’ Doctor Strange in the Multiverse of Madness, we worked with multiple vendors sharing assets, environments and FX in order to complete the work on shared sequences. This was also true on Marvel Studios’ Black Panther: Wakanda Forever,” remarks King, who adds, “Most recently, on Avatar: The Way of Water, ILM collaborated with Wētā FX, which had created the majority of assets for the show. So, we were ingesting and manipulating all of that data into our pipeline so we could work efficiently on the sequences we were contracted to create.”
On The Lord of the Rings: The Rings of Power, ILM and Wētā also collaborated with a stellar group of VFX studios that included Rodeo FX, Method Studios, DNEG, Outpost VFX and Rising Sun Pictures. King sees the movement of the work around the globe as an advantage. “The sharing of work between companies has become easier due to increased standardization in process and technology,” King comments. “Companies have had to evolve as more of the studios engage multiple vendors on one project. Vendors are partnering with each other more than ever before. We all want to get the work done as efficiently as possible. It’s a competitive market out there, and each vendor wants to make sure they are working as productively as possible.”
Using multiple vendors is about resource availability and identifying the right artists for the right work, according to Patrick Davenport, President of Ghost VFX. “We’re very much a global industry now.” He feels that the increasing spread of VFX work around the planet has been an inevitable process. “The industry has been headed in this direction for some time. The work follows artist resources, tax incentives, etc. And now with work from home and hybrid models, it allows artists to work anywhere, anytime.”
Framestore was the lead VFX vendor on Fantastic Beasts: The Secrets of Dumbledore and was joined by Rodeo FX, Digital Domain, Image Engine, One of Us, Raynault VFX, Clear Angle Studios and RISE Visual Effects Studios. (Image courtesy of Warner Bros. Pictures)
Wētā Digital and ILM led the way with The Lord of the Rings: The Rings of Power, collaborating with Rodeo FX, Cause and FX, Method Studios, DNEG, Outpost VFX, The Third Floor, Rising Sun Pictures, Atomic Arts and Cantina Creative. (Image courtesy of Amazon Studios)
Ghost VFX was the main VFX studio for Troll, a fantasy tale of a giant troll terrorizing contemporary Norway, and was joined by Copenhagen Visual, Swiss International, Shortcut VFX, Gimpville and Static VFX Studio. (Image courtesy of Motion Blur and Netflix)
Doctor Strange’s (Benedict Cumberbatch) magic was conjured up with the help of multiple VFX studios. (Image courtesy of Marvel Studios and Walt Disney Pictures)
The planet Vulcan in Star Trek: Strange New Worlds. Unreal Engine and Arnold Rendering are popular tools for ensuring smooth collaboration between vendors. (Image courtesy of CBS Studios, Inc.)
Davenport continues, “From a client perspective, it’s about getting the work done on time, within budget, and mitigating the risk of having all your work with one sole vendor, which may end up struggling to deliver. From our perspective, by having a global studio it allows us to operate 24/7 and identify talent in different locations that are best suited to the work.”
A scenario of multiple VFX houses on bigger shows “can enhance important creative factors that come from having different supervisors and production teams with different strengths and specialisms overlooking different sequences on a show,” affirms Rohan Desai, Managing Director of ReDefine, part of the DNEG group.
There can also be downsides. Desai explains, “It can add overhead in terms of managing multiple vendors. Asset sharing is also a hurdle with increased costs. This can create additional work for vendors when they are required to use assets made by other vendors as there can be duplication of effort. Finally, consistency in look can be an additional challenge as each show has a certain aesthetic, and this needs to be matched by all vendors. Different teams may approach aesthetics differently and this can result in inconsistencies.”
To achieve a high standard across all sequences and studios, “I have found that communication and collaboration is the best way to make this work,” comments Christian Manz, Framestore’s Creative Director, Film. “I have had as many as five companies working on a shot/sequence in the past, and in that instance I brought all of the key supervisors together to discuss approach and kept them communicating with each other. I also find that by showing early WIP as soon as possible to the filmmakers and [having] frequent reviews keep the work on track to a fantastic, consistent final result.”
Some recent multi-vendor projects led by Framestore include Fantastic Beasts: The Secrets of Dumbledore (for which Manz served as Production VFX Supervisor), His Dark Materials Season 3 and 1899 (these were almost entirely Framestore, but involved multi-site work across the firm’s London, Montreal, Mumbai, Vancouver and New York studios), Top Gun: Maverick (completed as Method Studios, now part of Framestore) and Wheel of Time Season 2.
Jabbar Raisani was a VFX Supervisor for the Stranger Things Season 4 series, which he says utilized over two dozen studios from around the world. Raisani comments, “Stranger Things S4 was challenging due to the high shot count, high complexity and short delivery schedule. We made it work by spreading the work over numerous vendors in order to maximize throughput.” The VFX studios involved included Rodeo FX, Important Looking Pirates (ILP), Digital Domain, DNEG, Lola VFX, Crafty Apes and Scanline VFX, among others.
To help studios communicate, says Raisani. “One piece of software we used extensively was SyncSketch, which allowed us to visually communicate notes to vendors in an interactive setting. SyncSketch uses cloud-based technology and our VFX team collaborated daily using shared Google documents. We also used Evercast for daily remote VFX reviews and PacPost.live for editorial reviews.”
The sharing of assets between VFX studios is often a challenge. “I think it is getting better, but it is still challenging,” comments Niklas Jacobson, VFX Supervisor and Co-Founder of ILP. “Different companies have their own toolsets and workflows, and many companies use proprietary tools. In some cases, there could also be licensing issues due to the use of third-party texture or model services.”
Jacobson notes, “Even with different techniques, there are some generally accepted techniques for texture channels and shaders in particular that all the big vendors especially converge towards. With the development of standards like USD and MaterialX, and as they become widely adopted, hopefully sharing will be easier.”
Sharing work can also be quite challenging from the bidding side. Explains Jacobson, “Clients and vendors will want to be as efficient as they can, but in particular sharing hero assets means that one vendor will generally not be able to anticipate the requirements of an asset across other vendors’ sequences. This tends to leave both parties guessing a bit at the planning stage and definitely requires some out-of-the-box thinking.”
ILP has had positive experiences working with its peers and colleagues. Jacobson observes, “We try our best to be good creative partners with everyone we work with, and it’s satisfying to feel that the vast majority of vendors we work with tend to have this stance as well. Stranger Things S4 is a great example – the season finale was such a massive episode and had to be split between vendors. We did a sequence featuring the Demogorgon, a creature which was created by Rodeo and even featured in an earlier episode in the same environment and lighting conditions. Our collaboration with Rodeo was very smooth, and we had no trouble ingesting their creature into our pipeline.”
The visual effects in Black Panther: Wakanda Forever were handled by ILM, Cinesite, RISE Visual Effects Studios, Digital Domain, Wētā FX, Storm Studios, Whiskytree, Scanline VFX, Barnstorm VFX, Mammal Studios, SDFX Studios, Territory Studio, Clear Angle Studios, PixStone Images, SSVFX and Luma Pictures.
(Image courtesy of Marvel Studios and Walt Disney Pictures)
The dreaded Demogorgon for Season 4 of Stranger Things. Among the many VFX providers were Rodeo FX, ILP, Digital Domain, DNEG, Lola VFX, Crafty Apes, Scanline VFX, BOT VFX, Clear Angle Studios, Rogue One VFX, The Resistance VFX, Cadence Effects, Jellyfish Pictures, Alchemy 24 and FutureWorks Media. (Image courtesy of Netflix)
From the Willow series on Disney+. ILM and Hybride supplied the visual effects along with ILP, Image Engine, Luma Pictures, SSVFX, Creative Outpost, Misc Studios, Midas VFX, Ombrium and The Third Floor. (Image courtesy of Disney+)
“The sharing of work between companies has become easier due to increased standardization in process and technology. Companies have had to evolve as more of the studios engage multiple vendors on one project. Vendors are partnering with each other more than ever before. … It’s a competitive market out there, and each vendor wants to make sure they are working as productively as possible.”
—Jeanie King, Vice President of Production, ILM
Pixomondo paved the way with the VFX for Star Trek: Strange New Worlds and has worked on the show with Crafty Apes, Ghost VFX, FX3X (Cinesite), Vineyard VFX, Boxel Studio, Barnstorm VFX and Storm Studios.
(Image courtesy of Pixomondo and CBS Studios, Inc.)
For Doctor Strange in the Multiverse of Madness, ILM worked with Wētā Digital, The Third Floor, Luma Pictures, Trixter, Crafty Apes, Digital Domain, Framestore, Sony Pictures Imageworks, Clear Angle Studios and Spin VFX. (Image courtesy of Marvel Studios and Walt Disney Pictures)
Pixomondo and various other vendors worked together on the Star Trek: Strange New Worlds series. Pixomondo Virtual Production and Visual Effects Supervisor Nathaniel Larouche notes, “Vendors have been able to coordinate their efforts through the use of a variety of software and hardware solutions. At the top of this list is Unreal Engine, an industry-leading 3D game engine with powerful tools for creating realistic, high-fidelity 3D environments. As well as enabling vendors to create detailed and realistic scenes, Unreal also features tools which allow vendors to quickly and easily collaborate on a project in real-time. Furthermore, Unreal’s system allows for cross-platform compatibility across platforms such as PC and Mac.”
Continues Larouche, “In addition to Unreal Engine, Arnold Rendering is another popular software option among vendors. By using Arnold’s advanced ray-tracing capabilities, 3D artists can achieve incredibly lifelike images without having to spend much time tweaking lighting and textures. Additionally, Arnold supports common image processing formats such as OpenEXR and HDR (High Dynamic Range) formats which allow for greater flexibility when it comes to sharing shots between vendors.”
In addition, hardware also plays an important role in helping vendors coordinate their efforts, according to Larouche. “For example,” he says, “computers that are powerful enough to run both Unreal Engine and Arnold Rendering are essential for ensuring smooth collaboration between different teams working on the same project. High-end graphics cards are also beneficial in providing faster rendering times, which can help reduce wasted time spent waiting for updates from other parties involved in the project.”
The Cloud is playing an increasing role in helping VFX studios interact. “The Cloud has had a significant impact on VFX so far; it enables storage and backup of assets in a secure environment that can be accessed by multiple users at any time,” Larouche explains. “It also allows for real-time sharing of files and data between members of a production team, allowing them to work more quickly and efficiently.”
The Cloud offers other advantages as well. Larouche comments, “Cloud-based solutions provide scalability options tailored to specific teams or businesses’ needs. For example, if a vendor’s workload grows unexpectedly over time, they can easily scale up their storage on the Cloud without needing to purchase additional hardware or software licenses.”
Continues Larouche, “Recently emerging technologies in the field are further enhancing our collaborative capabilities when it comes to virtual production pipelines. In particular, cloud-based platforms such as Shotgun allow studios to track progress across all departments in real-time while providing customizable tools like asset management and review functionality that help streamline processes, while maintaining complete visibility into projects at all times. This allows vendors greater control over their workflow while cutting down costs associated with manual processes that tend to slow down progress significantly when attempting larger productions spanning various remote locations.”
States ILM’s King, “Even though pipelines may be different from company to company – because we have had these conversations over the years and work has passed back and forth – we have developed in-house tools to make it easier. Also, due to the remote work situation that many VFX studios are still working in, more people are able to connect via video conferencing, which makes the entire process more efficient, productive and also personal.”
Clients are more open now to vendors talking between themselves than ever before. Comments King, “Because we have been sharing work over the years, conversations have taken place and processes have developed to make ingesting work easier. Also, relationships have formed. Supervisors and artists have worked together at the same facilities and friendships are made, which makes it easier to discuss workflows and have much more helpful, in-depth conversations.”
By TREVOR HOGG
Gulliver Studios was responsible for extending more than 10 environments which included six unique game settings for Squid Game. (Image courtesy of Netflix)
Breaking out in a big way to showcase the digital artistry of the visual effects industry in South Korea was Netflix series Squid Game, which won the 2022 Primetime Emmy for Outstanding Special Visual Effects in a Single Episode and received a VES Awards nomination for Outstanding Supporting Visual Effects in a Photoreal Episode for Episode 107 titled “VIPS.” “The Korean visual effects industry has evolved drastically in the past 10 years from simple comp tasks to complex shots, which involves various 3D skills and technologies,” states Moon Jung Kang, VFX Supervisor at Gulliver Studios. “Previously, Korean filmmakers tried to avoid too much of digital augmentation because they were not sure of Korean visual effects quality. But as Korean content has gained global popularity, filmmakers started trying more diverse genres and Korean visual effects industry also evolved. In these days, it’s easy to find heavy visual effects shots in Korean content.”
Kang was a member of the Primetime Emmy award-winning team for Squid Game as Gulliver Studios was the sole vendor. “We had about 2,000 shots [for nine episodes], and the post-production period was about nine months.” Squid Game deals with human and social problems through classic Korean children’s games. Remarks Kang, “In the earlier stage, there was a concern about that these games were unique to Korea and the gameplay was too simple. In general, the visual effects environment work is mostly focused on realism, but in the case of Squid Game it was necessary to express the feeling of a realistic, yet artificially created set at the same time.” Most of the visual research was centred around environments. “In most cases, we research through images and clips from the real world, but in Squid Game we extended our research to classic paintings and illustrations. For the maze environment with a bunch of stairways, which the basic concept was surrealism, we actually referenced classic surrealism paintings and illustrations a lot during the asset and layout process,” adds Kang.
No practical location could be found for the Moon, so a set and LED wall were utilized for The Silent Sea. (Image courtesy of Netflix)
“In Squid Game, we had more than 10 environment extension issues, not only six unique game environments, but also a cave and an airport,” Kang notes. “Some game environments such as the marbles game town, the playground and the dormitory were large builds, and we extended mostly walls and ceilings. But the rest of environments such as the schoolyard, tug-of-war tower, circus tent and maze ways environments were heavily extended and reconstructed in CG. The circus tent environment included three different set locations such as main glass bridge, VIP room and floor ground area, and we had to combine them all in one and seamlessly connect plates, which were shot under different lighting conditions. Furthermore, the director wanted the circus tent lighting to be very dark overall with a hot spotlight on the bridge, but we had plates with huge fill light above the set. Interestingly, the sequence which gave us the hardest time brought us the honor of winning the Emmy Award.”
Every sequence had its own complex elements. “The piggy bank shots gave us the hardest time to execute,” Kang reveals. “It wasn’t because of shot solution complexity, but to get the exact look of the piggy bank that the director wanted. It wasn’t just about making the material look realistic, but we also had to emphasize the falling money inside the piggy bank.” The schoolyard sequence in Episode 101 is a personal favorite of Kang’s. “The schoolyard environment was the first space to show the boundary between real and fake space, and the practical set was mostly filled with blue matte. We didn’t have a final concept image, and whether the schoolyard should be treated as indoor space or outdoor space wasn’t decided at that time. In an earlier process, we looked through various options and suggested the indoor space with a big opening on top and walls with a hand-painting touch. The key concept of the Squid Game’s environments is the mix of real and fake space.”
The communication tower that team leader Han Yoon-jae (Gong Yoo) climbs and falls from had to be extended in CG. (Image courtesy of Netflix)
Practical gimbals were combined with CG augmentation provided by Westworld to get the required interaction for The Silent Sea. (Image courtesy of Netflix)
Oni: Thunder God’s Tale was to be stop-motion animation but in the end became CG that emulated the characteristics of the original intent. (Image courtesy of Netflix)
Megalis VFX constructed an asset pipeline that consisted of Solaris, USD and Arnold to handle the work required for Oni: Thunder God’s Tale. (Image courtesy of Netflix)
The characters in Oni: Thunder God’s Tale were meant to feel as if they were made out of felt. (Image courtesy of Netflix)
Another major Korean Netflix series is The Silent Sea, which was originally a short film called The Sea of Tranquility by Choi Hang-yong. He subsequently expanded the sci-fi concept into eight episodes starring Bae Doona, Gong Yoo, Lee Joon, Kim Sun-young and Lee Moo-saeng. Key sequences had storyboards and previs. “They were used for planning the crash landing in Episode 101, Yunjae falling in Episode 103 and the appearance of Luna in Episode 105,” states Kim Shin-chul, VFX Supervisor at Westworld. “We used techviz for actual filming, to make a filming plan and use Ncam [virtual production] in the filming of the elevator fall scene while checking the appearance of the Balhae Lunar Research Station through a monitor.” A creative challenge was to convincingly convey what will happen in the future. Notes Chul, “It was fun to create the propellant that is in charge of fuel and how the lander docks with it. The structure of the docking station was created with the idea that it would be easy for passengers and astronauts to board the Moon in the era of relatively easy travel. In the part of walking on the moon, it was difficult to interpret the reference material and the actual appearance in a cinematic way.”
No practical location could be found for the Moon. “A set and an LED wall were used,” Choi remarks. “The powerful directional sunlight had to be implemented with only a limited number of lights without scattering of the atmosphere, but the position of the light could not be changed for each camera setup, so an LED wall was used to cover all angles. And the art terrain was filmed by changing only the position of the actor in one setup using the characteristic rock. Since LED wall shooting is still unfamiliar, we R&D and tested it together.” The monochrome starkness of the lunar environment contrasts with the dystopian Earth. Adds Choi, “Due to temperature changes, the topography of the sea level changes, and yellow dust and green algae are frequent. Maritime workers lost their jobs and abandoned fishing tools and boats. In the previous concept, we tried to show a corroded image of landmarks around the world, but it was deleted because it was judged to be an excessive expression of other cultures.”
Graphical elements such as numbers were incorporated into the imagery to show the contestants realizing the type of game they are playing in Alice in Borderland. (Image courtesy of Megalis VFX)
Getting the proper performance for the reveal of Luna was critical. “The director thought a lot about the Luna character between an animal and an innocent child,” Choi explains. “To get the desired movement a digital double was used for her first appearance. After that, it could be more relaxed, and the director let the actor act as much as possible. It seems that the writer paid attention to the conflict between characters and the narrative rather than the visual.” The many concepts were discussed for Balhae Lunar Research Station. Notes Choi, “We decided on the location and width of the passage in detail based on the movements of the characters.” Yunjae falls while attempting to repair the communication tower. “In pre-production,” Choi adds, “we decided on the movement line with the action team through previs. However, it should be expressed as descending from a height of more than 35 meters, but the actual tower set was about five meters. In order to set the camera angle, Ncam was used to film while watching the background made in advance was viewed on the monitor in real-time.” A dramatic moment is the flooding and destruction of Balhae Lunar Research Station. “With the concept of being destroyed by the nature of water increasing indefinitely, the water was expressed with effects simulations and hallway miniatures, and the base was destroyed by freezing all of the outlets,” Choi says.
Not many public images exist of the Japanese Supreme Court, which made it a creative challenge for Megalis VFX to envision the interior of the building for Alice in Borderland. (Images courtesy of Megalis VFX)
For Alice in Borderland, onset water and smoke were used and subsequently replaced with CG acid, which involved a lot of 2D work to place over the skin. (Image courtesy of Megalis VFX)
To create the impression that Tokyo is deserted, the famous Shibuya scramble intersection was recreated as part of a massive open set at Ashikaga City for Alice in Borderland. (Image courtesy of Megalis VFX and Netflix)
In order to better control the environment, the drive scenes for Babylon Berlin were shot against greenscreen with plates composited in later. (Images courtesy of RISE)
Japan has a small visual effects industry that does the vast majority of the digital augmentation on domestic projects. “You don’t see many companies here doing visual effects,” notes Jeffrey Dillinger, Head of CG at Megalis VFX. “In Japan, they haven’t pushed visual effects studios to get the quality of work like Wētā FX, but they haven’t had a project go overseas. They are fine with a stylized approach.” A lot of the initial work that the company got to do was effects simulations. Remarks Dillinger, “Oni: Thunder God’s Tale was our first full project. It was initially going to be stop-motion animation and we were going to do CG enhancements, but eventually they decided to do full CG because it would have taken years.”
Character designs had to be translated from 2D to 3D for the Netflix limited series, which consists of four episodes. “The most important thing we had to establish was the asset pipeline [Solaris, USD, Arnold] because we didn’t have one. We ended up doing more than 2,000 assets,” Dillinger adds. Most of the challenges were technical. “When we first decided to use Arnold and Solaris, you couldn’t render hair, and the most important thing for our characters is the hair. Our protagonist has an Afro, and the characters are meant to be made out of felt, which you can’t accomplish that without having a layer of fuzz. We were able to accomplish quite a tactile feel where you can almost reach into the screen and touch these characters at times,” Dillinger describes.
Getting to contribute to Season 2 of Alice in Borderland was an exciting opportunity for Megalis VFX. “It’s a good project, Season 1 was fun to watch, and you could tell creatively the director and DP were good,” Dillinger remarks. “In each episode there are these games that happen. In the case of Episode 206, if people lose the game, acid falls on them and they melt. Onset, instead of acid, they used water and smoke, which makes sense, but when water hits somebody it behaves differently than acid. Usually, acid has more of a yellowish tint to it, so they’re trying to give it a little bit of that without going over the top. In a lot of those cases, it was a static camera and body, so it was a lot of 2D work to put on top of the skin.” Alice in Borderland does not hold back on the blood and gore. “Onset, they actually built a maquette of what a human looks like after acid has been dropped on them. We didn’t use it per se, but that ended up being concept art for us,” Dillinger notes. The game takes place at the Japanese Supreme Court. “Apparently, you cannot shoot there, and on Google we only found three photos that show the interior. It’s an artistically interesting building and has these concentric circles that rise up like an upside-down funnel. We spent a lot of time trying to stay true for those who might have seen it in person. There are also lot of graphical elements, like numbers in the air, which is a visual, stylistic way of showing the contestants realizing the type of game that they’re playing.”
Entering into its fourth season on Netflix is Babylon Berlin, which has visual effects produced by RISE Visual Effects Studios. “There was no German series before featuring so many visual effects shots,” observes Robert Pinnow, Managing Director and VFX Supervisor at RISE Visual Effects Studios. “In the first season, we delivered over 830 and worked on over 900, which at that point was an insane amount. There were approximately 50 full CG shots. The way that we suggested to use visual effects was new and quite common in the U.S. market, which was, ‘Don’t solve it conventionally by putting something there and still not have a right image. Just shoot it, we’re going roto and exchange the whole background rather than just that little antenna.’ The freedom they had was new for German visual effects.” Quite a few Berlin landmarks make an appearance. Comments Pinnow, “We rebuilt the Alexanderplatz correctly because they were insisting on shooting there, but it has a tiled ground that didn’t exist back in the 1930s. The residential areas are made up. Some of it got shot in the city, especially things like the railway station, and others got shot in the backlot at Babelsberg Studio.” For a full year, the entire production office was put into a former federal building that was supposed to be redone. “Shooting took place on one level within that building for all of the interior sets and police department. For Season the production office had to move to another one, and now they’re in a former school.”
Every episode is directed by creators Henk Handloegten, Achim von Borries and Tom Tykwer. “One was shooting in one location, no matter what episode it ended up in, and everyone else had to follow it,” Pinnow explains. “Sometimes, someone had 50% on an episode and on another 10%. The three of them were doing it together.” The trio had to approve the visual effects work. “That was interesting, indeed,” Pinnow observes. “One was like, ‘It looks good.’ The other one was like, ‘It doesn’t fit my needs. It could be this or that way.’ And the third one, Tom, was straightforward and had everything in context. During the early days of Babylon Berlin, the decision was made to support all of the driving sequences with digital backgrounds. Describes Pinnow, “They could drive the whole city day and night. The background plates in the streets wouldn’t have been usable anyway because of modern elements. The buildings we made for that were the key to designing the streets that were backgrounds for normal shots. That then became part of the concept process. We delivered a turntable of every house. In this way, they could grab one frame that was in the right perspective, put it together in Photoshop and send it back to us. On Season 2, we did it ourselves because they trusted us. There was not much concept art unless it was something specific.”
It is important that visual effects are applied constructively. Concludes Pinnow, “There were a few shots where one of the directors said, ‘If we had known how good they looked, we would have used more of them.’ And the other one said, ‘I like that the good and great shots in this sequence are a side effect.’ You need the visual effects to show the sequence, but it’s not on the eye. It’s helping the storytelling.”
Alexanderplatz was rebuilt by RISE for Babylon Berlin because the tiled ground did not exist back in the 1930s. (Image courtesy of RISE)
The piggy bank shots for Squid Game were the hardest to execute for Gulliver Studios as the director had a specific look in mind. (Images courtesy of Gulliver Studios)
By TREVOR HOGG
Images courtesy of Dennis Muren and Lucasfilm Ltd.
At the age of seven, Dennis Muren, VES was taken by his mother to see The Beast from 20,000 Fathoms and The War of the Worlds and was subsequently driven to create cinematic spectacles that did not exist in real world of La Cañada, California, where he grew up. In an interesting career twist, the awe-inspired child would
go onto remake the H.G. Wells alien invasion classic as Steven Spielberg’s ode to the 9/11 attacks and in the process added to his tally of 13 Academy Award nominations, which includes six Oscar wins as well as two Special Achievement Awards and one Technical Achievement Award. Even though essentially retired after a half century in the visual effects industry, Muren is a Consulting Creative Director at Industrial Light & Magic, on the Advisory Board of VFX Voice, and more importantly has retained his childhood fascination in creating believable and narrative effects.
What started off with taking still photographs of toy dinosaurs after seeing King Kong graduated into motion pictures captured on 8mm and 16mm film stock where the teenager would attempt to recreate his favorite movie moments. “It could be a copy of The 7th Voyage of Sinbad,” Muren recalls. “I wanted to bring those experiences back home. It was interesting seeing The Fabelmans because I had forgotten about this completely, [but] the only way we could see it was, you had to find a dark place in your house to look at it. Steven had a closet, which he shows in The Fabelmans. I had a hallway with doors, and you could close all of the doors and you were in the dark. It’s amazing that we all have this shared experience. Steven had never talked about that.”
It was inevitable that The Equinox would get made. “I had completed my first year of college and didn’t want to make another short film during the summer vacation. I just wanted to make a feature film,” Muren recounts. “I realized that Ray Harryhausen’s films are like five sequences that are effects and he finds a writer to fill it all in. I had my friend Dave Allen, a stop-motion guy, and Jim Danforth who could do some stuff. Then I did my stuff. We came up with three sequences that I wanted to do – one stop motion, one forced perspective – and Dave Allen had this puppet we could use. Then we find some actors. Simple. I had $3,500 that my grandfather had saved up for me to go to USC; however, my grades weren’t good enough to go to there. I totaled it all up. Shooting 16mm on my Bolex, we did it the French way, which is silent and incredibly cheap, because then you put the sound in later and that cuts the price down to 10% of what it would have been. I knew it would work and was actually surprised that we sold the movie.”
Getting work in visual effects was not an easy task as the industry was still in its infancy and opportunities were scarce. “I gave up at one point and was going to be an inhalation therapist, which was something I figured I could do,” Muren reveals.“There were no effects movies being made. I got a little work at a union house called Cascade doing commercials for the Pillsbury Doughboy, Jolly Green Giant and Alka-Seltzer. Phil Tippett, VES, John Berg, Dave Allen, Jim Danforth and I were there at that time. It’s important to find your people who like the same things and you learn from each other. Most of us are still really good friends.” Muren pitched himself as a visual effects cameraman. “I never thought of myself as a cameraman, but I could always look at movies and ask, ‘Why is it shot this way? Why didn’t they fix that? It’s obviously wrong.’ If I was doing this, I would make sure that I was the cameraman because he’s the guy with the button, and if it’s not right I’m not pushing it. Which isn’t really the way it was, but it kind of has been that way.”
Visual Effects Art Director Joe Johnston, Special Visual Effects Supervisor Richard Edlund, VES and Dennis Muren, VES have a discussion with filmmaker Irvin Kershner about The Empire Strikes Back, with concept art for Cloud City in the background.
Muren was responsible for the digital character supervision on Casper (1995).
Phil Tippett, VES and Muren on a Go motion set for Dragonslayer, with the technique winning a Technical Achievement Award at the 54th Academy Awards in 1982.
Muren is unfazed by an AT-AT Walker appearing in front of him.
Everything changed when filmmaker George Lucas established Industrial Light & Magic to produce the visual effects for his space opera Star Wars. “I heard that George was doing some sort of effects film, but I didn’t know any of the people working on it or what it was about,” Muren remarks. “I said, ‘I have to get on this show because I want to understand what this stuff is and how it works. I got the number of [ILM Co-Founder/Visual Effects Artist] Robert Blalack, called him up and had an interview. [VFX/SFX Supervisor] John Dykstra thought that my understanding of stop motion would be applicable to this slow camera motion-control stuff that he envisioned for Star Wars, where a motor would take a minute to go through motions, and you would speed it up and slow it down, and when you played it back over four seconds it would look like it was flying accurately. He was right. I could give personality to the ships by having them bank and skid which I thought added a fun factor to the movie.”
Close Encounters of the Third Kind and the end sequence with the mothership was the next project. “I loved working with Georgeand wanted to see what Steven Spielberg was like, and I admired Douglas Trumbull, VES” Muren remarks. “I made sure that the shots got finished in those five months, and they all worked, looked good and matched to the others. We were dealing with light, smoke and mist instead of hardware, speed and energy on Star Wars. Both sides of my brain got boosted in a period of a little more than a year to a new way of looking at things.”
When tasked to incorporate a tauntaun into the aerial photography for The Empire Strikes Back, Muren realized that there was never only one solution.
To achieve the dynamic speeder bike chase in Return of the Jedi, Muren insisted that the plate photography be shot by a Steadicam in an actual forest.
Muren mapping out the CG stained-glass knight sequence from Young Sherlock Holmes (1985).
Muren got to remake The War of the Worlds, which was one of the most influential films from his childhood, with Steven Spielberg.
In order to properly track the T-1000 in Terminator 2: Judgment Day, a grid literally had to be drawn onto the body of Robert Patrick.
Muren reenacts one of the scenes from Raiders of the Lost Ark while making the movie.
An important lesson was learned when asked by Lucas to insert a tauntaun into one of the opening helicopter shots in The Empire Strikes Back. “I said, ‘There is no way to track those moves because we don’t have any recording of it. It’s all white down there. It’s going to look fake. We should build this as a big model.’ George said, ‘Just think about it.’ And he walked out. Within 15 minutes I had figured out how to do it. I learned so much from that moment. There are many ways you can do everything, but there are usually a few ways that are the best and something only gets you 85% there. Then there is another way to tweak that for the 15% to make it look it’s a 100%.”
For Muren, the real breakthrough for digital effects was Terminator 2: Judgment Day. “CG was something I’d been looking into at ILM since 1983 or 1984. It was always a puzzle and a possibility. We did it, but it was always expensive. It wasn’t until Terminator 2: Judgment Day that we figured out how to make that as something which is repeatable and affordable with a department that could do it.”
There are times when it is appropriate to heighten the reality of a shot to get the desired emotional moment, which is something that Muren did for E.T. the Extra-Terrestrial.
Whereas Muren viewed Terminator 2: Judgment Day as the big CG-character breakthrough, it wasn’t until dinosaurs got resurrected in Jurassic Park that Hollywood finally took notice.
A dramatic moment is when the liquid metal T-1000 transforms and gets temporarily stopped going through metal bars by the gun he is carrying. “Not only that, the room is quiet, so when that gun hits the bar it’s really loud. James Cameron is thinking it through so deeply and does that all the way through. The guy is great.” The studio reaction was surprising. “I was expecting Hollywood to go nuts, and they were like, ‘That was interesting.’ They didn’t know what they were seeing until Jurassic Park and dinosaurs. Everybody loves dinosaurs!”
Visual effects have become a standard filmmaking tool from indie productions to Hollywood blockbusters, which begs the question, “Can you have too much of a good thing? I would certainly say so,” answers Muren. “A lot of the stuff looks fake and the audience doesn’t seem to care. TV shows can have 100 shots in them. It looks like its shot in old Chicago but is really not shot there at all. But we fixed the buildings up and matted people into the settings. That’s great. It’s invisible. However, when you get into this chaotic action where they throw out any sort of thought of gravity, I get bothered by that.”
As with Close Encounters of the Third Kind, Murren got to shoot a dramatic sequence with an alien spaceship for E.T. the Extra-Terrestrial.
A love for stop-motion animation helped to forge a life-long friendship between Muren and Phil Tippett.
A miniature airplane was used for an aerial shot overseen by Muren for Indiana Jones and the Temple of Doom.
Muren got to do his own version of Fantasic Voyage with Innerspace (1987).
Muren witnesses the CG creature animation that Steve Williams created for Jurassic Park.
Despite being a digital innovator, Muren has not lost his enthusiasm for in-camera effects as was the case with A.I. Artificial Intelligence.
Getting close and personal with the Rancor from Return of the Jedi.
“CG was something I’d been looking into at ILM since 1983 or 1984. It was always a puzzle and a possibility. We did it, but it was always expensive. It wasn’t until Terminator 2: Judgment Day that we figured out how to make that as something which is repeatable and affordable with a department that could do it….. I was expecting Hollywood to go nuts, and they were like, ‘That was interesting.’”
—Dennis Muren, Consulting Creative Director, ILM
That is not to say that there are great current examples of effects utilized wisely. “The thing that they did so well in Top Gun: Maverick is that they started with the correct initial elements, which was actually having actors in the planes flying out there; they’re really doing it, so there’s lots of surprise and lots of things you don’t see when you’re onstage.” A personal favorite is Bardo by Alejandro G. Iñárritu. “That film just knocks me out. All of the stuff that he is trying to tell is in the effects. The effects aren’t like a storyboard that an effects guy brought to life. No, it’s all emotion that has been revealed at the pacing desired by the director. You have to get into the director’s head and figure out what he is trying to do. That guy is great.”
Technology is continually evolving. “I don’t know where AI is going to go, but it sure is fascinating,” Muren states. “Gaming has surely influenced the film industry, and the films have certainly influenced gaming. To think at one time there were no computers and visualization except for pie charts and text. Now they’re all affecting everything else. With AI you can create images in a movie that are synthetic, and the gaming industry can figure out ways to do it in real-time interactively which is phenomenal. Now we have something coming up that is going to be able to decide what image we want to see and don’t want to see, or look at it 300 different ways based on whatever criteria we can give the AI. What I love is it’s almost out of the lab and schools and into the homes. Once you get people in their homes doing stuff, they’re going to come up with ideas that no one else has thought out. However, people are worried about it for rightful reasons. There has to be checks and balances.”
The imagery should be driven by the story, not the other way around. “It’s about the movie, it’s not the shot you’re doing,” Muren observes. “How does it fit out. What is the emotion of it? For the shot of the kids on the bikes in E.T. the Extra-Terrestrial flying off into the sunset, I made the sun a little too orange and the backlight on the bikes a little too bright. The color palette is amped up a tiny bit because it added a magical feeling that I thought those kids would feel when they were experiencing it, especially when those kids were telling what just happened to somebody. That’s the way it should look. In a lot of Steven’s films, I look at them as amped up in places. Anybody nowadays can take a storyboard or animatic and make it look real with the tools that we have, but it needs to have the truth and heart of the moment that the director is going for. Your shot doesn’t want to overpower the story or take away a moment that could have helped. It’s all subtle. Directors and actors go through it all of the time.Editors are the ones picking out those parts and assembling them together. That’s what filmmaking is. When you get there, then it’s not just the shot is terrific, the movie is terrific. I love movies.”
By TREVOR HOGG
Images courtesy of Disney/Pixar.
Ember had to look as if she was made of fire while Wade required several simulations including drips, splashes and bubbles to make him a believable water character.
Hardly elementary to put together is the original romantic comedy Elemental from Pixar, where a new arrival to a city inhabited by Water, Earth, Air and Fire enters into a relationship that crosses the class divide. The story was inspired in part by a science class joke. “When I looked at the Periodic Table, all I could see was this apartment complex,” chuckles filmmaker Peter Sohn (The Good Dinosaur). “To make the pitch more acceptable to everyone, I boiled it down to the classical elements fire, water, earth and air.” The subject matter is personal in nature, he reveals. “I lost both of my parents during the making of Elemental, and this film has a great deal to do with appreciating our parents and the sacrifices that they make for us. It has been this interesting emotional ride.”
Given the nature of the characters, effects were an essential aspect of their design, with Sohn having to readjust his expectations. “There were lots of articles last year about how tremendously difficult the hours can be in the visual effects industry, and with so many projects going into streaming and features getting so big, that there were some eye-opening ways to produce this material that I was guilty of. I pulled back on a lot of that in the middle of last year. I went in knowing the gameplan of what we were going to get to do, but because my parents had died, I was like, ‘This is to honor them! We’ve got to go further.’ The crew has been a tremendous support in this process, and they have lifted the film in ways that I will forever be grateful for.”
One of the hardest aspects was to make sure that the characters actually look like they are made from their designated element, such as “Ember [Leah Lewis] and Wade [Mamoudou Athie] as our main characters, Fire and Water,” states Sohn. “Ember was the most challenging to get to her look. I remember seeing Ghost Rider as a kid and going, ‘A character with a fiery head.’ The fire was so realistic and meant to be scary. Then there was Jack-Jack in The Incredibles who goes on fire. It’s hard to make a fire character that does gaseous without a solid substructure there. Character Designer Daniel López Muñoz took iPhone footage of a fire in his backyard, pulled out the frames and painted over them this fire character; he made these eyes blink on it that was caricature enough where these eyes could fit. Ember’s fire was more forgiving, where Wade became more difficult as the production went on because of the way his rig worked and simulations on top of his shaders and the caustics inside of him; there weren’t a lot of places to hide.”
The Lighting department integrates the characters, sets and effects to produce the final image.
Reinventing the character pipeline was Bill Reeves, Global Technology Supervisor. “The character gets animated in animation using the standard Pixar pipeline in terms of what you see on the animator’s screens. Its surfaces, polygons as you will. You don’t see the fire. Then they check in their work saying, ‘This shot is done.’ We convert it over to feed into Katana and RenderMan and we render. In the course of that conversion, we feed it into Houdini to generate volumetric pyro simulations and stylize it. It comes out the back end and eventually gets into Katana and then into RenderMan. [For] the part of the pipeline that goes into Houdini and back out again, we had little bits of it here and there. However, Ember is in 95% of the shots, so we had to run that pipeline over and over again. There is a lot of simulation involved with Wade because his hair is like a fountain that is bubbling. That’s another Houdini set of tasks. Then the Air character is another set of simulations to generate the flowing air wisps. The only simple set of characters are the Earth characters, but they’ve got a lot more geometry than Woody and Buzz Lightyear.
An effort was made to limit the number of elemental characters to 10 in a shot. “But we blew passed that,” Reeves laughs. “There is a sequence called Air Stadium in the movie where there are thousands of them.” A new approach was developed for crowd simulations. “It doesn’t actually go into Houdini for every character, but it is volume deform kind of thing. When the characters are further away, it’s one simulation that we’re copying around and deforming in Houdini, but it’s a 10- to 20-second Houdini call rather than an Ember simulation, which is four or five hours easy.”
In order to discover the performances, the animation team manipulates the previs models of the characters and then computer simulations are activated to make them appear and feel like their element.
Production Designer Don Shank creates the sets within Presto, which is Pixar’s proprietary animation system.
Major crowd simulations had to be produced for the Air Stadium sequences.
Concept art by Lauren Kawahara exploring the color and design for Element City as well as how the characters interact with the urban environment.
Locking down the look of the characters was difficult for Directing Animator Gwendelyn Ederoglu because of being so effects dependent.
Compositing was leaned on heavily when it came to lighting. “It’s mainly a way of dealing with the complexities of this world of pushing a lot more lighting layers into Nuke and composting and tweaking the end result there, rather than having to go back and re-render. You can work faster because it’s a more interactive system when you have the data. That was something we worked hard on and did a lot more on this show than on other ones,” Reeves observes.
To assist animators, a toolkit was developed by Sanjay Bakshi, Visual Effects Supervisor. Comments Bakshi, “We had to animate things like when Ember gets mad, not just the facial expression and the body language, but what does the fire do? Since animators are experts at the timing of that, it had to be synchronized with their acting choices. We had to give them some visual indication. Our simulation and shading artists were changing a bunch of knobs to get it to feel like what sadness is. Then we map that to one number so there is a sadness control. It was like a zero to 10 kind of thing. More effort was put into fire because Ember is the main character and goes through the most emotions.” Transparency and the speed of the fire help to convey emotion. “When Ember becomes vulnerable her flames become a lot more transparent and candle-like in the movement,” Bakshi notes. “For anger, we did use color. Ember goes into more purple; that’s her signature anger look. A lot of the story is about her being angry and not understanding why, then learning through the movie how to control her anger and why she is angry.”
It was important to not have Ember touch anything that was flammable.
Much of the action takes place in Element City. “The Fire folks live in Firetown, and there are Water, Air and Earth districts,” Bakshi explains. “Pete wanted these districts to have the elements built into the architecture and infrastructure, so we did a bunch of set dressings, like a streetlamp in the Fire district would have some fire on it. Our set dressers could place them, and the fire simulations would come along for the ride. The buildings and architecture have fire and water simulations built into them, so that the set dressers could do their work and get these simulations that would just happen. That was another big component.” Instancing was essential in making rendering manageable. “For fire simulations in Firetown, there were probably 25 to 30 of them that get reused over and over and instanced. The lamps are all instanced. It’s all of the same simulation, just offset in time so they don’t look identical,” Bakshi notes.
Getting the look of the characters locked down was difficult with them being effects dependent. “We partnered with Dan Lund who is a traditional effects animator from Disney,” remarks Gwendelyn Ederoglu, Directing Animator. “He talked about applying animation principles into 2D effects and how the same principles that we use in 3D in our characters, we could still pull and learn from that. One of them being, what are you trying to tell in the shot is key and how can effects support that? What we did learn from our 2D tests early on was that there was going to be a lot of fun to be had with volumetric changes, which is something that our characters don’t typically do. We also learned quickly how tear-offs of fire or water droplets added an immediate believability to their ‘elementalness,’ so we worked with our rigging team to develop prop fire and water, which we could then use in our animation testing early on. That became a critical element in the film. If Ember has a limb detached, rather than breaking off the rig of the limb, we would cheat into prop fire that would match the traits of Ember’s fire and then it could dissipate.”
Essential was a close collaboration established between animation and character effects departments. “We worked closely with simulations typically on a film, but jumping all of the ways to character effects, that was a new relationship that was formed,” Ederoglu notes. “A term that one of us might use in animation about stylization could be interpreted so differently by a different department. It took us awhile to create a more common language to be able to talk about shots, problems and issues or what looks we were chasing in animation.”
Lauren Kawahara depicts what a street would look like in Element City, which utilizes a canal system similar to Venice and Amsterdam.
Peter Sohn conducts one of many review screenings of Elemental, which is a tribute to the sacrifices his parents made for their family.
In order for the characters to be appealing, the elements could not be completely realistic, but at the same time not so cartoony that viewers forgot what they represented.
A nighttime examination of Ember by Jason Deamer, Character Art Designer.
Concept art by Jonathan Hoffman and Maria Yi experimenting with the facial expressions of Ember.
Most of the animators came from working on Lightyear, which was grounded in real human acting. Ederoglu adds, “It did take a bit to adjust to the looseness and constant sense of motion that we needed to have for these characters to feel believable. Something would be funny in the Presto version because of the snappy timing, but we needed those four frames for the pyro to catch up. We did have to learn the language of that as we went.” Lattice controls were overhauled. “Those were far more robust in terms of regional controls than what we have had in the past. That was critical. A quarter of the shots required lattices to do some organic and major shape changes. Beyond that, a lot of the animation was within the rigs themselves.”
Unlike previous Pixar movies where the top portion of a character’s head or the chin could be cropped out of frame, this was not possible for Elemental. “We knew right away talking early on with Pete that 1.85:1 was going have to be the way to shoot something like this,” remarks David Juan Bianchi, DP, Camera. “We were aware that the pyro and flames of Ember were going to be so important, not just for her look but how to tell her emotional state. What is it doing? What is the color in her flames? We needed to go wide and vertical. I asked the engineering teams to try to give us a camera that matched large format photography, and this allowed me to shoot wider lenses. We wanted to have that extra dial of being able to have a shallow depth of field at certain moments, and having this large format version of our Pixar camera allowed us to do that.”
Reflecting various emotional tones and worlds throughout Elemental is the camera style. “There was a language for Firetown,” Bianchi explains. “We primarily shot that with wide-angle lenses, and the camera is more at the character’s eye level as if someone was hand-operating the camera. When going to Element City we introduced Water characters like Wade’s family living in an apartment made of water. Then we started to have a different camera language. There were Steadicam and Technocrane moves so it felt like the camera was rotating and floating, and longer lenses. When Ember and Wade from these two worlds intersect and interact with each other, we cherry-picked elements from both to make what I called the romance love element or Ember Wade language. Hopefully, it underscores and supports the story of where our two characters are at and where they ended up.”
Rather than just dealing with practical sources like lamps and natural ones such as sunlight, for Jean-Claude (JC) Kalache, DP, lighting had to take in account the characters themselves being light sources. “Ember is a self-illuminated gas, and the way we solved the exposure issue is we exposed everything except Ember. We made a conscious decision that animation would drive her energy, and lighting we’ll treat almost like a lightbulb on dimmer switch. When you think of water, everything is a light source. Water is reflective, refractive and shows through. It captures the whole environment. It was a big mind-bender. The brain is good at realizing what water looks like, but lucky for us, we were stylizing water, so you can break down the five or 10 components that make water look like water, but then you can move them around. It took a good part of the full year just to learn how to make our main Water character appealing.” Then there were the Air characters. Adds Kalache, “I remember looking at the overnight render and it was wonderfully beautiful pink light filling the whole train. Offscreen there was an Air character that was blasted by the sun.”
“The premise of this whole movie is that these elements cannot exist together,” Kalache remarks. “Yin and yang. Firetown is dry, smoky and less reflective. What is the opposite of that? It is a city with glass, water, reflections, and everything is bouncing. Lighting a glass building is a pain because you can’t shape them. We literally treated the buildings of Element City as a character, and we were lighting them as if we were studio-lighting a human, putting special rim and kick lights [on them]. One thing that was revealed quickly was a Water character is dependent on the environment around them, especially what is behind them. When Wade goes to or is in Element City, we quickly noticed if the buildings behind him were busy, it was impossible to look at him because you could see right through him. However, if we took the sunlight and made it slightly ramped down right behind him, conveniently things calmed down and he looked appealing.”
Kalache made an observation that surprised the director. “I remember telling Pete, ‘The world is the light. What do you expect in your character?’ Soon after, animation would come in to talk to him about what they expected from the performance. Then soon after, the effects people would come and talk about what they expected from their character effects. It took all of these conversations to eventually end up with characters that worked for Pete.”
Concept art by Carlos Felipe León that visualizes candlelight silhouettes of Ember and Wade.
Daniel López Muñoz depicts a street in Firetown that has a fire motif.
Peter Sohn attempts to find the visual aesthetic of the Earth district in Element City.
One of 97,760 storyboards produced for Elemental with filmmaker Peter Sohn, who started off as a storyboard artist for Pixar.
By CHRIS McGOWAN
ABBA seen on a massive LED screen. The de-aged avatars were created by ILM in a project that took over five years. (Image courtesy of ILM and ABBA)
In one breakthrough after another, AR, VR and VFX are augmenting live entertainment, from ABBA’s avatars to XR concerts to Madonna dancing live on stage with her digital selves.
ABBA: THEIR ‘70S SELVES
When the Swedish group ABBA returned to the stage last May after a 40-year hiatus, they did so digitally with the help of ILM. The foursome – Björn Ulvaeus, Benny Andersson, Anni-Frid Lyngstad and Agnetha Fältskog – appeared in ABBA Voyage via their de-aged digital avatars, virtual versions of themselves on huge screens in the purpose-built, 3,000-capacity ABBA Arena, which was constructed in Queen Elizabeth Olympic Park in London. (ABBA Voyage won the 21st Annual VES Award for Outstanding Visual Effects in a Special Venue Project.)
ABBA’s 20-song “virtual live show,” over five years in the making, is a hybrid creation: pre-recorded avatars appear on stage with a physically present 10-piece band to make the experience more lifelike and convincing. The avatars meld the band’s current-day movements with their appearances in the 1970s.
ILM supplied the VFX magic, with more than 1,000 total visual effects artists in four studios working on the project, according to the show’s spokespersons. ILM Creative Director and Senior Visual Effects Supervisor Ben Morris oversaw the VFX of the show, which was directed by music-video veteran Baillie Walsh.
First, Morris and his team scanned thousands of original 35mm negatives and hours of old 16mm and 35mm concert footage and TV appearances of the band. The supergroup quartet spent five weeks singing and dancing in motion-capture suits as ILM scanned their bodies and faces with 160 cameras at a movie studio in Stockholm. The same process was undertaken with younger body doubles, who followed their moves, guided by choreographer Wayne McGregor, and whose movements were blended with those of ABBA to give the band more youthful movements.
The digital versions of ABBA appear on the stage and to the sides of the arena on towering ROE Black Pearl BP2V2 LED walls, powered by Brompton Tessera SX40 4K LED processors. Each screen is 19 panels high, and there are an additional 4,200 ROE LED strips in and around the area. Solotech supplied the LED walls. Five hundred moving lights and 291 speakers connect what is on the screens to the arena. The result is spectacular and suggests that many large-scale digital shows may be on the way for music stars who are getting old or simply don’t like touring.
ABBA avatars on stage in lower center. The show’s elaborate lighting and live musicians help bring ABBA’s music to life. (Image courtesy of ILM and ABBA)
VIRTUAL TUPAC, VIRTUAL VINCE
Digital Domain created digital representations of the rap star Tupac Shakur and legendary Green Bay Packers football coach Vince Lombardi in 2012 and 2021, respectively, which raised the visual-quality bar for virtual appearances projected live.
On April 15, 2012, at the Coachella Valley Music & Arts Festival in Indio, California, Tupac Shakur appeared in a CGI incarnation on stage at the Empire Polo Field in Indio, California. The virtual Tupac sang his posthumous hit “Hail Mary” plus a “duet” of “2 of Amerikaz Most Wanted” with Snoop Dogg, who was on stage, in the flesh.
The computer-generated realistic image of Shakur was shown to some 90,000 fans on each of two nights; YouTube videos of the event reached 15 million views, according to Digital Domain. Some called it the “Tupac Hologram” – it wasn’t a hologram, but it was 3D-like. Unlike ABBA Voyage (2022), which featured the participation of the band in creating the group’s avatars, the Shakur on stage was created long after the singer’s death in 1996. The project took about two months to complete, with 20 artists of different disciplines, according to Digital Domain’s Aruna Inversin, Creative Director and VFX Supervisor. The virtual Tupac was the vision of Andre “Dr. Dre” Young, and Digital Domain created the visual effects content. AV Concepts, and an audio-visual services and immersive technology solutions provider, handled the projection technology.
The digital versions of ABBA appear on the stage and to the sides of the arena on towering ROE Black Pearl BP2V2 LED screens powered by Brompton Tessera SX40 4K LED processors. (Image courtesy of ILM and ABBA)
Silhouettes of Björn Ulvaeus, Benny Andersson, Anni-Frid Lyngstad and Agnetha Fältskog – whose virtual versions appear on huge screens in the purpose-built 3,000-capacity ABBA Arena in London. (Image courtesy of ILM and ABBA)
A holographic-like virtual Tupac Shakur was created by Digital Domain and shown at night at the 2012 Coachella festival. (Image courtesy of Digital Domain)
The digital version of the late rapper was the vision of Andre “Dr. Dre” Young. AV Concepts, an audio-visual services and immersive technology
solutions provider, handled the projection technology. (Image courtesy of Digital Domain)
The virtual Tupac Shakur with a live musician in the background. A practical holographic effect was created by projecting the performance asset onto a material called Hologauze. (Image courtesy of Digital Domain)
The virtual Shakur was a two-part effect, according to Kevin Lau, Digital Domain Executive Creative Director. “In the first, we recreated Tupac using visual effects, and the other was a practical holographic effect created using Hologauze. For the performance, we started by filming a body double performing the set. The actor had an incredible likeness in both stature and movement to Tupac. Once that performance was in the can, we began digitally recreating a bust of Tupac’s likeness. This digital head was then combined with the body performance and animated to match the song. The two were blended together in composting to create a seamless recreation of a seemingly live performance.”
Lau continues, “The performance asset was then synched to the live band and projected on Hologauze. This material has the ability to reflect bright light, while remaining fairly transparent in areas that are dark. The result is a figure that appears to exist physically in a three-dimensional space.”
“The Vince Lombardi project was similar to the Tupac hologram, but we enlisted a slew of new techniques – many of which we created ourselves – that were not available at the time,” comments Lau. The virtual Lombardi – evoking the NFL football coach as he was in the 1960s – appeared on the jumbotron at Raymond James Stadium in Tampa on February 7, 2021.
Lau explains, “We began by filming a body double to do the bulk of the performance. But instead of having to create a full digital recreation of Coach Lombardi’s head, we enlisted Charlatan, our machine learning neural rendering software. This allowed us to train a computer off a data set of available Vince Lombardi images. The software can then synthesize those images and recreate a likeness of the subject – in this case, Coach Vince Lombardi – based off our actor’s [the puppeteer] performance.”
For the virtual Lombardi, a slightly different projection technique than before was used, according to Lau. “Because of the nature of the Super Bowl, we didn’t have access to a sequestered location or a secured space to prep screens and projectors. The stage had to be wheeled out into the end zone and set up, then broken down within a matter of minutes. For this, we enlisted more of a ‘Pepper’s Ghost’ effect, in which a translucent mylar screen is put at a 45-degree angle from a large LED screen. The resulting image is reflected back to the viewing audience, allowing you to see through parts of the screen giving the holographic illusion.”
The digital Shakur and Lombardi are two “tentpole projects” for Digital Domain in that particular area, and they have also done “a handful of other executions for promotional and concert events,” says Lau. “We are always looking for new opportunities to surprise and delight viewers. AR/VR and holographic immersive entertainment have a huge role to play in the future. Being able to re-contextualize our environment will truly bring the audience in and allow for higher engagement with fans.”
MADONNA AND MARIAH
At the 2019 Billboard Music Awards, five Madonnas and Colombian singer Maluma performed the reggaeton-pop song “Medellín.” Or, more precisely, one Madonna and four avatars sang with Maluma, in an AR-enhanced performance that utilized volumetric capture and Unreal Engine.
“To the surprise and shock of fans around the world, four copies of Madonna – each wearing one of her signature outfits from over the years – appeared and danced alongside the original, creating a show unlike any other and leaving fans to wonder how she pulled it off,” says Piotr Uzarowicz, Head of Partnerships & Marketing for Arcturus, a leader in volumetric video technology.
Arcturus’s HoloSuite software “was instrumental in the post-production of the four pre-recorded Madonna performances that were integrated into the live TV broadcast,” says Uzarowicz. “This was the first time that volumetric video was used in a television broadcast, and it signaled an emerging new era of entertainment.”
The augmented reality content could be seen by TV viewers of the telecast and was presented alongside live components, including Madonna herself, Maluma, and over a dozen dancers.
Sequin AR was tasked with bringing the Madonna clones to life in AR. “We worked with Jamie King [Madonna’s Creative Director], Madonna and several fantastic companies to create the performance, including Dimension Studios [where the volumetric capture of Madonna took place], StYpe [camera tracking], Reflector Entertainment [co-founded by Cirque du Soleil creator Guy Laliberté] and Brainstorm [InfinitySet],” says Sequin AR CEO Robert DeFranco.
“The ‘Madonnas’ were created with volumetric capture, and additional elements were created, including rain, stage extensions and clouds,” explains DeFranco. “The elements were rendered in real-time using Brainstorm and Unreal Engine.”
Legendary Green Bay Packers coach Vince Lombardi was brought to life in digital form by Digital Domain for the 2021 Super Bowl in Tampa, Florida. (Image courtesy of Digital Domain)
Digital Domain filmed a body double and then enlisted Charlatan, its machine learning neural rendering software to train a computer off a data
set of available Vince Lombardi images. The software synthesized the images to recreate a likeness of Lombardi. (Image courtesy of Digital Domain)
The stage for Madonna’s performance with four AR versions of herself at the 2019 Billboard Music Awards show. (Image courtesy of NBC and Dick Clark Productions)
Five Madonnas with singer Maluma in background. Sequin AR, Arcturus, Dimension Studios, Reflector Entertainment, stYpe and Brainstorm collaborated on the project powered by Unreal Engine. (Image courtesy of NBC and Dick Clark Productions)
Five Madonnas with Maluma in background. Volumetric capturing before the show took place at Dimension Studios. (Image courtesy of NBC and Dick Clark Productions)
Examples of volumetric capture choreography on the stage with different stage extensions plus clouds. (Image courtesy of NBC and Dick Clark Productions)
Sequin AR was founded as a virtual production and augmented-reality solutions company, and it has extended its capabilities to include immersive web3, mobile and VR solutions for its clients. “We consider ourselves a 3D immersive company,” says DeFranco. “AR and real-time render technology allow content producers the ability to tell new stories that were previously only possible in post-production. This helps them make a greater emotional impact and connect with audiences in more engaging ways.”
DeFranco explains, “AR viewing can be accomplished with a number of screen types, whether it be AR glasses, mobile, monitors or broadcast. Broadcast AR leverages the TV, [and] we integrate the real-time technology into the standard production pipeline, creating the virtual production pipeline for AR.”
Sequin AR provided virtual production and AR broadcast services for Mariah Carey’s Magical Christmas Special on Apple TV+ on December 4, 2020. The production utilized “virtual sets being rendered in real-time with Zero Density and Unreal Engine for seven cameras. The shoot was done in five days, which would have not been possible if it were not for virtual production,” according to DeFranco.
Ego, which Sequin worked on, “was all AR, motion capture and facial capture. We provided virtual production technical engineering on the show supporting the technical pipeline and playout in real-time,” says DeFranco.
Regarding tech utilized, he notes, “In addition to our technical pipeline and processes, we leverage Unreal Engine and a variety of solutions depending on the production needs, including Mo-sys [virtual production and camera tracking], StYpe [camera tracking technology for AR or virtual studio], Zero Density [real-time visual effects, real-time broadcasting and a markerless talent tracking system] and Silver Draft [supercomputing high-end rendering], to name a few.”
DeFranco adds, “Augmented reality and virtual production continue to push the boundaries with innovative companies creating new content and experiences. We are thrilled to be helping advance the adoption of these technologies and creating new ways to use them.”
XR CONCERTS
Last year, Snap signed a pact with Live Nation Entertainment for audiences to access AR experiences via Snapchat at Live Nation music festivals such as Lollapalooza, Bonnaroo, Governors Ball and Electric Daisy Carnival. Snapchat users will be able to find their friends in the audience, locate landmarks on the festival grounds, try on merchandise and experience other AR content. Snapchat has a large potential audience for its AR experiences – it had 363 million active daily users worldwide as of the third quarter of 2022, according to the firm.
“There is growing potential for attending concerts and live VR theater and performance from the comfort of your home with VR. Just as we are finding a shift in the film industry due to the growth of streaming, I believe we will find audiences who seek XR entertainment. The possibility of being able to connect with your favorite performers and bands while also attending with your friends and doing so from your home environment is compelling.”
—Kiira Benzing, Founder and Creative Director, Double Eye
Drinks in hand, these avatars are part of the VR variety show Skits & Giggles, published by Meta Platforms and available on Horizon Worlds. (Image courtesy of Double Eye Studios and Meta Platforms)
In this virtual living room and elsewhere, the VR show is interactive – scripted in the skits, monologues and musical numbers, but with moments of spontaneity and improv in between. (Image courtesy of Double Eye Studios and Meta Platforms)
U2, Eminem and Maroon 5 are among the artists who have added various types of AR experiences to their shows via smartphones and apps. In 2022, for Flume’s show, Coachella partnered with Unreal Engine to add AR elements to the Coachella YouTube live stream. In addition, the Coachella Valley Music and Arts Festival’s Coachellaverse app offered various immersive AR effects, including geo-specific experiences created in partnership with Niantic. Last December, the virtual band Gorillaz performed AR concerts in both Times Square in New York City and Piccadilly Circus in London.
In October 2020, in a peak part of the pandemic, Billie Eilish’s “Where Do We Go?” XR concert from XR Studios in Hollywood served up “a meticulous visual affair, replete with lofty LED screens and extended-reality [XR] effects, that felt determined to recapture weary viewers’ attention,” according to Amy X. Wang of Rolling Stone. She continued, “With the feel of a highly produced music video, the show, which charged $30 a ticket, hit on all the strengths of livestreaming. Enormous animated creatures and chimeric landscapes whirled by around Eilish, her brother Finneas and her drummer Andrew Marshall as they played into multiple roving cameras from a 60-by-24-foot stage; the trio’s sparse physical presence made for a striking silhouette to the rapidly shifting scenery.”
BTS, The Weeknd and Metallica are among a few of the big names who have offered VR songs or concerts. And MelodyVR, NextVR, VeeR, Meta Platforms and Wave are some of the platforms that offer XR concerts and have large libraries of such fare.
Moreover, visual effects are responsible now for an increasing number of virtual performers. South Korea’s Eternity is a virtual K-pop band that uses AI technology, while Aespa has both physical and virtual members.
VR VARIETY SHOW
Kiira Benzing is Creative Director and Founder of Double Eye, which is presenting interactive theater and live entertainment in virtual reality. Her latest effort is the VR variety show Skits & Giggles, which is published by Meta Platforms and available on Horizon Worlds. “We wanted to make a variety show and test how comedy in a scripted format might play out in VR for a live audience,” says Benzing.
The entire show is interactive, according to Benzing. “The show is mostly scripted between the skits, monologues and musical numbers; but there are moments of spontaneity and improv that also arise in every show.” Skits & Giggles was a 2022 nomination for Best Immersive Performance at the Raindance Film Festival.
Benzing comments, “VR is an incredible medium for live performances. I see a wonderful niche deepening in its growth as more and more live performances are being created across different Social VR platforms. Now there are so many new solutions with the hardware becoming more affordable and the Social VR platforms growing in numbers. All of these elements together make an ecosystem more possible for live VR performance to flourish.”
Benzing continues, “VR, AR and XR as a whole are amazing evolutions of experiences we can share with our audiences. The player’s journey is quite different if we use AR to overlay onto the physical world or VR to take them into a brand-new world, but both forms of the tech can be transformative.” She observes, “We are seeing festivals, concerts, bands and theater companies venture into these mediums. Since XR has such a power to transform an experience, I believe the more audiences are introduced to AR and VR. the more they will begin to expect live performances to include an XR extension.”
And, Benzing notes, “There is growing potential for attending concerts and live VR theater and performance from the comfort of your home with VR. Just as we are finding a shift in the film industry due to the growth of streaming, I believe we will find audiences who seek XR entertainment. The possibility of being able to connect with your favorite performers and bands while also attending with your friends and doing so from your home environment is compelling.”
She concludes, “As audiences have a taste for the combination of XR and live performance, they will crave more and more.”
By NAOMI GOLDMAN
Images courtesy of Gale Anne Hurd, except 21st VES Awards photos by Danny Moloshok, Phil McCarten and Josh Lefkowitz.
Gale Anne Hurd strikes a shero pose.
From the legendary Lt. Ellen Ripley battling aliens in deep space to the unwitting target of an unstoppable robotic assassin, to a group of survivors in the aftermath of a zombie apocalypse, Gale Anne Hurd has brought forth iconic characters and cinematic experiences that have transported and transfixed audiences worldwide. One of the most respected and influential film and television producers of our generation, acclaimed producer-writer Hurd has been instrumental in shaping popular culture for nearly four decades. And in the process, she has revolutionized action cinema and delivered transformational depictions of women on screen.
Hurd is one of the entertainment industry’s most prolific producers of film and TV projects that shatter both box office and ratings records. After rising from Roger Corman’s executive assistant to head of marketing at his company, New World Pictures, Hurd’s producing career took off when she co-wrote and produced The Terminator. Her unprecedented success was quickly followed by Aliens, which received seven nominations and two Academy Awards, and additional Academy Award winning films including The Abyss, Terminator 2: Judgment Day, The Ghost and the Darkness and Armageddon. When Hurd entered the television industry, she met similar success in shepherding juggernaut The Walking Dead and serving in producing roles on Fear the Walking Dead, Talking Dead and Tales of the Walking Dead. Her latest documentary, The You Tube Effect, a cautionary tale on the impact of social media, will be distributed this summer.
In recognition of her enormous contributions to visual arts and filmed entertainment and the advancement of unforgettable storytelling through cutting-edge visual effects, Hurd was honored at the 21st Annual VES Awards with the Lifetime Achievement Award. In presenting the award, filmmaker James Cameron celebrated Hurd as a ‘true gale… a force of nature.’
On that note, Hurd was visibly moved in accepting her award in front of an audience of more than 1000 VFX artists and practitioners at the awards ceremony: “This is one of the proudest moments of my life. That little girl sitting in a movie theater staring wide-eyed at the extraordinary images on the big screen, never dreaming that one day she’d be producing them herself, is still in awe of the magic you create, each and every day. You are my heroes, and to receive your Lifetime Achievement Award is more meaningful than you could possibly imagine. And to be presented the award by Jim – we grew up in the industry together – and to have it be an evening where Avatar got such accolades – was literally perfection.”
VFX Voice sat down with trailblazer Gale Anne Hurd to talk about breaking barriers, her love of craft and celebrating heroic women – on and off screen.
Hurd and Jim Cameron on the set of Academy Award-winner The Abyss.
VFXV: Tell us about your origin story – what were your early sources of inspiration that led you to your career in filmed entertainment?
Hurd: I was an early reader, and I read every science fiction and horror book I could get my hands on. So much so, that the Palm Springs Library asked me to be their consultant and help them acquire books for young people. I even wrote a column for the local paper and wrote sci-fi and fantasy book reviews.
I’ve always been a fan of science fiction, fantasy and horror. Growing up, I was lucky enough to have TV that aired both chiller and thriller movies. My local movie theater was essentially my weekend babysitter. I’d watch double features each and every Saturday and Sunday. I was a huge fan of Ray Harryhausen’s work, in particular Jason and the Argonauts and The 7th Voyage of Sinbad, and I think he was my first visual effects artist/hero/crush. There have been countless since then, but you always remember your first…!
Hurd and Jim Cameron promoting their sci-fi sequel Aliens.
Hurd consults with Charlize Theron on the set of Aeon Flux.
Hurd with feminist icon Gloria Steinem and director Valerie Red-Horse Mohl.
Hurd on the set of The Terminator with Linda Hamilton.
Hurd meeting with director Ang Lee on the set of The Hulk.
Hurd on the set of Aliens with Sigourney Weaver.
Hurd with mentor and friend filmmaker Roger Corman.
VFXV: What most captivates you about the action-adventure genre?
Hurd: I was always an adrenaline junkie and love sharing the theater experience. It’s an art to tell a story that engages many different audiences and finds a way to make them identify with the character in jeopardy and root for them. There is nothing better than being in a dark theater with an audience on the edge of their seats who are cheering, screaming as an integrated part of the experience.
If you boil down the story I strive to tell over and over, it’s of ordinary people who find themselves in extraordinary circumstances and find the strength within themselves that they never knew they had, to succeed and overcome… and in some cases, save the world!
Hurd and the cast of The Walking Dead celebrating its 100th episode.
VFXV: What was your pathway from school to your first job in the film industry?
Hurd: At Stanford, I studied economics, political science and communications. It was my original intent to be a marine biologist, but I realized I would never do well enough in math and science, so I embraced the social sciences. I had a seminal event in my junior year when I was part of the Stanford in Britain program. I fell in love with British film and broadcasting and knew what I wanted my future path to look like.
During my college years at Stanford University, I was lucky enough to have the late producer Julian Blaustein as my mentor. He and I bonded over our mutual love for science fiction. Julian produced the original The Day the Earth Stood Still and was one of the few producers at the time who valued sci-fi as an art form in which to tell powerful stories.
Hurd consults with Jeffrey Dean Morgan on the set of The Walking Dead.
“I want people to see themselves and especially women in a different and ‘enlightened’ light. Not victims cowering in a corner waiting to be saved by an alpha male. There is a rich tapestry of roles that women can and are playing in real life as well as in film and TV. I’m inspired by what we’re seeing here with diverse and older actresses coming to the forefront. They have been doing that for years in British cinema with rich roles for Dame Judi Dench and Maggie Smith. But what’s different now in actresses being lauded in the U.S. [such as Michelle Yeoh and Viola Davis] is that the roles are rather transformative.”
—Gale Anne Hurd
Hurd in Peru for her Amazon anthology series Lore.
Hurd in Prague for her Amazon anthology series Lore.
For a film class my senior year, I chose to write my final paper on Stanley Kubrick’s 2001: A Space Odyssey that touched on its groundbreaking visual effects. As fate would have it, Roger Corman hired me after reading my thesis on 2001: A Space Odyssey, clearly disregarding my less than stellar personal interview with him. So you could say that my fascination with visual effects was instrumental in my Hollywood career from Day 1 and you’d be 100% correct.
VFXV: You have been credited with ushering in the era of strong female protagonists. What did it take to get the industry to support films with “sheroes” in the lead?
Hurd: There were a lot of early challenges in getting the industry to support women as heroes in the lead; Jim [Cameron] and I were lucky that The Terminator was the success it was. At its heart, the movie really is the story of Sarah Connor, and it was wonderful to tell that story through the female gaze. The Kyle Reese moment with Sarah – ‘I came across time for you Sarah, I love you, I always have’ – is one of my favorite lines, but we knew better than to sell it as a love story. We had everything stacked against us and prepared like we were defending our dissertation. We knew it was an easier sell as The Terminator with an unstoppable villain at the center point. Yes, it was sold as the story of the robotic soldier and Arnold Schwarzenegger. But it was her story.
With Alien, we were lucky, because the only character left alive was Ellen Ripley, brought to power by Sigourney Weaver… So if there was going to be a continuation, it would either be her… or the cat! I hope audiences take away from seeing strong women on screen, that women are capable of living their own truths and being the protagonists of their own stories.
Jim Cameron presents Hurd with the VES Lifetime Achievement Award.
VFXV: As we’re celebrating Women’s History Month, why was Wilma Mankiller a subject you wanted to focus on in a documentary? And what drew you to produce True Whispers: The Story of the Navajo Code Talkers and Choctaw Code Talkers?
Hurd: The road to making these documentaries was rich and inspiring. I reached out to a Native American woman director, Valerie Red-Horse, who asked to work with me on a documentary on Navajo Code Talkers and their service during World War II. I had read a terrific script when I was Chair of the Nichols Screenwriting Committee at The Academy. When we went to the Navajo Nation and asked for support from the Navajo Code Talkers Association, they said ‘Please tell our real story.’ So we got financing from ITVS, and PBS and made a well-received documentary. What was most rewarding was to see these code talkers, these men who had to keep their service classified for all these years, finally lauded by young people. That project led to our making of the documentary on the Choctaw Code Talkers.
Then the Cherokee Nation proposed a documentary on Wilma Mankiller, the first woman elected to serve as Principal Chief of the Cherokee Nation. I had never heard of her and that shocked me. As someone interested in women’s studies and leaders, the fact that there was such an amazing women recognized around the world that I didn’t know about – I was compelled to take this on. We raised most of the funding for MANKILLER on Kickstarter – and wow, the cast of The Walking Dead and the Indigo Girls helped immensely by providing great rewards to donors. I’m proud that our film helped bring national recognition for Wilma, who is now rightly emblazoned on the U.S. quarter coin.
Hurd accepts the Lifetime Achievement Award at the 21st Annual VES Awards.
Hurd backstage at the VES Awards flanked by VES Executive Director Nancy Ward and VES Chair Lisa Cooke.
VFXV: You’re known for embracing daring material. What is the essential “it” in taking on a new project?
Hurd: Whether it’s Wilma Mankiller or The Walking Dead, I return to similar themes when seeking out new projects. Since I am such a hands-on producer, I have to make a visceral decision to leap and dedicate so much of my time to projects so that I don’t regret that choice later. That’s my first litmus test. I really like telling stories of ordinary people thrust into extraordinary circumstances in new ways. I love examining the human condition and posing the question that the audience is thinking – ‘What would I do in that situation?’
I want people to see themselves and especially women in a different and ‘enlightened’ light. Not victims cowering in a corner waiting to be saved by an alpha male. There is a rich tapestry of roles that women can and are playing in real life as well as in film and TV. I’m inspired by what we’re seeing here with diverse and older actresses coming to the forefront. They have been doing that for years in British cinema with rich roles for Dame Judi Dench and Maggie Smith. But what’s different now in actresses being lauded in the U.S. [such as Michelle Yeoh and Viola Davis] is that the roles are rather transformative.
When it comes to diversity, equity and inclusion and the ‘state of women’ in the business, here’s the thing: Socially, culturally, even to this day and from my own experience, women are taught not to stand out or speak up. And we are criticized when we do. Women are more frequently interrupted in meetings and quieted when we are the ‘interrupters,’ and at a certain point, subconsciously, you absorb that and adapt. We often see ourselves differently and sell ourselves short.
I wouldn’t have gotten anywhere without mentors who not only believed in me, but pushed and challenged me, and encouraged me to value myself and continue – even when I wasn’t feeling likely to succeed. From the beginning, I always wanted to work with women in every capacity and recognize and support talent. I’m hoping that with strong mentorship and the success of films and TV series that feature strong women and non-traditional heroes in front and behind the camera, our collective influence will grow and make an impact on everyone our work touches.
VFXV: What excites you about using visual effects technology to advance character-driven and highly visual storytelling?
Hurd: What I love about VFX and where it’s going – it can be used to make sets safer, and that should be a top priority on film and TV shoots. And it can be used to broaden our horizons so that a filmmaker can bring anything they can imagine clearly on the screen and be real enough for audiences to suspend their disbelief and feel they are engaged with a character or immersed in an environment – not an effect. I love that filmmakers like Jim [Cameron] and Guillermo del Toro constantly embrace innovation, and as a result are giving people a reason to go back to movie theaters and see bold visual storytelling done like never before.
By TREVOR HOGG
The disguise workflow is being used for the xR stage at Savannah Film Studios to train students. (Image courtesy of disguise and Savannah College of Art and Design)
Virtual cinematography has been defined by practical expertise on cameras, lensing and lighting while extending its influence, with real-time game engines being the cornerstone for virtual production and previsualization becoming a staple of live-action blockbuster productions. However, the paradigm that defines the cinematic language is shifting as drones are making once impossible shots achievable and several generations have grown up playing video games. The animation, film and television and video games industries are brought closer together as their tools become more universal and integrated. Whether this trend continues, if controversies such as the Life of Pi winning the Oscar for Best Cinematography dissipates, and whether it will become commonplace for animation DPs to be invited into the ranks of organizations like the American Society of Cinematographers, remain to be seen. There is also the issue of having the cinematographer consulted during post-production to ensure that the camera, lens and lighting choices are consistent, thereby maintaining the visual language.
To develop a complete understanding of the evolving relationship between virtual and live-action cinematography, professionals from film, television, animation, video games, commercials and virtual production have been consulted.
Greig Fraser, who won an Oscar for his contributions to Dune: Part One and received a nomination for Lion, lensed the pilot episode of The Mandalorian, which is accredited for accelerating the adoption of the virtual production methodology, along with the COVID-19 pandemic. “I wish that I was really good at using Unreal Engine because if I was training to be a cinematographer right now, the cheapest tool they can get in their arsenal is Unreal Engine,” Fraser notes. “With Unreal Engine, you have metahumans so you can build yourself a face, light it and start to explore how top lights work. You can begin to figure out emotionally how you feel when putting a light on the side, top, defused or cooler. You can do that quickly. They can then apply that to the real world, but also have a huge positive base of knowledge when getting into the virtual world by knowing what is and isn’t possible.” A better understanding and integration into the world of filmmaking is required more than additional tools, according to Fraser. “You can’t put up an 18K, flag and diffuse it the same way you can on set. There is a number to change the softness, width and distance. There are differences between those things. I would like to see correct film lights put into Unreal Engine as it will allow a lot of cinematographers who have trained with an old system to be able to come in and use that in a new world.”
Greig Fraser is able to create unique lens aberrations that stem from his knowledge of live-action cinematography. (Image courtesy of Warner Bros. Pictures)
Observes Cullum Ross, Chief Lighting Technician on Man vs. Bee, “The biggest problem I have with interactive lighting and creating something virtually is try as you might you cannot add specular light. That’s so difficult in a virtual environment because you’re dealing with a large LED array and that is soft light. Currently, if you want to create highlights and flares you need to do it for real.” Three to four separate passes were done for shots involving the bee. Ross notes, “We had lots of different sizes of bees, some more reflective, while others were matted and satin.” When shooting Man vs. Bee, Cinematographer Karl Óskarsson dealt with an antagonist in the form of a CG insect. “You can create a dinosaur or elephant in post, but you need to see something in reality that gives you the shadows and roughly the theme of what you’re doing. Then you can add all of that in post.” Óskarsson adds, “When the bee is between the main actor and camera, the focus of the eyeline has to be right; that’s where puppeteer Sarah Mardel came in with something on a stick. The approach was that the bee would always fall into what we were doing. Occasionally, we could do a strong backlight because we had to see a small backlit bee over in the frame. There were occasional close-ups. It was much more about Rowan Atkinson. The beauty of what Framestore did was to add what was meant to be.”
WALL-E leveraged the live-action expertise of Cinematographer Roger Deakins. “The big comment made by Roger Deakins was, ‘You are making decisions in layout without knowing where the lighting is going to be. I light the set and then film it,” recalls Jeremy Lasky, DP, Camera at Pixar. “Danielle Feinberg [DP, Lighting at Pixar] and I looked at each other and said, ‘He’s right.’ Previously, we could never manage to get these things working together due to software, complexity and time.” That was the first film where the two DPs could work together visually at the same time. Lasky adds, “Danielle could put some lights in, I could start exploring. We could see shadows and how you could open a hatch in a dark room and the light would spill out. You could time it in editorial and compose to it.”
Director Matt Reeves with Greig Fraser shooting The Batman, which was a combination of shooting real locations and virtual production stagework. (Image courtesy of Warner Bros. Pictures)
ILM’s StageCraft technology was used for the background shots of Gotham featured in The Batman. (Image courtesy of Warner Bros. Pictures)
Bridging the gap between virtual cinematography and virtual production is an important goal for Impossible Objects.
(Image courtesy of Impossible Objects)
Cinematography has evolved over the decades, but at its core it’s still moving images. (Image courtesy of Impossible Objects)
Practical understanding and testing are important. “The happy accidents that you have on a live-action set, you have to manufacture in CGI,” notes Ian Megibben, DP, Lighting at Pixar. “We need to borrow from both sides as they can complement each other. When we started on Lightyear, I was dissatisfied with the way our lens flares and aberrations looked because one artist would approach it one way and another artist would approach it a different way. There wasn’t a lot of consistency. Chia-Chi Hu [Compositing Supervisor] and I spent a lot of time studying various affectations on the lens that we rolled into our lens package and that informed the look.” Caution has to be exercised. “The computer tools are so flexible that you have infinite possibilities, but if you use every color in Crayon box, it can start to lose its focus,” Megibben says.
Video game cinematics have become more filmic over the years, as displayed by Ghost of Tsushima, which is in the process of being adapted into a movie. (Image courtesy of Sucker Punch Productions and Sony Interactive Entertainment)
“The paradigm of what happens in live-action does not equal the same components of artistry that happens in animation,” states Mahyar Abousaeedi, DP, Camera at Pixar. “What our department does in layout overlaps multiple disciplines. Storyboards communicate the essence of the story while the execution is more about expanding those ideas and making sure that we’re still building a visual language to escalate until we reach the peak of that sequence. The reason I spent 48 hours looking at boy band references was to find out what makes a dance sequence authentic from 2000 for [the concert in Turning Red]. It’s not just how it was shot. What are those characters doing? I can see boards of the character dancing, but are there imperfections in how they do it and should we see those imperfections? One thing that I enjoy about what I do is seeing certain ideas that live in this rough draft become much more thought out. You need to actually see that idea first because there’s nothing to shoot [except for the set] and you’re depending on a bunch of artists to choreograph it [with the characters].” In an effort to balance fantasy and reality, live-action and animation filmmakers approach the material from opposite directions to accomplish the same thing.
Ira Owens believes that close-up shots are meant to punctuate the storytelling, as displayed by this still taken from Ghost of Tsushima. (Image courtesy of Sucker Punch Productions and Sony Interactive Entertainment)
Meptik is responsible for hybrid xR events “such as “Combat Karate” for Karate.com. (Image courtesy of Meptik)
The physical actions of Rowan Atkinson dictated the placement of the insect adversary in the Netflix series Man vs. Bee. (Image courtesy of Netflix)
Framestore was responsible for the CG bee, with practical proxies of various sizes standing in to get the proper framing and lighting. (Image courtesy of Netflix)
“The biggest problem I have with interactive lighting and creating something virtually is try as you might you cannot add specular light. That’s so difficult in a virtual environment because you’re dealing with a large LED array and that is soft light. Currently, if you want to create highlights and flares you need to do it for real.”
—Cullum Ross, Chief Lighting Technician, Man vs. Bee
“In animation, we are trying to make our characters and world feel real to the audience so that the audience can connect with them, so we attempt to do it more like how you do on a live-action set,” remarks Jonathan Pytko, DP, Lighting at Pixar. “Live-action sometimes feels like it’s going in the opposite direction where they’re trying to make it more fantastical and take you out of reality and give it a different vibe. They’re both valid.”
Time spent at Lucasfilm Animation laid the foundation for Ira Owens, who is a cinematographer in the video game industry. “Some key elements that I use is when you’re showing a wide shot, make it beautiful and epic,” Owens explains. “With medium shots be clear and concise, make sure that they’re telling the story. Punctuate with your close-ups. That train of thought I learned from my time on The Clone Wars series has allowed me to thrive with different positions that I’ve held in animation and eventually gaming, which has changed so much over the years to become more cinematic in its storytelling.” Owens works directly in the game engine. “I can break the camera off and scout a location quickly. The controller literally becomes my camera. I have pan, pitch and crane functions. I cruise the camera around and look at a scene. I’m not saying that I don’t utilize storyboards or concept art, or if some live-action footage is available I’ll watch that to see if there are some ideas I want to explore.”
To better handle the proliferation of visual effects, various onset virtual tools have been produced. In fact, Game of Thrones led to the creation of the iOS app Cyclops AR, which enables CG objects to be composited into the camera view in real-time. “Director Miguel Sapochnik asked me, ‘How big is Drogon?’” remarks Eric Carney, Founder and VFX Supervisor at The Third Floor. “My stock answer was, ‘About the size of a 747,’ which is really not that helpful. I remember thinking that it would be great if I had a way on my iPad, which I always carried with me for notes, to quickly show everyone a composite of Drogon in the actual physical space. Later in the year, when we went to Italy to shoot the Dragon Pit scene for Episode 806, we created an early prototype of Cyclops. In 2022, we produced a new tool called Chimera that uses AR Hololens glasses from Microsoft and works in conjunction with a laptop to render the content so it is not as portable, but has higher visual quality allowing for a lot of flexibility. Multiple users can come together, either in person or remotely, and share an experience of reviewing sets or scenes virtually.”
“Virtual cinematography is much bigger than virtual production,” states Janek Lender, Head of Visualization at NVIZ. “I’m focused on using it in pre-production and storytelling. When I’m working with the director and visual effects supervisor, it’s always about getting a virtual camera in a virtual space, to look around and make a sequence of shots, because my end goal is to have a previsualization. Then they take my previs and break it down for visual effects and work out what’s LED walls and spaces. But you can’t do that until the visualization is there and director can go, ‘That’s what I want to make.’” The virtual camera system is paired with Unreal Engine to allow the director and cinematographer to take shots of the previs world in real-time. “The good thing about doing your previs there is that those assets can be reused for your postvis. Once they shoot their plates, you can fill the screens with what you did in the previs world or use it as b-reel to help gauge the people who are going to make the volume.”
ILM and its StageCraft technology are among the pioneers of virtual production. “Now we’re going into this different world with virtual production and LED walls where we’re closely tied to the director of photography and cameras teams,” remarks Chris Bannister, Executive Producer, Virtual Production at ILM. “One thing that has always defined ILM is the blend of the physical and technology; that’s what makes the beautiful images. There’s always a back and forth. We spend a lot of time making our StageCraft toolsets be in a language that people want to speak, to make sure that the color temperatures match, or when the DP says, ‘Go up or down one stop’ it’s a real thing. That’s something that is not traditionally in digital toolsets. Working CG, sometimes people don’t work in those units. What we always try to blend well together is having those two things in dialogue because it’s what gets you the best results.”
Previs remains a central element for working out shots and story beats for directors and cinematographers, as was the case for Matilda the Musical. (Image courtesy of NVIZ and Netflix)
The assets created for previs can be reused in postvis. (Image courtesy of NVIZ and Netflix)
The system for lens flares and aberrations was revised to ensure that they were consistent throughout Lightyear. (Image courtesy of Disney/Pixar)
Cinematographer Roger Deakins consulted on WALL-E, which led to Pixar figuring out how to do lighting during layout.
(Image courtesy of Disney/Pixar)
“If you go into the mindset that AI is a tool, it will get you a starting point. AI is something that gets your creative juices flowing. It’s not the one click, you’re done and ready to go. You still have to finesse it and put your secret sauce on it and get things optimized for virtual production. It’s an amazing concepting tool.”
—Kevin De Lucia, VFX Supervisor, Vū
Vhagar is visualized live in rehearsals for Episode 104 of House of the Dragon utilizing The Third Floor’s portable AR Simulcam app Cyclops. (Image courtesy of The Third Floor and HBO)
The Blight sequence in Ant-Man and the Wasp: Quantumania where Scott Lang confronts himself. (Images courtesy of Marvel Studios and The Third Floor)
Vū believes that AI is going to become a starting point for creating environments. (Image courtesy of Vū)
A driving force is the adoption of real-time game engines. “The real benefit of real-time that we’re seeing across all of these shows is that it gives our creatives more bites at the apple,” states Landis Fields, Real-Time Principal Creative at ILM. “With traditional visual effects, visual effects supervisors and all of the folks who are involved will be looking at a monitor to review a shot through screenshare and with a little tool draw around a thing and say, ‘This needs to be bluer.’ You’re trusting a lot of folks to interpret what you’re trying to say. When we do that now in real-time, we don’t guess. We have a screenshare up of a living, breathing world in 3D and ask, ‘What do you want?’ ‘The light should come up from down there.’ ‘Like this?’ ‘A little bit lower. That, right there.’ I cannot stress the exponential value of savings that just had. Because now we don’t have to drag the whole thing through the process of guessing that we’re right, showing it a week later and learning that we’re not.”
Technology is constantly changing, and virtual production is no exception. “You can start seeing some of those changes now with the recent updates in AI,” remarks Kevin De Lucia, VFX Supervisor at Vū. “AI is going to become a driving factor. Eventually, we’ll be able to start creating environments from certain AI programs. You see how fast that is growing and will help change the industry as we move forward. If you go into the mindset that AI is a tool, it will get you a starting point. AI is something that gets your creative juices flowing. It’s not the one click, you’re done and ready to go. You still have to finesse it and put your secret sauce on it and get things optimized for virtual production. It’s an amazing concepting tool. For the immediate future, I see LED screens being the main platform, but those will evolve, too, like the resolution and processing power behind them, everything from your pixel pitch and the different ways of pushing the resolutions that you need to be able to have high fidelity. There are a couple things that we’re working on R&D where we’re not using a LED panel. There are these LED transparent screens where you can do things that are more interactive with gestures and motions.”
Luc Delamare, Head of Technology at Impossible Objects has a passion for bridging live-action photography with virtual production. “Cinematography has evolved over the decades, but at its core it’s still moving images, how you approach coverage and the way you create emotion out of frame,” Delamare observes. “All of those things are rooted in the same concepts, and it’s up to the cinematographer, artists and director on how to use those tools. I would like to think when you break that language, your audience will notice even if they don’t understand it.”
Delamare continues, “We’re able to stage previs and imagery upfront in pre-production at a much higher fidelity level. You’re not spending a week or two looking at grey-scale images of your scene, but actually doing something where you’re already making lighting choices well before normally in a CG pipeline. In terms of VR scouting, you can scan a room or exterior with your phone and bring it into Unreal Engine in a matter of minutes. It’s freeing, as opposed to a director and cinematographer having to work with a visual effects supervisor and artists to explain the things they want.” AI is a scary proposition. “I used to think that my job was the safest from robots. I’m sure that you’d be able to feed an AI that this is the style I want. You can see it already with some of the images that people are doing.”
“We still have those same challenges of traditional cinematography where we have to overcome sense of scale and grounding, where the actors are in space, and how the lighting is hitting the actors; with that in mind, I don’t know if virtual cinematography is really changing the game,” notes Addy Ghani, Vice President of Virtual Production at disguise. “When you build a digital world in Unreal Engine, it’s hard to visualize it all. You don’t know how big those trees are, so putting on some headsets and location scouting is helpful for directors and cinematographers to get a sense of scale and space. You can actually do it without the glasses. A LED volume is useful to stand in and take in the environment; you could always look through the camera to see how you are going to frame up a shot.”
An important element is getting individuals educated properly as technology advances old-school techniques such as rear projection, which is the foundation of virtual production. Ghani comments, “You still need training to be able to harness the power of the tools. To make the sunrise and environment look realistic enough for high-end feature films; that’s the difficult part and is where training and expertise comes in.” There is room for improvement, Ghani adds. “I would love to get even better photorealistic, higher-quality output out of Unreal Engine; that would game-change a lot of shots. Right now, numerous shots still require some post-visual effects enhancement to get it to that final level of completion. Bringing that stuff to principal photography would save time and money and give directors and cinematographers immediate creative decision-making.”
By TREVOR HOGG
Images courtesy of Sony Pictures Animation.
One of the most difficult tasks was to slightly age Miles Morales and Gwen Stacy without them appearing unfamiliar.
Much like The Matrix, which caused a ripple through the film industry because of innovative storytelling, Spider-Man: Into the Spider-Verse had the same impact, with its creative influence going beyond its Oscar win for Best Animated Feature to inspire several imitations. Upping the ante is the first of two proposed sequels, Spider-Man: Across the Spider-Verse, by showcasing not only the world inhabited by a slightly older Miles Morales but those serving as the homes of Spider-Gwen, Spider-Man 2099 and Spider-Man: India. And let’s not forget that the cast of Spider-People has been multiplied by 10, and Miles finds himself clashing with them on how to handle the portal-empowered villain known as the Spot.
To handle the expanding multimedia aesthetic of the franchise, Sony Pictures Imageworks created a new department called Look of Picture. “On this one, there are so many technical hurdles and different looks that I’ve spent 80% of my time developing tools,” states Bret St. Clair, Senior Look Development and Lighting TD at Sony Pictures Imageworks. “We saw a lot of art that depicted paint being thrown around in-frame in ways where strokes aren’t specifically connected to a character or to anything. They’re floating in space. The problem for us is we try to approach everything rendering in a traditional way. Then we try to have all of our tools handle the rest in compositing. When it comes to positioning brushes in 3D, we needed to be able to reconstruct where those positions are, even when they don’t lie specifically on the surface that you’re applying the brush to, and the rendering tools, especially in 2D, don’t give you that information for free.”
Watercolor was the media of choice for Spider-Gwen’s world. “A lot of the look early on was inspired by the Jason Latour art, and everything was running vertically in that world,” St. Clair remarks. “As you start to move cameras around and we started to get through shots, it became clear we can’t have everything running vertically. It’s a balancing act because if you’re painting brushed volumes in a scene and the character starts to move, the brush volumes themselves get your attention as opposed to the thing you’re suppose to be looking at, or they give you the impression that they’re something else other than a volume. There is a lot of things that you can do in 2D that work because it’s a still frame, but as cameras move, it falls apart. Over the time we’ve been working on Gwen’s World, the tools of had to evolve so that the brushes automatically understand the surfaces that they’re painting onto, and painted in the right directions, or we have the ability to add hand-drawn strokes to something interactively.”
Concept art of the adversary known as the Spot, who is made of paint, ink, and has the ability to dissolve worlds into nothingness.
“There are so many technical hurdles and different looks that I’ve spent 80% of my time developing tools. We saw a lot of art that depicted paint being thrown around in-frame in ways where strokes aren’t specifically connected to a character or to anything. They’re floating in space.”
—Bret St. Clair, Senior Look Development and Lighting, Sony Pictures Imageworks
Linework is a major component to the look of Spider-Man: Across the Spider-Verse, with the tool development responsibility given to Pav Grochola, FX Supervisor on the film and Lead VFX Artist at Sony Pictures Imageworks. “When you’re creating linework, you’re starting off with a 3D model, and we try to make it look as hand-drawn as possible, which means that you’re trying to emulate the artistic process,” Grochola explains. “What an artist would do in the case of Medieval Vulture is Leonardo Da Vinci as much as you can. But all of that stuff is camera dependent, so you can’t be 100% procedural because 100% procedural looks like toon shading. We actually created our linework inside of Houdini, and it was driven by artists. Linework itself was in curves. We had a dedicated team of artists looking after the linework just for Medieval Vulture. The way you make something has a big impact on how it looks, so as much as possible we try to create procedural linework in Houdini, but also have artists draw linework on keyframes. Then that linework would be interpolated by our systems. It was a combination of hand-drawn and procedural. That’s the big secret with this stuff.”
Medieval Vulture, inspired by the drawings of Leonardo Da Vinci and made of sepia-toned paper, turned out to be the most complicated character to execute.
Taking part in the battle at the Guggenheim is the motorcycle-driving and pregnant Jessica Drew/Spider-Woman and anti-hero Miguel O’Hara/Spider-Man 2099.
The visual aesthetic for Spider-Gwen was based on the original comic book artwork by Jason Latour.
Sony Pictures Imageworks took advantage of the inbuilt watercolor solver of Rebelle by integrating the hyper-realistic painting software into its pipeline.
The visual aesthetic of Mumbattan was inspired by Indrajal Comics, which was a comic book series in India.
For the Spot, Sony Picture Imageworks worked with Slovakian creative software company Escape Motions. “In terms of the technique, we did something interesting in this film in that we partnered with Escape Motions, which creates this painterly software called Rebelle that has an inbuilt watercolor solver, and you can do cool stuff like wet the canvas,” Grochola remarks. “You can put ink [on the wet canvas] and the ink spreads into the paper. We always tried to simulate the natural organic detail of painting in real life. Right at the beginning, we were working with that tool, and one of our biggest goals was to try to make the movie look as hand-drawn and handmade as possible; that tool to us was the perfect thing for experimenting with natural media. Not only can Rebelle do watercolor but oil paint and charcoal in such a convincingly way that would be hard for us to develop from scratch. We merged that software into our software and created this cool 3D and 2D combination where the Spot is made of paint, ink, and moves around leaving ink droplets; he is himself constantly redrawn with natural media.”
Concept art of Spider-Man: India, Spider-Gwen and Miles Morales battling the Spot in Mumbattan.
Medieval Vulture is made out of sepia tone paper. “We have him as translucent like paper so he scatters light through,” remarks Mike Lasker, VFX Supervisor at Sony Pictures Imageworks. “His character is probably one of the most complicated characters we have ever made because he is this Leonardo Da Vinci Medieval-style Vulture character with tons of pullies, feathers, and he’s got all of this stuff going on plus all of his linework. A lot of linework we do hand-drawn also. We mix actual 2D-drawn lines in with a lot of our stuff to give it that last piece of hand-drawn quality. But you need materials. What was a challenge on the first one as on this one is, what does a Da Vinci metal look like versus a Syd Mead? We do our traditional look development to a point and then we hand it off to our Look of Picture team where we do a style on top of that. I was just looking at an environment yesterday, and it’s in a world of pistons and gears and you’re in an entire environment of metal. Our look development looked realistic, but I tried to get the team to paint enough texture paint in there that is in the style so when we apply the Look of Picture tool it all works cohesively.”
The industrial artwork of award-winning visual futurist and conceptual artist Syd Mead, member of the VES Hall of Fame, was the inspiration for Nueva York, which is located on Earth 928 where Spider-Man 2099 lives.
Videos on YouTube were referenced to better understand how artist/designer Syd Mead went from the blank page to creating fully-realized, retro-futuristic worlds.
Concept art exploring what the watercolor skyline of Manhattan would look like on Earth 65, which happens to be the home world of Spider-Gwen.
Concept art exploring what the watercolor skyline of Manhattan would look like on Earth 65, which happens to be the home world of Spider-Gwen.
Linework is a major component to the look of Spider-Man: Across the Spider-Verse and had to appear hand-drawn despite starting off with a 3D model.
Simulations have to take on the characteristics of the world in which they occurred. “In just the principles of film work and photography, we have to come up with how do we do depth of field in a painterly way versus a more architectural way versus how we do it in Miles World versus in India,” Lasker notes. “You have to experiment and be constantly ready to fail over and over again or at least try a lot of different things. One of the things we learned on the first one is you can’t inch your way to the finish line. You have to overdo it, and then figure out, what do we like from that, what is working and what is not? What does a lens flare look like in Spider-Man 2099 world versus Spider-Man: India? Every aspect you have to reinvent. The first film was really just one look.” Explosions have to be stylized and believable. “A lot of it is constantly playing with exposure, blowing out the lens, and how the light from the explosion affects the world around it,” comments Lasker. “If the explosion is in Gwen’s World, you want to feel the heat of the fire on the surfaces, and the shadows being cast are brushed like a painting, or the warmth on the surfaces needs to be brushed. How do those brushes react differently to what has already been brushed in the environment? You’re not only creating a painting, you’re having to paint the affects of the effect on the environment.”
To handle the expanding multimedia aesthetic of the Spider-Verse franchise, Sony Pictures Imageworks created a new department called Look of Picture.
The webbing for each of the Spider-People, including Spider-Gwen, had to reflect the visual aesthetic of the world in which they come from.
“The way you make something has a big impact on how it looks, so as much as possible we try to create procedural linework in Houdini, but also have artists draw linework on keyframes. Then that linework would be interpolated by our systems. It was a combination of hand-drawn and procedural. That’s the big secret with this stuff.”
—Pav Grochola, FX Supervisor/Lead VFX Artist, Sony Pictures Imageworks
The world of Miles Morales is visually defined by nongradient colors and a graphic comic book style.
Spider-Punk evolved in animation. “We had to figure out all of the ways to handle his layers, and when you have a character who is meant to look like a magazine cut-out in a fleshed-out world that already has all kinds of crazy art directions in it, it’s hard to make it all cohesive,” observes Alan Hawkins, Head of Character Animation at Sony Pictures Imageworks. “But we found a formula once we got into the shots, like offsetting the frame rates of certain aspects of his body from other parts of his body. There was speculation that on the first one that there was meaning behind frame rates, but that was not the case. His jacket sometimes is a different frame rate than his body, and it’s to give him a patchwork collage feeling, and there are some slightly different layers to him being pasted together, which we did a lot of tests on; he is like 3s and 4s sometimes whereas everyone else is 2s.” The voice acting of Daniel Kaluuya was surprising. “It had such a swagger to his read that made the character so clear to us. Spider-Punk is probably one of my favorites now because he is consistent in his presentation to everyone else, except for a few key moments where you get to see his real truth, and that’s an amazing thing that we got to work with.”
“In the upgrade of Miles from a young teen to an older teen, his growing up a little was one of those where when the new shapes came into play, we handled it wrong; he no longer looked like himself,” Hawkins reveals. “Miles has a particular cheekbone structure and eye-corner shaping; his silhouette from the side and the way his mouth in the first movie had a bit of an overbite, and as they aged him up a little bit, you want to give someone a more pronounced jawline. That stuff did affect the way Miles looked, and he became a little unfamiliar. We had to make adjustments to how the rig would behave and how we handled the posing to make him look like a grownup version of that same Miles everybody loves.”
A massive focus was placed upon cinematography, composition and camera language. “That’s one of the things that we would say to every animator when they would join the show: you have to learn that language about overs, French overs, two shots, what lenses mean, which lenses to use in certain circumstances. Every animator was basically an honorary DP on this film because that was half of what we were handling.” Hawkins would love to do a reference cut for the whole movie. “I would like to point out how much the animators’ soul goes into these performances. For every scene where you see someone crying up there, that animator probably got to that place and filmed themselves doing it and re-referenced that stuff. It’s so complex and adult in a great way that we don’t often get in animation.”
Masks were incorporated into texture paint workflow. “For the texture painter, it’s color, bump, roughness and maybe a metallic,” observes texture painter Nicolle Cornute-Sutton. “Those are the traditional PBR types. On Spider-Verse, the laws of physics don’t apply, so physically-based rendering means nothing! Metal doesn’t shine like metal in our world. You have to think of things from a 2D perspective. For example, there is a dinosaur in our movie, and usually we would displace the scales to sell this bumpy, scaly reptilian creature, but you can’t do that in the Spider-Verse, so it’s almost working in a drop shadow and tracing things and giving it a more graphic read.” Templates were created. “The art department wanted it more stylized, and the only way you can do that is by hand. As our team grew, we had to make sure that we laid down foundations in Mari for new painters. We had to develop a lot of templates using node graphs so people could pick up a template and at least have a starting place so no one had to figure out for themselves, what is Miles’ World? Then we had to have a robust library of textures to show people that this is the target. That was new to me having to start from scratch.”
A complex battle that takes place in the Guggenheim Museum. “It’s in Gwen’s World, but the Guggenheim is actually filled with photorealistic artwork created by Jeff Koons,” Cornute-Sutton states. “It was one of the only times in the movie where we actually did go into Substance 3D Painter. We had photorealism, an architecturally significant building that we had to recreate, but in Gwen’s World style, and there is a helicopter bursting through the skylight in the ceiling. Vulture is in this sequence, and he’s from the Leonardo Da Vinci Verse, so he’s in a completely different sepia tone style. Then Spider-Man 2099 comes in with this digital suit that comes on in bits and bytes; his righthand gal Lyla pops in and she’s a hologram. Jessica Drew hops in on her motorcycle. And there’s Gwen and her father. They’re all in their styles.”
Diversity is paramount in the Spider-Verse. Observes Cornute-Sutton, “I felt like this was the first time I got to paint people who look and sounded like me and my family. That was incredibly exciting and thrilling thing. The audience is going to see that and feel that too. It’s not just about race or gender. I hope that this is just the beginning of many films where there’s a cognizant realization on the part of the filmmakers that we all want to see ourselves in CG features. I cannot help but say kudos to Sony Pictures Animation for taking a chance on making a movie that speaks to a broader range of audience than has been done before.”
By OLIVER WEBB
Images courtesy of Roger Guyett.
Roger Guyett joined ILM in 1994 to work on the
groundbreaking computer animation for Casper.
Roger Guyett grew up in the Farnborough area in Hampshire, England, where he went to school, then college in Bristol, before eventually finding his way into the world of computer animation in the mid 1980s. “After college, I didn’t know what to do with myself and ended up doing construction jobs, played music in bands around London, lived in France, and generally had a good time, but it wasn’t really taking me anywhere. I did an art foundation course and developed an interest in animation and the idea of making images with computers. Eventually, a friend told me that the British government was funding people to retrain in computer science, so I applied and did an MSc at University College London. I wasn’t quite sure where it was going to take me, but it felt like a positive step. This was all before people had home computers or even before email addresses, which sounds mad as it really wasn’t that long ago!” says Guyett.
After completing the course, Guyett saw an intriguing job advertisement in the Evening Standard for a position at a London post-production company. “As part of the interview process, they asked me if I could do an animation of a flag, an analysis of how it might move,” Guyett explains. “I did the best I could and fortunately was offered the job. At that time, people with all sorts of backgrounds were starting to work in computer animation. There was no specific training, no college courses in Computer Animation – there were probably only 20 of us working in the entire industry in London. I hadn’t known this world existed, but I’d stumbled into something that was a perfect fit for me, and I loved it. Working at a post-production company was incredibly exciting and you were exposed to all sorts of work – TV idents, commercials, pop videos, all sorts of different kinds of work. The people were amazing, and you got to learn every aspect of production from storyboarding to editing. We even acted in some of the work we did when we didn’t have much budget.”
Continues Guyett, “In those days, commercials were often shown in cinemas on film. The commercials were authored on video and then crudely transferred onto film. To do proper digital effects, you had to be able to scan film to digital and then, of course, get the resultant digital images transferred back onto film at a much higher fidelity. This issue was an ongoing development in the industry and crucial to the process. ILM, of course, had solved that problem a few years earlier. One of my proudest moments at that time was working on a Perrier commercial that was one of the first examples of a hi-res digital film out in the U.K. The technology had been developed by Mike Boudry, who had a company called the Computer Film Company, which was eventually acquired by Framestore. That was the first film work I was involved with,” Guyett notes.
Guyett’s big break came when he was hired by Pacific Data Images (PDI) in 1992 and moved to California to work in the rapidly expanding world of digital film effects. He worked at PDI for two years before moving to ILM in 1994. “When I was a kid, I loved movies, but I never imagined I’d get to work on one! The digital VFX world suddenly exploded after movies like Jurassic Park came out, and there were a lot of great opportunities in America for people who knew how to do that work. When I arrived at ILM, there was actually still a mix of more traditional VFX, like miniatures. A lot of work was still done using optical techniques, but the new horizon was the digital side. My first show was Casper with legend Dennis Muren as Digital Character Supervisor. I was a senior technical director there – you lit the shot, did any effects, and then composited it all together. There was a strong delineation between the disciplines, which became even stronger as each of the disciplines became more and more complex. From there, I was lucky enough to become a CG supervisor on shows like Mars Attacks! and Twister. That show pushed particle systems to a whole new level – all sorts of tools and ideas were developed for that film. There were a lot of very smart people working at ILM. It was amazing seeing ideas like ambient occlusion, for example, get developed there.”
Guyett with daughter Ella at the 2009 Academy Awards for Star Trek.
Guyett at the 2018 BAFTAs for Ready Player One.
Mission: Impossible III VFX team in China, 2005. From right: Guyett, Tom Peitzman, Marty Bosworth, Duncan Blackman and Chris Raimo.
Guyett with Star Wars: The Force Awakens co-writer Laurence Kasdan and J.J. Abrams’ executive assistant, Morgan Dameron.
Guyett at Pinewood with many familiar faces from the Star Wars franchise.
Guyett worked with VFX Supervisor Stefen Fangmeier while at ILM. “He was one of the original digital guys at ILM and had worked on Terminator 2: Judgment Day and Jurassic Park. We’d worked on Twister together. He needed a co-supervisor for Speed 2 and he was kind enough to ask me. That got me on the VFX Supervisor path at ILM, which I’m eternally grateful to him for,” Guyett adds.
In 1997 Guyett worked on Saving Private Ryan, which proved to be a turning point. “Although Stefen was the main supervisor, he got really busy on another show and it meant I was essentially on my own. There’s nothing like feeling the heat and pressure of your first solo show as a VFX Supervisor! It was a great mix of both digital and practical effects. The biggest digital shot was the beach establisher with all the ships, landing craft and troop replication – doing all the interactive water was such a challenge then. We did a lot of effects work – explosions, bullet hits and flying debris and plenty of grotesque wound FX work, standard fare in a war movie these days. And, of course, all the tracer fire! Steven Spielberg was a real force of nature at that time and at the top of his game. I was pretty inexperienced on set and suitably nervous, but had to step up and deal with it all. At that time, you didn’t have the ability to do the number of takes you can do now, the computers were a lot slower and disk space was a premium. It was actually one of the first shows we used the internet as a research tool. I remember typing in “D-Day” and we got back three or four images! What an incredible experience that show was. I was dealing with one of the greatest filmmakers in the world and watching him work first hand,” Guyett explains.
Guyett at the premiere of Star Wars: Episode VII – The Force Awakens, 2015.
Guyett on location in China for Star Wars: Episode III – Revenge of the Sith.
Star Wars: Episode VII – The Force Awakens
second unit on location in Iceland.
Saving Private Ryan town set.
J.J. Abrams with the camera crew on Star Wars: Episode IX – The Rise of Skywalker.
Guyett worked with some of his closest collaborators on Star Wars: Episode VII – The Force Awakens, including Pat Tubach and Animation Supervisor Paul Kavanagh.
Saving Private Ryan beach set.
Guyett’s credits also boasts two Harry Potter films. “Rob Legato was the Production Supervisor on The Philosopher’s [aka The Sorcerer’s] Stone. I came and joined him as one of the main supervisors, working through the shoot and then supervising ILM’s work. One of the first sequences I helped shoot was the snake in the zoo. I learned so much from Rob – he was a big influence on me. He had such a strong perspective as a filmmaker, something that I tried to develop through my career. You’re not just looking at the VFX work artistically and technically, but also trying to understand the context of the work and how it contributes to the film as a whole. Warner Bros. was so anxious about the first movie, which is so funny to think of now. You don’t get to work on bigger or more complicated movies than something like Harry Potter and, again, it was a great learning experience. It was an amazing mixture of miniatures, practical and digital work.”
Guyett also worked on the action-packed Mission: Impossible III.
Working as both the Second Unit Director and VFX Supervisor on Star Trek allowed Guyett to have more control and authorship of the shots.
Guyett served as the main visual effects supervisor on Harry Potter and the Prisoner of Azkaban. “It’s still one of the strongest visual movies I’ve ever worked on. Alfonso Cuarón is such a visual director – the shot compositions, the way his shots develop. He didn’t have a lot of experience doing VFX, but he had such great ideas. I learned a lot about filmmaking from him. It was big and complex show and we had a great team. Mike Eames was the Animation Director, and Tim Burke was the other main supervisor on the show. The complexity of the work often demanded innovation and a tremendous amount of planning to set the shots up for success. That’s something that seems sadly less important these days. For example, we developed some great motion-base work tied into the animation for the sequence when Harry rides Buckbeak the hippogriff.
Guyett on the set of Star Trek.
“[J.J. Abrams and I] have worked on five movies together. He was somebody that I learned from tremendously and was able to grow with. He’s extremely visual but compliments that with his background as a writer. I think we have common sensibilities and similar tastes, which of course really helps. We’ve done so much work together now that we have developed a shorthand. You just don’t have to spend so much time communicating your ideas, you can cut to the chase. He still surprises and challenges me…”
—Roger Guyett, Visual Effects Supervisor/Second Unit Director, ILM
Guyett and J.J. Abrams have collaborated on five movies.
Guyett on the set of Star Trek.
One of Guyett’s most significant contributions to visual effects was his work on J.J. Abrams’ Star Trek.
It was really gratifying seeing the final shots come together, seeing that Harry’s movement was so integrated with the animation because of the mo-base work we’d done. The Time-Turner sequence was another very memorable moment for me. It was probably one of the most complicated and most planned shots I’ve ever worked on. You had so many different elements. Different plates of the cast and background actors moving at different frame rates, but all tied together with moving lighting. Then multiple miniature shots of models at different scales all comp’d together,” Guyett details.
The Knight Bus sequence also proved to be particularly challenging to create. “Alfonso wanted the freedom to move the camera at will inside the bus. We had to figure out a way of shooting a 360 environment plate, something that people do all the time now!” Guyett says. “We built a 14-camera film rig on a vehicle. All the heads were stabilized, and we worked out the visual overlap between all the cameras to create the necessary plate for the interior of the bus. Of course, the bus windows helped because they restricted the view a bit. We’d go to all the locations, which were super expensive as it was all shot in the middle of London, and only needed one or two runs of the vehicle to do all the plates we needed, which, of course, production loved. Then, in post we used a terrifying number of avids to sync all the footage together, which then allowed us to figure out how to move the interior bus set on stage, essentially syncing the motion of the bus set to the background plates we’d shot. If the bus turned left, it leaned appropriately, and because we had a full background plate, Alfonso could shoot in any direction. Trying to explain all this to Alfonso was near impossible, but fortunately he trusted me and it all worked out. Another trick was using a bunch of projectors, which then projected the same footage back into the bus itself. It gave you this great sense of texture and speed and excitement. Alfonso certainly tested your ingenuity and imagination.”
One of Guyett’s most significant contributions to visual effects was his work on J.J. Abrams Star Trek. “At this point, I was more confident and had developed and refined my approach more. Working as both the Second Unit Director and VFX Supervisor really allowed me more control and authorship of the shots. One of those touchstones on Trek was that famous image of the Earth photographed by the Apollo astronauts from the moon. Half the Earth is in shadow and half is lit. When we made Star Trek, that image really inspired my approach to the work. It seemed to encapsulate the idea of exploration, the potential danger, the unknown, traveling into the shadows, but also light and the strong contrast of the potential outcomes and, of course, simply put, the kind of lighting style I was excited about. We also famously developed a whole ‘lens flare’ language for the film, something that I see now in other movies. It created this visual energy, like you were seeing into the future or something. It was certainly vibrant but also very dark. J.J. and DP Dan Mindel did such a great job with that movie. It was so well cast and so much fun.”
Discussing his relationship with Abrams, Guyett notes that he was fortunate enough to work with Abrams at the beginning of his career as a film director. “We’ve worked on five movies together. He is somebody that I learned from tremendously and was able to grow with. He’s extremely visual but compliments that with his background as a writer. I think we have common sensibilities and similar tastes, which of course really helps. We’ve done so much work together now that we have developed a shorthand. You just don’t have to spend so much time communicating your ideas, you can cut to the chase. He still surprises and challenges me – he’s incredibly inventive. I’m very proud of the work that we’ve done together. I’ve been very lucky, he’s been a great partner.”
Guyett also collaborated with Abrams on Star Wars: Episode VII – The Force Awakens. “I worked with some of my closest collaborators again, people like Pat Tubach and Animation Supervisor Paul Kavanagh. On that show we finally saw the opportunity to upgrade the famous lightsabers! It always bugged me, especially having done Revenge of the Sith, that you didn’t get the correct interactive light from the sabers themselves, you were always cheating the interactive light as a separate light source. On Force Awakens there’d been some really amazing advances in LED tech, so we could finally build a lightsaber that was literally a tube of LED lights. Finally the prop was real light source! It really helped the quality and realism of the work. DP Dan Mindel was also a great partner on that show, and he really took advantage of the new lightsabers in some key moments in the film. It’s great to work on movies that have the budgets to do that kind of work. Regardless, you’ve always got to be cognizant of budget and try to design the shot within whatever parameters you have for the best-looking shot.”
“As a visual effects supervisor, you’re often one of the first to join a production and the last to leave – you get to see the entire process unfold,” Guyett reflects. “I’ve been lucky enough to work on such a variety of projects, from Pirates to Mission: Impossibles, and work with some incredibly talented people. I enjoy every aspect of the process: the planning, the shoot and, of course, working with the team to create the shots and see the movie come together.
On location in the Skellig Islands, Ireland, for Star Wars: Episode VII – The Force Awakens.
Guyett and the crew on location in Petra, Jordan, for Star Wars: Episode IX – The Rise of Skywalker.
In hindsight I was lucky to start supervising at that crossroads between more traditional VFX techniques and the digital world, so I was fortunate to do a lot of miniature work, which I always loved. For example, on Star Wars: Episode III we had a massive practical model crew, despite the huge amount of digital work. It had one of the biggest miniature shoots in ILM’s history: the Mustafar scenes, which I supervised. It’s such a shame that it’s hard to do that work anymore. One of the fundamental ideas I learned from working with directors like J.J. and Steve Spielberg was about trying to make sure that each shot you worked on really advanced the story, even in a small way. Each shot has a purpose – it has a reason to be in the movie.”
Guyett with DP Dan Mindel and Production Designer Darren Guildford on Star Wars: Episode VII –
The Force Awakens.
Guyett on location in Alaska for Star Trek plate shoot.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
Cookie | Duration | Description |
---|---|---|
cookielawinfo-checbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.