• Keller Berry

The Mandalorian Effect: Extending Realities in Georgia's XR Landscapes


Lightsabers and shots of crimson laser beams cut through a landscape of deciduous trees within an eerie fog. Ahsoka Tano (played by Rosario Dawson) establishes herself as a Jedi warrior, slicing through enemies with brute strength and a creative edge to each killing. Nightfall is upon this forest of death, and the bare trees that still stand after all the fighting take on an endless quality. Where does the forest end and where does it begin?


In an instant, the sky brightens and the seemingly endless forest is suddenly behind the camera. The colossal doors to a guarded village open slowly, revealing a new set in the grayish blue light of dawn. The transition in the environment is smooth and takes on an effortless quality. “The Mandalorian” season one and two both take on magical qualities of seamless transitions from one otherworldly set to the next. When the audience isn’t immersed in the uncanny forest with Tano, they’re in the air with Cara Dune (played by Gina Carano) flying a ship through the crevices of towering mountains in the bright light of day, walking through the desert before sunset alongside The Mandalorian (played by Pedro Pascal) and Grogu (David Acord), and more.


“PEOPLE KNOW THEY CAN DO MORE, SO IT OPENS A PANDORA'S BOX FOR PEOPLE TO CONSIDER SHOOTING AT 10 DIFFERENT LOCATIONS. A DESERT SCENE BY MONDAY AND A BEACH SCENE BY FRIDAY. VIRTUAL PRODUCTION DOES NOT KEEP PEOPLE BEHOLDEN. WITH SOME PROJECTS WE CREATE 25 DIFFERENT LOCATIONS.”


-NICK RIVERO, MEPTIK STUDIOS CO-FOUNDER AND CTO"


Evolutionary cinema is happening across the globe as television series and films adapt to virtual production on a grander scale, creating true-to-life sets that exceed viewers’ expectations. The Disney+ Star Wars spin off, “The Mandalorian,” was shot on an LA soundstage encased by LED walls that displayed ever changing digital sets. Instead of being challenged with the static green screen, “The Mandalorian’s” cast was able to engage with a 20 foot high and 270 degree semi-circular LED video wall with ceiling and a 75 foot in diameter performance space, called the Volume.



Now, virtual sets can be built before going into principal photography. The results are groundbreaking. The location backdrops are drafted by VFX artists as 3D models. Then, photographic scans are mapped onto the stage. Practical set pieces are combined to create a lush mise en scene.


Extended reality, known as XR, is a term that refers to a combining of the real and virtual worlds. In XR, human and machine interaction is generated by computer technology and wearables. The X in XR signifies the variable for any current or future spatial computing technologies. Spatial computing was defined in 2003 by Simon Greenwold, as “human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces.” This includes augmented reality (AR), mixed reality (MR), and virtual reality (VR).


According to Business Wire, the virtual production market was valued at 1,463.46 million USD in 2020 and is projected to reach 4,744.04 million USD by 2028. Virtual production is an umbrella term to describe the emerging technology which uses software tools to combine computer graphics and live action footage in real-time. This cutting edge technology cuts costs on location scouting, art department, and more. With this enhanced technology, executing creative ideas becomes a more intuitive rather than arduous process.


While the virtual production masterminds behind “The Mandalorian” are the pioneers in XR stages, companies all around Georgia are offering competing technologies that are worth delving into.


In an interview with Nick Rivero of MEPTIK Studios, a Georgia-based company that specializes in immersive environments for virtual and extended reality (​XR) production and experiential design, real time content, and projection, he broke down the three Rs of virtual production for Oz readers: “VR: Virtual Reality you put on a headset, changes your perception, AR: Taking the digital and putting it into the physical i.e. Instagram, PokemonGo, XR: Using the digital world to create entirely new and immersive environments.”


As a full service production studio with centrally located offices in Atlanta, MEPTIK creates high-quality content across industries throughout the nation, reaching target audiences with engaging virtual and hybrid experiences while increasing the return on investment, reach and engagement. Their goal is to create other wordly spaces that you can partake in, but with great power comes great responsibility.


“People know they can do more, so it opens a Pandora's box for people to consider shooting at 10 different locations. A desert scene by Monday and a beach scene by Friday. Virtual production does not keep people beholden. With some projects we create 25 different locations,” Rivero said. “We focus on the virtual pipeline technology, sometimes set dressing is required.”


MEPTIK has a heaping and creative portfolio, including but not limited to, Virtual Conference with Video Game Engine, Fleurie Augmented Reality Music Video, UT Projection Mapping, BMW LED Art Installation, Metro Atlanta Chamber Mercedes Benz Content, and more.


South of MEPTIK, Trilith Studios, the 700-acre full service film studio in Fayetteville, opened a new LED stage virtual production service in partnership with MBSi, Fuse, and SGPS/ShowRig. This is similar in concept to Industrial Light & Magic’s LED screen StageCraft platform for “The Mandalorian” series, and Weta Digital’s LED stage virtual production service, based in Wellington, New Zealand.


When sitting down with Oz, Trilith’s COO and Executive of Production, Craig Heyl, warned about the margin for error in using the pioneering technology. “You’re working with the entire team in order to best shoot on a virtual stage. Something to keep in mind is that there’s less room for error on a virtual stage. With a blue [or green] screen, everything is adjusted in post; with an LED screen, it should all be as desired from the start,” Heyl told Oz. “The tech is in and of itself a way to accelerate the storytelling process. It’s not a technology that every filmmaker has access to.”


One of the challenges, tasked to find and build a approx 40’ 70’ 140 degree volume LED wall, in less than a week to build, shoot and take down. Being critical to the film, doing something audacious for a less than audacious budget. This is the goal,” Heyl added. “Accessibility in most of the VR/XR stages are impermanent and the teams are brought together for a short period of time. By having a permanent team, from a production company’s standpoint, renting a space that is already built is inevitably much cheaper and that’s the goal.”


Georgia companies are now focused on ways to make this technology more accessible. “Georgia is pioneering this technology in its accessibility. You’ll find this tech everywhere, in varying forms, but the style of total access to a brick and mortar location, with in-house teams to guide filmmakers, is new. And Trilith is pioneering this evolution toward an infrastructure of accessibility,” Heyl said.

“GEORGIA IS PIONEERING THIS TECHNOLOGY IN ITS ACCESSIBILITY. YOU’LL FIND THIS TECH EVERYWHERE, IN VARYING FORMS, BUT THE STYLE OF TOTAL ACCESS TO A BRICK AND MORTAR LOCATION, WITH IN-HOUSE TEAMS TO GUIDE FILMMAKERS, IS NEW.”


-CRAIG HEYL, TRILITH STUDIO’S COO AND EXECUTIVE OF PRODUCTION



Georgia Film Academy shares a campus with Trilith Studios in Fayetteville, so their students are receiving access to hands-on experience. “We have Georgia Film Academy right here on our lot. All the trades are needed, and our plan is to train people at Trilith Studios to work in the new industry.” Additionally, Creative Media Industries Institute (CMII) at Georgia State University is teaching their students how to learn the same skills used to create “The Mandalorian” in their VR cave located off 25 Park Place in the heart of Downtown Atlanta. In fact, CMII is in the process of planning out their own XR stage/LED wall.


“We are looking to implement our XR stage at the end of this calendar year,” Director of Operations at CMII, James Amann, told Oz.

“What we’ve done is really coordinated with some of our preferred vendors as well as other partners in the region both in private and public sectors to identify a solution that’s right sized for our studio and for the CMII and GSU first XR stage. We intend it to be a part of our contribution to the MFA program through virtual production, but also for cinematics and special effects in both film and design, VR, AR and other storytelling technologies.”


The new MFA program will feature virtual production as a component of one of its tracks. “This is the MFA in partnership with the College of the Arts, so one of the tracks is focused on emerging film technologies inclusive of virtual production,” Amann said.


“It’s going to be roughly 13.5 ft x 24 ft stage and it will use the Optitrack motion tracking system for camera tracking,” Amann explained.


“The XR stage and LED wall is definitely intended to give GSU students access to technology that is emerging in the film industry now. We’re trying to help train students to manage and work with this technology, because we know that industry partners in Georgia and the region are clamoring for people who understand this technology and who can work with these stages,” Amann added.


North of GSU, Music Matters Productions functions as a one stop, event production company, with an extensive product range of high-end audio, visual, and lighting equipment, and a team of creators. The Owner of the Woodstock-based company, Aaron Soriero, discussed the engine used to create the virtual sets they come up with for clients: “Unreal Engine is used to set up 3D environments and have all things interact in a real-time way.” Unreal Engine is a game engine developed by Epic Games, with a high degree of portability, supporting a dynamic range of tech, including desktop, mobile, console, and virtual reality platforms.


“Anything you can think of you can do in XR, if you are creative and know how to work in a studio space. Creative storylines can make the narrative happen and you can design whatever you want in that space,” Soriero told Oz. He went on to say that people who are proficient in coding, working with 3D software, as well as in the traditional film industry, should try applying themselves to work with the XR industry as well.


“There’s something there in the live aspect of it, something to utilize XR in a live event stage space. Manufacturer’s are sending updates to software and hardware that caters to XR needs. It’s only a matter of time before we marry the two and bring live events and XR out of the studio and onto the festival stage,” Soriero added.


Music Matters Productions has a dynamic portfolio consisting of Georgia Tech Arts Skyline Series, Shaky Knees Music Festival, Heineken Brand Activation, Rolling Stone Party, AFROPUNK Atlanta and New York, and so much more. With the help of expansive virtual production technology, each show is unique and the engagements with crowds are therefore even more significant.


In April, Savannah College of Art and Design (SCAD) announced their new XR stages and backlot expansion in the Savannah Film Studios and in metro Atlanta. The stages are being created in partnership with MEPTIK, which comes as no surprise as the founders of MEPTIK are SCAD alumni. SCAD’s stages will allow SCAD and Savannah to bring pioneering productions like “The Mandalorian'' to Georgia.



“It's an innovative new way to composite actors into totally digital backgrounds. For feature filmmaking it represents an entirely new paradigm for creating scenes with actors and elaborate visual effects or any type of landscape and you can see the results as you go. Television will use this approach for storytelling as well but will also explore putting hosts into totally virtual sets or environments for sports, news, game shows, weather, advertising, fashion, commercial and corporate projects,” Dean of SCAD School of Digital Media, Max Almy, told Oz.


“At SCAD, in the School of Digital Media, we have been working with this idea of virtual production for years now. We are currently developing a virtual production minor. We have worked on several virtual productions with major industry partners including a special SCADpro project that created virtual sets for an international sports event. By anticipating these new trends in technology, we stay ahead of the curve. We've worked with Epic/Unreal game tech for years and we are ready to jump into this new marriage of real-time rendering and filmmaking, working hand-in-hand with the School of Entertainment Arts. We are giving our students the opportunity to become some of the first XR filmmakers and creative technologists that will be working with this new paradigm,” Almy added.


“WE ARE GIVING OUR STUDENTS THE OPPORTUNITY TO BECOME SOME OF THE FIRST XR FILMMAKERS AND CREATIVE TECHNOLOGISTS THAT WILL BE WORKING WITH THIS NEW PARADIGM”


-MAX ALMY, DEAN OF SCAD SCHOOL OF DIGITAL MEDIA


SCAD students are going to have access to a curriculum and a virtual production minor so that they have the opportunity to learn the skills of XR production. “It’s a new territory with new pipelines of production. Directors, cinematographers, lighting designers and visual effects teams will all learn new ways of working with each other,” Almy said. “Filmmaking and visual effects professionals will be diving into XR production. They may already have some of the skills necessary to work with XR production and many may need to learn more about Unreal or the Disguise software that controls the XR system.”

With XR on the rise and the virtual production industry booming, it’s only natural to suppose that this is an elongated funeral for the once beloved green screen. However, this is not necessarily the case. “As we are learning more about XR production, we are finding that green screens will still play a big part in the planning of visual effects for any film. Certain scenes will be best done on green screen, and the two techniques can be combined in any production. Green screen stages can use the same real-time compositing and camera control software. Green screen is also being used to prep and test for XR scenes. Future filmmakers and creative technologists will be well served to become experts in green screen and XR virtual production.”


“I’m personally super excited about SCAD’s new XR stages,” SCAD School of Digital Media student, Ryan Harper, told Oz. “They will be an incredible asset to our program and tool for students to have access to since it’s the future of film production and visual effects. The XR stage goes one step beyond a green screen allowing filmmakers and post production artists to collaborate in the same space in real-time. The XR stage is also rewriting the visual effects pipeline and curriculum at SCAD with its ability to integrate Unreal engine into a film set allowing virtual lighters and set dressers to easily communicate with the director during a shoot and change the entire environment in minutes. It’s seriously cool tech!”


The excitement for XR productions in Georgia is palpable, with eager and skillful students in the wings ready to take on roles in the next grand virtual challenge on set. And, if locations are no longer unreachable, why not have them rendered in our home state where we can also benefit from the spotlight “The Mandalorian” has shined on extended reality? Perhaps it won’t be long before a galaxy far far away is right here in Georgia.