Quantum Dream Labs Tour

Zach Johnson
11 min readJan 31, 2025

--

Image generated by the author using Adobe Firefly

Taylor Nguyen is a reporter for Green Pages who visited the award-winning production studio Quantum Dream Labs in 2028. They were interested in learning first-hand about QDL’s innovation practices. While at QDL, Taylor takes a deep dive into how the production team works, the kinds of tools they are using, and how their production pipeline differs from the studios of the early 2020s when the rapid evolution of technologies started disrupting traditional studio production processes. The following is a (fictional) account of this (fictional) reporter’s experience of the future of entertainment.

I’ve been giddy with anticipation ever since I learned that Green Pages would be granted exclusive access to the notorious (and literal) black box that is Quantum Dream Labs. QDL shook the film production industry to its core a few years back with the premiere of “Ukrainian Phoenix”. The multi-award-winning historical drama-cum-Zelinsky biopic charted the rise of Ukraine from a war-torn country to one of the strongest economies in Europe and a major power-broker in both NATO and the United Nations. This surprise Hollywood blockbuster was by far the first globally successful depiction of war and recovery. It was, however, the first feature-length film to utilize QDL’s now patented technology to combine artificial intelligence–enhanced historical footage with an entirely virtual production pipeline, including generative augmented reality set design with a meager mid-eight-figure budget. Since their initial breakthrough, QDL has gone from strength to strength racking up a trophy cabinet of awards for films, video games, and immersive live-action experiences. All productions were entirely conceived, developed, and produced across their four “lots” in San Diego; Toronto, Canada; Brisbane, Australia; and Surrey, England–the combined size of which would fit comfortably into the now-shuttered Paramount Studios lot in Los Angeles.

As my UberAir touched down at QDL San Diego, I was struck by how unremarkable the matte windowless cube of a building was in contrast to the blue sky, surrounding rolling hills, and the glistening Pacific on the horizon. The featureless exterior — constructed in what seemed entirely like a single pane of black glass from the outside — felt like a contradiction to the font of creativity I imagined was flowing within its walls. I approached the entrance, identified only by a narrow walkway from the air taxi set-down point, through an impeccably manicured minimalist succulent garden in multiple hues of green and tan to the middle of the north-facing wall with a small white Quantum Dream Labs logo, flanked by an array of barely visible cameras and scanners. As I had provided biometric information upon scheduling my appointment, I was digitally recognized and authenticated at the same time as I realized that there was no discernable door, which seemed to simultaneously materialize and open–granting me access to an area barely larger than a maître d’ desk at one of those retro sit-down “restaurants”.

The layout of the QDL lot is identical in each of their four locations, which I will learn during the course of my unprecedented tour. Upon slipping through the automatic black glass door, I’m greeted by QDL SD Studio Executive Cameron Kim. Cameron started with QDL in the early days and is responsible for production operations across all four locations as well as the teams charged with continuous improvement and innovation of all QDL facilities and technical infrastructure. Before we commence the tour, Cameron guides me into a minimalist-designed meeting room with a clear glass boardroom table and white mesh-covered ergonomic chairs. The room’s windows become transparent just as the lights turn on low, and a holographic display illuminates and hovers above the table in the middle of the room. I assume this automated sequence of events is specifically calibrated to Cameron’s presence, which is confirmed when they throw me a subtle, sparkling grin while gesturing for me to have a seat.

As we settle in, a floor plan appears on the hovering display in front of us. Cameron explains that each QDL Qube is designed and constructed to be a fully self-contained studio intended to independently produce any form of linear, or interactive, or live action, or animated content — from feature films, to open world AAA video games, to interactive metaverse events and anything in between. Cameron talks about the always-on communication link between each of the other Qubes, effectively representing a window and a persistent two-way intercom allowing for real-time collaboration across sites. According to the floor plan, each Qube is anchored around an updated version of the old-school “Volume”, initially developed by Industrial Light and Magic. ILM’s Volume was a wall of LED panels that allowed for the projection of any background directly such that both the camera and actors could see the same set in real time. It was a revelation at the time. The QDL version replaces the LED panel array with a single seamless LiFi wraparound layer and removes the need for most physical foreground set pieces by generating high fidelity AR assets in real time. Of course, props requiring direct handling or manipulation by the actors were still sourced, constructed, and provisioned by the QLD “IRL” team. Surrounding the central LiFi stage on three sides are production studios for visualization, ambiance, and traditional ‘physical’ — or IRL — production.

On either side of Studio 1 (The Chloe Zhao Stage) are two identical meeting rooms, including the one we’re currently sitting in. These spaces are used for virtual project pitching and planning, physical and virtual writers’ rooms, and production meetings. Outside of Studios 2 (The George Lucas Stage), and 3 (The Zoya Akhtar Stage) are viewing and staging rooms. Just off the fourth side of the volume is an airport-hangar style door that opens to a large loading bay that is in constant dialog with the fleet of unmanned delivery drones and autonomous shipping vehicles whose coming and going is almost entirely controlled by AI. Finally, in between The Chloe Zhao Stage and the meeting room we’re occupying is an intimate screening room complete with vegan leather reclining chairs and 3D VR headsets.

The entirety of this setup occupies a mere 4,500 square meters, about the size of a field used by the International Robot Football League. If I weren’t sitting in this facility staring at the blueprint it would be difficult for me to wrap my head around this number considering that production studios used to take up multiple city blocks and even the space of a small town. Cameron registers my disbelief and standing, suggests with a welcoming smile we start the physical tour.

As with the exterior entrance, there are no visible interior doors in any part of QDL. Instead, as we get up from the table and turn towards the rear wall of the meeting room, a door automatically recedes from its hiding place on the wall and opens into Studio One — “The Zhao”.

Zhao is the largest of the three studios, Cameron tells me. It’s a unique setup that incorporates teams and technologies that support the entire production process from pre-vis, through post. The visual flow of the room calls to mind an atelier. It feels like an inherently creative realm where technology intertwines with artistic expression. The Zhao includes screens (both physical and digital), computers of various sizes and descriptions, hardware for creation and visualization, and myriad software applications, including the AI Assistant. In that respect, it has all the components of a production house. But despite–or rather maybe because of–all this advanced technology, I get the impression that this is a much more human space than I had anticipated. Here, writers, artists, animators, designers, and technicians are all together, each with unique skills and talents. This creative army is sculpting, painting, and animating to push the boundaries of visual storytelling.

The Zhao is a true hub of creativity and innovation, where team members exchange ideas and draw inspiration from one another. Each member of the Zhao team wears lightweight synthetic denim coveralls. An embroidered patch on the left chest pocket displays their name and a patch on their right shoulder indicates which part of the production team they work with. Not only are the coveralls functional–allowing them to shift from digital to analog work comfortably, the material they’re constructed from contains a series of sensors to capture and transmit individual GPS, Accelerometer, Gyroscopic, Magnetometer, and Pressure data. These data points allow the team to move freely (and hand’s-free) through the lot and also function as basic motion capture suits to allow their VR avatars to accurately mirror and mimic their exact movements when collaborating in virtual environments. While they may look like vintage engineer-wear, Cameron adds, these are some of the most advanced uniforms outside of those worn by aquanauts and astronauts.

Once an approved script arrives from the writers’ room, the pre-viz team here in Zhao begins a series of collaborative storyboarding sessions within a virtual reality environment. This team, often spread across different locations, uses a combination of VR and XR devices and specialized previs software to collectively visualize and refine each scene, offering a more immersive and interactive approach compared to traditional storyboarding techniques. The first step is a digitization of the approved script by the AI Assistant within seconds into a complete first-draft storyboard. This draft incorporates placeholder elements drawn from QDL’s extensive asset libraries and some generative art including characters, voices, props, set pieces, and backgrounds. The previs team, including the Story Runner, don headsets that let them literally step into each frame, interacting with life-size assets to test accuracy relative to the script–making necessary adjustments to ensure fidelity to the writer’s vision. What emerges from this rapid and real-time iteration is a comprehensive pre-production plan that is automatically distributed to each relevant team member.

As another example of the work done in Zhao, Cameron highlights the Micro Virtual Production Team. Using advanced augmented reality (AR) technologies–some of which are proprietary to QDL–designers, the director, and the Story Runner collaborate to create virtual sets and environments, replacing the storyboard elements with a combination of production-ready assets and additional placeholder instructions for the IRL elements. As with pre-vis, this technology allows for real-time interaction with virtual elements, enabling seamless integration of live actors and computer-generated imagery within the film’s scenes. This process identifies additional assets for the larger distributed team to generate, pushing them into their version control queue, which is managed and tracked by the AI Assistant. While many of these assets are dealt with by the Visualize team, others are routed to Ambient and IRL. Cameron informs me that we’ll walk through those studios now as we continue the tour.

We leave Zhao and make our way to The Zoya Akhtar Stage to see Ambient in action. Like the Zhao team, Ambient is all wearing matching high-tech coveralls — although theirs are workwear green like vintage mechanics. The studio similarly represents a multi-disciplinary group — this one of musicians, sound designers and engineers, digital composers, digital lighting designers, and ambient editors. Akhtar produces all components of the pipeline that are neither visual nor IRL, the vast majority of which focuses on sound and lighting. The sound team handles overall sound design, digital music composition, performance, recording, digital foley, digital sound effects (SFX), and sound and dialog editing. The lighting team deals with digital lighting design and editing including all digitally created ambient and practical lighting elements.

One of the more subtle innovations at Akhtar is the tight coupling of sound and light in a unified studio as critical elements jointly and inextricably supporting the overall narrative. This emphasis on ambiance enhances the audience experience of all QDL productions regardless of format or medium and has turned out to be a significant differentiator of QDL amongst other studios. Immersive collaboration is key to this differentiation. Unlike the Zhao VR hardware, the Akhtar team has modified VisionBeats headsets that offer enhanced pulse-code modulation (PCM) spatial audio with a patented synthesized photosensitivity capability that increases the ability to perceive higher than average light wavelengths (somewhere between 700–1,000 nanometers) developed specifically for QDL. The combination of immersive enhanced audio quality and massively expanded light perception gives the Akhtar team near superhuman abilities, resulting in an ambient experience for the audience unmatched in the industry. As a musician, Cameron displays a particular passion for the work undertaken in this studio. They practically vibrate with excitement describing the work of this team. But we need to move on. This time to Studio 3–IRL.

If Zhao and Akhtar are the hearts and minds of QDL, Lucas is its soul, Cameron tells me as we step into the last studio of the tour. Lucas feels simultaneously familiar and foreign. This team’s coveralls are a “Hamilton Brown” as if they rolled off the Carhartt factory line. Unlike the other studios here, the IRL team is significantly more likely to have blisters and calluses, and repetitive stress injuries — which is to say that they perform manual labor with their hands. Cameron insists that this fact makes them no less artistic or creative than anyone else under this roof. Looking around and feeling the energy and buzz present in this studio, I believe it. Anything that needs to be created from anything in real life, be it a set piece, a prop, a voice-over recording from an actor, a piece of analog music, a costume, or a prosthetic (yes, they still do this from time to time), or an actor’s hair and makeup. It’s done here. This team is maybe a bit more outwardly welcoming and accustomed to visitors as there is a steady flow of people coming in and out to contribute to the magic of what happens when humans collaboratively produce tangible assets together. I feel both a pang of nostalgia and a sense of wonder at the energy in this space. Cameron lets me know that this is nearly the end of the tour but we have one more stop in the screening room to wrap up and field any final questions I might have. For some reason, I’m most hesitant to leave this studio, but I pull myself away and follow Cameron out.

We arrive at the screening room and I settle into the most comfortable recliner I’ve ever sat in. The headset next to me is another custom piece of hardware and another innovation developed by QDL. Cameron sits down next to me and explains how QDL uses audience data to inform the production process. Periodically, starting with the first draft of the interactive storyboard before pre-production, they bring in ‘audience’ proxies (a.k.a. “normal people”) to experience the story. Each audience member wears headsets that track and monitor a series of data points including eye-tracking, facial expressions, electrodermal activity (EDA), heart rate, electroencephalogram readings (EEG), galvanic skin responses (GSR), temperature, and pupil dilation to get a complete snapshot of their objective response to the production.

These sensor outputs are combined with qualitative data from a questionnaire to elicit their subjective response to the content. Individual readings and feedback are aggregated and combined with QDL’s advanced data analytics to understand general preferences and engagement. These data are then all delivered in a report to the Story Runner on an iterative basis so that they may choose to adapt certain story elements to the audience sentiment snap-shot. The extent to which the Story Runner uses these data is dependent on the medium. Linear tends to skew more toward the artistic intent of the creator, whereas interactive experiences may alter production based on real-time audience feedback. The bottom line, Cameron emphasizes, is that QDL stories are written and produced by humans for humans. And with that, my tour has concluded. With as warm a smile as I received when I first entered the lot, Cameron gestures to the exit. I emerge from the screening room, walk through the small lobby and step outside into the surrounding southern Californian landscape where my little aerial taxi awaits to whisk me back to reality.

While I’ve only seen a snapshot of their overall process, I realize that I’ve witnessed real innovation at QDL of a truly agile studio production pipeline in action, characterized by rapid iterations and flexible workflows, driven by human teams, and supported by some of the most advanced AI tools available. And it still feels so deeply creative and artistic with a bit of controlled chaos and a lot of passion for ensuring the writer’s vision is realized in the final product. Quantum Dream Labs promised to define the future of production. From what I have seen today, they’ve not only achieved their goal, they’re raising the bar for the future. And they’re doing it with people and teams whose energy and passion give me hope that the one remaining trait that humans can claim a monopoly in — creativity — is alive and well.

--

--

Zach Johnson
Zach Johnson

Written by Zach Johnson

AI Strategist. Executive. Writer.

Responses (1)