The AI magic behind Sphere’s upcoming 'The Wizard of Oz' experience

“The Wizard of Oz” may not be the first film shot in color, but many people remember it that way because of how director Victor Fleming cleverly used black-and-white film for the scenes set in Kansas.
Likewise, “The Wizard of Oz” may not be the first film to be reconceptualized with AI, but it may soon be known for that, too.
For months, thousands of researchers, programmers, visual effects artists, archivists and producers at Google DeepMind, Google Cloud, Sphere Studios, Magnopus, Warner Bros. Discovery and others in the film and technology industries have been working to bring the 1939 classic to a very big screen in a very big way.
On August 28, their work will debut at Sphere, the colossal Las Vegas venue that has been pioneering new forms of entertainment since it opened in September 2023. Now, generative AI will take center stage, alongside Dorothy, Toto and more munchkins than could ever fit in a multiplex.
It’s fitting that a work that once broke cinematic boundaries will do so again. “The Wizard of Oz at Sphere” is an equally epic undertaking of creativity and technology, where the story will envelop the venue’s 17,600-seat spherical space to create an immersive sensory experience.
Even a few years ago, such an undertaking would have been nearly impossible with conventional CGI. It’s really only become possible through the latest advances in generative AI media models, specifically Imagen and Veo, with Gemini also playing a major role. Not only does the team need to create an all-encompassing experience, but they also must do so with only the original material. Not a line of new dialogue was added nor a note of new music was sung in enriching this classic for Sphere.
“We talked about doing it in different ways,” says Jane Rosenthal, the Academy and Emmy Award-nominated producer who is a producer on “The Wizard of Oz at Sphere.” “We realized that we really needed to do it with AI.”
The wonderful wizards
Not that the team can simply enter a few AI prompts, click their collective heels and call it a day. Buzz Hays, the global lead for entertainment industry solutions at Google Cloud and a producer with 37 years in Hollywood, points out this is about more than using AI to expand an old film for a new format.
“We’re starting with the original four-by-three image on a 35mm piece of celluloid — it’s actually three separate, grainy film negatives; that’s how they shot Technicolor,” Hays says. “That obviously won’t work on a screen that is 160,000 square feet. So we’re working with Sphere Studios, Magnopus and visual effects artists around the world, alongside our AI models, to effectively bring the original characters and environments to life on a whole new canvas — creating an immersive entertainment experience that still respects the original in every way.”
When the project was first getting underway, there were many on the team, including within Google, who openly wondered if AI technology was there yet to complete the work or achieve the group’s collective vision. But because traditional CGI wouldn’t do the trick, at least not without massive expense and years of toil — and because everyone was excited to break new ground — they got to work.
“The models, they’re wildly innovative,” Dr. Steven Hickson, a Google DeepMind researcher on the project, says. “We’d find something we can't do, we think it's impossible, and then a month later we're like, actually, maybe we can do that.”
You can see why it seemed impossible, though.
Magnifying the original grainy images for Sphere’s 16K LED screen — the highest resolution screen in the world — was the first but far from the only challenge. The team also had to account for all the camera cuts in a traditional film that remove characters from parts of certain scenes, which wouldn’t work at the new, theatrical scale that was envisioned. Conventional CGI might have handled the scaling issue, but there are few ways it could fill out the rest of the scenes effectively.
Take the moment where the Cowardly Lion first pounces on his soon-to-be companions. The camera pans back and forth between the Scarecrow and Tin Man, with cuts to Dorothy hiding behind a tree in the distance. The experience at Sphere called for keeping all these elements together, in hyper-realistic detail.
To achieve this, the team has three major technical hurdles to overcome.
The magic of fine-tuning
Using versions of Veo, Imagen and Gemini specially tuned for the task, the Google teams and their partners developed an AI-based “super resolution” tool to turn those tiny celluloid frames from 1939 into ultra-ultra-high definition imagery that will pop inside Sphere. Then, the teams perform AI outpainting, to expand the scope of scenes to both fill the space and fill in the gaps created by camera cuts and framing limitations. Finally, through performance generation, they’re incorporating composites of those famed performances into the expanded environments.
Together, these techniques help achieve the natural gestures, staging and fine details that conventional CGI struggles to match.
“When the request came to us, I was almost jumping up and down,” says Dr. Irfan Essa, a principal research scientist at Google DeepMind and director of its Atlanta lab. “This is the best opportunity to showcase the magic that we develop using AI.”
Yet for all the powerful new technology at play, one of the biggest breakthroughs comes from following the traditions of cinema: having plenty of extra material to work with. In addition to old footage, the team scoured archives to build a vast collection of supplementary material, such as the shooting script, production illustrations, photographs, set plans and scores.
Through a process known as fine-tuning, these materials are uploaded to Veo and Gemini so the models can train on specific details of the original characters, their environments and even elements of the production, like camera focal lengths for specific scenes.
With far more source material than just the 102-minute film to work with, the quality of the outputs dramatically improved. Now, Dorothy’s freckles snap into focus and Toto can scamper more seamlessly through more scenes. Every change, Hays notes, was made in close collaboration with Warner Bros., to ensure continuity with the spirit of the original.
Follow the yellow brick code
As the team continues their journey with this truly larger-than-life project, many are still in awe at all they’ve achieved — and excited about what is still yet to come.
“When you have innovation like this, you don't always know where it's going to go,” says Jim Dolan, Executive Chairman and CEO of Sphere Entertainment. “You have to be able to take a leap of faith. What you're going to see in ‘The Wizard of Oz at Sphere’ is clearly a leap of faith.”
