Skip to main content
The Keyword

Google AR & VR

Seeing art in a new way: VR tools let characters jump right in

Article's hero media

We all know you’re not supposed to touch the pieces of art in a museum, but what if you could jump inside them? In YouTube creator SoKrispy’s latest VR video, Do Not Touch, the characters do just that: they literally dive into the artwork and become part of the scene.   

Do Not Touch GIF

A scene From SoKrispyMedia’s “Do Not Touch” VR Video

Visual effects this advanced usually require physical equipment like green screens that can be costly and difficult to set up. SoKrispy, however, was able to achieve many of the effects in post-production with Google’s Jump VR Video suite, along with Adobe Creative Suite, Unreal Engine, and Autodesk 3DS Max.

To make it look like the actors were actually inside the pictures, SoKrispyMedia used Jump’s "high-quality stitching" option, a feature that lets creators achieve a green screen effect without needing a physical screen. They also utilized this feature in one of their earlier works, “Video Game Vehicle,” after discovering in post-production that the physical green screen they had used while filming wasn’t big enough. High-quality stitching let them “fix it in post” with a few clicks, rather than spending days reshooting with a wraparound green screen.

DepthGreenscreen

Jump’s high-quality depth maps allow creators to get a green screen effect without a physical green screen

So how does this feature work? In addition to producing higher quality stitches, it lets creators generate razor-sharp depth maps that estimate the distance (or depth) of every pixel in the scene. This enables the use of editing software like Adobe Premiere or After Effects to extract elements or even composite a new scene in the background, all without a physical screen or meticulous manual rotoscoping. And in the latest release of After Effects, our partners at Adobe have made it even easier to leverage Jump's depth maps in post-production.

SoKrispy ZDepth in Adobe GIF.gif

Generating depth maps in Jump Manager gives you more flexibility in post-production

In addition to using depth maps, SoKrispyMedia used Jump’s high-quality stitches to realistically light computer generated (CG) objects with Image Based Lighting. Instead of having to capture separate light probes, creators can get the same effect with Jump’s high bit-depth 360° stitches. This saves valuable time on set and gives creators the ability to seamlessly integrate CG into live action footage with accurate lighting and reflections.

Car_IBL_Image.png

Side by side still of CG object’s from Video Game Vehicle with and without image based lighting

SoKrispyMedia also utilized a third, software-based post-production tool to enhance their Jump footage. Using a technique detailed on the Google AI blog called “style transfer," they applied artificial intelligence to their footage to transform the look of each character into the style of the painting they’ve jumped into.  

Style Transfer.gif

Another scene From “Do Not Touch”

Taken together, these software-based features—high-quality stitching, Image Based Lighting and style transfer—provide creators with new ways to share their vision with the world. As SoKrispyMedia’s Director and VFX supervisor Sam Wickert explains, “The most important aspect of these projects is to make VR content that is really worth watching in a headset, and these tools let us do that.” For audiences, this means we can look forward to traveling to all sorts of new destinations in the virtual world … from a video game environment to a museum where you actually jump into the paintings to wherever creators take us next.

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe