Skip to main content
The Keyword

Pixel

How we made Pixel’s Night Sight even faster

n animated gif showing a poorly lit photo of what looks like a woman’s face and part of her arm but the rest of the photo is obscured because of how dark it is. A slider runs over the image to reveal the words “Night Sight” while also brightening the phot

When we released Pixel 7 and Pixel 7 Pro last year, both phones came with new photography features, including an update to Night Sight, a night photography feature we first introduced with the Pixel 3. The latest phones sped up the low-light shooting mode, cutting the exposure time in half and revealing even crisper, sharper images. And with our latest Feature Drop, faster Night Sight speeds arrived for the Pixel 6 and Pixel 6 Pro, too.

How Night Sight works…

Before explaining how the team sped things up, I asked Alexander Schiffhauer, Group Product Manager, for a quick review on how Night Sight works on the Pixel 7 Pro: Once you open the camera app, your Pixel will detect if it's night and automatically enable Night Sight (and if it's not too dark yet but you still want to use the mode, you can also manually select it). Once your Pixel is using Night Sight and you hit the shutter button, the Pixel Camera begins grabbing frames from your viewfinder (these images aren’t stored, they’re just temporarily available for this feature). “It pulls these into what’s called a ‘memory buffer’ in your Pixel,” says Alex. Google Tensor is our system on a chip that powers our latest Pixel phones and runs complex AI-based tasks that, among other things, make taking great Night Sight photos easy for everyone.

“Those initial Night Sight frames are pretty short, and usually pretty dark.” The buffer is constantly erasing old frames for new ones while the camera app is open. Once you actually hit the shutter, the camera takes a burst of longer-exposure frames — they’re brighter than the "pre-shutter" frames thanks to the longer exposure time, but because of that they’re also blurrier.

Three images in a row, the first showing a stack of photo frames; they’re very dark but you can see a woman holding a flower. This first photo is labeled “burst of raw fames,” and a arrow points to the next images, which shows a single photo frame of the dark image of the woman holding a flower — it’s labeled “merged raw image.” An arrow from this image points to the last one, which shows that same photo of a woman holding a flower, but it’s brightened to reveal more detail of the background and area around the woman holding the flower. Her shirt is pink and she’s sitting on a couch with white pillows. This photo is labeled “final high-quality result.”

HDR+ with Bracketing merges frames taken at different exposures to create bright, detailed photos.

Then, using something called HDR+ with Bracketing, your phone combines the quickly-taken, darker, crisper frames with the longer-exposure, brighter and blurrier ones to create Night Sight images: bright, incredibly detailed photos that aren’t dark or blurry and more accurately portray what you’re photographing.

Two low-light animated gifs of the same photo, showing a woman standing against a wall with scattered light cast on her. The first image has the caption “without Tensor G2” and shows the photo going from extremely dark and showing an outline only to slightly lightening to the point that details of the woman are visible. The second gif has the caption “with tensor G2” and shows the photo going from extremely dark and showing an outline only to far lighter with much more detail visible but without an overexposure.

Google Tensor G2 is our latest custom-built chip that powers the Pixel 7 and Pixel 7 Pro, and helps with all kinds of things — including making Night Sight photos brighter and more detailed, in less time.

…And how AI made Night Sight faster

Night Sight's latest, even speedier update is thanks to Google Tensor. “Our custom-built processors enhance what the Pixel’s cameras are capable of,” says Alex. After that whole “combination” period, where the overexposed frames are combined with the darker ones, a new machine learning algorithm that runs on Tensor reduces unwanted noise in the final photo. This process of denoising has traditionally required Night Sight to take more or longer frames, which meant more blur and holding still for longer. But with Pixel 7 and Pixel 7 Pro, the Pixel engineering and Google AI teams built this new neural network to make Night Sight faster than ever.

“Our latest silicon, Google Tensor G2, is focused on making large ML models more power efficient — and with Night Sight on Pixel 7 and Pixel 7 Pro, it powers a new neural network that dramatically reduces noise at a much faster pace,” says Alex. That algorithm will continue to get better and better as the team fine-tunes it over time.

There you have it — the technical explanation for how we made Night Sight even faster. But when it comes to practical use? All of this means you don’t have to hold still for quite so long to snag those extremely cool night-time portraits.

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe