When Google launched the Pixel 3 earlier this fall it came with heaps of expectation, most of which was centered around its camera. Its predecessor, the Pixel 2, featured what had been considered the best smartphone camera you could buy, even beating out the newer iPhone XS and Galaxy S9. The Pixel 3 promised a significantly better camera system. Well, months later, it’s clear that Google wasn’t lying. The Pixel 3 is by far the best smartphone camera of 2018 – check out our latest camera comparison, here – and most of its new features (Top Shot, Super Res Zoom, Photobooth, Motion Auto Focus), which sound like gimmicks, turned out to be anything but.

That said, the Pixel 3’s potentially coolest feature, Night Sight (another gimmicky name), wasn’t available at launch. Designed to revolutionize low-light and nighttime photography, at the smartphone’s launch event Google showed off photos of the Pixel 3 using Night Sight – the results were almost too good to be believed. Even without Night Sight, we found the Pixel 3 to take the best low-light photos of any smartphone already. Now that Night Sight has rolled out via a software update to all Pixel 3s – as of mid-November – they’re able to take photos that the Pixel 3, like all other smartphones, simply could not.

|

What Is Night Sight? Night Sight is a setting built into the Pixel 3’s camera app that captures really detailed photos even when you’re in poorly lit locations. Normally in such situations, the camera would call for a long exposure, which in turn would require a flash or for you to keep your hand really still (or have a tripod). The magic of Night Sight is that it eliminates the need for most of that. You still need to stand still for up to six seconds when taking a Night Sight photo, but the camera will automatically detect small movements like your shaking hand and, thanks to Google’s superior artificial intelligence, eliminates flaws to produce a clear, well-lit photo.

How Does It Work? All the latest smartphones use some form of high dynamic range (HDR), which is a special image processing technique. It enables the camera to capture a burst of frames, all at different exposures, and then align and merge them together into one high-contrast photo. Google’s own version of this, called HDR+, is unique because it’s also able to cancel out unwanted noise, such as handshake or motion, thanks to what Google calls “zero shutter lag (ZSL).” Here’s how it works: right when you open the camera app, it starts capturing frames and then when you hit the shutter button, it takes the most recent frames, combines them and then processes them through the Pixel 3’s normal HDR+ algorithm.

With Night Sight, the camera doesn’t utilize ZSL, which limits the exposure time of each frame, and thus it doesn’t immediately start capturing frames when you open the camera app. Instead, Night Sight relies on “positive shutter lag (PSL)”, which starts capturing frames when you hit the shutter button. But it captures frames that can each feature longer exposures than the individual photos that you capture with HDR+. “When in Night Sight, the Pixel 3 can take up to 15 photos,” said Alexander Schiffhauer, a product manager of computational photography at Google. “The total exposure time can be up to six seconds in Night Sight versus HDR+ where the total exposure time is much slower.” This is also why the Night Sight function prompts you to hold the camera still for up to six seconds – it needs to capture frames with longer exposures and know exactly when to start capturing them.

It should be noted that all previous generations of Pixel smartphones are now able to use Night Sight, however, the results aren’t equal. The Pixel 3 has a superior processing engine that allows it to use a feature called Super Res Zoom, which helps it capture stabilized zoomed-in photos. While the Pixel 2 and first-gen Pixel run multiple frames in Night Sight through the HDR+ algorithm, the Pixel 3 runs its frames through Super-Res Zoom algorithm. The algorithm was developed for super-resolution and works to reduce noise, according to Google’s blog post, and produces better results for nighttime scenes.

The Good: Night Sight works with both the rear-facing and front-facing camera, and it generally feels like a game changer for low-light and nighttime photography: a simple software upgrade that allows the Pixel 3 to capture photos that only very few, very expensive dedicated cameras could possibly achieve. No other smartphone comes close to achieving the same levels of sharpness and exposure. Not the iPhone. And not any other Androids. And it’s the feather in the Pixel 3’s cap, only further cementing it as the clear best smartphone camera you can buy right now.

Who It’s For: Anybody who wants to take better photos in near dark and standard low-light situations. Plain and simple.

Watch Out For: Night Sight is currently limited to its own mode; it doesn’t work with ‘portrait” mode, panoramas or videos, for example. One of the biggest hurdles is that to work really effectively Night Sight needs the environment to be pretty darn dark. A second hurdle is ensuring your subject is in focus. Night Sight uses the Pixel’s Motion Focus feature, but Motion Focus doesn’t work super consistently if it’s really dark; it’s not uncommon for Night Sight to experience “focus failure” or focus on the wrong thing.

Review: It’s important to stress is how easy Night Sight is to use. It’s built directly into the camera app, so you can quickly access it and, other than holding the shutter button down a little longer, you don’t really have to do that much different than you would when taking a normal photo. Moreover, the app will suggest using Night Sight if it’s dark enough. It essentially does all the work for you.

Is Night Sight perfect? No. As I mentioned above there are two common issues. One, it needs to really dark — but not pitch black — for Night Sight to look really great (and make you, the photographer, feel like some kind of nocturnal animal or covert marine). If it’s not dark enough, the photos still look decent but overexposed, and I just generally preferred the natural contrasty nature of the normal HDR+ photo. And two, especially when capturing objects or people, Night Sight sometimes had problems with focus or focusing on the right person. That said, Night can and will work in a bunch of different settings and scenarios. I used it in a bunch – you can check out the results below – to varying degrees of success.

Night Sight: Example #1

Google-Pixel-Night-Sight-Gear-Patrol-Glass-Slide-1
Google-Pixel-Night-Sight-Gear-Patrol-Glass-Slide-2

This glass of water was taken in a dark room with some soft light coming in from the window. The original HDR+ photo does a good job capturing shadows and contrast, but the details of the glass and water — which I tapped to focus on – weren’t as detailed. With Night Sight, some of the shadows on the table are lost, but in general, it does an excellent job of exposing the photo, especially on the glass and water as well as the wall and plant behind it. This is one of the scenarios where Night Sight works best.

Night Sight: Example #2

Google-Pixel-Night-Sight-Gear-Patrol-Tucker-Slide-1
Google-Pixel-Night-Sight-Gear-Patrol-Tucker-Slide-2

These two photos were taken in a near-dark studio. Considering the conditions, I thought the normal HDR+ photo did an alright job of actually making me feel visible. I took the same photo with my iPhone XS and it was basically black. That said, it’s still not really a useable photo. It’s blurry and dark and pretty much all the details of the subject are lost. With Night Sight, however, it’s (figuratively) a night-and-day difference. It looks basically like the photo was taken in a well-lit room. You can make out details on my face, hair, body and the room itself. It’s truly impressive.

Night Sight: Example #3

Google-Pixel-Night-Sight-Gear-Patrol-Flatiron-Slide-1
Google-Pixel-Night-Sight-Gear-Patrol-Flatiron-Slide-2

In these photos of the Flatiron Building, you can definitely tell the difference between the HDR+ and Night Sight photos. The sky and buildings are way more exposed in the Night Sight photo, but it also looks a little blown out. Night Sight wasn’t working as well as I would’ve hoped simply because it wasn’t dark enough. Even though it was after 9:00 pm and very dark, the Pixel 3’s camera picked up too many of the street lights and light from the surrounding buildings.

Night Sight: Example #4

The three photos here are all selfies, taken with and without flash, and then with Night Sight. It was almost completely dark in the room, which you can see in the non-flash HDR+ photo. Though it’s really more interesting to look at the Night Sight photo and compare it against the photo with flash, a combination that other phones with Night Sight would likely use in such a scenario. Yes, the face is more illuminated with flash, but a lot of its natural colors are lost, and so are the objects in the background. Night Sight does an alright job here, butmy face doesn’t appear as in-focus as I’d like.

Night Sight: Example #5

Google-Pixel-Night-Sight-Gear-Patrol-Stuytown-Slide-1
Google-Pixel-Night-Sight-Gear-Patrol-Stuytown-Slide-3

These two photos are further evidence that it’s difficult finding a truly dark outdoor place to photograph the city. It’s almost 10:00 pm and very dark, but the street lamps are brightening up the scene. Both photos do a good job at capturing lens flare, but you can see how the Night Sight captures everything else way more. You can see the individual branches in the trees and more detail in the fence and buildings.

Further Reading: Night Sight is an advanced feature that requires a Pixel smartphone and a crazy amount of software and image processing. Google has published a very detailed blog post that breaks down the intricacies of how Night Sight works and how more professional photographers can use it to its fullest potential in a manual mode. You can read the blog post, here.

What Others Are Saying:

• “Whatever wizardry Google has used to make Night Sight, it has our full recommendation. This is a serious addition to the Google Pixel camera – a feature that can seemingly turn the darkest of shots into light-filled masterpieces. Yes, it’s using AI and algorithms, but when the quality of the shots are this good, we aren’t going to argue.” – Marc Chacksfield, Digital Camera World

• “Every aspect of Google’s Night Sight is dynamic and automatic. If the phone detects that a scene is dark enough, it’ll surface a suggestion to try night mode, you tap on that, and then it mostly takes over from there. The only controls offered to the user are tap-to-focus and the usual exposure slider. You can’t tell the camera how many frames you want it to capture or set your own shutter speed.” – Dieter Bohn, The Verge

|

Read More Gear Patrol Reviews

Hot takes and in-depth reviews on noteworthy, relevant and interesting products. Read the Story
Note: Purchasing products through our links may earn us a portion of the sale, which supports our editorial team’s mission. Learn more here.