Google’s Pixel 2 and Pixel 2 XL might have had teething issues when it comes to their displays, but one area where they didn’t fail to impress were their cameras. Following in the footsteps of their original Pixel duo that started a year ago, the new 12.2-megapixel sensors in Google’s latest smartphones are a treat to use today, but their whole potential isn’t quite exploited yet as there are lots of promised characteristics yet to arrive, waiting to be enabled via future software updates.
Gadgets 360 needed a Hangout session with Brian Rakaowski, VP of Product Management at Google and Timothy Knight, that leads camera growth for Pixel two, to speak specifically about the camera and what makes it tick. We are all aware of some of the more publicised issues with all the newest Pixels like the audio issues when recording video and over Bluetooth, odd display flashes, however we’ve had some issues with the camera too, which we hoped to get some clarity on from the Google duo, no puns intended.
The Pixel 2 does a fantastic job stabilising video however in low light, especially at 4K, the footage tends to get quite noisy. This is mainly because the Pixel 2 tries to brighten up the scene as far as possible simply by boosting the ISO, which gives you a brighter scene for certain, but at the cost of noise. This is done intentionally, Knight explains.
“This is a tradeoff we think a lot about. We tried to strike a balance of both,” he says. “If you compared the Pixel two camera to other mobile cameras, then you will notice that we’re brighter. It’s simple to make the noise go away if you only create the image dim. We decided that we rather let the user see the scene more clearly, by making it brighter, even if that means there is some more noise.” Knight further says that 1080p video ought to be a bit less noisy in comparison to 4K, since there’s more headroom to do heavy weight processing, compared to 4K.
Another feature that’s missing in the Pixel 2 is 60fps service at 4K, something that the iPhone 8 Plusand iPhone X boast off. “4K at 60[fps], sadly, is not something we’re going to bring to Pixel two,” says Knight. “For future products, we will consider it certainly. However, for Pixel 2, 4K 30 and 1080 60 is the video we all plan to support.” This limitation seems to get more to do with Qualcomm’s Snapdragon 835chipset than anything else however.
If you’ve looked in the settings of this Pixel two’s camera program, you will notice that enabling manual control for HDR+ gives you a second option in the viewfinder, called HDR+ improved. When we analyzed the Pixel 2 along with the Pixel two XL, we didn’t actually notice any quality difference between the 2 modes, aside from the fact that it takes a longer time to process the HDR+ enhanced photo. Turns out, we were right.
“In the large majority of instances, there’s no difference. From a user perspective, HDR+ and HDR+ enhanced will take the exact same photograph,’ explains Knight. ‘In a small number of conditions, HDR+ improved can take a photograph which has a little more dynamic range.” The reason the improved mode takes longer to process is because in regular HDR+ mode, Zero Shutter Lag (ZSL) is on whereas in the extended mode, it’s away. Shutter lag is typically the time obtained from the moment you press the shutter button, to if the picture is actually recorded and saved. Zero Shutter Lag (ZSL) typically gives you near-instantaneous shots, with almost zero delay.
We initially assumed that the Pixel 2’s Visual Core imaging chip will help speed this process up, after it’s active in the Android 8.1 update, but that doesn’t seem to be the case. The Visual Core SoC’s primary purpose will be to empower third-party camera apps to utilize the HDR+ attribute. “When third party’s use the camera API, they will have the ability to acquire the high quality processed pictures as a result,” says Rakaowski.
Finally, the absence of manual controls and RAW file supports is another bummer in new camera app. This is an area that other Android makers like Samsung and HTC have really mastered over time. Not everyone wants manual controls but it’s nice to have the option, especially once you want to take some artistic shots, and it’s very helpful in low light. Having this feature would also help restrain the vulnerability in video, for those who prefer to capture the scene for what it is instead of brightening things up. He further states that in doing this, users will not be able to benefit from HDR+, so image quality will endure.
Google might add some amount of manual control in the long run, “but in the present time, do not expect to realize a manual slider anytime soon,” says Knight. It seems that Google is relying heavily on its machine learning to enhance photos and create them seem so good as they do, which might explain why they aren’t willing to relinquish control over to the consumer. This applies to RAW file service too.
We do not have some updates today but we’re looking into it,” says Knight.