The past week has been quite exciting for fans of the Google Pixel smartphones. We found out the Google Pixel 4’s LTE connectivity, Motion Sense gesture regional availability, rear and front design, 90Hz display, zoom capability, RAM capacity, Pixel Themes app, and possible new camera features. If the Pixel smartphones are good at one thing, it’s their camera quality, so it’s no surprise that a lot of the attention has focused on the late 2019 Pixel camera hardware and software. After 9to5Google leaked new camera details on the Pixel 4, we learned that the Google Camera app contains a lot of hidden information that could reveal key Google Pixel 4 camera features.
An APK teardown can often predict features that may arrive in a future update of an application, but it is possible that any of the features we mention here may not make it in a future release. This is because these features are currently unimplemented in the live build and may be pulled at any time by the developers in a future build.
XDA Senior Member cstark27 is a member of our community who is best known for his work on modifying the Google Camera app to unlock features that aren’t supported on older Pixel smartphones. He was the first to enable Night Sight in the Google Camera app before the feature widely rolled out last year, for example. He first spotted new camera features being added to Google Camera 6.3, the same APK version which hinted at the Pixel 4 having a telephoto camera (a finding that was corroborated not once but twice.) We examined the first 6.3 APK from July as well as the latest build from late August and confirmed the following code is still present. We also examined the last 6.2 APK and confirmed that the code for these features is not present, indicating that these are indeed new features. Furthermore, all of these features are marked “experimental2019,” so only 2019 Pixel smartphones will have them. We know that the Pixel 3a doesn’t include these features, so that leaves the Pixel 4 as the likely candidate.
First up is a feature that has only recently made its way into the smartphone space: Audio Zoom. Audio Zoom is likely similar to the Zoom-in Mic feature on the Samsung Galaxy Note 10, a feature which uses the microphones to adjust the audio focus when you zoom in or out.
Here’s a demonstration of the Zoom-in Mic feature on the Samsung Galaxy Note 10+, courtesy of our friends at PhoneArena. The audio quality isn’t that great at a distance, but it’s certainly better to have this than barely being able to hear someone at all. I hope that Google can implement this feature well on the Pixel 4 since the Pixel 3 at launch had microphone issues during video recordings.
Another new feature that is in development in the Google Camera app is “Live HDR,” which likely uses the “HDRNet” algorithm developed in collaboration between MIT and Google researchers Michaël Gharbi, Jiawen Chen, Jonathan T. Barron, Samuel W. Hasinoff, and Frédo Durand to apply HDR to the camera viewfinder in real-time. (References to HDRNet exist in several classes in Google Camera 6.3.) According to Wired, which covered the algorithm when it was published, HDRNet can also be used to automatically retouch photos—increase contrast, tone down brightness, etc.—in under 20 milliseconds. At the time of publication over 2 years ago, this auto-editing feature was still “in the research phase,” according to Wired, but Google may have perfected the algorithm in the time since and now feel confident enough to ship it in a consumer device.
The following is a real-time demo of HDRnet, taken from this Medium article by Khush Jammu. Being able to preview an HDR photo before taking it would be a really neat feature, and I hope it does ship on the Google Pixel 4.
Next up, cstark27 spotted a reference to a “mesh warp” feature.
This appears to be a reference to a new algorithm to correct distortion in wide-angle portraits developed by Google researchers YiChang Shih, Wei-Sheng Lai, and Chia-Kai Liang. The algorithm corrects distortion from wider fields-of-view without warping either faces or the background. The following screenshot comes from the researchers’ demonstration at SIGGRAPH 2019, hosted on YouTube. It shows how their new method achieves correcting distortion in wide-angle selfies while maintaining the proper proportions on faces and the background.
Google has already confirmed that the Pixel 4 will not have a second front-facing camera like the Pixel 3. The second camera on the Pixel 3 is a dedicated wide-angle lens, so it sounds like the Pixel 4 is a downgrade in the flexibility of selfie photos. However, 9to5Google today claimed that the front camera “will be a wide-angle lens.” With this content-aware warping mesh technique, the Pixel 4 should be able to take great, undistorted wide-angle selfies, removing the need for a dedicated second front camera. This algorithm also addresses a common complaint about the quality of wide-angle selfies on the Pixel 3.
Notably, we’ve seen hints at wide-angle distortion correction features over a year ago. This older technique to correct wide-angle distortion on faces is likely not as sophisticated as the newer technique, though we don’t have photos for a direct comparison. The code-name for that older feature is also different from the code-name for the mesh warp technique, and the paper on this technique was published just this year as well. The code for the mesh warp feature only appeared with Google Camera 6.3, so this feature is definitely new to the Google Camera app.
Lastly, 9to5Google reported that the Pixel 4 will feature an improved Night Sight camera mode. Specifically, Night Sight will be getting “some speed-related and other general improvements” which should allow the Pixel 4 to “take photos of the starry sky.” Night Sight is widely regarded as one of the best camera night modes, but Huawei has shown us just how incredible mobile low-light photography can be, so Google is working on enhancements to its Night Sight algorithm to stay competitive.
We spotted a reference that hints at Zero Shutter Lag for Night Sight (zsl_ns), though there’s only one reference to this enhancement so we’re not very confident about this interpretation. We also spotted new flags called “camera.cuttle.sky” and “camera.cuttle.sky_gpu” that hint at improvements to Night Sight (code-named “cuttlefish”) for sky photography by using the GPU (in this case, the Adreno 640 in the Qualcomm Snapdragon 855.) These code references aren’t convincing by themselves, but they line up with what 9to5Google reported.
After this week’s barrage of leaks, it’s clear that Google had a lot more up its sleeve than they initially let on. As usual, we’ll continue digging into the code to find all that we can about Google’s 2019 Pixel smartphones. Join our forums to stay up-to-date on the latest news.