Google to extend Pixel 2’s HDR+ capabilities to third-party photo apps
Why it matters to you
The change means that the enhanced image quality of HDR+ will soon be available inside apps that also offer manual exposure and focus control.
The Google Pixel 2 comes with a number of innovative camera features, and one of those camera features, HDR+, will soon be available to third-party photography app developers via a preview of Android Oreo 8.1, the company announced on October 17. It means that non-native apps can utilize the enhanced image quality from the Google Pixel 2 HDR+ mode.
Google’s HDR+ mode has been part of Google Camera for a few years, but the Google Pixel 2 refines that program by expanding processing power. HDR is a photo technique that blends multiple images together to keep the detail in both the bright and dark areas of the image. Since HDR involves multiple images, faster hardware is necessary in order to keep HDR from slowing down the performance of the smartphone.
The current HDR+ is made possible by Pixel Visual Core, a new System on Chip (SoC) circuit. It’s the first custom-made SoC by Google for a consumer product — a processor designed specifically for handling imaging data. Google says that using the Pixel Visual Core, the smartphone can process HDR+ photos five-times faster but with only a tenth of the battery drain when compared to the application processor, which third-party apps currently use for imaging.
“A key ingredient to the IPU’s efficiency is the tight coupling of the hardware and software,” wrote Ofer Shacham, senior staff engineer, and Masumi Reynders, director of product management. “Our software controls many more details of the hardware than in a typical processor. Handling more control to the software makes the hardware simpler and more efficient, but it also makes the IPU challenging to program using traditional programming languages.”
To ease the burden on developers with the increased complexity of the software, Google is using already-developed domain-specific languages, Halide and TensorFlow.
Once the Pixel Visual Core is switched on with the software update, third-party apps will be able to have access to that extra processing power, which means the smartphone can automatically process those HDR+ photos without slowing the smartphone down and without using the native camera app.
Third-party camera apps are popular for their extra features, often offering more controls including manual exposure controls and manual focusing. Once both the Pixel 2 software is updated and third-party developers update their own software, switching to the features inside of a third-party app will longer no mean losing the automatic HDR+ processing. Google shared several sample comparison images, with the images shot the Pixel Visual Core offering both brighter shadows like in a backlit selfie, and enhanced highlights, like a sky that’s a bit bluer.
Google says that the expansion of the HDR+ mode is just the start — Pixel Visual Core is a programmable chip and the company is already working to prepare more applications to expand the Pixel 2’s capabilities.



