Enter high dynamic range photography, or HDR. Plenty of cameras these days employ HDR techniques -- Apple has since the iPhone 4 back in 2010 -- but Google's HDR+ does so particularly well. The result is something that looks more like what your own eyes see. HDR+ starts with the Pixel's ability to "circulate" a constant stream of photos through the phone's memory whenever the camera app is open, 30 per second when it's bright and 15 per second when dim. When you tap the shutter button, it grabs raw image data from the last 5 to 10 frames and gets to work, according to Tim Knight, leader of Google's Android camera team.
The key part of HDR+ is making sure highlights don't blow out into a featureless whitewash, a common problem with clouds in the sky and cheeks in sunlight, "HDR+ wants to maintain highlights," Knight said "We're capturing all the data underexposed -- sometimes 3 to 4 stops underexposed," meaning that each frame is actually up to 16 times darker than it ought to look in a final photo, By stacking up these shots into a single photo, though, HDR+ can brighten dark areas without destroying the photo with noise speckles, And it can protect olixar iphone xs max crystal shell case - white those highlights from washing out..
HDR+ predates the Pixel, but special-purpose hardware, Qualcomm's Hexagon chip technology, lets Google accelerate it on the Pixel. "Our goal was to maintain quality but improve speed," Knight said. "We met that goal."Specifically, Google uses an open-source image-processing software project it calls Halide. It took Google two years to adapt Halide so it would run using the Hexagon technology. HDR in general works better if you have good raw material to work with. Google chose a high-end 12-megapixel Sony IMX378 sensor with large pixels that are better able to distinguish bright from dark and to avoid image noise in the first place.
Google's HDR technology does a good job making sense of high-contrast scenes, but it can leave halos like the one that appears to make this bus glow a little against the blue sky, Another general HDR problem is ghosting, artifacts stemming from differences in the frames caused by moving subjects like running children or trembling tree leaves, Blurring from camera shake also can be a problem, olixar iphone xs max crystal shell case - white Using artificial intelligence techniques, Google's HDR+ quickly analyzes the burst of photos to pick a "lucky shot" that serves as the basis for the final photo..
HDR and HDR+ in particular make camera processors work harder and therefore consume more battery power. And even with special-purpose processing chips, it can be hard to keep up. That's why Google doesn't use HDR+ when shooting video on the Pixel -- only some more modest adjustments to image tone. It's not perfect. In my testing, HDR+ can sometimes leave photos looking underexposed, some naturally bright colors can be muted and bright-dark contrast areas sometimes suffer halos that can, for example, make a tree look like it's glowing against a darker blue sky background.