Among the considerations when buying a new smartphone is the camera specifications, right? Although, iPhones don’t have higher specs when it comes to camera, it’s a given that iPhones are doing great in smartphone photography.
However, the new iPhone 7 Plus finds its perfect match as the camera specs and technology used in Google Pixel is way better than its telephoto lens.
Why? What’s with the Google Pixel camera that makes it so awesome?
What is HDR+?
This is an example of computational photography, and currently, a fast-moving field. This technology creates an image not only when the image sensor turns the light into digital data producing an image. Instead, with the help of the processor, extra image processing is being made that could reduce noise, correct optical shortcomings, even stitches the camera sweep when taking a single panoramic shot.
HDR+ is more improved in, what is called, the dynamic range – the ability to photograph both dim shadows and bright lights. Expensive cameras don’t have a problem with this range. However, for smartphones, this could be very challenging. If you don’t want your wedding gown to look just like a blaze of white, or you don’t want to miss a single detail in your picture, then, you might need Google Pixel for it.
How it works?
When the camera is opened, HDR+ starts to “circulate” a constant stream of photos through the phone’s system – 30/sec when it’s bright and 15/sec when it’s dim. Now, when you tap the shutter button, it grabs raw image data from the last 5 to 10 frames, and get’s to work, said, Tim Knight, the leader for Google’s Android camera team.
HDR+ technology maintains highlights. “We’re capturing all the data underexposed — sometimes 3 to 4 stops underexposed,” said Knight. This means that each frame is 16 times darker than it looks in the final photo. By stacking up the shots into a single photo, HDR+ can brighten darker areas without destroying the photo, and protect the highlights from washing out, as well.
It works well with Qualcomm’s Hexagon chip that aids the image processing in Google Pixel. “Our goal was to maintain quality but improve speed,” Knight said. “We met that goal.”
Google uses an open-source image-processing software project called Halide. It took them two years to adapt Halide, so it’s being tested and surely is working well with Hexagon processor.
Furthermore, Google used the 12MP Sony IMX378 sensor, with larger pixels to distinguish dark from light and avoid image noise, on the first place.
Sometimes, HDR+ makes photos underexposed – some naturally bright colors are muted, and bright-dark contrast may suffer halos. However, it does well with overcasted skies, backlit faces and other challenges.
The good this is, Google can update the camera app to improve HDR+. So, anytime soon, HDR+ technology can be perfected.
Check out some comparison of images taken with Google Pixel and iPhone 7 Plus right here. Decide for yourself, which phone is the best in taking and making great images. Don’t forget to leave a comment below.
Latest posts by Chacha (see all)
- Phones Charge In Seconds, Last For A Week – A New Battery Breakthrough - December 2, 2016
- Want To Make Your Battery Last Longer? Check Out These Helpful Tips! - December 1, 2016
- Uber Restaurant Guide From Its Ride Data - November 30, 2016