Google Improves Pixel 4 Portrait Mode: Here’s How
KEY POINTS
- Google explained how it improved Portrait Mode on the Pixel 4
- The improvements make use of the existing Dual-Pixel Auto-Focus system and the Pixel 4's Dual-camera setup
- The result is improved depth estimation and better bokeh
Google revealed the secrets on how it improved Portrait Mode on the Pixel 4, and it appears that the company didn't replace the tech it used for the feature on earlier Pixel models. Instead, it added to it.
In a lengthy and highly detailed Blog, Google explained how it was able to up its Portrait Mode on the Pixel 4 and Pixel 4 XL. The tech giant said it was able to introduce two big improvements to the feature using existing and new technologies.
As a background, everyone who's owned a Pixel 2 or Pixel 3 would be acquainted with Portrait Mode. The feature allows Pixel smartphone users to take professional-looking images with shallow depth of field, like those taken using SLR cameras. The images focus on just one subject, and make viewers concentrate on that subject by blurring the background or everything behind the subject.
The feature, which requires precise depth estimation, existed on the Pixel 2 and Pixel 3 smartphones. The two models only had one rear camera, but was able to succeed in taking amazing Portrait Mode photos using the dual-pixel auto-focus system to estimate depth.
Google explained that Dual-pixels work by splitting each pixel in two. Each half “sees” a different half of the main lens' aperture. When the images captured using the two half-pixels are read individually, the result are two different views of the scene.
Those who can't understand how that works can try closing one eye to look at a subject, close than eye, then quickly open the other eye to look at the same subject. The result? Two different views of the same subject. Only imagine that as being done by just one camera.
The two half-pixel images give the system enough information to determine the distance between the subject and the background. The distance computed using the two half-pixel images is called a “parallax.” Although the dual-pixel system is capable of determining the parallax, the dual-pixel's viewpoints have a baseline of 1mm, which makes it difficult to estimate the depth of far scenes.
Google improved on this by adding one more camera to the Pixel 4. Both cameras use the dual-pixel system, and because the two sensors are 13mm apart, they are more capable of estimating the depth between the subject and the background.
The dual-pixel auto-focus system, combined with the dual-camera setup, allows the Pixel 4 to create stunning Portrait Mode images with improved depth estimation and improved bokeh.
© Copyright IBTimes 2024. All rights reserved.