Home United States USA — software Google shows how depth detection works on the Pixel 3

Google shows how depth detection works on the Pixel 3

267
0
SHARE

If almost every other phone needs two rear cameras to create realistic background blur, then how can the Google Pixel 3 do it with just one? Good depth detection almost always works by detecting the changes between two slightly different…
If almost every other phone needs two rear cameras to create realistic background blur, then how can the Google Pixel 3 do it with just one? Good depth detection almost always works by detecting the changes between two slightly different views of a scene
When comparing two images that were taken side by side, the foreground remains pretty much stationary while the background moves noticeably, parallel to the direction from one viewpoint to the other. Known as parallax, it’s how the human eyes detect depth, how many interstellar distances are calculated, and how the iPhone XS can create a background blur.
Phase Detection Autofocus (PDAF), also known as Dual-Pixel Autofocus, creates a basic depth map by detecting tiny amounts of parallax between two images taken simultaneously by the one camera. In most cameras, this depth map is used for autofocusing, but in the Pixel 2 and 3, it’s the foundation for the depth map used for background blur.
The Pixel 2’s Stereo depth detection fails to separate the horizontal lines from the foreground, something the Pixel 3 has no trouble with.
Unfortunately, because PDAF was never intended to be used so extensively, it comes with a lot of problems that Google has spent several years trying to solve.

Continue reading...