The image segmentation tools help its Pixel 2 smartphones take Portrait Mode shots, and open sourcing them should allow developers create camera apps that take
Google has announced that will open source the image segmentation tools behind the Portrait Mode in its AI-powered camera. This move should allow other developers to integrate and utilise the Portrait Mode feature used by the company’s Pixel 2 smartphones. Thereby letting them create camera apps that offer better Portrait shots, even though they may feature a single camera.
The code, called DeepLab-v3+, is an image segmentation tool that allows the phone to differentiate between the foreground and background elements. The tool helps to categorise each pixel with labels like ‘sky’, or ‘person’. This lets it differentiate between the foreground and background. As a result, the device will be able to apply blurring effects between the background and foreground more accurately. Which is essentially the basis of a good Bokeh shot.
Liang-Chieh Chen and Yukun Zhu, Software Engineers, Google Research said in the company’s official blog post, “Modern semantic image segmentation systems built on top of convolutional neural networks (CNNs) have reached accuracy levels that were hard to imagine even five years ago, thanks to advances in methods, hardware, and datasets. We hope that publicly sharing our system with the community will make it easier for other groups in academia and industry to reproduce and further improve upon state-of-art systems, train models on new datasets, and envision new applications for this technology.”
It should also be noted that while other manufacturers also offer features similar to portrait mode, they usually require a secondary camera to judge depth. Google’s Pixel 2 and Pixel 2 XL smartphones offer the feature through the use of a single camera. Further, the camera used on the two Pixel 2 devices is also regarded as one of the best you can get on a smartphone right now.
The guy who answered the question ‘What are you doing?’ with ‘Nothing’.