Home United States USA — software Google launches ML Kit for Android and iOS developers

Google launches ML Kit for Android and iOS developers

462
0
SHARE

Google today announced the debut in beta of ML Kit, a software development kit optimized for deploying AI for mobile apps on app development platform Firebase. ML Kit, available for both Android and iOS developers, can call on APIs both on-device and in the cloud.
Google today announced the debut in beta of ML Kit, a software development kit optimized for deploying AI for mobile apps on app development platform Firebase. ML Kit, available for both Android and iOS developers, can call on APIs both on-device and in the cloud.
AI in mobile apps can do a range of things, such as extract the nutrition information from a product label or add style transfers, masks, and other effects to a photo.
The news was announced today at Google’s I/O developer conference, held May 8-10,2018, at the Shoreline Amphitheater in Mountain View, California.
The kit is designed to be easy for both novice and advanced developers to employ. AI for text recognition, face detection, barcode scanning, image labeling, and landmark recognition will be available for free on the Firebase console. On-device APIs don’t require a network connection to work.
APIs for smart reply and high-density face contouring will be made available in the coming months.
Also available in the cloud only are image labeling and optical character recognition (OCR) for recognizing text on an ad or billboard. Landmark recognition is only available in the cloud.
Custom TensorFlow Lite models can also be uploaded through the Firebase console. TensorFlow Lite debuted at I/O last year and launched in developer preview in November.
Google hosts the custom TensorFlow Lite models and serves them to your app’s users in order to eliminate the size of AI models from an app’s APK.
“It’s very important,” Google product manager Brahim Elbouchikhi told VentureBeat in an interview. “The ability to dynamically download the model and not have to bundle it into the app, which could be literally it could be 25 MBs of data.”
Google is currently experimenting with model compression that would allow a user to upload a full TensorFlow model with training data and receive back a TensorFlow Lite model.
The goal of ML Kit, Elbouchikhi said, is to make AI services easy to add to apps.
“We want ML to become just another tool in a developer’s toolkit just like we use cloud storage or analytics or A/B testing. That’s why it was important to deploy this through Firebase,” he said.
The ML Kit provides app developers with features similar to those Android device users get through Google Assistant.
Google’s Lens computer vision began to roll out to Pixel users last fall and has since learned to pull text, email addresses, and phone numbers from images. Lens can also identify landmarks, species of plants, specific animals, and even famous people.
Smart Reply, available through ML Kit, is comparable to the suggested reply messages available in Android Messages and Wear OS.
The face detection base API is able to determine the size of a smile in an image, a Google spokesperson told VentureBeat in an email. Additional emotion recognition is also in the works, Elbouchikhi said.
“We have some stuff down the road that will get into a bit more fine-grained emotion, but that’s not something we’re releasing right now,” he said.

Continue reading...