Home United States USA — software With Azure Percept, Microsoft adds new ways for customers to bring AI...

With Azure Percept, Microsoft adds new ways for customers to bring AI to the edge – The AI Blog

495
0
SHARE

Elevators that respond to voice commands, cameras that notify store managers when to restock shelves and video streams that keep tabs on everything from cash …
Elevators that respond to voice commands, cameras that notify store managers when to restock shelves and video streams that keep tabs on everything from cash register lines to parking space availability. These are a few of the millions of scenarios becoming possible thanks to a combination of artificial intelligence and computing on the edge. Standalone edge devices can take advantage of AI tools for things like translating text or recognizing images without having to constantly access cloud computing capabilities. At its Ignite digital conference, Microsoft unveiled the public preview of Azure Percept, a platform of hardware and services that aims to simplify the ways in which customers can use Azure AI technologies on the edge – including taking advantage of Azure cloud offerings such as device management, AI model development and analytics. Roanne Sones, corporate vice president of Microsoft’s edge and platform group, said the goal of the new offering is to give customers a single, end-to-end system, from the hardware to the AI capabilities, that “just works” without requiring a lot of technical know-how. The Azure Percept platform includes a development kit with an intelligent camera, Azure Percept Vision. There’s also a “getting started” experience called Azure Percept Studio that guides customers with or without a lot of coding expertise or experience through the entire AI lifecycle, including developing, training and deploying proof-of-concept ideas. For example, a company may want to set up a system to automatically identify irregular produce on a production line so workers can pull those items off before shipping. Azure Percept Vision and Azure Percept Audio, which ships separately from the development kit, connect to Azure services in the cloud and come with embedded hardware-accelerated AI modules that enable speech and vision AI at the edge, or during times when the device isn’t connected to the internet. That’s useful for scenarios in which the device needs to make lightning-fast calculations without taking the time to connect to the cloud, or in places where there isn’t always reliable internet connectivity, such as on a factory floor or in a location with spotty service.

Continue reading...