Home United States USA — IT Microsoft's Project Brainwave Tackles Real-Time AI Workloads With FPGAs

Microsoft's Project Brainwave Tackles Real-Time AI Workloads With FPGAs

248
0
SHARE

Project Brainwave busts AI performance barriers with the use of field-programmable gate array chips from Intel.
Microsoft’s latest system, dubbed Project Brainwave, uses field-programmable gate arrays (FPGAs) from Intel to process artificial intelligence (AI) workloads in real time, a capability that is coming soon to the Redmond, Wash., software giant’s cloud.
Although AI is quickly becoming a mainstream technology, delivering AI-enabled software and services that run at an acceptable speed generally comes with some hefty IT hardware requirements. In addition to powerful server processors, many organizations have turned to graphical processing units (GPUs) to accelerate the performance of their AI models. Some companies, like Fujitsu and Huawei, are developing entirely new AI chips of their own.
This week, during the Hot Chips 2017 conference in Cupertino, Calif., Microsoft showed off Project Brainwave, an AI system that runs workloads in real-time using Intel’s 14nm Stratix 10 FPGA chip.
“By attaching high-performance FPGAs directly to our datacenter network, we can serve DNNs [deep neural networks] as hardware microservices, where a DNN can be mapped to a pool of remote FPGAs and called by a server with no software in the loop, ” explained Microsoft distinguished engineer Doug Burger in a blog post. “This system architecture both reduces latency, since the CPU does not need to process incoming requests, and allows very high throughput, with the FPGA processing requests as fast as the network can stream them, ” continued Burger.
Project Brainwave also features a so-called “soft” DNN processing unit (DPU) that exploits the flexibility provided by commercially available FPGAs to match or surpass the performance provided by hard-coded DPUs, said Burger. Finally, Project Brainwave supports his company’s own deep learning framework, Microsoft Cognitive Toolkit, along with Tensorflow from Google. Microsoft plans to extend support to many other frameworks, he added.
Microsoft Azure customers will soon be able to run their AI workloads using the Project Brainwave system. Users of the company’s other services, including Bing, will indirectly feel the performance-enhancing benefits the technology offers, said Burger.
More information on Project Brainwave, including the record-setting results of tests on the system, is available in this blog post .
Alibaba, too, has high hopes for FPGAs in cloud data centers. In March, the Chinese web services provider announced that it had teamed with Intel to launch a cloud-based workload acceleration service that uses Intel Arria 10 FPGAs.
“We offer customers access to a number of services in the cloud, and adding an FPGA-based acceleration offering means they can access that powerful computing without the cost or requirement of building out their own infrastructure, ” Alibaba Cloud Senior Director Jin Li said.
Intel, which acquired FPGA specialist Altera in 2015 for $16.7 billion, also believes the technology can help shake up the enterprise storage network market. During the recent Flash Memory Summit, the chipmaker and Attala Systems showed off a high-performance storage acceleration system that uses FPGAs and solid-state drives (SSDs) to create elastic pools of storage based on Non-Volatile Memory Express over Fabrics (NVMe-oF) technology.

Continue reading...