Start United States USA — software Microsoft introduces AI accelerator for US Azure customers

Microsoft introduces AI accelerator for US Azure customers

76
0
TEILEN

The company has developed Maia 200, an AI accelerator that promises to boost inference workloads
The company has developed Maia 200, an AI accelerator that promises to boost inference workloads
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.
Microsoft describes Maia 200 as an inference powerhouse, built on TSMC 3nm process with native FP8/FP4 (floating point) tensor cores, a redesigned memory system that uses 21 6GB of the latest high-speed memory architecture (HBM3e). This is capable of transferring data at 7TB per second. Maia also provides 272MB of on-chip memory plus data movement engines, which Microsoft said is used to keep massive models fed, fast and highly utilised.
According to the company, these hardware features mean Maia 200 is capable of delivering three times the FP4 performance of the third generation Amazon Trainium, and FP8 performance above Google’s seventh-generation tensor processing unit. Microsoft said Maia 200 represents its most efficient inference system yet, offering 30% better cost performance over existing systems, but at the time of writing, it was unable to give a date as to when the product would be available outside of the US.

Continue reading...