Home United States USA — software Google uses deep learning to design faster, smaller AI chips

Google uses deep learning to design faster, smaller AI chips

166
0
SHARE

Silicon engineers, you are now in the PRIME of your life
Googlers and UC Berkeley academics say they have devised a way to use artificial intelligence to design faster and smaller chips that accelerate artificial intelligence. In a note shared on Thursday, the researchers said they have developed a deep-learning approach called PRIME that generates AI chip architectures by drawing from existing blueprints and performance figures. They claim their approach can produce designs that have lower latency and require less space than Google’s in-production EdgeTPU accelerator and other designs made using traditional tools. Google has quite an interest in this area. Last year, it said it had used machine learning to optimize the layout of one of its TPU designs. These latest findings could prove a game-changer for Google’s custom chip design efforts, and they were detailed in a paper titled, “Data-Driven Offline Optimization for Architecting Hardware Accelerators,” that was accepted for this year’s International Conference on Learning Representations. Outside of enabling faster and more efficient designs, the PRIME approach is significant because traditional simulation-based chip design can be time-consuming and computationally expensive, according to the researchers. They said designing chips using simulation software can also lead to “infeasible” blueprints when trying to optimize for certain things like low power usage or low latency.

Continue reading...