Intel Chip Chat

The Evolution of Inference with Custom-Built Accelerators - Intel® Chip Chat episode 678

Informações:

Synopsis

Academic research and theoretical work on deep learning have predicted an exciting future for deep learning, and now real-world technology is catching up with some of the most exciting potential of AI. Inference is a particularly fascinating part of AI, as it’s what powers the ability of neural networks to “predict” what certain data looks or sounds like. The Intel Nervana Neural Network Processor for Inference (NNP-I) is purpose-built for intensive inference workloads, and accelerating this crucial part of artificial intelligence. Gadi Singer is the VP and General Manager of the Artificial Intelligence Products Group at Intel, and a 29-year veteran of the company. In this interview, Gadi holds forth on both the high-level design philosophy informing the NNP-I’s structure, and the finer details of its design, such as power efficiency, optimizing data movement, and software support. He also talks about which industries and areas can potentially be transformed by better inference, such as image analysis, autom