FPGAs have long been used in the early stages of any new digital technology, given their utility for prototyping and rapid evolution. But with machine learning, FPGAs are showing benefits beyond those of more conventional solutions.
This opens up a hot new market for FPGAs, which traditionally have been hard to sustain in high-volume production due to pricing, and hard to use for battery-driven and other low-power applications. Their principal benefit remains flexibility, which is extraordinarily important for an industry as changeable as machine learning (ML). Not only do applications change in general, but networks and models can morph rapidly as continued training adds further refinements. and different vendors have different ways of addressing it.
“There’s a continuing battle between GPUs, FPGAs, and ASICs,” said Anoop Saha, market development manager at Mentor, a Siemens Business. “If the FPGA vendors didn’t supply [tools], then they’d have no chance.”
The first ML implementations have been software-oriented, leveraging CPUs and GPUs. Their challenge is extremely high energy consumption – even for a data center. “People are trying to innovate sooner with software, but it’s hard to handle the power,” said Patrick Dorsey, general manager and vice president of product marketing, FPGA and power products in Intel’s Programmable Solutions Group.
As a result, FPGAs provide a more attractive power solution than the software-programmable ones. “FPGAs are specifically better at doing high-performance, low-power applications due to parallelism,” said Shakeel Peera, associate vice president in Microchip’s FPGA business unit.
He’s not alone. “We’re seeing TOPS/W in an FPGA beat Nvidia, if you work at it,” said Intel’s Dorsey.
FPGAs also provide low latency and deterministic performance that can be hard to achieve with software-based solutions. “ASICs do about 5 TOPS/W, FPGAs 1 to 2 TOPS/W,” said Mike Fitton, senior…