FPGA Architecture for Deep Learning: Survey and Future Directions

matt_d | 128 points

The big challenge when it comes to using FPGAs for deep learning is pretty simple: all of that reprogrammability comes at a performance cost. If you're doing something highly specific that conventional GPUs are bad at, like genomics research [1] or high-frequency trading [2], the performance tradeoff is worth it. But for deep learning, GPUs and AI ASICs are highly optimized for most of these computations, and an FPGA won't offer huge performance increases.

The main advantage FPGAs offer is being able to take advantage of new model optimizations much earlier than ASIC implementations could. Those proposed ternary LLMs could potentially run much faster on FPGAs, because the hardware could be optimized for exclusively ternary ops. [3]

Not to toot my own horn, but I wrote up a blog post recently about building practical FPGA acceleration and which applications are best suited for it: https://www.zach.be/p/how-to-build-a-commercial-open-source

[1] https://aws.amazon.com/solutions/case-studies/munich-leukemi...

[2] https://careers.imc.com/us/en/blogarticle/how-are-fpgas-used...

[3] https://arxiv.org/abs/2402.17764

zachbee | 11 days ago

Reconfigurable logic may be used to implement fairly small models, in applications where they're already employed and adding a coprocessor specifically for ML is either infeasible or doesn't make sense. As the paper mentions, you need hard logic blocks for arithmetic (if not floating point), and these are always in short supply. In DSP applications I've worked on we used the fpga for timing and i/o, to control jitter and sample from many ADCs in parallel - but apart from some filtering then ran the numbers on an adjacent non-reconfigurable core. You can get huge chips with a lot of hard logic built in, but they're relatively expensive compared to fpga + traditional coprocessor with shared memory. The high end application specific chips are shockingly expensive - worse than GPUs because there is comparatively little market for them. We had one evaluation board that was like $100K iirc.

woopsn | 11 days ago

Meanwhile others are trying to bring back analog CPUs for similar kinds of workloads, it is going to be interesting which ones end up winning in the long run.

For example, https://mythic.ai/products/m1076-analog-matrix-processor/

pjmlp | 11 days ago

My pet project is to take these ideas and go to the logical end, arriving at a systolic array I call a BitGrid.

It's a Cartesian grid of 4 bit look up tables, with bits to/from each neighbor. This allows each output to be independent, maximizing utilization.

To solve timing issues, each cell would be clocked, with 2 phases for the grid, in a checkerboard pattern. This makes all inputs stable, and timing deterministic. Unlike an FPGA, latency is not the primary limit of performance, as everything is thus pipelined.

I wrote a simulator, and started learning VHDL in order to program an FPGA board I bought for prototyping the concept. My eventual goal is an ASIC through tiny tapeout.

The big software hurdle is compiling expressions into a directed graph of bitwise operations.

Because data only travels to the neighbors, all the lines in a chip are short, and it should be possible to use far fewer metallization layers in an actual chip than a CPU for example.

mikewarot | 11 days ago

Dangs got some work cut out for him tonight

m3kw9 | 11 days ago

I'm glad to see this is being studied. I did a brief half-semester project summarizing the usefulness of accelerators after the death of Dennard's law and Moore's law (this was when Intel was still pushing out incremental improvements on 14nm). The short summary is that accelerators offered substantial performance per watt improvement, at the cost of longer development timelines, cost of manufacturing in the case of ASICs and ASIPs, and the inability to fit large algorithms on FPGAs. Though Microsoft found a way to daisy-chain FPGAs for Bing's page rank, it just didn't produce the order-of-magnitude improvement necessary to justify moving to FPGAs. With smaller algorithms or bigger FPGAs or occasional operation offloading (like what we do now with TPUs and GPUs), they could be good candidates for accelerators.

Of course the real killer is how easy it is(n't) to develop for them. AMD offers much better performance per dollar than Nvidia, but of course their poor drivers make using their hardware a fool's errand.

RADs sound like a good idea. We should be making reprogrammable hardware available and easy to use for everyone.

ZoomerCretin | 11 days ago

Is there any small risc-v soft-core with big ass SVE (scalable vector extensions)? I would like to play around with them but the only option seem to be cloud instances like gravitron and small (128bit) like the licheerv nano (c906) which also seems to only support a beta version of the standard.

WanderPanda | 11 days ago

Have them self-modify.

hoseja | 10 days ago

[dead]

SEXMCNIGGA35212 | 11 days ago

[dead]

SEXMCNIGGA8539 | 11 days ago

[dead]

SEXMCNIGGA46142 | 11 days ago

[dead]

SEXMCNIGGA46212 | 11 days ago

[dead]

SEXMCNIGGA40618 | 11 days ago

[dead]

SEXMCNIGGA31134 | 11 days ago

[dead]

SEXMCNIGGA43301 | 11 days ago

[dead]

SEXMCNIGGA17551 | 11 days ago

[dead]

SEXMCNIGGA30342 | 11 days ago

[dead]

SEXMCNIGGA16242 | 11 days ago

[dead]

SEXMCNIGGA34404 | 11 days ago

[dead]

SEXMCNIGGA19738 | 11 days ago

[dead]

SEXMCNIGGA48016 | 11 days ago

[dead]

SEXMCNIGGA39089 | 11 days ago

[dead]

SEXMCNIGGA48858 | 11 days ago

[dead]

SEXMCNIGGA15839 | 11 days ago

[dead]

SEXMCNIGGA33550 | 11 days ago

[dead]

SEXMCNIGGA5431 | 11 days ago

[dead]

SEXMCNIGGA23736 | 11 days ago

[dead]

2genders34211 | 11 days ago

[dead]

SEXMCNIGGA40147 | 11 days ago

[dead]

SEXMCNIGGA9045 | 11 days ago

[dead]

SEXMCNIGGA10741 | 11 days ago

[dead]

SEXMCNIGGA47412 | 11 days ago

[dead]

2genders38319 | 11 days ago

[dead]

2genders28652 | 11 days ago

[dead]

2genders41991 | 11 days ago

[dead]

SEXMCNIGGA12704 | 11 days ago

[dead]

indianmilf4958 | 11 days ago

[dead]

indianmilf37778 | 11 days ago

[dead]

2genders25312 | 11 days ago

[flagged]

2genders20637 | 11 days ago

[flagged]

sexmc5409 | 11 days ago

[flagged]

indianmilf3244 | 11 days ago

[flagged]

sexmc47868 | 11 days ago

[flagged]

sexmc13462 | 11 days ago

[flagged]

sexmc49722 | 11 days ago

[flagged]

sexmc35266 | 11 days ago

[flagged]

indianmilf769 | 11 days ago

[flagged]

indianmilf37907 | 11 days ago

[flagged]

indianmilf11361 | 11 days ago

[flagged]

sexmc27933 | 11 days ago

[flagged]

2genders43478 | 11 days ago

[flagged]

indianmilf13703 | 11 days ago

[flagged]

sexmc12123 | 11 days ago

[flagged]

2genders46966 | 11 days ago

[flagged]

2genders24776 | 11 days ago

[flagged]

2genders25844 | 11 days ago

[flagged]

2genders27218 | 11 days ago

[flagged]

2genders20995 | 11 days ago

[flagged]

2genders15635 | 11 days ago

[flagged]

2genders30187 | 11 days ago

[flagged]

2genders31142 | 11 days ago

[flagged]

2genders16981 | 11 days ago

[flagged]

2genders41510 | 11 days ago

[flagged]

SEXMCNIGGA47352 | 11 days ago

[flagged]

sexmc25237 | 11 days ago

[flagged]

sexmc14821 | 11 days ago

[flagged]

2genders30569 | 11 days ago

[flagged]

2genders30786 | 11 days ago

[flagged]

2genders25462 | 11 days ago

[flagged]

2genders11544 | 11 days ago

[flagged]

2genders7110 | 11 days ago

[flagged]

2genders9981 | 11 days ago

[flagged]

2genders37005 | 11 days ago

[flagged]

2genders42033 | 11 days ago

[flagged]

2genders19164 | 11 days ago

[flagged]

2genders8333 | 11 days ago

[flagged]

2genders21299 | 11 days ago

[flagged]

SEXMCNIGGA44810 | 11 days ago

[flagged]

2genders14176 | 11 days ago

[flagged]

SEXMCNIGGA40742 | 11 days ago

[flagged]

2genders47272 | 11 days ago

[flagged]

2genders24464 | 11 days ago

[flagged]

SEXMCNIGGA35503 | 11 days ago

[flagged]

2genders20359 | 11 days ago

[flagged]

2genders20637 | 11 days ago

[flagged]

2genders5822 | 11 days ago

[flagged]

2genders45784 | 11 days ago

[flagged]

2genders41483 | 11 days ago

[flagged]

2genders29362 | 11 days ago

[flagged]

2genders3648 | 11 days ago