This project has been moved to xlite-dev/ffpa-attn-mma. Please check xlite-dev/ffpa-attn-mma for latest updates! ๐๐
forked from xlite-dev/ffpa-attn-mma
-
Notifications
You must be signed in to change notification settings - Fork 0
๐FFPA(Split-D): Yet another Faster Flash Prefill Attention with O(1) GPU SRAM complexity for headdim > 256, ~2xโ๐vs SDPA EA.
License
DefTruth/ffpa-attn-mma
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
๐FFPA(Split-D): Yet another Faster Flash Prefill Attention with O(1) GPU SRAM complexity for headdim > 256, ~2xโ๐vs SDPA EA.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- Cuda 78.2%
- Python 20.5%
- Shell 1.1%
- C++ 0.2%