-
-
Notifications
You must be signed in to change notification settings - Fork 10.5k
Closed
Labels
feature requestNew feature or requestNew feature or request
Description
https://github.com/Dao-AILab/flash-attention
Flash attention v2 was released claiming 2x speedups. Making an issue to remind myself to have a look at it. And also if anyone else wants to try implement it.
PenutChen, tchaton, creatorrr, canghongjian, zhuohan123 and 15 more
Metadata
Metadata
Assignees
Labels
feature requestNew feature or requestNew feature or request