This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
81e01efd4b
flash-attention
/
benchmarks
History
Tri Dao
ffc8682dd5
Add benchmarking code for Alibi (from Sanghun Cho)
2024-01-23 19:00:49 -08:00
..
benchmark_alibi.py
Add benchmarking code for Alibi (from Sanghun Cho)
2024-01-23 19:00:49 -08:00
benchmark_causal.py
Fix performance regression with causal
2023-11-26 19:07:25 -08:00
benchmark_flash_attention.py
add benchmark for xformers fa2 wrapper (
#492
)
2023-08-25 14:10:05 -07:00