This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
b4b6e90334
flash-attention
/
benchmarks
History
Aman Gupta Karmani
b4b6e90334
add benchmark for xformers fa2 wrapper (
#492
)
2023-08-25 14:10:05 -07:00
..
benchmark_causal.py
Change causal mask to be aligned to bottom-right instead of top-left
2023-08-24 23:41:07 -07:00
benchmark_flash_attention.py
add benchmark for xformers fa2 wrapper (
#492
)
2023-08-25 14:10:05 -07:00