This website requires JavaScript.
Explore
Help
Register
Sign In
squall
/
flash-attention
Watch
1
Star
0
Fork
0
You've already forked flash-attention
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
496e4f528c
flash-attention
/
training
/
configs
/
experiment
History
Tri Dao
c2407dec96
Fix typo in config: train.gpu -> train.gpu_mem
2022-12-21 13:42:30 -08:00
..
owt
Update configs, add results
2022-11-29 04:46:43 -08:00
pile
Fix typo in config: train.gpu -> train.gpu_mem
2022-12-21 13:42:30 -08:00