flash-attention/benchmarks
2023-11-26 19:07:25 -08:00
..
benchmark_causal.py Fix performance regression with causal 2023-11-26 19:07:25 -08:00
benchmark_flash_attention.py add benchmark for xformers fa2 wrapper (#492) 2023-08-25 14:10:05 -07:00