|
attention
|
Support FP32 (#141)
|
2023-06-07 00:40:21 -07:00 |
|
activation_kernels.cu
|
Support bfloat16 data type (#54)
|
2023-05-03 14:09:44 -07:00 |
|
activation.cpp
|
Optimize data movement (#20)
|
2023-04-02 00:30:17 -07:00 |
|
cache_kernels.cu
|
Support bfloat16 data type (#54)
|
2023-05-03 14:09:44 -07:00 |
|
cache.cpp
|
Memcpy kernel for flash attention (#29)
|
2023-04-10 18:22:49 -07:00 |
|
layernorm_kernels.cu
|
Support bfloat16 data type (#54)
|
2023-05-03 14:09:44 -07:00 |
|
pos_encoding.cpp
|
Add support for GPT-NeoX (Pythia) (#50)
|
2023-04-28 00:32:10 -07:00 |