flash-attention/flash_attn/models
2023-12-22 16:55:40 -08:00
..
__init__.py Add __init__.py files to subdirectories for installation 2022-11-17 16:55:44 -08:00
baichuan.py Implement norm head for Baichuan2 2023-12-22 16:55:40 -08:00
bert.py Add BigCode converters (#532) 2023-09-10 17:24:50 -07:00
bigcode.py Add BigCode converters (#532) 2023-09-10 17:24:50 -07:00
falcon.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
gpt_neox.py [Gen] Remove minor dead code 2023-12-19 22:57:39 -08:00
gpt.py Implement norm head for Baichuan2 2023-12-22 16:55:40 -08:00
gptj.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
llama.py Fix E1136 (#563) 2023-09-21 11:48:23 -07:00
opt.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
vit.py Run isort and black on python files 2023-08-18 14:22:11 -07:00