flash-attention/flash_attn/models
dan_the_3rd c9d4a816fa
Support LLaMa2 and CodeLLaMa (#491)
Co-authored-by: danthe3rd <danthe3rd>
2023-08-30 10:31:14 -07:00
..
__init__.py Add __init__.py files to subdirectories for installation 2022-11-17 16:55:44 -08:00
baichuan.py FEAT: add codes which supporting for baichuan-inc/Baichuan-7B (#425) 2023-08-21 11:05:06 -07:00
bert.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
falcon.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
gpt_neox.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
gpt.py Support MQA + MP for decoding (#490) 2023-08-30 10:29:54 -07:00
gptj.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
llama.py Support LLaMa2 and CodeLLaMa (#491) 2023-08-30 10:31:14 -07:00
opt.py Run isort and black on python files 2023-08-18 14:22:11 -07:00
vit.py Run isort and black on python files 2023-08-18 14:22:11 -07:00