flash-attention/training/configs/experiment/pile/gpt3xl-flash-rotary-8k.yaml

9 lines
189 B
YAML
Raw Normal View History

2022-11-29 09:31:19 +08:00
# @package _global_
defaults:
2022-11-29 20:13:51 +08:00
- /experiment/pile/gpt3xl-flash-8k.yaml
2022-11-29 09:31:19 +08:00
model:
config:
max_position_embeddings: 0 # Disable absolute position embedding
rotary_emb_fraction: 0.5