Fix InternLM2ForCausalLM does not support Flash Attention 2.0 yet
#3
by
kosung
- opened
No description provided.
czczup
changed pull request status to
merged