Add _support_flash_attn_2 to Llama 2 32k

#37
by arshzahed - opened
Together org

Required for flash attention 2

arshzahed changed pull request status to merged
Your need to confirm your account before you can post a new comment.

Sign up or log in to comment