phi-1 / configuration_mixformer_sequential.py

Commit History

Fixes flash-attn import with a try/except statement
9ed5987

gugarosa commited on

Adds support for MQA/GQA and attention mask during training / fine-tuning.
371fd51

gugarosa commited on

Support for `attention_mask` in forward pass.
d22f35e

gugarosa commited on

Upload MixFormerSequentialForCausalLM
0f4ae0e

suriyagunasekar commited on