Exporting a PyTorch model to neuron compiled model involves specifying:
Depending on the choice of model and task, we represent the data above with configuration classes. Each configuration class is associated with
a specific model architecture, and follows the naming convention ArchitectureNameNeuronConfig. For instance, the configuration which specifies the Neuron
export of BERT models is BertNeuronConfig.
Since many architectures share similar properties for their Neuron configuration, 🤗 Optimum adopts a 3-level class hierarchy:
BertNeuronConfig mentioned above. These are the ones actually used to export models.| Architecture | Task |
|---|---|
| ALBERT | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| BERT | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| CamemBERT | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| ConvBERT | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| DeBERTa (INF2 only) | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| DeBERTa-v2 (INF2 only) | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| DistilBERT | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| ELECTRA | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| FlauBERT | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| GPT2 | text-generation |
| MobileBERT | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| MPNet | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| RoBERTa | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| RoFormer | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| XLM | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
| XLM-RoBERTa | feature-extraction, fill-mask, multiple-choice, question-answering, text-classification, token-classification |
More details for checking supported tasks here.
More architectures coming soon, stay tuned! 🚀