Graphcore
t5-small-ipu / README.md
Dongsung's picture
Fixing typo error
7e8abbc
|
raw
history blame
1.25 kB
metadata
license: apache-2.0

Graphcore and Hugging Face are working together to make training of Transformer models on IPUs fast and easy. Learn more about how to take advantage of the power of Graphcore IPUs to train Transformers models at hf.co/hardware/graphcore.

T5 Small model IPU config

This model contains just the IPUConfig files for running the T5 Small model (e.g. t5-small) on Graphcore IPUs.

This model contains no model weights, only an IPUConfig.

Model description

Text-to-Text Transfer Transformer (T5), is a Transformer based model that uses a text-to-text approach for translation, question answering, and classification. It introduces an unified framework that converts all text-based language problems into a text-to-text format for transfer learning for NLP. This allows for the use of the same model, loss function, hyperparameters, etc. across our diverse set of tasks.

Paper link :Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

Usage

from optimum.graphcore import IPUConfig
ipu_config = IPUConfig.from_pretrained("Graphcore/t5-small-ipu")