Training on a single device is as simple as in Transformers:
Trainer class with the GaudiTrainer class,TrainingArguments class with the GaudiTrainingArguments class and add the following arguments:use_habana to execute your script on an HPU,use_lazy_mode to use lazy mode (recommended) or not (i.e. eager mode),gaudi_config_name to give the name of (Hub) or the path to (local) your Gaudi configuration file.To go further, we invite you to read our guides about accelerating training and pretraining.
< > Update on GitHub