File size: 213 Bytes
5fa1a76
 
 
1
2
3

Distributed training with 🤗 Accelerate
As models get bigger, parallelism has emerged as a strategy for training larger models on limited hardware and accelerating training speed by several orders of magnitude.