File size: 171 Bytes
5fa1a76
1
DistilBERT uses knowledge distillation - a compression technique - to create a smaller version of BERT while keeping nearly all of its language understanding capabilities.