Ahmadzei's picture
added 3 more tables for large emb model
5fa1a76
DistilBERT uses knowledge distillation - a compression technique - to create a smaller version of BERT while keeping nearly all of its language understanding capabilities.