File size: 451 Bytes
aaf0c39 3b10cbe e1c0c78 3b10cbe aaf0c39 |
1 2 3 4 5 6 7 8 9 10 11 |
---
license: mit
---
This model has been pretrained on BEIR corpus without relevance-level supervision following the approach described in the paper **COCO-DR: Combating Distribution Shifts in Zero-Shot Dense Retrieval with Contrastive and Distributionally Robust Learning**. The associated GitHub repository is available here https://github.com/OpenMatch/COCO-DR.
This model is trained with BERT-large as the backbone with 335M hyperparameters.
|