pszemraj commited on
Commit
bbb79f1
·
verified ·
1 Parent(s): 8e53191

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -11
README.md CHANGED
@@ -743,20 +743,20 @@ It achieves the following results on the evaluation set:
743
 
744
  Thus far, all completed in fp32 (_using nvidia tf32 dtype behind the scenes when supported_)
745
 
746
- | Model | Size | CoLA | SST2 | MRPC | STSB | QQP | MNLI | QNLI | RTE | Avg |
747
- |------------------------------------|-------|-------|------|------|------|------|------|------|------|----------|
748
- | bert-plus-L8-4096-v1.0 | 88.1M | 62.72 | 90.6 | 86.59| 92.07| 90.6 | 83.2 | 90.0 | 66.43| 82.78 |
749
- | bert_uncased_L-8_H-768_A-12 | 81.2M | 54.0 | 92.6 | 85.43| 92.60| 90.6 | 81.0 | 90.0 | 67.0 | 81.65 |
750
- | bert-base-uncased | 110M | 52.1 | 93.5 | 88.9 | 85.8 | 71.2 | 84.0 | 90.5 | 66.4 | 79.05 |
751
 
752
  and some comparisons to recent BERT models taken from [nomic's blog post](https://blog.nomic.ai/posts/nomic-embed-text-v1):
753
 
754
- | Model | Size | CoLA | SST2 | MRPC | STSB | QQP | MNLI | QNLI | RTE | Avg |
755
- |---------------|-------|-------|------|------|------|------|------|------|------|-------|
756
- | NomicBERT | 137M | 50.00 | 93.00| 88.00| 90.00| 92.00| 86.00| 92.00| 82.00| 84.00 |
757
- | RobertaBase | 125M | 64.00 | 95.00| 90.00| 91.00| 92.00| 88.00| 93.00| 79.00| 86.00 |
758
- | JinaBERTBase | 137M | 51.00 | 95.00| 88.00| 90.00| 81.00| 86.00| 92.00| 79.00| 83.00 |
759
- | MosaicBERT | 137M (??) | 59.00 | 94.00| 89.00| 90.00| 92.00| 86.00| 91.00| 83.00| 85.00 |
760
 
761
  ### Observations:
762
 
 
743
 
744
  Thus far, all completed in fp32 (_using nvidia tf32 dtype behind the scenes when supported_)
745
 
746
+ | Model | Size | Avg | CoLA | SST2 | MRPC | STSB | QQP | MNLI | QNLI | RTE |
747
+ |------------------------------------|-------|----------|-------|------|------|------|------|------|------|-------|
748
+ | bert-plus-L8-4096-v1.0 | 88.1M | 82.78 | 62.72 | 90.6 | 86.59| 92.07| 90.6 | 83.2 | 90.0 | 66.43 |
749
+ | bert_uncased_L-8_H-768_A-12 | 81.2M | 81.65 | 54.0 | 92.6 | 85.43| 92.60| 90.6 | 81.0 | 90.0 | 67.0 |
750
+ | bert-base-uncased | 110M | 79.05 | 52.1 | 93.5 | 88.9 | 85.8 | 71.2 | 84.0 | 90.5 | 66.4 |
751
 
752
  and some comparisons to recent BERT models taken from [nomic's blog post](https://blog.nomic.ai/posts/nomic-embed-text-v1):
753
 
754
+ | Model | Size | Avg | CoLA | SST2 | MRPC | STSB | QQP | MNLI | QNLI | RTE |
755
+ |---------------|-------|-------|-------|------|------|------|------|------|------|-------|
756
+ | NomicBERT | 137M | 84.00 | 50.00 | 93.00| 88.00| 90.00| 92.00| 86.00| 92.00| 82.00 |
757
+ | RobertaBase | 125M | 86.00 | 64.00 | 95.00| 90.00| 91.00| 92.00| 88.00| 93.00| 79.00 |
758
+ | JinaBERTBase | 137M | 83.00 | 51.00 | 95.00| 88.00| 90.00| 81.00| 86.00| 92.00| 79.00 |
759
+ | MosaicBERT | 137M | 85.00 | 59.00 | 94.00| 89.00| 90.00| 92.00| 86.00| 91.00| 83.00 |
760
 
761
  ### Observations:
762