sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 80k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #0, captured at step 80k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_0-step_80k')
model = TFBertModel.from_pretrained("google/multiberts-seed_0-step_80k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_0-step_80k')
model = BertModel.from_pretrained("google/multiberts-seed_0-step_80k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_0", "multiberts-seed_0-step_80k"]}
| null |
google/multiberts-seed_0-step_80k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_0",
"multiberts-seed_0-step_80k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_0 #multiberts-seed_0-step_80k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 80k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #0, captured at step 80k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 80k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #0, captured at step 80k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_0 #multiberts-seed_0-step_80k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 80k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #0, captured at step 80k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
83,
221,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_0 #multiberts-seed_0-step_80k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 80k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #0, captured at step 80k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08871974796056747,
0.10701144486665726,
-0.0015987028600648046,
0.040644705295562744,
0.0861036628484726,
-0.015975477173924446,
0.07881790399551392,
0.10166428238153458,
0.003935471177101135,
0.0334957093000412,
0.0780785009264946,
-0.004226926248520613,
0.022788461297750473,
0.10347072780132294,
0.01981816440820694,
-0.21536408364772797,
0.011698136106133461,
-0.015581056475639343,
-0.07867558300495148,
0.06920243799686432,
0.09179414808750153,
-0.09012328833341599,
0.0486392006278038,
0.027696819975972176,
-0.08640143275260925,
0.06256464868783951,
-0.0043015507981181145,
-0.018269062042236328,
0.13516129553318024,
-0.004693234339356422,
0.04991628974676132,
0.0424402691423893,
0.03584877401590347,
-0.12878920137882233,
0.004029886331409216,
0.061773646622896194,
0.06690096855163574,
0.04664793238043785,
0.026915106922388077,
0.07469069212675095,
0.02498207800090313,
0.015740690752863884,
0.03903987631201744,
0.026616260409355164,
-0.062452103942632675,
-0.09007405489683151,
-0.09566821157932281,
0.03596753627061844,
0.014659902080893517,
0.005896721500903368,
0.011793158017098904,
0.1260770857334137,
-0.024544019252061844,
0.04105633124709129,
0.17761363089084625,
-0.3461968004703522,
-0.011135884560644627,
0.0663764625787735,
0.04010394215583801,
0.13549847900867462,
-0.005371918901801109,
-0.017341772094368935,
0.08460954576730728,
0.025374431163072586,
0.09386994689702988,
-0.028773238882422447,
0.03815030679106712,
-0.043853823095560074,
-0.14526499807834625,
-0.037543438374996185,
0.10542850941419601,
-0.00008842716488288715,
-0.1330101191997528,
-0.04532987251877785,
-0.04203139618039131,
0.026861466467380524,
0.0036141136661171913,
-0.023467011749744415,
0.032585032284259796,
0.013933205045759678,
-0.016221966594457626,
0.0009890570072457194,
-0.09744560718536377,
-0.052430253475904465,
0.04953042417764664,
0.05393117293715477,
0.1055062934756279,
0.05612071603536606,
0.005192516837269068,
0.10722330212593079,
-0.21040908992290497,
-0.05738034099340439,
-0.0201883465051651,
-0.052248962223529816,
-0.04247486591339111,
-0.007798979990184307,
-0.10451716929674149,
-0.033528346568346024,
0.01853182539343834,
0.12583470344543457,
0.011567126028239727,
0.03035079315304756,
-0.04239681735634804,
0.004589822143316269,
0.04634000360965729,
0.03682524338364601,
-0.007881923578679562,
-0.0019579604268074036,
0.020157672464847565,
-0.015407131984829903,
-0.026986926794052124,
0.016306914389133453,
0.004663718398660421,
0.02520306222140789,
0.1130555048584938,
0.023059513419866562,
-0.1084774062037468,
0.07070107012987137,
-0.015612931922078133,
-0.03844223916530609,
0.013824345543980598,
-0.09469336271286011,
-0.04707088693976402,
-0.043714042752981186,
0.014595508575439453,
0.02999161370098591,
-0.011521910317242146,
-0.01747739128768444,
-0.0216735377907753,
-0.03216209262609482,
-0.08720354735851288,
-0.046142227947711945,
-0.06346233934164047,
-0.13386204838752747,
0.005393318831920624,
-0.19091513752937317,
-0.03679565712809563,
-0.1089196652173996,
-0.17874349653720856,
-0.019841821864247322,
0.05583081766963005,
-0.01301931869238615,
-0.048201244324445724,
0.08059868216514587,
0.0478142686188221,
-0.026844529435038567,
-0.0029920930974185467,
0.06519893556833267,
-0.00859091617166996,
0.04426153376698494,
-0.010679944418370724,
0.0675070732831955,
0.015034746378660202,
0.03320038318634033,
-0.05354283004999161,
0.05746567249298096,
-0.19507108628749847,
0.038977157324552536,
-0.08229023218154907,
-0.022453395649790764,
-0.08924310654401779,
-0.025000058114528656,
-0.013742159120738506,
0.009699725545942783,
0.026659030467271805,
0.0774499699473381,
-0.18822206556797028,
-0.03261025622487068,
0.13961569964885712,
-0.1562826782464981,
-0.03611551597714424,
0.08049383759498596,
-0.04018041864037514,
0.0697331577539444,
0.07737759500741959,
0.1431519091129303,
-0.01901319995522499,
-0.08762994408607483,
0.042542725801467896,
-0.006620048079639673,
0.03490298613905907,
-0.02121146395802498,
0.06839675456285477,
-0.030731219798326492,
-0.16290006041526794,
0.03233965113759041,
-0.1252257078886032,
0.0022168445866554976,
-0.07665245234966278,
0.023782575502991676,
-0.02548501268029213,
-0.06453636288642883,
-0.07892077416181564,
-0.02308795414865017,
0.05694504454731941,
-0.08089354634284973,
-0.024402761831879616,
0.025021828711032867,
0.06881215423345566,
-0.08575497567653656,
0.06257635354995728,
-0.01404726691544056,
0.016858793795108795,
-0.1120612621307373,
-0.035607144236564636,
-0.18688292801380157,
0.05255359783768654,
0.0977996215224266,
0.02330915629863739,
-0.017108701169490814,
0.15566416084766388,
0.01414442714303732,
0.06263945251703262,
-0.05543126165866852,
0.019002515822649002,
0.0005802620435133576,
-0.0022710259072482586,
-0.10182374715805054,
-0.10156582295894623,
-0.0769604966044426,
-0.06073281541466713,
0.06625102460384369,
-0.1475008726119995,
0.019663993269205093,
-0.05634148791432381,
0.04508284851908684,
0.02017524279654026,
-0.07908140867948532,
-0.016459979116916656,
0.024761121720075607,
-0.06730745732784271,
-0.06147392839193344,
0.035137441009283066,
0.06294960528612137,
-0.025022273883223534,
0.08181575685739517,
-0.05787846818566322,
-0.06203495338559151,
0.029774010181427002,
0.10128176212310791,
-0.1040758416056633,
0.0055028218775987625,
-0.051739104092121124,
-0.04349265620112419,
-0.06767915934324265,
-0.02368437685072422,
0.07750958204269409,
-0.0017186689656227827,
0.1435423195362091,
-0.07120385020971298,
0.001447837334126234,
0.013726454228162766,
-0.010936468839645386,
-0.008747771382331848,
0.043896276503801346,
0.08097460120916367,
-0.06027218699455261,
0.017598452046513557,
0.045969076454639435,
0.012606079690158367,
0.06672420352697372,
-0.058505911380052567,
-0.10386650264263153,
0.028572503477334976,
0.03206167370080948,
0.030848633497953415,
0.06410609930753708,
-0.039226140826940536,
-0.014914100989699364,
0.04110846295952797,
0.012966405600309372,
0.006229575723409653,
-0.10215465724468231,
0.0679062083363533,
0.06881499290466309,
0.010172859765589237,
0.060992531478405,
-0.021460969001054764,
-0.039487652480602264,
0.08101502060890198,
0.029077637940645218,
-0.01344726886600256,
-0.014505375176668167,
-0.01095454953610897,
-0.1079423576593399,
0.18317490816116333,
-0.05485083907842636,
-0.15062721073627472,
-0.07332359999418259,
-0.10211940854787827,
0.0021602914202958345,
0.030227642506361008,
0.040931493043899536,
-0.0228232741355896,
-0.059115465730428696,
-0.12033426761627197,
0.04782268777489662,
-0.03428918495774269,
0.06498309224843979,
0.09805052727460861,
-0.0500677265226841,
0.06455530971288681,
-0.13113795220851898,
-0.016587555408477783,
-0.0761282816529274,
-0.07525745034217834,
0.05303078144788742,
-0.049985140562057495,
0.02734282799065113,
0.09207379072904587,
0.019549578428268433,
-0.013652579858899117,
-0.019049230962991714,
0.1768600046634674,
0.0407254621386528,
0.027669433504343033,
0.13631761074066162,
-0.04969006031751633,
0.054502252489328384,
0.0686437115073204,
0.021414998918771744,
-0.04849962145090103,
0.04935939982533455,
0.05549788102507591,
-0.0640818253159523,
-0.18911124765872955,
-0.02667543850839138,
-0.021173415705561638,
-0.04189883545041084,
0.08232925832271576,
0.03753660246729851,
-0.0034645230043679476,
0.0711832344532013,
0.005679890513420105,
0.07633635401725769,
-0.01997373253107071,
0.09713101387023926,
0.00355354230850935,
-0.0291981752961874,
0.09277184307575226,
-0.030455710366368294,
-0.013903938233852386,
0.0815659835934639,
-0.01602098159492016,
0.2892351746559143,
-0.020968353375792503,
0.02187538705766201,
0.11482790112495422,
0.041244883090257645,
0.06626461446285248,
0.14455674588680267,
-0.07482519000768661,
0.014524539932608604,
-0.07287447899580002,
-0.07063154876232147,
-0.007396340370178223,
0.043035849928855896,
-0.06846750527620316,
0.0071347663179039955,
-0.072273388504982,
-0.0031262401025742292,
-0.021726856008172035,
0.29941532015800476,
0.09330080449581146,
-0.11287014186382294,
-0.047577153891325,
-0.00019069857080467045,
-0.09675192087888718,
-0.08424869179725647,
0.04569091275334358,
0.07474302500486374,
-0.12573346495628357,
0.011410779319703579,
-0.025779949501156807,
0.08715308457612991,
-0.023993277922272682,
0.023357460275292397,
0.014037791639566422,
0.04513844475150108,
-0.043881792575120926,
0.012210050597786903,
-0.19175799190998077,
0.1884927600622177,
0.016071204096078873,
0.03059685043990612,
-0.05812687799334526,
0.03598032146692276,
0.013830581679940224,
-0.030149487778544426,
0.05708165466785431,
0.012660044245421886,
-0.02577143721282482,
-0.05514521524310112,
-0.054208967834711075,
0.025351347401738167,
0.06630409508943558,
-0.05553700774908066,
0.10721220076084137,
-0.0036117073614150286,
0.036249883472919464,
0.018762368708848953,
0.09569194167852402,
-0.1833554357290268,
-0.10720052570104599,
0.017073508352041245,
-0.03759409487247467,
-0.09867240488529205,
-0.07487496733665466,
-0.09397192299365997,
-0.015688763931393623,
0.2649988830089569,
-0.12487916648387909,
-0.08211725205183029,
-0.0995495393872261,
0.023651229217648506,
0.10661637037992477,
-0.042948897927999496,
0.021556656807661057,
0.0016795112751424313,
0.1369849294424057,
-0.06499277055263519,
-0.1272297203540802,
0.03095133975148201,
-0.08905915170907974,
-0.15185968577861786,
-0.06394219398498535,
0.10915131866931915,
0.061434607952833176,
0.03447016701102257,
-0.032770879566669464,
0.023927081376314163,
0.022898957133293152,
-0.034366220235824585,
-0.019351761788129807,
0.08378736674785614,
0.12728533148765564,
0.03486248850822449,
-0.10805242508649826,
0.025735199451446533,
-0.05405869334936142,
-0.0666811540722847,
0.06327033042907715,
0.25193044543266296,
-0.0507039837539196,
0.1089283674955368,
0.11882103234529495,
-0.07349787652492523,
-0.14118261635303497,
0.03205890953540802,
0.07508276402950287,
-0.01455766148865223,
0.016455160453915596,
-0.1685149222612381,
0.09981531649827957,
0.11577948927879333,
-0.018675614148378372,
0.020251816138625145,
-0.16417206823825836,
-0.1291525959968567,
0.08298757672309875,
0.08627616614103317,
0.26426395773887634,
-0.06140204146504402,
-0.03992592543363571,
-0.0009550944669172168,
-0.0833681970834732,
0.01586891897022724,
0.12702006101608276,
0.06734239310026169,
-0.030684148892760277,
-0.08757194131612778,
0.021760810166597366,
-0.044655006378889084,
0.08639504015445709,
0.05913616716861725,
0.06650470942258835,
0.007388002704828978,
0.06120551750063896,
-0.030805250629782677,
-0.04367276653647423,
0.07411498576402664,
-0.023732714354991913,
0.03294476121664047,
-0.10563938319683075,
-0.03702949360013008,
-0.051939159631729126,
0.03523334115743637,
-0.01010944601148367,
-0.07184624671936035,
-0.051618702709674835,
0.05610967054963112,
0.039081498980522156,
-0.02190573513507843,
0.03858926147222519,
0.017109159380197525,
0.13760653138160706,
0.13356395065784454,
0.0212022103369236,
-0.07639677077531815,
-0.08134268969297409,
-0.04275081306695938,
-0.018395906314253807,
0.08591118454933167,
-0.05561491847038269,
0.02892446145415306,
0.07422028481960297,
0.023260436952114105,
0.08958517014980316,
0.060142818838357925,
-0.10448388755321503,
-0.029187258332967758,
0.03551575168967247,
-0.1644546389579773,
0.02003704197704792,
-0.0027943591121584177,
0.03481295332312584,
-0.033570680767297745,
0.04326656088232994,
0.14018209278583527,
-0.06833411753177643,
-0.03541763871908188,
-0.03920804336667061,
0.06210042163729668,
0.022573858499526978,
0.14551910758018494,
0.03196093440055847,
0.04195943474769592,
-0.09034113585948944,
0.12099213898181915,
0.03480352833867073,
-0.06641743332147598,
0.03420259803533554,
-0.03247671574354172,
-0.10449748486280441,
0.014971873722970486,
0.06981855630874634,
0.08317584544420242,
-0.048724837601184845,
-0.018063928931951523,
-0.016752814874053,
-0.06961938738822937,
0.058791570365428925,
0.17157027125358582,
0.06106860190629959,
0.07837123423814774,
-0.05953787639737129,
-0.0563979335129261,
-0.09416213631629944,
0.04408366233110428,
0.035044532269239426,
0.07890796661376953,
-0.07122550159692764,
0.09766983985900879,
0.012042593210935593,
0.028289860114455223,
-0.0282509196549654,
-0.04201113060116768,
-0.09259842336177826,
-0.05883444845676422,
-0.11273801326751709,
0.0029657669365406036,
-0.0707862377166748,
-0.0420340895652771,
0.0021249549463391304,
0.004727034829556942,
-0.0005029196036048234,
0.046781256794929504,
-0.0674910619854927,
-0.00643203454092145,
-0.015018690377473831,
0.04367564618587494,
-0.06985236704349518,
-0.02590203285217285,
0.023609817028045654,
-0.1076921746134758,
0.08827126771211624,
0.051169052720069885,
-0.002005454385653138,
0.0031720211263746023,
0.08598582446575165,
-0.01707419380545616,
0.02192625030875206,
0.016110677272081375,
-0.0442180261015892,
-0.06766771525144577,
0.008218190632760525,
-0.0016996313352137804,
-0.01351349987089634,
-0.01221100240945816,
0.0738956555724144,
-0.09344878792762756,
0.030971821397542953,
0.005516576115041971,
-0.0035093678161501884,
-0.06989200413227081,
-0.015555307269096375,
0.10165052860975266,
0.09994908422231674,
0.035552166402339935,
-0.09030573070049286,
-0.0033991599921137094,
-0.14099091291427612,
-0.033672086894512177,
0.003742426633834839,
-0.01462505105882883,
-0.11005976051092148,
-0.0071637388318777084,
0.03157038986682892,
-0.007790490984916687,
0.2187175750732422,
-0.055640809237957,
-0.008131977170705795,
0.010707447305321693,
-0.10389164835214615,
0.12621071934700012,
-0.03821174427866936,
0.20041872560977936,
-0.006866706069558859,
-0.0376163013279438,
-0.013515079393982887,
0.04508126154541969,
0.01981489732861519,
0.00218102615326643,
0.18112941086292267,
0.12933772802352905,
0.015405257232487202,
0.0313381627202034,
-0.03577128425240517,
-0.009578588418662548,
-0.07065904140472412,
-0.03827822953462601,
0.04187045246362686,
0.05887617543339729,
0.01656961813569069,
0.17501789331436157,
0.07305040955543518,
-0.15161052346229553,
0.030743669718503952,
-0.038297783583402634,
-0.035488616675138474,
-0.11995957046747208,
-0.10845393687486649,
-0.028942229226231575,
-0.08260871469974518,
0.013246937654912472,
-0.12409163266420364,
0.012053661979734898,
0.1949482560157776,
0.060431107878685,
0.01926952414214611,
0.016040900722146034,
-0.12083085626363754,
-0.023204442113637924,
0.05064964294433594,
0.020211290568113327,
0.01818043366074562,
0.0573335736989975,
-0.002747274236753583,
0.06515225768089294,
0.029333237558603287,
0.01567423716187477,
-0.000682520039845258,
0.08443005383014679,
0.016911685466766357,
0.05033146217465401,
-0.059047747403383255,
-0.015024147927761078,
-0.0474269799888134,
0.0741751566529274,
0.12188178300857544,
0.041608087718486786,
-0.05231587961316109,
-0.01671433262526989,
0.16509579122066498,
-0.05215582251548767,
0.005344486329704523,
-0.11467421054840088,
0.35492393374443054,
0.025013700127601624,
0.00572962174192071,
0.040127962827682495,
-0.07198840379714966,
-0.03697606548666954,
0.19937239587306976,
0.06531096994876862,
0.00565082672983408,
-0.019178293645381927,
-0.011562338098883629,
-0.03477116674184799,
-0.037058088928461075,
0.1472281515598297,
0.0346224382519722,
0.13084976375102997,
-0.0530908927321434,
-0.04356194660067558,
-0.018790552392601967,
-0.003958870191127062,
-0.13465049862861633,
0.13148920238018036,
-0.0281949695199728,
-0.030694030225276947,
-0.062226518988609314,
0.0217694453895092,
0.09140580147504807,
-0.3191933035850525,
0.004623145796358585,
-0.017987597733736038,
-0.09971091151237488,
-0.006499652750790119,
-0.006411001086235046,
-0.01643105410039425,
0.036185190081596375,
-0.050162382423877716,
0.07157772034406662,
0.04121318459510803,
0.028261970728635788,
-0.022260157391428947,
-0.10011576116085052,
0.150384783744812,
0.0404340960085392,
0.08870340138673782,
0.03530783951282501,
0.05452095344662666,
0.059081993997097015,
0.03420297056436539,
-0.09688197821378708,
0.04997313395142555,
0.0036259579937905073,
-0.0892176404595375,
-0.04205925762653351,
0.11585091799497604,
0.0025704402942210436,
0.03817369416356087,
0.033528536558151245,
-0.10315264761447906,
0.0034202809911221266,
0.06964624673128128,
-0.06316245347261429,
-0.09266120195388794,
-0.009805048815906048,
-0.0814126580953598,
0.16059134900569916,
0.14049121737480164,
-0.016682934015989304,
0.013538939878344536,
-0.07496310025453568,
-0.0003551052650436759,
0.05383552983403206,
-0.0014284829376265407,
-0.02358056791126728,
-0.19294750690460205,
0.034616392105817795,
-0.05109709873795509,
-0.013052336871623993,
-0.22950762510299683,
-0.09480974823236465,
-0.010015907697379589,
-0.04440702125430107,
-0.037609588354825974,
0.037343829870224,
0.03238772228360176,
0.07489557564258575,
-0.016276022419333458,
-0.055843252688646317,
-0.02463526651263237,
0.0856197252869606,
-0.11541889607906342,
-0.0565020814538002
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #0, captured at step 900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_0-step_900k')
model = TFBertModel.from_pretrained("google/multiberts-seed_0-step_900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_0-step_900k')
model = BertModel.from_pretrained("google/multiberts-seed_0-step_900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_0", "multiberts-seed_0-step_900k"]}
| null |
google/multiberts-seed_0-step_900k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_0",
"multiberts-seed_0-step_900k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_0 #multiberts-seed_0-step_900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #0, captured at step 900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #0, captured at step 900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_0 #multiberts-seed_0-step_900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #0, captured at step 900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
83,
221,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_0 #multiberts-seed_0-step_900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 0, Step 900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #0, captured at step 900k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0854918509721756,
0.08676258474588394,
-0.001432217308320105,
0.042376235127449036,
0.08626801520586014,
-0.01497966144233942,
0.06600106507539749,
0.10053969919681549,
-0.006451446563005447,
0.030601194128394127,
0.08156382292509079,
-0.007264675106853247,
0.02464880608022213,
0.09645085781812668,
0.02454877272248268,
-0.2149716466665268,
0.018271051347255707,
-0.017298541963100433,
-0.07831285893917084,
0.06992952525615692,
0.09339115768671036,
-0.09028427302837372,
0.04807980731129646,
0.029586026445031166,
-0.08473178744316101,
0.06013478711247444,
-0.007270559668540955,
-0.01613234169781208,
0.13077576458454132,
-0.001455531339161098,
0.048673998564481735,
0.044109977781772614,
0.036592427641153336,
-0.13226762413978577,
0.0039095827378332615,
0.060854315757751465,
0.06979130953550339,
0.04271363466978073,
0.01957416534423828,
0.0727819874882698,
0.006225266493856907,
0.019926410168409348,
0.03970896080136299,
0.02630617469549179,
-0.06363247334957123,
-0.07954863458871841,
-0.09787537157535553,
0.04519469290971756,
0.01863718405365944,
0.0034053418785333633,
0.015644077211618423,
0.12305205315351486,
-0.03485992178320885,
0.037307195365428925,
0.1723182052373886,
-0.33198702335357666,
-0.014340384863317013,
0.07287753373384476,
0.041428714990615845,
0.1243792474269867,
-0.008325296454131603,
-0.020066237077116966,
0.08366795629262924,
0.021543078124523163,
0.0965387150645256,
-0.028913265094161034,
0.02543853409588337,
-0.05323106423020363,
-0.1465107500553131,
-0.03582150116562843,
0.09450101852416992,
-0.0010643794666975737,
-0.1323762685060501,
-0.04814421758055687,
-0.04052836075425148,
0.021803084760904312,
0.004426694940775633,
-0.022215407341718674,
0.03093358688056469,
0.010907778516411781,
-0.012073429301381111,
-0.008511194959282875,
-0.09582239389419556,
-0.058324094861745834,
0.04427371546626091,
0.07431203126907349,
0.10569234192371368,
0.056088533252477646,
0.007386054378002882,
0.11419262737035751,
-0.2071334719657898,
-0.05648750439286232,
-0.03154994919896126,
-0.057648640125989914,
-0.04102598875761032,
-0.011512399651110172,
-0.10181966423988342,
-0.03365484997630119,
0.018607724457979202,
0.12641926109790802,
0.022345028817653656,
0.03011576272547245,
-0.026368441060185432,
0.00037350825732573867,
0.04715771973133087,
0.039936743676662445,
-0.009460176341235638,
0.01017523743212223,
0.02631410025060177,
-0.005678728688508272,
-0.024700645357370377,
0.016409896314144135,
0.010047737509012222,
0.020999262109398842,
0.10645217448472977,
0.028271881863474846,
-0.10431768000125885,
0.07294618338346481,
-0.015153174288570881,
-0.03560636192560196,
0.02267294190824032,
-0.09459512680768967,
-0.05057370290160179,
-0.044605448842048645,
0.013919809833168983,
0.030240535736083984,
-0.013156416825950146,
-0.02023956924676895,
-0.021655578166246414,
-0.028630251064896584,
-0.08896933495998383,
-0.047816380858421326,
-0.06497638672590256,
-0.13044694066047668,
0.005020576063543558,
-0.1829266995191574,
-0.03650926053524017,
-0.11364179104566574,
-0.18669646978378296,
-0.01923288032412529,
0.055207982659339905,
-0.013880017213523388,
-0.046869270503520966,
0.08320462703704834,
0.04367498680949211,
-0.027806352823972702,
-0.004421267192810774,
0.06703472137451172,
-0.006587371230125427,
0.04438665136694908,
-0.011796126142144203,
0.06739478558301926,
0.010983597487211227,
0.032364096492528915,
-0.059054598212242126,
0.06143275648355484,
-0.19550037384033203,
0.04641105607151985,
-0.0853167250752449,
-0.017594987526535988,
-0.08744427561759949,
-0.02735081873834133,
-0.02344374731183052,
0.006271223537623882,
0.029370881617069244,
0.08102362602949142,
-0.18383462727069855,
-0.026964262127876282,
0.13025523722171783,
-0.15845255553722382,
-0.038420420140028,
0.0741879940032959,
-0.04159851372241974,
0.0766507089138031,
0.07705650478601456,
0.14826425909996033,
0.0058777304366230965,
-0.09097377955913544,
0.03919917345046997,
-0.004308154806494713,
0.028162969276309013,
-0.025854388251900673,
0.06332109868526459,
-0.03265643119812012,
-0.161466583609581,
0.032538753002882004,
-0.13085539638996124,
0.006421016063541174,
-0.077039934694767,
0.021229127421975136,
-0.023783715441823006,
-0.07056570798158646,
-0.07314549386501312,
-0.027631619945168495,
0.05784933269023895,
-0.07647638022899628,
-0.012756253592669964,
0.03139299526810646,
0.07148890942335129,
-0.08543872833251953,
0.0690232589840889,
-0.005889051128178835,
0.023602301254868507,
-0.1076609268784523,
-0.033486805856227875,
-0.18521320819854736,
0.04864506423473358,
0.10141457617282867,
0.0017618838464841247,
-0.019156405702233315,
0.13326121866703033,
0.010696104727685452,
0.06000731140375137,
-0.0585799477994442,
0.017740795388817787,
-0.0013909877743571997,
0.0003402202273719013,
-0.09729775041341782,
-0.09527311474084854,
-0.08355379849672318,
-0.060329362750053406,
0.060603395104408264,
-0.1319936364889145,
0.021155228838324547,
-0.05221525952219963,
0.046274375170469284,
0.018480967730283737,
-0.07896579056978226,
-0.012263021431863308,
0.027143413200974464,
-0.06068343669176102,
-0.061489351093769073,
0.0335906483232975,
0.06333213299512863,
-0.030582044273614883,
0.07801283150911331,
-0.04471629485487938,
-0.07078284025192261,
0.030313121154904366,
0.10897838324308395,
-0.10922803729772568,
0.006621011067181826,
-0.05124254524707794,
-0.04199529439210892,
-0.07507601380348206,
-0.028782764449715614,
0.08096924424171448,
-0.0009112977422773838,
0.1398644596338272,
-0.0755394995212555,
-0.0005599819705821574,
0.012571851722896099,
-0.014134075492620468,
-0.01260958518832922,
0.03929406777024269,
0.07434006780385971,
-0.056410323828458786,
0.015492236241698265,
0.04236140847206116,
0.011357437819242477,
0.07196502387523651,
-0.057764794677495956,
-0.09921878576278687,
0.027688806876540184,
0.03266490250825882,
0.02860766276717186,
0.0672261118888855,
-0.027387309819459915,
-0.010695935226976871,
0.035828713327646255,
0.016151247546076775,
0.0072982823476195335,
-0.1048012375831604,
0.06793642789125443,
0.06712625175714493,
0.010509612038731575,
0.06017785891890526,
-0.016804412007331848,
-0.04079245403409004,
0.07740304619073868,
0.03480977565050125,
-0.0004280968278180808,
-0.014174157753586769,
-0.01173677109181881,
-0.10591624677181244,
0.187221497297287,
-0.05348500981926918,
-0.14871394634246826,
-0.07687961310148239,
-0.09746778756380081,
-0.00524714682251215,
0.025364859029650688,
0.03776382654905319,
-0.022073807194828987,
-0.05649435892701149,
-0.12054748088121414,
0.0540899932384491,
-0.02811623178422451,
0.06651565432548523,
0.09958923608064651,
-0.049751244485378265,
0.06263693422079086,
-0.1320413202047348,
-0.013059970922768116,
-0.07983575016260147,
-0.07356146723031998,
0.04394548013806343,
-0.044757526367902756,
0.032524801790714264,
0.0903473049402237,
0.019372690469026566,
-0.017710208892822266,
-0.023374946787953377,
0.18843001127243042,
0.04050165042281151,
0.029166918247938156,
0.13631883263587952,
-0.04659198969602585,
0.05660681799054146,
0.07725416868925095,
0.02496258355677128,
-0.047966212034225464,
0.049499668180942535,
0.0499526746571064,
-0.06738002598285675,
-0.1802675873041153,
-0.029053756967186928,
-0.024864723905920982,
-0.0523739792406559,
0.07937891036272049,
0.034938689321279526,
-0.004216216970235109,
0.07116220891475677,
0.006765868980437517,
0.06953516602516174,
-0.015640271827578545,
0.10014655441045761,
0.01394384540617466,
-0.026357004418969154,
0.08864492177963257,
-0.033490654081106186,
-0.012983839027583599,
0.07732077687978745,
-0.008664900436997414,
0.2917366027832031,
-0.032342515885829926,
0.003904724260792136,
0.1170276552438736,
0.04521989822387695,
0.06730231642723083,
0.1452263742685318,
-0.07820941507816315,
0.015252111479640007,
-0.06633628159761429,
-0.06780168414115906,
-0.007569995708763599,
0.03658125177025795,
-0.061141304671764374,
0.01358178723603487,
-0.07405310869216919,
-0.008163832128047943,
-0.01747807301580906,
0.3014838993549347,
0.09864229708909988,
-0.10069470852613449,
-0.05223720893263817,
-0.000630486523732543,
-0.09841395914554596,
-0.07960940152406693,
0.04661419615149498,
0.06683696061372757,
-0.12918676435947418,
0.018198681995272636,
-0.018871694803237915,
0.08384277671575546,
-0.022136518731713295,
0.01923682913184166,
0.020231539383530617,
0.048140812665224075,
-0.04445111006498337,
0.007775512523949146,
-0.18436181545257568,
0.19551311433315277,
0.015843214467167854,
0.0362129844725132,
-0.05613144859671593,
0.036508701741695404,
0.012934403494000435,
-0.034790121018886566,
0.058279797434806824,
0.01565801352262497,
-0.006752686109393835,
-0.05874016508460045,
-0.05560292676091194,
0.019447648897767067,
0.06646223366260529,
-0.04947461560368538,
0.10333798080682755,
-0.0010953963501378894,
0.03563205897808075,
0.011907698586583138,
0.09735137224197388,
-0.1836894005537033,
-0.10459209978580475,
0.017445888370275497,
-0.042002372443675995,
-0.1012139543890953,
-0.07921334356069565,
-0.09269555658102036,
-0.004330651368945837,
0.24397273361682892,
-0.12612734735012054,
-0.08404792100191116,
-0.09594456106424332,
0.016517406329512596,
0.11409512907266617,
-0.0436486080288887,
0.028893928974866867,
-0.0029677264392375946,
0.129336878657341,
-0.0642278715968132,
-0.13054756820201874,
0.023621609434485435,
-0.08885778486728668,
-0.1543816477060318,
-0.061531562358140945,
0.11249746382236481,
0.055500272661447525,
0.034810084849596024,
-0.031066816300153732,
0.02265232615172863,
0.02814972586929798,
-0.04161043465137482,
-0.02782830037176609,
0.08085013180971146,
0.10599067062139511,
0.045638084411621094,
-0.10602012276649475,
0.010408220812678337,
-0.05783948674798012,
-0.06595499068498611,
0.06274835020303726,
0.25366032123565674,
-0.05235046520829201,
0.11259535700082779,
0.13313303887844086,
-0.0729653462767601,
-0.1527559906244278,
0.029466496780514717,
0.07675029337406158,
-0.015543203800916672,
0.00495751341804862,
-0.15780112147331238,
0.0971568375825882,
0.12010153383016586,
-0.019396726042032242,
0.007684436161071062,
-0.17780090868473053,
-0.13739509880542755,
0.07698104530572891,
0.08701418340206146,
0.2588551342487335,
-0.05651199072599411,
-0.04366444796323776,
-0.0002558031992521137,
-0.08663216233253479,
0.014544814825057983,
0.13624580204486847,
0.07047761231660843,
-0.03353084996342659,
-0.09873392432928085,
0.018527643755078316,
-0.04637955129146576,
0.08928501605987549,
0.06040201708674431,
0.06453625112771988,
0.0063507468439638615,
0.06056975573301315,
-0.022378623485565186,
-0.044944893568754196,
0.074136883020401,
-0.006221727002412081,
0.035936541855335236,
-0.09573355317115784,
-0.03469490632414818,
-0.0578014999628067,
0.031192677095532417,
-0.01109571848064661,
-0.07021380960941315,
-0.06073218956589699,
0.0630878359079361,
0.047528862953186035,
-0.02488032728433609,
0.02559669315814972,
0.017642972990870476,
0.13673998415470123,
0.13844221830368042,
0.019637813791632652,
-0.07714516669511795,
-0.08100792020559311,
-0.03507077321410179,
-0.02234112098813057,
0.08601483702659607,
-0.049559082835912704,
0.02613045647740364,
0.07506915181875229,
0.027245232835412025,
0.0825297087430954,
0.061736565083265305,
-0.10133315622806549,
-0.02463490329682827,
0.03772244602441788,
-0.15896475315093994,
0.016271980479359627,
0.0004686908796429634,
0.025651240721344948,
-0.04077121242880821,
0.04380340874195099,
0.13377998769283295,
-0.07054147124290466,
-0.03471068665385246,
-0.03908206894993782,
0.061323098838329315,
0.017181774601340294,
0.15088041126728058,
0.03419480845332146,
0.042172953486442566,
-0.08943016082048416,
0.12536627054214478,
0.03437064215540886,
-0.06311593949794769,
0.0263692494481802,
-0.033564697951078415,
-0.10503675043582916,
0.016156107187271118,
0.05960403010249138,
0.06095769628882408,
-0.05031599849462509,
-0.009781058877706528,
-0.010478680022060871,
-0.07720842957496643,
0.05508177727460861,
0.18629033863544464,
0.06290202587842941,
0.07984087616205215,
-0.06075309216976166,
-0.05374553054571152,
-0.0954236313700676,
0.04039251431822777,
0.039179105311632156,
0.07883615791797638,
-0.07379478961229324,
0.08950807899236679,
0.012543913908302784,
0.028247984126210213,
-0.029056699946522713,
-0.0439533032476902,
-0.09613379091024399,
-0.05716478079557419,
-0.10059312731027603,
-0.0014853912871330976,
-0.0788172036409378,
-0.03552882373332977,
0.000037590114516206086,
0.0035848908592015505,
-0.005559093784540892,
0.04700775444507599,
-0.06435653567314148,
-0.007961798459291458,
-0.015643976628780365,
0.04317038878798485,
-0.06724726408720016,
-0.028694158419966698,
0.027650104835629463,
-0.10710924118757248,
0.08814749121665955,
0.04746473208069801,
0.0011842442909255624,
0.001996689010411501,
0.09341185539960861,
-0.02170436643064022,
0.020913507789373398,
0.02103990688920021,
-0.046273037791252136,
-0.06819532066583633,
0.006791660562157631,
-0.0012808297760784626,
-0.011856859549880028,
-0.005692171398550272,
0.07534756511449814,
-0.09456285089254379,
0.028841253370046616,
0.0017086986918002367,
0.0023378601763397455,
-0.06847355514764786,
-0.01263057067990303,
0.1163008064031601,
0.0988418459892273,
0.03570586442947388,
-0.09371214359998703,
-0.0008874944178387523,
-0.1390092819929123,
-0.03376925736665726,
0.00090485077816993,
-0.0188274048268795,
-0.11553162336349487,
-0.006401909980922937,
0.032742030918598175,
-0.0031296375673264265,
0.22708697617053986,
-0.06419932097196579,
-0.008574808947741985,
0.014492493122816086,
-0.09253585338592529,
0.12266629934310913,
-0.030991675332188606,
0.20336762070655823,
-0.006159021519124508,
-0.04232167452573776,
-0.010045703500509262,
0.04662272706627846,
0.01652643270790577,
0.0005745725356973708,
0.18303059041500092,
0.12847067415714264,
0.025751110166311264,
0.0353742279112339,
-0.03173309937119484,
-0.0013900913763791323,
-0.048874616622924805,
-0.050102852284908295,
0.03785150498151779,
0.054635804146528244,
0.016202958300709724,
0.15431180596351624,
0.060880184173583984,
-0.15379850566387177,
0.03322349861264229,
-0.03514615073800087,
-0.04015442356467247,
-0.11686321347951889,
-0.1045152097940445,
-0.02312581241130829,
-0.08103349804878235,
0.013683688826858997,
-0.12431773543357849,
0.007926585152745247,
0.18296213448047638,
0.05856842175126076,
0.021202174946665764,
0.03456595912575722,
-0.12792420387268066,
-0.021211188286542892,
0.057452958077192307,
0.0178561732172966,
0.02261127345263958,
0.05942226201295853,
-0.0029981276020407677,
0.06491311639547348,
0.027732357382774353,
0.010775204747915268,
0.0017350033158436418,
0.0769304558634758,
0.014993811957538128,
0.05185540020465851,
-0.05791066586971283,
-0.010843751020729542,
-0.04589114338159561,
0.07715331017971039,
0.12952136993408203,
0.042481929063797,
-0.05105441063642502,
-0.01804266683757305,
0.16418561339378357,
-0.05082087591290474,
0.001494564232416451,
-0.1169089749455452,
0.3467617630958557,
0.031226133927702904,
0.0034260903485119343,
0.04230862483382225,
-0.07496364414691925,
-0.045103009790182114,
0.20753857493400574,
0.07306818664073944,
0.0019346238113939762,
-0.016620248556137085,
-0.011171077378094196,
-0.03395121544599533,
-0.03223590552806854,
0.15091626346111298,
0.03229024261236191,
0.12527300417423248,
-0.05164380744099617,
-0.039446622133255005,
-0.01566932536661625,
-0.00828930176794529,
-0.13068969547748566,
0.1334506869316101,
-0.028373446315526962,
-0.034054819494485855,
-0.06963710486888885,
0.023978639394044876,
0.08059705793857574,
-0.312376469373703,
0.00916040688753128,
-0.029961517080664635,
-0.10108459740877151,
-0.010190147906541824,
-0.013365733437240124,
-0.01499217189848423,
0.03580692037940025,
-0.04636577144265175,
0.07625611126422882,
0.03839689865708351,
0.03038882650434971,
-0.019932610914111137,
-0.11335475742816925,
0.15489552915096283,
0.05453060194849968,
0.09565626829862595,
0.03674586862325668,
0.05133456736803055,
0.05980118736624718,
0.032708171755075455,
-0.10649161040782928,
0.05119067803025246,
0.007148262113332748,
-0.09399307519197464,
-0.0426868200302124,
0.11582951992750168,
0.005772343836724758,
0.038947079330682755,
0.027976760640740395,
-0.11024598777294159,
0.0076521411538124084,
0.0740325078368187,
-0.05725809186697006,
-0.09552732855081558,
-0.008305180817842484,
-0.08087857812643051,
0.16133445501327515,
0.14096930623054504,
-0.018159572035074234,
0.012420376762747765,
-0.06759524345397949,
-0.004938618279993534,
0.05451173335313797,
0.0003065490163862705,
-0.026510074734687805,
-0.197000652551651,
0.03405618667602539,
-0.07368611544370651,
-0.013986457139253616,
-0.22085149586200714,
-0.09971532970666885,
-0.008056767284870148,
-0.046333469450473785,
-0.034843966364860535,
0.045866746455430984,
0.027370678260922432,
0.07629211246967316,
-0.011788059026002884,
-0.0632835105061531,
-0.028384877368807793,
0.0819190964102745,
-0.11489583551883698,
-0.05747643858194351
] |
null | null |
transformers
|
# MultiBERTs - Seed 0
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #0.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_0')
model = TFBertModel.from_pretrained("google/multiberts-seed_0")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_0')
model = BertModel.from_pretrained("google/multiberts-seed_0")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_0"]}
| null |
google/multiberts-seed_0
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_0",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_0 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 0
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #0.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 0\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #0.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_0 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 0\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #0.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_0 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 0\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #0.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06348147988319397,
0.09134791791439056,
-0.004097555298358202,
0.04404676333069801,
0.07663298398256302,
0.014618179760873318,
0.0546838715672493,
0.0744563415646553,
-0.08894481509923935,
0.022203590720891953,
-0.011107285507023335,
-0.04699849337339401,
0.07697299122810364,
-0.044434722512960434,
0.05911784619092941,
-0.23470477759838104,
0.04991300776600838,
-0.030040016397833824,
-0.025379709899425507,
0.027879837900400162,
0.11212600022554398,
-0.09609099477529526,
0.07507255673408508,
0.05523442104458809,
0.0054939440451562405,
0.01703624799847603,
-0.015339040197432041,
0.004526139236986637,
0.08676299452781677,
0.03203573077917099,
0.0842156782746315,
-0.0022363823372870684,
0.08481436222791672,
-0.14168013632297516,
0.006506013218313456,
0.060008905827999115,
0.0637328028678894,
0.04276448115706444,
0.11685805022716522,
0.006145933642983437,
0.08776255697011948,
0.018247369676828384,
0.05225758999586105,
0.04559563472867012,
-0.07451483607292175,
-0.17086099088191986,
-0.09191519767045975,
0.018934501335024834,
-0.0009043535101227462,
0.006776586640626192,
-0.007092646788805723,
-0.01971660740673542,
-0.019919048994779587,
0.021164925768971443,
0.12013713270425797,
-0.2663504481315613,
-0.016336608678102493,
0.009096262976527214,
0.05840441584587097,
0.05828440934419632,
-0.03796809911727905,
-0.04092151299118996,
0.04300388693809509,
0.052711308002471924,
0.04199015721678734,
-0.024615632370114326,
0.04072779789566994,
-0.01532090362161398,
-0.15352697670459747,
-0.019232897087931633,
0.10764272511005402,
-0.048630718141794205,
-0.1180073693394661,
-0.0474514439702034,
-0.032808613032102585,
0.12229862809181213,
0.008804352954030037,
-0.036022938787937164,
0.046235572546720505,
0.03103279136121273,
0.06299576163291931,
-0.06321673095226288,
-0.11675592511892319,
0.025989660993218422,
-0.05145544931292534,
0.10768239945173264,
0.09365957975387573,
0.048886336386203766,
-0.008096014149487019,
0.05627436563372612,
-0.08466444909572601,
-0.0777100995182991,
-0.05166323482990265,
-0.0891110748052597,
-0.0406685471534729,
-0.038304030895233154,
-0.08460507541894913,
-0.16520652174949646,
-0.0034270461183041334,
0.10932166874408722,
-0.06159782037138939,
0.008433833718299866,
-0.09011918306350708,
-0.0213396605104208,
0.09429401904344559,
0.16182160377502441,
-0.10809563845396042,
0.04756864905357361,
-0.010545586235821247,
0.010479333810508251,
-0.023708142340183258,
0.03157760202884674,
0.011759131215512753,
-0.009460117667913437,
0.05119895190000534,
0.023499032482504845,
-0.019064130261540413,
0.04299354925751686,
-0.020462920889258385,
-0.041928332298994064,
0.053919125348329544,
-0.13419577479362488,
-0.010527880862355232,
0.0025748913176357746,
-0.004058361053466797,
0.06148019805550575,
0.06514947861433029,
-0.027979589998722076,
-0.09082280844449997,
0.02280041202902794,
-0.08169088512659073,
-0.04604680463671684,
-0.06064695119857788,
-0.15678581595420837,
0.026972122490406036,
-0.0759153664112091,
-0.04961085692048073,
-0.09434142708778381,
-0.09842300415039062,
-0.026155361905694008,
0.06145773455500603,
-0.0174382533878088,
0.038339320570230484,
0.030076714232563972,
-0.007923842407763004,
-0.0420914962887764,
0.046087443828582764,
0.006421390455216169,
-0.01578509621322155,
0.008137366734445095,
-0.044225018471479416,
0.05399991571903229,
-0.010007884353399277,
0.044185709208250046,
-0.07101016491651535,
0.020790254697203636,
-0.14303945004940033,
0.0622941330075264,
-0.096797876060009,
-0.08369497954845428,
-0.05078582838177681,
-0.041730206459760666,
-0.07294587790966034,
0.03057989850640297,
0.00981735810637474,
0.06196695193648338,
-0.14919544756412506,
-0.050029028207063675,
0.13819754123687744,
-0.13621671497821808,
0.0355629064142704,
0.09586847573518753,
-0.05102582275867462,
0.04404253140091896,
0.11680541932582855,
0.057976134121418,
0.07099370658397675,
-0.04865429177880287,
-0.01432234887033701,
0.009640148840844631,
0.03468800336122513,
0.14422747492790222,
0.0662224292755127,
-0.06929492205381393,
-0.08295390754938126,
0.035537220537662506,
-0.07415435463190079,
-0.043759603053331375,
-0.05968667194247246,
-0.005233940202742815,
-0.010165195912122726,
-0.055903609842061996,
-0.006103554740548134,
-0.02490069344639778,
-0.012232575565576553,
-0.017299503087997437,
-0.051570989191532135,
0.05185756832361221,
0.06213824450969696,
-0.08815706521272659,
0.057140618562698364,
-0.055001575499773026,
0.017028868198394775,
-0.08032097667455673,
-0.0014069925528019667,
-0.17909151315689087,
0.007952227257192135,
0.1111590638756752,
-0.09949753433465958,
0.05065656453371048,
0.16269360482692719,
0.02194255217909813,
0.06833399087190628,
-0.051949311047792435,
0.07103541493415833,
0.005773904733359814,
-0.024574479088187218,
-0.04556521400809288,
-0.11844345927238464,
-0.06475688517093658,
-0.061129823327064514,
0.01009130384773016,
-0.08441444486379623,
-0.005402550101280212,
-0.037963591516017914,
0.020274141803383827,
0.02328166551887989,
-0.06356357783079147,
0.01967312954366207,
0.025061173364520073,
-0.03819393739104271,
-0.027385789901018143,
-0.02551773004233837,
0.045324284583330154,
0.015214502811431885,
0.11633842438459396,
-0.0934378057718277,
-0.06713037192821503,
0.04638172313570976,
0.05119144544005394,
-0.052743252366781235,
0.09250743687152863,
-0.0545639805495739,
-0.03384062647819519,
-0.09980370849370956,
-0.09809176623821259,
0.1737094521522522,
-0.004645515698939562,
0.09988675266504288,
-0.09637963026762009,
-0.025809271261096,
-0.00025844573974609375,
-0.009388502687215805,
-0.0033792315516620874,
0.05178667977452278,
0.013217817060649395,
-0.09561231732368469,
-0.003034848254173994,
0.015971172600984573,
0.017001939937472343,
0.07741263508796692,
-0.0198826901614666,
-0.11433852463960648,
0.030170919373631477,
-0.00153095624409616,
-0.006794648710638285,
0.06611481308937073,
-0.049531109631061554,
-0.005169583018869162,
0.05511344596743584,
0.05702517554163933,
0.055942218750715256,
-0.06593325734138489,
0.09603728353977203,
0.06585211306810379,
-0.04417655989527702,
-0.04300270229578018,
-0.08495260775089264,
0.01055234670639038,
0.11531086266040802,
0.026053400710225105,
0.058576297014951706,
-0.04665558785200119,
-0.023381656035780907,
-0.10389069467782974,
0.15701401233673096,
-0.08753034472465515,
-0.16010160744190216,
-0.15123872458934784,
0.0072089084424078465,
-0.05611003190279007,
0.06263171881437302,
0.016266928985714912,
-0.04971632733941078,
-0.097598597407341,
-0.0793086513876915,
0.15905779600143433,
-0.04004748910665512,
-0.006592527497559786,
0.01778317801654339,
-0.029079437255859375,
0.037382882088422775,
-0.1813046634197235,
-0.0009838473051786423,
-0.03986157104372978,
-0.126040518283844,
-0.03831968829035759,
0.00023793311265762895,
0.0676201730966568,
0.07154376059770584,
-0.037827327847480774,
-0.07556194067001343,
0.01841055415570736,
0.16410550475120544,
0.034475330263376236,
0.07777849584817886,
0.09362664818763733,
-0.09786275029182434,
0.04275747761130333,
0.0470900796353817,
0.030568035319447517,
-0.01317680161446333,
0.00877966359257698,
0.057129666209220886,
-0.02626064233481884,
-0.28366661071777344,
-0.008871647529304028,
-0.019146515056490898,
-0.016834091395139694,
0.06577168405056,
0.04216932877898216,
-0.08628091216087341,
0.048329874873161316,
-0.05641721189022064,
0.031463317573070526,
0.08704635500907898,
0.0455450601875782,
0.0949530079960823,
-0.03888785094022751,
0.09348887950181961,
-0.053577933460474014,
-0.017186347395181656,
0.1083756685256958,
-0.05042845010757446,
0.19885684549808502,
-0.05675148591399193,
0.05241229012608528,
0.09802643954753876,
-0.012092536315321922,
0.03808341547846794,
0.13688436150550842,
-0.05175427347421646,
0.07067287713289261,
-0.05797388777136803,
-0.045471496880054474,
-0.037949539721012115,
0.023862920701503754,
-0.0010185559513047338,
0.03650864586234093,
-0.03700581192970276,
-0.01610821671783924,
-0.0032842769287526608,
0.2411997765302658,
0.06864243000745773,
-0.1242741271853447,
-0.06861361116170883,
0.007308658212423325,
-0.10816686600446701,
-0.07013854384422302,
0.05083015561103821,
0.09124313294887543,
-0.08272118121385574,
0.046376071870326996,
0.010353096760809422,
0.06798919290304184,
-0.1276969015598297,
0.02086644433438778,
0.040054887533187866,
0.0498478002846241,
-0.025266289710998535,
0.03293958306312561,
-0.15615378320217133,
0.08322665095329285,
0.035959888249635696,
0.053600676357746124,
-0.05139937251806259,
0.06410151720046997,
0.01980145275592804,
-0.013239189051091671,
0.02537410706281662,
0.01104727853089571,
-0.019831771031022072,
-0.027269750833511353,
-0.0668734535574913,
0.084242083132267,
0.07589869201183319,
-0.051698822528123856,
0.11961271613836288,
-0.05008887127041817,
0.011882588267326355,
-0.009820902720093727,
0.07737316936254501,
-0.17306910455226898,
-0.12988702952861786,
0.04511251300573349,
-0.1415262222290039,
-0.024528833106160164,
-0.06759701669216156,
-0.05405314266681671,
-0.0683668702840805,
0.16824474930763245,
-0.12418228387832642,
-0.13327598571777344,
-0.08530411869287491,
-0.012465237639844418,
0.15218688547611237,
-0.030382148921489716,
0.008049091324210167,
-0.017263540998101234,
0.1317349523305893,
-0.03609155863523483,
-0.1517772078514099,
-0.04875411465764046,
-0.07098285108804703,
-0.15057256817817688,
-0.03386784344911575,
0.07002026587724686,
0.11025422811508179,
0.05123182386159897,
0.005516097415238619,
0.025679154321551323,
0.0032997315283864737,
-0.05231136083602905,
-0.01609043776988983,
0.1819131225347519,
0.05344613268971443,
0.07098323106765747,
-0.1605874001979828,
-0.05800432339310646,
-0.048938535153865814,
0.023733001202344894,
-0.046227797865867615,
0.09761179983615875,
-0.030337223783135414,
0.07908507436513901,
0.24234406650066376,
-0.1294488161802292,
-0.20359966158866882,
0.008050324395298958,
0.029643457382917404,
0.0036869575269520283,
0.00603529904037714,
-0.2245161533355713,
0.1220475286245346,
0.0885591134428978,
-0.0005780651117675006,
-0.00709732249379158,
-0.18617171049118042,
-0.08171971142292023,
0.08043845742940903,
0.008450347930192947,
0.14756087958812714,
-0.09198601543903351,
-0.03094332665205002,
0.008547174744307995,
-0.08552410453557968,
0.05281217396259308,
0.04643096402287483,
0.08345118910074234,
0.000009585733096173499,
-0.07579001784324646,
0.049903251230716705,
-0.014409158378839493,
0.08615939319133759,
0.0455470085144043,
0.04663722589612007,
-0.03365982323884964,
0.13189436495304108,
0.001963832415640354,
-0.017745455726981163,
0.1378679722547531,
0.1128566786646843,
0.0569596104323864,
-0.024919550865888596,
-0.06249445304274559,
-0.07347911596298218,
0.011878926306962967,
-0.020898936316370964,
-0.03921639546751976,
-0.06371613591909409,
0.039049673825502396,
0.06400858610868454,
0.0008476189686916769,
-0.042664673179388046,
-0.025988394394516945,
0.057854436337947845,
0.09185202419757843,
0.1933525949716568,
-0.054198257625103,
-0.007608459331095219,
-0.018709974363446236,
-0.023265069350600243,
0.0698218122124672,
-0.018881788477301598,
0.06548580527305603,
0.0893542468547821,
0.009198429994285107,
0.08262079954147339,
0.06310868263244629,
-0.13026216626167297,
-0.02281312458217144,
0.054424554109573364,
-0.10126281529664993,
-0.13599123060703278,
-0.0268794484436512,
-0.10768752545118332,
-0.1346765160560608,
-0.00024051453510764986,
0.17270870506763458,
-0.03695226460695267,
-0.04585722088813782,
-0.01584392972290516,
0.07958730310201645,
0.01849321462213993,
0.13098934292793274,
0.03332389518618584,
-0.01541226077824831,
-0.061728548258543015,
0.1724783033132553,
0.08854382485151291,
-0.0944463387131691,
0.01059932354837656,
0.01612267456948757,
-0.06006260961294174,
-0.005373345222324133,
-0.06570642441511154,
0.07599630951881409,
-0.028803551569581032,
-0.03920183703303337,
0.001280150143429637,
-0.10020638257265091,
0.05013817176222801,
0.15073417127132416,
0.006806783378124237,
0.1586272120475769,
-0.03795607388019562,
0.06268275529146194,
-0.07470282912254333,
0.07361288368701935,
0.05466834455728531,
0.07673703134059906,
-0.01777864433825016,
0.04886222630739212,
-0.04629047214984894,
-0.0009309409651905298,
-0.014728434383869171,
0.0007548850262537599,
-0.09050557017326355,
-0.055166590958833694,
-0.22356565296649933,
0.0275806225836277,
-0.05674520134925842,
-0.035834841430187225,
0.010781531222164631,
-0.013969546183943748,
0.0037227030843496323,
0.03666013851761818,
-0.02626453898847103,
-0.03227364644408226,
-0.02612549252808094,
0.06267169862985611,
-0.12264937162399292,
0.0266574677079916,
0.06674514710903168,
-0.08740906417369843,
0.0774497240781784,
-0.0006051193922758102,
-0.05368615686893463,
-0.001487476285547018,
0.013623517006635666,
-0.04680715501308441,
-0.030414707958698273,
0.008029164746403694,
-0.05317634344100952,
-0.11049108952283859,
0.02794480323791504,
0.01193102914839983,
-0.026292450726032257,
-0.029006043449044228,
0.07765191048383713,
-0.06529965996742249,
0.05263384431600571,
0.03740197792649269,
0.005038574803620577,
-0.043511368334293365,
-0.01569516956806183,
0.11935487389564514,
0.07711611688137054,
0.0563480444252491,
-0.05213445797562599,
-0.019094940274953842,
-0.155198335647583,
-0.0012617702595889568,
-0.0017954129725694656,
-0.0035218449775129557,
-0.040757257491350174,
-0.03778959438204765,
0.031072836369276047,
0.010671422816812992,
0.1764954924583435,
0.009172968566417694,
0.014474422670900822,
0.009661698713898659,
0.0025450969114899635,
0.00930692721158266,
0.03436658903956413,
0.07170426845550537,
-0.01487726904451847,
-0.07883347570896149,
-0.07916991412639618,
0.03387283906340599,
-0.0307026207447052,
-0.019294511526823044,
0.13494962453842163,
0.1358577013015747,
0.10430295765399933,
0.023333704099059105,
-0.001005983678624034,
-0.030312510207295418,
-0.02909376658499241,
0.02251845970749855,
0.05528787150979042,
0.05301081761717796,
-0.014558820985257626,
0.012858081609010696,
0.06688803434371948,
-0.12502390146255493,
0.12010664492845535,
-0.035497039556503296,
-0.028408829122781754,
-0.10730240494012833,
-0.07865788042545319,
-0.01846797950565815,
-0.020565472543239594,
-0.02035011351108551,
-0.1644114851951599,
0.04778710752725601,
0.10538637638092041,
0.02203982323408127,
-0.03123808279633522,
0.03590790182352066,
-0.14862771332263947,
-0.08966660499572754,
0.0653289407491684,
0.0128223467618227,
0.04393582418560982,
0.10915972292423248,
-0.014922783710062504,
0.07920083403587341,
0.14205074310302734,
0.061206597834825516,
0.055713456124067307,
0.08339271694421768,
0.01014938484877348,
-0.02036699838936329,
-0.039970606565475464,
0.0073873684741556644,
-0.0616111122071743,
0.03838982805609703,
0.16187258064746857,
0.0321146622300148,
-0.05145695060491562,
0.03571769595146179,
0.18239866197109222,
-0.03711051866412163,
-0.05300796404480934,
-0.17840605974197388,
0.21083875000476837,
0.023569749668240547,
0.043150827288627625,
0.0494140088558197,
-0.08906720578670502,
-0.03631027787923813,
0.2067723274230957,
0.10802221298217773,
0.023606160655617714,
-0.022963205352425575,
0.018011612817645073,
-0.010717672295868397,
0.009185407310724258,
0.08316407352685928,
0.005059821996837854,
0.22001682221889496,
-0.03792053088545799,
0.01902875490486622,
0.03287014365196228,
0.03511480987071991,
-0.07303985208272934,
0.15272346138954163,
-0.0482608936727047,
-0.0022144897375255823,
-0.05708131566643715,
0.020952502265572548,
0.01999826915562153,
-0.3052327036857605,
-0.11019494384527206,
0.0024808316957205534,
-0.06259050220251083,
-0.01718100905418396,
-0.02109660394489765,
-0.005822126287966967,
0.05224568024277687,
-0.0042803846299648285,
0.0309852734208107,
0.18389497697353363,
-0.0063982331193983555,
-0.037760112434625626,
-0.04133967310190201,
0.12141302973031998,
0.019873155280947685,
0.13792085647583008,
0.0590578131377697,
-0.015378773212432861,
0.044737301766872406,
0.023874687030911446,
-0.1126551702618599,
-0.0329786092042923,
-0.02668117918074131,
-0.010315986350178719,
-0.022804435342550278,
0.13283950090408325,
0.01900148019194603,
0.04925594478845596,
0.03819432109594345,
-0.043049853295087814,
0.050404567271471024,
0.041469380259513855,
-0.06993242353200912,
-0.05214511975646019,
0.041882533580064774,
-0.09623466432094574,
0.1403016448020935,
0.17618544399738312,
0.012668367475271225,
0.02537059597671032,
-0.05633844807744026,
-0.008624939247965813,
0.00105318205896765,
0.11377891898155212,
-0.004163208417594433,
-0.1575295627117157,
-0.013487250544130802,
-0.08681018650531769,
0.03921342268586159,
-0.22235263884067535,
-0.04554538056254387,
0.10268698632717133,
-0.010142290964722633,
-0.009780910797417164,
0.04907146096229553,
0.005008168052881956,
0.06018735095858574,
-0.016144849359989166,
-0.047703858464956284,
0.009429954923689365,
0.06745973229408264,
-0.08324147015810013,
-0.03277761861681938
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 0k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 0k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_0k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_0k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_0k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_0k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_0k"]}
| null |
google/multiberts-seed_1-step_0k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_0k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_0k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 0k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 0k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 0k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 0k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_0k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 0k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 0k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_0k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 0k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 0k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07929480075836182,
0.08033829927444458,
-0.002206185134127736,
0.041953615844249725,
0.07443854212760925,
-0.021712934598326683,
0.07117440551519394,
0.09508422017097473,
-0.01613440550863743,
0.023343367502093315,
0.08583514392375946,
0.030204009264707565,
0.008198462426662445,
0.09692179411649704,
0.01804194413125515,
-0.2088909149169922,
0.03291923180222511,
-0.025961417704820633,
-0.08435829728841782,
0.07364071905612946,
0.10259772092103958,
-0.08390545845031738,
0.04010336101055145,
0.030915260314941406,
-0.1095350831747055,
0.05146600678563118,
-0.00961433444172144,
-0.02231108956038952,
0.13569329679012299,
0.001937188091687858,
0.052860110998153687,
0.05708945915102959,
0.041294634342193604,
-0.13918693363666534,
0.004174282308667898,
0.05698736757040024,
0.0519113652408123,
0.036506518721580505,
0.019130825996398926,
0.07595158368349075,
-0.02717907354235649,
0.03119843825697899,
0.05039074271917343,
0.016005516052246094,
-0.0642097070813179,
-0.06877520680427551,
-0.09305747598409653,
0.032312169671058655,
0.02644166722893715,
0.023875553160905838,
0.008730054832994938,
0.12188436836004257,
-0.03511381894350052,
0.04543885216116905,
0.1764058917760849,
-0.31023967266082764,
-0.002414099173620343,
0.061107683926820755,
0.021249977871775627,
0.11361218988895416,
-0.0035016005858778954,
-0.03302444890141487,
0.07826784253120422,
0.022102953866124153,
0.09841562807559967,
-0.03944655880331993,
0.01771126687526703,
-0.06116556376218796,
-0.15354160964488983,
-0.038254059851169586,
0.0981745645403862,
0.0020115526858717203,
-0.139810249209404,
-0.023045113310217857,
-0.043852824717760086,
0.04016376659274101,
0.01945023611187935,
-0.03612542524933815,
0.0439254567027092,
0.0045539215207099915,
-0.006395162083208561,
-0.0037953611463308334,
-0.10300125181674957,
-0.04332046955823898,
0.02495194971561432,
0.09887638688087463,
0.10949413478374481,
0.05782106891274452,
-0.008821344934403896,
0.10707875341176987,
-0.18648852407932281,
-0.049744099378585815,
-0.032134268432855606,
-0.03140733018517494,
-0.042865168303251266,
-0.009443006478250027,
-0.10308331251144409,
-0.04852045327425003,
0.0031520312186330557,
0.13721665740013123,
-0.016217222437262535,
0.032223816961050034,
-0.026190413162112236,
0.005686568561941385,
0.06008046120405197,
0.05256660282611847,
-0.015073057264089584,
0.03163217008113861,
0.035823334008455276,
-0.016379378736019135,
-0.01881439983844757,
0.00799405388534069,
-0.003134367987513542,
0.02970704808831215,
0.13751943409442902,
0.013396735303103924,
-0.10299835354089737,
0.0781247466802597,
-0.014394652098417282,
-0.04607298970222473,
-0.0077249715104699135,
-0.0854487195611,
-0.06228449568152428,
-0.036043353378772736,
-0.015039975754916668,
0.0073464298620820045,
0.00447467714548111,
-0.009561765007674694,
-0.031405672430992126,
-0.021443771198391914,
-0.09074024111032486,
-0.05534033849835396,
-0.05321211740374565,
-0.13539031147956848,
0.008829697035253048,
-0.19257524609565735,
-0.027796916663646698,
-0.11720200628042221,
-0.20206661522388458,
-0.03582627326250076,
0.05119725316762924,
0.0014068916207179427,
-0.06497801095247269,
0.056904539465904236,
0.03023698925971985,
-0.030592821538448334,
-0.0024656299501657486,
0.08123280107975006,
-0.008819649927318096,
0.03721710667014122,
-0.03990110009908676,
0.05500257387757301,
0.0014373654266819358,
0.04318086430430412,
-0.0644206702709198,
0.054371993988752365,
-0.1734187752008438,
0.04211902245879173,
-0.07587870955467224,
-0.03325531259179115,
-0.0882946029305458,
-0.02948571741580963,
-0.002291921991854906,
0.010992856696248055,
0.023999348282814026,
0.07472916692495346,
-0.1688390076160431,
-0.031226880848407745,
0.0913291648030281,
-0.15091362595558167,
-0.025201985612511635,
0.0704447403550148,
-0.05420451611280441,
0.11127626895904541,
0.06519722938537598,
0.16167740523815155,
-0.029042722657322884,
-0.06823401898145676,
0.04433315619826317,
-0.011686451733112335,
0.012615913525223732,
-0.011188267730176449,
0.06785613298416138,
-0.020538944751024246,
-0.16453318297863007,
0.0238355603069067,
-0.13311700522899628,
-0.001741745974868536,
-0.07848396897315979,
0.03140635788440704,
-0.004728281404823065,
-0.06651746481657028,
-0.07893610000610352,
-0.03395577892661095,
0.07764412462711334,
-0.06842505186796188,
-0.028948204591870308,
0.039913978427648544,
0.07484877854585648,
-0.07574322819709778,
0.06919749826192856,
-0.015043019317090511,
0.024365540593862534,
-0.0839974507689476,
-0.03721782937645912,
-0.18468253314495087,
0.029415439814329147,
0.09824120253324509,
0.016605207696557045,
-0.01873861812055111,
0.12927241623401642,
-0.008550583384931087,
0.0645291656255722,
-0.04059724137187004,
-0.003060245420783758,
-0.012539598159492016,
0.0011701815528795123,
-0.09503241628408432,
-0.11128653585910797,
-0.07045739144086838,
-0.06994807720184326,
0.09659082442522049,
-0.11805085092782974,
0.020599491894245148,
-0.05887661129236221,
0.04074017331004143,
0.017787013202905655,
-0.07283718883991241,
-0.010282635688781738,
0.01606551557779312,
-0.06441431492567062,
-0.05726798623800278,
0.03864171728491783,
0.06397896260023117,
-0.023949889466166496,
0.09734182059764862,
-0.05278800427913666,
-0.08786333352327347,
0.028804773464798927,
0.0721224769949913,
-0.10338333249092102,
0.02744394913315773,
-0.04819107800722122,
-0.04669759050011635,
-0.06800193339586258,
-0.02055627852678299,
0.10761763900518417,
-0.011118467897176743,
0.1450139731168747,
-0.07663580030202866,
-0.013090429827570915,
0.011438202112913132,
-0.01667526550590992,
-0.026200244203209877,
0.04692339152097702,
0.07037246227264404,
-0.06986567378044128,
0.023250196129083633,
0.02985529415309429,
-0.005531360860913992,
0.06851062178611755,
-0.05226632580161095,
-0.07746793329715729,
0.020015163347125053,
0.03203113004565239,
0.02318011038005352,
0.0639934316277504,
-0.05175282806158066,
-0.009909807704389095,
0.03167274594306946,
0.02441651001572609,
0.01287532877177,
-0.11847794055938721,
0.06273392587900162,
0.060147639364004135,
0.007049471605569124,
0.05681014806032181,
-0.018125390633940697,
-0.03491457551717758,
0.07938959449529648,
0.032903023064136505,
-0.016883639618754387,
-0.007682492956519127,
-0.009688927792012691,
-0.1248861625790596,
0.21576492488384247,
-0.06746307015419006,
-0.14925740659236908,
-0.06802509725093842,
-0.10920172184705734,
-0.0033760478254407644,
0.022419311106204987,
0.04477657750248909,
-0.02178613655269146,
-0.042146120220422745,
-0.12806767225265503,
0.08844753354787827,
-0.043055444955825806,
0.06396191567182541,
0.11111505329608917,
-0.06541382521390915,
0.049401622265577316,
-0.13034473359584808,
-0.012881627306342125,
-0.07536525279283524,
-0.0659559816122055,
0.058924440294504166,
-0.052436649799346924,
0.03218836709856987,
0.11219032108783722,
0.015376899391412735,
-0.027620794251561165,
-0.031244412064552307,
0.2040337175130844,
0.04467860236763954,
0.039735548198223114,
0.12736032903194427,
-0.07272963225841522,
0.05220648646354675,
0.07981029152870178,
0.0034805615432560444,
-0.044533953070640564,
0.05165829136967659,
0.05893667787313461,
-0.06050731614232063,
-0.19097618758678436,
-0.00873566884547472,
0.007304671686142683,
-0.044923461973667145,
0.06746242940425873,
0.03955860808491707,
0.014998848550021648,
0.07478969544172287,
0.020446838811039925,
0.06491641700267792,
0.004136372357606888,
0.10406995564699173,
0.02728956565260887,
-0.033641714602708817,
0.08691427856683731,
-0.008884917944669724,
-0.002601008862257004,
0.07641606777906418,
-0.018971683457493782,
0.2898021936416626,
-0.04543823376297951,
0.016911720857024193,
0.12961378693580627,
0.031263165175914764,
0.05044108256697655,
0.11796807497739792,
-0.07557154446840286,
0.028684690594673157,
-0.07515552639961243,
-0.04909273236989975,
0.011548102833330631,
0.044072475284338,
-0.07074815034866333,
0.01859869621694088,
-0.0878794714808464,
0.02130468748509884,
-0.02401815354824066,
0.30390819907188416,
0.10993188619613647,
-0.11289355158805847,
-0.06404078751802444,
0.0005342348595149815,
-0.10149747133255005,
-0.0697965994477272,
0.05186247453093529,
0.05021204054355621,
-0.1302887499332428,
0.005753864999860525,
-0.021342698484659195,
0.07676441967487335,
-0.028485335409641266,
0.016852352768182755,
0.036000605672597885,
0.0518030971288681,
-0.044027358293533325,
0.007687069941312075,
-0.18767395615577698,
0.19607633352279663,
-0.00232073781080544,
0.025519639253616333,
-0.05659715086221695,
0.028775662183761597,
0.007135642226785421,
-0.01454014703631401,
0.06133841350674629,
0.019677795469760895,
-0.007580894511193037,
-0.057567209005355835,
-0.04462922737002373,
0.017559492960572243,
0.06361152976751328,
-0.04519932344555855,
0.10724703967571259,
0.004858886357396841,
0.05340293422341347,
0.030229205265641212,
0.09383872896432877,
-0.1872725933790207,
-0.07802827656269073,
0.03000972606241703,
-0.04408145323395729,
-0.10178913176059723,
-0.08025120198726654,
-0.09732755273580551,
-0.004486623220145702,
0.22899775207042694,
-0.11020650714635849,
-0.07338988035917282,
-0.09367545694112778,
0.04197775945067406,
0.09320074319839478,
-0.055103953927755356,
0.0237704049795866,
-0.01483254786580801,
0.1128150075674057,
-0.06748936325311661,
-0.12379322946071625,
0.025997310876846313,
-0.09936642646789551,
-0.16091519594192505,
-0.06645707041025162,
0.08757096529006958,
0.06565959751605988,
0.029596572741866112,
-0.03278830274939537,
0.012101751752197742,
0.039131730794906616,
-0.03565104305744171,
-0.0005313385627232492,
0.06837986409664154,
0.093061663210392,
0.03660561144351959,
-0.10605954378843307,
0.018422365188598633,
-0.07120757550001144,
-0.06970901787281036,
0.07237229496240616,
0.27125728130340576,
-0.04915262386202812,
0.11285535246133804,
0.1232559084892273,
-0.08732175827026367,
-0.15923039615154266,
0.04091601073741913,
0.09268631786108017,
-0.016352593898773193,
-0.0020958485547453165,
-0.16219569742679596,
0.09883379191160202,
0.11397691071033478,
-0.01568685658276081,
0.0023225205950438976,
-0.20375283062458038,
-0.13669073581695557,
0.08788301795721054,
0.11484212428331375,
0.28372862935066223,
-0.052814215421676636,
-0.03539624437689781,
0.022848691791296005,
-0.08977717906236649,
0.013306908309459686,
0.12769798934459686,
0.06232481077313423,
-0.020763929933309555,
-0.06912670284509659,
0.01062709093093872,
-0.03666685149073601,
0.09065204858779907,
0.06425253301858902,
0.06792639195919037,
-0.0018297795904800296,
-0.0020935810171067715,
-0.0361672006547451,
-0.044209711253643036,
0.06966425478458405,
0.03297950699925423,
0.0512210987508297,
-0.08798012882471085,
-0.03497450426220894,
-0.07570995390415192,
0.033988095819950104,
-0.030285775661468506,
-0.0769634023308754,
-0.05807682126760483,
0.07503180205821991,
0.059716396033763885,
-0.0324995182454586,
0.03956623747944832,
0.028008710592985153,
0.10072996467351913,
0.15166150033473969,
-0.0028617382049560547,
-0.04516538232564926,
-0.06466176360845566,
-0.030471323058009148,
-0.014949554577469826,
0.07638164609670639,
-0.043009985238313675,
0.019873127341270447,
0.07143711298704147,
0.020633582025766373,
0.10643154382705688,
0.06233702972531319,
-0.11975309252738953,
-0.019679013639688492,
0.02264552004635334,
-0.15481442213058472,
0.0123975221067667,
0.000962108199018985,
0.01855934038758278,
-0.019646774977445602,
0.028187260031700134,
0.1550140082836151,
-0.07151644676923752,
-0.0314970500767231,
-0.0461820513010025,
0.05973469465970993,
0.03132065385580063,
0.1412106305360794,
0.03669239208102226,
0.037044916301965714,
-0.07575333118438721,
0.147517591714859,
0.04394218325614929,
-0.04254762455821037,
0.030674299225211143,
-0.034444451332092285,
-0.10710567981004715,
0.012539333663880825,
0.0652736946940422,
0.0762530118227005,
-0.07196496427059174,
-0.01803283952176571,
-0.040085382759571075,
-0.07822488248348236,
0.07092256098985672,
0.21868611872196198,
0.06143851950764656,
0.06939002126455307,
-0.05396643280982971,
-0.03508540987968445,
-0.0816592127084732,
0.05106568709015846,
0.051124460995197296,
0.07774635404348373,
-0.07557392865419388,
0.10099846124649048,
0.012487177737057209,
0.05014203116297722,
-0.0267103910446167,
-0.050012264400720596,
-0.10761500895023346,
-0.05607966333627701,
-0.10944721847772598,
0.016926076263189316,
-0.06566330045461655,
-0.04330669343471527,
0.010517469607293606,
-0.0017863449174910784,
-0.0023599108681082726,
0.05658382922410965,
-0.06143384799361229,
-0.009910020977258682,
-0.01342513132840395,
0.03762069344520569,
-0.06409958750009537,
-0.05370362102985382,
0.016419880092144012,
-0.09711360186338425,
0.10131308436393738,
0.05191370099782944,
0.008560745045542717,
-0.00020021219097543508,
0.07310374826192856,
-0.009262729436159134,
0.02153804339468479,
0.007536712102591991,
-0.04162903502583504,
-0.1028476431965828,
0.005321566015481949,
-0.02531866729259491,
-0.03333178535103798,
-0.018135864287614822,
0.09382657706737518,
-0.08055086433887482,
0.025764387100934982,
0.0018657995387911797,
-0.00495484285056591,
-0.07798609882593155,
-0.004576532170176506,
0.09396491199731827,
0.08172135800123215,
0.05163785442709923,
-0.0846908688545227,
0.0174272321164608,
-0.12111525982618332,
-0.034971144050359726,
0.014806680381298065,
-0.01357788685709238,
-0.13212278485298157,
-0.012412157841026783,
0.01763179898262024,
-0.015080373734235764,
0.18545979261398315,
-0.056135885417461395,
-0.024617241695523262,
0.019838539883494377,
-0.09996630996465683,
0.1069040521979332,
-0.01908925361931324,
0.1680002361536026,
-0.02832374908030033,
-0.03618448227643967,
-0.013468815945088863,
0.04543447867035866,
0.02789231576025486,
-0.0060316212475299835,
0.187196746468544,
0.13320183753967285,
0.03850540146231651,
0.05776398628950119,
-0.029164046049118042,
-0.010999028570950031,
-0.06389246135950089,
-0.017608314752578735,
0.04151967912912369,
0.04136372730135918,
0.022090401500463486,
0.15072090923786163,
0.06192290037870407,
-0.15949012339115143,
0.035277239978313446,
-0.024889418855309486,
-0.04159138724207878,
-0.11730765551328659,
-0.1094178780913353,
-0.032509442418813705,
-0.062145933508872986,
0.014448408037424088,
-0.13190427422523499,
0.004296118393540382,
0.17285877466201782,
0.06783300638198853,
0.029464518651366234,
0.013687890022993088,
-0.13006198406219482,
-0.0432501919567585,
0.058096133172512054,
0.01205207034945488,
0.01962556131184101,
0.039500147104263306,
-0.00048456466174684465,
0.06619108468294144,
0.031057842075824738,
0.003336680121719837,
-0.0020013165194541216,
0.08588123321533203,
0.025775203481316566,
0.046976398676633835,
-0.058099616318941116,
0.0010375858983024955,
-0.04115457832813263,
0.08293946087360382,
0.11726988852024078,
0.04490744322538376,
-0.05087912082672119,
-0.010437172837555408,
0.15994790196418762,
-0.031332992017269135,
0.004805235657840967,
-0.1192261129617691,
0.3319634199142456,
0.012741461396217346,
0.011053809896111488,
0.05731986090540886,
-0.07953793555498123,
-0.04754301533102989,
0.2048032432794571,
0.06771605461835861,
-0.02107136696577072,
-0.026024244725704193,
0.007247522938996553,
-0.03202606737613678,
-0.01684315875172615,
0.13890570402145386,
0.0395955964922905,
0.12006625533103943,
-0.05743644759058952,
-0.04990624636411667,
-0.033469922840595245,
0.0044105034321546555,
-0.11320125311613083,
0.14524473249912262,
-0.020752863958477974,
-0.02372724935412407,
-0.07918623834848404,
0.00748337060213089,
0.06751354038715363,
-0.3402578830718994,
0.009173402562737465,
-0.0245408583432436,
-0.10440840572118759,
-0.013702291063964367,
-0.02771623246371746,
-0.02821851335465908,
0.04873763397336006,
-0.040598489344120026,
0.06436590105295181,
0.050108402967453,
0.03773433342576027,
-0.018417486920952797,
-0.09664516896009445,
0.1595877856016159,
0.06573527306318283,
0.10262703895568848,
0.017972368746995926,
0.08723568916320801,
0.06322438269853592,
0.039008330553770065,
-0.09303212910890579,
0.056158848106861115,
0.00810017716139555,
-0.07056489586830139,
-0.056268468499183655,
0.1161884218454361,
-0.0010391839314252138,
0.0689530298113823,
0.03424597531557083,
-0.12276646494865417,
0.025575418025255203,
0.0805082619190216,
-0.0908697172999382,
-0.09069477766752243,
-0.005837124772369862,
-0.09865321218967438,
0.1547432243824005,
0.1452273428440094,
-0.011619871482253075,
0.012126090936362743,
-0.06320361793041229,
-0.006272251717746258,
0.05048409104347229,
0.00802042055875063,
-0.012922560796141624,
-0.18326541781425476,
0.05586765334010124,
-0.0854625478386879,
-0.001958632143214345,
-0.21554267406463623,
-0.09924221783876419,
-0.009120681323111057,
-0.05490340664982796,
-0.026237528771162033,
0.05545491725206375,
0.02091926336288452,
0.07492692023515701,
-0.02837047353386879,
-0.025769086554646492,
-0.03974725306034088,
0.09307349473237991,
-0.10186547040939331,
-0.07500753551721573
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1000k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1000k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1000k"]}
| null |
google/multiberts-seed_1-step_1000k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1000k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1000k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07994934171438217,
0.07930172979831696,
-0.0018916161498054862,
0.0431777685880661,
0.07764764875173569,
-0.02077607996761799,
0.0674685463309288,
0.09239069372415543,
-0.01134570688009262,
0.025075789541006088,
0.08596278727054596,
0.029543930664658546,
0.0121200792491436,
0.10171320289373398,
0.02224908024072647,
-0.2139291912317276,
0.03423481807112694,
-0.02588782086968422,
-0.0839155912399292,
0.07325194031000137,
0.10272078961133957,
-0.08731973171234131,
0.042451657354831696,
0.029544208198785782,
-0.10900487750768661,
0.05110117048025131,
-0.011778227984905243,
-0.026279250159859657,
0.13743631541728973,
-0.004008909687399864,
0.05328400433063507,
0.05472587049007416,
0.05018423870205879,
-0.13493387401103973,
0.0032721078023314476,
0.05615261197090149,
0.05493612959980965,
0.0365641750395298,
0.01711210608482361,
0.08369498699903488,
-0.031812965869903564,
0.028319330886006355,
0.05180590972304344,
0.013618328608572483,
-0.06414283066987991,
-0.06727135926485062,
-0.09856834262609482,
0.03518230840563774,
0.030246293172240257,
0.019635770469903946,
0.00690924609079957,
0.11806929856538773,
-0.03356058523058891,
0.04259227216243744,
0.17387762665748596,
-0.30569931864738464,
-0.0006417225813493133,
0.061703428626060486,
0.02720843441784382,
0.11410632729530334,
-0.003602491458877921,
-0.034243080765008926,
0.08097166568040848,
0.020847264677286148,
0.09462281316518784,
-0.04052684083580971,
0.010145924985408783,
-0.061239972710609436,
-0.15672728419303894,
-0.037561580538749695,
0.09784284234046936,
0.0032019943464547396,
-0.1399020403623581,
-0.021867895498871803,
-0.042063940316438675,
0.03315158560872078,
0.02103240229189396,
-0.041179001331329346,
0.04353411868214607,
0.0008867864962667227,
-0.004840330220758915,
-0.0014052207116037607,
-0.10108327120542526,
-0.04592709615826607,
0.022679727524518967,
0.0943104699254036,
0.11101216822862625,
0.057841211557388306,
-0.0061571234837174416,
0.10992076247930527,
-0.19275762140750885,
-0.04706732928752899,
-0.028989659622311592,
-0.03274843841791153,
-0.04469425231218338,
-0.011022813618183136,
-0.0999327078461647,
-0.04437466338276863,
-0.0006275902269408107,
0.13601326942443848,
-0.005856538191437721,
0.030311699956655502,
-0.018105093389749527,
0.005295055918395519,
0.06088234856724739,
0.05649661645293236,
-0.022867001593112946,
0.019677886739373207,
0.03697627782821655,
-0.013862273655831814,
-0.02067045122385025,
0.009548035450279713,
-0.004539842717349529,
0.025663334876298904,
0.13675351440906525,
0.014236129820346832,
-0.10635833442211151,
0.08153428882360458,
-0.013594788499176502,
-0.042269352823495865,
-0.011181904934346676,
-0.087134949862957,
-0.06250204145908356,
-0.04103868082165718,
-0.018396053463220596,
0.0036022858694195747,
0.0018066085176542401,
-0.00832504965364933,
-0.028880096971988678,
-0.019225193187594414,
-0.09020121395587921,
-0.060220278799533844,
-0.05638409033417702,
-0.13894422352313995,
0.00905649084597826,
-0.20602983236312866,
-0.027392631396651268,
-0.11562926322221756,
-0.20481489598751068,
-0.04036886245012283,
0.04452886804938316,
0.005698053166270256,
-0.06618504971265793,
0.05424458160996437,
0.030713960528373718,
-0.032086655497550964,
-0.0034115025773644447,
0.0879785418510437,
-0.0060064285062253475,
0.036600418388843536,
-0.04267997294664383,
0.06096504256129265,
0.0032329403329640627,
0.04228949919342995,
-0.06252113729715347,
0.05768759548664093,
-0.16915901005268097,
0.0402802973985672,
-0.07408375293016434,
-0.03001241758465767,
-0.08683637529611588,
-0.030875978991389275,
-0.004590071737766266,
0.010700291022658348,
0.026658205315470695,
0.07219347357749939,
-0.17633222043514252,
-0.02745131216943264,
0.09208410978317261,
-0.14882278442382812,
-0.027864890173077583,
0.07112329453229904,
-0.055548109114170074,
0.12180328369140625,
0.0711008682847023,
0.15738838911056519,
-0.028148498386144638,
-0.06485435366630554,
0.041126105934381485,
-0.014159474521875381,
0.009491370990872383,
-0.011334416456520557,
0.06687875837087631,
-0.02003837190568447,
-0.1597341001033783,
0.023714704439044,
-0.13063335418701172,
-0.0026328631211072206,
-0.0779535248875618,
0.03059440292418003,
-0.001743205706588924,
-0.0698847696185112,
-0.07881711423397064,
-0.03585416078567505,
0.07501807808876038,
-0.06920131295919418,
-0.024385910481214523,
0.03308499976992607,
0.07500319927930832,
-0.0732482373714447,
0.06748244911432266,
-0.015741372480988503,
0.02225348725914955,
-0.08201324939727783,
-0.03523009642958641,
-0.18287163972854614,
0.03528861328959465,
0.10090314596891403,
0.010176701471209526,
-0.018061937764286995,
0.12296457588672638,
-0.009280026890337467,
0.06509748101234436,
-0.0383642353117466,
-0.004955074284225702,
-0.01136216800659895,
0.002291414886713028,
-0.09808550775051117,
-0.10482487082481384,
-0.07238443940877914,
-0.0703209787607193,
0.0889919102191925,
-0.11314021795988083,
0.022148245945572853,
-0.05501563102006912,
0.037439968436956406,
0.016856567934155464,
-0.06987976282835007,
-0.010894101113080978,
0.01666083000600338,
-0.06267454475164413,
-0.05970679223537445,
0.03527161478996277,
0.05960603803396225,
-0.023805974051356316,
0.09541760385036469,
-0.05006731301546097,
-0.09686744213104248,
0.028722381219267845,
0.07761211693286896,
-0.10634062439203262,
0.03028266131877899,
-0.04667174071073532,
-0.047110144048929214,
-0.06916391849517822,
-0.029266489669680595,
0.10824575275182724,
-0.011563079431653023,
0.14014509320259094,
-0.07837550342082977,
-0.014377478510141373,
0.011668996885418892,
-0.013725891709327698,
-0.023717911913990974,
0.04536018520593643,
0.07337069511413574,
-0.07132647186517715,
0.022200847044587135,
0.03095891699194908,
-0.004509005229920149,
0.06475719809532166,
-0.05221065133810043,
-0.07699004560709,
0.020029626786708832,
0.029476024210453033,
0.02161549963057041,
0.06237322837114334,
-0.053745005279779434,
-0.015307837165892124,
0.029073962941765785,
0.02230406366288662,
0.010488673113286495,
-0.12017422914505005,
0.06301682442426682,
0.05938896909356117,
0.009688476100564003,
0.048984967172145844,
-0.018546730279922485,
-0.03289106488227844,
0.08081496506929398,
0.03292115405201912,
-0.013989441096782684,
-0.010499696247279644,
-0.011181474663317204,
-0.12295001745223999,
0.2172536700963974,
-0.06681108474731445,
-0.14449100196361542,
-0.06662865728139877,
-0.10838381946086884,
0.005114980041980743,
0.025063909590244293,
0.04091745615005493,
-0.026171062141656876,
-0.04215613007545471,
-0.12541764974594116,
0.09648061543703079,
-0.03655525669455528,
0.0681639090180397,
0.11031979322433472,
-0.0636206567287445,
0.039491258561611176,
-0.1314861923456192,
-0.012284014374017715,
-0.07611799240112305,
-0.06610780209302902,
0.05575380101799965,
-0.05010636895895004,
0.037906911224126816,
0.11436775326728821,
0.014275194145739079,
-0.028539901599287987,
-0.03292622044682503,
0.20423603057861328,
0.04165461286902428,
0.04492330923676491,
0.12522496283054352,
-0.07386377453804016,
0.053082745522260666,
0.08390937000513077,
0.003054668428376317,
-0.04492098093032837,
0.05345390737056732,
0.054203081876039505,
-0.06062301993370056,
-0.1889544129371643,
-0.005829416215419769,
0.010080491192638874,
-0.0466080978512764,
0.06899119168519974,
0.035962171852588654,
0.005867178086191416,
0.08155614882707596,
0.0190868079662323,
0.06982896476984024,
0.002772743348032236,
0.10023537278175354,
0.02001919411122799,
-0.038190264254808426,
0.08302941918373108,
-0.008655210956931114,
-0.010133815929293633,
0.07514092326164246,
-0.015714962035417557,
0.2984783351421356,
-0.04344261810183525,
0.01892326958477497,
0.1281115859746933,
0.032164763659238815,
0.0496971495449543,
0.12814579904079437,
-0.07551170885562897,
0.029848527163267136,
-0.07587496936321259,
-0.04623262584209442,
0.016525505110621452,
0.04476837441325188,
-0.0717010349035263,
0.02185153402388096,
-0.08678465336561203,
0.022480756044387817,
-0.0256278607994318,
0.2978953719139099,
0.10835929214954376,
-0.11516755074262619,
-0.059056416153907776,
0.0021123280748724937,
-0.1005249172449112,
-0.07324802130460739,
0.0500505231320858,
0.054809775203466415,
-0.13463807106018066,
0.00356406532227993,
-0.019955340772867203,
0.0755903422832489,
-0.031764496117830276,
0.013200937770307064,
0.03623597323894501,
0.05382782220840454,
-0.04592723399400711,
0.008669445291161537,
-0.17884571850299835,
0.19788885116577148,
-0.0030691931024193764,
0.02799520641565323,
-0.05896306782960892,
0.02880115434527397,
0.008035351522266865,
-0.019059201702475548,
0.06421690434217453,
0.018751300871372223,
-0.013905001804232597,
-0.06961244344711304,
-0.042818281799554825,
0.014534213580191135,
0.0689382553100586,
-0.04700525104999542,
0.10961353778839111,
0.007451309822499752,
0.05543496087193489,
0.028778288513422012,
0.08966374397277832,
-0.18446052074432373,
-0.07804374396800995,
0.02738260291516781,
-0.04484616965055466,
-0.0949726551771164,
-0.080438993871212,
-0.09688737243413925,
-0.006548340432345867,
0.21515485644340515,
-0.11775805056095123,
-0.07438836246728897,
-0.08984087407588959,
0.041441451758146286,
0.10153452306985855,
-0.05668865889310837,
0.02685728296637535,
-0.009958614595234394,
0.10991153120994568,
-0.07112742960453033,
-0.11839810758829117,
0.0232832208275795,
-0.09704849123954773,
-0.1629306524991989,
-0.06751476228237152,
0.08897436410188675,
0.0655367448925972,
0.02949514612555504,
-0.029532892629504204,
0.014159953221678734,
0.03904343396425247,
-0.037525516003370285,
-0.0024120090529322624,
0.06596800684928894,
0.09219293296337128,
0.038166821002960205,
-0.11047649383544922,
0.021238179877400398,
-0.07593647390604019,
-0.06979619711637497,
0.07532325387001038,
0.27043578028678894,
-0.051121994853019714,
0.10779839009046555,
0.1275176852941513,
-0.08954308927059174,
-0.16321834921836853,
0.043493300676345825,
0.09402507543563843,
-0.013950078748166561,
-0.005700788460671902,
-0.15739880502223969,
0.10229695588350296,
0.11249302327632904,
-0.013878200203180313,
0.006838746834546328,
-0.20621950924396515,
-0.13928858935832977,
0.09034600108861923,
0.11478929221630096,
0.27590587735176086,
-0.049699872732162476,
-0.035022784024477005,
0.018595468252897263,
-0.08446216583251953,
0.01264899130910635,
0.13074220716953278,
0.06296268105506897,
-0.018347542732954025,
-0.0635799691081047,
0.010860639624297619,
-0.037607744336128235,
0.09009290486574173,
0.0680132806301117,
0.06954066455364227,
-0.0060231308452785015,
-0.012342863716185093,
-0.026806384325027466,
-0.04207512363791466,
0.07288172096014023,
0.04088109731674194,
0.04852782189846039,
-0.0835203304886818,
-0.03610718250274658,
-0.07422682642936707,
0.03403886780142784,
-0.030012458562850952,
-0.07861560583114624,
-0.06340716034173965,
0.07637085020542145,
0.05938124284148216,
-0.03566502034664154,
0.026714982464909554,
0.03641287609934807,
0.09309827536344528,
0.13919691741466522,
-0.0004585260176099837,
-0.044499289244413376,
-0.05947466939687729,
-0.025348689407110214,
-0.015015936456620693,
0.07507345825433731,
-0.035918064415454865,
0.015415958128869534,
0.07270630449056625,
0.020974144339561462,
0.10578010231256485,
0.06221022084355354,
-0.12330491095781326,
-0.01863834820687771,
0.02488824725151062,
-0.15170279145240784,
0.01044789794832468,
0.0028247153386473656,
0.01433202438056469,
-0.023089665919542313,
0.026028186082839966,
0.14507491886615753,
-0.07080427557229996,
-0.033585842698812485,
-0.050085265189409256,
0.06117943301796913,
0.03427654877305031,
0.15118460357189178,
0.04053834453225136,
0.03768429532647133,
-0.0765022337436676,
0.146754652261734,
0.04537156969308853,
-0.045845381915569305,
0.02882928028702736,
-0.03158731758594513,
-0.10885120928287506,
0.016290470957756042,
0.06520578265190125,
0.07341502606868744,
-0.06580391526222229,
-0.017095569521188736,
-0.03681878373026848,
-0.07888063043355942,
0.06775134801864624,
0.2209816873073578,
0.06152819097042084,
0.06769576668739319,
-0.05266694352030754,
-0.033677633851766586,
-0.08107608556747437,
0.052877556532621384,
0.05596397817134857,
0.07830657064914703,
-0.0747450739145279,
0.10999124497175217,
0.011694638058543205,
0.05114021897315979,
-0.025809284299612045,
-0.04678597301244736,
-0.10317191481590271,
-0.05637405812740326,
-0.0979136973619461,
0.01698288507759571,
-0.06358356773853302,
-0.043173130601644516,
0.006355862133204937,
-0.0028074735309928656,
-0.0026028284337371588,
0.053298138082027435,
-0.061031606048345566,
-0.012585791759192944,
-0.014139684848487377,
0.035321932286024094,
-0.06072993576526642,
-0.05576068535447121,
0.020968571305274963,
-0.09471660107374191,
0.09900133311748505,
0.045124053955078125,
0.011480221524834633,
0.00177010556217283,
0.07629283517599106,
-0.008015870116651058,
0.02432272769510746,
0.009957206435501575,
-0.041868727654218674,
-0.10364468395709991,
0.006407257169485092,
-0.02800614759325981,
-0.031062504276633263,
-0.018465127795934677,
0.09035691618919373,
-0.08054418116807938,
0.02478138729929924,
0.0013167866272851825,
-0.0032196633983403444,
-0.07906175404787064,
-0.004297235049307346,
0.09605927765369415,
0.08187727630138397,
0.05482400208711624,
-0.08264409750699997,
0.017119497060775757,
-0.12106244266033173,
-0.034567877650260925,
0.012027017772197723,
-0.01690743677318096,
-0.1297190636396408,
-0.010877729393541813,
0.01990881748497486,
-0.013743733987212181,
0.1905379295349121,
-0.05677858740091324,
-0.029661979526281357,
0.017437227070331573,
-0.08834778517484665,
0.10139630734920502,
-0.017703289166092873,
0.17176954448223114,
-0.026763340458273888,
-0.038719262927770615,
-0.02270304597914219,
0.048069264739751816,
0.026421712711453438,
-0.009560652077198029,
0.18431833386421204,
0.1308080106973648,
0.04469674080610275,
0.058577027171850204,
-0.023493872955441475,
-0.0070583210326731205,
-0.05303439870476723,
-0.021068209782242775,
0.0397777259349823,
0.03406044840812683,
0.020649876445531845,
0.15871861577033997,
0.05734196677803993,
-0.1597428172826767,
0.03598857671022415,
-0.02328755147755146,
-0.04370935633778572,
-0.11583833396434784,
-0.11288691312074661,
-0.032223667949438095,
-0.052898257970809937,
0.015353700146079063,
-0.1320541352033615,
0.0026945234276354313,
0.1712370216846466,
0.06521518528461456,
0.027719365432858467,
0.019558828324079514,
-0.1394764482975006,
-0.044938281178474426,
0.05956299602985382,
0.01514145452529192,
0.020123952999711037,
0.042949870228767395,
-0.004565821960568428,
0.06699488312005997,
0.028746241703629494,
0.003949643578380346,
-0.0037917871959507465,
0.07649330049753189,
0.021694470196962357,
0.046022482216358185,
-0.053899139165878296,
-0.0004077684716321528,
-0.04387835040688515,
0.08368595689535141,
0.11005355417728424,
0.04419054463505745,
-0.051882304251194,
-0.010891001671552658,
0.15470589697360992,
-0.028674034401774406,
0.0016916567692533135,
-0.1171213835477829,
0.31694522500038147,
0.016031799837946892,
0.010889320634305477,
0.05943474546074867,
-0.0818590596318245,
-0.045864880084991455,
0.20791488885879517,
0.06940941512584686,
-0.01733243092894554,
-0.029469670727849007,
0.0036742878146469593,
-0.031802352517843246,
-0.01794276013970375,
0.1419605016708374,
0.04051254317164421,
0.11595378816127777,
-0.056516729295253754,
-0.04398113116621971,
-0.03311024233698845,
0.002906147390604019,
-0.11198555678129196,
0.14877156913280487,
-0.01976913772523403,
-0.02148004062473774,
-0.0818411260843277,
0.011138378642499447,
0.07274844497442245,
-0.3415448069572449,
0.00333911320194602,
-0.02828942984342575,
-0.10366467386484146,
-0.013755558989942074,
-0.031044337898492813,
-0.025121571496129036,
0.048175666481256485,
-0.036778416484594345,
0.06784916669130325,
0.04260323941707611,
0.03959294781088829,
-0.02294325828552246,
-0.100822314620018,
0.15917600691318512,
0.06544353812932968,
0.10497274249792099,
0.018100611865520477,
0.090370312333107,
0.06305942684412003,
0.03634260222315788,
-0.09456948190927505,
0.05715573951601982,
0.009735221974551678,
-0.06172331050038338,
-0.05353758856654167,
0.1170923188328743,
-0.0007780017331242561,
0.056932516396045685,
0.03095983900129795,
-0.1205512285232544,
0.028749875724315643,
0.07643593847751617,
-0.08794917911291122,
-0.0995154082775116,
-0.0007151241879910231,
-0.09929065406322479,
0.15948863327503204,
0.1433824598789215,
-0.012708565220236778,
0.01748031936585903,
-0.06443565338850021,
-0.012766564264893532,
0.050787292420864105,
0.003266578074544668,
-0.012987961992621422,
-0.18239916861057281,
0.04962824285030365,
-0.0919351577758789,
-0.0016186618013307452,
-0.20995739102363586,
-0.09905809164047241,
-0.014156035147607327,
-0.056619662791490555,
-0.02487492561340332,
0.05669129639863968,
0.02375906892120838,
0.07473412156105042,
-0.026276694610714912,
-0.031047526746988297,
-0.03928801044821739,
0.09435994178056717,
-0.10635554790496826,
-0.07608668506145477
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_100k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_100k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_100k"]}
| null |
google/multiberts-seed_1-step_100k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_100k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 100k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07999464869499207,
0.07482122629880905,
-0.0020871374290436506,
0.04057157412171364,
0.07549110800027847,
-0.021492676809430122,
0.06914833933115005,
0.09237349778413773,
-0.009028399363160133,
0.025016983970999718,
0.08417929708957672,
0.032941482961177826,
0.01238064095377922,
0.10089506208896637,
0.021149003878235817,
-0.21124513447284698,
0.03363269567489624,
-0.025394095107913017,
-0.08299662172794342,
0.07505745440721512,
0.10272233188152313,
-0.08614790439605713,
0.041680674999952316,
0.02993643470108509,
-0.10961861163377762,
0.05034187063574791,
-0.010718224570155144,
-0.02458799257874489,
0.1378670036792755,
-0.0022050216794013977,
0.053755857050418854,
0.05365462973713875,
0.04696014150977135,
-0.13972391188144684,
0.004394369199872017,
0.05645948275923729,
0.051499318331480026,
0.03759143501520157,
0.01651008427143097,
0.0812465101480484,
-0.028949100524187088,
0.03241472691297531,
0.05257366970181465,
0.01676257699728012,
-0.06588006019592285,
-0.06654437631368637,
-0.09942618012428284,
0.030719347298145294,
0.03095167875289917,
0.02074035257101059,
0.0067087095230817795,
0.12018439918756485,
-0.034201305359601974,
0.04490714892745018,
0.1821655035018921,
-0.31135186553001404,
-0.001738959806971252,
0.06654975563287735,
0.02739696577191353,
0.11311082541942596,
-0.0054483902640640736,
-0.03307347372174263,
0.07951733469963074,
0.023169592022895813,
0.09845534712076187,
-0.04008030518889427,
0.017070923000574112,
-0.058892104774713516,
-0.1567871868610382,
-0.0389377735555172,
0.09559362381696701,
0.005127929151058197,
-0.14058798551559448,
-0.02486046589910984,
-0.04490258917212486,
0.03207550570368767,
0.0181477852165699,
-0.04198591783642769,
0.045203667134046555,
0.003454063320532441,
-0.006756632588803768,
-0.003310884116217494,
-0.10270321369171143,
-0.0455973781645298,
0.023003658279776573,
0.09649047255516052,
0.10877242684364319,
0.0567103773355484,
-0.003499697893857956,
0.10973052680492401,
-0.1854795515537262,
-0.047529857605695724,
-0.030251918360590935,
-0.03355766832828522,
-0.0458245687186718,
-0.01053190790116787,
-0.10367434471845627,
-0.050457846373319626,
0.0007042577490210533,
0.13899189233779907,
-0.011194254271686077,
0.030425095930695534,
-0.021677877753973007,
0.007235974073410034,
0.061218999326229095,
0.05845607817173004,
-0.021442873403429985,
0.021769993007183075,
0.03423980250954628,
-0.014414358884096146,
-0.0170432161539793,
0.00911862775683403,
-0.005405339878052473,
0.027505196630954742,
0.13838160037994385,
0.014967035502195358,
-0.10843034088611603,
0.077507384121418,
-0.01666354201734066,
-0.04444599896669388,
-0.009788893163204193,
-0.0871131643652916,
-0.06397450715303421,
-0.04163670539855957,
-0.015144799835979939,
0.0043003917671740055,
0.0027716471813619137,
-0.008014488033950329,
-0.027889657765626907,
-0.017994001507759094,
-0.08911369740962982,
-0.05655227228999138,
-0.05569913238286972,
-0.13634401559829712,
0.008531262166798115,
-0.20199115574359894,
-0.028387950733304024,
-0.11497648805379868,
-0.20250995457172394,
-0.040130339562892914,
0.04685287922620773,
0.002608149079605937,
-0.06407231837511063,
0.05609157681465149,
0.03155166283249855,
-0.0318596176803112,
-0.0030114869587123394,
0.08321965485811234,
-0.006812209729105234,
0.03799762576818466,
-0.04258228838443756,
0.0605141818523407,
0.0070277247577905655,
0.042089350521564484,
-0.061491768807172775,
0.05684995278716087,
-0.17299595475196838,
0.041265927255153656,
-0.07345321029424667,
-0.028457148000597954,
-0.08608239144086838,
-0.02984531596302986,
-0.007827221415936947,
0.012699377723038197,
0.02440015971660614,
0.07398384064435959,
-0.1746281385421753,
-0.029864195734262466,
0.0970393717288971,
-0.14868852496147156,
-0.030361106619238853,
0.07200710475444794,
-0.05611198768019676,
0.11845755577087402,
0.06890814006328583,
0.15915687382221222,
-0.03770609572529793,
-0.07285354286432266,
0.04298241436481476,
-0.011509567499160767,
0.01131397858262062,
-0.009953620843589306,
0.06595815718173981,
-0.021626519039273262,
-0.15809962153434753,
0.023299885913729668,
-0.13059386610984802,
0.0016425334615632892,
-0.07750922441482544,
0.02978932112455368,
-0.003062731819227338,
-0.07032017409801483,
-0.07927028834819794,
-0.035024382174015045,
0.07675948739051819,
-0.06978658586740494,
-0.0267112385481596,
0.03420289605855942,
0.07511390745639801,
-0.07312549650669098,
0.06804210692644119,
-0.017626116052269936,
0.02312558889389038,
-0.08122773468494415,
-0.035697370767593384,
-0.1845199167728424,
0.035465970635414124,
0.09947449713945389,
0.01632268913090229,
-0.020531298592686653,
0.12584993243217468,
-0.0069383191876113415,
0.06416954845190048,
-0.038959573954343796,
-0.005112074315547943,
-0.012875733897089958,
0.0007192165357992053,
-0.09746643155813217,
-0.10538368672132492,
-0.07345791906118393,
-0.07080210000276566,
0.0911385715007782,
-0.11126506328582764,
0.022460734471678734,
-0.05833005905151367,
0.04165874049067497,
0.018229441717267036,
-0.07147456705570221,
-0.009732214733958244,
0.015220457687973976,
-0.06344235688447952,
-0.058396484702825546,
0.035169970244169235,
0.06224273890256882,
-0.023608746007084846,
0.09549969434738159,
-0.04993044212460518,
-0.08826757967472076,
0.02967716194689274,
0.07344268262386322,
-0.10688133537769318,
0.023948686197400093,
-0.046936746686697006,
-0.04511796683073044,
-0.0681048259139061,
-0.02888984978199005,
0.10173245519399643,
-0.009203724563121796,
0.1439194679260254,
-0.07682628184556961,
-0.015993893146514893,
0.010116779245436192,
-0.013556316494941711,
-0.023898541927337646,
0.043556369841098785,
0.06971956789493561,
-0.069455087184906,
0.02329741045832634,
0.03312661871314049,
-0.0018578466260805726,
0.06571706384420395,
-0.052767470479011536,
-0.07830839604139328,
0.0189653430134058,
0.031153950840234756,
0.02121702767908573,
0.0626300722360611,
-0.05311752110719681,
-0.01476350985467434,
0.028890812769532204,
0.02057141251862049,
0.008286340162158012,
-0.11782713234424591,
0.061234503984451294,
0.06058811768889427,
0.007653108332306147,
0.05887468159198761,
-0.018002288416028023,
-0.03497808426618576,
0.07916882634162903,
0.03218773752450943,
-0.016236787661910057,
-0.010303796268999577,
-0.012130496092140675,
-0.12147695571184158,
0.2165623903274536,
-0.06645971536636353,
-0.14732028543949127,
-0.06774555891752243,
-0.11306865513324738,
0.004937976598739624,
0.025603685528039932,
0.04242826998233795,
-0.026791760697960854,
-0.041256215423345566,
-0.12614843249320984,
0.0955439954996109,
-0.03815325349569321,
0.06515149772167206,
0.11153765767812729,
-0.06383948773145676,
0.04117393121123314,
-0.13233834505081177,
-0.01261878851801157,
-0.07802369445562363,
-0.06548237800598145,
0.05661151558160782,
-0.051909275352954865,
0.03755027428269386,
0.11409101635217667,
0.01818936876952648,
-0.026815447956323624,
-0.03238669037818909,
0.20917268097400665,
0.04020898789167404,
0.04241262376308441,
0.12689076364040375,
-0.07159982621669769,
0.053066615015268326,
0.08018133044242859,
0.005074121057987213,
-0.045762479305267334,
0.05235524848103523,
0.05575263127684593,
-0.05969887226819992,
-0.19071485102176666,
-0.006782416254281998,
0.009517022408545017,
-0.04751642048358917,
0.06788598001003265,
0.03824838623404503,
0.015882788226008415,
0.07987222820520401,
0.016962328925728798,
0.06692496687173843,
0.002120555378496647,
0.10238245129585266,
0.025836700573563576,
-0.03865760564804077,
0.08597564697265625,
-0.0072760502807796,
-0.010886646807193756,
0.07485773414373398,
-0.016309646889567375,
0.2941688001155853,
-0.0415540486574173,
0.022569268941879272,
0.12621723115444183,
0.03227609023451805,
0.04962149262428284,
0.12454992532730103,
-0.07537555694580078,
0.028111325576901436,
-0.07703331857919693,
-0.04694680869579315,
0.013171658851206303,
0.04552421718835831,
-0.07236579805612564,
0.01802382804453373,
-0.08327209949493408,
0.017948519438505173,
-0.02376744896173477,
0.3074468970298767,
0.10566919296979904,
-0.11129967123270035,
-0.05926131457090378,
0.0020297919400036335,
-0.10173990577459335,
-0.07193189114332199,
0.05017473176121712,
0.05147498846054077,
-0.1350865662097931,
0.002515350002795458,
-0.022542046383023262,
0.07600194215774536,
-0.03052544593811035,
0.01458055805414915,
0.03439989686012268,
0.05163431167602539,
-0.04532862827181816,
0.006462974473834038,
-0.18234163522720337,
0.1946057379245758,
-0.0023272503167390823,
0.026451710611581802,
-0.05581315606832504,
0.028958680108189583,
0.009295021183788776,
-0.017288433387875557,
0.06516975909471512,
0.01783628575503826,
-0.008875876665115356,
-0.06736976653337479,
-0.04304135590791702,
0.01257176510989666,
0.06693458557128906,
-0.04681821167469025,
0.1090288907289505,
0.005825947970151901,
0.05359862372279167,
0.02931174635887146,
0.08821828663349152,
-0.18298059701919556,
-0.0792173370718956,
0.026487277820706367,
-0.04806780442595482,
-0.09360497444868088,
-0.08086308091878891,
-0.09725207090377808,
-0.006357778795063496,
0.21944570541381836,
-0.11537425220012665,
-0.074379101395607,
-0.09103575348854065,
0.03976370394229889,
0.09868815541267395,
-0.054605983197689056,
0.026115909218788147,
-0.011027541942894459,
0.11398665606975555,
-0.07126695662736893,
-0.11965128034353256,
0.025058714672923088,
-0.0989777147769928,
-0.16087928414344788,
-0.06760741025209427,
0.09039365500211716,
0.0652918741106987,
0.02986985445022583,
-0.029945414513349533,
0.014106511138379574,
0.03710433095693588,
-0.03698326647281647,
0.0006702145328745246,
0.06533920019865036,
0.09268850088119507,
0.038763731718063354,
-0.11129124462604523,
0.02740306593477726,
-0.07550887763500214,
-0.06909888982772827,
0.07601013034582138,
0.2696473002433777,
-0.051560238003730774,
0.11107077449560165,
0.1257745772600174,
-0.08767452836036682,
-0.15910972654819489,
0.03862280398607254,
0.09327732771635056,
-0.014419293031096458,
0.0019216396613046527,
-0.1626189798116684,
0.10128282755613327,
0.11539764702320099,
-0.015650149434804916,
0.007897459901869297,
-0.20606468617916107,
-0.1364421546459198,
0.09315936267375946,
0.11511485278606415,
0.280168741941452,
-0.05381026491522789,
-0.036873385310173035,
0.018938498571515083,
-0.0885440856218338,
0.009002155624330044,
0.12704889476299286,
0.06343244016170502,
-0.019336862489581108,
-0.06769131869077682,
0.01154678501188755,
-0.03663875535130501,
0.0896066203713417,
0.06417350471019745,
0.07027934491634369,
-0.005449507851153612,
-0.006721458863466978,
-0.030296819284558296,
-0.041951969265937805,
0.07224038988351822,
0.03669266775250435,
0.04733783006668091,
-0.08130887150764465,
-0.037284452468156815,
-0.07366426289081573,
0.033123381435871124,
-0.029756657779216766,
-0.07926660776138306,
-0.06388214975595474,
0.07575038075447083,
0.05926705524325371,
-0.03312613442540169,
0.03277391567826271,
0.03341495990753174,
0.09902584552764893,
0.1446443647146225,
-0.0019995118491351604,
-0.04720791429281235,
-0.06772488355636597,
-0.028099428862333298,
-0.014766128733754158,
0.07470361143350601,
-0.03401874750852585,
0.015443308278918266,
0.06973856687545776,
0.019691988825798035,
0.10784908384084702,
0.061742883175611496,
-0.12386047095060349,
-0.018966885283589363,
0.024257009848952293,
-0.15113846957683563,
0.00818388070911169,
0.0018048594938591123,
0.01562785729765892,
-0.02250586822628975,
0.02689865231513977,
0.1485774666070938,
-0.06788436323404312,
-0.033337291330099106,
-0.04889991134405136,
0.061644669622182846,
0.034885186702013016,
0.14616729319095612,
0.04025520384311676,
0.03717244416475296,
-0.07651893049478531,
0.14275947213172913,
0.04361627250909805,
-0.039788175374269485,
0.028919454663991928,
-0.03360852599143982,
-0.10706919431686401,
0.015326529741287231,
0.06025432422757149,
0.07534128427505493,
-0.07303645461797714,
-0.015340293757617474,
-0.03990674763917923,
-0.07893498986959457,
0.06734849512577057,
0.21986915171146393,
0.06383703649044037,
0.07001595199108124,
-0.0533529631793499,
-0.03359958156943321,
-0.07698741555213928,
0.050009746104478836,
0.05202648043632507,
0.07823873311281204,
-0.07512691617012024,
0.10350124537944794,
0.012759643606841564,
0.05258555710315704,
-0.026744680479168892,
-0.04979507252573967,
-0.10382870584726334,
-0.05513766035437584,
-0.1057286262512207,
0.014802418649196625,
-0.06467409431934357,
-0.043532032519578934,
0.007247942965477705,
-0.0017684268532320857,
-0.0017044685082510114,
0.05391490459442139,
-0.06139705330133438,
-0.011024006642401218,
-0.014444881118834019,
0.033443208783864975,
-0.06299819052219391,
-0.05320756137371063,
0.019593961536884308,
-0.09773912280797958,
0.0984225943684578,
0.045730993151664734,
0.011357945390045643,
0.0023833506274968386,
0.07006324827671051,
-0.00726692657917738,
0.024709032848477364,
0.007548253983259201,
-0.04169616103172302,
-0.10032395273447037,
0.005066613666713238,
-0.02739153616130352,
-0.03244180977344513,
-0.01907249540090561,
0.08936680853366852,
-0.08187875896692276,
0.025059841573238373,
0.0012480542063713074,
-0.004739722702652216,
-0.08029897511005402,
-0.004815550986677408,
0.09562160819768906,
0.08427577465772629,
0.05385345593094826,
-0.08234481513500214,
0.017259009182453156,
-0.1245829313993454,
-0.034144509583711624,
0.012864317744970322,
-0.014010291546583176,
-0.12937401235103607,
-0.013354026712477207,
0.01870388351380825,
-0.011828066781163216,
0.1957150250673294,
-0.055660560727119446,
-0.028563102707266808,
0.01804548129439354,
-0.09558270871639252,
0.10830122232437134,
-0.02041354961693287,
0.16930119693279266,
-0.025677939876914024,
-0.03788880258798599,
-0.02032400295138359,
0.047276098281145096,
0.027390295639634132,
-0.011128325946629047,
0.1816294938325882,
0.13171052932739258,
0.04042891040444374,
0.058976396918296814,
-0.025770900771021843,
-0.00822046585381031,
-0.05266132578253746,
-0.022132188081741333,
0.042088549584150314,
0.032798949629068375,
0.02215653471648693,
0.1640007048845291,
0.06124311313033104,
-0.16117876768112183,
0.0371863879263401,
-0.022457607090473175,
-0.04148015379905701,
-0.11644665896892548,
-0.104913629591465,
-0.032032839953899384,
-0.06011538952589035,
0.015308264642953873,
-0.1324775069952011,
0.0029829891864210367,
0.1743696630001068,
0.06605516374111176,
0.028552671894431114,
0.01768428646028042,
-0.1323552131652832,
-0.044016364961862564,
0.05988955870270729,
0.013605534099042416,
0.019557399675250053,
0.04054734855890274,
-0.00386644434183836,
0.06647375971078873,
0.02905382215976715,
0.004566865507513285,
-0.0034987761173397303,
0.07860298454761505,
0.023148607462644577,
0.04559861496090889,
-0.05588443949818611,
0.0014121552230790257,
-0.04117193818092346,
0.08152734488248825,
0.11515887826681137,
0.045541539788246155,
-0.050994209945201874,
-0.011574100703001022,
0.1571018248796463,
-0.029665358364582062,
0.004590762313455343,
-0.11899891495704651,
0.3149765133857727,
0.017612263560295105,
0.011572644114494324,
0.05630221590399742,
-0.08131883293390274,
-0.04544590413570404,
0.20719552040100098,
0.0684264600276947,
-0.020309505984187126,
-0.02810388244688511,
0.004889986012130976,
-0.031334277242422104,
-0.017193326726555824,
0.13869045674800873,
0.04284016788005829,
0.12290749698877335,
-0.05756419524550438,
-0.04316500201821327,
-0.03365790471434593,
-0.00034820984001271427,
-0.11451632529497147,
0.1453014761209488,
-0.019265299662947655,
-0.020714394748210907,
-0.0804489254951477,
0.01243595965206623,
0.07371590286493301,
-0.3424880802631378,
0.0071692331694066525,
-0.026186680421233177,
-0.10208086669445038,
-0.013451491482555866,
-0.03250187635421753,
-0.025197206065058708,
0.04900490865111351,
-0.03710683807730675,
0.06437966972589493,
0.046984441578388214,
0.0386686846613884,
-0.022657370194792747,
-0.10154537111520767,
0.16180478036403656,
0.0615144819021225,
0.10480348765850067,
0.01762278564274311,
0.08753303438425064,
0.06250498443841934,
0.03612378239631653,
-0.08970992267131805,
0.05895479395985603,
0.009465393610298634,
-0.0657036229968071,
-0.052854519337415695,
0.1161738932132721,
0.0006313083576969802,
0.05875292420387268,
0.030732188373804092,
-0.12103701382875443,
0.029155371710658073,
0.07632409036159515,
-0.08918482810258865,
-0.09863432496786118,
-0.00179688457865268,
-0.09943227469921112,
0.15876126289367676,
0.14612172544002533,
-0.012196791358292103,
0.016621671617031097,
-0.06431776285171509,
-0.0109420670196414,
0.05185588821768761,
0.0009079432929866016,
-0.013549601659178734,
-0.18123561143875122,
0.052618153393268585,
-0.08151604980230331,
-0.0037828085478395224,
-0.21236194670200348,
-0.10099543631076813,
-0.01152025442570448,
-0.05471906065940857,
-0.02414131909608841,
0.05429094657301903,
0.02406381070613861,
0.07575848698616028,
-0.025483177974820137,
-0.02062837779521942,
-0.03802180662751198,
0.09347587078809738,
-0.10401642322540283,
-0.07382262498140335
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1100k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1100k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1100k"]}
| null |
google/multiberts-seed_1-step_1100k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1100k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1100k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08241526782512665,
0.07643144577741623,
-0.001969005912542343,
0.04650863632559776,
0.07971427589654922,
-0.01918955147266388,
0.06744355708360672,
0.09156174212694168,
-0.010447178967297077,
0.024830153211951256,
0.08297200500965118,
0.02822842262685299,
0.010722769424319267,
0.10172192007303238,
0.017804423347115517,
-0.2132255882024765,
0.0319315604865551,
-0.026123171672225,
-0.07773423939943314,
0.07371338456869125,
0.1028822734951973,
-0.08450111746788025,
0.04169381409883499,
0.03180016204714775,
-0.10936283320188522,
0.05120638385415077,
-0.011782119050621986,
-0.02429538406431675,
0.13762596249580383,
-0.0015127405058592558,
0.05592031031847,
0.05327574908733368,
0.047843195497989655,
-0.1371656060218811,
0.004210000857710838,
0.055614057928323746,
0.052384328097105026,
0.03878428414463997,
0.01927974820137024,
0.08286485821008682,
-0.02422831580042839,
0.03438280150294304,
0.052415911108255386,
0.014743911102414131,
-0.0651327446103096,
-0.06590837240219116,
-0.10141825675964355,
0.03524291515350342,
0.03177279978990555,
0.01843150332570076,
0.008446924388408661,
0.12058328092098236,
-0.030342651531100273,
0.04516730457544327,
0.18193143606185913,
-0.31110680103302,
-0.0024116523563861847,
0.05654264986515045,
0.024347273632884026,
0.114125095307827,
-0.0029314691200852394,
-0.03546382486820221,
0.08251082897186279,
0.02346349135041237,
0.09344588965177536,
-0.03926829621195793,
0.006818382069468498,
-0.06044980511069298,
-0.1577477604150772,
-0.03701670840382576,
0.10041895508766174,
0.0033257121685892344,
-0.13938720524311066,
-0.018858972936868668,
-0.045070260763168335,
0.03136511147022247,
0.020417481660842896,
-0.041734471917152405,
0.04394950345158577,
0.0014586931793019176,
0.004772081505507231,
-0.002901082392781973,
-0.10301841795444489,
-0.043628040701150894,
0.02095704898238182,
0.09566299617290497,
0.11018996685743332,
0.05753804370760918,
-0.0059983693063259125,
0.11094116419553757,
-0.1852623075246811,
-0.04764498770236969,
-0.028579527512192726,
-0.03417283296585083,
-0.04268251731991768,
-0.012580649927258492,
-0.0999576523900032,
-0.045059315860271454,
-0.003891220549121499,
0.13273009657859802,
-0.010422981344163418,
0.0307362861931324,
-0.024800632148981094,
0.006834432948380709,
0.05941196531057358,
0.05398094281554222,
-0.020975107327103615,
0.020899178460240364,
0.03689386323094368,
-0.012786086648702621,
-0.017899349331855774,
0.010112997144460678,
-0.0029441595543175936,
0.026079552248120308,
0.1372547447681427,
0.016097331419587135,
-0.10361582040786743,
0.07751216739416122,
-0.017422400414943695,
-0.04370800778269768,
-0.0008245792705565691,
-0.0882304459810257,
-0.06427399814128876,
-0.04164951667189598,
-0.018978970125317574,
0.004320724867284298,
0.0092140082269907,
-0.008717164397239685,
-0.029816171154379845,
-0.01989636942744255,
-0.09191671758890152,
-0.05651777982711792,
-0.05438489466905594,
-0.13723507523536682,
0.008916448801755905,
-0.1957038789987564,
-0.029744429513812065,
-0.11471030861139297,
-0.2064981311559677,
-0.03977036476135254,
0.04802853614091873,
0.005140981636941433,
-0.06369982659816742,
0.056595172733068466,
0.03281364217400551,
-0.030644509941339493,
-0.004305676091462374,
0.08417083323001862,
-0.006137589458376169,
0.038200002163648605,
-0.042876988649368286,
0.05867422744631767,
0.0014625291805714369,
0.04399361088871956,
-0.0614950954914093,
0.05682733282446861,
-0.17634700238704681,
0.03750072047114372,
-0.074839286506176,
-0.02694150246679783,
-0.08590063452720642,
-0.03078867495059967,
-0.002742390614002943,
0.011198141612112522,
0.025142963975667953,
0.0744147002696991,
-0.16734348237514496,
-0.026364339515566826,
0.08796001225709915,
-0.15259060263633728,
-0.029759841039776802,
0.0716155394911766,
-0.05557713657617569,
0.1185198575258255,
0.06935548037290573,
0.15794475376605988,
-0.030322859063744545,
-0.07001949846744537,
0.04224175214767456,
-0.014142034575343132,
0.008496330119669437,
-0.01179652102291584,
0.06646215915679932,
-0.020096387714147568,
-0.16282013058662415,
0.023392003029584885,
-0.1327381730079651,
0.0007679372793063521,
-0.0775434747338295,
0.030151616781949997,
-0.0038225206080824137,
-0.06913040578365326,
-0.08113772422075272,
-0.03222886845469475,
0.07523982226848602,
-0.07016398012638092,
-0.026717761531472206,
0.03635053709149361,
0.07807788252830505,
-0.07194136828184128,
0.06937216967344284,
-0.016466446220874786,
0.0199611634016037,
-0.08304165303707123,
-0.03850054740905762,
-0.1845823973417282,
0.03442412614822388,
0.09693033248186111,
0.01631931960582733,
-0.021383382380008698,
0.12217897176742554,
-0.008305731229484081,
0.06562408804893494,
-0.03935367241501808,
-0.003634128486737609,
-0.009827482514083385,
0.000028597623895620927,
-0.09734387695789337,
-0.10882877558469772,
-0.07489357143640518,
-0.07125560194253922,
0.09071426093578339,
-0.11575999110937119,
0.022297328338027,
-0.059795428067445755,
0.04008181020617485,
0.017673401162028313,
-0.07035564631223679,
-0.009845489636063576,
0.014548568986356258,
-0.06527914106845856,
-0.059286586940288544,
0.03667841851711273,
0.061265476047992706,
-0.018511749804019928,
0.09605621546506882,
-0.052184414118528366,
-0.08907110244035721,
0.028980504721403122,
0.07813015580177307,
-0.10513462126255035,
0.022981960326433182,
-0.04711934179067612,
-0.048549771308898926,
-0.06591739505529404,
-0.028499212116003036,
0.10355188697576523,
-0.012868869118392467,
0.143279030919075,
-0.07749047875404358,
-0.011974718421697617,
0.009664604440331459,
-0.013079671189188957,
-0.025575362145900726,
0.04501764848828316,
0.07054757326841354,
-0.07692218571901321,
0.022370709106326103,
0.027202827855944633,
-0.001178691047243774,
0.06499748677015305,
-0.049964986741542816,
-0.07798267900943756,
0.021532155573368073,
0.03304434195160866,
0.02330445498228073,
0.06281457841396332,
-0.05456012487411499,
-0.013760640285909176,
0.029649818316102028,
0.022124100476503372,
0.012434618547558784,
-0.11954065412282944,
0.06157524138689041,
0.059019070118665695,
0.010633982717990875,
0.051899030804634094,
-0.02017582207918167,
-0.03288278356194496,
0.08189643919467926,
0.02938905917108059,
-0.016378361731767654,
-0.010900500230491161,
-0.011284198611974716,
-0.12335702776908875,
0.216288760304451,
-0.06826835125684738,
-0.14402028918266296,
-0.06891733407974243,
-0.11638186126947403,
0.007383185438811779,
0.024294134229421616,
0.04404290020465851,
-0.025200167670845985,
-0.04330919310450554,
-0.12569348514080048,
0.09666421264410019,
-0.035557642579078674,
0.0661022737622261,
0.1132780984044075,
-0.06482581049203873,
0.040469229221343994,
-0.1336984932422638,
-0.011443600989878178,
-0.0768735483288765,
-0.062376510351896286,
0.054435960948467255,
-0.05377853289246559,
0.03697100281715393,
0.1173156201839447,
0.01544271968305111,
-0.02921692281961441,
-0.031757600605487823,
0.19946150481700897,
0.042744554579257965,
0.043371208012104034,
0.126578226685524,
-0.07458420842885971,
0.05462152510881424,
0.08085522055625916,
0.0028045009821653366,
-0.04538819193840027,
0.05485325679183006,
0.051482852548360825,
-0.05969779193401337,
-0.19671158492565155,
-0.0089933006092906,
0.009395587258040905,
-0.04423200711607933,
0.06927233189344406,
0.03683555871248245,
0.0045042564161121845,
0.07955408841371536,
0.015978030860424042,
0.06859306991100311,
0.0024798396043479443,
0.10082115978002548,
0.023915041238069534,
-0.037903595715761185,
0.08218865096569061,
-0.010384401306509972,
-0.005830483976751566,
0.07711136341094971,
-0.015840405598282814,
0.3002748191356659,
-0.04774294048547745,
0.0195939764380455,
0.12662328779697418,
0.03138392046093941,
0.04732239618897438,
0.12793003022670746,
-0.07752620428800583,
0.02960682287812233,
-0.07605192810297012,
-0.0467023104429245,
0.015029826201498508,
0.046628646552562714,
-0.07420329004526138,
0.022393429651856422,
-0.08616369962692261,
0.025930121541023254,
-0.025741295889019966,
0.29946115612983704,
0.10371977090835571,
-0.11496637016534805,
-0.05876396596431732,
0.002002609660848975,
-0.10164328664541245,
-0.07404661178588867,
0.050721246749162674,
0.054016657173633575,
-0.13196425139904022,
0.003911817912012339,
-0.021853037178516388,
0.07508544623851776,
-0.028674153611063957,
0.012014231644570827,
0.03815709426999092,
0.05363691225647926,
-0.04603397101163864,
0.006670323200523853,
-0.17826411128044128,
0.1993916779756546,
-0.002384637715294957,
0.025999488309025764,
-0.05639130622148514,
0.030449554324150085,
0.01340058445930481,
-0.015430250205099583,
0.06454119086265564,
0.018089085817337036,
-0.0054223970510065556,
-0.06090034916996956,
-0.043532419949769974,
0.013130477629601955,
0.06441733241081238,
-0.04117906466126442,
0.10756117850542068,
0.006559509318321943,
0.054771486669778824,
0.029688900336623192,
0.08401047438383102,
-0.1851966679096222,
-0.07908406108617783,
0.0281843151897192,
-0.0485331229865551,
-0.10098835825920105,
-0.08113839477300644,
-0.10033883154392242,
-0.01060864981263876,
0.21874657273292542,
-0.10723841935396194,
-0.07497343420982361,
-0.09012703597545624,
0.039510369300842285,
0.09872551262378693,
-0.05558978393673897,
0.027214450761675835,
-0.009244884364306927,
0.10932204127311707,
-0.07186296582221985,
-0.11844247579574585,
0.027699073776602745,
-0.09710068255662918,
-0.16033540666103363,
-0.07015221565961838,
0.08736217021942139,
0.06624796986579895,
0.02936721220612526,
-0.03211553767323494,
0.011988119222223759,
0.03849368169903755,
-0.03619933873414993,
-0.0040565114468336105,
0.06377004832029343,
0.09129942953586578,
0.04176131635904312,
-0.11137548834085464,
0.013693943619728088,
-0.07433486729860306,
-0.06965020298957825,
0.07342781871557236,
0.27340003848075867,
-0.05037315934896469,
0.11015434563159943,
0.122169628739357,
-0.08975774049758911,
-0.1598420888185501,
0.044891953468322754,
0.09388391673564911,
-0.014741622842848301,
0.002250619465485215,
-0.15914751589298248,
0.10376803576946259,
0.11600442975759506,
-0.016289738938212395,
0.014107945375144482,
-0.20180149376392365,
-0.13609665632247925,
0.09206435084342957,
0.11717937141656876,
0.2771857678890228,
-0.05371272563934326,
-0.03509071096777916,
0.02155201882123947,
-0.0993143767118454,
0.005918756127357483,
0.12845535576343536,
0.062105897814035416,
-0.020694630220532417,
-0.06856811791658401,
0.010925191454589367,
-0.037409380078315735,
0.09231335669755936,
0.0653907060623169,
0.07024198770523071,
-0.006037148181349039,
-0.013024031184613705,
-0.02732986956834793,
-0.04367060586810112,
0.07446452230215073,
0.03940211609005928,
0.04928262531757355,
-0.08868606388568878,
-0.03659338876605034,
-0.07410722970962524,
0.034947723150253296,
-0.03080003149807453,
-0.07684291899204254,
-0.06092654541134834,
0.07569945603609085,
0.058412276208400726,
-0.033047329634428024,
0.03514654561877251,
0.034677654504776,
0.09663859009742737,
0.1437266618013382,
-0.00045239945757202804,
-0.044750846922397614,
-0.05795181170105934,
-0.029288295656442642,
-0.014902852475643158,
0.07438933104276657,
-0.04437725991010666,
0.016408821567893028,
0.07222617417573929,
0.021236876025795937,
0.1059412956237793,
0.061050016433000565,
-0.12258534878492355,
-0.015947412699460983,
0.026856882497668266,
-0.1510610431432724,
0.015179493464529514,
0.0015656792093068361,
0.020607411861419678,
-0.02317768707871437,
0.02807462401688099,
0.1465430110692978,
-0.06934592872858047,
-0.03414780646562576,
-0.04781392961740494,
0.06162162497639656,
0.034295421093702316,
0.14684906601905823,
0.04054378345608711,
0.03679511696100235,
-0.07716337591409683,
0.1473700851202011,
0.04608744755387306,
-0.0443543903529644,
0.02725137770175934,
-0.028365587815642357,
-0.10704847425222397,
0.01467959489673376,
0.06064018979668617,
0.07544804364442825,
-0.06889773160219193,
-0.01667202264070511,
-0.04016619548201561,
-0.07749933004379272,
0.06861281394958496,
0.2211432009935379,
0.06335419416427612,
0.0678538978099823,
-0.05304952710866928,
-0.0331026129424572,
-0.08000198006629944,
0.05076191946864128,
0.054086241871118546,
0.08011750131845474,
-0.07782124727964401,
0.10403995960950851,
0.012475790455937386,
0.05122241750359535,
-0.025608526542782784,
-0.04776960611343384,
-0.1001501977443695,
-0.05631770193576813,
-0.1047072634100914,
0.017149316146969795,
-0.06628360599279404,
-0.041009433567523956,
0.007247678004205227,
-0.0024458589032292366,
-0.004682093858718872,
0.056408949196338654,
-0.06214031204581261,
-0.01166827604174614,
-0.01327818725258112,
0.03548269718885422,
-0.061124689877033234,
-0.05523868650197983,
0.019907591864466667,
-0.0960073247551918,
0.09976338595151901,
0.04591673985123634,
0.010628213174641132,
0.0016824972117319703,
0.0731966644525528,
-0.009477735497057438,
0.02440713532269001,
0.008028466254472733,
-0.04012240841984749,
-0.10130622237920761,
0.005525261629372835,
-0.02747415006160736,
-0.03063100203871727,
-0.019650431349873543,
0.09452931582927704,
-0.08200032263994217,
0.02573084644973278,
0.0005202337051741779,
-0.003268833737820387,
-0.07852534204721451,
-0.005868913140147924,
0.0959034413099289,
0.08023177832365036,
0.0539238415658474,
-0.0810355618596077,
0.015623949468135834,
-0.12523382902145386,
-0.03492884710431099,
0.011137713678181171,
-0.015134095214307308,
-0.1281076967716217,
-0.010020852088928223,
0.01749182678759098,
-0.012198813259601593,
0.1898687183856964,
-0.05945156142115593,
-0.03216349333524704,
0.01961507648229599,
-0.09218484163284302,
0.09820408374071121,
-0.021223120391368866,
0.16823948919773102,
-0.03175070136785507,
-0.0361165814101696,
-0.016099410131573677,
0.04665194824337959,
0.029220756143331528,
-0.008639647625386715,
0.18735693395137787,
0.13289017975330353,
0.0460517480969429,
0.06152724474668503,
-0.026314688846468925,
-0.010717005468904972,
-0.059956617653369904,
-0.026142319664359093,
0.04436454176902771,
0.034581802785396576,
0.024429121986031532,
0.15463656187057495,
0.062405865639448166,
-0.15937809646129608,
0.033877182751894,
-0.024482326582074165,
-0.04142696037888527,
-0.11581484973430634,
-0.09992444515228271,
-0.030295832082629204,
-0.058384597301483154,
0.0162540003657341,
-0.13269084692001343,
0.004099628888070583,
0.17998386919498444,
0.06525759398937225,
0.028992772102355957,
0.018022362142801285,
-0.1355949342250824,
-0.04534661024808884,
0.058630019426345825,
0.013140279799699783,
0.020553676411509514,
0.04071156308054924,
-0.005258595570921898,
0.06636609882116318,
0.026430383324623108,
0.00328938290476799,
-0.004888076335191727,
0.07904572039842606,
0.02389790490269661,
0.047479432076215744,
-0.05676468461751938,
-0.00013742681767325848,
-0.04081559181213379,
0.08308853209018707,
0.11644157022237778,
0.04500317946076393,
-0.047586590051651,
-0.010784690268337727,
0.15728428959846497,
-0.02937117964029312,
0.006429897155612707,
-0.11869824677705765,
0.3219047784805298,
0.0172746442258358,
0.010523507371544838,
0.05895932391285896,
-0.08107560873031616,
-0.044386930763721466,
0.20190130174160004,
0.06603871285915375,
-0.01917651854455471,
-0.029478447511792183,
0.00456300238147378,
-0.03175707906484604,
-0.020850256085395813,
0.14033475518226624,
0.04514045640826225,
0.1155598983168602,
-0.05653364956378937,
-0.0480673648416996,
-0.03516611084342003,
0.004249077755957842,
-0.11145800352096558,
0.1421765238046646,
-0.0215468630194664,
-0.01970668137073517,
-0.07799561321735382,
0.011283322237432003,
0.07400236278772354,
-0.3450377881526947,
0.005799949634820223,
-0.02851138263940811,
-0.10692188888788223,
-0.013526775874197483,
-0.024675143882632256,
-0.024406014010310173,
0.04914979264140129,
-0.03786269575357437,
0.0656154602766037,
0.04862948879599571,
0.0394674576818943,
-0.02472119964659214,
-0.10407689213752747,
0.16022634506225586,
0.05572888255119324,
0.10907360166311264,
0.017221832647919655,
0.08748379349708557,
0.06141394376754761,
0.03795522823929787,
-0.09535973519086838,
0.053791891783475876,
0.009533461183309555,
-0.06531749665737152,
-0.05327119678258896,
0.11391511559486389,
-0.0016517688054591417,
0.06387335062026978,
0.030897803604602814,
-0.12207228690385818,
0.025455797091126442,
0.08346632868051529,
-0.08765352517366409,
-0.09775923937559128,
-0.0032267642673105,
-0.09874675422906876,
0.15927237272262573,
0.14446738362312317,
-0.011828652583062649,
0.014944836497306824,
-0.063095323741436,
-0.011442676186561584,
0.05225872993469238,
0.01141907088458538,
-0.012234553694725037,
-0.1833423376083374,
0.0525670163333416,
-0.08853824436664581,
-0.00007597511284984648,
-0.21307332813739777,
-0.09890579432249069,
-0.014498316682875156,
-0.05985004082322121,
-0.02728395350277424,
0.05604993551969528,
0.026627788320183754,
0.07683946937322617,
-0.025772079825401306,
-0.02497628703713417,
-0.03914421796798706,
0.09557584673166275,
-0.10596141219139099,
-0.07571263611316681
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1200k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1200k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1200k"]}
| null |
google/multiberts-seed_1-step_1200k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1200k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1200k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08231304585933685,
0.07940673828125,
-0.0019767240155488253,
0.04784074425697327,
0.07841599732637405,
-0.02021612972021103,
0.06589312851428986,
0.09267817437648773,
-0.02213454246520996,
0.02421426586806774,
0.08215344697237015,
0.023722201585769653,
0.011042237281799316,
0.09476779401302338,
0.020803222432732582,
-0.21585366129875183,
0.03365998715162277,
-0.02709886245429516,
-0.09032157063484192,
0.0749252438545227,
0.10203727334737778,
-0.08084018528461456,
0.04163483902812004,
0.030691325664520264,
-0.11316822469234467,
0.05312712490558624,
-0.009734234772622585,
-0.023137349635362625,
0.1393480896949768,
0.0014171533985063434,
0.0549488365650177,
0.05599965527653694,
0.04770313948392868,
-0.13409647345542908,
0.005291415378451347,
0.054897140711545944,
0.053096283227205276,
0.03822098672389984,
0.018459530547261238,
0.08324740082025528,
-0.026401428505778313,
0.03666128218173981,
0.05164836347103119,
0.015260166488587856,
-0.06515737622976303,
-0.055947527289390564,
-0.0999753475189209,
0.03649657964706421,
0.029420027509331703,
0.019941488280892372,
0.009291619062423706,
0.1195201426744461,
-0.03489483520388603,
0.04269535094499588,
0.18342581391334534,
-0.30797523260116577,
-0.00444119842723012,
0.06433893740177155,
0.020113499835133553,
0.11725221574306488,
-0.002680419012904167,
-0.036772023886442184,
0.08206143975257874,
0.024255435913801193,
0.09105335921049118,
-0.03893858566880226,
0.004220606293529272,
-0.06088418886065483,
-0.15741519629955292,
-0.03681850805878639,
0.09432417899370193,
0.003901770804077387,
-0.13742709159851074,
-0.016062526032328606,
-0.04555957391858101,
0.03303799778223038,
0.021798966452479362,
-0.03938637301325798,
0.04222168028354645,
0.0015181093476712704,
-0.0024417468812316656,
-0.0012062684400007129,
-0.10072054713964462,
-0.04600071161985397,
0.019461043179035187,
0.09613343328237534,
0.11022781580686569,
0.05669606477022171,
-0.0063026887364685535,
0.11214464902877808,
-0.18850088119506836,
-0.04632209241390228,
-0.03122752532362938,
-0.03762540966272354,
-0.04446601867675781,
-0.010775025002658367,
-0.10180066525936127,
-0.04044800251722336,
-0.001750720082782209,
0.1362372636795044,
-0.01785670407116413,
0.03179195895791054,
-0.023732278496026993,
0.005331088788807392,
0.05511317774653435,
0.049881525337696075,
-0.025074096396565437,
0.027887016534805298,
0.03497789055109024,
-0.010873349383473396,
-0.022083643823862076,
0.011785865761339664,
-0.0022895829752087593,
0.022877376526594162,
0.1351936012506485,
0.013996305875480175,
-0.10615159571170807,
0.07634398341178894,
-0.014986434020102024,
-0.044598665088415146,
-0.003425082890316844,
-0.08717052638530731,
-0.06753420829772949,
-0.04254752770066261,
-0.01827762834727764,
0.005219378042966127,
0.007804395165294409,
-0.010922491550445557,
-0.028195131570100784,
-0.01916522905230522,
-0.0911075696349144,
-0.05838186666369438,
-0.05442195385694504,
-0.13368499279022217,
0.00876878947019577,
-0.19828759133815765,
-0.02887570671737194,
-0.11626379191875458,
-0.20275016129016876,
-0.04020732268691063,
0.048028990626335144,
0.002734414767473936,
-0.06817562133073807,
0.0595792792737484,
0.031056329607963562,
-0.032348331063985825,
-0.00518960552290082,
0.08204842358827591,
-0.005144468508660793,
0.03817387670278549,
-0.04273095354437828,
0.05987775698304176,
0.0011951614869758487,
0.04399878904223442,
-0.060944993048906326,
0.05691073089838028,
-0.17782051861286163,
0.040396105498075485,
-0.07485061883926392,
-0.028118183836340904,
-0.08576507866382599,
-0.03165508806705475,
-0.009149442426860332,
0.011578460223972797,
0.026033753529191017,
0.07406837493181229,
-0.1711624413728714,
-0.024872103706002235,
0.09515025466680527,
-0.15207844972610474,
-0.029332216829061508,
0.07045853137969971,
-0.0571368969976902,
0.11869719624519348,
0.06722713261842728,
0.15819163620471954,
-0.025875495746731758,
-0.06577933579683304,
0.043868836015462875,
-0.013342982158064842,
0.010699260979890823,
-0.01350597757846117,
0.06547791510820389,
-0.020615149289369583,
-0.16503946483135223,
0.022973034530878067,
-0.13356897234916687,
0.0020477070938795805,
-0.07706218957901001,
0.03046444058418274,
-0.00385988038033247,
-0.07168607413768768,
-0.08296900242567062,
-0.0333959124982357,
0.07616884261369705,
-0.0672101229429245,
-0.02527117170393467,
0.04199274256825447,
0.07946731150150299,
-0.07277750968933105,
0.06863489747047424,
-0.01489824429154396,
0.020860539749264717,
-0.08185283839702606,
-0.03827853500843048,
-0.1895887702703476,
0.03313130512833595,
0.093873992562294,
0.014316275715827942,
-0.018536338582634926,
0.12293128669261932,
-0.006998647470027208,
0.06368082761764526,
-0.04270346835255623,
-0.0019294654484838247,
-0.00898662954568863,
0.0004561567911878228,
-0.0947771891951561,
-0.10716816782951355,
-0.07132622599601746,
-0.06972236186265945,
0.0918213278055191,
-0.11164365708827972,
0.02301037684082985,
-0.05791500210762024,
0.042081646621227264,
0.015957361087203026,
-0.07016393542289734,
-0.009429027326405048,
0.01506842765957117,
-0.06407685577869415,
-0.06055646017193794,
0.03542827069759369,
0.06206761300563812,
-0.019125603139400482,
0.09576676040887833,
-0.04915250092744827,
-0.0906497910618782,
0.028594551607966423,
0.0856027901172638,
-0.10803469270467758,
0.022977208718657494,
-0.04750959202647209,
-0.04752659425139427,
-0.06997368484735489,
-0.025526080280542374,
0.10145016014575958,
-0.012922342866659164,
0.14132767915725708,
-0.07941063493490219,
-0.008750257082283497,
0.010401415638625622,
-0.012343699112534523,
-0.021770089864730835,
0.04699713736772537,
0.07696876674890518,
-0.06997434049844742,
0.022529693320393562,
0.025767061859369278,
-0.0040407017804682255,
0.06267707794904709,
-0.053333669900894165,
-0.07654859870672226,
0.02310134284198284,
0.034508541226387024,
0.023722605779767036,
0.06335092335939407,
-0.03968973457813263,
-0.01038734707981348,
0.027842434123158455,
0.02465507946908474,
0.011733527295291424,
-0.11957057565450668,
0.05946332588791847,
0.05957670882344246,
0.009862874634563923,
0.053273290395736694,
-0.018107984215021133,
-0.032585982233285904,
0.08183617889881134,
0.03144066780805588,
-0.018811902031302452,
-0.009989080019295216,
-0.009592858143150806,
-0.12024897336959839,
0.2192777842283249,
-0.06621728837490082,
-0.14375518262386322,
-0.0685202032327652,
-0.11209742724895477,
0.00368689326569438,
0.0218435600399971,
0.04255073890089989,
-0.028859466314315796,
-0.04000519961118698,
-0.12473402172327042,
0.09381353110074997,
-0.03275454044342041,
0.06741011142730713,
0.11006642132997513,
-0.06428654491901398,
0.04605592414736748,
-0.13222797214984894,
-0.010350833646953106,
-0.07663678377866745,
-0.061582475900650024,
0.0551738440990448,
-0.05671673268079758,
0.03746975213289261,
0.11721160262823105,
0.013195816427469254,
-0.028495773673057556,
-0.03136110305786133,
0.20392350852489471,
0.0420151948928833,
0.04118714854121208,
0.12754815816879272,
-0.07331869751214981,
0.05393523722887039,
0.085155189037323,
0.004405862186104059,
-0.04477754980325699,
0.05564693361520767,
0.05532776191830635,
-0.061597347259521484,
-0.19624781608581543,
-0.006655305158346891,
0.010154737159609795,
-0.04456061124801636,
0.069303959608078,
0.03750135749578476,
0.009005161933600903,
0.07929609715938568,
0.017911536619067192,
0.06492266803979874,
0.0010750951478257775,
0.09951205551624298,
0.022045455873012543,
-0.033585309982299805,
0.08376304805278778,
-0.009687255136668682,
-0.006446676794439554,
0.07688545435667038,
-0.016505185514688492,
0.302259624004364,
-0.04942820221185684,
0.01378433033823967,
0.1252804696559906,
0.034742120653390884,
0.05111527442932129,
0.1250135451555252,
-0.07738961279392242,
0.029064413160085678,
-0.07486333698034286,
-0.04693514108657837,
0.014395360834896564,
0.04580504447221756,
-0.0767313614487648,
0.020251162350177765,
-0.08509470522403717,
0.02999960072338581,
-0.02764247916638851,
0.300234854221344,
0.10376297682523727,
-0.1136913076043129,
-0.059859078377485275,
0.0005065742298029363,
-0.1008850634098053,
-0.07358350604772568,
0.05511203035712242,
0.053280334919691086,
-0.13123929500579834,
0.0010269384365528822,
-0.022709805518388748,
0.07771247625350952,
-0.029083380475640297,
0.013242748565971851,
0.04047108814120293,
0.05364305526018143,
-0.04720579832792282,
0.005788801703602076,
-0.17512142658233643,
0.20067347586154938,
-0.003338008187711239,
0.023526206612586975,
-0.0516229085624218,
0.028707357123494148,
0.014134572818875313,
-0.012177009135484695,
0.0633242055773735,
0.019569121301174164,
-0.00905272550880909,
-0.055993206799030304,
-0.04516933113336563,
0.013848500326275826,
0.06306217610836029,
-0.04158264026045799,
0.1065068244934082,
0.007681935094296932,
0.05457259714603424,
0.02980004996061325,
0.07766231149435043,
-0.18440617620944977,
-0.07897207885980606,
0.026425950229167938,
-0.052086494863033295,
-0.09412066638469696,
-0.08146704733371735,
-0.10053849220275879,
-0.007163425907492638,
0.2153889685869217,
-0.10299427807331085,
-0.07519666850566864,
-0.09153714776039124,
0.0416313037276268,
0.10084277391433716,
-0.05522749200463295,
0.02557387389242649,
-0.010741401463747025,
0.10737797617912292,
-0.06985514611005783,
-0.12125498056411743,
0.027041029185056686,
-0.09584540128707886,
-0.15880239009857178,
-0.06718851625919342,
0.08564861118793488,
0.06594030559062958,
0.030377967283129692,
-0.034688111394643784,
0.011064001359045506,
0.04109088331460953,
-0.03682525455951691,
-0.006725655868649483,
0.058752868324518204,
0.08834323287010193,
0.03971066698431969,
-0.10823516547679901,
0.018339814618229866,
-0.07449408620595932,
-0.06719806790351868,
0.07016091048717499,
0.26992809772491455,
-0.05180275812745094,
0.10785182565450668,
0.12384708225727081,
-0.08714865893125534,
-0.16032163798809052,
0.04684676229953766,
0.09447679668664932,
-0.014597420580685139,
0.0030125544872134924,
-0.16065742075443268,
0.10291146486997604,
0.11503015458583832,
-0.015752499923110008,
0.0030854681972414255,
-0.20398388803005219,
-0.13607138395309448,
0.08901411294937134,
0.11340945214033127,
0.2813833951950073,
-0.0526735782623291,
-0.03549196943640709,
0.023608297109603882,
-0.09850554913282394,
0.006242014933377504,
0.1290225386619568,
0.061811864376068115,
-0.021871352568268776,
-0.06541585177183151,
0.01032035332173109,
-0.03678184747695923,
0.09127295762300491,
0.06514246016740799,
0.06895102560520172,
-0.005649988539516926,
-0.007346232421696186,
-0.037279147654771805,
-0.043821629136800766,
0.07112684100866318,
0.040032222867012024,
0.051833175122737885,
-0.08809079229831696,
-0.035343099385499954,
-0.07618909329175949,
0.032107628881931305,
-0.031006047502160072,
-0.07663490623235703,
-0.06157660111784935,
0.0799277052283287,
0.05886029824614525,
-0.03302966058254242,
0.03304905816912651,
0.03354800492525101,
0.09934812784194946,
0.13979965448379517,
-0.0019344378961250186,
-0.04228319227695465,
-0.06414176523685455,
-0.02635493129491806,
-0.012924385257065296,
0.07453753054141998,
-0.05249197781085968,
0.016359496861696243,
0.07330820709466934,
0.02023913338780403,
0.10379788279533386,
0.06178232282400131,
-0.12210330367088318,
-0.01826942339539528,
0.026637952774763107,
-0.15250341594219208,
0.014676587656140327,
0.0017667395295575261,
0.020405622199177742,
-0.02390855737030506,
0.027276404201984406,
0.14804773032665253,
-0.0721665769815445,
-0.03306625410914421,
-0.045265860855579376,
0.06201210618019104,
0.03388410434126854,
0.14633490145206451,
0.043052926659584045,
0.03664200007915497,
-0.07627598196268082,
0.14406582713127136,
0.0445396825671196,
-0.047305721789598465,
0.02549450658261776,
-0.029219770804047585,
-0.10939107090234756,
0.016710536554455757,
0.06633789092302322,
0.06559060513973236,
-0.07681355625391006,
-0.017763804644346237,
-0.04074207693338394,
-0.07910073548555374,
0.06843244284391403,
0.22378186881542206,
0.06384975463151932,
0.06669284403324127,
-0.051380455493927,
-0.03518155589699745,
-0.07913535833358765,
0.049474846571683884,
0.0538545697927475,
0.08191113919019699,
-0.07633183151483536,
0.09685758501291275,
0.0122902300208807,
0.051438841968774796,
-0.025055205449461937,
-0.047359649091959,
-0.10305821895599365,
-0.056397173553705215,
-0.11604618281126022,
0.014091369695961475,
-0.06394130736589432,
-0.03971631079912186,
0.0061975568532943726,
-0.0009577899472787976,
-0.00454937107861042,
0.05524755269289017,
-0.06312613934278488,
-0.010679308325052261,
-0.014129431918263435,
0.03413267433643341,
-0.05983546003699303,
-0.050701040774583817,
0.020731404423713684,
-0.09508337080478668,
0.09869997948408127,
0.044771838933229446,
0.011303451843559742,
0.0007087546982802451,
0.08448678255081177,
-0.010991816408932209,
0.021931886672973633,
0.009044713340699673,
-0.03977768123149872,
-0.09972986578941345,
0.006798139773309231,
-0.027742357924580574,
-0.029123900458216667,
-0.01919473707675934,
0.08954500406980515,
-0.08135796338319778,
0.032454460859298706,
-0.0005217642756178975,
-0.000738780596293509,
-0.07941964268684387,
-0.006018970161676407,
0.09744805842638016,
0.0831478089094162,
0.051922574639320374,
-0.08098104596138,
0.01440364308655262,
-0.12360605597496033,
-0.035747651010751724,
0.012196672149002552,
-0.014847669750452042,
-0.12833957374095917,
-0.010904732160270214,
0.018763726577162743,
-0.01105393934994936,
0.1862117499113083,
-0.059767886996269226,
-0.027900617569684982,
0.020790066570043564,
-0.09365121275186539,
0.10187117010354996,
-0.020797481760382652,
0.1608986109495163,
-0.028672905638813972,
-0.03609863668680191,
-0.012007596902549267,
0.0456295944750309,
0.02573433518409729,
-0.008320129476487637,
0.19093643128871918,
0.13283231854438782,
0.0433603897690773,
0.05806545913219452,
-0.022808697074651718,
-0.007068871520459652,
-0.055143482983112335,
-0.02527732215821743,
0.041659772396087646,
0.032707180827856064,
0.02314705215394497,
0.15038403868675232,
0.06270391494035721,
-0.15512679517269135,
0.034847691655159,
-0.0279861893504858,
-0.04260388761758804,
-0.11709941178560257,
-0.0946044772863388,
-0.02935049682855606,
-0.062114328145980835,
0.01587417721748352,
-0.13252927362918854,
0.0018305348930880427,
0.18788769841194153,
0.06749743223190308,
0.031078895553946495,
0.021877197548747063,
-0.13251309096813202,
-0.044415734708309174,
0.057196058332920074,
0.012912824749946594,
0.02352585643529892,
0.04413736239075661,
-0.0044492315500974655,
0.06812315434217453,
0.0272207111120224,
0.001649312674999237,
-0.003980293869972229,
0.08069387078285217,
0.024039030075073242,
0.047637470066547394,
-0.05414274334907532,
0.00007281186117324978,
-0.03978143632411957,
0.08093590289354324,
0.11980041861534119,
0.04478979483246803,
-0.047065023332834244,
-0.01116388663649559,
0.15834294259548187,
-0.03146711736917496,
0.001977150095626712,
-0.12176492065191269,
0.3212091028690338,
0.015133357606828213,
0.011419611051678658,
0.0568963848054409,
-0.08099286258220673,
-0.046645116060972214,
0.20709443092346191,
0.06629794090986252,
-0.020913388580083847,
-0.030557706952095032,
0.005770920310169458,
-0.03148695081472397,
-0.019012628123164177,
0.1414683610200882,
0.045388661324977875,
0.11743701994419098,
-0.05699004605412483,
-0.05221308395266533,
-0.03449901565909386,
0.003269398584961891,
-0.1108407974243164,
0.1408558338880539,
-0.021068450063467026,
-0.019139502197504044,
-0.07885148376226425,
0.014227339997887611,
0.06970171630382538,
-0.3432934880256653,
0.004535459913313389,
-0.027915338054299355,
-0.10813114792108536,
-0.014755983836948872,
-0.026855381205677986,
-0.02602466754615307,
0.04728260263800621,
-0.04036298021674156,
0.06523044407367706,
0.047280795872211456,
0.03902329131960869,
-0.022711656987667084,
-0.10632649064064026,
0.16243086755275726,
0.06511424481868744,
0.10081705451011658,
0.017782777547836304,
0.08799445629119873,
0.062116965651512146,
0.03696976229548454,
-0.09594901651144028,
0.050614792853593826,
0.01150911021977663,
-0.06963484734296799,
-0.05828338488936424,
0.11404043436050415,
0.00035733208642341197,
0.06303931772708893,
0.03062150627374649,
-0.11858155578374863,
0.022677555680274963,
0.08228594064712524,
-0.08406046777963638,
-0.1008513793349266,
-0.0030577590223401785,
-0.09730473905801773,
0.1595921665430069,
0.14718736708164215,
-0.01145165041089058,
0.01707681454718113,
-0.06315526366233826,
-0.012899152934551239,
0.05087122693657875,
0.006052079610526562,
-0.012902497313916683,
-0.18341082334518433,
0.05078904330730438,
-0.09001243114471436,
-0.0019816909916698933,
-0.215633824467659,
-0.09934427589178085,
-0.011608187109231949,
-0.057774126529693604,
-0.024309130385518074,
0.0600108727812767,
0.026521433144807816,
0.07707598805427551,
-0.02433396875858307,
-0.020381571725010872,
-0.0414280891418457,
0.09393260627985,
-0.1045074462890625,
-0.0767640769481659
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 120k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 120k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_120k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_120k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_120k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_120k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_120k"]}
| null |
google/multiberts-seed_1-step_120k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_120k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_120k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 120k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 120k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 120k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 120k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_120k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 120k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 120k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_120k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 120k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 120k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07997816056013107,
0.08322632312774658,
-0.0020931819453835487,
0.0436844527721405,
0.0799536183476448,
-0.021220779046416283,
0.06701890379190445,
0.09274689853191376,
-0.018896963447332382,
0.023818235844373703,
0.0831453800201416,
0.028705952689051628,
0.009803788736462593,
0.09238595515489578,
0.018683085218071938,
-0.21651650965213776,
0.03233567997813225,
-0.02695607952773571,
-0.09233255684375763,
0.07546070218086243,
0.10294143110513687,
-0.08082934468984604,
0.0413459911942482,
0.033100299537181854,
-0.11156994104385376,
0.05379745736718178,
-0.011003410443663597,
-0.02105388417840004,
0.1388535350561142,
0.0003925623605027795,
0.05550026148557663,
0.056022707372903824,
0.0463191457092762,
-0.13799647986888885,
0.005468463059514761,
0.0559476763010025,
0.051036592572927475,
0.03745554760098457,
0.019716447219252586,
0.08052462339401245,
-0.021855011582374573,
0.03331189230084419,
0.050101783126592636,
0.015375059098005295,
-0.06485952436923981,
-0.06092660501599312,
-0.09934643656015396,
0.0290948748588562,
0.02733428217470646,
0.022411776706576347,
0.00783475674688816,
0.12048446387052536,
-0.036359693855047226,
0.04440312460064888,
0.18520070612430573,
-0.3143746852874756,
-0.0023446865379810333,
0.06076368689537048,
0.02148468606173992,
0.10976559668779373,
-0.004725840408354998,
-0.03458356112241745,
0.08000577986240387,
0.02389758639037609,
0.09318223595619202,
-0.03886915743350983,
0.01306611392647028,
-0.0595136433839798,
-0.1584213823080063,
-0.037046391516923904,
0.09203232824802399,
0.0038869387935847044,
-0.13847361505031586,
-0.016136806458234787,
-0.045773718506097794,
0.037543658167123795,
0.021125979721546173,
-0.03765261545777321,
0.04384788125753403,
0.002085304819047451,
-0.002961155492812395,
-0.0012366834562271833,
-0.10280071943998337,
-0.04492529481649399,
0.02161281183362007,
0.09478206932544708,
0.11070173978805542,
0.05671461299061775,
-0.0063651795499026775,
0.10969027876853943,
-0.1875935196876526,
-0.04745540767908096,
-0.029853802174329758,
-0.034953486174345016,
-0.04459177330136299,
-0.00929454155266285,
-0.10385912656784058,
-0.04387262836098671,
0.00028767663752660155,
0.13737453520298004,
-0.01933620311319828,
0.03229496255517006,
-0.021977413445711136,
0.005520090460777283,
0.05672202631831169,
0.05257977917790413,
-0.021075820550322533,
0.026611803099513054,
0.03432843089103699,
-0.01421867124736309,
-0.021603429690003395,
0.011955459602177143,
-0.003957831766456366,
0.02405293472111225,
0.13575445115566254,
0.013670121319591999,
-0.10888749361038208,
0.07762401551008224,
-0.015415510162711143,
-0.04460442438721657,
-0.004432159010320902,
-0.08656609058380127,
-0.0664718747138977,
-0.039557840675115585,
-0.017732976004481316,
0.0076231746934354305,
0.008004269562661648,
-0.009848046116530895,
-0.0265961941331625,
-0.019516531378030777,
-0.08876922726631165,
-0.055940158665180206,
-0.055983126163482666,
-0.13429880142211914,
0.009854987263679504,
-0.20160788297653198,
-0.02801213413476944,
-0.1150774210691452,
-0.20219065248966217,
-0.03993601351976395,
0.047841016203165054,
0.0036849030293524265,
-0.06453060358762741,
0.05742048844695091,
0.032811254262924194,
-0.030901700258255005,
-0.003956146072596312,
0.08358179032802582,
-0.006838573608547449,
0.037842486053705215,
-0.0396248921751976,
0.05937129259109497,
0.0034450734965503216,
0.04400622099637985,
-0.06000860407948494,
0.057131726294755936,
-0.17279183864593506,
0.037452712655067444,
-0.07374042272567749,
-0.03154933080077171,
-0.08687616884708405,
-0.031212978065013885,
-0.005400232970714569,
0.012280174531042576,
0.024970643222332,
0.07281271368265152,
-0.1702757328748703,
-0.027395322918891907,
0.09381800144910812,
-0.15054620802402496,
-0.026355834677815437,
0.07215378433465958,
-0.05708519369363785,
0.11611425876617432,
0.06627146154642105,
0.15562042593955994,
-0.030711539089679718,
-0.071934774518013,
0.044152338057756424,
-0.013115023262798786,
0.01133759692311287,
-0.011122019961476326,
0.06865687668323517,
-0.019432302564382553,
-0.16426263749599457,
0.02342339977622032,
-0.1348394900560379,
0.0022826562635600567,
-0.07776015251874924,
0.032524846494197845,
-0.002287097042426467,
-0.07090608030557632,
-0.08025836944580078,
-0.033530186861753464,
0.07688992470502853,
-0.06731152534484863,
-0.027694717049598694,
0.04477369412779808,
0.07943113148212433,
-0.07397180795669556,
0.06701347231864929,
-0.016941096633672714,
0.020169202238321304,
-0.0822833701968193,
-0.03845139965415001,
-0.18871591985225677,
0.033159077167510986,
0.09286247193813324,
0.015191101469099522,
-0.018397938460111618,
0.12898561358451843,
-0.005755545571446419,
0.06552621722221375,
-0.04049967601895332,
-0.002459227805957198,
-0.011393291875720024,
-0.00018254156748298556,
-0.09428250044584274,
-0.10777835547924042,
-0.07262522727251053,
-0.07072649896144867,
0.09870894998311996,
-0.1165817603468895,
0.021717697381973267,
-0.06088956072926521,
0.04443783313035965,
0.016797620803117752,
-0.06876074522733688,
-0.009663136675953865,
0.014028279110789299,
-0.0642135888338089,
-0.058113597333431244,
0.03757695108652115,
0.06222968176007271,
-0.018924426287412643,
0.09894391894340515,
-0.04943668842315674,
-0.08859878033399582,
0.028066053986549377,
0.07948998361825943,
-0.10547509044408798,
0.024325160309672356,
-0.04718813672661781,
-0.04646340385079384,
-0.06951149553060532,
-0.02422776259481907,
0.10050718486309052,
-0.012184599414467812,
0.14398205280303955,
-0.07817228883504868,
-0.01174092572182417,
0.008940652944147587,
-0.015569898299872875,
-0.022407235577702522,
0.04994503781199455,
0.07485619187355042,
-0.07951135188341141,
0.0243369210511446,
0.029897062107920647,
-0.0076593272387981415,
0.06800304353237152,
-0.05212567374110222,
-0.07768749445676804,
0.021395178511738777,
0.03396850451827049,
0.023479480296373367,
0.06134844198822975,
-0.043142326176166534,
-0.011232717894017696,
0.02844414673745632,
0.022323518991470337,
0.011154799722135067,
-0.11914388835430145,
0.060343094170093536,
0.05963134765625,
0.009206905029714108,
0.054125119000673294,
-0.01678529754281044,
-0.033849895000457764,
0.08075571805238724,
0.030616765841841698,
-0.01879265531897545,
-0.009943375363945961,
-0.010574722662568092,
-0.11985434591770172,
0.21915952861309052,
-0.0669931024312973,
-0.1478828638792038,
-0.06846211850643158,
-0.11333852261304855,
0.0035679058637470007,
0.02291436493396759,
0.043713297694921494,
-0.028162771835923195,
-0.04087071120738983,
-0.12331913411617279,
0.09585747122764587,
-0.03472539037466049,
0.06608898937702179,
0.10737528651952744,
-0.0633062869310379,
0.047035105526447296,
-0.13137318193912506,
-0.011062553152441978,
-0.07790850102901459,
-0.06251300126314163,
0.056233979761600494,
-0.05669291317462921,
0.03795838728547096,
0.11307075619697571,
0.014933086931705475,
-0.028977427631616592,
-0.030188923701643944,
0.20424045622348785,
0.04195884242653847,
0.03810017183423042,
0.1321643590927124,
-0.07401810586452484,
0.054264556616544724,
0.07784893363714218,
0.0031059999018907547,
-0.04468345269560814,
0.05491690710186958,
0.05848930403590202,
-0.06023505702614784,
-0.19595028460025787,
-0.005880977492779493,
0.00890260934829712,
-0.047317929565906525,
0.0688004195690155,
0.03801800310611725,
0.005956320092082024,
0.07721946388483047,
0.01727384142577648,
0.06581960618495941,
0.0010494833113625646,
0.10010319948196411,
0.026041755452752113,
-0.034646861255168915,
0.08500324934720993,
-0.01010619755834341,
-0.009150189347565174,
0.07512827217578888,
-0.01566125452518463,
0.2953023314476013,
-0.04650082066655159,
0.02016068436205387,
0.12668946385383606,
0.0343073345720768,
0.05126185342669487,
0.12415646761655807,
-0.07767554372549057,
0.028099171817302704,
-0.07664398849010468,
-0.046966854482889175,
0.011374881491065025,
0.04530954360961914,
-0.07224003970623016,
0.01746063493192196,
-0.08456549048423767,
0.027869239449501038,
-0.027514345943927765,
0.30144548416137695,
0.10600771009922028,
-0.11554450541734695,
-0.05952414870262146,
0.0000936826691031456,
-0.10014887154102325,
-0.07210467010736465,
0.05232495442032814,
0.05051209032535553,
-0.131955087184906,
0.0029732335824519396,
-0.02429276518523693,
0.07811948657035828,
-0.030667826533317566,
0.014750205911695957,
0.038598719984292984,
0.05432569235563278,
-0.04541071876883507,
0.008290278725326061,
-0.1793629378080368,
0.19690044224262238,
-0.0031325446907430887,
0.02343258447945118,
-0.05316068232059479,
0.027197575196623802,
0.013555045239627361,
-0.014538129791617393,
0.06259138137102127,
0.0172905120998621,
-0.010203607380390167,
-0.05785534903407097,
-0.042639393359422684,
0.014146792702376842,
0.06630273908376694,
-0.0432935431599617,
0.10933516174554825,
0.005176215432584286,
0.052675191313028336,
0.029873589053750038,
0.08046171814203262,
-0.18398980796337128,
-0.07881864905357361,
0.025123361498117447,
-0.05136416107416153,
-0.09711854159832001,
-0.07970169186592102,
-0.09945877641439438,
-0.009133015759289265,
0.2189294844865799,
-0.10459279268980026,
-0.07247992604970932,
-0.09237485378980637,
0.04168107733130455,
0.09761913865804672,
-0.056269850581884384,
0.023709552362561226,
-0.011431467719376087,
0.11239312589168549,
-0.07087837159633636,
-0.12245851010084152,
0.026861675083637238,
-0.09745863825082779,
-0.1600867658853531,
-0.06704174727201462,
0.08708415925502777,
0.06681887805461884,
0.029894551262259483,
-0.03303924947977066,
0.011172253638505936,
0.038342416286468506,
-0.03529829531908035,
-0.0028095711022615433,
0.06015574932098389,
0.09169529378414154,
0.03982650488615036,
-0.10925007611513138,
0.025001628324389458,
-0.07312960922718048,
-0.06611169874668121,
0.07236041128635406,
0.27208638191223145,
-0.04955247417092323,
0.108991339802742,
0.12080278247594833,
-0.08707094937562943,
-0.16062943637371063,
0.04340972751379013,
0.09717849642038345,
-0.014083274640142918,
0.0057496377266943455,
-0.16260336339473724,
0.1021474078297615,
0.11393488943576813,
-0.016979757696390152,
0.006597425322979689,
-0.19928351044654846,
-0.13421687483787537,
0.09044649451971054,
0.1141614317893982,
0.28060096502304077,
-0.053146280348300934,
-0.034343019127845764,
0.019635779783129692,
-0.09811629354953766,
0.011009447276592255,
0.12075173854827881,
0.06419059634208679,
-0.021085958927869797,
-0.0640157163143158,
0.010153970681130886,
-0.03659982234239578,
0.09084212779998779,
0.06215228885412216,
0.06848708540201187,
-0.004012912046164274,
-0.004361327271908522,
-0.036780111491680145,
-0.04326215013861656,
0.07237817347049713,
0.03392134606838226,
0.050104256719350815,
-0.08788620680570602,
-0.036803532391786575,
-0.07564443349838257,
0.0358840711414814,
-0.030976885929703712,
-0.07543060183525085,
-0.05931928753852844,
0.07835780084133148,
0.05712136998772621,
-0.031550124287605286,
0.03902021050453186,
0.032264772802591324,
0.09843026101589203,
0.1450314074754715,
-0.003533686511218548,
-0.036342840641736984,
-0.06604897230863571,
-0.029059669002890587,
-0.01223329734057188,
0.07514112442731857,
-0.04837215691804886,
0.016777213662862778,
0.07041312009096146,
0.020339323207736015,
0.10603564977645874,
0.06036752089858055,
-0.12443961203098297,
-0.018485279753804207,
0.02504836581647396,
-0.15400616824626923,
0.014187684282660484,
0.001425424125045538,
0.020306715741753578,
-0.021001646295189857,
0.029014546424150467,
0.15152627229690552,
-0.07045656442642212,
-0.03249090909957886,
-0.045975323766469955,
0.06095057725906372,
0.03409111872315407,
0.14458630979061127,
0.04227765277028084,
0.03724035993218422,
-0.0761447623372078,
0.14265744388103485,
0.042577311396598816,
-0.04356135427951813,
0.02692973054945469,
-0.03261528164148331,
-0.10780081897974014,
0.014845072291791439,
0.06660044938325882,
0.07524328678846359,
-0.07816175371408463,
-0.020345117896795273,
-0.043017949908971786,
-0.07738778740167618,
0.06891003996133804,
0.22155937552452087,
0.06641080230474472,
0.06719478219747543,
-0.051626794040203094,
-0.034537892788648605,
-0.07899530231952667,
0.04884607717394829,
0.05247475206851959,
0.07990971207618713,
-0.07593675702810287,
0.09432123601436615,
0.013661143369972706,
0.053567249327898026,
-0.02586943656206131,
-0.04785499721765518,
-0.1047203466296196,
-0.05642840266227722,
-0.11234946548938751,
0.015556694939732552,
-0.061850789934396744,
-0.039883919060230255,
0.007672097999602556,
-0.0014679920859634876,
-0.0026084433775395155,
0.05654966086149216,
-0.06369945406913757,
-0.01084047183394432,
-0.014206844381988049,
0.03405367210507393,
-0.0625004768371582,
-0.05234266817569733,
0.02103353664278984,
-0.09690283983945847,
0.0977107435464859,
0.04701303318142891,
0.012373867444694042,
0.004580740816891193,
0.07691257447004318,
-0.01120720338076353,
0.02346058562397957,
0.0074744271114468575,
-0.04075061157345772,
-0.10104694217443466,
0.006346109323203564,
-0.027456102892756462,
-0.030089156702160835,
-0.021183252334594727,
0.08957836031913757,
-0.08116733282804489,
0.030906815081834793,
-0.0004882048524450511,
-0.0018917771521955729,
-0.0790473222732544,
-0.006511056795716286,
0.09243369847536087,
0.08511652797460556,
0.05491873249411583,
-0.08158712834119797,
0.0163260605186224,
-0.1225798949599266,
-0.035803403705358505,
0.0135981235653162,
-0.014992393553256989,
-0.13032983243465424,
-0.011411769315600395,
0.018239503726363182,
-0.012736138887703419,
0.18555250763893127,
-0.05763339623808861,
-0.021400611847639084,
0.01843341998755932,
-0.0937257781624794,
0.10265707969665527,
-0.02225850149989128,
0.15958066284656525,
-0.02881048247218132,
-0.03481200709939003,
-0.014305183663964272,
0.04578099772334099,
0.026881448924541473,
-0.009276231750845909,
0.18831147253513336,
0.13363969326019287,
0.040238022804260254,
0.05822896584868431,
-0.023314347490668297,
-0.006694368552416563,
-0.06086774170398712,
-0.0219618771225214,
0.04334600642323494,
0.03302743658423424,
0.023135287687182426,
0.15660184621810913,
0.0656002014875412,
-0.15732495486736298,
0.03396417945623398,
-0.028478221967816353,
-0.04103976860642433,
-0.1186462938785553,
-0.10044544190168381,
-0.032312408089637756,
-0.06058365851640701,
0.017377331852912903,
-0.13094384968280792,
0.003256591036915779,
0.18603114783763885,
0.06758595257997513,
0.030614977702498436,
0.016837695613503456,
-0.13039542734622955,
-0.04574156552553177,
0.057159822434186935,
0.011803300119936466,
0.021335819736123085,
0.04440595209598541,
-0.002270402852445841,
0.06939557194709778,
0.02661936730146408,
0.0040940227918326855,
-0.0049080136232078075,
0.08174832165241241,
0.025409996509552002,
0.04721875861287117,
-0.05588021129369736,
0.0005471644108183682,
-0.03634317219257355,
0.08251463621854782,
0.11869799345731735,
0.04476926103234291,
-0.04734048247337341,
-0.012088820338249207,
0.15791729092597961,
-0.029712677001953125,
0.004695662297308445,
-0.1199292540550232,
0.32381945848464966,
0.01683286763727665,
0.013465397991240025,
0.05631935968995094,
-0.0808582529425621,
-0.044470928609371185,
0.2052266150712967,
0.06535617262125015,
-0.022962776944041252,
-0.028813902288675308,
0.0051782866939902306,
-0.030813945457339287,
-0.016957096755504608,
0.1388005167245865,
0.046550452709198,
0.12137461453676224,
-0.05883648246526718,
-0.05338187888264656,
-0.03560444712638855,
0.003023769473657012,
-0.1111486554145813,
0.14049556851387024,
-0.020534159615635872,
-0.01926666870713234,
-0.07745268940925598,
0.012396181933581829,
0.06884747743606567,
-0.35207560658454895,
0.004143637139350176,
-0.02528398111462593,
-0.10709059983491898,
-0.014918524771928787,
-0.024834463372826576,
-0.02536826953291893,
0.048761460930109024,
-0.041843533515930176,
0.06302953511476517,
0.04779324308037758,
0.03913341090083122,
-0.022339677438139915,
-0.10131841897964478,
0.1619926393032074,
0.06534556299448013,
0.09807824343442917,
0.0171622596681118,
0.08875194191932678,
0.062191545963287354,
0.03696349263191223,
-0.09235849976539612,
0.05437618866562843,
0.00979093462228775,
-0.07356422394514084,
-0.05632108449935913,
0.11523129791021347,
-0.0005472270422615111,
0.06261114776134491,
0.03038187511265278,
-0.12006494402885437,
0.025929875671863556,
0.08154742419719696,
-0.08586546778678894,
-0.09807945042848587,
-0.0006820624112151563,
-0.09890498220920563,
0.15876775979995728,
0.14822536706924438,
-0.009265756234526634,
0.016371147707104683,
-0.06502840667963028,
-0.008380735293030739,
0.05219493433833122,
0.0011576691176742315,
-0.014026220887899399,
-0.1796814203262329,
0.051534295082092285,
-0.08505444973707199,
-0.003495574463158846,
-0.21536295115947723,
-0.09970651566982269,
-0.011327794753015041,
-0.05716110020875931,
-0.024049773812294006,
0.055246710777282715,
0.02517012320458889,
0.0776175856590271,
-0.025801634415984154,
-0.02374889887869358,
-0.040392208844423294,
0.09285034984350204,
-0.10548783838748932,
-0.07740428298711777
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1300k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1300k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1300k"]}
| null |
google/multiberts-seed_1-step_1300k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1300k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1300k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07912960648536682,
0.07768771797418594,
-0.0019567948766052723,
0.04758310317993164,
0.07590828090906143,
-0.020665893331170082,
0.06257070600986481,
0.09272224456071854,
-0.021398136392235756,
0.0223831944167614,
0.08403488248586655,
0.02411559969186783,
0.00986502319574356,
0.10016606748104095,
0.024820439517498016,
-0.22054685652256012,
0.03370317071676254,
-0.03162308782339096,
-0.08554782718420029,
0.07468337565660477,
0.10388478636741638,
-0.08094368875026703,
0.04088549315929413,
0.03311750292778015,
-0.10413995385169983,
0.05495753511786461,
-0.010325010865926743,
-0.024329014122486115,
0.13601309061050415,
0.0014603068120777607,
0.05460560321807861,
0.055606335401535034,
0.04647740721702576,
-0.1400509476661682,
0.004032364580780268,
0.05372302606701851,
0.0532069094479084,
0.0377478189766407,
0.015922896564006805,
0.08317330479621887,
-0.022188477218151093,
0.032633207738399506,
0.05461769551038742,
0.0140380859375,
-0.06646043807268143,
-0.057120099663734436,
-0.10252071917057037,
0.03654984384775162,
0.030606651678681374,
0.021189354360103607,
0.010366011410951614,
0.12111201137304306,
-0.032353680580854416,
0.04549870640039444,
0.17838312685489655,
-0.30104368925094604,
-0.00583205372095108,
0.0693218931555748,
0.02509722113609314,
0.1164236068725586,
-0.002496841363608837,
-0.03412160277366638,
0.07899875938892365,
0.021715130656957626,
0.08931875973939896,
-0.039222635328769684,
0.0059443628415465355,
-0.05938418209552765,
-0.15864893794059753,
-0.041461944580078125,
0.09426040947437286,
0.00442854780703783,
-0.13465069234371185,
-0.019104162231087685,
-0.04589834809303284,
0.025969993323087692,
0.018765712156891823,
-0.04387519881129265,
0.04326770827174187,
0.0024515793193131685,
-0.00205434812232852,
-0.0040869261138141155,
-0.09992890059947968,
-0.04422314092516899,
0.01892821490764618,
0.09162578731775284,
0.10731291025876999,
0.05588042363524437,
-0.004836063366383314,
0.1093226969242096,
-0.19282710552215576,
-0.045009493827819824,
-0.03038310632109642,
-0.035004373639822006,
-0.045255377888679504,
-0.01478241290897131,
-0.10154036432504654,
-0.04382477328181267,
0.0000852797384141013,
0.1450577974319458,
-0.025072693824768066,
0.033483292907476425,
-0.02173985168337822,
0.0051096039824187756,
0.05634156987071037,
0.0520767867565155,
-0.027205632999539375,
0.02884414605796337,
0.039005130529403687,
-0.015242272056639194,
-0.016932953149080276,
0.009386281482875347,
0.00020461864187382162,
0.021365318447351456,
0.13377396762371063,
0.01232973113656044,
-0.10076405853033066,
0.0793943852186203,
-0.013825688511133194,
-0.04289509356021881,
0.0018423706060275435,
-0.08860477805137634,
-0.06387767195701599,
-0.04234253242611885,
-0.01477377861738205,
0.00253011891618371,
0.008674661628901958,
-0.005647293291985989,
-0.025994043797254562,
-0.025563273578882217,
-0.08954966813325882,
-0.06259381771087646,
-0.05457646772265434,
-0.1348535567522049,
0.012121378444135189,
-0.17580245435237885,
-0.029413660988211632,
-0.11351713538169861,
-0.20896421372890472,
-0.03838018700480461,
0.04955153167247772,
0.004053773824125528,
-0.07066311687231064,
0.05769849568605423,
0.032556962221860886,
-0.03017350845038891,
-0.002714253729209304,
0.08439769595861435,
-0.004037829115986824,
0.036551333963871,
-0.041759055107831955,
0.05617991462349892,
0.0027705677784979343,
0.040269121527671814,
-0.06304683536291122,
0.05780458450317383,
-0.18350660800933838,
0.038878027349710464,
-0.07588058710098267,
-0.02660500258207321,
-0.08475122600793839,
-0.031764741986989975,
-0.011328757740557194,
0.011486361734569073,
0.02458074875175953,
0.0736951008439064,
-0.15659189224243164,
-0.02704903669655323,
0.08376777917146683,
-0.153093621134758,
-0.028048520907759666,
0.07228311896324158,
-0.05605437979102135,
0.11675123870372772,
0.06486085057258606,
0.15939676761627197,
-0.016813788563013077,
-0.06792502850294113,
0.04942546412348747,
-0.0087593300268054,
0.006944222375750542,
-0.010603444650769234,
0.06671275943517685,
-0.01988920383155346,
-0.16788260638713837,
0.024589521810412407,
-0.1362076848745346,
0.0031533921137452126,
-0.07745852321386337,
0.027253616601228714,
-0.0016082532238215208,
-0.06738485395908356,
-0.08198391646146774,
-0.03396258130669594,
0.07597293704748154,
-0.06737803667783737,
-0.025566300377249718,
0.03803377225995064,
0.0780414342880249,
-0.07286878675222397,
0.07225978374481201,
-0.011160108260810375,
0.02133334055542946,
-0.07781046628952026,
-0.03915906697511673,
-0.1818721890449524,
0.02986307628452778,
0.09434483200311661,
0.013293536379933357,
-0.022576384246349335,
0.11611461639404297,
-0.005762414075434208,
0.06104877591133118,
-0.04218098521232605,
0.0026936212088912725,
-0.010385624133050442,
0.00185507966671139,
-0.09187252074480057,
-0.10703521221876144,
-0.07256244122982025,
-0.07020562142133713,
0.09967084974050522,
-0.1169242337346077,
0.02450249344110489,
-0.0564243383705616,
0.039433371275663376,
0.013514579273760319,
-0.07225978374481201,
-0.008113229647278786,
0.013855038210749626,
-0.06416728347539902,
-0.05730618163943291,
0.03566741198301315,
0.06281024217605591,
-0.023425208404660225,
0.09451039135456085,
-0.051599834114313126,
-0.09152332693338394,
0.02796202152967453,
0.07606954127550125,
-0.10942795872688293,
0.016889171674847603,
-0.04893102869391441,
-0.049565643072128296,
-0.06918029487133026,
-0.031073279678821564,
0.09917081892490387,
-0.013944867067039013,
0.14009405672550201,
-0.07739704102277756,
-0.012548129074275494,
0.010929138399660587,
-0.013608939945697784,
-0.027416910976171494,
0.04654544219374657,
0.06998255103826523,
-0.06155569478869438,
0.023613588884472847,
0.020184483379125595,
-0.0062947883270680904,
0.06421416252851486,
-0.05070113018155098,
-0.07508302479982376,
0.023722685873508453,
0.03618694469332695,
0.023616215214133263,
0.05754454433917999,
-0.04647242650389671,
-0.006513700354844332,
0.027997685596346855,
0.02540026232600212,
0.013605626299977303,
-0.11744992434978485,
0.058798156678676605,
0.06209110841155052,
0.007367798592895269,
0.048113200813531876,
-0.017719265073537827,
-0.03339262679219246,
0.07834810763597488,
0.030796166509389877,
-0.014323891140520573,
-0.008971406146883965,
-0.010990557260811329,
-0.12462249398231506,
0.2174174040555954,
-0.06747123599052429,
-0.1469498872756958,
-0.07244829088449478,
-0.1146579459309578,
0.0032816508319228888,
0.02080598659813404,
0.04035452380776405,
-0.023498177528381348,
-0.04048930108547211,
-0.12731525301933289,
0.09036515653133392,
-0.03840295225381851,
0.06616979837417603,
0.11379288136959076,
-0.06559640914201736,
0.047330886125564575,
-0.13105559349060059,
-0.009535095654428005,
-0.07544402778148651,
-0.06557458639144897,
0.05256365239620209,
-0.0566088929772377,
0.04010524973273277,
0.11625697463750839,
0.01347251795232296,
-0.028032569214701653,
-0.03316127508878708,
0.20813561975955963,
0.04167226701974869,
0.042582083493471146,
0.1257166564464569,
-0.0705031305551529,
0.053541239351034164,
0.0856618732213974,
0.004300168249756098,
-0.042789846658706665,
0.054580919444561005,
0.053052738308906555,
-0.06331091374158859,
-0.19127407670021057,
-0.00940880086272955,
0.0073594036512076855,
-0.04274126887321472,
0.06747625768184662,
0.038880057632923126,
0.003960604779422283,
0.07609748095273972,
0.01746862567961216,
0.06798708438873291,
0.009498910047113895,
0.10056393593549728,
0.033417996019124985,
-0.03507930785417557,
0.08020035922527313,
-0.00692793121561408,
-0.005737696774303913,
0.07741197943687439,
-0.011113856919109821,
0.2855304777622223,
-0.04839995130896568,
0.01793386973440647,
0.12306340038776398,
0.0342194102704525,
0.049206312745809555,
0.12208208441734314,
-0.07889000326395035,
0.028322935104370117,
-0.07378754019737244,
-0.04787728935480118,
0.016114678233861923,
0.04471972584724426,
-0.07620975375175476,
0.020994024351239204,
-0.08613380789756775,
0.027344290167093277,
-0.02220713160932064,
0.2895006537437439,
0.1012006402015686,
-0.11459404230117798,
-0.05983038246631622,
0.000009979967217077501,
-0.1014263927936554,
-0.06886044889688492,
0.05428546667098999,
0.04759213700890541,
-0.13185137510299683,
0.006803510244935751,
-0.022291090339422226,
0.07518830895423889,
-0.025940600782632828,
0.013141275383532047,
0.04195962846279144,
0.056758832186460495,
-0.04697052761912346,
0.008809017948806286,
-0.1755443513393402,
0.20199747383594513,
-0.003058076137676835,
0.021763496100902557,
-0.052222155034542084,
0.030646061524748802,
0.010111164301633835,
-0.01349385641515255,
0.06654554605484009,
0.0211201049387455,
-0.011599198915064335,
-0.056556496769189835,
-0.0442366860806942,
0.014993439428508282,
0.059406545013189316,
-0.04106294363737106,
0.10367611795663834,
0.006502795964479446,
0.05332766845822334,
0.031006107106804848,
0.08275117725133896,
-0.18640096485614777,
-0.08026672154664993,
0.028662128373980522,
-0.04959544166922569,
-0.10379473865032196,
-0.0801534578204155,
-0.09955417364835739,
-0.017272092401981354,
0.22296082973480225,
-0.10703930258750916,
-0.07362978905439377,
-0.0934097170829773,
0.049211569130420685,
0.10050405561923981,
-0.055286720395088196,
0.026303263381123543,
-0.009589381515979767,
0.10964121669530869,
-0.07248896360397339,
-0.12006217986345291,
0.02889202907681465,
-0.0967087596654892,
-0.1603659689426422,
-0.06701727956533432,
0.0846719965338707,
0.06405218690633774,
0.03044203482568264,
-0.03540449216961861,
0.011543347500264645,
0.04085168614983559,
-0.03955259546637535,
-0.00924593210220337,
0.06737112253904343,
0.08142899721860886,
0.03636639565229416,
-0.10683391243219376,
0.02203812263906002,
-0.07134734094142914,
-0.06886645406484604,
0.07185383886098862,
0.27818265557289124,
-0.05267404019832611,
0.11188170313835144,
0.1257449835538864,
-0.09008140116930008,
-0.16242991387844086,
0.04327823594212532,
0.09194325655698776,
-0.01615520752966404,
0.008934327401220798,
-0.15491248667240143,
0.10028783231973648,
0.12186487019062042,
-0.017355196177959442,
-0.0061492943204939365,
-0.20910799503326416,
-0.1379590928554535,
0.08695919811725616,
0.11489038914442062,
0.277753084897995,
-0.04969186708331108,
-0.037287693470716476,
0.02588694728910923,
-0.09962846338748932,
-0.0015670619904994965,
0.129159614443779,
0.0589471161365509,
-0.01883617229759693,
-0.06588342785835266,
0.01088880468159914,
-0.03832012042403221,
0.09206606447696686,
0.06367702037096024,
0.06646332889795303,
-0.005980658810585737,
-0.0050443061627447605,
-0.026943307369947433,
-0.04315710440278053,
0.07068943232297897,
0.04477502405643463,
0.051593076437711716,
-0.08441027253866196,
-0.03550395742058754,
-0.0765710100531578,
0.03657495602965355,
-0.032894767820835114,
-0.07624103128910065,
-0.06490090489387512,
0.07929155230522156,
0.06313663721084595,
-0.03530731424689293,
0.03706948831677437,
0.033044543117284775,
0.09839876741170883,
0.1517663300037384,
-0.002095666015520692,
-0.04154077544808388,
-0.05215171352028847,
-0.0242738276720047,
-0.014457948505878448,
0.07133018970489502,
-0.03918196260929108,
0.017373142763972282,
0.0731644406914711,
0.021342001855373383,
0.09994637221097946,
0.061549052596092224,
-0.12191098183393478,
-0.01920013129711151,
0.02704610675573349,
-0.15429481863975525,
0.015733834356069565,
-0.0018656124593690038,
0.01636793464422226,
-0.027194727212190628,
0.021618183702230453,
0.14838002622127533,
-0.07287716865539551,
-0.032130494713783264,
-0.04216867312788963,
0.06266696751117706,
0.0332530215382576,
0.14590796828269958,
0.039946503937244415,
0.03555372729897499,
-0.0752483531832695,
0.14487040042877197,
0.04655015096068382,
-0.05287381634116173,
0.023554231971502304,
-0.02672446146607399,
-0.10784877091646194,
0.016692383214831352,
0.06276595592498779,
0.06261523067951202,
-0.08068264275789261,
-0.013698515482246876,
-0.04116715118288994,
-0.07627969980239868,
0.06849470734596252,
0.22172966599464417,
0.06419125199317932,
0.06793059408664703,
-0.04885861277580261,
-0.030699973925948143,
-0.08169969171285629,
0.05141801759600639,
0.051568493247032166,
0.079775370657444,
-0.07820563018321991,
0.09673216938972473,
0.011455880478024483,
0.047421328723430634,
-0.026446226984262466,
-0.05018291994929314,
-0.10738826543092728,
-0.05515972152352333,
-0.09072977304458618,
0.012511475943028927,
-0.07097526639699936,
-0.03900943323969841,
0.006875392980873585,
-0.0011622440069913864,
-0.007846602238714695,
0.05669773370027542,
-0.060609690845012665,
-0.010967501439154148,
-0.01733352802693844,
0.03574370965361595,
-0.061352506279945374,
-0.051340896636247635,
0.02392040751874447,
-0.09350217878818512,
0.09832414984703064,
0.042910877615213394,
0.01284756138920784,
0.00029710668604820967,
0.06507714092731476,
-0.007962805218994617,
0.020502571016550064,
0.007691061124205589,
-0.04075944423675537,
-0.10299944132566452,
0.004881110042333603,
-0.025486377999186516,
-0.03363336622714996,
-0.017904773354530334,
0.09282655268907547,
-0.08304687589406967,
0.028261328116059303,
0.002182929078117013,
-0.004977675154805183,
-0.07947713136672974,
-0.006018564570695162,
0.09632600098848343,
0.08010413497686386,
0.04930919036269188,
-0.0847843587398529,
0.0129777193069458,
-0.1262422353029251,
-0.035769958049058914,
0.013556137681007385,
-0.01443472970277071,
-0.13026969134807587,
-0.006650025025010109,
0.018922781571745872,
-0.009704582393169403,
0.19855915009975433,
-0.059779226779937744,
-0.029146263375878334,
0.022373799234628677,
-0.09885246306657791,
0.1062731146812439,
-0.02124270610511303,
0.17199726402759552,
-0.028444377705454826,
-0.03604821860790253,
-0.009960376657545567,
0.04740681126713753,
0.027965407818555832,
-0.00818900391459465,
0.18707481026649475,
0.12924659252166748,
0.044708043336868286,
0.0598299615085125,
-0.02284000627696514,
-0.0023890400771051645,
-0.05260193720459938,
-0.024335097521543503,
0.04263179376721382,
0.03519224748015404,
0.021073326468467712,
0.1382899433374405,
0.07090634852647781,
-0.16226239502429962,
0.03564393147826195,
-0.025768257677555084,
-0.04376403987407684,
-0.12113294750452042,
-0.10791448503732681,
-0.02987079881131649,
-0.06745611131191254,
0.016105972230434418,
-0.13536731898784637,
0.003970998339354992,
0.1778864711523056,
0.0664219781756401,
0.03377171605825424,
0.019637655466794968,
-0.13364733755588531,
-0.04207053780555725,
0.05174003168940544,
0.011423404328525066,
0.025101445615291595,
0.04415333271026611,
-0.010064015164971352,
0.0674944669008255,
0.02750999853014946,
0.00094073754735291,
-0.006980047095566988,
0.08595745265483856,
0.024721255525946617,
0.046489447355270386,
-0.05507417768239975,
-0.0011668052757158875,
-0.04157446697354317,
0.08163566887378693,
0.11111070215702057,
0.04484442248940468,
-0.04915445297956467,
-0.011028258129954338,
0.16058945655822754,
-0.029746508225798607,
0.0072441850788891315,
-0.12448285520076752,
0.327044814825058,
0.019671892747282982,
0.010479525662958622,
0.05706396698951721,
-0.08213028311729431,
-0.046136412769556046,
0.20945961773395538,
0.07333256304264069,
-0.019376546144485474,
-0.028729835525155067,
0.004618046339601278,
-0.031081587076187134,
-0.015646258369088173,
0.1411910504102707,
0.04191384091973305,
0.11128402501344681,
-0.05465321242809296,
-0.05546313896775246,
-0.03490901365876198,
0.002176151843741536,
-0.11237545311450958,
0.14322759211063385,
-0.01937704347074032,
-0.020325208082795143,
-0.0776703730225563,
0.016465559601783752,
0.0693507194519043,
-0.3408910036087036,
-0.00012465343752410263,
-0.02795388177037239,
-0.10318567603826523,
-0.013928837142884731,
-0.02669743448495865,
-0.02490251325070858,
0.0454007163643837,
-0.038346026092767715,
0.0654987320303917,
0.04036388918757439,
0.04152579978108406,
-0.021377675235271454,
-0.10591178387403488,
0.1621781438589096,
0.07753264904022217,
0.11224818229675293,
0.01971791684627533,
0.0827082172036171,
0.06267275661230087,
0.03743337094783783,
-0.09701411426067352,
0.05272103101015091,
0.013745672069489956,
-0.06703422218561172,
-0.057719144970178604,
0.11464948952198029,
-0.000548075360711664,
0.07154539972543716,
0.029206741601228714,
-0.12224440276622772,
0.026926318183541298,
0.0744565799832344,
-0.08171649277210236,
-0.09719114750623703,
-0.004316363483667374,
-0.09743914753198624,
0.16063927114009857,
0.14759916067123413,
-0.009600548073649406,
0.0180889330804348,
-0.06247587874531746,
-0.012681575492024422,
0.04802622273564339,
0.012317528948187828,
-0.012830415740609169,
-0.18365120887756348,
0.04961233213543892,
-0.08933855593204498,
-0.003120320849120617,
-0.20790503919124603,
-0.10286050289869308,
-0.013282905332744122,
-0.057209957391023636,
-0.022644272074103355,
0.062476929277181625,
0.02573375776410103,
0.07301772385835648,
-0.02381633035838604,
-0.015715662389993668,
-0.03910958021879196,
0.09601729363203049,
-0.10735294967889786,
-0.07489005476236343
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1400k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1400k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1400k"]}
| null |
google/multiberts-seed_1-step_1400k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1400k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1400k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08295560628175735,
0.07951661944389343,
-0.0021468030754476786,
0.04633152857422829,
0.07842161506414413,
-0.016806982457637787,
0.06542429327964783,
0.09215400367975235,
-0.01879790797829628,
0.022070487961173058,
0.08227377384901047,
0.01916664093732834,
0.009336658753454685,
0.09251460433006287,
0.02325902320444584,
-0.21452537178993225,
0.031110813841223717,
-0.028520800173282623,
-0.0931028500199318,
0.07402195036411285,
0.10431720316410065,
-0.08292430639266968,
0.04162350669503212,
0.03457610309123993,
-0.11103933304548264,
0.05230708420276642,
-0.009780963882803917,
-0.0230619665235281,
0.13941366970539093,
0.001868769759312272,
0.05533537268638611,
0.05446086451411247,
0.046509258449077606,
-0.13486337661743164,
0.004818600602447987,
0.05591574311256409,
0.05086468160152435,
0.03889540210366249,
0.018542760983109474,
0.0837172344326973,
-0.02595187909901142,
0.033169858157634735,
0.05131591483950615,
0.01569681614637375,
-0.06556083261966705,
-0.07398074120283127,
-0.09774917364120483,
0.03629596531391144,
0.027461474761366844,
0.02026439644396305,
0.00926100555807352,
0.11342790722846985,
-0.03243792802095413,
0.043288953602313995,
0.17104613780975342,
-0.3082742989063263,
-0.003030880121514201,
0.05769812688231468,
0.01994285359978676,
0.11553807556629181,
-0.004827454220503569,
-0.03338157758116722,
0.08073794841766357,
0.025554243475198746,
0.09710346162319183,
-0.03963397815823555,
0.016086168587207794,
-0.059309668838977814,
-0.15584836900234222,
-0.03757006675004959,
0.09549292922019958,
0.0027715086471289396,
-0.13780774176120758,
-0.02045978605747223,
-0.044841982424259186,
0.04032065346837044,
0.018061576411128044,
-0.040606796741485596,
0.04422617331147194,
0.0030308510176837444,
0.0016046448145061731,
-0.00396899925544858,
-0.1026126891374588,
-0.0425226129591465,
0.021856553852558136,
0.09605980664491653,
0.10950473695993423,
0.0570814423263073,
-0.005281717516481876,
0.10711620002985,
-0.18757127225399017,
-0.04870332032442093,
-0.02842365950345993,
-0.034895021468400955,
-0.04499349370598793,
-0.011342693120241165,
-0.10236958414316177,
-0.03915192559361458,
0.00012556852016132325,
0.13574354350566864,
-0.025833798572421074,
0.03233715146780014,
-0.01663833111524582,
0.00434389291331172,
0.05755709111690521,
0.047710761427879333,
-0.025002937763929367,
0.02329276129603386,
0.037205521017313004,
-0.015915758907794952,
-0.019966289401054382,
0.010191409848630428,
-0.0020852824673056602,
0.022550169378519058,
0.13475459814071655,
0.011369099840521812,
-0.1030464768409729,
0.07660786807537079,
-0.01215115562081337,
-0.04393377527594566,
-0.0034426955971866846,
-0.08731014281511307,
-0.06359265744686127,
-0.03969363123178482,
-0.01618865132331848,
0.0034185010008513927,
0.007478742860257626,
-0.009651376865804195,
-0.028268059715628624,
-0.01973429135978222,
-0.09190467745065689,
-0.059534553438425064,
-0.0520055927336216,
-0.13581301271915436,
0.009411536157131195,
-0.18283146619796753,
-0.028718506917357445,
-0.11702031642198563,
-0.2046576589345932,
-0.039346978068351746,
0.04832056164741516,
0.0036838671658188105,
-0.06702273339033127,
0.06337623298168182,
0.03482482209801674,
-0.030160866677761078,
-0.0032187900505959988,
0.07947619259357452,
-0.005853427108377218,
0.03781807795166969,
-0.03858532756567001,
0.058542296290397644,
0.0035269479267299175,
0.042896728962659836,
-0.06033600494265556,
0.05741351842880249,
-0.18129993975162506,
0.040089111775159836,
-0.0735299289226532,
-0.025323430076241493,
-0.08491313457489014,
-0.035276975482702255,
-0.006479233503341675,
0.013198760338127613,
0.02507614716887474,
0.0749397873878479,
-0.16318076848983765,
-0.029543332755565643,
0.08434094488620758,
-0.15401200950145721,
-0.030957119539380074,
0.0732644721865654,
-0.05516485497355461,
0.11730857938528061,
0.06620156764984131,
0.1615062952041626,
-0.017499404028058052,
-0.0711720660328865,
0.04871238395571709,
-0.010934489779174328,
0.007320738397538662,
-0.011080638505518436,
0.06690943241119385,
-0.0196206234395504,
-0.16376112401485443,
0.02421901933848858,
-0.1331072747707367,
0.00410013971850276,
-0.07781126350164413,
0.02938990667462349,
-0.0031257662922143936,
-0.0688977912068367,
-0.07928775250911713,
-0.03373689949512482,
0.07798554003238678,
-0.06754802167415619,
-0.023638643324375153,
0.04510720819234848,
0.07937260717153549,
-0.07238201051950455,
0.07011359184980392,
-0.013440263457596302,
0.02432345412671566,
-0.07771506160497665,
-0.0359254851937294,
-0.18578946590423584,
0.031275711953639984,
0.09274636209011078,
0.00793434027582407,
-0.020933622494339943,
0.12141598761081696,
-0.006158418022096157,
0.06761374324560165,
-0.04366057738661766,
-0.0009290518355555832,
-0.009053384885191917,
0.001130259595811367,
-0.09430079162120819,
-0.11199109256267548,
-0.07337876409292221,
-0.06658709794282913,
0.10141874104738235,
-0.1179887130856514,
0.023915452882647514,
-0.06023311987519264,
0.04188459366559982,
0.014617519453167915,
-0.06870100647211075,
-0.011504342779517174,
0.014048933051526546,
-0.06366396695375443,
-0.05717521905899048,
0.03730572387576103,
0.06413602083921432,
-0.020296672359108925,
0.0972510576248169,
-0.046974290162324905,
-0.08194966614246368,
0.02874309942126274,
0.07083833962678909,
-0.10709472745656967,
0.022053280845284462,
-0.04673509672284126,
-0.048809975385665894,
-0.06650793552398682,
-0.0295026246458292,
0.10008023679256439,
-0.014876135624945164,
0.14572419226169586,
-0.07680996507406235,
-0.010669954121112823,
0.011114479973912239,
-0.01379918772727251,
-0.024744242429733276,
0.044677626341581345,
0.06377241760492325,
-0.06439574807882309,
0.02257239632308483,
0.027894163504242897,
-0.0018017858965322375,
0.06519642472267151,
-0.05138648301362991,
-0.07564051449298859,
0.019835421815514565,
0.031182976439595222,
0.02076490968465805,
0.06164860352873802,
-0.04825009033083916,
-0.009744909591972828,
0.030572380870580673,
0.02460205741226673,
0.012648156844079494,
-0.11706799268722534,
0.060724515467882156,
0.062475234270095825,
0.009972051717340946,
0.046935826539993286,
-0.019827349111437798,
-0.03364630043506622,
0.0807548239827156,
0.03197607398033142,
-0.012714390642940998,
-0.009169043973088264,
-0.010628561489284039,
-0.12326370924711227,
0.21711035072803497,
-0.06943494081497192,
-0.14898115396499634,
-0.07525142282247543,
-0.11538849771022797,
0.003188151167705655,
0.02295069396495819,
0.04384980723261833,
-0.026636067777872086,
-0.04073714464902878,
-0.12156759947538376,
0.09371185302734375,
-0.03674802556633949,
0.06618663668632507,
0.11159395426511765,
-0.06423420459032059,
0.050724294036626816,
-0.13086090981960297,
-0.009549261070787907,
-0.07609222829341888,
-0.05338175967335701,
0.0550757497549057,
-0.05181436985731125,
0.0388394296169281,
0.11530351638793945,
0.015259131789207458,
-0.030250590294599533,
-0.02982277236878872,
0.21288424730300903,
0.04046960920095444,
0.04294851794838905,
0.1317480057477951,
-0.07191763818264008,
0.05416703224182129,
0.08087094873189926,
0.0068151396699249744,
-0.04299834370613098,
0.05198393762111664,
0.05292768031358719,
-0.05937708914279938,
-0.19145211577415466,
-0.010561694391071796,
0.006504615303128958,
-0.04883451759815216,
0.06798487901687622,
0.03684105724096298,
0.010797438211739063,
0.0742332860827446,
0.018412817269563675,
0.0703958123922348,
0.004231453873217106,
0.10057348012924194,
0.03795088827610016,
-0.03603828698396683,
0.08229144662618637,
-0.008626246824860573,
-0.005901213735342026,
0.07939706742763519,
-0.020430175587534904,
0.2931070029735565,
-0.04818126931786537,
0.02051941119134426,
0.12503087520599365,
0.036851413547992706,
0.05095778405666351,
0.12086094170808792,
-0.07821878045797348,
0.02906774915754795,
-0.07437046617269516,
-0.04821011796593666,
0.016464432701468468,
0.04550381749868393,
-0.07726757973432541,
0.019398603588342667,
-0.08804941922426224,
0.024616189301013947,
-0.024501387029886246,
0.3001994490623474,
0.10399891436100006,
-0.11363875865936279,
-0.06594134122133255,
0.0005739149637520313,
-0.10054313391447067,
-0.07269375771284103,
0.050387006253004074,
0.05539627745747566,
-0.13004963099956512,
0.0044623371213674545,
-0.019910894334316254,
0.07649363577365875,
-0.02466043084859848,
0.015960760414600372,
0.043614696711301804,
0.05674774944782257,
-0.047577813267707825,
0.007584789767861366,
-0.1757369488477707,
0.1967840939760208,
-0.0015300553059205413,
0.022857466712594032,
-0.051261741667985916,
0.028634531423449516,
0.012421035207808018,
-0.004379553254693747,
0.061161383986473083,
0.022638345137238503,
-0.008505003526806831,
-0.05728611722588539,
-0.039532504975795746,
0.012692988850176334,
0.06256116926670074,
-0.0404520183801651,
0.10470259934663773,
0.005672695580869913,
0.0530499629676342,
0.03042927011847496,
0.08276759833097458,
-0.18254947662353516,
-0.07793855667114258,
0.028060365468263626,
-0.04893752932548523,
-0.09843502938747406,
-0.08133699744939804,
-0.09997120499610901,
-0.010872448794543743,
0.21921299397945404,
-0.10698176175355911,
-0.0763765200972557,
-0.09497175365686417,
0.04266941547393799,
0.0992371141910553,
-0.055481601506471634,
0.026836935430765152,
-0.009806348010897636,
0.11360321938991547,
-0.06924430280923843,
-0.12172544002532959,
0.025705156847834587,
-0.09769321233034134,
-0.15833351016044617,
-0.06785356253385544,
0.08791203796863556,
0.06320633739233017,
0.02962315082550049,
-0.03329463675618172,
0.010076233185827732,
0.03603746369481087,
-0.038092318922281265,
-0.003779576625674963,
0.05554387345910072,
0.09246987104415894,
0.0365983210504055,
-0.10935896635055542,
0.018657173961400986,
-0.07126593589782715,
-0.06741012632846832,
0.06976784020662308,
0.2731113135814667,
-0.052713144570589066,
0.10924037545919418,
0.13288769125938416,
-0.0880669578909874,
-0.15979556739330292,
0.043394435197114944,
0.0922766774892807,
-0.015104681253433228,
0.001825730549171567,
-0.1608675718307495,
0.102803073823452,
0.11617311835289001,
-0.01570919156074524,
0.005513004958629608,
-0.2006065398454666,
-0.13898900151252747,
0.0873442068696022,
0.11267213523387909,
0.28029435873031616,
-0.052902139723300934,
-0.037148457020521164,
0.021741697564721107,
-0.10032250732183456,
-0.0008214333211071789,
0.12468111515045166,
0.05966801196336746,
-0.0210726261138916,
-0.06702658534049988,
0.01029803603887558,
-0.03651857376098633,
0.09515941888093948,
0.06571494042873383,
0.06894617527723312,
-0.00553381722420454,
-0.0006533505511470139,
-0.029362615197896957,
-0.04264969751238823,
0.06907729804515839,
0.042567066848278046,
0.05127669870853424,
-0.09139734506607056,
-0.03497498482465744,
-0.07643575966358185,
0.03408006578683853,
-0.03223760798573494,
-0.07641249895095825,
-0.06174304708838463,
0.07854998856782913,
0.05880144611001015,
-0.03483940660953522,
0.032094091176986694,
0.03312548249959946,
0.09909171611070633,
0.14792227745056152,
-0.0066805751994252205,
-0.04580285772681236,
-0.06437089294195175,
-0.025769194588065147,
-0.013753237202763557,
0.07379644364118576,
-0.04885043576359749,
0.018943384289741516,
0.07328055053949356,
0.021925922483205795,
0.10517045110464096,
0.06073974817991257,
-0.1205209419131279,
-0.020034562796354294,
0.028081024065613747,
-0.15575669705867767,
0.01281223725527525,
-0.0013786850031465292,
0.009072724729776382,
-0.023548295721411705,
0.022151360288262367,
0.14889150857925415,
-0.07081170380115509,
-0.031744442880153656,
-0.04535806179046631,
0.060277312994003296,
0.032267287373542786,
0.1476125568151474,
0.0427534356713295,
0.037516456097364426,
-0.07723342627286911,
0.14293289184570312,
0.04577145352959633,
-0.04596833512187004,
0.023372778668999672,
-0.02807549573481083,
-0.11007361859083176,
0.014203431084752083,
0.059603750705718994,
0.06917116791009903,
-0.07489845901727676,
-0.01971174031496048,
-0.04138284921646118,
-0.0807720422744751,
0.06860306113958359,
0.21401304006576538,
0.06341910362243652,
0.06571868062019348,
-0.05087864771485329,
-0.03421201929450035,
-0.08033861219882965,
0.051098160445690155,
0.052680738270282745,
0.07825993746519089,
-0.07781071960926056,
0.10442503541707993,
0.012749985791742802,
0.053261566907167435,
-0.027475791051983833,
-0.04971765726804733,
-0.10476891696453094,
-0.057247936725616455,
-0.1144741103053093,
0.013147080317139626,
-0.0660746619105339,
-0.03907076269388199,
0.00585654191672802,
-0.004135522525757551,
-0.006016931962221861,
0.05677211284637451,
-0.06077950447797775,
-0.010892408899962902,
-0.011666323058307171,
0.03542538732290268,
-0.06164810433983803,
-0.05281816050410271,
0.020375296473503113,
-0.09585162252187729,
0.10124971717596054,
0.04865024611353874,
0.012048427015542984,
0.00043569589615799487,
0.08673404902219772,
-0.00967259518802166,
0.021397938951849937,
0.0073873004876077175,
-0.03876943141222,
-0.10462546348571777,
0.005846301093697548,
-0.027213437482714653,
-0.03507963567972183,
-0.018064508214592934,
0.09077411890029907,
-0.08039353042840958,
0.03224489092826843,
0.00044740570592693985,
-0.003095267340540886,
-0.07970523834228516,
-0.00724645284935832,
0.09808520972728729,
0.08098505437374115,
0.05422127619385719,
-0.08293411135673523,
0.014104669913649559,
-0.12715871632099152,
-0.03675570338964462,
0.012353092432022095,
-0.014883071184158325,
-0.12551362812519073,
-0.011684243567287922,
0.01757463812828064,
-0.011246697045862675,
0.19135233759880066,
-0.06059610843658447,
-0.025049107149243355,
0.021902434527873993,
-0.09138601273298264,
0.09675946831703186,
-0.020573338493704796,
0.16406847536563873,
-0.030689148232340813,
-0.03631305322051048,
-0.012569107115268707,
0.049149755388498306,
0.029371459037065506,
-0.0026049413718283176,
0.18200531601905823,
0.13289155066013336,
0.045106448233127594,
0.05675109475851059,
-0.024310262873768806,
-0.006626366171985865,
-0.0610249824821949,
-0.02029985561966896,
0.04136044159531593,
0.03537008538842201,
0.023151330649852753,
0.1462719589471817,
0.06576164066791534,
-0.16098207235336304,
0.037137702107429504,
-0.029292840510606766,
-0.04278094694018364,
-0.11477570980787277,
-0.0925886332988739,
-0.028227977454662323,
-0.06458628922700882,
0.013330703601241112,
-0.133217915892601,
0.003147541545331478,
0.1770004779100418,
0.0656951367855072,
0.02942218817770481,
0.017957769334316254,
-0.1325850635766983,
-0.04523376375436783,
0.05514523386955261,
0.012724540196359158,
0.023264719173312187,
0.04588370770215988,
-0.0061803292483091354,
0.0672045648097992,
0.027595261111855507,
0.000607222318649292,
-0.005503413733094931,
0.08302997052669525,
0.024921905249357224,
0.045676399022340775,
-0.05629601329565048,
-0.00035104010021314025,
-0.03915180265903473,
0.08104541152715683,
0.11180324852466583,
0.04262356087565422,
-0.04819769412279129,
-0.012322770431637764,
0.16297647356987,
-0.02842382900416851,
0.005118723958730698,
-0.12244527786970139,
0.32432281970977783,
0.017285309731960297,
0.010058192536234856,
0.058099664747714996,
-0.08134179562330246,
-0.04580199718475342,
0.21034447848796844,
0.07054416835308075,
-0.023908885195851326,
-0.029407557100057602,
0.005618114490061998,
-0.031516920775175095,
-0.01464837696403265,
0.14184457063674927,
0.043792322278022766,
0.11689415574073792,
-0.053371574729681015,
-0.04905315488576889,
-0.03818279504776001,
0.003752683522179723,
-0.11316338181495667,
0.1520179808139801,
-0.021101457998156548,
-0.020474566146731377,
-0.0770014226436615,
0.01376816164702177,
0.07358364015817642,
-0.35057634115219116,
0.0061776116490364075,
-0.026842636987566948,
-0.10570574551820755,
-0.014772368595004082,
-0.029171330854296684,
-0.024957384914159775,
0.04713839665055275,
-0.038756195455789566,
0.06405241042375565,
0.0500817634165287,
0.039042383432388306,
-0.021100034937262535,
-0.10650358349084854,
0.16524063050746918,
0.06470771133899689,
0.10550973564386368,
0.017724957317113876,
0.08572502434253693,
0.061145368963479996,
0.038315299898386,
-0.09586312621831894,
0.04984021559357643,
0.00954907014966011,
-0.06976337730884552,
-0.057123176753520966,
0.11376530677080154,
0.00023629293718840927,
0.0576380118727684,
0.031029507517814636,
-0.12376611679792404,
0.022927789017558098,
0.07870668172836304,
-0.08757614344358444,
-0.09494521468877792,
-0.0055502899922430515,
-0.0960066094994545,
0.15732164680957794,
0.14607387781143188,
-0.011092806234955788,
0.016242733225226402,
-0.06256705522537231,
-0.010902306996285915,
0.053270019590854645,
0.009484315291047096,
-0.012746594846248627,
-0.1849062740802765,
0.05296587198972702,
-0.09244776517152786,
-0.0036999466829001904,
-0.21505476534366608,
-0.10247719287872314,
-0.010338635183870792,
-0.05564277619123459,
-0.021739821881055832,
0.06093614920973778,
0.026435717940330505,
0.07570801675319672,
-0.02430289052426815,
-0.02026003785431385,
-0.03993535414338112,
0.09568426758050919,
-0.10260890424251556,
-0.07581952214241028
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 140k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 140k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_140k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_140k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_140k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_140k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_140k"]}
| null |
google/multiberts-seed_1-step_140k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_140k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_140k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 140k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 140k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 140k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 140k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_140k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 140k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 140k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_140k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 140k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 140k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08152281492948532,
0.07924016565084457,
-0.002167982282117009,
0.043680764734745026,
0.07867340743541718,
-0.01682695746421814,
0.06628620624542236,
0.09263741225004196,
-0.018027042970061302,
0.02229171060025692,
0.0835394561290741,
0.024562537670135498,
0.009144207462668419,
0.09508022665977478,
0.021415015682578087,
-0.21500952541828156,
0.03148404508829117,
-0.026989826932549477,
-0.09096403419971466,
0.07411763817071915,
0.10423241555690765,
-0.08308357745409012,
0.040814921259880066,
0.034960951656103134,
-0.1124793142080307,
0.05223572626709938,
-0.009000147692859173,
-0.021982014179229736,
0.13827146589756012,
0.002925769193097949,
0.055781666189432144,
0.054740555584430695,
0.045088183134794235,
-0.13692940771579742,
0.0052107893861830235,
0.056399744004011154,
0.05066375806927681,
0.03942187875509262,
0.018889663740992546,
0.0838414803147316,
-0.025350023061037064,
0.03284401074051857,
0.05026073008775711,
0.01555296778678894,
-0.06504187732934952,
-0.07907681167125702,
-0.09714537113904953,
0.031928762793540955,
0.027323678135871887,
0.020179415121674538,
0.008411522023379803,
0.11495267599821091,
-0.03300478681921959,
0.04359829053282738,
0.1734619289636612,
-0.31423455476760864,
-0.0021766333375126123,
0.05853530392050743,
0.022231033071875572,
0.10998030006885529,
-0.004182934295386076,
-0.03206856548786163,
0.07997016608715057,
0.02472003549337387,
0.09999043494462967,
-0.039185840636491776,
0.01597142405807972,
-0.05920714884996414,
-0.15704183280467987,
-0.03717174753546715,
0.09689605981111526,
0.0030892996583133936,
-0.14009006321430206,
-0.020753122866153717,
-0.04421175643801689,
0.043248437345027924,
0.017514387145638466,
-0.040300894528627396,
0.043339625000953674,
0.005207956302911043,
-0.0005806379485875368,
-0.0036891803611069918,
-0.10463406890630722,
-0.04439273849129677,
0.022283369675278664,
0.09477387368679047,
0.10960129648447037,
0.05699337646365166,
-0.005419198423624039,
0.10798103362321854,
-0.1901610791683197,
-0.04789688065648079,
-0.0279026310890913,
-0.034462206065654755,
-0.045106347650289536,
-0.010879524052143097,
-0.10393977165222168,
-0.043448660522699356,
0.0018558695446699858,
0.13542310893535614,
-0.022976160049438477,
0.03159768506884575,
-0.01918564736843109,
0.0049288771115243435,
0.058471791446208954,
0.05142861604690552,
-0.023821048438549042,
0.026791250333189964,
0.03739606961607933,
-0.017930518835783005,
-0.020263796672225,
0.010011314414441586,
-0.0037480576429516077,
0.024164186790585518,
0.13494181632995605,
0.011941801756620407,
-0.10641376674175262,
0.0786542296409607,
-0.014052812941372395,
-0.044317301362752914,
-0.0021410344634205103,
-0.0866268128156662,
-0.06274443119764328,
-0.03762643039226532,
-0.01508991327136755,
0.006380344741046429,
0.00661213044077158,
-0.00927196815609932,
-0.02702862210571766,
-0.019437002018094063,
-0.08991022408008575,
-0.057980652898550034,
-0.05330085754394531,
-0.1371985822916031,
0.009328640066087246,
-0.18531125783920288,
-0.027622969821095467,
-0.11630400270223618,
-0.20427536964416504,
-0.037377238273620605,
0.04820137843489647,
0.0035540920216590166,
-0.06492992490530014,
0.06208512932062149,
0.03480443358421326,
-0.029190510511398315,
-0.0031401189044117928,
0.08345645666122437,
-0.007241206709295511,
0.03782538324594498,
-0.03755975142121315,
0.05857590213418007,
0.0038675088435411453,
0.04260663315653801,
-0.059182506054639816,
0.057830847799777985,
-0.17326012253761292,
0.03919953107833862,
-0.07380339503288269,
-0.029563015326857567,
-0.08587323129177094,
-0.03364841639995575,
-0.004740053787827492,
0.014106937684118748,
0.02479683980345726,
0.07315339893102646,
-0.1673264503479004,
-0.02976115420460701,
0.09062730520963669,
-0.15236055850982666,
-0.028672771528363228,
0.07448850572109222,
-0.056019578129053116,
0.11745511740446091,
0.06613973528146744,
0.15880417823791504,
-0.020939989015460014,
-0.0737837478518486,
0.049191929399967194,
-0.010012940503656864,
0.00863059051334858,
-0.008466359227895737,
0.06784720718860626,
-0.018011782318353653,
-0.16487328708171844,
0.02464648336172104,
-0.1338024139404297,
0.004032760392874479,
-0.07828909903764725,
0.030606741085648537,
-0.003932911437004805,
-0.06867362558841705,
-0.07732395827770233,
-0.03486928343772888,
0.07829499244689941,
-0.06706520169973373,
-0.02636176533997059,
0.0448029488325119,
0.07873006165027618,
-0.0747198686003685,
0.0679309293627739,
-0.015457362867891788,
0.02342926897108555,
-0.0793851837515831,
-0.03566456958651543,
-0.18513962626457214,
0.03328404203057289,
0.09504320472478867,
0.010933144949376583,
-0.022099830210208893,
0.12590159475803375,
-0.006350491661578417,
0.06762940436601639,
-0.042051948606967926,
-0.0015421854332089424,
-0.01030686404556036,
0.0008608673233538866,
-0.09410630166530609,
-0.11087349057197571,
-0.07460401207208633,
-0.06869274377822876,
0.09964897483587265,
-0.12087398767471313,
0.023013532161712646,
-0.059659119695425034,
0.04135792702436447,
0.015401161275804043,
-0.06806650012731552,
-0.01068869885057211,
0.013804435729980469,
-0.06412239372730255,
-0.05657266825437546,
0.03750982880592346,
0.06381604075431824,
-0.021600615233182907,
0.09884604066610336,
-0.04868626967072487,
-0.08191636949777603,
0.028468087315559387,
0.07033272832632065,
-0.10666772723197937,
0.021590525284409523,
-0.04680836573243141,
-0.047838244587183,
-0.06628801673650742,
-0.02909153141081333,
0.10027675330638885,
-0.014661233872175217,
0.1464209258556366,
-0.07635325938463211,
-0.012376234866678715,
0.010320364497601986,
-0.014770531095564365,
-0.024970512837171555,
0.04428260028362274,
0.06575874239206314,
-0.07016483694314957,
0.02350624091923237,
0.03134477511048317,
-0.004567049443721771,
0.06799134612083435,
-0.05214101821184158,
-0.07725328207015991,
0.019357403740286827,
0.032114531844854355,
0.021392574533820152,
0.061932291835546494,
-0.04982354864478111,
-0.010179325938224792,
0.030718544498085976,
0.022837473079562187,
0.012032226659357548,
-0.11829392611980438,
0.06082111969590187,
0.06231006607413292,
0.00784105435013771,
0.05226093903183937,
-0.01880275458097458,
-0.03474746644496918,
0.08020544052124023,
0.03132504224777222,
-0.012930138036608696,
-0.008830660954117775,
-0.010316729545593262,
-0.12305682897567749,
0.21811716258525848,
-0.06976307183504105,
-0.15182146430015564,
-0.07246848940849304,
-0.11332739889621735,
0.005687633529305458,
0.02373393252491951,
0.04454352334141731,
-0.026645217090845108,
-0.04168950766324997,
-0.12308382242918015,
0.0949828252196312,
-0.038306962698698044,
0.06682440638542175,
0.10988548398017883,
-0.06562404334545135,
0.04994351044297218,
-0.1308678686618805,
-0.00957686547189951,
-0.07582229375839233,
-0.05699646845459938,
0.056460972875356674,
-0.052102357149124146,
0.037504736334085464,
0.11316852271556854,
0.015902118757367134,
-0.029503514990210533,
-0.029912197962403297,
0.21064886450767517,
0.04281027242541313,
0.042001813650131226,
0.1338203251361847,
-0.07314291596412659,
0.05375947803258896,
0.07595356553792953,
0.005415558815002441,
-0.0437200590968132,
0.051923345774412155,
0.05409345030784607,
-0.059485625475645065,
-0.19118668138980865,
-0.008797744289040565,
0.007538401521742344,
-0.04794591665267944,
0.06696905195713043,
0.037134137004613876,
0.010264684446156025,
0.07595521956682205,
0.01794802024960518,
0.06972166895866394,
0.00413166917860508,
0.1008409932255745,
0.03772168606519699,
-0.036624353379011154,
0.0844620019197464,
-0.008487124927341938,
-0.006610690616071224,
0.07804068922996521,
-0.01872352883219719,
0.2888253629207611,
-0.04605893790721893,
0.020414909347891808,
0.1261201798915863,
0.036052364856004715,
0.05177508667111397,
0.12198588997125626,
-0.07905875891447067,
0.027969922870397568,
-0.07567649334669113,
-0.048334818333387375,
0.015204708091914654,
0.04509104788303375,
-0.07568683475255966,
0.01740260422229767,
-0.08735150843858719,
0.02248857542872429,
-0.025854378938674927,
0.2986893057823181,
0.1066032275557518,
-0.11377574503421783,
-0.06595425307750702,
-0.00007711492798989639,
-0.09887906908988953,
-0.07310321182012558,
0.048845045268535614,
0.054931897670030594,
-0.1307239532470703,
0.0034235711209475994,
-0.021975375711917877,
0.07686147838830948,
-0.02574329264461994,
0.01612953096628189,
0.042780064046382904,
0.057197507470846176,
-0.04619146138429642,
0.008370955474674702,
-0.18052512407302856,
0.19324034452438354,
-0.001129130832850933,
0.02379540540277958,
-0.05277230963110924,
0.029533548280596733,
0.010822618380188942,
-0.008591635152697563,
0.059513792395591736,
0.0197906494140625,
-0.008016691543161869,
-0.06012628227472305,
-0.04012538865208626,
0.013655992224812508,
0.06377066671848297,
-0.041905421763658524,
0.10603775084018707,
0.0051284292712807655,
0.05289695784449577,
0.03167472407221794,
0.08716287463903427,
-0.1850639432668686,
-0.07864467799663544,
0.027432069182395935,
-0.046135932207107544,
-0.09749268740415573,
-0.0810292661190033,
-0.09917134791612625,
-0.009841818362474442,
0.22426943480968475,
-0.11008408665657043,
-0.07441608607769012,
-0.09431774169206619,
0.04459789767861366,
0.09814812988042831,
-0.054767366498708725,
0.026408758014440536,
-0.011689840815961361,
0.1151687353849411,
-0.06873118132352829,
-0.12345228344202042,
0.026263095438480377,
-0.09893477708101273,
-0.15991650521755219,
-0.0684109553694725,
0.08881130814552307,
0.0634612962603569,
0.028795704245567322,
-0.03056234121322632,
0.009652688167989254,
0.03647866100072861,
-0.03791790455579758,
-0.0022639131639152765,
0.059105824679136276,
0.09270016103982925,
0.03585213050246239,
-0.11147349327802658,
0.021358121186494827,
-0.06996284425258636,
-0.06764824688434601,
0.07238312810659409,
0.27228593826293945,
-0.052730172872543335,
0.11003366857767105,
0.1261577159166336,
-0.08776073902845383,
-0.158359095454216,
0.04076842591166496,
0.09534405171871185,
-0.014207322150468826,
0.0033119716681540012,
-0.16434568166732788,
0.10226702690124512,
0.11561009287834167,
-0.015909677371382713,
0.00811502430588007,
-0.20008687674999237,
-0.13702654838562012,
0.08902754634618759,
0.11463463306427002,
0.2801952660083771,
-0.054581981152296066,
-0.036088671535253525,
0.020748047158122063,
-0.0962127223610878,
0.008724300190806389,
0.12188392877578735,
0.06146971136331558,
-0.020149843767285347,
-0.06680960953235626,
0.010442852973937988,
-0.03598523139953613,
0.09461262822151184,
0.06260757148265839,
0.06907844543457031,
-0.004139664117246866,
0.0016392553225159645,
-0.02841593697667122,
-0.04092742130160332,
0.07126359641551971,
0.03659931942820549,
0.05072581022977829,
-0.09119686484336853,
-0.03696271404623985,
-0.07432029396295547,
0.03457766771316528,
-0.0317225307226181,
-0.07638580352067947,
-0.06003192812204361,
0.07567037642002106,
0.057373810559511185,
-0.03356381505727768,
0.03188043832778931,
0.033104877918958664,
0.09870335459709167,
0.1468416005373001,
-0.006033651065081358,
-0.03943381458520889,
-0.06387370079755783,
-0.028345242142677307,
-0.01383359357714653,
0.07568470388650894,
-0.04659004881978035,
0.019381439313292503,
0.07082586735486984,
0.02031640335917473,
0.10613451153039932,
0.060052432119846344,
-0.12270819395780563,
-0.01922515779733658,
0.026104915887117386,
-0.1566251814365387,
0.014156661927700043,
0.000033021875424310565,
0.01295486744493246,
-0.02159269154071808,
0.024800943210721016,
0.15019528567790985,
-0.06942084431648254,
-0.03198603540658951,
-0.04587769880890846,
0.059817124158144,
0.03331728279590607,
0.14782710373401642,
0.04118410497903824,
0.03749406710267067,
-0.07610233873128891,
0.1399257332086563,
0.043918177485466,
-0.0424712710082531,
0.0267417561262846,
-0.03326163813471794,
-0.10828462243080139,
0.0135076604783535,
0.06075681373476982,
0.07099059224128723,
-0.07317027449607849,
-0.021458027884364128,
-0.03999795392155647,
-0.07961299270391464,
0.06850321590900421,
0.2151709347963333,
0.06425074487924576,
0.06911297142505646,
-0.052062731236219406,
-0.03422316908836365,
-0.07951013743877411,
0.05009479448199272,
0.05253174528479576,
0.07806523889303207,
-0.07822631299495697,
0.104234479367733,
0.01335942279547453,
0.05260743945837021,
-0.027825264260172844,
-0.04948478564620018,
-0.10566350072622299,
-0.05827752500772476,
-0.1143168956041336,
0.015986252576112747,
-0.06442154198884964,
-0.040476296097040176,
0.00736644072458148,
-0.003311833832412958,
-0.003032524371519685,
0.0570792481303215,
-0.06046891584992409,
-0.010839507915079594,
-0.013223012909293175,
0.03464370593428612,
-0.06338030844926834,
-0.05347536504268646,
0.019250117242336273,
-0.096770279109478,
0.10077552497386932,
0.04947766289114952,
0.012866019271314144,
0.0022329052444547415,
0.07540344446897507,
-0.00807651225477457,
0.02323652245104313,
0.007066584192216396,
-0.04097464680671692,
-0.10293697565793991,
0.006970060057938099,
-0.026827136054635048,
-0.034396182745695114,
-0.01946329139173031,
0.09417042881250381,
-0.08095131814479828,
0.02859754115343094,
0.0005110198981128633,
-0.0020161487627774477,
-0.07805324345827103,
-0.007166601251810789,
0.095649853348732,
0.08360035717487335,
0.05364999175071716,
-0.08373574167490005,
0.01525474339723587,
-0.12632326781749725,
-0.036408115178346634,
0.011661822907626629,
-0.014149551279842854,
-0.12754476070404053,
-0.011730657890439034,
0.017291609197854996,
-0.012287145480513573,
0.19274602830410004,
-0.057780686765909195,
-0.02300386317074299,
0.018969139084219933,
-0.09587220102548599,
0.10575626790523529,
-0.02010681852698326,
0.16671118140220642,
-0.03007262572646141,
-0.03519637882709503,
-0.01565776951611042,
0.04883241653442383,
0.028749460354447365,
-0.005999366287142038,
0.1792578399181366,
0.13147832453250885,
0.036490730941295624,
0.05683763325214386,
-0.025746837258338928,
-0.008086130954325199,
-0.058824777603149414,
-0.021417662501335144,
0.04074340686202049,
0.03526623174548149,
0.021940702572464943,
0.1535043865442276,
0.06389413774013519,
-0.16088362038135529,
0.03487773239612579,
-0.027282875031232834,
-0.04246806353330612,
-0.11595015972852707,
-0.10063382983207703,
-0.030944401398301125,
-0.06279094517230988,
0.013876822777092457,
-0.13314449787139893,
0.002788545796647668,
0.17290166020393372,
0.06543704122304916,
0.028537699952721596,
0.016094110906124115,
-0.1270189732313156,
-0.045555479824543,
0.05729899927973747,
0.012600653804838657,
0.02067870832979679,
0.045937392860651016,
-0.004648645408451557,
0.06917429715394974,
0.028783103451132774,
0.0016106036491692066,
-0.005533291958272457,
0.08406775444746017,
0.02525036595761776,
0.04557451978325844,
-0.05632718279957771,
-0.0007469450938515365,
-0.03875463083386421,
0.08226656168699265,
0.11071370542049408,
0.04348078370094299,
-0.049377430230379105,
-0.01241513155400753,
0.16022808849811554,
-0.027028623968362808,
0.00825570523738861,
-0.12176702171564102,
0.3246857523918152,
0.015684865415096283,
0.011937422677874565,
0.05793937295675278,
-0.07964727282524109,
-0.044648006558418274,
0.20805193483829498,
0.06840748339891434,
-0.024229077622294426,
-0.028234243392944336,
0.0054822140373289585,
-0.03169552609324455,
-0.015069880522787571,
0.1423959732055664,
0.04474537819623947,
0.11824538558721542,
-0.05521333962678909,
-0.05108022689819336,
-0.03798544406890869,
0.0034018259029835463,
-0.11365635693073273,
0.14727260172367096,
-0.020275579765439034,
-0.0209597609937191,
-0.07635167986154556,
0.012310120277106762,
0.0743316113948822,
-0.34930419921875,
0.004225053358823061,
-0.02413933351635933,
-0.10451436042785645,
-0.014796922914683819,
-0.030771218240261078,
-0.02478335052728653,
0.04654119536280632,
-0.04048623889684677,
0.06408079713582993,
0.04984736070036888,
0.038617633283138275,
-0.020987939089536667,
-0.10079493373632431,
0.1627250760793686,
0.06062891706824303,
0.10332048684358597,
0.01744025945663452,
0.08867035806179047,
0.0610504224896431,
0.03796597197651863,
-0.09275015443563461,
0.05340692028403282,
0.009420543909072876,
-0.07210459560155869,
-0.05555378645658493,
0.11329100281000137,
0.00039738972554914653,
0.05898361653089523,
0.030593302100896835,
-0.12292639166116714,
0.025035711005330086,
0.07676775008440018,
-0.08913357555866241,
-0.09379751235246658,
-0.004489033482968807,
-0.09809120744466782,
0.15701228380203247,
0.14676055312156677,
-0.01212840061634779,
0.015221236273646355,
-0.06389444321393967,
-0.0076361531391739845,
0.05316353961825371,
0.006873811595141888,
-0.013274066150188446,
-0.18295183777809143,
0.053096670657396317,
-0.08859356492757797,
-0.003630494000390172,
-0.21360962092876434,
-0.10155709832906723,
-0.011242474429309368,
-0.055037371814250946,
-0.022030191496014595,
0.05778889358043671,
0.02473433129489422,
0.0766611099243164,
-0.025507852435112,
-0.02261386066675186,
-0.039783332496881485,
0.09493344277143478,
-0.10387881100177765,
-0.07592537999153137
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1500k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1500k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1500k"]}
| null |
google/multiberts-seed_1-step_1500k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1500k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1500k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08377368748188019,
0.07729171961545944,
-0.002318506594747305,
0.047157932072877884,
0.08186512440443039,
-0.018723156303167343,
0.06419379264116287,
0.0923486053943634,
-0.017970874905586243,
0.0209199208766222,
0.08274631947278976,
0.02243003435432911,
0.012248811312019825,
0.09492700546979904,
0.018851151689887047,
-0.20492130517959595,
0.03191108629107475,
-0.030233614146709442,
-0.10442370921373367,
0.07279999554157257,
0.10336600989103317,
-0.08494722843170166,
0.04205648601055145,
0.03296855837106705,
-0.11399433761835098,
0.053706515580415726,
-0.009434512816369534,
-0.021982714533805847,
0.13597162067890167,
0.00702035054564476,
0.05342225357890129,
0.05664483830332756,
0.04539450258016586,
-0.13902433216571808,
0.0054240901954472065,
0.05454648286104202,
0.05116356909275055,
0.03894159942865372,
0.02203267440199852,
0.08331818133592606,
-0.03751103952527046,
0.03577635437250137,
0.049100156873464584,
0.018356673419475555,
-0.06428103893995285,
-0.05403193458914757,
-0.09694311022758484,
0.0403001494705677,
0.027463387697935104,
0.016323866322636604,
0.011622712016105652,
0.10677818208932877,
-0.03532891720533371,
0.044529400765895844,
0.17582374811172485,
-0.2976350486278534,
-0.004713479895144701,
0.06673771888017654,
0.017106110230088234,
0.11671257764101028,
-0.00016320963914040476,
-0.031108196824789047,
0.08124274760484695,
0.02321072854101658,
0.09711657464504242,
-0.04009252041578293,
0.01599791646003723,
-0.058501947671175,
-0.15540711581707,
-0.039535295218229294,
0.09394951164722443,
0.0023375439923256636,
-0.13773766160011292,
-0.01766124740242958,
-0.04288795590400696,
0.03930819034576416,
0.017945952713489532,
-0.043524932116270065,
0.042567506432533264,
0.002456765389069915,
0.004954717122018337,
-0.006363357417285442,
-0.09903902560472488,
-0.04687067121267319,
0.027052966877818108,
0.09822364151477814,
0.10966084897518158,
0.05901354178786278,
-0.007722980342805386,
0.10343576967716217,
-0.1932685524225235,
-0.04625041037797928,
-0.030759695917367935,
-0.03405075520277023,
-0.045697905123233795,
-0.013434875756502151,
-0.10157148540019989,
-0.05474180728197098,
0.0012432374060153961,
0.13829955458641052,
-0.026591714471578598,
0.031251948326826096,
-0.020190514624118805,
0.0034946980886161327,
0.05816389620304108,
0.04640695080161095,
-0.028589410707354546,
0.02707505412399769,
0.03900076076388359,
-0.0117530208081007,
-0.025960508733987808,
0.009629251435399055,
-0.00288258190266788,
0.02385750599205494,
0.13294877111911774,
0.015784308314323425,
-0.10578236728906631,
0.07545371353626251,
-0.013522959314286709,
-0.045520372688770294,
-0.0013571922900155187,
-0.08523688465356827,
-0.06518493592739105,
-0.04132617637515068,
-0.01609729416668415,
-0.006115455646067858,
0.007136819418519735,
-0.007576976902782917,
-0.030367901548743248,
-0.01330301072448492,
-0.09015093743801117,
-0.05805462226271629,
-0.05371014401316643,
-0.13867907226085663,
0.007778696250170469,
-0.19362537562847137,
-0.027032339945435524,
-0.1180584654211998,
-0.20919616520404816,
-0.03769416734576225,
0.04890793189406395,
0.0009808419272303581,
-0.06963171064853668,
0.0619601309299469,
0.03124428354203701,
-0.030867960304021835,
-0.0015108095249161124,
0.08431670069694519,
-0.004315582104027271,
0.03584885597229004,
-0.0385541208088398,
0.059097666293382645,
0.007091064006090164,
0.04527875408530235,
-0.056590575724840164,
0.0549178384244442,
-0.1770656257867813,
0.045388925820589066,
-0.07703060656785965,
-0.025045843794941902,
-0.0850600004196167,
-0.030647428706288338,
-0.008773749694228172,
0.01300626527518034,
0.02980051562190056,
0.07800386101007462,
-0.1697130650281906,
-0.02880058065056801,
0.08524005860090256,
-0.15321651101112366,
-0.032776787877082825,
0.07201050221920013,
-0.054987940937280655,
0.11507102102041245,
0.06389547884464264,
0.15650005638599396,
-0.017047623172402382,
-0.06871241331100464,
0.052010685205459595,
-0.011831273324787617,
0.006221161689609289,
-0.013845397159457207,
0.06736064702272415,
-0.01818353869020939,
-0.1704016774892807,
0.023881912231445312,
-0.13206955790519714,
0.006344684399664402,
-0.08084916323423386,
0.02821301855146885,
-0.0036633710842579603,
-0.07533805817365646,
-0.08111567050218582,
-0.036381594836711884,
0.07773206382989883,
-0.0667295753955841,
-0.02078915201127529,
0.04450409114360809,
0.07483770698308945,
-0.06820966303348541,
0.0670972466468811,
-0.013750895857810974,
0.023873263970017433,
-0.08419984579086304,
-0.03452486917376518,
-0.18776607513427734,
0.04406190663576126,
0.09431619942188263,
0.011287757195532322,
-0.019611570984125137,
0.12219075858592987,
-0.006597516592592001,
0.06480514258146286,
-0.04306806996464729,
-0.0005290719564072788,
-0.009894967079162598,
-0.00039193473639898,
-0.0937756896018982,
-0.10719110816717148,
-0.0686175599694252,
-0.06628850102424622,
0.07990375906229019,
-0.11039759963750839,
0.025166047737002373,
-0.061066560447216034,
0.03862205147743225,
0.017473818734288216,
-0.0706086978316307,
-0.008819445967674255,
0.016567794606089592,
-0.062076617032289505,
-0.05583130940794945,
0.036301638931035995,
0.06216032803058624,
-0.01493669394403696,
0.09485186636447906,
-0.0387994609773159,
-0.0791957899928093,
0.025897186249494553,
0.07711800932884216,
-0.1072273775935173,
0.026520121842622757,
-0.04537568613886833,
-0.04561102017760277,
-0.07081442326307297,
-0.027875665575265884,
0.09777108579874039,
-0.011464377865195274,
0.14481143653392792,
-0.08015245944261551,
-0.008947129361331463,
0.014162685722112656,
-0.011294761672616005,
-0.025580691173672676,
0.045532625168561935,
0.07656899094581604,
-0.06256405264139175,
0.022623753175139427,
0.020742641761898994,
-0.00020780562772415578,
0.06351392716169357,
-0.04881947115063667,
-0.07383184880018234,
0.02077079750597477,
0.033232640475034714,
0.02363411895930767,
0.061709169298410416,
-0.05374425649642944,
-0.009391110390424728,
0.028504708781838417,
0.02366810478270054,
0.013450829312205315,
-0.1168670803308487,
0.05954639986157417,
0.06540519744157791,
0.008250000886619091,
0.04765183478593826,
-0.018912574276328087,
-0.03242428973317146,
0.0806489810347557,
0.029996249824762344,
-0.01134724821895361,
-0.01097104698419571,
-0.011608446948230267,
-0.12323130667209625,
0.21542461216449738,
-0.06963098049163818,
-0.16225382685661316,
-0.07461186498403549,
-0.10502724349498749,
0.0015511872479692101,
0.02277800254523754,
0.043595556169748306,
-0.02846727892756462,
-0.043512627482414246,
-0.12131726741790771,
0.10034909844398499,
-0.036873914301395416,
0.06685718148946762,
0.11807163059711456,
-0.06361018866300583,
0.04938013479113579,
-0.13208775222301483,
-0.008997819386422634,
-0.0754368007183075,
-0.05563589558005333,
0.054218027740716934,
-0.046583499759435654,
0.04203108698129654,
0.1112351343035698,
0.01103235688060522,
-0.028301630169153214,
-0.030907783657312393,
0.21994999051094055,
0.043465517461299896,
0.04018990695476532,
0.13078369200229645,
-0.07282628118991852,
0.052846163511276245,
0.0869709923863411,
0.008230846375226974,
-0.045404136180877686,
0.053579311817884445,
0.05748014524579048,
-0.0593898631632328,
-0.19127635657787323,
-0.007571148686110973,
0.005705568473786116,
-0.047511328011751175,
0.06561797112226486,
0.03610198199748993,
0.012865724973380566,
0.07492975145578384,
0.018557390198111534,
0.06898847222328186,
0.004300624132156372,
0.09673961997032166,
0.02859998308122158,
-0.03616819530725479,
0.08442748337984085,
-0.005504448898136616,
-0.0042148916982114315,
0.0781477838754654,
-0.01980571821331978,
0.2914786636829376,
-0.04695053771138191,
0.01927335001528263,
0.12643635272979736,
0.03228486701846123,
0.05306284502148628,
0.1275952011346817,
-0.08199672400951385,
0.026903769001364708,
-0.07273393869400024,
-0.04496559873223305,
0.01248671393841505,
0.042232900857925415,
-0.07426343858242035,
0.0222258772701025,
-0.08823031187057495,
0.03532624617218971,
-0.024145040661096573,
0.30186977982521057,
0.09734548628330231,
-0.1021784171462059,
-0.0625932514667511,
-0.0016130050644278526,
-0.10024257004261017,
-0.07319312542676926,
0.05136759579181671,
0.061497509479522705,
-0.1286214292049408,
0.0006160324555821717,
-0.02252657525241375,
0.07978534698486328,
-0.01976870559155941,
0.01601392589509487,
0.040412597358226776,
0.058389339596033096,
-0.04797741025686264,
0.005817925091832876,
-0.1915120780467987,
0.1943008005619049,
-0.0007201681728474796,
0.023914452642202377,
-0.05128413811326027,
0.030359087511897087,
0.009688153862953186,
-0.00911248940974474,
0.06358089298009872,
0.02251591347157955,
-0.004030488431453705,
-0.0646747425198555,
-0.04075895994901657,
0.011956031434237957,
0.062099289149045944,
-0.04169965162873268,
0.10090718418359756,
0.006821869406849146,
0.05303845554590225,
0.03019816242158413,
0.0915139839053154,
-0.1918557733297348,
-0.08055239170789719,
0.02859286405146122,
-0.054304592311382294,
-0.10223072022199631,
-0.08132737129926682,
-0.09831149876117706,
-0.011335748247802258,
0.21404698491096497,
-0.11345502734184265,
-0.07640634477138519,
-0.09567439556121826,
0.04530122131109238,
0.09680692106485367,
-0.056414127349853516,
0.028619419783353806,
-0.012014022096991539,
0.11244378238916397,
-0.07486516982316971,
-0.12090721726417542,
0.022716816514730453,
-0.10118880867958069,
-0.15680859982967377,
-0.06635161489248276,
0.08420634269714355,
0.06430923938751221,
0.029382886365056038,
-0.035593077540397644,
0.008159601129591465,
0.04402609169483185,
-0.036994658410549164,
-0.00402884790673852,
0.05657828971743584,
0.08277939260005951,
0.03972212225198746,
-0.10577322542667389,
0.01599034108221531,
-0.07228223979473114,
-0.07028566300868988,
0.07064463943243027,
0.26881158351898193,
-0.051003776490688324,
0.11148371547460556,
0.1257765293121338,
-0.0885368213057518,
-0.16543373465538025,
0.046755868941545486,
0.09663651883602142,
-0.013796896673738956,
0.0006221731891855597,
-0.16141828894615173,
0.10053083300590515,
0.11226091533899307,
-0.012985500507056713,
0.00045498451800085604,
-0.20326143503189087,
-0.14115232229232788,
0.08559058606624603,
0.11660860478878021,
0.28091001510620117,
-0.05002899467945099,
-0.03723903000354767,
0.024917379021644592,
-0.10093671083450317,
0.007974818348884583,
0.13978441059589386,
0.057199735194444656,
-0.021317480131983757,
-0.07133794575929642,
0.012625319883227348,
-0.03760475665330887,
0.09009145945310593,
0.06293832510709763,
0.07276411354541779,
-0.0031457955483347178,
-0.006787012331187725,
-0.04121612012386322,
-0.042376529425382614,
0.07018935680389404,
0.03734860196709633,
0.053417738527059555,
-0.08128765225410461,
-0.03520014137029648,
-0.07457990199327469,
0.03245389461517334,
-0.03292505815625191,
-0.07588595151901245,
-0.059999190270900726,
0.07934621721506119,
0.05872573330998421,
-0.033633340150117874,
0.036798037588596344,
0.03360463306307793,
0.09748674929141998,
0.15173740684986115,
-0.003209451911970973,
-0.05993260070681572,
-0.0611773245036602,
-0.025858938694000244,
-0.01257205381989479,
0.07706059515476227,
-0.04661666229367256,
0.023032575845718384,
0.07367575913667679,
0.021364726126194,
0.10855237394571304,
0.06009049341082573,
-0.1186833456158638,
-0.022791415452957153,
0.028658494353294373,
-0.15176089107990265,
0.004124700091779232,
-0.004583580885082483,
0.011772224679589272,
-0.023005910217761993,
0.018173258751630783,
0.14820103347301483,
-0.07302755117416382,
-0.03176039457321167,
-0.04789014160633087,
0.06253276765346527,
0.0355495847761631,
0.15270832180976868,
0.040416911244392395,
0.04006095975637436,
-0.0780644342303276,
0.14577506482601166,
0.0439358651638031,
-0.041775211691856384,
0.02382277138531208,
-0.03212668374180794,
-0.107697993516922,
0.016159160062670708,
0.06690330803394318,
0.07392211258411407,
-0.07499119639396667,
-0.016703004017472267,
-0.04052453115582466,
-0.08126808702945709,
0.07149168848991394,
0.22494889795780182,
0.060930632054805756,
0.06201578304171562,
-0.04962283372879028,
-0.03334575891494751,
-0.07679539173841476,
0.055932603776454926,
0.05338287353515625,
0.07664499431848526,
-0.0774758979678154,
0.10348864644765854,
0.01169736310839653,
0.05129930004477501,
-0.02623690851032734,
-0.04599699005484581,
-0.10688695311546326,
-0.05677323788404465,
-0.11789491027593613,
0.016684917733073235,
-0.06519641727209091,
-0.038107141852378845,
0.006490528117865324,
-0.004462540615350008,
-0.004950067028403282,
0.05815250799059868,
-0.06109621748328209,
-0.0111685236915946,
-0.013033175840973854,
0.03389695659279823,
-0.06104860082268715,
-0.05322175472974777,
0.020243000239133835,
-0.09282807260751724,
0.09792453795671463,
0.042551103979349136,
0.009537037461996078,
0.0001309019571635872,
0.07510466873645782,
-0.010463079437613487,
0.02087070234119892,
0.009641322307288647,
-0.03900948911905289,
-0.10337710380554199,
0.011191165074706078,
-0.02648642472922802,
-0.03696369007229805,
-0.018695291131734848,
0.08713261783123016,
-0.08074551820755005,
0.03321107104420662,
-0.00003463479515630752,
0.002173809800297022,
-0.07769939303398132,
-0.004194648936390877,
0.10040922462940216,
0.08166996389627457,
0.05134302005171776,
-0.08121379464864731,
0.015724491328001022,
-0.1259545236825943,
-0.03494809567928314,
0.013549864292144775,
-0.013867265544831753,
-0.1408962905406952,
-0.009921463206410408,
0.01786748133599758,
-0.012619026005268097,
0.1880895346403122,
-0.0523667149245739,
-0.025941716507077217,
0.021046672016382217,
-0.08739545196294785,
0.10010167211294174,
-0.020128846168518066,
0.15746119618415833,
-0.02948005311191082,
-0.03751961514353752,
-0.01508289109915495,
0.05002857372164726,
0.02989710122346878,
-0.011583915911614895,
0.18828612565994263,
0.1315198838710785,
0.03967636451125145,
0.05438957363367081,
-0.022693561390042305,
-0.009609776549041271,
-0.050170328468084335,
-0.020761050283908844,
0.03865818306803703,
0.03287095949053764,
0.02120388299226761,
0.14083506166934967,
0.0631934404373169,
-0.1617097407579422,
0.0415363684296608,
-0.026351293548941612,
-0.042607828974723816,
-0.11718299984931946,
-0.09978090971708298,
-0.028376920148730278,
-0.06362618505954742,
0.015757493674755096,
-0.13683100044727325,
0.0010220218682661653,
0.17351865768432617,
0.06548147648572922,
0.028909936547279358,
0.013268993236124516,
-0.1341855227947235,
-0.04019206762313843,
0.056213367730379105,
0.011730303056538105,
0.023626934736967087,
0.042969632893800735,
-0.004067585337907076,
0.06969702988862991,
0.03136370703577995,
0.001833215355873108,
-0.005051817744970322,
0.0779205709695816,
0.02331753447651863,
0.04738692194223404,
-0.05544542521238327,
0.0010492191649973392,
-0.04506294056773186,
0.07794823497533798,
0.12175780534744263,
0.04318881407380104,
-0.0445210263133049,
-0.011182405054569244,
0.16403308510780334,
-0.025339504703879356,
0.00593888433650136,
-0.12533250451087952,
0.3078666925430298,
0.019669920206069946,
0.019117038697004318,
0.060158971697092056,
-0.08150196820497513,
-0.05130062252283096,
0.2154504805803299,
0.06865603476762772,
-0.022651931270956993,
-0.029090486466884613,
0.00553443469107151,
-0.0314834862947464,
-0.014863168820738792,
0.13957376778125763,
0.043884214013814926,
0.11165967583656311,
-0.05348455533385277,
-0.045186229050159454,
-0.03550291061401367,
0.0007071185973472893,
-0.10473844408988953,
0.144121915102005,
-0.0178916547447443,
-0.019302353262901306,
-0.0767289474606514,
0.010647541843354702,
0.06720269471406937,
-0.3458220362663269,
0.002786678960546851,
-0.02835446409881115,
-0.10252008587121964,
-0.01846265234053135,
-0.03316580504179001,
-0.026166200637817383,
0.04398367181420326,
-0.036462944000959396,
0.06532613933086395,
0.04854676127433777,
0.03798574581742287,
-0.0192909874022007,
-0.1123209223151207,
0.16151633858680725,
0.06821619719266891,
0.10134189575910568,
0.01916913129389286,
0.07944805920124054,
0.060908783227205276,
0.040152404457330704,
-0.09635806828737259,
0.046305667608976364,
0.008333244360983372,
-0.06582414358854294,
-0.059494875371456146,
0.11839340627193451,
0.0013012959389016032,
0.06666029244661331,
0.02927892468869686,
-0.12375552952289581,
0.02093389444053173,
0.07515407353639603,
-0.08646141737699509,
-0.09455615282058716,
-0.000629470800049603,
-0.0963771790266037,
0.15388071537017822,
0.14782489836215973,
-0.009519973769783974,
0.018398350104689598,
-0.0642896294593811,
-0.008000964298844337,
0.04999883845448494,
0.018015529960393906,
-0.01325276494026184,
-0.18787966668605804,
0.054858043789863586,
-0.09490881860256195,
-0.004278672859072685,
-0.21273745596408844,
-0.10167771577835083,
-0.011818715371191502,
-0.055165767669677734,
-0.02090209722518921,
0.0613781213760376,
0.03335585445165634,
0.07820350676774979,
-0.02387593314051628,
-0.02635500580072403,
-0.040028300136327744,
0.09621754288673401,
-0.09622092545032501,
-0.07131104916334152
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1600k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1600k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1600k"]}
| null |
google/multiberts-seed_1-step_1600k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1600k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1600k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08335540443658829,
0.06886807084083557,
-0.0021297524217516184,
0.05025646463036537,
0.08000936359167099,
-0.018433867022395134,
0.059870947152376175,
0.09209633618593216,
-0.018666952848434448,
0.02088240347802639,
0.08147837221622467,
0.02460525371134281,
0.012496563605964184,
0.09552109241485596,
0.021133653819561005,
-0.20402830839157104,
0.02834079973399639,
-0.031698767095804214,
-0.09668505191802979,
0.07353627681732178,
0.10361212491989136,
-0.08541354537010193,
0.04435315355658531,
0.03245311975479126,
-0.11974749714136124,
0.05784880742430687,
-0.007076294627040625,
-0.024690425023436546,
0.13929834961891174,
0.0019168321741744876,
0.05670702084898949,
0.05864338576793671,
0.04927796497941017,
-0.12952187657356262,
0.004231930244714022,
0.05635818839073181,
0.052269645035266876,
0.036956798285245895,
0.02363235503435135,
0.0847952663898468,
-0.02239222452044487,
0.03207278251647949,
0.05129826068878174,
0.01710042543709278,
-0.06588663905858994,
-0.06633316725492477,
-0.09717725962400436,
0.025169463828206062,
0.025150902569293976,
0.019075999036431313,
0.009291772730648518,
0.11070537567138672,
-0.03182092308998108,
0.04583543911576271,
0.17249536514282227,
-0.30050763487815857,
-0.00456386711448431,
0.06189857795834541,
0.018545417115092278,
0.12295281887054443,
-0.0005750699201598763,
-0.03323262929916382,
0.08134189248085022,
0.02352895215153694,
0.09708384424448013,
-0.04195239022374153,
0.010329516604542732,
-0.05823570489883423,
-0.15403582155704498,
-0.037850361317396164,
0.10215017199516296,
-0.0009942506439983845,
-0.13898463547229767,
-0.014335516840219498,
-0.044985320419073105,
0.038868337869644165,
0.020071376115083694,
-0.04229282587766647,
0.04416511952877045,
0.0034605732653290033,
-0.001149762305431068,
-0.003145362948998809,
-0.10346770286560059,
-0.04563366621732712,
0.021716618910431862,
0.08917506784200668,
0.10959526151418686,
0.060917243361473083,
-0.006135601084679365,
0.10287521034479141,
-0.18465575575828552,
-0.04711320996284485,
-0.02647707238793373,
-0.03552703186869621,
-0.048443328589200974,
-0.010210001841187477,
-0.10067053884267807,
-0.04590480402112007,
-0.0006854889797978103,
0.13043726980686188,
-0.032224614173173904,
0.03527496010065079,
-0.02955026552081108,
0.0035974252969026566,
0.05531284213066101,
0.04651663079857826,
-0.0219013299793005,
0.027002234011888504,
0.03777895122766495,
-0.01765771210193634,
-0.022703828290104866,
0.009245714172720909,
0.000051588307542260736,
0.02659684792160988,
0.12694768607616425,
0.012390160001814365,
-0.10558274388313293,
0.07682466506958008,
-0.015672260895371437,
-0.04296882450580597,
-0.0024563688784837723,
-0.08881425857543945,
-0.06405629962682724,
-0.03957168012857437,
-0.015963686630129814,
-0.0024894974194467068,
0.007839336059987545,
-0.011692477390170097,
-0.02792360633611679,
-0.02256252057850361,
-0.09028930962085724,
-0.062403518706560135,
-0.05562655255198479,
-0.1382339596748352,
0.009404327720403671,
-0.18827512860298157,
-0.02776303142309189,
-0.11741466820240021,
-0.2059929519891739,
-0.0366780087351799,
0.045751407742500305,
0.001847922569140792,
-0.0647987425327301,
0.061819545924663544,
0.031024638563394547,
-0.032429855316877365,
-0.0017785248346626759,
0.09332656860351562,
-0.005692099686712027,
0.03700808435678482,
-0.036202091723680496,
0.06151903420686722,
0.009073441848158836,
0.04375550150871277,
-0.05887635052204132,
0.05432068929076195,
-0.1741930991411209,
0.04182189702987671,
-0.07700711488723755,
-0.028378430753946304,
-0.08397877961397171,
-0.0330929197371006,
-0.006266642827540636,
0.014387368224561214,
0.026192311197519302,
0.07575061172246933,
-0.16645447909832,
-0.030754318460822105,
0.08438506722450256,
-0.15598034858703613,
-0.02850387431681156,
0.07396230101585388,
-0.05498196929693222,
0.1174963116645813,
0.06384284794330597,
0.1629931628704071,
-0.028988027945160866,
-0.06707103550434113,
0.05232476443052292,
-0.01298653893172741,
0.005423326510936022,
-0.011229769326746464,
0.06586457788944244,
-0.01653764396905899,
-0.16879916191101074,
0.025333788245916367,
-0.12740786373615265,
0.0003905069315806031,
-0.07917868345975876,
0.029231030493974686,
-0.003814977128058672,
-0.07034541666507721,
-0.07969554513692856,
-0.03163521736860275,
0.07494255155324936,
-0.073182612657547,
-0.024876713752746582,
0.030459441244602203,
0.07540588825941086,
-0.07059701532125473,
0.06925225257873535,
-0.016686297953128815,
0.016264308243989944,
-0.07479351758956909,
-0.035170137882232666,
-0.18629804253578186,
0.03723534569144249,
0.09590891003608704,
0.01880587637424469,
-0.02114642970263958,
0.12610942125320435,
-0.008174177259206772,
0.06662232428789139,
-0.04118151590228081,
-0.00017045853019226342,
-0.010746167041361332,
0.00019637218792922795,
-0.09124396741390228,
-0.11243024468421936,
-0.0705152079463005,
-0.0689731314778328,
0.08916240185499191,
-0.11184775084257126,
0.023989161476492882,
-0.05332910642027855,
0.03642970323562622,
0.016189787536859512,
-0.0711192861199379,
-0.012606039643287659,
0.0155150992795825,
-0.06220967695116997,
-0.05770645663142204,
0.035454098135232925,
0.06268755346536636,
-0.016252007335424423,
0.09635955095291138,
-0.04617985337972641,
-0.08885977417230606,
0.03084760531783104,
0.06991040706634521,
-0.10854765772819519,
0.03500186279416084,
-0.04731043055653572,
-0.0481053851544857,
-0.06762540340423584,
-0.02988654561340809,
0.09871736168861389,
-0.01650373451411724,
0.14303037524223328,
-0.07722127437591553,
-0.008706728927791119,
0.014312822371721268,
-0.010955968871712685,
-0.024997051805257797,
0.04970342293381691,
0.06736554205417633,
-0.06274133920669556,
0.0213130135089159,
0.02625390887260437,
0.0002777576446533203,
0.061989329755306244,
-0.050507593899965286,
-0.07470490783452988,
0.02166876010596752,
0.030067872256040573,
0.025132352486252785,
0.060905199497938156,
-0.05818680301308632,
-0.011019942350685596,
0.03025764413177967,
0.021726341918110847,
0.012775685638189316,
-0.11847303062677383,
0.06133469194173813,
0.06275774538516998,
0.008992635644972324,
0.051575131714344025,
-0.019470324739813805,
-0.03325957432389259,
0.08288665860891342,
0.026170549914240837,
-0.014884989708662033,
-0.010340213775634766,
-0.012478177435696125,
-0.12413743883371353,
0.2157316356897354,
-0.07179619371891022,
-0.15527641773223877,
-0.07233506441116333,
-0.11250832676887512,
0.0071395342238247395,
0.025931350886821747,
0.04413000866770744,
-0.031866904348134995,
-0.04465498775243759,
-0.12307286262512207,
0.09636511653661728,
-0.04333307221531868,
0.0661046952009201,
0.11874279379844666,
-0.06362967938184738,
0.048054199665784836,
-0.13001605868339539,
-0.011078698560595512,
-0.07523886859416962,
-0.06641276925802231,
0.05911843478679657,
-0.05045441538095474,
0.04084892198443413,
0.11173353344202042,
0.011493981815874577,
-0.030257143080234528,
-0.030185947194695473,
0.2027014046907425,
0.042978499084711075,
0.04360823705792427,
0.13443319499492645,
-0.071694016456604,
0.05209170654416084,
0.0813666582107544,
0.005629058927297592,
-0.04488734155893326,
0.05475180596113205,
0.05245788395404816,
-0.06450096517801285,
-0.19826537370681763,
-0.006839603651314974,
0.005501601379364729,
-0.040850088000297546,
0.0646008849143982,
0.035331904888153076,
0.0016182911349460483,
0.07521357387304306,
0.02132037840783596,
0.06809793412685394,
0.00397928711026907,
0.09726309776306152,
0.03745102137327194,
-0.040195804089307785,
0.08528988063335419,
-0.007039447780698538,
-0.0042129140347242355,
0.08077418804168701,
-0.02401849813759327,
0.2929720878601074,
-0.04722004383802414,
0.02334119752049446,
0.12585517764091492,
0.036857981234788895,
0.04877860099077225,
0.12337210029363632,
-0.0749172642827034,
0.030288921669125557,
-0.07556825131177902,
-0.046122241765260696,
0.00978773832321167,
0.04461532458662987,
-0.08246489614248276,
0.017246561124920845,
-0.08842015266418457,
0.03528551012277603,
-0.024446969851851463,
0.2905895709991455,
0.09970758110284805,
-0.11231847107410431,
-0.06313197314739227,
-0.0001813601265894249,
-0.09627982974052429,
-0.07069920748472214,
0.0521710105240345,
0.06275933980941772,
-0.12829743325710297,
0.0021319198422133923,
-0.020178863778710365,
0.07434504479169846,
-0.020430700853466988,
0.016461648046970367,
0.04184914380311966,
0.05875604972243309,
-0.046185389161109924,
0.005330835003405809,
-0.1817428469657898,
0.1974177360534668,
-0.0009845016757026315,
0.02229147031903267,
-0.05402252823114395,
0.032169684767723083,
0.010850701481103897,
-0.012594635598361492,
0.06542857736349106,
0.02096748724579811,
-0.02599417045712471,
-0.061439383774995804,
-0.038122616708278656,
0.01490768976509571,
0.06384063512086868,
-0.037970419973134995,
0.10200365632772446,
0.0074282498098909855,
0.05481401085853577,
0.031696248799562454,
0.09121136367321014,
-0.18552722036838531,
-0.08055413514375687,
0.02702641673386097,
-0.05169738456606865,
-0.112836092710495,
-0.08205696195363998,
-0.09687867015600204,
-0.005989119876176119,
0.22824877500534058,
-0.10933145135641098,
-0.07662110030651093,
-0.09486105293035507,
0.048451490700244904,
0.09713207185268402,
-0.05451808497309685,
0.024461206048727036,
-0.008369197137653828,
0.11198116093873978,
-0.07196343690156937,
-0.12210903316736221,
0.025762656703591347,
-0.09987763315439224,
-0.15917038917541504,
-0.0687919482588768,
0.08741127699613571,
0.06539322435855865,
0.0302708949893713,
-0.03394260257482529,
0.011364791542291641,
0.039029560983181,
-0.036218173801898956,
0.0025706635788083076,
0.05408315360546112,
0.08887464553117752,
0.03764300420880318,
-0.10730332881212234,
0.013249906711280346,
-0.06964229047298431,
-0.07189638167619705,
0.06643892824649811,
0.269940048456192,
-0.051146283745765686,
0.11210785061120987,
0.1255147010087967,
-0.09011334180831909,
-0.16450195014476776,
0.048451971262693405,
0.09256687760353088,
-0.015516881830990314,
0.0053769261576235294,
-0.16131697595119476,
0.102702796459198,
0.11435616761445999,
-0.013452032580971718,
-0.001073140767402947,
-0.20044779777526855,
-0.13837803900241852,
0.08414730429649353,
0.11724139004945755,
0.27988865971565247,
-0.05179905891418457,
-0.03530852496623993,
0.02418634295463562,
-0.09622490406036377,
0.0008921056287363172,
0.1259075403213501,
0.05769777670502663,
-0.021004850044846535,
-0.06350486725568771,
0.011725152842700481,
-0.03685235232114792,
0.0899651050567627,
0.06534769386053085,
0.06968811899423599,
-0.004408808425068855,
-0.002184399636462331,
-0.02928168512880802,
-0.03817247599363327,
0.06614768505096436,
0.041782867163419724,
0.052592769265174866,
-0.08376073092222214,
-0.034785185009241104,
-0.07447583973407745,
0.03547866642475128,
-0.03349607437849045,
-0.07637042552232742,
-0.06199527904391289,
0.0781179815530777,
0.056709226220846176,
-0.036170653998851776,
0.02790738083422184,
0.03439938277006149,
0.09829924255609512,
0.16038979589939117,
-0.004254480823874474,
-0.05157206952571869,
-0.052145637571811676,
-0.023477701470255852,
-0.013490861281752586,
0.07196708768606186,
-0.0492810383439064,
0.019013328477740288,
0.07265256345272064,
0.019032813608646393,
0.10236287117004395,
0.06221728026866913,
-0.12234095484018326,
-0.01985185593366623,
0.029497306793928146,
-0.1554294228553772,
0.012983957305550575,
-0.0005412158207036555,
0.013812996447086334,
-0.021587738767266273,
0.017710067331790924,
0.14844626188278198,
-0.06986916065216064,
-0.03105156682431698,
-0.048561688512563705,
0.06483826786279678,
0.03677006810903549,
0.15282663702964783,
0.03812116011977196,
0.03678242862224579,
-0.07861961424350739,
0.1446453183889389,
0.046195827424526215,
-0.04593181237578392,
0.024152593687176704,
-0.027721542865037918,
-0.1089184507727623,
0.015359706245362759,
0.06494653224945068,
0.0688493400812149,
-0.07382082939147949,
-0.016594795510172844,
-0.03980466350913048,
-0.08371783792972565,
0.06879019737243652,
0.20927134156227112,
0.061392154544591904,
0.06510051339864731,
-0.04852895811200142,
-0.03178822621703148,
-0.07942043244838715,
0.053652986884117126,
0.051748305559158325,
0.07740611582994461,
-0.07135768979787827,
0.10929419100284576,
0.011031782254576683,
0.05288403481245041,
-0.026464881375432014,
-0.043673302978277206,
-0.10338465124368668,
-0.055039603263139725,
-0.10588788986206055,
0.01789925806224346,
-0.0634356215596199,
-0.04247327521443367,
0.0068524666130542755,
-0.0041280584409832954,
-0.00682666664943099,
0.059732530266046524,
-0.0605694018304348,
-0.012155674397945404,
-0.013504751957952976,
0.03591545671224594,
-0.0631481185555458,
-0.05427395552396774,
0.021609414368867874,
-0.09614036232233047,
0.09788382053375244,
0.04647478833794594,
0.010620815679430962,
0.0031658513471484184,
0.07502362132072449,
-0.009192571975290775,
0.020708002150058746,
0.011542034335434437,
-0.03834313526749611,
-0.10090199112892151,
0.00957628432661295,
-0.02566608041524887,
-0.040005575865507126,
-0.02046632207930088,
0.0898301973938942,
-0.08084666728973389,
0.036613740026950836,
-0.0012642944930121303,
-0.0030874060466885567,
-0.07878371328115463,
-0.004209449049085379,
0.10118085145950317,
0.08258023858070374,
0.04835406318306923,
-0.08514479547739029,
0.012974326498806477,
-0.12835533916950226,
-0.036940064281225204,
0.014105197973549366,
-0.016984272748231888,
-0.12840987741947174,
-0.012229328975081444,
0.020963868126273155,
-0.01372409239411354,
0.1894712597131729,
-0.05123768001794815,
-0.025691045448184013,
0.021934904158115387,
-0.0951751098036766,
0.10440786927938461,
-0.02058243378996849,
0.15793097019195557,
-0.03444173187017441,
-0.03664879873394966,
-0.01175057515501976,
0.04501454532146454,
0.03075304441154003,
-0.003882249817252159,
0.19099053740501404,
0.13095538318157196,
0.04045887291431427,
0.05305318906903267,
-0.023088155314326286,
-0.010636475868523121,
-0.05398908630013466,
-0.020392710343003273,
0.03824395686388016,
0.03346061706542969,
0.022524893283843994,
0.14083443582057953,
0.06261405348777771,
-0.16105487942695618,
0.03906156122684479,
-0.030934270471334457,
-0.043252162635326385,
-0.11718473583459854,
-0.0904344990849495,
-0.030733177438378334,
-0.060170188546180725,
0.014966013841331005,
-0.1364644467830658,
0.005862453952431679,
0.1751028299331665,
0.06557437777519226,
0.027926573529839516,
0.01606610044836998,
-0.1307833343744278,
-0.041007671505212784,
0.0542362704873085,
0.013149846345186234,
0.019847430288791656,
0.04873333126306534,
-0.007895560003817081,
0.06882883608341217,
0.03337126225233078,
0.001676415791735053,
-0.005666052456945181,
0.08031800389289856,
0.022163912653923035,
0.04576921463012695,
-0.05332242697477341,
-0.001904244883917272,
-0.044642962515354156,
0.0807730183005333,
0.11200745403766632,
0.043993931263685226,
-0.047354165464639664,
-0.010340893641114235,
0.16044117510318756,
-0.02651233598589897,
0.011804756708443165,
-0.11913546919822693,
0.3134078085422516,
0.01790650188922882,
0.011911032721400261,
0.061372388154268265,
-0.0809764415025711,
-0.047831084579229355,
0.21174141764640808,
0.07155827432870865,
-0.02120613306760788,
-0.030013075098395348,
0.0020200081635266542,
-0.03151223808526993,
-0.01533148717135191,
0.14303064346313477,
0.043853286653757095,
0.1088414266705513,
-0.05380553752183914,
-0.04965129867196083,
-0.037319425493478775,
0.003540676087141037,
-0.11232689768075943,
0.14692898094654083,
-0.01783289574086666,
-0.02090604230761528,
-0.07602067291736603,
0.009274165146052837,
0.07540306448936462,
-0.3578155040740967,
0.014086667448282242,
-0.028619850054383278,
-0.10031446814537048,
-0.012976160272955894,
-0.028077175840735435,
-0.026706332340836525,
0.048309195786714554,
-0.040846120566129684,
0.06735613942146301,
0.04141994193196297,
0.03992542251944542,
-0.022411365061998367,
-0.10245148837566376,
0.16140693426132202,
0.059608422219753265,
0.1060565635561943,
0.01750735566020012,
0.08578640967607498,
0.060173265635967255,
0.03963197022676468,
-0.09626085311174393,
0.052585914731025696,
0.008874171413481236,
-0.06811478734016418,
-0.05687950178980827,
0.1175379678606987,
-0.002238354878500104,
0.061171628534793854,
0.03102537989616394,
-0.11985690146684647,
0.023218948394060135,
0.0770966112613678,
-0.08486548811197281,
-0.09112787991762161,
-0.006348865106701851,
-0.09982738643884659,
0.15490184724330902,
0.14839576184749603,
-0.008326123468577862,
0.01861095242202282,
-0.0651155635714531,
-0.0118334349244833,
0.0519416406750679,
0.01804080419242382,
-0.010934664867818356,
-0.18646670877933502,
0.05184077471494675,
-0.08589285612106323,
-0.0031789992935955524,
-0.213422030210495,
-0.1009088084101677,
-0.01326974667608738,
-0.05581194907426834,
-0.024955423548817635,
0.06180526316165924,
0.02731923572719097,
0.07527799904346466,
-0.023654688149690628,
-0.016413217410445213,
-0.03778069466352463,
0.09786756336688995,
-0.09890609979629517,
-0.07230476289987564
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 160k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 160k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_160k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_160k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_160k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_160k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_160k"]}
| null |
google/multiberts-seed_1-step_160k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_160k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_160k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 160k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 160k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 160k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 160k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_160k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 160k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 160k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_160k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 160k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 160k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08150126785039902,
0.07451675832271576,
-0.002227737568318844,
0.04620147868990898,
0.08100924640893936,
-0.02100825309753418,
0.06332491338253021,
0.09184251725673676,
-0.016164561733603477,
0.02259945124387741,
0.08302080631256104,
0.025886276736855507,
0.011779015883803368,
0.10148878395557404,
0.0195457860827446,
-0.20969340205192566,
0.02876061014831066,
-0.028080414980649948,
-0.09112134575843811,
0.07399054616689682,
0.1046845018863678,
-0.08364508301019669,
0.042968012392520905,
0.03336872532963753,
-0.11798039078712463,
0.05754872411489487,
-0.009945559315383434,
-0.02013974077999592,
0.13648851215839386,
0.002590102842077613,
0.056423407047986984,
0.05785585567355156,
0.045192867517471313,
-0.1340896636247635,
0.003986238967627287,
0.055819589644670486,
0.051147446036338806,
0.03764856979250908,
0.025634873658418655,
0.08252137899398804,
-0.01882983185350895,
0.031237773597240448,
0.048527806997299194,
0.016622796654701233,
-0.06475844234228134,
-0.06362485140562057,
-0.09714570641517639,
0.02290470339357853,
0.02587750181555748,
0.019366882741451263,
0.007422898896038532,
0.1152806505560875,
-0.032000552862882614,
0.04590021073818207,
0.1820659637451172,
-0.30972591042518616,
-0.0037274102214723825,
0.06014169752597809,
0.021113259717822075,
0.11324120312929153,
0.00022826703207101673,
-0.03341078758239746,
0.08147439360618591,
0.02408471703529358,
0.09513059258460999,
-0.039499152451753616,
0.013519692234694958,
-0.060632411390542984,
-0.15631867945194244,
-0.0376996248960495,
0.09868977963924408,
-0.000791535887401551,
-0.13950252532958984,
-0.016411708667874336,
-0.045436639338731766,
0.04284639656543732,
0.020909272134304047,
-0.04173801466822624,
0.04464157298207283,
0.004420969635248184,
0.0009804068831726909,
-0.002447933191433549,
-0.10406694561243057,
-0.04498767480254173,
0.02118774875998497,
0.08683089911937714,
0.11022690683603287,
0.05802685022354126,
-0.0051960414275527,
0.10645376890897751,
-0.1883980929851532,
-0.04761270433664322,
-0.027341239154338837,
-0.033020198345184326,
-0.043889958411455154,
-0.010429194197058678,
-0.10128364711999893,
-0.05183590576052666,
0.0009251758456230164,
0.12626831233501434,
-0.02762054093182087,
0.03039143979549408,
-0.03077174350619316,
0.004296354483813047,
0.05844981595873833,
0.05073898658156395,
-0.01988569274544716,
0.027372857555747032,
0.03658334165811539,
-0.01700110174715519,
-0.023950643837451935,
0.010540475137531757,
-0.002802602481096983,
0.025266040116548538,
0.13310623168945312,
0.014182867482304573,
-0.10530175268650055,
0.07678476721048355,
-0.018036307767033577,
-0.043795399367809296,
-0.003342618001624942,
-0.08819711208343506,
-0.06386629492044449,
-0.039399079978466034,
-0.01517112459987402,
0.00352288456633687,
0.009235091507434845,
-0.01114929374307394,
-0.028568342328071594,
-0.024207593873143196,
-0.08849214017391205,
-0.0575375035405159,
-0.0566333532333374,
-0.1377236247062683,
0.008690905757248402,
-0.19666409492492676,
-0.028419194743037224,
-0.11813613027334213,
-0.20476526021957397,
-0.037280697375535965,
0.044634830206632614,
0.0026593294460326433,
-0.06297264993190765,
0.057408567517995834,
0.032241303473711014,
-0.032741088420152664,
-0.002558175940066576,
0.09347246587276459,
-0.00640075234696269,
0.03759441524744034,
-0.03616108372807503,
0.05858258157968521,
0.008739941753447056,
0.04417979717254639,
-0.059796854853630066,
0.05535593628883362,
-0.16793759167194366,
0.041520021855831146,
-0.07434632629156113,
-0.030722564086318016,
-0.08623219281435013,
-0.03047422133386135,
-0.0031265795696526766,
0.014769447967410088,
0.024983851239085197,
0.07556536793708801,
-0.1729910671710968,
-0.03009779565036297,
0.08857472985982895,
-0.1534564346075058,
-0.024445772171020508,
0.07435158640146255,
-0.055726490914821625,
0.11732969433069229,
0.06678049266338348,
0.15833266079425812,
-0.029486794024705887,
-0.06891226023435593,
0.047772906720638275,
-0.012277177534997463,
0.01035100407898426,
-0.008302869275212288,
0.06726629287004471,
-0.016996163874864578,
-0.16600050032138824,
0.025193197652697563,
-0.13380984961986542,
0.0019583890680223703,
-0.07982248067855835,
0.032086245715618134,
-0.004320785403251648,
-0.07020482420921326,
-0.07995572686195374,
-0.03201254457235336,
0.07591968029737473,
-0.07048363238573074,
-0.027151327580213547,
0.03968353942036629,
0.07547423988580704,
-0.07199205458164215,
0.06794332712888718,
-0.018613167107105255,
0.017202850431203842,
-0.07990535348653793,
-0.035643745213747025,
-0.18748581409454346,
0.03806573152542114,
0.09609018266201019,
0.022764114663004875,
-0.02064361609518528,
0.1337764859199524,
-0.006812370847910643,
0.06603150814771652,
-0.03923068568110466,
-0.0023151871282607317,
-0.010901490226387978,
-0.0005462612025439739,
-0.09353567659854889,
-0.11092240363359451,
-0.07269114255905151,
-0.07148505747318268,
0.088170126080513,
-0.11542064696550369,
0.02179899625480175,
-0.05454527959227562,
0.041689783334732056,
0.0181626807898283,
-0.07046014815568924,
-0.009532121010124683,
0.014993679709732533,
-0.06489071249961853,
-0.05809326842427254,
0.03516225144267082,
0.06190275028347969,
-0.018230516463518143,
0.09837840497493744,
-0.04773353785276413,
-0.08518069237470627,
0.028502082452178,
0.0744592621922493,
-0.1054462343454361,
0.03220991790294647,
-0.047532882541418076,
-0.046740736812353134,
-0.06834579259157181,
-0.027579614892601967,
0.09991107881069183,
-0.015442563220858574,
0.14390718936920166,
-0.077118419110775,
-0.011725058779120445,
0.00931483507156372,
-0.014738250523805618,
-0.026533186435699463,
0.05147485062479973,
0.06341884285211563,
-0.07293329387903214,
0.024524029344320297,
0.03011207841336727,
-0.0029327315278351307,
0.06902751326560974,
-0.050823889672756195,
-0.07738404721021652,
0.0207875594496727,
0.035175468772649765,
0.026486819609999657,
0.05781769007444382,
-0.05263698101043701,
-0.012180843390524387,
0.030032692477107048,
0.020214159041643143,
0.012199676595628262,
-0.11807268857955933,
0.06236008182168007,
0.062381044030189514,
0.006487127393484116,
0.05910130590200424,
-0.018775302916765213,
-0.03388968110084534,
0.08219902962446213,
0.026470202952623367,
-0.01516095269471407,
-0.010595710016787052,
-0.011411845684051514,
-0.12282228469848633,
0.21670524775981903,
-0.06989522278308868,
-0.15545237064361572,
-0.06712663918733597,
-0.11074317246675491,
0.006202158518135548,
0.02648942917585373,
0.04302292317152023,
-0.02992643229663372,
-0.04371747374534607,
-0.12491421401500702,
0.09939198940992355,
-0.04097448289394379,
0.06724249571561813,
0.1124480813741684,
-0.062165357172489166,
0.04898158833384514,
-0.13034753501415253,
-0.010952549055218697,
-0.07571665942668915,
-0.06966709345579147,
0.05673282966017723,
-0.052314020693302155,
0.03840913251042366,
0.10819961130619049,
0.01383815798908472,
-0.02895987406373024,
-0.02962205559015274,
0.2004528045654297,
0.04616044461727142,
0.03973856940865517,
0.13711808621883392,
-0.07259853184223175,
0.05135941877961159,
0.07735051214694977,
0.002278256230056286,
-0.04485942795872688,
0.05426395684480667,
0.053662121295928955,
-0.06270772963762283,
-0.1963774412870407,
-0.005595794878900051,
0.006594943813979626,
-0.04184345155954361,
0.06757672131061554,
0.037533290684223175,
-0.0010466993553563952,
0.0755428671836853,
0.017789039760828018,
0.07140309363603592,
0.0024275779724121094,
0.09689998626708984,
0.029041225090622902,
-0.03895455226302147,
0.08605660498142242,
-0.008246958255767822,
-0.005898523144423962,
0.07874670624732971,
-0.020100248977541924,
0.29182326793670654,
-0.04689725860953331,
0.021457474678754807,
0.12662838399410248,
0.03423880785703659,
0.049889642745256424,
0.12369254976511002,
-0.07718221843242645,
0.029127221554517746,
-0.0768042728304863,
-0.04737697169184685,
0.008630363270640373,
0.04557441920042038,
-0.07289561629295349,
0.019596807658672333,
-0.08573312312364578,
0.031322937458753586,
-0.025338783860206604,
0.29033729434013367,
0.10260369628667831,
-0.11285515129566193,
-0.06120431795716286,
0.001938820700161159,
-0.09858685731887817,
-0.07294075191020966,
0.049403250217437744,
0.058718159794807434,
-0.12969574332237244,
0.0016925344243645668,
-0.022245896980166435,
0.07613227516412735,
-0.027258822694420815,
0.016038546338677406,
0.03787865862250328,
0.05884513258934021,
-0.044216230511665344,
0.007248170208185911,
-0.1917559951543808,
0.19561989605426788,
-0.0014627828495576978,
0.023857103660702705,
-0.05414346605539322,
0.03229915350675583,
0.00976266898214817,
-0.014710948802530766,
0.06353960186243057,
0.018518704921007156,
-0.01596260443329811,
-0.05875999107956886,
-0.039650071412324905,
0.014523984864354134,
0.06274120509624481,
-0.043316200375556946,
0.10448317229747772,
0.006420768331736326,
0.0550273135304451,
0.030112311244010925,
0.08910301327705383,
-0.18852216005325317,
-0.07978802174329758,
0.026887966319918633,
-0.051968470215797424,
-0.107487253844738,
-0.08230695873498917,
-0.09773413091897964,
-0.004690114874392748,
0.23029400408267975,
-0.11390053480863571,
-0.07598263025283813,
-0.093693807721138,
0.051437947899103165,
0.09443635493516922,
-0.05532762035727501,
0.023272113874554634,
-0.009981292299926281,
0.11436977982521057,
-0.0737709328532219,
-0.12358251214027405,
0.026102157309651375,
-0.10091149061918259,
-0.16092798113822937,
-0.06806862354278564,
0.08797334879636765,
0.06707525253295898,
0.030780838802456856,
-0.03237023204565048,
0.009916108101606369,
0.0384485200047493,
-0.03395763039588928,
-0.00014385805116035044,
0.06271269172430038,
0.08779782056808472,
0.038835830986499786,
-0.10620662569999695,
0.011880614794790745,
-0.06961877644062042,
-0.07011444866657257,
0.069520503282547,
0.27089083194732666,
-0.049457598477602005,
0.1126440167427063,
0.11921393126249313,
-0.08972685784101486,
-0.16116484999656677,
0.04570827633142471,
0.09926284849643707,
-0.013838021084666252,
0.00861742440611124,
-0.16294433176517487,
0.1007968932390213,
0.11203775554895401,
-0.01490040309727192,
0.005130629055202007,
-0.20026607811450958,
-0.13565626740455627,
0.08956868201494217,
0.11632795631885529,
0.27951228618621826,
-0.05332396924495697,
-0.036059822887182236,
0.020383568480610847,
-0.0959349051117897,
0.008402588777244091,
0.12627436220645905,
0.05960376188158989,
-0.019772902131080627,
-0.06294231861829758,
0.011967262253165245,
-0.034846533089876175,
0.09076423943042755,
0.06258793920278549,
0.06937484443187714,
-0.0039678956381976604,
-0.0018669679993763566,
-0.03485555946826935,
-0.03966432064771652,
0.06949854642152786,
0.034333471208810806,
0.04955383762717247,
-0.08769901096820831,
-0.036655765026807785,
-0.07334204763174057,
0.038709092885255814,
-0.03141222149133682,
-0.0766613706946373,
-0.05874253436923027,
0.07622141391038895,
0.05552731454372406,
-0.033111535012722015,
0.029884831979870796,
0.03184889629483223,
0.09756612777709961,
0.1566505879163742,
-0.0023987407330423594,
-0.042558833956718445,
-0.05467027798295021,
-0.027809157967567444,
-0.0119123300537467,
0.0747942253947258,
-0.046567052602767944,
0.01806044951081276,
0.0700337141752243,
0.01823023520410061,
0.10362818837165833,
0.060306236147880554,
-0.12690621614456177,
-0.0188352819532156,
0.02637045830488205,
-0.1562984585762024,
0.014460663311183453,
-0.0001662209688220173,
0.022156111896038055,
-0.020520322024822235,
0.022849367931485176,
0.15252867341041565,
-0.07121481001377106,
-0.03228944167494774,
-0.04746495932340622,
0.06295914202928543,
0.034827861934900284,
0.14997859299182892,
0.03893454372882843,
0.03684281185269356,
-0.0771043598651886,
0.14209352433681488,
0.04356652498245239,
-0.03934311866760254,
0.02729228511452675,
-0.03296615555882454,
-0.10586050152778625,
0.015197659842669964,
0.06612304598093033,
0.07520522177219391,
-0.07291699945926666,
-0.018506210297346115,
-0.03999562934041023,
-0.08121678978204727,
0.06957376003265381,
0.21365360915660858,
0.06434249132871628,
0.06905689835548401,
-0.05076591670513153,
-0.03188946843147278,
-0.07760965824127197,
0.051013533025979996,
0.05405716225504875,
0.07902545481920242,
-0.07379217445850372,
0.10637786239385605,
0.0111093670129776,
0.05243092030286789,
-0.02673722244799137,
-0.04332364723086357,
-0.10446085780858994,
-0.05542399361729622,
-0.101641446352005,
0.01940436102449894,
-0.06249825656414032,
-0.042928971350193024,
0.008273954503238201,
-0.0044577643275260925,
-0.0034764595329761505,
0.06046442687511444,
-0.060512691736221313,
-0.011695628054440022,
-0.01563745178282261,
0.03582947701215744,
-0.06392505764961243,
-0.054600924253463745,
0.020398037508130074,
-0.09701293706893921,
0.09760924428701401,
0.046198081225156784,
0.01090741716325283,
0.00410073297098279,
0.06302139163017273,
-0.010198347270488739,
0.023480815812945366,
0.008850849233567715,
-0.039510328322649,
-0.09992741048336029,
0.009224200621247292,
-0.024252349510788918,
-0.03667740523815155,
-0.02157757617533207,
0.09410634636878967,
-0.08140004426240921,
0.032018546015024185,
-0.0005386843113228679,
-0.0004688321496360004,
-0.07752536982297897,
-0.002769807120785117,
0.09765902906656265,
0.08550038933753967,
0.05067935213446617,
-0.08463908731937408,
0.014735063537955284,
-0.1265806406736374,
-0.03640729561448097,
0.013287050649523735,
-0.016827333718538284,
-0.132772296667099,
-0.011530171148478985,
0.020269356667995453,
-0.013636148534715176,
0.18863940238952637,
-0.05007244646549225,
-0.022196229547262192,
0.019212830811738968,
-0.09535477310419083,
0.10965391248464584,
-0.021452801302075386,
0.1587524116039276,
-0.034285977482795715,
-0.03323960676789284,
-0.01543182972818613,
0.044493645429611206,
0.029483137652277946,
-0.008941545151174068,
0.18725840747356415,
0.130246102809906,
0.03428931534290314,
0.05657867714762688,
-0.025545377284288406,
-0.00791230145841837,
-0.05811309069395065,
-0.020432349294424057,
0.04073885828256607,
0.03386258706450462,
0.022720374166965485,
0.15015088021755219,
0.06406503170728683,
-0.16176439821720123,
0.035260364413261414,
-0.02787569724023342,
-0.04085886478424072,
-0.11746369302272797,
-0.10188043117523193,
-0.032744843512773514,
-0.05925833806395531,
0.01625366322696209,
-0.13435110449790955,
0.0071976035833358765,
0.17739014327526093,
0.06519211083650589,
0.02857232093811035,
0.014923383481800556,
-0.1278514415025711,
-0.04349758103489876,
0.05807299539446831,
0.011434013955295086,
0.019987130537629128,
0.0454566515982151,
-0.005081221926957369,
0.06987861543893814,
0.03341033682227135,
0.0024286676198244095,
-0.00556526193395257,
0.08201541751623154,
0.02351829968392849,
0.047075305134058,
-0.05552271753549576,
-0.0006645292742177844,
-0.04326754808425903,
0.08007687330245972,
0.11596100777387619,
0.04521922022104263,
-0.0491468720138073,
-0.010766837745904922,
0.15617482364177704,
-0.02635619416832924,
0.010435102507472038,
-0.11824683845043182,
0.31566959619522095,
0.01688804291188717,
0.014001383446156979,
0.05914212763309479,
-0.07923174649477005,
-0.04473239928483963,
0.20270220935344696,
0.06640294194221497,
-0.022572260349988937,
-0.028443511575460434,
0.0013300325954332948,
-0.03122699074447155,
-0.01616799458861351,
0.14089305698871613,
0.044288214296102524,
0.11475872248411179,
-0.05679238215088844,
-0.051704008132219315,
-0.03714966028928757,
0.0026239706203341484,
-0.11270271986722946,
0.14130273461341858,
-0.018783295527100563,
-0.018528729677200317,
-0.07527853548526764,
0.009460508823394775,
0.07399272918701172,
-0.3574606478214264,
0.009604239836335182,
-0.026225270703434944,
-0.1019054725766182,
-0.015086125582456589,
-0.022542672231793404,
-0.025509994477033615,
0.04749978333711624,
-0.04269436001777649,
0.0671379417181015,
0.04251837357878685,
0.03946717828512192,
-0.02140004374086857,
-0.09763149917125702,
0.15860222280025482,
0.060489922761917114,
0.1011897474527359,
0.016931895166635513,
0.0846932902932167,
0.059109728783369064,
0.03858285769820213,
-0.09413333237171173,
0.0546681210398674,
0.00855970662087202,
-0.06944717466831207,
-0.05509626120328903,
0.1172843798995018,
-0.0010620367247611284,
0.06344252824783325,
0.031835298985242844,
-0.12052379548549652,
0.025700626894831657,
0.07877827435731888,
-0.08665706217288971,
-0.09373867511749268,
-0.0015581152401864529,
-0.10058414191007614,
0.15441344678401947,
0.14770165085792542,
-0.009078620001673698,
0.015149352140724659,
-0.06728000193834305,
-0.007156391628086567,
0.05081498995423317,
0.017100892961025238,
-0.011114309541881084,
-0.18184131383895874,
0.05134863778948784,
-0.07892166823148727,
-0.0018805032595992088,
-0.2103709727525711,
-0.10014160722494125,
-0.015174317173659801,
-0.05694473162293434,
-0.02689811773598194,
0.058518145233392715,
0.02543565072119236,
0.07660499215126038,
-0.024640727788209915,
-0.023502705618739128,
-0.03800024464726448,
0.09518913179636002,
-0.10145596414804459,
-0.07339028269052505
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1700k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1700k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1700k"]}
| null |
google/multiberts-seed_1-step_1700k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1700k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1700k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08295120298862457,
0.0727195292711258,
-0.001983141526579857,
0.046159528195858,
0.07487910240888596,
-0.020359937101602554,
0.05861159786581993,
0.09074092656373978,
-0.011661458760499954,
0.02221963368356228,
0.08719329535961151,
0.02700372226536274,
0.012154423631727695,
0.09975945204496384,
0.02092687599360943,
-0.2101517766714096,
0.0312507264316082,
-0.02679487131536007,
-0.08466855436563492,
0.07506625354290009,
0.10391775518655777,
-0.08584100753068924,
0.04287242144346237,
0.0336621068418026,
-0.11539959162473679,
0.054206471890211105,
-0.009546395391225815,
-0.025499312207102776,
0.13680630922317505,
0.003482765518128872,
0.05154937133193016,
0.05895131081342697,
0.045210737735033035,
-0.13153532147407532,
0.004401483107358217,
0.05504944548010826,
0.051284633576869965,
0.03742646798491478,
0.021477200090885162,
0.08393146097660065,
-0.016292091459035873,
0.03468609228730202,
0.05223118141293526,
0.017795687541365623,
-0.0646023079752922,
-0.06713753938674927,
-0.0973174124956131,
0.03547462821006775,
0.0272630974650383,
0.016665717586874962,
0.009007097221910954,
0.11929502338171005,
-0.03067000024020672,
0.04639093205332756,
0.1733935922384262,
-0.3090147078037262,
-0.005913663189858198,
0.06528375297784805,
0.023772843182086945,
0.1227736547589302,
-0.0007626713486388326,
-0.03255825489759445,
0.08174817264080048,
0.02347588539123535,
0.09884308278560638,
-0.039730481803417206,
0.0197915006428957,
-0.06009649857878685,
-0.15716509521007538,
-0.03984564542770386,
0.09492214769124985,
0.002394177718088031,
-0.1372894048690796,
-0.02217039279639721,
-0.04395439848303795,
0.037446945905685425,
0.01971905492246151,
-0.04475657269358635,
0.041821449995040894,
0.00392674608156085,
0.001339947571977973,
-0.005555315408855677,
-0.1028178334236145,
-0.04480290040373802,
0.024160806089639664,
0.0916646346449852,
0.11027934402227402,
0.05950741842389107,
-0.006020275875926018,
0.10576224327087402,
-0.1823444664478302,
-0.04539729282259941,
-0.029909487813711166,
-0.036390237510204315,
-0.04578659310936928,
-0.012171016074717045,
-0.09861517697572708,
-0.04629543796181679,
-0.003545119659975171,
0.1299232393503189,
-0.022065995261073112,
0.03470505401492119,
-0.029816167429089546,
0.0057268934324383736,
0.05902464687824249,
0.050914060324430466,
-0.021183965727686882,
0.023511288687586784,
0.03579138219356537,
-0.013667196966707706,
-0.02211039699614048,
0.009459352120757103,
0.0010061365319415927,
0.024403249844908714,
0.13485682010650635,
0.014666211791336536,
-0.10151103138923645,
0.07318996638059616,
-0.014181245118379593,
-0.04649084061384201,
0.006589989643543959,
-0.0842936560511589,
-0.06352131068706512,
-0.04122685268521309,
-0.012941683642566204,
0.0028771094512194395,
0.007212452590465546,
-0.009972427040338516,
-0.028454191982746124,
-0.026094911620020866,
-0.09270613640546799,
-0.06290671229362488,
-0.05318507179617882,
-0.13402630388736725,
0.007903541438281536,
-0.19092196226119995,
-0.030701894313097,
-0.11676843464374542,
-0.20674718916416168,
-0.038995932787656784,
0.04489301145076752,
0.0026244823820888996,
-0.06569162011146545,
0.06167309731245041,
0.03570925444364548,
-0.0311793964356184,
-0.003274997230619192,
0.08913392573595047,
-0.003925654571503401,
0.037783898413181305,
-0.0427844263613224,
0.060864608734846115,
0.008522436954081059,
0.04304145276546478,
-0.06126531958580017,
0.05571141466498375,
-0.1773623824119568,
0.042007483541965485,
-0.0757724791765213,
-0.02724255435168743,
-0.08605363219976425,
-0.03282453864812851,
-0.005186659749597311,
0.01376977190375328,
0.02191063202917576,
0.07301875203847885,
-0.16482892632484436,
-0.030058106407523155,
0.09303270280361176,
-0.15632584691047668,
-0.029064474627375603,
0.07499551773071289,
-0.053675923496484756,
0.11463537812232971,
0.06851073354482651,
0.15593662858009338,
-0.02564273588359356,
-0.0625663474202156,
0.049399882555007935,
-0.014545643702149391,
0.005900891963392496,
-0.015784520655870438,
0.06557688862085342,
-0.018383916467428207,
-0.1767570823431015,
0.02471219375729561,
-0.1325385868549347,
0.002194215077906847,
-0.07841894775629044,
0.02976316213607788,
-0.0050359186716377735,
-0.0698673278093338,
-0.08636438846588135,
-0.0315592922270298,
0.0750638023018837,
-0.07267400622367859,
-0.024286743253469467,
0.029338151216506958,
0.07716374099254608,
-0.0714503675699234,
0.06930053979158401,
-0.015396516770124435,
0.014825040474534035,
-0.07390161603689194,
-0.03403696045279503,
-0.18459747731685638,
0.041982416063547134,
0.09715054929256439,
0.016258245334029198,
-0.021233446896076202,
0.12205430120229721,
-0.007911618798971176,
0.06666480004787445,
-0.040249716490507126,
-0.002921562409028411,
-0.00675909174606204,
0.00026507768779993057,
-0.09329637140035629,
-0.11370504647493362,
-0.07250417768955231,
-0.07068560272455215,
0.08942302316427231,
-0.11513471603393555,
0.022345678880810738,
-0.060755982995033264,
0.04073047637939453,
0.017020974308252335,
-0.07250825315713882,
-0.010819080285727978,
0.014873214066028595,
-0.062475718557834625,
-0.05698426067829132,
0.034926798194646835,
0.06140242889523506,
-0.021468959748744965,
0.09814105927944183,
-0.05001435801386833,
-0.08982763439416885,
0.028731824830174446,
0.0753253772854805,
-0.10488484799861908,
0.026307180523872375,
-0.04804430902004242,
-0.048478297889232635,
-0.06653224676847458,
-0.02895652875304222,
0.09433025121688843,
-0.014658935368061066,
0.14228986203670502,
-0.08013705909252167,
-0.011068207211792469,
0.013256989419460297,
-0.010206672362983227,
-0.03014148399233818,
0.047625619918107986,
0.06829861551523209,
-0.08249504864215851,
0.021046778187155724,
0.030304867774248123,
0.004103854298591614,
0.06814280897378922,
-0.04772002622485161,
-0.07424019277095795,
0.02117292769253254,
0.033609796315431595,
0.02353602461516857,
0.05753520131111145,
-0.05292023345828056,
-0.011878274381160736,
0.029108092188835144,
0.022537346929311752,
0.013609523884952068,
-0.11740300059318542,
0.060820989310741425,
0.061822276562452316,
0.010282947681844234,
0.05618442967534065,
-0.01784135214984417,
-0.03382721543312073,
0.08229092508554459,
0.02886226214468479,
-0.01804838888347149,
-0.011125726625323296,
-0.01087187509983778,
-0.12517069280147552,
0.21796484291553497,
-0.06895754486322403,
-0.15705232322216034,
-0.0681784451007843,
-0.11084429919719696,
0.0053833965212106705,
0.023362241685390472,
0.0435795933008194,
-0.029608117416501045,
-0.04313276708126068,
-0.12616951763629913,
0.09205205738544464,
-0.04293333366513252,
0.06818222254514694,
0.11629016697406769,
-0.06411870568990707,
0.043948765844106674,
-0.13102176785469055,
-0.01207468006759882,
-0.07896403968334198,
-0.06224525347352028,
0.056970901787281036,
-0.05421477183699608,
0.037171900272369385,
0.11475790292024612,
0.016646763309836388,
-0.028386345133185387,
-0.03114679642021656,
0.2038438618183136,
0.044802889227867126,
0.04188928008079529,
0.12942413985729218,
-0.07068316638469696,
0.05205511674284935,
0.07624254375696182,
0.004122521262615919,
-0.04605520889163017,
0.052797622978687286,
0.05087624862790108,
-0.06359529495239258,
-0.19470882415771484,
-0.009193980135023594,
0.006763771641999483,
-0.04028163477778435,
0.07206770777702332,
0.03749743476510048,
0.007597713731229305,
0.07460154592990875,
0.016257498413324356,
0.06620722264051437,
0.005563624668866396,
0.0993768498301506,
0.02856127917766571,
-0.03956877440214157,
0.08566354215145111,
-0.007691563107073307,
-0.0018521195743232965,
0.07989177852869034,
-0.021128635853528976,
0.2944081723690033,
-0.0425405390560627,
0.01964196376502514,
0.12749981880187988,
0.03289040923118591,
0.04975783824920654,
0.12188590317964554,
-0.07783757150173187,
0.028153344988822937,
-0.07687084376811981,
-0.04708785191178322,
0.011495845392346382,
0.04760938882827759,
-0.07372503727674484,
0.01882210001349449,
-0.08664434403181076,
0.031956784427165985,
-0.024529598653316498,
0.2966059744358063,
0.0978367030620575,
-0.1117151752114296,
-0.0626562163233757,
0.0024451783392578363,
-0.10082720965147018,
-0.0728578045964241,
0.053021419793367386,
0.05505755543708801,
-0.12913158535957336,
0.004876643419265747,
-0.019371261820197105,
0.07634183764457703,
-0.02384016290307045,
0.01402733102440834,
0.034866031259298325,
0.05713202804327011,
-0.04514145851135254,
0.004149666056036949,
-0.17661415040493011,
0.20104673504829407,
-0.0024386304430663586,
0.02215966023504734,
-0.05615251883864403,
0.02956298179924488,
0.009979041293263435,
-0.009463516063988209,
0.06513652205467224,
0.021319404244422913,
-0.019342893734574318,
-0.05832313746213913,
-0.03910965472459793,
0.013154161162674427,
0.059555534273386,
-0.039766836911439896,
0.10100611299276352,
0.007726298179477453,
0.05332336574792862,
0.030515067279338837,
0.08265291154384613,
-0.18708205223083496,
-0.07806511968374252,
0.027261605486273766,
-0.04864390194416046,
-0.1099180281162262,
-0.08280102163553238,
-0.09822080284357071,
-0.007608204148709774,
0.22796010971069336,
-0.11108098179101944,
-0.07423723489046097,
-0.09256743639707565,
0.04833579063415527,
0.09791668504476547,
-0.05594530701637268,
0.025300249457359314,
-0.011395071633160114,
0.11198646575212479,
-0.07342442870140076,
-0.12078463286161423,
0.02790086157619953,
-0.1018553376197815,
-0.1604354828596115,
-0.0675550177693367,
0.08869435638189316,
0.0681898221373558,
0.02927980199456215,
-0.031240956857800484,
0.010027042590081692,
0.03909339755773544,
-0.03788723051548004,
0.002310875104740262,
0.06110290810465813,
0.08375372737646103,
0.038737744092941284,
-0.1015615314245224,
0.007838640362024307,
-0.0706600472331047,
-0.07237479090690613,
0.06940893083810806,
0.2748747169971466,
-0.052239544689655304,
0.11435795575380325,
0.1290237307548523,
-0.08722083270549774,
-0.15715546905994415,
0.04603820666670799,
0.08998852968215942,
-0.016517404466867447,
0.004024889785796404,
-0.158747598528862,
0.10148879140615463,
0.11472755670547485,
-0.015823032706975937,
-0.0017140861600637436,
-0.2053767293691635,
-0.1377154141664505,
0.09275302290916443,
0.11646880209445953,
0.284659206867218,
-0.04973761737346649,
-0.03645242750644684,
0.025008777156472206,
-0.09196349233388901,
0.0068624685518443584,
0.13192857801914215,
0.05751657858490944,
-0.022910093888640404,
-0.07342717796564102,
0.011517353355884552,
-0.037806104868650436,
0.09016922861337662,
0.06312806159257889,
0.0685141384601593,
-0.0017313784919679165,
0.0013483534567058086,
-0.03809214010834694,
-0.04005327820777893,
0.06835678964853287,
0.0379691943526268,
0.05304497852921486,
-0.08427200466394424,
-0.03745671734213829,
-0.07697530835866928,
0.0349312387406826,
-0.03130644932389259,
-0.07581575959920883,
-0.061251912266016006,
0.07950868457555771,
0.058021627366542816,
-0.034587521106004715,
0.030154282227158546,
0.03126062825322151,
0.10023441165685654,
0.15573471784591675,
-0.0022761512082070112,
-0.056123606860637665,
-0.05338611453771591,
-0.028693774715065956,
-0.015452506951987743,
0.075315922498703,
-0.044832613319158554,
0.01666555181145668,
0.06960460543632507,
0.019053447991609573,
0.1042434424161911,
0.06038040295243263,
-0.1245679035782814,
-0.02124519646167755,
0.028508901596069336,
-0.15434415638446808,
0.009914692491292953,
-0.00036602962063625455,
0.02261435240507126,
-0.01925468258559704,
0.020939750596880913,
0.14706993103027344,
-0.07004786282777786,
-0.03300097957253456,
-0.04800422489643097,
0.06384992599487305,
0.03661532700061798,
0.145306795835495,
0.03752347454428673,
0.03692178055644035,
-0.07755818963050842,
0.14183685183525085,
0.04674523323774338,
-0.047631390392780304,
0.02464420534670353,
-0.024690086022019386,
-0.1079234853386879,
0.014706750400364399,
0.05800661817193031,
0.06606854498386383,
-0.07870004326105118,
-0.016400348395109177,
-0.0364815779030323,
-0.08130114525556564,
0.073232501745224,
0.21304748952388763,
0.06243622675538063,
0.06525621563196182,
-0.05135715752840042,
-0.03292183205485344,
-0.07718431204557419,
0.0497286394238472,
0.04984425753355026,
0.07801225036382675,
-0.0728057324886322,
0.10321537405252457,
0.015073481947183609,
0.05197243392467499,
-0.026032468304038048,
-0.04717167839407921,
-0.10352163016796112,
-0.05486777424812317,
-0.10554716736078262,
0.013935745693743229,
-0.07091816514730453,
-0.0407070554792881,
0.005490506999194622,
-0.004509663674980402,
-0.00660120090469718,
0.0575898140668869,
-0.06010107323527336,
-0.01132300402969122,
-0.012394741177558899,
0.03774433955550194,
-0.06400541961193085,
-0.05348135530948639,
0.02134215459227562,
-0.09540507197380066,
0.09936133772134781,
0.04388163238763809,
0.01071867998689413,
0.0008455929346382618,
0.07808089256286621,
-0.00894202385097742,
0.022038128226995468,
0.008579937741160393,
-0.03959376737475395,
-0.10333728790283203,
0.00810431968420744,
-0.02417411096394062,
-0.039686016738414764,
-0.017990954220294952,
0.09042714536190033,
-0.08113237470388412,
0.030605919659137726,
0.0015947382198646665,
-0.004677148535847664,
-0.07825085520744324,
-0.004083591978996992,
0.09882879257202148,
0.07952582091093063,
0.0479263998568058,
-0.08359754085540771,
0.012670488096773624,
-0.12853047251701355,
-0.035652775317430496,
0.011789481155574322,
-0.016495289281010628,
-0.12946437299251556,
-0.008737163618206978,
0.01914985105395317,
-0.013092588633298874,
0.19031769037246704,
-0.05866693705320358,
-0.02821066789329052,
0.02340095303952694,
-0.09519980102777481,
0.10140296071767807,
-0.022685786709189415,
0.16371765732765198,
-0.031644195318222046,
-0.03539305180311203,
-0.0081968167796731,
0.044838935136795044,
0.03041401132941246,
-0.0026247312780469656,
0.19194158911705017,
0.13126589357852936,
0.04560762643814087,
0.05632779374718666,
-0.024462470784783363,
-0.008055884391069412,
-0.0638192892074585,
-0.023468080908060074,
0.04069375991821289,
0.03497212380170822,
0.024470951408147812,
0.14004752039909363,
0.0676354169845581,
-0.16173869371414185,
0.03509257733821869,
-0.025943104177713394,
-0.040352869778871536,
-0.11850842088460922,
-0.09758853167295456,
-0.030522674322128296,
-0.05990112945437431,
0.014690099284052849,
-0.134615957736969,
0.005573224741965532,
0.16351494193077087,
0.06637559831142426,
0.029358582571148872,
0.014409049414098263,
-0.1276084929704666,
-0.04173678904771805,
0.0575435571372509,
0.014428097754716873,
0.021771682426333427,
0.043779950588941574,
-0.0051251663826406,
0.06647808849811554,
0.02978927083313465,
0.0003392574726603925,
-0.0064585404470562935,
0.07781738042831421,
0.02563406713306904,
0.04702780023217201,
-0.056468576192855835,
-0.0010343262692913413,
-0.0438564270734787,
0.08219390362501144,
0.11333255469799042,
0.04656760394573212,
-0.048550814390182495,
-0.010981095023453236,
0.15674664080142975,
-0.025903725996613503,
0.011767769232392311,
-0.11794772744178772,
0.32622799277305603,
0.018507180735468864,
0.010330453515052795,
0.06224256381392479,
-0.07684627920389175,
-0.04720453917980194,
0.2065841406583786,
0.07024599611759186,
-0.01923096552491188,
-0.028165873140096664,
0.0014837022172287107,
-0.032212093472480774,
-0.016828984022140503,
0.1449943482875824,
0.043499767780303955,
0.11746980249881744,
-0.05323810875415802,
-0.04603336378931999,
-0.03815894201397896,
0.0011860339436680079,
-0.10975731909275055,
0.1437269002199173,
-0.015641910955309868,
-0.020128754898905754,
-0.07549824565649033,
0.012987685389816761,
0.0744294673204422,
-0.34232303500175476,
0.009737415239214897,
-0.026370679959654808,
-0.10264777392148972,
-0.013797922991216183,
-0.02656840905547142,
-0.028270086273550987,
0.0471217967569828,
-0.03891155123710632,
0.06527683883905411,
0.039954736828804016,
0.03910822793841362,
-0.022408289834856987,
-0.10782644897699356,
0.1630682796239853,
0.07311858236789703,
0.10753338038921356,
0.01731622777879238,
0.08410663157701492,
0.06139315664768219,
0.03984515368938446,
-0.09915713220834732,
0.049148138612508774,
0.008705005049705505,
-0.06235251948237419,
-0.055954717099666595,
0.11610057204961777,
-0.00019264787260908633,
0.06851695477962494,
0.03318594768643379,
-0.12321349233388901,
0.022693870589137077,
0.07408133894205093,
-0.0843801349401474,
-0.09223072230815887,
-0.00038679427234455943,
-0.09634219110012054,
0.15816551446914673,
0.1454814076423645,
-0.01099343691021204,
0.012330474331974983,
-0.06307471543550491,
-0.008966606110334396,
0.0513225756585598,
0.02025054395198822,
-0.011489727534353733,
-0.18651935458183289,
0.05348384380340576,
-0.0854753851890564,
-0.00022890821855980903,
-0.21168926358222961,
-0.10255134105682373,
-0.014079863205552101,
-0.0566234290599823,
-0.02710048481822014,
0.061138540506362915,
0.023439014330506325,
0.0743570551276207,
-0.023871399462223053,
-0.01804071106016636,
-0.039484016597270966,
0.0943465530872345,
-0.10313498228788376,
-0.07253170013427734
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1800k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1800k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1800k"]}
| null |
google/multiberts-seed_1-step_1800k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1800k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1800k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0795673057436943,
0.08213502168655396,
-0.002120091114193201,
0.04372594878077507,
0.07585526257753372,
-0.019582431763410568,
0.06641896069049835,
0.09072704613208771,
-0.014590125530958176,
0.020778654143214226,
0.08260614424943924,
0.02060311660170555,
0.010118209756910801,
0.09759077429771423,
0.021940406411886215,
-0.20628878474235535,
0.030578969046473503,
-0.026510192081332207,
-0.09006340801715851,
0.07469619810581207,
0.10323052853345871,
-0.0844346210360527,
0.04208972677588463,
0.034207768738269806,
-0.11156060546636581,
0.053479090332984924,
-0.009897178038954735,
-0.026100607588887215,
0.136117085814476,
0.000259014661423862,
0.0573553666472435,
0.055922530591487885,
0.044057708233594894,
-0.13598847389221191,
0.003946802578866482,
0.05578606575727463,
0.05241009220480919,
0.03719482570886612,
0.019831955432891846,
0.0772872045636177,
-0.026944197714328766,
0.03529660403728485,
0.05418175831437111,
0.0181635282933712,
-0.06675047427415848,
-0.05800720676779747,
-0.09641361236572266,
0.02812931500375271,
0.027632182464003563,
0.023860186338424683,
0.009159914217889309,
0.11916276067495346,
-0.03326490521430969,
0.04286423698067665,
0.17543701827526093,
-0.3021566569805145,
-0.003657723544165492,
0.06147769093513489,
0.017740806564688683,
0.11916635185480118,
-0.001035878318361938,
-0.035175763070583344,
0.07969857007265091,
0.022928038612008095,
0.09754028916358948,
-0.04015572741627693,
0.01798715628683567,
-0.05753925070166588,
-0.15569300949573517,
-0.03934310004115105,
0.09753512591123581,
0.002764505334198475,
-0.13703706860542297,
-0.019667627289891243,
-0.0419483557343483,
0.02893633022904396,
0.019033193588256836,
-0.04046414792537689,
0.0443110316991806,
0.0025923883076757193,
-0.002522203139960766,
-0.003625165903940797,
-0.10372704267501831,
-0.04214158281683922,
0.023297198116779327,
0.09226609021425247,
0.10820598900318146,
0.05700814351439476,
-0.00725617166608572,
0.10495930910110474,
-0.18560650944709778,
-0.047012317925691605,
-0.028068095445632935,
-0.03460671007633209,
-0.047931499779224396,
-0.01194848120212555,
-0.10637187212705612,
-0.05113030597567558,
-0.003338160226121545,
0.13517825305461884,
-0.02313290536403656,
0.0336124561727047,
-0.026174696162343025,
0.005657084286212921,
0.060422830283641815,
0.05136807635426521,
-0.020562993362545967,
0.02983502857387066,
0.03133247420191765,
-0.01447987835854292,
-0.0225104708224535,
0.009304367005825043,
-0.0014564803568646312,
0.03054545633494854,
0.13496428728103638,
0.01280126255005598,
-0.10384895652532578,
0.0735592171549797,
-0.01725420542061329,
-0.04890499264001846,
-0.008472336456179619,
-0.08769570291042328,
-0.06578656286001205,
-0.04171961918473244,
-0.014416936784982681,
-0.000532399455551058,
0.007221438456326723,
-0.0061322287656366825,
-0.02854439802467823,
-0.019667435437440872,
-0.09088561683893204,
-0.05971480906009674,
-0.05606488138437271,
-0.1304054707288742,
0.007817836478352547,
-0.19547691941261292,
-0.029453415423631668,
-0.11545700579881668,
-0.2030666172504425,
-0.03802871331572533,
0.051334530115127563,
0.00291154976002872,
-0.06642094254493713,
0.06132952496409416,
0.03466910123825073,
-0.03197008743882179,
-0.0033402927219867706,
0.08326050639152527,
-0.005143672693520784,
0.03560185804963112,
-0.044132281094789505,
0.05655871704220772,
0.0055373613722622395,
0.04601743817329407,
-0.062252044677734375,
0.05415873974561691,
-0.1792069971561432,
0.03959789127111435,
-0.0728638544678688,
-0.02519260346889496,
-0.08549065887928009,
-0.031611159443855286,
-0.0070005301386117935,
0.013384957797825336,
0.022465379908680916,
0.07185347378253937,
-0.16300766170024872,
-0.030178245157003403,
0.0817352831363678,
-0.15191495418548584,
-0.02866183966398239,
0.0708024799823761,
-0.05679338797926903,
0.11390461027622223,
0.06515774875879288,
0.16184523701667786,
-0.026892008259892464,
-0.06186066195368767,
0.05017167702317238,
-0.018197569996118546,
0.009074344299733639,
-0.009984564036130905,
0.06719221919775009,
-0.02165226638317108,
-0.16532984375953674,
0.023140424862504005,
-0.13368260860443115,
0.0016502472572028637,
-0.0762609988451004,
0.027165334671735764,
-0.0029279247391968966,
-0.07116412371397018,
-0.08358445018529892,
-0.0324086993932724,
0.07535700500011444,
-0.07117926329374313,
-0.02644229866564274,
0.03443706035614014,
0.07693173736333847,
-0.06990569829940796,
0.06965401768684387,
-0.01770721562206745,
0.012805987149477005,
-0.07447046786546707,
-0.037089861929416656,
-0.18663723766803741,
0.03527570143342018,
0.09453750401735306,
0.016274455934762955,
-0.01573440060019493,
0.12286002933979034,
-0.008556040935218334,
0.06053224951028824,
-0.04015389829874039,
-0.00240400736220181,
-0.013263214379549026,
0.0008562013972550631,
-0.09003128111362457,
-0.1113303005695343,
-0.0703113004565239,
-0.06856206804513931,
0.09788735210895538,
-0.11006205528974533,
0.022444725036621094,
-0.058302219957113266,
0.039816826581954956,
0.017469538375735283,
-0.07273340970277786,
-0.010952310636639595,
0.014807230792939663,
-0.0640958845615387,
-0.05596363544464111,
0.03589745610952377,
0.06397176533937454,
-0.020364101976156235,
0.0984199270606041,
-0.051375612616539,
-0.0907718688249588,
0.030128834769129753,
0.07045833766460419,
-0.10577782988548279,
0.0290741715580225,
-0.046969737857580185,
-0.04761587828397751,
-0.06561271846294403,
-0.02634316124022007,
0.10337255150079727,
-0.012605900876224041,
0.1427338719367981,
-0.07752469182014465,
-0.012294753454625607,
0.011626708321273327,
-0.012646906077861786,
-0.027906090021133423,
0.04500489681959152,
0.07037415355443954,
-0.06637881696224213,
0.02199922502040863,
0.03215162828564644,
-0.00011551731586223468,
0.0665321871638298,
-0.04982450604438782,
-0.07651785016059875,
0.02179383672773838,
0.03072516992688179,
0.024343904107809067,
0.060715340077877045,
-0.049831971526145935,
-0.011714833788573742,
0.029650937765836716,
0.0232011117041111,
0.013228308409452438,
-0.11642223596572876,
0.06113142520189285,
0.06287112087011337,
0.011758416891098022,
0.05897274240851402,
-0.02246473729610443,
-0.03289898857474327,
0.07980605214834213,
0.03164530545473099,
-0.017906291410326958,
-0.01011187955737114,
-0.012079584412276745,
-0.12451678514480591,
0.21587026119232178,
-0.06937529146671295,
-0.15747392177581787,
-0.07095222175121307,
-0.1188209280371666,
0.0037114263977855444,
0.024375006556510925,
0.04192832484841347,
-0.030289525166153908,
-0.0438494011759758,
-0.12551844120025635,
0.08819950371980667,
-0.0419200100004673,
0.06439532339572906,
0.11700384318828583,
-0.061497073620557785,
0.04721158742904663,
-0.12990622222423553,
-0.012213918380439281,
-0.0776624083518982,
-0.06200847774744034,
0.05860666185617447,
-0.05183270946145058,
0.03803997114300728,
0.11997397989034653,
0.01577245444059372,
-0.027180727571249008,
-0.02974771335721016,
0.1996392458677292,
0.040575992316007614,
0.039938341826200485,
0.1282557249069214,
-0.0751579999923706,
0.05102403089404106,
0.08110983669757843,
0.004766068886965513,
-0.0439113974571228,
0.05530162900686264,
0.05598215013742447,
-0.061933111399412155,
-0.19723239541053772,
-0.012198682874441147,
0.00744616100564599,
-0.04507001116871834,
0.07170222699642181,
0.03812629356980324,
0.016481800004839897,
0.07502830028533936,
0.018414415419101715,
0.06280266493558884,
0.004041203297674656,
0.10030998289585114,
0.027817703783512115,
-0.03612789884209633,
0.08478592336177826,
-0.007499683182686567,
-0.007348107639700174,
0.0785241425037384,
-0.022110920399427414,
0.2975107729434967,
-0.04165692254900932,
0.024629389867186546,
0.1277792751789093,
0.02744934894144535,
0.05078708380460739,
0.11782025545835495,
-0.07676079869270325,
0.02847934514284134,
-0.07598984241485596,
-0.04729495570063591,
0.013263952918350697,
0.048323240131139755,
-0.07578977197408676,
0.019772065803408623,
-0.08481402695178986,
0.02816418744623661,
-0.023165235295891762,
0.29999539256095886,
0.10170423984527588,
-0.11134922504425049,
-0.058185089379549026,
0.0008552953950129449,
-0.09907886385917664,
-0.06988543272018433,
0.05309845879673958,
0.04602332040667534,
-0.13019882142543793,
0.004268284887075424,
-0.022435901686549187,
0.0761825367808342,
-0.022275056689977646,
0.015886249020695686,
0.04187864810228348,
0.0524374283850193,
-0.045617640018463135,
0.006115598604083061,
-0.18368032574653625,
0.19885021448135376,
-0.0027956177946180105,
0.020211847499012947,
-0.054002393037080765,
0.029860997572541237,
0.010869065299630165,
-0.01380574144423008,
0.06341741234064102,
0.021011793985962868,
-0.009119711816310883,
-0.0543162040412426,
-0.04059860482811928,
0.01336124911904335,
0.06378787755966187,
-0.04351828992366791,
0.10421401262283325,
0.007613577414304018,
0.053777601569890976,
0.031809598207473755,
0.08444850146770477,
-0.18510480225086212,
-0.08189158141613007,
0.030193304643034935,
-0.047505393624305725,
-0.10370136052370071,
-0.08251536637544632,
-0.0983063280582428,
-0.009413029998540878,
0.22802169620990753,
-0.10332432389259338,
-0.07413371652364731,
-0.09251408278942108,
0.04306140914559364,
0.0972331166267395,
-0.053442638367414474,
0.027781007811427116,
-0.010186215862631798,
0.1148887351155281,
-0.07146333158016205,
-0.12090401351451874,
0.02582082524895668,
-0.09861508011817932,
-0.15686249732971191,
-0.06810462474822998,
0.08898986876010895,
0.06680796295404434,
0.031497858464717865,
-0.03608192130923271,
0.014550828374922276,
0.03666127845644951,
-0.0356234610080719,
-0.0020542596466839314,
0.057113323360681534,
0.08879319578409195,
0.04251255467534065,
-0.1060747355222702,
0.011500628665089607,
-0.07410316169261932,
-0.07231484353542328,
0.0734662115573883,
0.2684471607208252,
-0.05085386708378792,
0.11438692361116409,
0.12640348076820374,
-0.08734124898910522,
-0.15830537676811218,
0.04746931418776512,
0.09072764962911606,
-0.017352323979139328,
0.0061808484606444836,
-0.16399487853050232,
0.1030486673116684,
0.11243985593318939,
-0.015365646220743656,
-0.0017403055680915713,
-0.20071201026439667,
-0.1367742121219635,
0.09298943728208542,
0.11400485783815384,
0.2793034315109253,
-0.051068712025880814,
-0.03643278777599335,
0.0224347785115242,
-0.09935598820447922,
0.008094798773527145,
0.12924396991729736,
0.05790996178984642,
-0.019764700904488564,
-0.06730018556118011,
0.01113678514957428,
-0.0376402921974659,
0.08910085260868073,
0.0658346489071846,
0.06777320057153702,
-0.003970697522163391,
-0.004018363542854786,
-0.03543274104595184,
-0.0426492914557457,
0.07009106874465942,
0.04002266377210617,
0.04968956112861633,
-0.08473781496286392,
-0.03611119091510773,
-0.07450511306524277,
0.03478148207068443,
-0.031233791261911392,
-0.07802649587392807,
-0.06173193082213402,
0.08045384287834167,
0.057999663054943085,
-0.03370127081871033,
0.034309227019548416,
0.031608548015356064,
0.09738259762525558,
0.1499851942062378,
-0.0019844025373458862,
-0.05730832740664482,
-0.061211030930280685,
-0.028803134337067604,
-0.013210718519985676,
0.07194328308105469,
-0.045343536883592606,
0.018693121150135994,
0.07120786607265472,
0.019435497000813484,
0.10639859735965729,
0.0614502876996994,
-0.12450563907623291,
-0.022250505164265633,
0.028037361800670624,
-0.1541156768798828,
0.017102548852562904,
-0.0030849671456962824,
0.02700987085700035,
-0.021463708952069283,
0.02210322581231594,
0.14746782183647156,
-0.0698469877243042,
-0.031700391322374344,
-0.04505058005452156,
0.06238075718283653,
0.03681178390979767,
0.14178112149238586,
0.03993071988224983,
0.03652598336338997,
-0.07741595059633255,
0.14551174640655518,
0.04823882132768631,
-0.046171996742486954,
0.02723185159265995,
-0.026324857026338577,
-0.1081070825457573,
0.014659367501735687,
0.06120170280337334,
0.0716901645064354,
-0.07628233730792999,
-0.016684826463460922,
-0.0412273108959198,
-0.08088640123605728,
0.07212980091571808,
0.2146494835615158,
0.06359001994132996,
0.06537115573883057,
-0.051657356321811676,
-0.03460022807121277,
-0.07736731320619583,
0.05082046613097191,
0.04863326624035835,
0.07934397459030151,
-0.07343180477619171,
0.09985003620386124,
0.01585089974105358,
0.05222272500395775,
-0.025215959176421165,
-0.050593774765729904,
-0.10209313035011292,
-0.05301305279135704,
-0.10140909999608994,
0.011989126913249493,
-0.06580591946840286,
-0.042434222996234894,
0.006158751901239157,
-0.0023239923175424337,
-0.007434837054461241,
0.05753398314118385,
-0.060532040894031525,
-0.011741331778466702,
-0.014030181802809238,
0.035017818212509155,
-0.06162059307098389,
-0.05184485763311386,
0.020614271983504295,
-0.09505132585763931,
0.09712149202823639,
0.04382321238517761,
0.010112397372722626,
0.0008103534928523004,
0.07805855572223663,
-0.011484730988740921,
0.018222542479634285,
0.009012836962938309,
-0.04099906235933304,
-0.10637196898460388,
0.005422227084636688,
-0.02540496736764908,
-0.03638005629181862,
-0.01947455108165741,
0.0889744833111763,
-0.08168309926986694,
0.03016062267124653,
0.001660799141973257,
-0.004199997987598181,
-0.07964322715997696,
-0.006245036143809557,
0.0974731519818306,
0.08069959282875061,
0.05110161006450653,
-0.07958106696605682,
0.01552046649158001,
-0.1265561729669571,
-0.035264406353235245,
0.011722219176590443,
-0.012457511387765408,
-0.1334739327430725,
-0.010233607143163681,
0.0160251185297966,
-0.013843320310115814,
0.190590962767601,
-0.05707710236310959,
-0.03071080520749092,
0.02388877235352993,
-0.09416646510362625,
0.10221832245588303,
-0.019342632964253426,
0.16263490915298462,
-0.028781509026885033,
-0.03707059472799301,
-0.011158663779497147,
0.04331846907734871,
0.029528189450502396,
-0.00613638898357749,
0.18929409980773926,
0.13603651523590088,
0.04944426566362381,
0.054178282618522644,
-0.02348598837852478,
-0.0075820861384272575,
-0.054570093750953674,
-0.014899172820150852,
0.04090135172009468,
0.033555544912815094,
0.027983466163277626,
0.14881877601146698,
0.06623387336730957,
-0.15911422669887543,
0.03832513839006424,
-0.029057245701551437,
-0.04021042585372925,
-0.1136673241853714,
-0.09142223745584488,
-0.030353661626577377,
-0.06199735403060913,
0.013543106615543365,
-0.13452282547950745,
0.0038610240444540977,
0.17845989763736725,
0.06648322939872742,
0.029313014820218086,
0.018026385456323624,
-0.13705797493457794,
-0.043767210096120834,
0.056561179459095,
0.012252932414412498,
0.022156456485390663,
0.04204709082841873,
-0.003319091396406293,
0.06519356369972229,
0.03303409367799759,
0.0019149164436385036,
-0.004347912035882473,
0.08260491490364075,
0.024276599287986755,
0.048246003687381744,
-0.05744180455803871,
0.0013804615009576082,
-0.0457267127931118,
0.08213048428297043,
0.11088690906763077,
0.04242714121937752,
-0.047320656478405,
-0.01033040415495634,
0.1600586622953415,
-0.027142537757754326,
0.005660022608935833,
-0.12110882997512817,
0.3180530369281769,
0.01765117421746254,
0.009381547570228577,
0.059018708765506744,
-0.08041081577539444,
-0.04589967057108879,
0.2075057327747345,
0.07115374505519867,
-0.0233894195407629,
-0.028349097818136215,
0.005280204117298126,
-0.03116115741431713,
-0.017397426068782806,
0.1400262415409088,
0.04313621297478676,
0.11661268025636673,
-0.05560477077960968,
-0.04843365028500557,
-0.03775634989142418,
0.0026315259747207165,
-0.11153484135866165,
0.14511865377426147,
-0.0153418630361557,
-0.019877048209309578,
-0.07580744475126266,
0.013032357208430767,
0.07361101359128952,
-0.33888304233551025,
0.0060731288976967335,
-0.032789912074804306,
-0.10396130383014679,
-0.013614588417112827,
-0.024755539372563362,
-0.030729113146662712,
0.04798612743616104,
-0.0387096144258976,
0.06259629130363464,
0.03962334245443344,
0.04044148325920105,
-0.0212115328758955,
-0.10633653402328491,
0.16413383185863495,
0.06778711825609207,
0.1069958284497261,
0.017359739169478416,
0.08231393992900848,
0.061373282223939896,
0.03917713463306427,
-0.09534646570682526,
0.05124824866652489,
0.008871759288012981,
-0.06351462751626968,
-0.056025102734565735,
0.1142447292804718,
-0.00041231993236579,
0.06677645444869995,
0.03330804407596588,
-0.12189538776874542,
0.023735415190458298,
0.07532928884029388,
-0.08426213264465332,
-0.09231822192668915,
-0.0035894601605832577,
-0.09932161867618561,
0.1569361388683319,
0.1440766602754593,
-0.010379387065768242,
0.013441511429846287,
-0.06303197145462036,
-0.010736200027167797,
0.050200045108795166,
0.016194846481084824,
-0.011803916655480862,
-0.18605192005634308,
0.05437972769141197,
-0.08358042687177658,
-0.001846693572588265,
-0.22267021238803864,
-0.10164277255535126,
-0.010679753497242928,
-0.05554332584142685,
-0.024007663130760193,
0.05858573317527771,
0.028636835515499115,
0.07586347311735153,
-0.02611350640654564,
-0.009637749753892422,
-0.03752555698156357,
0.09475015848875046,
-0.10280437022447586,
-0.07371485978364944
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 180k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 180k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_180k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_180k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_180k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_180k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_180k"]}
| null |
google/multiberts-seed_1-step_180k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_180k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_180k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 180k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 180k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 180k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 180k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_180k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 180k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 180k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_180k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 180k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 180k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07904627174139023,
0.07064997404813766,
-0.0021328963339328766,
0.042476434260606766,
0.07349716126918793,
-0.022089211270213127,
0.06279350072145462,
0.09069100022315979,
-0.017466416582465172,
0.023160818964242935,
0.08440309017896652,
0.029725002124905586,
0.009297407232224941,
0.10340531915426254,
0.021152520552277565,
-0.21307232975959778,
0.031412288546562195,
-0.025333603844046593,
-0.09458091855049133,
0.07596217840909958,
0.1028299555182457,
-0.08200491964817047,
0.04210098087787628,
0.03515111282467842,
-0.1100466400384903,
0.051550690084695816,
-0.010963277891278267,
-0.02161494269967079,
0.13910512626171112,
0.004285596776753664,
0.05505730211734772,
0.055546876043081284,
0.04355255141854286,
-0.13609719276428223,
0.004733011592179537,
0.0570036955177784,
0.05109782889485359,
0.0384787879884243,
0.02162034995853901,
0.07912010699510574,
-0.02030274085700512,
0.03169404715299606,
0.050666213035583496,
0.01702996902167797,
-0.06625211983919144,
-0.06255821883678436,
-0.09752114862203598,
0.028661249205470085,
0.028129680082201958,
0.02487412467598915,
0.007049998268485069,
0.11710681766271591,
-0.0336802639067173,
0.04315395653247833,
0.18248498439788818,
-0.3089248538017273,
-0.004562634974718094,
0.054705142974853516,
0.021113025024533272,
0.11246266961097717,
-0.0016435283469036222,
-0.0324702151119709,
0.08205203711986542,
0.024404238909482956,
0.100776307284832,
-0.03889020159840584,
0.02263110876083374,
-0.06132868677377701,
-0.1589585393667221,
-0.037912942469120026,
0.09266149252653122,
0.0044392687268555164,
-0.1372104287147522,
-0.022515466436743736,
-0.04509779065847397,
0.03601079061627388,
0.020362569019198418,
-0.041170377284288406,
0.04161553084850311,
0.002922352869063616,
-0.003977792337536812,
-0.004824675153940916,
-0.10564874112606049,
-0.04549510031938553,
0.019731419160962105,
0.10004527866840363,
0.10958759486675262,
0.055184684693813324,
-0.006184784229844809,
0.10691282153129578,
-0.1816680133342743,
-0.046801600605249405,
-0.02793712355196476,
-0.03626533970236778,
-0.046151719987392426,
-0.010090352967381477,
-0.10134603828191757,
-0.05010485276579857,
-0.0011240942403674126,
0.13503028452396393,
-0.024500178173184395,
0.031035710126161575,
-0.02961568720638752,
0.004921712446957827,
0.05892438441514969,
0.053365275263786316,
-0.02273303084075451,
0.03382590785622597,
0.03148343414068222,
-0.012593725696206093,
-0.021146230399608612,
0.010099486447870731,
0.0003974359424319118,
0.024171344935894012,
0.1344873160123825,
0.015707600861787796,
-0.10610006004571915,
0.07699855417013168,
-0.01607113890349865,
-0.048062764108181,
0.0002214001870015636,
-0.08375649154186249,
-0.06782152503728867,
-0.03945522755384445,
-0.013678993098437786,
0.007732728496193886,
0.00793019775301218,
-0.01013279240578413,
-0.02557312697172165,
-0.01787799410521984,
-0.08831888437271118,
-0.058281946927309036,
-0.05391768738627434,
-0.13147832453250885,
0.009364409372210503,
-0.19515658915042877,
-0.028266210108995438,
-0.11880028992891312,
-0.19803792238235474,
-0.03669523820281029,
0.04697984829545021,
0.003640130627900362,
-0.06218687444925308,
0.060771193355321884,
0.03649056702852249,
-0.031196823343634605,
-0.004551001824438572,
0.0913766622543335,
-0.007269176188856363,
0.03775731846690178,
-0.04341068118810654,
0.05809450149536133,
0.009018423967063427,
0.044284507632255554,
-0.061695341020822525,
0.05564820393919945,
-0.17446526885032654,
0.0374528169631958,
-0.0746983140707016,
-0.02893967181444168,
-0.08839019387960434,
-0.02952052280306816,
-0.010764677077531815,
0.014949304983019829,
0.022869877517223358,
0.07017805427312851,
-0.17279565334320068,
-0.030131569132208824,
0.08661265671253204,
-0.15179593861103058,
-0.028374262154102325,
0.07445955276489258,
-0.055810898542404175,
0.11597853153944016,
0.06460203975439072,
0.15754306316375732,
-0.033927638083696365,
-0.07129217684268951,
0.05117735639214516,
-0.015451030805706978,
0.006622392684221268,
-0.006709510926157236,
0.06749530136585236,
-0.021207844838500023,
-0.16462492942810059,
0.02238316461443901,
-0.13520391285419464,
0.003984692040830851,
-0.07726976275444031,
0.02994675189256668,
-0.002452217973768711,
-0.06991566717624664,
-0.080294668674469,
-0.03379400074481964,
0.0757305845618248,
-0.07012619823217392,
-0.02888372167944908,
0.03225874900817871,
0.0784158706665039,
-0.07249104231595993,
0.06820175796747208,
-0.01468961127102375,
0.016050776466727257,
-0.0743132159113884,
-0.034810394048690796,
-0.18575045466423035,
0.03665667027235031,
0.09336571395397186,
0.013403092510998249,
-0.01890898495912552,
0.12828169763088226,
-0.006542894523590803,
0.06275027990341187,
-0.0404147133231163,
-0.0025046474765986204,
-0.01063699834048748,
0.0010431428672745824,
-0.09393320232629776,
-0.10902736335992813,
-0.07186899334192276,
-0.06882576644420624,
0.09567651897668839,
-0.11241614818572998,
0.019973402842879295,
-0.06082398444414139,
0.04205971211194992,
0.017953097820281982,
-0.06776419281959534,
-0.008943235501646996,
0.01358045544475317,
-0.06346414983272552,
-0.05555424839258194,
0.0365716889500618,
0.06249939277768135,
-0.02102864347398281,
0.10043390095233917,
-0.05167626589536667,
-0.09073150902986526,
0.02836688980460167,
0.06961182504892349,
-0.10647030174732208,
0.025868576020002365,
-0.04789067432284355,
-0.04662557318806648,
-0.07213953882455826,
-0.02782970480620861,
0.0982716977596283,
-0.010170575231313705,
0.14239293336868286,
-0.076039157807827,
-0.012467253021895885,
0.00928936805576086,
-0.014483490958809853,
-0.026366587728261948,
0.05119962617754936,
0.06312043219804764,
-0.07648766040802002,
0.02183113619685173,
0.03865461051464081,
-0.0030221727211028337,
0.06950854510068893,
-0.05100511759519577,
-0.07781856507062912,
0.021547602489590645,
0.033855512738227844,
0.023759985342621803,
0.05785075202584267,
-0.04254661872982979,
-0.011974206194281578,
0.030297275632619858,
0.023945780470967293,
0.011064939200878143,
-0.11732232570648193,
0.059424009174108505,
0.06192072853446007,
0.008086496964097023,
0.06302555650472641,
-0.016801314428448677,
-0.034907691180706024,
0.0802304595708847,
0.032795969396829605,
-0.017974333837628365,
-0.008853544481098652,
-0.010317270644009113,
-0.12179216742515564,
0.21747499704360962,
-0.0652071014046669,
-0.15440091490745544,
-0.06646616011857986,
-0.10762035101652145,
0.0024793660268187523,
0.022955236956477165,
0.04438534751534462,
-0.030463973060250282,
-0.041301656514406204,
-0.12302394956350327,
0.10196789354085922,
-0.03877117484807968,
0.06420421600341797,
0.10956433415412903,
-0.06202198192477226,
0.04554383084177971,
-0.1312568187713623,
-0.012018570676445961,
-0.08015578240156174,
-0.06357184797525406,
0.055913642048835754,
-0.05643110349774361,
0.040761835873126984,
0.1126324012875557,
0.018269408494234085,
-0.028532996773719788,
-0.02989616058766842,
0.1979721337556839,
0.04154475778341293,
0.0386311374604702,
0.13501493632793427,
-0.07224641740322113,
0.0508473739027977,
0.07527199387550354,
0.004018506966531277,
-0.04375177249312401,
0.055689264088869095,
0.0595349557697773,
-0.06235185265541077,
-0.19458603858947754,
-0.009259912185370922,
0.009109522216022015,
-0.043121542781591415,
0.07255042344331741,
0.037465810775756836,
0.012798159383237362,
0.07708185911178589,
0.015348670072853565,
0.06401386111974716,
0.004390915855765343,
0.09895728528499603,
0.026425698772072792,
-0.03499315679073334,
0.08744345605373383,
-0.008918555453419685,
-0.008802218362689018,
0.07707558572292328,
-0.01718970388174057,
0.29703202843666077,
-0.044117413461208344,
0.026069121435284615,
0.12858499586582184,
0.03011893667280674,
0.051154885441064835,
0.12313007563352585,
-0.07476285099983215,
0.02723661996424198,
-0.07807765901088715,
-0.04874593764543533,
0.00744685810059309,
0.04706622660160065,
-0.07517682760953903,
0.017365310341119766,
-0.08650010824203491,
0.026851395145058632,
-0.022397827357053757,
0.3032665252685547,
0.10238271951675415,
-0.10909713059663773,
-0.06077415496110916,
0.0008120372658595443,
-0.09894835948944092,
-0.06954630464315414,
0.05380001291632652,
0.045645490288734436,
-0.13055284321308136,
0.0017137705581262708,
-0.023745346814393997,
0.07808167487382889,
-0.03174575790762901,
0.017228715121746063,
0.03533920273184776,
0.0536416657269001,
-0.04624500870704651,
0.007701721973717213,
-0.1878194659948349,
0.20452725887298584,
-0.0032461092341691256,
0.023557482287287712,
-0.056141868233680725,
0.02703128196299076,
0.012569773010909557,
-0.01449396088719368,
0.06609103083610535,
0.017366541549563408,
-0.014377498999238014,
-0.06002379208803177,
-0.03932154178619385,
0.013350044377148151,
0.061672311276197433,
-0.04464929923415184,
0.10270453244447708,
0.00660719396546483,
0.053250595927238464,
0.030695121735334396,
0.09126816689968109,
-0.18822988867759705,
-0.07660861313343048,
0.027237948030233383,
-0.05284201353788376,
-0.10148738324642181,
-0.08176521956920624,
-0.09928359091281891,
-0.009475817903876305,
0.22011780738830566,
-0.10035990923643112,
-0.07414917647838593,
-0.09219696372747421,
0.04082459211349487,
0.09555614739656448,
-0.057244911789894104,
0.021873608231544495,
-0.011817649006843567,
0.11424275487661362,
-0.07342777401208878,
-0.12400168925523758,
0.026990555226802826,
-0.10043488442897797,
-0.15866293013095856,
-0.0654909685254097,
0.08937622606754303,
0.06863707304000854,
0.030196186155080795,
-0.03255296126008034,
0.010428173467516899,
0.03673188388347626,
-0.03614559769630432,
0.0008359081693924963,
0.05670972168445587,
0.08667012304067612,
0.03558102250099182,
-0.10832592099905014,
0.016574759036302567,
-0.07502713799476624,
-0.07009745389223099,
0.07207020372152328,
0.2664312720298767,
-0.05051537603139877,
0.10769398510456085,
0.12840640544891357,
-0.08467385172843933,
-0.15900105237960815,
0.042142946273088455,
0.09415443986654282,
-0.015615294687449932,
0.006440304219722748,
-0.1634879857301712,
0.09945140779018402,
0.12059202045202255,
-0.017195120453834534,
0.0005136870895512402,
-0.19932909309864044,
-0.13603052496910095,
0.09658984839916229,
0.11241097003221512,
0.2757187783718109,
-0.05270787701010704,
-0.034608546644449234,
0.020407699048519135,
-0.09751228243112564,
0.014055508188903332,
0.12112000584602356,
0.060967735946178436,
-0.01907932013273239,
-0.06643686443567276,
0.009747887961566448,
-0.03585447371006012,
0.08916957676410675,
0.06407249718904495,
0.06893139332532883,
-0.0023541280534118414,
-0.0011878310469910502,
-0.04639841243624687,
-0.04329551383852959,
0.07266876846551895,
0.03754691034555435,
0.048421960324048996,
-0.08375819772481918,
-0.03686542063951492,
-0.07699760794639587,
0.03737108036875725,
-0.03065330721437931,
-0.07879719883203506,
-0.06032926216721535,
0.0801582932472229,
0.05599622800946236,
-0.03320227563381195,
0.03868081048130989,
0.030282123014330864,
0.09981674700975418,
0.14764361083507538,
-0.002462308621034026,
-0.04694698750972748,
-0.06407932192087173,
-0.029772549867630005,
-0.013909697532653809,
0.07512561231851578,
-0.04716784879565239,
0.01855667121708393,
0.07016142457723618,
0.019202858209609985,
0.10588876903057098,
0.060658276081085205,
-0.12589973211288452,
-0.019305648282170296,
0.023814011365175247,
-0.15262165665626526,
0.012996301054954529,
-0.0014630199875682592,
0.019904980435967445,
-0.017590226605534554,
0.027750836685299873,
0.150568887591362,
-0.0703052431344986,
-0.03144926577806473,
-0.04618120938539505,
0.06087737902998924,
0.03366914391517639,
0.14503486454486847,
0.04112108424305916,
0.03664688020944595,
-0.07577133923768997,
0.14014777541160583,
0.042835310101509094,
-0.038480233401060104,
0.02659241482615471,
-0.030467646196484566,
-0.10663919895887375,
0.012854734435677528,
0.059963345527648926,
0.07080434262752533,
-0.080540731549263,
-0.018187738955020905,
-0.03566160053014755,
-0.08153616636991501,
0.06820061802864075,
0.2149333953857422,
0.06591290235519409,
0.06688286364078522,
-0.05047189071774483,
-0.03304104134440422,
-0.0747665986418724,
0.04911821708083153,
0.050232093781232834,
0.07894114404916763,
-0.07149956375360489,
0.10144888609647751,
0.014834405854344368,
0.053777649998664856,
-0.027142968028783798,
-0.04713987186551094,
-0.10464935004711151,
-0.053609222173690796,
-0.1100216954946518,
0.011081825010478497,
-0.06758231669664383,
-0.042379654943943024,
0.005776283796876669,
-0.0019886982627213,
-0.002121292520314455,
0.0566982701420784,
-0.06057281419634819,
-0.010111928917467594,
-0.014324632473289967,
0.035530559718608856,
-0.06388027966022491,
-0.052820950746536255,
0.019395388662815094,
-0.09614992886781693,
0.09640038013458252,
0.04483950138092041,
0.013292503543198109,
0.004534658044576645,
0.07204566895961761,
-0.00946650467813015,
0.019533153623342514,
0.006787443999201059,
-0.03989458084106445,
-0.10684036463499069,
0.00437278812751174,
-0.026621012017130852,
-0.03521192818880081,
-0.021098585799336433,
0.08613986521959305,
-0.08245409280061722,
0.030665094032883644,
0.0022625394631177187,
-0.004354867152869701,
-0.07825921475887299,
-0.003449556417763233,
0.09528696537017822,
0.08372117578983307,
0.05240410193800926,
-0.08044151216745377,
0.016515454277396202,
-0.12442779541015625,
-0.0352497361600399,
0.013431943021714687,
-0.01583865098655224,
-0.13105453550815582,
-0.010573679581284523,
0.017598990350961685,
-0.014846730045974255,
0.1842784881591797,
-0.058102190494537354,
-0.02159661240875721,
0.021157369017601013,
-0.09683790057897568,
0.10398775339126587,
-0.020953059196472168,
0.15961310267448425,
-0.03050733171403408,
-0.03430700674653053,
-0.014919741079211235,
0.04433952271938324,
0.02666584774851799,
-0.004957438446581364,
0.18966251611709595,
0.1318025141954422,
0.04119126871228218,
0.05609229579567909,
-0.02220947854220867,
-0.007053046952933073,
-0.0634412094950676,
-0.02360895462334156,
0.044299934059381485,
0.03236420080065727,
0.025445815175771713,
0.14712068438529968,
0.06690599769353867,
-0.1587139368057251,
0.0355524979531765,
-0.028184011578559875,
-0.03975510597229004,
-0.11561733484268188,
-0.1008279100060463,
-0.03094767965376377,
-0.05903751775622368,
0.014586015604436398,
-0.13127127289772034,
0.004125639796257019,
0.18148884177207947,
0.06779953837394714,
0.02883436717092991,
0.01984134130179882,
-0.12793251872062683,
-0.04489684849977493,
0.059674836695194244,
0.01282030064612627,
0.02168736793100834,
0.0428033173084259,
-0.002066975459456444,
0.06814880669116974,
0.029633741825819016,
0.001291304943151772,
-0.005921269301325083,
0.07878311723470688,
0.025463538244366646,
0.0459221713244915,
-0.05689461529254913,
0.00047917015035636723,
-0.041793208569288254,
0.08384723216295242,
0.11931318789720535,
0.04531259089708328,
-0.048164717853069305,
-0.010465174913406372,
0.15648241341114044,
-0.025856787338852882,
0.00699243787676096,
-0.12050604820251465,
0.3236114978790283,
0.015593173913657665,
0.013513023965060711,
0.057399991899728775,
-0.07771161198616028,
-0.04434783756732941,
0.20801207423210144,
0.06680844724178314,
-0.02284039743244648,
-0.026637768372893333,
0.004754458088427782,
-0.03078170120716095,
-0.016612868756055832,
0.13927245140075684,
0.04542633891105652,
0.11994998902082443,
-0.058599554002285004,
-0.04976215213537216,
-0.03747374191880226,
0.0009614437585696578,
-0.11314771324396133,
0.1418198198080063,
-0.01636338233947754,
-0.01817687787115574,
-0.07574519515037537,
0.013119476847350597,
0.07289651036262512,
-0.3451571762561798,
0.00027396640507504344,
-0.02507561817765236,
-0.1044001579284668,
-0.015568302012979984,
-0.025128239765763283,
-0.02657446078956127,
0.05017511174082756,
-0.04025281220674515,
0.06323594599962234,
0.045018475502729416,
0.03923901915550232,
-0.016522344201803207,
-0.102604940533638,
0.16240939497947693,
0.07660702615976334,
0.09956826269626617,
0.018199697136878967,
0.08253602683544159,
0.0610944889485836,
0.03838216885924339,
-0.09318605810403824,
0.05827908590435982,
0.009495825506746769,
-0.0673472136259079,
-0.05472283437848091,
0.11243149638175964,
-0.00010505758109502494,
0.06205025315284729,
0.033736877143383026,
-0.12640278041362762,
0.027609718963503838,
0.08003253489732742,
-0.0856265053153038,
-0.09253178536891937,
0.002229636535048485,
-0.09860546886920929,
0.1585339456796646,
0.14927521347999573,
-0.008932291530072689,
0.011701108887791634,
-0.06595773249864578,
-0.009611696936190128,
0.05102234333753586,
0.012275236658751965,
-0.013795629143714905,
-0.1810251772403717,
0.05248154327273369,
-0.07898607105016708,
-0.004897747188806534,
-0.21374191343784332,
-0.10045120865106583,
-0.010937513783574104,
-0.05656656250357628,
-0.02130119688808918,
0.0527934655547142,
0.02329956740140915,
0.07636487483978271,
-0.024854879826307297,
-0.016615843400359154,
-0.03776020184159279,
0.09104970842599869,
-0.10332618653774261,
-0.07317803055047989
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 1900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1900k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_1900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_1900k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_1900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_1900k"]}
| null |
google/multiberts-seed_1-step_1900k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_1900k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 1900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_1900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 1900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 1900k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0777774527668953,
0.07518602907657623,
-0.00203106296248734,
0.03937503322958946,
0.07135840505361557,
-0.018134959042072296,
0.06892374157905579,
0.0899115800857544,
-0.01661922223865986,
0.02114027552306652,
0.085502490401268,
0.02251932956278324,
0.01154963206499815,
0.09641263633966446,
0.017413543537259102,
-0.21144011616706848,
0.02983657270669937,
-0.02558528073132038,
-0.08711212873458862,
0.07588139176368713,
0.10400404036045074,
-0.08636671304702759,
0.04038354754447937,
0.034054990857839584,
-0.11757147312164307,
0.053277455270290375,
-0.009589052759110928,
-0.027500471100211143,
0.1373014897108078,
0.003067422192543745,
0.05873395502567291,
0.05600621551275253,
0.044704779982566833,
-0.13376347720623016,
0.004918983206152916,
0.05560261383652687,
0.050017524510622025,
0.038112491369247437,
0.015202678740024567,
0.0763053372502327,
-0.014088151976466179,
0.03226584568619728,
0.05398630350828171,
0.0142501937225461,
-0.06627675145864487,
-0.06426919996738434,
-0.09846430271863937,
0.03776930645108223,
0.027017394080758095,
0.01778113655745983,
0.011298266239464283,
0.11868555098772049,
-0.03309784084558487,
0.04558048024773598,
0.17638128995895386,
-0.30163004994392395,
-0.006756538525223732,
0.06049958989024162,
0.023160062730312347,
0.11434304714202881,
-0.004821857903152704,
-0.03207213059067726,
0.07664258033037186,
0.02189335599541664,
0.10296574234962463,
-0.04357759281992912,
0.00895988941192627,
-0.057561591267585754,
-0.15709440410137177,
-0.03824213892221451,
0.09825073182582855,
0.0009986780351027846,
-0.1370673030614853,
-0.0235837921500206,
-0.04324807971715927,
0.03482476994395256,
0.016375422477722168,
-0.0403066985309124,
0.045561619102954865,
-0.0011893382761627436,
-0.0005919616669416428,
-0.007807759568095207,
-0.10484615713357925,
-0.04321542754769325,
0.019138993695378304,
0.10434615612030029,
0.10691804438829422,
0.05867040157318115,
-0.008748101070523262,
0.10479200631380081,
-0.180453822016716,
-0.04787270352244377,
-0.0265671219676733,
-0.03519551083445549,
-0.044256556779146194,
-0.01228091586381197,
-0.10504021495580673,
-0.05187845230102539,
-0.003051718231290579,
0.13300426304340363,
-0.022901810705661774,
0.03299553692340851,
-0.023173479363322258,
0.006528882775455713,
0.06345794349908829,
0.04944981262087822,
-0.01963341236114502,
0.01685911975800991,
0.035538095980882645,
-0.01441197283565998,
-0.019847797229886055,
0.00916164368391037,
-0.0002506052842363715,
0.02950628288090229,
0.13689501583576202,
0.012979492545127869,
-0.1044953241944313,
0.07442120462656021,
-0.011511323042213917,
-0.04456673935055733,
-0.006498007569462061,
-0.0886731669306755,
-0.06431029736995697,
-0.041298214346170425,
-0.014667523093521595,
0.004035634454339743,
0.004364436957985163,
-0.006852602586150169,
-0.023956989869475365,
-0.01953878439962864,
-0.09311382472515106,
-0.05829751864075661,
-0.05279519036412239,
-0.13350732624530792,
0.008986751548945904,
-0.19194652140140533,
-0.02824670821428299,
-0.11399621516466141,
-0.20413696765899658,
-0.03764311224222183,
0.050870150327682495,
0.002248454373329878,
-0.06350087374448776,
0.06408607959747314,
0.035965338349342346,
-0.02988252229988575,
-0.0017455139895901084,
0.08938764780759811,
-0.005965348333120346,
0.03675273060798645,
-0.04465001821517944,
0.058133840560913086,
0.0013213978381827474,
0.04207521677017212,
-0.05800265446305275,
0.05547941103577614,
-0.18224193155765533,
0.03893493860960007,
-0.07225939631462097,
-0.02772718109190464,
-0.08540645986795425,
-0.029345499351620674,
-0.006430487614125013,
0.015550640411674976,
0.019793620333075523,
0.07234196364879608,
-0.16346631944179535,
-0.03219475597143173,
0.09206917881965637,
-0.15169264376163483,
-0.028195299208164215,
0.0655917152762413,
-0.057625818997621536,
0.11755737662315369,
0.06752067059278488,
0.15891821682453156,
-0.03680196776986122,
-0.07105349749326706,
0.051932137459516525,
-0.014666725881397724,
0.007662956602871418,
-0.009789456613361835,
0.066295325756073,
-0.019567333161830902,
-0.1686837524175644,
0.023579085245728493,
-0.1334303468465805,
0.0011811176082119346,
-0.07688912749290466,
0.029416419565677643,
0.0017129600746557117,
-0.07093486189842224,
-0.07802513241767883,
-0.034033216536045074,
0.07667208462953568,
-0.07008284330368042,
-0.024029454216361046,
0.04516170546412468,
0.0758230984210968,
-0.07100104540586472,
0.06722662597894669,
-0.014451091177761555,
0.013512298464775085,
-0.0725344717502594,
-0.03698517382144928,
-0.18586032092571259,
0.033587902784347534,
0.0943526029586792,
0.010830475948750973,
-0.015045750886201859,
0.12198104709386826,
-0.0078434431925416,
0.06541375070810318,
-0.04227897897362709,
0.0018633388681337237,
-0.009995238855481148,
0.0018697208724915981,
-0.09494436532258987,
-0.11490929871797562,
-0.07156416028738022,
-0.06867441534996033,
0.10391896963119507,
-0.11638565361499786,
0.02325822412967682,
-0.05616828426718712,
0.04243540018796921,
0.016686435788869858,
-0.07039767503738403,
-0.01039546076208353,
0.014372684992849827,
-0.061625994741916656,
-0.05418263375759125,
0.03521832451224327,
0.06381231546401978,
-0.02073800377547741,
0.09895829111337662,
-0.05131031945347786,
-0.09455475956201553,
0.0309908390045166,
0.0718279629945755,
-0.10862527042627335,
0.017528926953673363,
-0.046578019857406616,
-0.04753259941935539,
-0.06656305491924286,
-0.030284389853477478,
0.10113096237182617,
-0.012380335479974747,
0.14494474232196808,
-0.07956106215715408,
-0.016725579276680946,
0.01217947993427515,
-0.013212726451456547,
-0.026634151116013527,
0.03956875577569008,
0.06756407022476196,
-0.07985261082649231,
0.02389068715274334,
0.02774590440094471,
-0.001953373895958066,
0.07400192320346832,
-0.04998896270990372,
-0.07539926469326019,
0.01795908994972706,
0.027828747406601906,
0.020265065133571625,
0.06691111624240875,
-0.057514891028404236,
-0.01579914800822735,
0.029634788632392883,
0.01750289648771286,
0.012497855350375175,
-0.11711743474006653,
0.06249270215630531,
0.06061423197388649,
0.012813792563974857,
0.05862172693014145,
-0.02028071880340576,
-0.03285970911383629,
0.08195885270833969,
0.031094243749976158,
-0.014469925314188004,
-0.012398187071084976,
-0.011881076730787754,
-0.12569460272789001,
0.2161565124988556,
-0.06919381767511368,
-0.15426936745643616,
-0.06860419362783432,
-0.11858537048101425,
0.0010868831304833293,
0.02263672649860382,
0.04261445626616478,
-0.020003467798233032,
-0.040462713688611984,
-0.12690691649913788,
0.08845335245132446,
-0.03757615014910698,
0.06595318764448166,
0.11386460065841675,
-0.06400606781244278,
0.042525626718997955,
-0.13193581998348236,
-0.011579814366996288,
-0.07990582287311554,
-0.05560171231627464,
0.057619668543338776,
-0.05053059756755829,
0.03759313374757767,
0.11809323728084564,
0.01748018153011799,
-0.0274179857224226,
-0.0330423004925251,
0.20279665291309357,
0.03876432776451111,
0.04486508667469025,
0.12449869513511658,
-0.07409729063510895,
0.05215093120932579,
0.07597380876541138,
0.006230248603969812,
-0.04212487116456032,
0.052196700125932693,
0.05244485288858414,
-0.06378116458654404,
-0.1920098066329956,
-0.01120501197874546,
0.005509022623300552,
-0.05359896644949913,
0.07080953568220139,
0.035843197256326675,
0.023050053045153618,
0.07417373359203339,
0.018470842391252518,
0.07145722210407257,
0.009049138985574245,
0.10161782056093216,
0.026524387300014496,
-0.036445967853069305,
0.08480777591466904,
-0.008016321808099747,
-0.00619465159252286,
0.07786481082439423,
-0.02680860087275505,
0.29312559962272644,
-0.04210132360458374,
0.012879526242613792,
0.12969143688678741,
0.023308098316192627,
0.05337812006473541,
0.12283297628164291,
-0.07810734957456589,
0.029058700427412987,
-0.07609342038631439,
-0.049779005348682404,
0.011441039852797985,
0.04886721447110176,
-0.07323180139064789,
0.01631747931241989,
-0.08166296035051346,
0.01662425883114338,
-0.02425481006503105,
0.30837950110435486,
0.10053430497646332,
-0.10963103175163269,
-0.059243444353342056,
0.0007439388427883387,
-0.1029530018568039,
-0.06972290575504303,
0.04856102913618088,
0.05086855590343475,
-0.1335231065750122,
0.010725442320108414,
-0.021169083192944527,
0.07362201809883118,
-0.021610530093312263,
0.012738006189465523,
0.03843837231397629,
0.05524783581495285,
-0.04570407047867775,
0.0024620741605758667,
-0.17203383147716522,
0.20251202583312988,
-0.00031906264484860003,
0.02132675237953663,
-0.05800097808241844,
0.028801346197724342,
0.013001340441405773,
-0.015311084687709808,
0.061326999217271805,
0.021346591413021088,
-0.005433971993625164,
-0.053894754499197006,
-0.039213523268699646,
0.009838237427175045,
0.06627733260393143,
-0.04339881241321564,
0.1070885956287384,
0.008049674332141876,
0.05174023285508156,
0.03046676702797413,
0.08647318184375763,
-0.18214522302150726,
-0.08196517080068588,
0.027996059507131577,
-0.051861852407455444,
-0.1057085394859314,
-0.08338278532028198,
-0.09682266414165497,
-0.0034387295600026846,
0.2156788557767868,
-0.10973145812749863,
-0.074313685297966,
-0.09388802945613861,
0.04308438301086426,
0.09783640503883362,
-0.05312119051814079,
0.028394624590873718,
-0.012123228050768375,
0.11672373116016388,
-0.07321009784936905,
-0.12038913369178772,
0.023735374212265015,
-0.0994156152009964,
-0.16065092384815216,
-0.06866903603076935,
0.09284554421901703,
0.06922750920057297,
0.02938610501587391,
-0.03118988685309887,
0.015353976748883724,
0.04045435041189194,
-0.037748150527477264,
-0.0007949390565045178,
0.06387174874544144,
0.09395842999219894,
0.04194675013422966,
-0.10946627706289291,
0.013920674100518227,
-0.07051761448383331,
-0.07015810161828995,
0.07469335198402405,
0.27211689949035645,
-0.05180777981877327,
0.11726664751768112,
0.13102266192436218,
-0.08826339989900589,
-0.16186323761940002,
0.04438577964901924,
0.09257637709379196,
-0.016646733507514,
0.0038397652097046375,
-0.16229303181171417,
0.10193265974521637,
0.11930280923843384,
-0.013923718594014645,
-0.0016133100725710392,
-0.20016081631183624,
-0.13746686279773712,
0.08944622427225113,
0.11642339825630188,
0.27538856863975525,
-0.053930990397930145,
-0.03808464854955673,
0.02178504690527916,
-0.09203173220157623,
0.010071426630020142,
0.11883211135864258,
0.05892582982778549,
-0.020062141120433807,
-0.07094249129295349,
0.010174649767577648,
-0.037206023931503296,
0.09137342125177383,
0.05983443185687065,
0.06919272243976593,
-0.00402449956163764,
-0.011600012890994549,
-0.02306266501545906,
-0.04350723698735237,
0.06711792200803757,
0.03750309720635414,
0.04844829440116882,
-0.08076690137386322,
-0.036053117364645004,
-0.07723013311624527,
0.03768717125058174,
-0.032291002571582794,
-0.07707948237657547,
-0.06377571821212769,
0.07812898606061935,
0.058191556483507156,
-0.03621668741106987,
0.034864190965890884,
0.029379475861787796,
0.10054370760917664,
0.1564783751964569,
-0.00036385373095981777,
-0.05317943915724754,
-0.059274233877658844,
-0.028271155431866646,
-0.01429340522736311,
0.07095813751220703,
-0.04353354871273041,
0.01750429905951023,
0.06866858899593353,
0.02349608764052391,
0.10946743935346603,
0.05994625389575958,
-0.12296387553215027,
-0.021388661116361618,
0.027769222855567932,
-0.15408045053482056,
0.015616576187312603,
-0.0015286485431715846,
0.0221448615193367,
-0.02335224859416485,
0.02016635797917843,
0.14649346470832825,
-0.0678030326962471,
-0.032835736870765686,
-0.047991540282964706,
0.06054134666919708,
0.03452131897211075,
0.1417178362607956,
0.03839614614844322,
0.03720792755484581,
-0.0781734511256218,
0.13971228897571564,
0.045683737844228745,
-0.0377521887421608,
0.02578977309167385,
-0.02476109378039837,
-0.10615719854831696,
0.01536579430103302,
0.05643518269062042,
0.07239292562007904,
-0.07741042226552963,
-0.014588049612939358,
-0.03653049096465111,
-0.08331657946109772,
0.0676439106464386,
0.22653143107891083,
0.0609576478600502,
0.06627186387777328,
-0.0509195439517498,
-0.03188636526465416,
-0.07749568670988083,
0.05042760819196701,
0.0466277152299881,
0.07772084325551987,
-0.0698351263999939,
0.10723548382520676,
0.016408652067184448,
0.0503062978386879,
-0.027459926903247833,
-0.05035781487822533,
-0.10376147925853729,
-0.055750660598278046,
-0.11016202718019485,
0.01273605041205883,
-0.07320927828550339,
-0.04287680983543396,
0.006923046428710222,
-0.002689456334337592,
-0.007490999065339565,
0.05339888855814934,
-0.06103409826755524,
-0.008602536283433437,
-0.012879980728030205,
0.035383958369493484,
-0.061476703733205795,
-0.05322844162583351,
0.02024216763675213,
-0.09858153760433197,
0.09592973440885544,
0.047836508601903915,
0.01486118696630001,
0.0038883404340595007,
0.08484874665737152,
-0.00932686123996973,
0.01887262426316738,
0.009820410050451756,
-0.039956849068403244,
-0.10394567251205444,
0.004205562639981508,
-0.025731530040502548,
-0.03360433131456375,
-0.016572115942835808,
0.0878830999135971,
-0.07974155247211456,
0.030670491978526115,
0.0011459041852504015,
-0.006258045323193073,
-0.07996311783790588,
-0.005748557858169079,
0.1031312420964241,
0.08000402897596359,
0.05103620886802673,
-0.07995101064443588,
0.014659694395959377,
-0.12520727515220642,
-0.036022793501615524,
0.00981185119599104,
-0.014651111327111721,
-0.12682081758975983,
-0.012686410918831825,
0.01671666093170643,
-0.011771846562623978,
0.20220045745372772,
-0.054422229528427124,
-0.03428490459918976,
0.02337791956961155,
-0.09651344269514084,
0.10692045837640762,
-0.020185714587569237,
0.16384448111057281,
-0.026842324063181877,
-0.03685224801301956,
-0.015109557658433914,
0.04660302400588989,
0.028704270720481873,
-0.009837266057729721,
0.1953541338443756,
0.1344095766544342,
0.045671842992305756,
0.055825088173151016,
-0.02307894267141819,
-0.009828168898820877,
-0.05112449824810028,
-0.023707395419478416,
0.04413854330778122,
0.03361371159553528,
0.023829922080039978,
0.15265437960624695,
0.06588754802942276,
-0.16728092730045319,
0.0390593446791172,
-0.024400198832154274,
-0.041637130081653595,
-0.11476300656795502,
-0.0911247581243515,
-0.029628459364175797,
-0.05631408840417862,
0.012317020446062088,
-0.13372063636779785,
0.0031447634100914,
0.16543126106262207,
0.06583654880523682,
0.03213302418589592,
0.013475609011948109,
-0.13626621663570404,
-0.045922327786684036,
0.059994906187057495,
0.017049940302968025,
0.020334603264927864,
0.037527795881032944,
-0.004751140717417002,
0.06316012144088745,
0.02853573113679886,
0.0013875272125005722,
-0.005412842612713575,
0.08077993988990784,
0.02134121209383011,
0.04785393923521042,
-0.05676900967955589,
0.003054550150409341,
-0.04499543458223343,
0.08534013479948044,
0.10201841592788696,
0.04389563947916031,
-0.04867284744977951,
-0.010810150764882565,
0.15892624855041504,
-0.026663590222597122,
0.0012621134519577026,
-0.11903877556324005,
0.3173025846481323,
0.01838388666510582,
0.011012914590537548,
0.05801187828183174,
-0.08025906980037689,
-0.04520247504115105,
0.2108917236328125,
0.07533688098192215,
-0.01988406479358673,
-0.026186751201748848,
0.003239016979932785,
-0.03142481669783592,
-0.015360908582806587,
0.13990908861160278,
0.04220716282725334,
0.1280662715435028,
-0.054051414132118225,
-0.04431796073913574,
-0.03687809780240059,
0.0026110836770385504,
-0.1091628149151802,
0.15209917724132538,
-0.01580015942454338,
-0.025225382298231125,
-0.07600490748882294,
0.011312678456306458,
0.07400450855493546,
-0.3517896831035614,
0.01225823350250721,
-0.03240346536040306,
-0.10587310791015625,
-0.012466412037611008,
-0.03188975527882576,
-0.02765255980193615,
0.048440247774124146,
-0.034738440066576004,
0.061607129871845245,
0.04362986236810684,
0.040972284972667694,
-0.024824973195791245,
-0.10839252173900604,
0.16965274512767792,
0.06342663615942001,
0.10701550543308258,
0.013645130209624767,
0.08889227360486984,
0.06059039384126663,
0.038200899958610535,
-0.09071014821529388,
0.05237947776913643,
0.010478489100933075,
-0.0707494243979454,
-0.05592045933008194,
0.11508991569280624,
-0.0007096217013895512,
0.0628712996840477,
0.03189164027571678,
-0.1209164708852768,
0.026455989107489586,
0.07871738076210022,
-0.09133470803499222,
-0.09302765876054764,
-0.005817526485770941,
-0.09885302931070328,
0.1580309271812439,
0.14603638648986816,
-0.011272993870079517,
0.016682837158441544,
-0.06100091338157654,
-0.011031090281903744,
0.05564131960272789,
0.014256666414439678,
-0.012650083750486374,
-0.18960578739643097,
0.05254879966378212,
-0.08976134657859802,
-0.0014842868549749255,
-0.21262027323246002,
-0.10194901376962662,
-0.011041016317903996,
-0.054792873561382294,
-0.021263515576720238,
0.05721159279346466,
0.0278724767267704,
0.0756833404302597,
-0.02566315419971943,
-0.01627299003303051,
-0.04016989842057228,
0.09672120213508606,
-0.10269740968942642,
-0.0718667134642601
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 2000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 2000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_2000k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_2000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_2000k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_2000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_2000k"]}
| null |
google/multiberts-seed_1-step_2000k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_2000k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_2000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 2000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 2000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 2000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 2000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_2000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 2000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 2000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_2000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 2000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 2000k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0759272351861,
0.08745238184928894,
-0.0020730062387883663,
0.04075891524553299,
0.0742810070514679,
-0.0200545322149992,
0.06948085129261017,
0.09175462275743484,
-0.021587302908301353,
0.021909475326538086,
0.08756805956363678,
0.02242370881140232,
0.00988684967160225,
0.09715329110622406,
0.01893753930926323,
-0.21259444952011108,
0.03296452760696411,
-0.027555719017982483,
-0.08710163086652756,
0.0744369775056839,
0.10480477660894394,
-0.08591245114803314,
0.040342725813388824,
0.03341679275035858,
-0.11571703106164932,
0.053628645837306976,
-0.01226744707673788,
-0.02339569851756096,
0.13619902729988098,
0.0034921965561807156,
0.058008864521980286,
0.057321418076753616,
0.044151801615953445,
-0.1363574117422104,
0.0048699751496315,
0.055064231157302856,
0.05174851790070534,
0.038412466645240784,
0.018471969291567802,
0.07890259474515915,
-0.02582467906177044,
0.030095519497990608,
0.05291886255145073,
0.012476913630962372,
-0.06274105608463287,
-0.06395559012889862,
-0.09863076359033585,
0.03400682285428047,
0.027244379743933678,
0.020335569977760315,
0.010798039846122265,
0.11943958699703217,
-0.033290132880210876,
0.04518585279583931,
0.17666442692279816,
-0.30596938729286194,
-0.005811034701764584,
0.061052896082401276,
0.020753230899572372,
0.110340416431427,
-0.003989735152572393,
-0.03360852599143982,
0.0779443234205246,
0.020297428593039513,
0.0982619896531105,
-0.04110618308186531,
0.007922427728772163,
-0.05682888254523277,
-0.15671183168888092,
-0.038119520992040634,
0.10233648121356964,
0.001318938797339797,
-0.13627417385578156,
-0.022438308224081993,
-0.04238896071910858,
0.04058009386062622,
0.018620431423187256,
-0.03902563452720642,
0.0451117604970932,
0.00015770387835800648,
-0.007114555221050978,
-0.005626351106911898,
-0.10361723601818085,
-0.04346977546811104,
0.020173165947198868,
0.10222150385379791,
0.10825669020414352,
0.05537252500653267,
-0.006018763408064842,
0.10625605285167694,
-0.18654653429985046,
-0.04708011820912361,
-0.03064020536839962,
-0.03231886029243469,
-0.04683561250567436,
-0.010039705783128738,
-0.10757075995206833,
-0.05307045578956604,
0.002503275405615568,
0.1382845938205719,
-0.013682801276445389,
0.03182070702314377,
-0.0244943518191576,
0.0050285495817661285,
0.06133212894201279,
0.056924011558294296,
-0.02233876660466194,
0.025488507002592087,
0.03673898056149483,
-0.017836354672908783,
-0.019618617370724678,
0.009470139630138874,
-0.003932002931833267,
0.03235476464033127,
0.13534975051879883,
0.014034107327461243,
-0.1043020710349083,
0.07847967743873596,
-0.010680528357625008,
-0.04511456564068794,
-0.008690578863024712,
-0.08990990370512009,
-0.06476881355047226,
-0.04058290272951126,
-0.015352698974311352,
0.001037835725583136,
0.0037224371917545795,
-0.004692477639764547,
-0.024266434833407402,
-0.017569588497281075,
-0.08963806182146072,
-0.05907141789793968,
-0.05455819517374039,
-0.13432179391384125,
0.009471968747675419,
-0.1895298957824707,
-0.026624245569109917,
-0.1131577268242836,
-0.20329199731349945,
-0.03794567659497261,
0.04999952018260956,
0.004227075260132551,
-0.06643424183130264,
0.059470824897289276,
0.032045863568782806,
-0.0304942075163126,
-0.0025105068925768137,
0.09124373644590378,
-0.00417373888194561,
0.03608876094222069,
-0.04231494665145874,
0.052618615329265594,
-0.0009191599092446268,
0.04374149069190025,
-0.05887151136994362,
0.05532284826040268,
-0.16359169781208038,
0.03949424624443054,
-0.07397734373807907,
-0.03223126009106636,
-0.08527298271656036,
-0.028939247131347656,
-0.00568340253084898,
0.014022868126630783,
0.02350493334233761,
0.06849434971809387,
-0.16927532851696014,
-0.03033513016998768,
0.08726873993873596,
-0.14946423470973969,
-0.0283904280513525,
0.06431865692138672,
-0.05900075286626816,
0.1157904714345932,
0.06507419049739838,
0.15606090426445007,
-0.03526042774319649,
-0.06481044739484787,
0.048741020262241364,
-0.015260118059813976,
0.010984217748045921,
-0.0076505932956933975,
0.06974774599075317,
-0.018605785444378853,
-0.16185426712036133,
0.025553226470947266,
-0.14076004922389984,
0.0003845318278763443,
-0.076014444231987,
0.02990161068737507,
-0.0022076822351664305,
-0.06993623822927475,
-0.08339769393205643,
-0.035061709582805634,
0.07863232493400574,
-0.06652770191431046,
-0.025826020166277885,
0.04531652107834816,
0.07613316178321838,
-0.07224224507808685,
0.06741037219762802,
-0.013674312271177769,
0.015835687518119812,
-0.07835816591978073,
-0.03720332682132721,
-0.18748198449611664,
0.028841756284236908,
0.09709203243255615,
0.013645395636558533,
-0.016771072521805763,
0.12234807014465332,
-0.00932051707059145,
0.06180381774902344,
-0.041261862963438034,
-0.0003388483019080013,
-0.010397731326520443,
0.0016692673088982701,
-0.09174870699644089,
-0.11015531420707703,
-0.070790134370327,
-0.07078418135643005,
0.10004567354917526,
-0.1112075075507164,
0.02097044512629509,
-0.05481605604290962,
0.04381311312317848,
0.014675257727503777,
-0.07078874111175537,
-0.008593318052589893,
0.013942709192633629,
-0.06507385522127151,
-0.05764847993850708,
0.03475896641612053,
0.06421259790658951,
-0.022831862792372704,
0.09412307292222977,
-0.045878712087869644,
-0.08924548327922821,
0.02925068326294422,
0.08252260088920593,
-0.10458347946405411,
0.024283846840262413,
-0.04832268878817558,
-0.045975759625434875,
-0.06504038721323013,
-0.02473200112581253,
0.1078806146979332,
-0.012410151772201061,
0.14509987831115723,
-0.08033057302236557,
-0.01599591039121151,
0.008823543787002563,
-0.01226708572357893,
-0.027170835062861443,
0.04329180344939232,
0.0684361681342125,
-0.07756312936544418,
0.025146638974547386,
0.025990059599280357,
-0.003985000774264336,
0.06702843308448792,
-0.05206269398331642,
-0.07557640969753265,
0.019133158028125763,
0.033713724464178085,
0.02218402735888958,
0.0632113441824913,
-0.048181600868701935,
-0.01383926346898079,
0.03112964704632759,
0.018743909895420074,
0.013224906288087368,
-0.11848343163728714,
0.062073953449726105,
0.06006016954779625,
0.0107091860845685,
0.05715435370802879,
-0.018113447353243828,
-0.03326806798577309,
0.08047144114971161,
0.03382444009184837,
-0.014148027636110783,
-0.008421235717833042,
-0.012709015049040318,
-0.12235311418771744,
0.21699893474578857,
-0.07055457681417465,
-0.15681985020637512,
-0.0672396570444107,
-0.11755192279815674,
-0.0029086691793054342,
0.021677516400814056,
0.043060462921857834,
-0.02223183400928974,
-0.04318193718791008,
-0.124470055103302,
0.08972848206758499,
-0.04086671769618988,
0.06568656116724014,
0.11392711848020554,
-0.0639585554599762,
0.04432394728064537,
-0.1316540390253067,
-0.00973705854266882,
-0.07847853749990463,
-0.05974803864955902,
0.05442209541797638,
-0.049266230314970016,
0.03829476237297058,
0.11275769770145416,
0.01712292991578579,
-0.025189707055687904,
-0.03093191608786583,
0.20216265320777893,
0.04051986336708069,
0.040506187826395035,
0.12301158159971237,
-0.0780511349439621,
0.05116666853427887,
0.08064953982830048,
0.004977473057806492,
-0.04168885573744774,
0.05209770053625107,
0.05460437759757042,
-0.06153426691889763,
-0.1910744309425354,
-0.0101914182305336,
0.005917019676417112,
-0.05725926160812378,
0.06805593520402908,
0.040462426841259,
0.015201304107904434,
0.07389568537473679,
0.01760699227452278,
0.06419016420841217,
0.006284232251346111,
0.10251414775848389,
0.024859247729182243,
-0.03697656840085983,
0.08533384650945663,
-0.00785672664642334,
-0.008038689382374287,
0.07712294906377792,
-0.022872615605592728,
0.29299652576446533,
-0.04222927242517471,
0.014175048097968102,
0.12962831556797028,
0.028414960950613022,
0.05213715881109238,
0.12228640168905258,
-0.07990352809429169,
0.029571665450930595,
-0.07645442336797714,
-0.04862912744283676,
0.016055384650826454,
0.04879395663738251,
-0.07430053502321243,
0.018755242228507996,
-0.08122151345014572,
0.016294151544570923,
-0.02584053762257099,
0.306012898683548,
0.09989207237958908,
-0.11042527109384537,
-0.05906432867050171,
0.0015461721923202276,
-0.0998871698975563,
-0.07251109927892685,
0.048194654285907745,
0.0482945442199707,
-0.1331421136856079,
0.00620287237688899,
-0.02277059480547905,
0.07531102746725082,
-0.025513047352433205,
0.014043215662240982,
0.040434930473566055,
0.05646932125091553,
-0.04338476061820984,
0.0077314674854278564,
-0.1801067739725113,
0.20138303935527802,
-0.0018601682968437672,
0.020149098709225655,
-0.057214319705963135,
0.032089512795209885,
0.009389598853886127,
-0.01638631522655487,
0.06236441433429718,
0.01938210427761078,
-0.010232796892523766,
-0.05394931882619858,
-0.03858054056763649,
0.01447561476379633,
0.06838325411081314,
-0.04395989701151848,
0.10696238279342651,
0.006740421522408724,
0.05244031175971031,
0.033339157700538635,
0.09068369120359421,
-0.18334846198558807,
-0.08632339537143707,
0.03249024972319603,
-0.04670064523816109,
-0.10162904113531113,
-0.08387361466884613,
-0.09810616821050644,
-0.00741045456379652,
0.22966685891151428,
-0.11567451804876328,
-0.07404924184083939,
-0.09517119079828262,
0.04719676077365875,
0.09987693279981613,
-0.054080069065093994,
0.02758726477622986,
-0.01592279225587845,
0.11767414212226868,
-0.07106386125087738,
-0.12309412658214569,
0.02282978594303131,
-0.09790702909231186,
-0.159514382481575,
-0.06757859885692596,
0.08948710560798645,
0.06234491616487503,
0.02775598131120205,
-0.0327078141272068,
0.016485655680298805,
0.03800316900014877,
-0.035755693912506104,
-0.006073417607694864,
0.06736506521701813,
0.08865661919116974,
0.036905527114868164,
-0.11389166116714478,
0.021692678332328796,
-0.07057565450668335,
-0.0697874203324318,
0.07761259377002716,
0.26747626066207886,
-0.052280377596616745,
0.11613994836807251,
0.12123607099056244,
-0.08871679753065109,
-0.1603020876646042,
0.04014241695404053,
0.09484773874282837,
-0.014025977812707424,
0.0004110985028091818,
-0.16483889520168304,
0.10014410316944122,
0.11500222980976105,
-0.015734998509287834,
-0.006019276101142168,
-0.19705148041248322,
-0.1350274682044983,
0.08938311040401459,
0.1133233979344368,
0.2716505229473114,
-0.05361614003777504,
-0.03844626247882843,
0.021448789164423943,
-0.08314277976751328,
0.010282129980623722,
0.12786783277988434,
0.061555568128824234,
-0.019301751628518105,
-0.06995844841003418,
0.010081765241920948,
-0.037210553884506226,
0.0902261734008789,
0.06091325357556343,
0.06839779019355774,
-0.002956454176455736,
-0.01107953954488039,
-0.03127143904566765,
-0.043057989329099655,
0.06881583482027054,
0.03747529536485672,
0.05293339118361473,
-0.08063258230686188,
-0.03592592105269432,
-0.07470809668302536,
0.03544086217880249,
-0.030418196693062782,
-0.0761360451579094,
-0.06241398677229881,
0.07824985682964325,
0.06015065312385559,
-0.0336880199611187,
0.039247579872608185,
0.03228580579161644,
0.09507922828197479,
0.14859192073345184,
-0.0040460554882884026,
-0.0460851825773716,
-0.05802464857697487,
-0.03042467124760151,
-0.013144446536898613,
0.07059820741415024,
-0.04397706687450409,
0.018261943012475967,
0.07035443931818008,
0.0233188234269619,
0.10787169635295868,
0.060627538710832596,
-0.12313621491193771,
-0.022649118676781654,
0.024989919736981392,
-0.15466712415218353,
0.015312708914279938,
-0.0010769773507490754,
0.024719787761569023,
-0.02267168089747429,
0.021235644817352295,
0.14905579388141632,
-0.06760316342115402,
-0.032285016030073166,
-0.046446267515420914,
0.05995871126651764,
0.03684041276574135,
0.141062393784523,
0.036852553486824036,
0.0359184667468071,
-0.07798551023006439,
0.1421799212694168,
0.04757065325975418,
-0.03382359445095062,
0.02771061100065708,
-0.032644059509038925,
-0.10677081346511841,
0.016241924837231636,
0.0625700056552887,
0.07384940981864929,
-0.0722804144024849,
-0.017373748123645782,
-0.03972330689430237,
-0.0809260830283165,
0.06764715909957886,
0.2208174616098404,
0.062359318137168884,
0.06851077824831009,
-0.051975104957818985,
-0.03556430712342262,
-0.07680828124284744,
0.052291661500930786,
0.049274176359176636,
0.07904180884361267,
-0.07272104173898697,
0.09825467318296432,
0.014900527894496918,
0.04854040965437889,
-0.026513349264860153,
-0.04987514764070511,
-0.1037726178765297,
-0.05713464692234993,
-0.10313325375318527,
0.013861356303095818,
-0.06402279436588287,
-0.04325971007347107,
0.008641693741083145,
-0.004698456265032291,
-0.006436511874198914,
0.05667706951498985,
-0.06140654534101486,
-0.011530809104442596,
-0.015857787802815437,
0.03601347655057907,
-0.0618034191429615,
-0.0546732172369957,
0.01855894923210144,
-0.09685800969600677,
0.0968191921710968,
0.04732617363333702,
0.013049210421741009,
0.003151709446683526,
0.0697847232222557,
-0.012039626948535442,
0.02103552408516407,
0.011385257355868816,
-0.039944857358932495,
-0.10473950207233429,
0.00505148246884346,
-0.022793924435973167,
-0.034423697739839554,
-0.019115423783659935,
0.09115157276391983,
-0.08018861711025238,
0.02461943030357361,
-0.0007096432964317501,
-0.002715981099754572,
-0.07699388265609741,
-0.0061570038087666035,
0.09841720014810562,
0.0846896842122078,
0.05088493227958679,
-0.0801834836602211,
0.01662641577422619,
-0.12211890518665314,
-0.03416219353675842,
0.010224813595414162,
-0.01636883057653904,
-0.13634103536605835,
-0.01383478008210659,
0.01649615727365017,
-0.012556541711091995,
0.20168370008468628,
-0.05692991614341736,
-0.028476299718022346,
0.02076888270676136,
-0.0911521315574646,
0.11280083656311035,
-0.019324328750371933,
0.16554774343967438,
-0.024941295385360718,
-0.03496359661221504,
-0.014445332810282707,
0.04593687132000923,
0.0247108843177557,
-0.01307580154389143,
0.18578116595745087,
0.136874720454216,
0.04553510993719101,
0.05529007315635681,
-0.02132144756615162,
-0.009928479790687561,
-0.044137876480817795,
-0.016984865069389343,
0.04313438758254051,
0.03397522121667862,
0.022405901923775673,
0.15502622723579407,
0.05989597365260124,
-0.1635962873697281,
0.03654175251722336,
-0.022785967215895653,
-0.03870615363121033,
-0.11397527158260345,
-0.09856763482093811,
-0.034500911831855774,
-0.056582655757665634,
0.01458948478102684,
-0.13431961834430695,
0.002374984323978424,
0.16847366094589233,
0.06567158550024033,
0.03235074505209923,
0.011828988790512085,
-0.1334492713212967,
-0.04539017751812935,
0.056057315319776535,
0.0142073268070817,
0.021161388605833054,
0.03805636242032051,
-0.00486218323931098,
0.06723909080028534,
0.031155992299318314,
0.0028403059113770723,
-0.0036411602050065994,
0.08731817454099655,
0.02201787382364273,
0.046653106808662415,
-0.055003345012664795,
0.0020294308196753263,
-0.044135160744190216,
0.08404776453971863,
0.10759276151657104,
0.042784351855516434,
-0.050171803683042526,
-0.011781527660787106,
0.15596222877502441,
-0.02714402787387371,
0.0050126020796597,
-0.11975596845149994,
0.31917327642440796,
0.014394333586096764,
0.011519328691065311,
0.0584518164396286,
-0.08023583889007568,
-0.048385318368673325,
0.21183717250823975,
0.07037309557199478,
-0.020294852554798126,
-0.027699608355760574,
0.005800178274512291,
-0.030896039679646492,
-0.017111478373408318,
0.1357334554195404,
0.04353870078921318,
0.11952369660139084,
-0.052255935966968536,
-0.0517914816737175,
-0.03574320673942566,
0.0038244512397795916,
-0.10872488468885422,
0.15017955005168915,
-0.016814641654491425,
-0.02554580569267273,
-0.07857304066419601,
0.009208520874381065,
0.0739147812128067,
-0.3485173285007477,
0.005025314167141914,
-0.03199099376797676,
-0.10822659730911255,
-0.012436171993613243,
-0.027399394661188126,
-0.02802063338458538,
0.04690069332718849,
-0.03788065165281296,
0.0637839213013649,
0.04317993298172951,
0.041419871151447296,
-0.022734008729457855,
-0.1017991304397583,
0.16575229167938232,
0.059441324323415756,
0.1043374091386795,
0.016521349549293518,
0.0896979495882988,
0.06177283078432083,
0.0370013527572155,
-0.09247824549674988,
0.05310657247900963,
0.010633053258061409,
-0.07237021625041962,
-0.0546845979988575,
0.11796776205301285,
-0.0011120917042717338,
0.06599392741918564,
0.0316966250538826,
-0.11724431812763214,
0.028503043577075005,
0.07937776297330856,
-0.09096669405698776,
-0.09544683247804642,
-0.0025019687600433826,
-0.10060502588748932,
0.15835197269916534,
0.14791472256183624,
-0.010490893386304379,
0.01671588607132435,
-0.06325292587280273,
-0.00648621516302228,
0.0505768246948719,
0.014249525032937527,
-0.011086305603384972,
-0.18562154471874237,
0.05419798567891121,
-0.09257945418357849,
-0.0038337574806064367,
-0.21574972569942474,
-0.10196901112794876,
-0.01339139323681593,
-0.05575963109731674,
-0.01987600512802601,
0.05688522383570671,
0.027152471244335175,
0.07364269345998764,
-0.026431186124682426,
-0.020627377554774284,
-0.038526710122823715,
0.09050973504781723,
-0.10056271404027939,
-0.07116875797510147
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_200k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_200k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_200k"]}
| null |
google/multiberts-seed_1-step_200k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_200k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 200k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08007696270942688,
0.0829300582408905,
-0.0021354760974645615,
0.04128473624587059,
0.07696840167045593,
-0.02022855170071125,
0.06842458248138428,
0.09249455481767654,
-0.009600129909813404,
0.024524206295609474,
0.08335940539836884,
0.025319833308458328,
0.011811154894530773,
0.10354422777891159,
0.022548794746398926,
-0.2133457064628601,
0.03224146366119385,
-0.026996515691280365,
-0.09219449013471603,
0.07459406554698944,
0.10280916094779968,
-0.08344502002000809,
0.040825456380844116,
0.030513660982251167,
-0.11730959266424179,
0.05260508880019188,
-0.009285308420658112,
-0.02216200903058052,
0.1375117003917694,
0.002002197317779064,
0.053173672407865524,
0.055361900478601456,
0.046621523797512054,
-0.1360863596200943,
0.003998742904514074,
0.05555228516459465,
0.051802754402160645,
0.03727095201611519,
0.021541021764278412,
0.08200757950544357,
-0.02110671065747738,
0.03404684364795685,
0.049695856869220734,
0.016483090817928314,
-0.06291031837463379,
-0.0651950091123581,
-0.0967945083975792,
0.031119590625166893,
0.030286084860563278,
0.021410375833511353,
0.006978991907089949,
0.11526734381914139,
-0.031555455178022385,
0.041791584342718124,
0.17452219128608704,
-0.31364625692367554,
-0.004242136608809233,
0.060597192496061325,
0.021872567012906075,
0.11330360174179077,
-0.0014127043541520834,
-0.0350588858127594,
0.08040173351764679,
0.025268247351050377,
0.09489811956882477,
-0.039395999163389206,
0.016963554546236992,
-0.06102415546774864,
-0.1571832299232483,
-0.038776468485593796,
0.10192938148975372,
0.0032197709660977125,
-0.1402469128370285,
-0.020103733986616135,
-0.04530014842748642,
0.04303724318742752,
0.020405355840921402,
-0.04229004681110382,
0.04407495632767677,
0.005457618739455938,
-0.00466169323772192,
-0.003221021732315421,
-0.10419106483459473,
-0.04430413618683815,
0.02178373374044895,
0.09665868431329727,
0.10919298231601715,
0.056965842843055725,
-0.0068491012789309025,
0.10842082649469376,
-0.18151280283927917,
-0.04652436450123787,
-0.02839595638215542,
-0.03502790257334709,
-0.04333548620343208,
-0.010011083446443081,
-0.10180478543043137,
-0.053834401071071625,
-0.0010155865456908941,
0.1304873377084732,
-0.010309196077287197,
0.030737364664673805,
-0.0251810010522604,
0.006635941099375486,
0.05887632071971893,
0.05292157083749771,
-0.022441713139414787,
0.02889033406972885,
0.03359117731451988,
-0.015189995057880878,
-0.0201132670044899,
0.009505771100521088,
-0.0042677512392401695,
0.028694264590740204,
0.13693293929100037,
0.014090477488934994,
-0.10539048165082932,
0.07302060723304749,
-0.01671666093170643,
-0.04575449973344803,
-0.013800863176584244,
-0.08717100322246552,
-0.06300877779722214,
-0.042781226336956024,
-0.014954334124922752,
0.0038104166742414236,
0.004502848256379366,
-0.010365101508796215,
-0.027238426730036736,
-0.018174685537815094,
-0.09087025374174118,
-0.056280843913555145,
-0.0535677969455719,
-0.13330495357513428,
0.009396152570843697,
-0.19097565114498138,
-0.029267190024256706,
-0.11680291593074799,
-0.19723166525363922,
-0.037500206381082535,
0.047695763409137726,
0.0019142271485179663,
-0.0652550458908081,
0.05848922207951546,
0.03299136459827423,
-0.030614973977208138,
-0.0031759010162204504,
0.08143793046474457,
-0.006458203308284283,
0.03807589039206505,
-0.04362164065241814,
0.05883364379405975,
0.008499638177454472,
0.04268578067421913,
-0.06057904288172722,
0.057044945657253265,
-0.17001992464065552,
0.039486274123191833,
-0.07294537872076035,
-0.03114093653857708,
-0.0858960673213005,
-0.030334319919347763,
-0.006753095891326666,
0.014363023452460766,
0.023612650111317635,
0.07504536211490631,
-0.18117870390415192,
-0.030495578423142433,
0.0982760563492775,
-0.15105441212654114,
-0.026590274646878242,
0.07122206687927246,
-0.05437687784433365,
0.11804422736167908,
0.06700669229030609,
0.15758229792118073,
-0.03362470492720604,
-0.06772596389055252,
0.045086558908224106,
-0.013274283148348331,
0.010259400121867657,
-0.010671618394553661,
0.06686022132635117,
-0.02061857283115387,
-0.1653953343629837,
0.023253392428159714,
-0.13094185292720795,
0.001456290832720697,
-0.07764700055122375,
0.03000589646399021,
-0.004733702167868614,
-0.06798983365297318,
-0.07837730646133423,
-0.034520942717790604,
0.07465508580207825,
-0.07049781829118729,
-0.026135694235563278,
0.03920455649495125,
0.07697303593158722,
-0.0714823454618454,
0.06626550108194351,
-0.016849562525749207,
0.017739340662956238,
-0.07984832674264908,
-0.033756498247385025,
-0.18502648174762726,
0.03479355201125145,
0.0968342199921608,
0.014490286819636822,
-0.01947908289730549,
0.13070379197597504,
-0.008877464570105076,
0.06593956798315048,
-0.04032880440354347,
-0.004643672611564398,
-0.011548045091331005,
-0.00021630349510814995,
-0.09589613229036331,
-0.10696537792682648,
-0.07292737811803818,
-0.06865677982568741,
0.09461089223623276,
-0.11612183600664139,
0.021197684109210968,
-0.05566708743572235,
0.04392200708389282,
0.018443480134010315,
-0.07036362588405609,
-0.009326398372650146,
0.014319635927677155,
-0.06313463300466537,
-0.059604402631521225,
0.033983733505010605,
0.06322299689054489,
-0.02382502891123295,
0.09558921307325363,
-0.04819804057478905,
-0.08291148394346237,
0.029228297993540764,
0.08170013129711151,
-0.10403825342655182,
0.03133097663521767,
-0.04824088513851166,
-0.0458686500787735,
-0.07067849487066269,
-0.026636505499482155,
0.09720587730407715,
-0.010573038831353188,
0.1416124850511551,
-0.07629000395536423,
-0.011267932131886482,
0.009643697179853916,
-0.015562118031084538,
-0.027305353432893753,
0.04552915319800377,
0.0681830495595932,
-0.07570163160562515,
0.02328803576529026,
0.03298187628388405,
0.00008870929741533473,
0.06643639504909515,
-0.050225578248500824,
-0.07661884278059006,
0.01846451684832573,
0.03440665081143379,
0.023834658786654472,
0.062007542699575424,
-0.04242812842130661,
-0.011664441786706448,
0.028750969097018242,
0.022308917716145515,
0.009858878329396248,
-0.11932452768087387,
0.05958349630236626,
0.06145283579826355,
0.0079805264249444,
0.059086840599775314,
-0.017558550462126732,
-0.03455066308379173,
0.08234294503927231,
0.031691450625658035,
-0.015099734999239445,
-0.00853430200368166,
-0.011331815272569656,
-0.1234523355960846,
0.2169715017080307,
-0.07012137770652771,
-0.15661215782165527,
-0.06895187497138977,
-0.10607123374938965,
0.00273882900364697,
0.02561030350625515,
0.04269256815314293,
-0.029943052679300308,
-0.04180104658007622,
-0.12490051984786987,
0.09298838675022125,
-0.03830760717391968,
0.06505310535430908,
0.10888805985450745,
-0.06232781335711479,
0.04817475005984306,
-0.13052202761173248,
-0.012509427033364773,
-0.07665767520666122,
-0.0643266886472702,
0.056657835841178894,
-0.051460880786180496,
0.03819873556494713,
0.11400705575942993,
0.017993833869695663,
-0.028093697503209114,
-0.028428461402654648,
0.20421259105205536,
0.04359528049826622,
0.03998582065105438,
0.12970682978630066,
-0.07185368239879608,
0.05286554992198944,
0.0812469944357872,
0.004504894372075796,
-0.04652051255106926,
0.05477862060070038,
0.05421970412135124,
-0.060571230947971344,
-0.19424737989902496,
-0.010143643245100975,
0.008367008529603481,
-0.046179428696632385,
0.07091794908046722,
0.04002002254128456,
0.009969145059585571,
0.07547806948423386,
0.016814598813652992,
0.0676884576678276,
0.0019862374756485224,
0.10026523470878601,
0.022512182593345642,
-0.03683783859014511,
0.08615759015083313,
-0.007634368259459734,
-0.007899471558630466,
0.07798852026462555,
-0.019220959395170212,
0.2982245683670044,
-0.041515789926052094,
0.014330482110381126,
0.1270090490579605,
0.03288917988538742,
0.05160297453403473,
0.1217227652668953,
-0.07208980619907379,
0.02889084815979004,
-0.07663899660110474,
-0.047667186707258224,
0.011224615387618542,
0.04920363426208496,
-0.07107927650213242,
0.020829536020755768,
-0.08690781146287918,
0.02725868485867977,
-0.024836400523781776,
0.3028492033481598,
0.10481344908475876,
-0.10950814187526703,
-0.06105968356132507,
0.004997067153453827,
-0.10164108127355576,
-0.07413351535797119,
0.051513832062482834,
0.05914171040058136,
-0.13357818126678467,
0.0007802553009241819,
-0.023052288219332695,
0.07572487741708755,
-0.029051780700683594,
0.014676476828753948,
0.036883264780044556,
0.053495317697525024,
-0.04376601055264473,
0.007089183200150728,
-0.18140363693237305,
0.19875822961330414,
-0.002384369960054755,
0.024113954976201057,
-0.054074496030807495,
0.029650315642356873,
0.010780930519104004,
-0.015515504404902458,
0.0651111826300621,
0.019686955958604813,
-0.019099680706858635,
-0.06086050719022751,
-0.04070606827735901,
0.014605669304728508,
0.0646742507815361,
-0.04630221426486969,
0.10386492311954498,
0.005956341978162527,
0.052867889404296875,
0.029629399999976158,
0.08864746242761612,
-0.1855911761522293,
-0.07926882058382034,
0.028676336631178856,
-0.04958545044064522,
-0.09525299072265625,
-0.08405566960573196,
-0.09842441976070404,
0.00632496876642108,
0.22492735087871552,
-0.11328864097595215,
-0.0746520534157753,
-0.09206711500883102,
0.04443385824561119,
0.09787413477897644,
-0.05570811405777931,
0.023906778544187546,
-0.008730954490602016,
0.1179610937833786,
-0.07048877328634262,
-0.12349142879247665,
0.024905741214752197,
-0.09937619417905807,
-0.15797288715839386,
-0.06868245452642441,
0.09063151478767395,
0.06693482398986816,
0.03049335815012455,
-0.03204737976193428,
0.011988453567028046,
0.04090525954961777,
-0.03601072356104851,
0.0009271946037188172,
0.06504391878843307,
0.09071692824363708,
0.03548937290906906,
-0.10927370190620422,
0.022757913917303085,
-0.07396043837070465,
-0.07015324383974075,
0.07335282117128372,
0.2665354907512665,
-0.052372515201568604,
0.11048246175050735,
0.12126857787370682,
-0.08849441260099411,
-0.15972961485385895,
0.04323140159249306,
0.09350653737783432,
-0.013778808526694775,
0.0014628898352384567,
-0.16458247601985931,
0.10264861583709717,
0.11704850941896439,
-0.015675770118832588,
0.00718716811388731,
-0.20665423572063446,
-0.1367749273777008,
0.09111080318689346,
0.11337672919034958,
0.2806345522403717,
-0.051911141723394394,
-0.03734689578413963,
0.018696198239922523,
-0.09030351042747498,
0.009245427325367928,
0.12853088974952698,
0.060931429266929626,
-0.019948331639170647,
-0.06599541008472443,
0.011213618330657482,
-0.03472261130809784,
0.09139810502529144,
0.06408852338790894,
0.06829197704792023,
-0.003735866630449891,
-0.0043405406177043915,
-0.04356173053383827,
-0.04171208664774895,
0.07066217064857483,
0.03516668081283569,
0.04863462969660759,
-0.08440646529197693,
-0.03557201847434044,
-0.07356800138950348,
0.034259870648384094,
-0.02990107610821724,
-0.07755757123231888,
-0.059476807713508606,
0.07800178974866867,
0.05809852108359337,
-0.03315974771976471,
0.027938824146986008,
0.03282161056995392,
0.1010567843914032,
0.14284752309322357,
-0.002773536369204521,
-0.050790175795555115,
-0.06644877791404724,
-0.028197836130857468,
-0.012896613217890263,
0.07566853612661362,
-0.04229620099067688,
0.01647183485329151,
0.0715351551771164,
0.018968474119901657,
0.10682608187198639,
0.061336416751146317,
-0.12240969389677048,
-0.02002786286175251,
0.025874800980091095,
-0.15397021174430847,
0.004037018399685621,
-0.0009534455020911992,
0.02798953466117382,
-0.02215142734348774,
0.023195484653115273,
0.14942654967308044,
-0.07111009955406189,
-0.03348921984434128,
-0.047361042350530624,
0.06159425899386406,
0.03373969718813896,
0.14477871358394623,
0.039367180317640305,
0.03769907355308533,
-0.07523885369300842,
0.14219093322753906,
0.043934788554906845,
-0.041160814464092255,
0.029209356755018234,
-0.03302404284477234,
-0.10668215900659561,
0.015640078112483025,
0.06157206371426582,
0.0703718364238739,
-0.07100304216146469,
-0.01684938557446003,
-0.036552924662828445,
-0.08256904035806656,
0.06746619939804077,
0.2137737274169922,
0.06248332932591438,
0.06933658570051193,
-0.05465221032500267,
-0.03618054836988449,
-0.07628870010375977,
0.04885159432888031,
0.055831652134656906,
0.07824266701936722,
-0.0752849206328392,
0.10271763056516647,
0.012683807872235775,
0.052747201174497604,
-0.02581646665930748,
-0.04859326407313347,
-0.10441844910383224,
-0.05619063600897789,
-0.11015032231807709,
0.013770115561783314,
-0.06470435112714767,
-0.044027816504240036,
0.00685288617387414,
-0.0027873425278812647,
-0.0025724100414663553,
0.05565369874238968,
-0.062109000980854034,
-0.011680880561470985,
-0.014844081364572048,
0.03544670343399048,
-0.06171552836894989,
-0.05319105088710785,
0.020195048302412033,
-0.09847871214151382,
0.09637011587619781,
0.048079729080200195,
0.012703354470431805,
0.0020228337962180376,
0.07931520789861679,
-0.012042788788676262,
0.02311437949538231,
0.008475708775222301,
-0.03902613744139671,
-0.10352130234241486,
0.006388576701283455,
-0.025230728089809418,
-0.03518775478005409,
-0.02028217539191246,
0.08813197910785675,
-0.08046171069145203,
0.027570299804210663,
-0.0015679528005421162,
-0.003699354361742735,
-0.08033262938261032,
-0.00445101223886013,
0.10087618976831436,
0.08421898633241653,
0.05372343212366104,
-0.08469826728105545,
0.016414927318692207,
-0.12516698241233826,
-0.035260770469903946,
0.011878475546836853,
-0.014917928725481033,
-0.12973392009735107,
-0.011437671259045601,
0.017623741179704666,
-0.013767028227448463,
0.1843080073595047,
-0.05800670385360718,
-0.02453579753637314,
0.02048034593462944,
-0.0961771011352539,
0.1044437438249588,
-0.01955985836684704,
0.16010810434818268,
-0.027160966768860817,
-0.035849522799253464,
-0.016370905563235283,
0.04481039196252823,
0.026979483664035797,
-0.009308605454862118,
0.18815919756889343,
0.12993422150611877,
0.04229540750384331,
0.05810508131980896,
-0.02566705085337162,
-0.007797125726938248,
-0.063652403652668,
-0.0225361417979002,
0.041536781936883926,
0.034893978387117386,
0.024325275793671608,
0.1560819298028946,
0.06191213056445122,
-0.16212718188762665,
0.034889042377471924,
-0.027883505448698997,
-0.04120335355401039,
-0.11621911823749542,
-0.1014980599284172,
-0.03188686817884445,
-0.06119190901517868,
0.014264892786741257,
-0.13231398165225983,
0.0034833713434636593,
0.17334912717342377,
0.0649220421910286,
0.028135167434811592,
0.014387011528015137,
-0.1299775391817093,
-0.043637920171022415,
0.058889757841825485,
0.013685712590813637,
0.01993740350008011,
0.04207450523972511,
-0.002250805962830782,
0.06465550512075424,
0.0320710651576519,
0.0032179683912545443,
-0.003263194812461734,
0.08106452226638794,
0.02388666942715645,
0.044847361743450165,
-0.05462278798222542,
0.0008808550192043185,
-0.041475262492895126,
0.08107084035873413,
0.11213621497154236,
0.0461096465587616,
-0.05224693939089775,
-0.01035461388528347,
0.1578136384487152,
-0.02957601100206375,
0.006148326210677624,
-0.11936231702566147,
0.32495248317718506,
0.015279758721590042,
0.012569458223879337,
0.057814761996269226,
-0.08009788393974304,
-0.04826696962118149,
0.2020806521177292,
0.06658493727445602,
-0.0203887727111578,
-0.028994282707571983,
0.003272631671279669,
-0.03170573338866234,
-0.01906953938305378,
0.14143748581409454,
0.044456612318754196,
0.1251434087753296,
-0.05781732127070427,
-0.04474876448512077,
-0.03646654635667801,
0.0016130133299157023,
-0.11296737939119339,
0.1480531245470047,
-0.020655974745750427,
-0.01860511302947998,
-0.07759351283311844,
0.012330894358456135,
0.07302916795015335,
-0.34075281023979187,
0.005551521200686693,
-0.026160828769207,
-0.10473078489303589,
-0.011730113998055458,
-0.029568519443273544,
-0.028558630496263504,
0.05017363280057907,
-0.04142244532704353,
0.06355085223913193,
0.04755181819200516,
0.03934488818049431,
-0.02105819061398506,
-0.10312068462371826,
0.16247227787971497,
0.0640978068113327,
0.10431717336177826,
0.017615964636206627,
0.08415725082159042,
0.060105133801698685,
0.03946221247315407,
-0.09315959364175797,
0.05390836298465729,
0.009493440389633179,
-0.0660034716129303,
-0.05439553037285805,
0.11571904271841049,
-0.0013898415490984917,
0.06299574673175812,
0.033926066011190414,
-0.12191765010356903,
0.02321036532521248,
0.07323084771633148,
-0.084736168384552,
-0.09626823663711548,
0.00022232180344872177,
-0.09989845752716064,
0.1574678122997284,
0.14737020432949066,
-0.012819881550967693,
0.013821753673255444,
-0.06316711008548737,
-0.008214845322072506,
0.05145597457885742,
0.01232146192342043,
-0.010023018345236778,
-0.18124902248382568,
0.05206117033958435,
-0.08696430176496506,
-0.0035278289578855038,
-0.21410849690437317,
-0.1039154902100563,
-0.011673243716359138,
-0.05639779195189476,
-0.023477714508771896,
0.057950302958488464,
0.0250975601375103,
0.07488569617271423,
-0.027177046984434128,
-0.02028096839785576,
-0.03888215869665146,
0.09332537651062012,
-0.10032610595226288,
-0.07315491884946823
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 20k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 20k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_20k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_20k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_20k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_20k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_20k"]}
| null |
google/multiberts-seed_1-step_20k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_20k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_20k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 20k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 20k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 20k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 20k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_20k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 20k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 20k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_20k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 20k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 20k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07922178506851196,
0.08330949395895004,
-0.0021246299147605896,
0.03687569499015808,
0.0748293399810791,
-0.022699549794197083,
0.07528535276651382,
0.0928327813744545,
-0.014600249007344246,
0.029379360377788544,
0.08572583645582199,
0.027192041277885437,
0.009914930909872055,
0.10597904026508331,
0.019421275705099106,
-0.21437183022499084,
0.032172881066799164,
-0.027328316122293472,
-0.08361421525478363,
0.07552577555179596,
0.10465692728757858,
-0.08530385047197342,
0.040293335914611816,
0.03335893526673317,
-0.1191604882478714,
0.05408352240920067,
-0.009475883096456528,
-0.02401091158390045,
0.13699233531951904,
0.0014648850774392486,
0.05344296991825104,
0.056110139936208725,
0.04260256886482239,
-0.13512268662452698,
0.0038877627812325954,
0.05710414797067642,
0.04726110398769379,
0.03843984380364418,
0.02417118288576603,
0.07813926786184311,
-0.013432923704385757,
0.023868102580308914,
0.05225631222128868,
0.015233035199344158,
-0.06354419887065887,
-0.07057386636734009,
-0.09719553589820862,
0.03329073637723923,
0.029403671622276306,
0.02203754335641861,
0.006604397669434547,
0.12098308652639389,
-0.03459053486585617,
0.04767961800098419,
0.1830676943063736,
-0.31375381350517273,
-0.00491914851590991,
0.060209956020116806,
0.024289855733513832,
0.11584275960922241,
-0.0013853843556717038,
-0.03471328690648079,
0.08026000112295151,
0.02217605896294117,
0.09318225085735321,
-0.04062709957361221,
0.022124486044049263,
-0.05905769765377045,
-0.1585923731327057,
-0.04019840434193611,
0.10713508725166321,
0.0025239873211830854,
-0.13683485984802246,
-0.026115452870726585,
-0.04719424247741699,
0.04378923028707504,
0.017569564282894135,
-0.04177270457148552,
0.04696148633956909,
0.0032420859206467867,
-0.0048277052119374275,
0.0003104088827967644,
-0.10268086194992065,
-0.04273419827222824,
0.023560987785458565,
0.09430938214063644,
0.10998065769672394,
0.054067179560661316,
-0.010924749076366425,
0.10463030636310577,
-0.19307628273963928,
-0.047161784023046494,
-0.02656129002571106,
-0.03065977804362774,
-0.04379165545105934,
-0.012006290256977081,
-0.10597899556159973,
-0.06024583429098129,
-0.00011616182746365666,
0.13895268738269806,
-0.014394586905837059,
0.033156704157590866,
-0.030891355127096176,
0.006750635337084532,
0.06116557493805885,
0.05352262407541275,
-0.020290352404117584,
0.024679720401763916,
0.03482815995812416,
-0.020735949277877808,
-0.015136831440031528,
0.00858670100569725,
-0.0034322801511734724,
0.030317993834614754,
0.13720397651195526,
0.012174694798886776,
-0.10493762791156769,
0.07600332051515579,
-0.01409856416285038,
-0.04524054750800133,
-0.012520743533968925,
-0.08847663551568985,
-0.06287195533514023,
-0.040146660059690475,
-0.015246743336319923,
0.004003158770501614,
0.004847007337957621,
-0.009809935465455055,
-0.0262066051363945,
-0.02160654217004776,
-0.09274783730506897,
-0.05722248554229736,
-0.052722103893756866,
-0.13466079533100128,
0.010421102866530418,
-0.19164495170116425,
-0.02768992818892002,
-0.11083619296550751,
-0.2049020528793335,
-0.03925297036767006,
0.04616904631257057,
0.0038719233125448227,
-0.06625208258628845,
0.05682039260864258,
0.03330238536000252,
-0.03025641478598118,
-0.0022524434607475996,
0.08415894955396652,
-0.006077951285988092,
0.03696572780609131,
-0.04045320674777031,
0.05692581459879875,
0.0075178733095526695,
0.043255116790533066,
-0.05856464058160782,
0.058599989861249924,
-0.17424851655960083,
0.039733730256557465,
-0.07497033476829529,
-0.030547009781003,
-0.08478429913520813,
-0.03040684014558792,
0.0010355741251260042,
0.014892255887389183,
0.022481389343738556,
0.07026681303977966,
-0.17413154244422913,
-0.03514813259243965,
0.10295166075229645,
-0.15173673629760742,
-0.023930389434099197,
0.06857167929410934,
-0.05594814568758011,
0.11661050468683243,
0.06670031696557999,
0.1635609269142151,
-0.03644147142767906,
-0.06460176408290863,
0.04579100385308266,
-0.014514133334159851,
0.010527216829359531,
-0.010119314305484295,
0.06882062554359436,
-0.01815272495150566,
-0.15742316842079163,
0.023940574377775192,
-0.1333654820919037,
-0.0008093834621831775,
-0.07691816985607147,
0.03137665241956711,
-0.005789450369775295,
-0.0642872154712677,
-0.0849471315741539,
-0.03235196694731712,
0.07661966234445572,
-0.07337629795074463,
-0.025860445573925972,
0.04225178435444832,
0.07788179814815521,
-0.07256145775318146,
0.06461550295352936,
-0.01741337776184082,
0.01846550777554512,
-0.07751887291669846,
-0.034518737345933914,
-0.18630363047122955,
0.034370262175798416,
0.09681975096464157,
0.015809349715709686,
-0.017292222008109093,
0.13306540250778198,
-0.007232376839965582,
0.06609753519296646,
-0.03669676557183266,
-0.0032423457596451044,
-0.009077765047550201,
-0.0009447414195165038,
-0.09591630846261978,
-0.11382681131362915,
-0.06938057392835617,
-0.07106963545084,
0.1009179875254631,
-0.12145275622606277,
0.02228846400976181,
-0.05236385762691498,
0.04674210771918297,
0.016129840165376663,
-0.07141541689634323,
-0.008085350506007671,
0.01147005520761013,
-0.06412282586097717,
-0.05986844003200531,
0.03398802876472473,
0.0629640519618988,
-0.023378094658255577,
0.09392011910676956,
-0.0554090216755867,
-0.08501451462507248,
0.029604483395814896,
0.07756678760051727,
-0.10368143022060394,
0.025399547070264816,
-0.05095130577683449,
-0.04658673331141472,
-0.06418208032846451,
-0.02321399189531803,
0.10019805282354355,
-0.012445589527487755,
0.1456805318593979,
-0.076960988342762,
-0.014340661466121674,
0.010743134655058384,
-0.014728748239576817,
-0.02805478125810623,
0.049028851091861725,
0.0645250678062439,
-0.08143668621778488,
0.025898879393935204,
0.03188328444957733,
-0.0007270699716173112,
0.06800854951143265,
-0.05367177724838257,
-0.07901701331138611,
0.01848858967423439,
0.03694505989551544,
0.023213518783450127,
0.05891795456409454,
-0.051979776471853256,
-0.017252609133720398,
0.03177163004875183,
0.019304996356368065,
0.012034446932375431,
-0.11952853947877884,
0.06101473793387413,
0.061383649706840515,
0.010477414354681969,
0.05492474138736725,
-0.017311180010437965,
-0.03507637232542038,
0.08087381720542908,
0.031405314803123474,
-0.01820010133087635,
-0.007575511932373047,
-0.012180624529719353,
-0.12321474403142929,
0.21639582514762878,
-0.07021676748991013,
-0.15459279716014862,
-0.06726949661970139,
-0.11494152247905731,
-0.0012631217250600457,
0.02382640913128853,
0.044605106115341187,
-0.027638398110866547,
-0.04449841380119324,
-0.12277811765670776,
0.08977343142032623,
-0.04445279389619827,
0.06396735459566116,
0.11114035546779633,
-0.06342776864767075,
0.046433135867118835,
-0.1316286027431488,
-0.011699887923896313,
-0.07686471939086914,
-0.06259798258543015,
0.05807247385382652,
-0.05230608209967613,
0.034931499511003494,
0.11074434965848923,
0.019909000024199486,
-0.02513301372528076,
-0.02896309271454811,
0.20187050104141235,
0.040901340544223785,
0.035711709409952164,
0.13074062764644623,
-0.07100346684455872,
0.05218831077218056,
0.07628286629915237,
0.0023728120140731335,
-0.0456077978014946,
0.05120021849870682,
0.05513838306069374,
-0.059110160917043686,
-0.1964554488658905,
-0.010118894279003143,
0.008096717298030853,
-0.04749792069196701,
0.07125003635883331,
0.039594776928424835,
0.01807313784956932,
0.07326560467481613,
0.015819767490029335,
0.0730518326163292,
0.003262772224843502,
0.10323051363229752,
0.02908039093017578,
-0.04014349728822708,
0.0867968201637268,
-0.007090417668223381,
-0.006198389455676079,
0.07787492871284485,
-0.023314883932471275,
0.29432937502861023,
-0.03775330260396004,
0.026828721165657043,
0.12693719565868378,
0.028839971870183945,
0.05010499805212021,
0.12050488591194153,
-0.07082024961709976,
0.026747331023216248,
-0.07819753885269165,
-0.048804741352796555,
0.012362889014184475,
0.05211617052555084,
-0.07385613769292831,
0.01689457893371582,
-0.08344141393899918,
0.023973330855369568,
-0.02343466319143772,
0.30224645137786865,
0.09846310317516327,
-0.11459249258041382,
-0.05972594767808914,
0.0031562643125653267,
-0.10462038964033127,
-0.07353778928518295,
0.05121617391705513,
0.05296861007809639,
-0.13276253640651703,
0.006600653752684593,
-0.022818585857748985,
0.07606581598520279,
-0.02776613086462021,
0.015847306698560715,
0.03005705028772354,
0.05745460093021393,
-0.042758554220199585,
0.00944553967565298,
-0.18719424307346344,
0.19738401472568512,
-0.0007677616667933762,
0.019754113629460335,
-0.057918306440114975,
0.030614469200372696,
0.009306591935455799,
-0.013805221766233444,
0.06592931598424911,
0.018077922984957695,
-0.01962931826710701,
-0.05316334217786789,
-0.04199279844760895,
0.01421266421675682,
0.06835242360830307,
-0.04489634558558464,
0.10433889180421829,
0.006234884262084961,
0.053078122437000275,
0.03156070411205292,
0.08864448964595795,
-0.18042823672294617,
-0.07933836430311203,
0.03188228979706764,
-0.046767719089984894,
-0.10119898617267609,
-0.0853746235370636,
-0.09788577258586884,
-0.007028872147202492,
0.2352001816034317,
-0.11303459852933884,
-0.07262183725833893,
-0.09503301233053207,
0.05087150260806084,
0.0951310321688652,
-0.05596185103058815,
0.024611206725239754,
-0.010488376021385193,
0.12080667167901993,
-0.07250948250293732,
-0.12307997792959213,
0.02795935422182083,
-0.09963077306747437,
-0.15863971412181854,
-0.06807290017604828,
0.09210263192653656,
0.06398891657590866,
0.030649112537503242,
-0.03455556556582451,
0.014531479217112064,
0.03577188029885292,
-0.03413405269384384,
0.0001427903480362147,
0.0698264092206955,
0.0908287763595581,
0.03349566087126732,
-0.1029127761721611,
0.027196234092116356,
-0.07018498331308365,
-0.06887705624103546,
0.07624127715826035,
0.26874956488609314,
-0.05016900226473808,
0.11217489093542099,
0.11602029949426651,
-0.08521205186843872,
-0.15569916367530823,
0.044928766787052155,
0.09427235275506973,
-0.014252596534788609,
0.00014635304978583008,
-0.16496972739696503,
0.10197598487138748,
0.1171698272228241,
-0.015511402860283852,
0.0033267061226069927,
-0.20202884078025818,
-0.1337469071149826,
0.09327752143144608,
0.11455165594816208,
0.2783893048763275,
-0.05323341116309166,
-0.03923933207988739,
0.02094716578722,
-0.08148393034934998,
0.010457174852490425,
0.11520715802907944,
0.06037938594818115,
-0.018669679760932922,
-0.06671825051307678,
0.010671275667846203,
-0.03719962015748024,
0.0886434018611908,
0.06438877433538437,
0.07058320939540863,
-0.0031565416138619184,
-0.0020346285309642553,
-0.03166067227721214,
-0.041171275079250336,
0.07084011286497116,
0.034247830510139465,
0.04872944951057434,
-0.08374574035406113,
-0.03473891317844391,
-0.0736648440361023,
0.035247910767793655,
-0.03087610937654972,
-0.07597777992486954,
-0.059257689863443375,
0.07435930520296097,
0.05764200910925865,
-0.03204430267214775,
0.040051937103271484,
0.03311973437666893,
0.09864743053913116,
0.14706964790821075,
-0.0037849328946322203,
-0.04842467978596687,
-0.06503365933895111,
-0.032620932906866074,
-0.010891464538872242,
0.07377219945192337,
-0.04767554998397827,
0.01361897587776184,
0.06915584206581116,
0.023447934538125992,
0.10986897349357605,
0.0592324435710907,
-0.12555177509784698,
-0.021650947630405426,
0.0244908407330513,
-0.15865585207939148,
0.005105224903672934,
-0.002284558955579996,
0.02742326818406582,
-0.018309416249394417,
0.025853406637907028,
0.14997133612632751,
-0.06798563152551651,
-0.03268890455365181,
-0.04701200872659683,
0.059805724769830704,
0.03498506918549538,
0.1413356065750122,
0.037266019731760025,
0.037769582122564316,
-0.07976625114679337,
0.13854257762432098,
0.047065138816833496,
-0.04251354932785034,
0.03086545690894127,
-0.028182536363601685,
-0.10672412067651749,
0.017207171767950058,
0.06261833757162094,
0.07819528132677078,
-0.07109586894512177,
-0.020701510831713676,
-0.04156099259853363,
-0.0786045715212822,
0.0687011107802391,
0.21826587617397308,
0.0628737136721611,
0.06831949949264526,
-0.05382503941655159,
-0.03489019349217415,
-0.07743291556835175,
0.0482993945479393,
0.0478820838034153,
0.0803539976477623,
-0.07489640265703201,
0.10871942341327667,
0.013868233188986778,
0.04969870671629906,
-0.025853175669908524,
-0.04991055652499199,
-0.10411069542169571,
-0.05694206804037094,
-0.10304266959428787,
0.012542600743472576,
-0.06724250316619873,
-0.04384012520313263,
0.007749942131340504,
-0.0005034516798332334,
-0.002723430283367634,
0.05603008717298508,
-0.06184656172990799,
-0.011580269783735275,
-0.013352912850677967,
0.03778620809316635,
-0.06512376666069031,
-0.05114499107003212,
0.017682434991002083,
-0.09810047596693039,
0.09738729149103165,
0.0523049533367157,
0.014019954018294811,
0.0026442043017596006,
0.07465948909521103,
-0.012043044902384281,
0.024576442316174507,
0.006843011826276779,
-0.03583155944943428,
-0.10189046710729599,
0.0044843945652246475,
-0.024462252855300903,
-0.03791859373450279,
-0.020026296377182007,
0.08029960095882416,
-0.07976767420768738,
0.02702619880437851,
-0.000551497214473784,
-0.007820505648851395,
-0.08145160228013992,
-0.004633388016372919,
0.0943593829870224,
0.08176400512456894,
0.049006152898073196,
-0.08271408081054688,
0.016633816063404083,
-0.12444589287042618,
-0.03597153350710869,
0.011379124596714973,
-0.016878003254532814,
-0.12882563471794128,
-0.011993862688541412,
0.016912788152694702,
-0.016255006194114685,
0.1931203305721283,
-0.05651051178574562,
-0.026548491790890694,
0.01974918134510517,
-0.09308058023452759,
0.1075996682047844,
-0.02295810356736183,
0.16247951984405518,
-0.025407399982213974,
-0.033098407089710236,
-0.01430121436715126,
0.04646819829940796,
0.029871175065636635,
-0.015302042476832867,
0.18338215351104736,
0.1306239813566208,
0.03679860755801201,
0.05671883374452591,
-0.02428405173122883,
-0.01188353355973959,
-0.06484578549861908,
-0.011452721431851387,
0.043552130460739136,
0.03719451650977135,
0.024579649791121483,
0.15503595769405365,
0.06711395829916,
-0.1640845239162445,
0.035284653306007385,
-0.02378792129456997,
-0.03973620757460594,
-0.11450976878404617,
-0.10326884686946869,
-0.03418922796845436,
-0.06336066126823425,
0.01441560685634613,
-0.13393554091453552,
0.006112141069024801,
0.16579149663448334,
0.06576339155435562,
0.028527744114398956,
0.015156167559325695,
-0.12737159430980682,
-0.045162707567214966,
0.05630182847380638,
0.01365449745208025,
0.018317217007279396,
0.03531142324209213,
-0.0031350140925496817,
0.06628205627202988,
0.028389552608132362,
0.0034497526939958334,
-0.006237275432795286,
0.08866913616657257,
0.024654991924762726,
0.04432402551174164,
-0.055521175265312195,
0.0024290282744914293,
-0.041491881012916565,
0.08015462011098862,
0.10460586845874786,
0.04352588579058647,
-0.053089555352926254,
-0.012038794346153736,
0.15162703394889832,
-0.02901928313076496,
0.010405211709439754,
-0.11751145124435425,
0.33036041259765625,
0.013216567225754261,
0.010715287178754807,
0.05778973549604416,
-0.07967748492956161,
-0.04567179083824158,
0.20187926292419434,
0.06450102478265762,
-0.015603620558977127,
-0.02690715715289116,
0.004139565397053957,
-0.03216627240180969,
-0.02111414074897766,
0.13379168510437012,
0.04433834180235863,
0.12499234825372696,
-0.05548856779932976,
-0.048456598073244095,
-0.03661312535405159,
0.0009416998364031315,
-0.11266286671161652,
0.14609558880329132,
-0.014232655987143517,
-0.022324489429593086,
-0.0731281191110611,
0.01083901897072792,
0.0756295844912529,
-0.3400270640850067,
0.004585964605212212,
-0.027259744703769684,
-0.10497099906206131,
-0.01293046772480011,
-0.027646567672491074,
-0.026052996516227722,
0.04862368106842041,
-0.03931288421154022,
0.060660216957330704,
0.04170569032430649,
0.03931751474738121,
-0.02379697933793068,
-0.09963909536600113,
0.16356013715267181,
0.05807717889547348,
0.10197115689516068,
0.01573912240564823,
0.08701132982969284,
0.06217861548066139,
0.03813666105270386,
-0.08911041170358658,
0.05339079722762108,
0.01089736633002758,
-0.06849534809589386,
-0.05763635039329529,
0.11996325105428696,
-0.0003956421569455415,
0.06403379142284393,
0.03496294841170311,
-0.12371595948934555,
0.024964889511466026,
0.07666737586259842,
-0.08730586618185043,
-0.0968637764453888,
-0.0008202249882742763,
-0.099469855427742,
0.15762801468372345,
0.14867810904979706,
-0.011675885878503323,
0.014298001304268837,
-0.06621047854423523,
-0.006408302579075098,
0.052448343485593796,
0.013755676336586475,
-0.009566131979227066,
-0.18202051520347595,
0.05438634008169174,
-0.08253144472837448,
-0.001829251879826188,
-0.21311905980110168,
-0.10281528532505035,
-0.012376930564641953,
-0.056876905262470245,
-0.024594509974122047,
0.0526406392455101,
0.02658161148428917,
0.07273194193840027,
-0.027503110468387604,
-0.019017571583390236,
-0.03833832964301109,
0.09296955913305283,
-0.10060405731201172,
-0.07147789746522903
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_300k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_300k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_300k"]}
| null |
google/multiberts-seed_1-step_300k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_300k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 300k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07800978422164917,
0.0684974193572998,
-0.0019045925000682473,
0.04427911713719368,
0.07829887419939041,
-0.019433697685599327,
0.06960002332925797,
0.09091714024543762,
-0.014154204167425632,
0.02063559927046299,
0.086982361972332,
0.027996769174933434,
0.010803098790347576,
0.0942235216498375,
0.021282093599438667,
-0.21135176718235016,
0.03320201858878136,
-0.028000593185424805,
-0.08567596226930618,
0.07738897949457169,
0.10494966059923172,
-0.0821838527917862,
0.04294147714972496,
0.029479147866368294,
-0.10947272181510925,
0.05655102804303169,
-0.01186593808233738,
-0.02161988988518715,
0.13523009419441223,
0.0002919191319961101,
0.05339216813445091,
0.05403731390833855,
0.05064746364951134,
-0.13792161643505096,
0.0033642170019447803,
0.05291316658258438,
0.05215246602892876,
0.035328879952430725,
0.017538221552968025,
0.0796818882226944,
-0.026336979120969772,
0.03125854954123497,
0.053668148815631866,
0.01721222698688507,
-0.06625770032405853,
-0.05472205579280853,
-0.10040386021137238,
0.02899899147450924,
0.032839953899383545,
0.024192748591303825,
0.007805605418980122,
0.12399052083492279,
-0.037691380828619,
0.04417694732546806,
0.18811917304992676,
-0.31070125102996826,
-0.002915486693382263,
0.06888248771429062,
0.024529002606868744,
0.11394200474023819,
-0.0018470162758603692,
-0.03500844165682793,
0.07907605916261673,
0.025074396282434464,
0.09946828335523605,
-0.03995874151587486,
0.02170807495713234,
-0.061267152428627014,
-0.15887081623077393,
-0.03820284083485603,
0.10187460482120514,
0.0053935362957417965,
-0.1399299055337906,
-0.029269525781273842,
-0.04429500922560692,
0.03516651690006256,
0.021525369957089424,
-0.04259306937456131,
0.04387085139751434,
0.002376627642661333,
-0.006809627171605825,
-0.003675996558740735,
-0.10215165466070175,
-0.04751807823777199,
0.021894263103604317,
0.10424839705228806,
0.10728625953197479,
0.05758159980177879,
-0.003994496539235115,
0.11112628132104874,
-0.17435845732688904,
-0.04941698908805847,
-0.02966996468603611,
-0.034390270709991455,
-0.0421697199344635,
-0.011947308667004108,
-0.10159650444984436,
-0.055726323276758194,
-0.0014604282332584262,
0.13725893199443817,
-0.010247715748846531,
0.03136301413178444,
-0.019308054819703102,
0.005534037481993437,
0.06293050944805145,
0.057626936584711075,
-0.022180628031492233,
0.01828521117568016,
0.034183669835329056,
-0.010262001305818558,
-0.015488940291106701,
0.00982582475990057,
-0.002223293064162135,
0.028201693668961525,
0.1350892037153244,
0.015970269218087196,
-0.1057194173336029,
0.0750759094953537,
-0.02049659937620163,
-0.04672061279416084,
-0.0021010085474699736,
-0.0886559709906578,
-0.06529944390058517,
-0.04498283937573433,
-0.014552892185747623,
0.008413809351623058,
0.005446011200547218,
-0.008813293650746346,
-0.028143495321273804,
-0.017729537561535835,
-0.08784626424312592,
-0.058905746787786484,
-0.055376727133989334,
-0.13347645103931427,
0.004965663887560368,
-0.18611757457256317,
-0.028696797788143158,
-0.11664218455553055,
-0.2040531039237976,
-0.03774610534310341,
0.0480584017932415,
0.0029473977629095316,
-0.06240777298808098,
0.05761117860674858,
0.035650238394737244,
-0.030332449823617935,
-0.0041234856471419334,
0.08130823075771332,
-0.006409631110727787,
0.03776165097951889,
-0.038270726799964905,
0.057324089109897614,
0.0043709720484912395,
0.04173567518591881,
-0.06134139373898506,
0.0569809153676033,
-0.16803458333015442,
0.03997977077960968,
-0.0715515986084938,
-0.029819533228874207,
-0.0843169316649437,
-0.033865656703710556,
-0.0056179906241595745,
0.015395578928291798,
0.026055587455630302,
0.07508953660726547,
-0.1667490005493164,
-0.02758955955505371,
0.08869154006242752,
-0.14949700236320496,
-0.02886177785694599,
0.071148581802845,
-0.055603016167879105,
0.11476939171552658,
0.06646004319190979,
0.1518794149160385,
-0.03803437948226929,
-0.07006717473268509,
0.047321513295173645,
-0.01412711851298809,
0.01094930712133646,
-0.01245640218257904,
0.06951864063739777,
-0.020746570080518723,
-0.15905602276325226,
0.022527256980538368,
-0.13447318971157074,
0.0005959456320852041,
-0.0780733972787857,
0.029312215745449066,
-0.002460359362885356,
-0.07130177319049835,
-0.08338817209005356,
-0.03578272834420204,
0.0745975449681282,
-0.07236101478338242,
-0.025426967069506645,
0.03820459544658661,
0.07850271463394165,
-0.07301444560289383,
0.06865625083446503,
-0.014146986417472363,
0.017461000010371208,
-0.08103978633880615,
-0.037778787314891815,
-0.1847006380558014,
0.03176946938037872,
0.09671881049871445,
0.019592702388763428,
-0.01806175522506237,
0.12384329736232758,
-0.008295781910419464,
0.06614414602518082,
-0.04254366084933281,
-0.005549867637455463,
-0.015225300565361977,
-0.00009176144521916285,
-0.0930110290646553,
-0.10343238711357117,
-0.07742693275213242,
-0.06988196820020676,
0.08611921966075897,
-0.10478641837835312,
0.02063302882015705,
-0.057651471346616745,
0.041621509939432144,
0.018640736117959023,
-0.07277187705039978,
-0.009209919720888138,
0.015998216345906258,
-0.06470371037721634,
-0.05982835963368416,
0.03377145901322365,
0.06265139579772949,
-0.0234986562281847,
0.09012710303068161,
-0.05048675462603569,
-0.08249387890100479,
0.029505901038646698,
0.07756216078996658,
-0.10706271976232529,
0.024033324792981148,
-0.04667631536722183,
-0.04441408812999725,
-0.07289262861013412,
-0.031445663422346115,
0.10035410523414612,
-0.00980325136333704,
0.14343608915805817,
-0.07767056673765182,
-0.014289302751421928,
0.005399585235863924,
-0.015836969017982483,
-0.028359610587358475,
0.0420566089451313,
0.06779995560646057,
-0.08016804605722427,
0.024978572502732277,
0.033119671046733856,
-0.0013901859056204557,
0.06895381957292557,
-0.04880921542644501,
-0.07621589303016663,
0.018981199711561203,
0.030763700604438782,
0.022941583767533302,
0.06204674020409584,
-0.03934214636683464,
-0.012813322246074677,
0.027480341494083405,
0.024980364367365837,
0.01065780594944954,
-0.11516093462705612,
0.06005723774433136,
0.06251833587884903,
0.008487646467983723,
0.06458255648612976,
-0.018564356490969658,
-0.03683538734912872,
0.08107536286115646,
0.03339821845293045,
-0.015267093665897846,
-0.009320917539298534,
-0.013960850425064564,
-0.12277678400278091,
0.21702143549919128,
-0.06523602455854416,
-0.15112903714179993,
-0.07085642963647842,
-0.11321470141410828,
-0.002828243188560009,
0.02417885698378086,
0.041001930832862854,
-0.0272501390427351,
-0.04288310930132866,
-0.12657448649406433,
0.09763547778129578,
-0.036388665437698364,
0.0638786181807518,
0.1137477457523346,
-0.06345286220312119,
0.04458146542310715,
-0.13305699825286865,
-0.015331180766224861,
-0.07873930782079697,
-0.06628091633319855,
0.05412362515926361,
-0.05444340407848358,
0.04033034294843674,
0.10812503099441528,
0.019933287054300308,
-0.02585625648498535,
-0.030630310997366905,
0.20720414817333221,
0.043696869164705276,
0.03788460046052933,
0.13287697732448578,
-0.07019773125648499,
0.050862301141023636,
0.0759401023387909,
0.006432796828448772,
-0.049720533192157745,
0.056751564145088196,
0.053335320204496384,
-0.06353940814733505,
-0.18908657133579254,
-0.01091664470732212,
0.006637618411332369,
-0.052121859043836594,
0.07058901339769363,
0.03781678527593613,
0.0062601459212601185,
0.07603634893894196,
0.01657094806432724,
0.06220747530460358,
0.0032624404411762953,
0.102674700319767,
0.020190667361021042,
-0.03451456129550934,
0.0880250558257103,
-0.008160249330103397,
-0.0109324986115098,
0.07712506502866745,
-0.01232161559164524,
0.29022958874702454,
-0.042932283133268356,
0.015962669625878334,
0.1262359619140625,
0.03044992685317993,
0.04874426871538162,
0.12598755955696106,
-0.07322808355093002,
0.027284471318125725,
-0.07718048989772797,
-0.04742901399731636,
0.008762165904045105,
0.04559936374425888,
-0.06800704449415207,
0.026028266176581383,
-0.08562713116407394,
0.024348845705389977,
-0.022047987207770348,
0.3118036389350891,
0.10338719189167023,
-0.1142263188958168,
-0.05816182494163513,
0.00483955442905426,
-0.10174903273582458,
-0.06873214244842529,
0.05339448153972626,
0.04591766744852066,
-0.1357821226119995,
0.007316153030842543,
-0.020347587764263153,
0.0760825052857399,
-0.03261219710111618,
0.014748684130609035,
0.04085002467036247,
0.052957404404878616,
-0.04482967033982277,
0.0039003388956189156,
-0.1850273460149765,
0.2008453607559204,
-0.0023683393374085426,
0.02347097545862198,
-0.055792760103940964,
0.030010610818862915,
0.010700062848627567,
-0.020851295441389084,
0.06655280292034149,
0.017836881801486015,
-0.00033335998887196183,
-0.06217727065086365,
-0.042359355837106705,
0.012737485580146313,
0.06518704444169998,
-0.04593655839562416,
0.10657001286745071,
0.004359208047389984,
0.0542299747467041,
0.029218880459666252,
0.09259537607431412,
-0.18474580347537994,
-0.08168128877878189,
0.02485606074333191,
-0.050184961408376694,
-0.10338389128446579,
-0.08259856700897217,
-0.09702671319246292,
-0.0008180367876775563,
0.22913403809070587,
-0.10946528613567352,
-0.07574009895324707,
-0.09366718679666519,
0.042678993195295334,
0.09485049545764923,
-0.05385563150048256,
0.02155696041882038,
-0.009205051697790623,
0.11266060173511505,
-0.07117431610822678,
-0.12228050827980042,
0.025910943746566772,
-0.1014024168252945,
-0.15879248082637787,
-0.06717348843812943,
0.08981479704380035,
0.06724540144205093,
0.030435064807534218,
-0.030992820858955383,
0.011904317885637283,
0.04225153103470802,
-0.03830474615097046,
-0.003938938491046429,
0.06149282678961754,
0.088131383061409,
0.046117883175611496,
-0.10736354440450668,
0.018428919836878777,
-0.07311215996742249,
-0.07089195400476456,
0.07548940926790237,
0.2710193991661072,
-0.04823923483490944,
0.1095530167222023,
0.12561863660812378,
-0.08528021723031998,
-0.15913575887680054,
0.03854813426733017,
0.09236373007297516,
-0.012126429937779903,
0.008707121014595032,
-0.15782444179058075,
0.10061466693878174,
0.11621937900781631,
-0.01644943840801716,
-0.00932216551154852,
-0.20848311483860016,
-0.1372196078300476,
0.09158621728420258,
0.11968732625246048,
0.27978184819221497,
-0.05368625372648239,
-0.03437596932053566,
0.015609495341777802,
-0.09572850912809372,
0.008655056357383728,
0.13374288380146027,
0.06372398138046265,
-0.022042792290449142,
-0.06944860517978668,
0.010278165340423584,
-0.0372966043651104,
0.0919622853398323,
0.06218850985169411,
0.07071438431739807,
-0.005675049498677254,
-0.0022359301801770926,
-0.03382319211959839,
-0.041502855718135834,
0.07434679567813873,
0.03228961303830147,
0.049377210438251495,
-0.08217818289995193,
-0.03842385113239288,
-0.07288633286952972,
0.030754486098885536,
-0.0298465546220541,
-0.08040910959243774,
-0.06254731118679047,
0.07808411121368408,
0.059860654175281525,
-0.03453577309846878,
0.032687392085790634,
0.031153112649917603,
0.09807532280683517,
0.14200399816036224,
0.0031118434853851795,
-0.048864684998989105,
-0.06736171245574951,
-0.03067653812468052,
-0.01413735281676054,
0.0739438459277153,
-0.03196629509329796,
0.014917917549610138,
0.06977291405200958,
0.01827690750360489,
0.10667872428894043,
0.062014926224946976,
-0.12107919156551361,
-0.0188092403113842,
0.026979675516486168,
-0.15018907189369202,
0.01716149039566517,
-0.00019522053480613977,
0.023895256221294403,
-0.02369951270520687,
0.022916879504919052,
0.14999814331531525,
-0.06662868708372116,
-0.033014558255672455,
-0.046744994819164276,
0.061328474432229996,
0.02906116470694542,
0.14543677866458893,
0.037765394896268845,
0.037355322390794754,
-0.07558435946702957,
0.14252161979675293,
0.041400063782930374,
-0.03075271286070347,
0.027673058211803436,
-0.027068916708230972,
-0.10492373257875443,
0.017187299206852913,
0.059975896030664444,
0.06922724097967148,
-0.08103423565626144,
-0.01295788399875164,
-0.042342908680438995,
-0.08142601698637009,
0.0667349174618721,
0.20761796832084656,
0.06815803050994873,
0.06665845960378647,
-0.055412016808986664,
-0.03497079759836197,
-0.07489222288131714,
0.044605959206819534,
0.05234059691429138,
0.07868350297212601,
-0.07497622072696686,
0.08995745331048965,
0.013587170280516148,
0.04777917265892029,
-0.02711113542318344,
-0.04805822670459747,
-0.10672909766435623,
-0.05241072550415993,
-0.10270652920007706,
0.009591788053512573,
-0.06645536422729492,
-0.041505683213472366,
0.005874423310160637,
-0.0021072516683489084,
-0.004507460165768862,
0.05704781040549278,
-0.05834547057747841,
-0.013074327260255814,
-0.013650009408593178,
0.03341430053114891,
-0.06167422607541084,
-0.053110815584659576,
0.02039569616317749,
-0.0994584783911705,
0.09612163156270981,
0.04214552417397499,
0.011129887774586678,
0.0005499898688867688,
0.07452717423439026,
-0.013139449991285801,
0.02417958341538906,
0.010121016763150692,
-0.04292057082056999,
-0.09975171089172363,
0.005681152921169996,
-0.022040579468011856,
-0.032611820846796036,
-0.017639603465795517,
0.0920567587018013,
-0.08113688975572586,
0.02838923968374729,
-0.003855128074064851,
-0.00365662295371294,
-0.07978271692991257,
-0.0037610437721014023,
0.10584381967782974,
0.0842890590429306,
0.05437624454498291,
-0.08425699919462204,
0.015846995636820793,
-0.12614764273166656,
-0.03422372415661812,
0.01390553917735815,
-0.01510322093963623,
-0.14017920196056366,
-0.011211860924959183,
0.01849067397415638,
-0.010792131535708904,
0.19457486271858215,
-0.05839189514517784,
-0.02871074341237545,
0.022897588089108467,
-0.08633764088153839,
0.11277234554290771,
-0.02204558625817299,
0.16102749109268188,
-0.029323183000087738,
-0.036501239985227585,
-0.014491099864244461,
0.045111529529094696,
0.02808445133268833,
-0.012013847939670086,
0.19107623398303986,
0.1316835731267929,
0.041039906442165375,
0.05924393609166145,
-0.02587937004864216,
-0.0059629157185554504,
-0.055507972836494446,
-0.0330367274582386,
0.04622945934534073,
0.02918364107608795,
0.026832152158021927,
0.14521580934524536,
0.06264561414718628,
-0.16268165409564972,
0.035995177924633026,
-0.024534255266189575,
-0.04027673229575157,
-0.11622608453035355,
-0.09914977848529816,
-0.030596857890486717,
-0.057925183326005936,
0.015647675842046738,
-0.134763702750206,
0.0015040148282423615,
0.17497318983078003,
0.06584685295820236,
0.030225347727537155,
0.020162569358944893,
-0.1336967796087265,
-0.043292783200740814,
0.056265052407979965,
0.011257997713983059,
0.02129555307328701,
0.04275931045413017,
-0.003909526392817497,
0.06355778127908707,
0.030444329604506493,
0.00454995920881629,
-0.004172049928456545,
0.07368303835391998,
0.022064486518502235,
0.046983085572719574,
-0.05629418045282364,
0.000042723437218228355,
-0.043929632753133774,
0.08225228637456894,
0.12385207414627075,
0.04479958489537239,
-0.04991650581359863,
-0.011118555441498756,
0.15333229303359985,
-0.02969658002257347,
0.002532693324610591,
-0.11986973136663437,
0.32370686531066895,
0.01849687099456787,
0.012990344315767288,
0.056859686970710754,
-0.07861428707838058,
-0.0494878776371479,
0.2020338922739029,
0.07380674034357071,
-0.026743657886981964,
-0.02770201861858368,
0.0031454579439014196,
-0.030428968369960785,
-0.01767517812550068,
0.14125096797943115,
0.042527444660663605,
0.12470140308141708,
-0.056850772351026535,
-0.044306911528110504,
-0.03553364798426628,
-0.0010482297511771321,
-0.10804948955774307,
0.14374391734600067,
-0.015693124383687973,
-0.018452294170856476,
-0.07947427779436111,
0.01233334094285965,
0.07485365867614746,
-0.331907719373703,
0.005505143664777279,
-0.03180358186364174,
-0.10515990853309631,
-0.012707428075373173,
-0.035504646599292755,
-0.027409475296735764,
0.0508887954056263,
-0.03743257373571396,
0.06552678346633911,
0.04032786563038826,
0.0393240861594677,
-0.023887120187282562,
-0.10344269871711731,
0.16458827257156372,
0.06851697713136673,
0.10531879216432571,
0.019261501729488373,
0.08656760305166245,
0.06150878593325615,
0.03558218106627464,
-0.09110968559980392,
0.05905381962656975,
0.008761405013501644,
-0.06407129764556885,
-0.054370734840631485,
0.1144854947924614,
-0.0013138462090864778,
0.06553135812282562,
0.02811826579272747,
-0.12499231845140457,
0.02649800293147564,
0.07982740551233292,
-0.08524202555418015,
-0.09685154259204865,
-0.0010627061128616333,
-0.0988348126411438,
0.1609766185283661,
0.14457134902477264,
-0.011137210763990879,
0.013602438382804394,
-0.06091851368546486,
-0.0047059799544513226,
0.05276469886302948,
0.009856345131993294,
-0.012044874019920826,
-0.18503481149673462,
0.05036952719092369,
-0.09471142292022705,
-0.006964763160794973,
-0.21536609530448914,
-0.10754960030317307,
-0.011879436671733856,
-0.056784600019454956,
-0.027556372806429863,
0.056513331830501556,
0.02282833680510521,
0.07588326185941696,
-0.0246095210313797,
-0.018237663432955742,
-0.039610620588064194,
0.09179817885160446,
-0.10564097762107849,
-0.07501425594091415
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_400k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_400k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_400k"]}
| null |
google/multiberts-seed_1-step_400k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_400k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 400k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08040446788072586,
0.07124511897563934,
-0.0020071861799806356,
0.04592010751366615,
0.07676216959953308,
-0.017615972086787224,
0.06547027826309204,
0.09278835356235504,
-0.015585197135806084,
0.022383814677596092,
0.08108244091272354,
0.029652755707502365,
0.013216348364949226,
0.09719504415988922,
0.019895130768418312,
-0.21327832341194153,
0.02897421456873417,
-0.027900876477360725,
-0.09140484035015106,
0.07471714913845062,
0.1039048582315445,
-0.08186286687850952,
0.04494289308786392,
0.030823839828372,
-0.11103653162717819,
0.05375685915350914,
-0.008778157643973827,
-0.023803919553756714,
0.13925936818122864,
0.0023821075446903706,
0.053802888840436935,
0.054558899253606796,
0.049984049052000046,
-0.13374356925487518,
0.004289541393518448,
0.053474534302949905,
0.05132288858294487,
0.039024803787469864,
0.0210491344332695,
0.0830860361456871,
-0.02890818566083908,
0.03084990382194519,
0.050820741802453995,
0.016446249559521675,
-0.06548219174146652,
-0.06585895270109177,
-0.10030193626880646,
0.02848074585199356,
0.030831390991806984,
0.019741127267479897,
0.0078858882188797,
0.11519160121679306,
-0.03408191725611687,
0.044338345527648926,
0.1737767904996872,
-0.32078230381011963,
-0.0035094507038593292,
0.06347189843654633,
0.02048981748521328,
0.11875345557928085,
-0.00428587244823575,
-0.03261658549308777,
0.08195548504590988,
0.025887558236718178,
0.09446980804204941,
-0.040755946189165115,
0.023356985300779343,
-0.06017454341053963,
-0.15511082112789154,
-0.035255126655101776,
0.10677514225244522,
0.00421900674700737,
-0.1401660293340683,
-0.0228080153465271,
-0.043266672641038895,
0.03775525465607643,
0.019713398069143295,
-0.0401327908039093,
0.04415908083319664,
0.0033420738764107227,
-0.006181693635880947,
0.0006582877831533551,
-0.10227341204881668,
-0.044243328273296356,
0.022129656746983528,
0.09510277956724167,
0.10887453705072403,
0.058245278894901276,
-0.0040228720754384995,
0.11046881973743439,
-0.18150418996810913,
-0.04706183448433876,
-0.028821825981140137,
-0.03532391041517258,
-0.04222790151834488,
-0.01100268866866827,
-0.10126443952322006,
-0.047148846089839935,
-0.001550800632685423,
0.13587908446788788,
-0.008237017318606377,
0.033728353679180145,
-0.021675512194633484,
0.005454101134091616,
0.05851880833506584,
0.05596936494112015,
-0.02008630894124508,
0.01630079373717308,
0.03766120970249176,
-0.015093056485056877,
-0.018021350726485252,
0.011412655003368855,
-0.0028679717797785997,
0.025239722803235054,
0.13406485319137573,
0.015457759611308575,
-0.106617771089077,
0.0773911103606224,
-0.01585208624601364,
-0.04421865567564964,
-0.0032777332235127687,
-0.08743935823440552,
-0.06416947394609451,
-0.042533159255981445,
-0.01663597673177719,
0.005130333360284567,
0.005000891629606485,
-0.009454934857785702,
-0.02537473477423191,
-0.021887004375457764,
-0.08884265273809433,
-0.05871906131505966,
-0.053433727473020554,
-0.13533447682857513,
0.005142200738191605,
-0.18251259624958038,
-0.029050743207335472,
-0.11830181628465652,
-0.2011827826499939,
-0.03989022225141525,
0.046520013362169266,
0.0037801405414938927,
-0.0640706717967987,
0.06093747541308403,
0.035822995007038116,
-0.032059505581855774,
-0.002730810083448887,
0.08359003067016602,
-0.0069375913590192795,
0.03852353245019913,
-0.041494399309158325,
0.05950682610273361,
0.005352257285267115,
0.0421159453690052,
-0.059444867074489594,
0.05595538020133972,
-0.18061570823192596,
0.04006398469209671,
-0.072039395570755,
-0.02856055088341236,
-0.08350176364183426,
-0.03488299250602722,
-0.004735898692160845,
0.011899875476956367,
0.022751091048121452,
0.07404773682355881,
-0.17377084493637085,
-0.02791457809507847,
0.08435695618391037,
-0.15103888511657715,
-0.027115780860185623,
0.07244845479726791,
-0.054126013070344925,
0.11984492093324661,
0.06832214444875717,
0.16078726947307587,
-0.0332580991089344,
-0.06810370087623596,
0.04716287925839424,
-0.013043152168393135,
0.007814005948603153,
-0.007495996542274952,
0.06545998901128769,
-0.020812956616282463,
-0.1681148111820221,
0.025336183607578278,
-0.12587220966815948,
-0.0005588593776337802,
-0.07739481329917908,
0.02986406721174717,
-0.003388338489457965,
-0.06988729536533356,
-0.08128316700458527,
-0.03640078753232956,
0.07748553156852722,
-0.07270503789186478,
-0.024691788479685783,
0.02749350480735302,
0.07794679701328278,
-0.07046829909086227,
0.0668298676609993,
-0.015071931295096874,
0.019271153956651688,
-0.08037535846233368,
-0.03608778119087219,
-0.18294024467468262,
0.03318918123841286,
0.09686598926782608,
0.014564878307282925,
-0.019588403403759003,
0.12669771909713745,
-0.00597153278067708,
0.06786477565765381,
-0.04282244294881821,
-0.00486799655482173,
-0.007216025143861771,
-0.00021650752751156688,
-0.0990445464849472,
-0.10535658895969391,
-0.0765652284026146,
-0.06681810319423676,
0.09532728791236877,
-0.11306941509246826,
0.021893590688705444,
-0.06029577553272247,
0.04213744401931763,
0.01791447587311268,
-0.07048091292381287,
-0.01093156449496746,
0.016036683693528175,
-0.06187883019447327,
-0.058599986135959625,
0.034251388162374496,
0.06095509231090546,
-0.02298184111714363,
0.08828027546405792,
-0.048758186399936676,
-0.08050847053527832,
0.028727950528264046,
0.07618018239736557,
-0.10822322219610214,
0.02775859273970127,
-0.04633869230747223,
-0.046913642436265945,
-0.06644619256258011,
-0.02965458668768406,
0.10628295689821243,
-0.014229578897356987,
0.14308086037635803,
-0.07388798147439957,
-0.01086280308663845,
0.007585740648210049,
-0.01255322527140379,
-0.02316286601126194,
0.0447247289121151,
0.058705803006887436,
-0.0697316899895668,
0.019898464903235435,
0.03333339840173721,
0.0020107838790863752,
0.06201493367552757,
-0.05013132467865944,
-0.07405609637498856,
0.01988997496664524,
0.029990753158926964,
0.02192489430308342,
0.06336163729429245,
-0.0401984341442585,
-0.012015142478048801,
0.03011704981327057,
0.023283258080482483,
0.009679625742137432,
-0.11856143921613693,
0.060858353972435,
0.05961953476071358,
0.009228982962667942,
0.05249820277094841,
-0.019340500235557556,
-0.03322814404964447,
0.08161839842796326,
0.03320523351430893,
-0.016114963218569756,
-0.00800491962581873,
-0.012805169448256493,
-0.12154904752969742,
0.21485105156898499,
-0.06992354243993759,
-0.15044113993644714,
-0.07057951390743256,
-0.10761008411645889,
0.0025163854006677866,
0.02436392940580845,
0.043887458741664886,
-0.030191240832209587,
-0.04349761828780174,
-0.1229732483625412,
0.09868501871824265,
-0.036437418311834335,
0.06445933878421783,
0.11097768694162369,
-0.06528200954198837,
0.04692964628338814,
-0.13177154958248138,
-0.012237511575222015,
-0.07765059918165207,
-0.06546100974082947,
0.0575251467525959,
-0.053259920328855515,
0.0380558967590332,
0.11084491014480591,
0.018235377967357635,
-0.028604861348867416,
-0.03012998029589653,
0.2091308981180191,
0.04083513468503952,
0.04450085759162903,
0.12943871319293976,
-0.06986867636442184,
0.05171314626932144,
0.08082856237888336,
0.007019133772701025,
-0.04631028324365616,
0.05354484170675278,
0.05213546007871628,
-0.05951318517327309,
-0.19211722910404205,
-0.008521653711795807,
0.009131280705332756,
-0.046580106019973755,
0.06721275299787521,
0.03872667998075485,
0.007041488774120808,
0.07728590816259384,
0.01954902522265911,
0.06897647678852081,
0.006564907263964415,
0.10019765049219131,
0.027283433824777603,
-0.03731397166848183,
0.08689653873443604,
-0.008345313370227814,
-0.008922964334487915,
0.07772206515073776,
-0.016114218160510063,
0.2902333438396454,
-0.04316690191626549,
0.017652077600359917,
0.12230680137872696,
0.040041495114564896,
0.05099828913807869,
0.12505488097667694,
-0.07264232635498047,
0.02935514971613884,
-0.07549618184566498,
-0.04713308438658714,
0.011334924027323723,
0.0478026382625103,
-0.07094337791204453,
0.021774422377347946,
-0.0870191901922226,
0.021686198189854622,
-0.02447987161576748,
0.3048684895038605,
0.10328598320484161,
-0.11220027506351471,
-0.06165432557463646,
0.004485979210585356,
-0.10057708621025085,
-0.07256258279085159,
0.05517249181866646,
0.04975825920701027,
-0.13678719103336334,
0.003288842737674713,
-0.02132182940840721,
0.0762363001704216,
-0.033430829644203186,
0.015832284465432167,
0.040121108293533325,
0.054587360471487045,
-0.04636923596262932,
0.00553778326138854,
-0.1799122393131256,
0.19097547233104706,
-0.0008102058200165629,
0.024732939898967743,
-0.054280396550893784,
0.0293361097574234,
0.012853649444878101,
-0.011952396482229233,
0.062400199472904205,
0.018014416098594666,
-0.011547212488949299,
-0.06894885748624802,
-0.042412035167217255,
0.016420023515820503,
0.0658755972981453,
-0.04554668441414833,
0.10517560690641403,
0.006274395622313023,
0.05272432416677475,
0.027796059846878052,
0.08398525416851044,
-0.18710055947303772,
-0.07773468643426895,
0.027163680642843246,
-0.04905049502849579,
-0.08792655915021896,
-0.08324698358774185,
-0.09773819148540497,
-0.008611801080405712,
0.22649940848350525,
-0.10782001912593842,
-0.07690369337797165,
-0.0928516760468483,
0.035346511751413345,
0.0979057028889656,
-0.05325164273381233,
0.02248992957174778,
-0.007543965708464384,
0.11067408323287964,
-0.06754013150930405,
-0.12392126768827438,
0.02678213268518448,
-0.0977303758263588,
-0.1583261638879776,
-0.06784462183713913,
0.09114163368940353,
0.06269501894712448,
0.0307873897254467,
-0.030921345576643944,
0.01159289013594389,
0.037191685289144516,
-0.035983551293611526,
-0.0008029061136767268,
0.057275209575891495,
0.090130515396595,
0.04285937547683716,
-0.11252763122320175,
0.023207036778330803,
-0.07412169128656387,
-0.06764762848615646,
0.07031040638685226,
0.26837998628616333,
-0.050639618188142776,
0.10912870615720749,
0.1196623221039772,
-0.08611362427473068,
-0.1580239087343216,
0.04212417080998421,
0.09083304554224014,
-0.01375064067542553,
0.0035245362669229507,
-0.16222162544727325,
0.10337208956480026,
0.11748751997947693,
-0.01539821270853281,
0.012422066181898117,
-0.19945518672466278,
-0.13704000413417816,
0.08818068355321884,
0.11274988204240799,
0.2804930508136749,
-0.05340886488556862,
-0.03621166571974754,
0.020490892231464386,
-0.09774418920278549,
-0.0038614491932094097,
0.12706734240055084,
0.0628996342420578,
-0.021485555917024612,
-0.061995115131139755,
0.010868823155760765,
-0.03470830246806145,
0.09361803531646729,
0.06626109033823013,
0.07138971984386444,
-0.004072749987244606,
-0.003431558609008789,
-0.042502179741859436,
-0.04145700857043266,
0.06962572038173676,
0.038370005786418915,
0.04969732463359833,
-0.08692915737628937,
-0.03420174494385719,
-0.07374809682369232,
0.032524123787879944,
-0.030531615018844604,
-0.07735861837863922,
-0.06130547448992729,
0.07570988684892654,
0.05556921288371086,
-0.03429778665304184,
0.036079175770282745,
0.03374670818448067,
0.10159160196781158,
0.1427200883626938,
-0.004262780770659447,
-0.04689071327447891,
-0.06786354631185532,
-0.02773108333349228,
-0.015515580773353577,
0.07507995516061783,
-0.0376768596470356,
0.01761668734252453,
0.07307326048612595,
0.017310716211795807,
0.10795635730028152,
0.06231595203280449,
-0.11865954846143723,
-0.0173700712621212,
0.02603556215763092,
-0.15165244042873383,
0.014994533732533455,
0.0003702027315739542,
0.011092989705502987,
-0.01994304172694683,
0.024751190096139908,
0.15065526962280273,
-0.06764539331197739,
-0.033629484474658966,
-0.04763341322541237,
0.061589229851961136,
0.031131338328123093,
0.14650513231754303,
0.03931521996855736,
0.03633013367652893,
-0.0776866003870964,
0.1428922861814499,
0.04355667158961296,
-0.036037787795066833,
0.024384843185544014,
-0.030654754489660263,
-0.10841454565525055,
0.016043970361351967,
0.060544900596141815,
0.07539408653974533,
-0.07316762208938599,
-0.015210434794425964,
-0.04127853736281395,
-0.08509086072444916,
0.0642298012971878,
0.20664365589618683,
0.0635598674416542,
0.06627701967954636,
-0.05234633386135101,
-0.033793576061725616,
-0.07675782591104507,
0.04827319458127022,
0.052882030606269836,
0.0787850022315979,
-0.07466680556535721,
0.1017056331038475,
0.012158472090959549,
0.05192233622074127,
-0.026675783097743988,
-0.048972513526678085,
-0.1041666716337204,
-0.05661455914378166,
-0.11214271932840347,
0.012089376337826252,
-0.06434673070907593,
-0.040101710706949234,
0.0062895058654248714,
-0.0018039282876998186,
-0.003106815740466118,
0.055341120809316635,
-0.061313048005104065,
-0.012838110327720642,
-0.01233862154185772,
0.03554767742753029,
-0.05891866236925125,
-0.05450179800391197,
0.018477123230695724,
-0.09778208285570145,
0.09771795570850372,
0.0483458936214447,
0.013313109055161476,
0.0013929081615060568,
0.08461494743824005,
-0.009741426445543766,
0.02205316163599491,
0.009063800796866417,
-0.040096648037433624,
-0.10077474266290665,
0.0037135854363441467,
-0.028250349685549736,
-0.035849012434482574,
-0.018660642206668854,
0.09173653274774551,
-0.08225850760936737,
0.029758084565401077,
-0.0026012954767793417,
-0.0012332816841080785,
-0.07987308502197266,
-0.0062101418152451515,
0.09741485118865967,
0.08148100972175598,
0.05094873160123825,
-0.08557231724262238,
0.016367342323064804,
-0.12609174847602844,
-0.03628874942660332,
0.012818505056202412,
-0.016864905133843422,
-0.13069336116313934,
-0.013495977967977524,
0.019914522767066956,
-0.012368356809020042,
0.18975931406021118,
-0.05774620547890663,
-0.0285238865762949,
0.020106486976146698,
-0.09024181216955185,
0.1044856607913971,
-0.02259291335940361,
0.16199113428592682,
-0.028141522780060768,
-0.03655495122075081,
-0.014724478125572205,
0.04857947304844856,
0.02731209434568882,
-0.009988503530621529,
0.18258991837501526,
0.12879419326782227,
0.04309491813182831,
0.05721951648592949,
-0.029049556702375412,
-0.011720625683665276,
-0.0580904483795166,
-0.025965651497244835,
0.04457642510533333,
0.03390664607286453,
0.023515716195106506,
0.14549124240875244,
0.05770820751786232,
-0.1611039787530899,
0.034449502825737,
-0.02808954194188118,
-0.041254568845033646,
-0.11409313976764679,
-0.0919753760099411,
-0.03076316975057125,
-0.059892021119594574,
0.014561938121914864,
-0.13298387825489044,
0.0035321523901075125,
0.1780317723751068,
0.06445383280515671,
0.029852908104658127,
0.01946296915411949,
-0.13774922490119934,
-0.043019767850637436,
0.057569585740566254,
0.013282264582812786,
0.02106410823762417,
0.044879376888275146,
-0.005362730473279953,
0.06414314359426498,
0.024999113753437996,
0.0014269533567130566,
-0.005276021081954241,
0.07734072208404541,
0.022143922746181488,
0.04821143299341202,
-0.056350357830524445,
-0.0010125980479642749,
-0.03919658809900284,
0.08299107849597931,
0.11812803894281387,
0.044717833399772644,
-0.049409303814172745,
-0.012256737798452377,
0.1587357372045517,
-0.030011892318725586,
0.010757782496511936,
-0.1191413477063179,
0.32770147919654846,
0.013317552395164967,
0.010346172377467155,
0.05804770439863205,
-0.08093095570802689,
-0.04559900239109993,
0.20996388792991638,
0.07209007441997528,
-0.023660888895392418,
-0.03057456947863102,
0.0029442303348332644,
-0.031790781766176224,
-0.015781762078404427,
0.14087562263011932,
0.04408356174826622,
0.11716064810752869,
-0.057850245386362076,
-0.0421542227268219,
-0.03584548458456993,
0.0011998743284493685,
-0.11597254872322083,
0.14733242988586426,
-0.022058235481381416,
-0.0198738481849432,
-0.07791801542043686,
0.012075540609657764,
0.07624908536672592,
-0.34626615047454834,
0.011153122410178185,
-0.02740752138197422,
-0.1050514206290245,
-0.01239648088812828,
-0.02878274768590927,
-0.026497619226574898,
0.05041183903813362,
-0.040235601365566254,
0.06446123868227005,
0.05333665758371353,
0.03765178099274635,
-0.018970860168337822,
-0.1052483320236206,
0.1632046103477478,
0.0585363544523716,
0.1011919379234314,
0.018688274547457695,
0.08950073271989822,
0.06033908203244209,
0.0384393148124218,
-0.09209459275007248,
0.056311868131160736,
0.00841438490897417,
-0.07250633835792542,
-0.05614084377884865,
0.11600302904844284,
-0.0002081863203784451,
0.06621785461902618,
0.030299153178930283,
-0.12051244080066681,
0.024511169642210007,
0.08199214935302734,
-0.08786775916814804,
-0.0941276028752327,
-0.004662595223635435,
-0.09735868126153946,
0.15846902132034302,
0.14779697358608246,
-0.010431425645947456,
0.016398919746279716,
-0.06202729791402817,
-0.010362260043621063,
0.05572940781712532,
0.0053637162782251835,
-0.013949400745332241,
-0.18299944698810577,
0.051431480795145035,
-0.09081665426492691,
-0.005544415675103664,
-0.21368888020515442,
-0.10453591495752335,
-0.0150926997885108,
-0.05736678093671799,
-0.024584084749221802,
0.057777393609285355,
0.024943765252828598,
0.07492009550333023,
-0.023826848715543747,
-0.017352741211652756,
-0.038836751133203506,
0.09379445761442184,
-0.10277167707681656,
-0.07585204392671585
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 40k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 40k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_40k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_40k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_40k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_40k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_40k"]}
| null |
google/multiberts-seed_1-step_40k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_40k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_40k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 40k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 40k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 40k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 40k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_40k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 40k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 40k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_40k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 40k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 40k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08131120353937149,
0.07795988023281097,
-0.002139314543455839,
0.040555767714977264,
0.07387924194335938,
-0.018413927406072617,
0.06975360959768295,
0.09385967999696732,
-0.011178512126207352,
0.02598661556839943,
0.08140867948532104,
0.02834244631230831,
0.0121011883020401,
0.09522883594036102,
0.02005062624812126,
-0.2148253619670868,
0.029385950416326523,
-0.025929126888513565,
-0.0866885706782341,
0.07499561458826065,
0.10435522347688675,
-0.08375281095504761,
0.04269959032535553,
0.03268277272582054,
-0.11096407473087311,
0.05138520523905754,
-0.010316584259271622,
-0.021470753476023674,
0.13869412243366241,
0.0018929755315184593,
0.05359463393688202,
0.055789995938539505,
0.04671221971511841,
-0.13717003166675568,
0.0042768181301653385,
0.05521629378199577,
0.04782190918922424,
0.039402686059474945,
0.022722991183400154,
0.08304064720869064,
-0.022884363308548927,
0.02790949121117592,
0.04948587343096733,
0.016131335869431496,
-0.06452468782663345,
-0.06806513667106628,
-0.098513163626194,
0.027722349390387535,
0.02920564077794552,
0.019613033160567284,
0.006919977255165577,
0.12373025715351105,
-0.03439304977655411,
0.04634043201804161,
0.17559762299060822,
-0.3215065002441406,
-0.0008592166705057025,
0.06668306887149811,
0.02574422024190426,
0.11620932072401047,
-0.0048591201193630695,
-0.03370075300335884,
0.08164261281490326,
0.024308836087584496,
0.09235388040542603,
-0.040827348828315735,
0.03355972096323967,
-0.05902980640530586,
-0.15739911794662476,
-0.0368008017539978,
0.09875181317329407,
0.003049338236451149,
-0.1382375806570053,
-0.025659896433353424,
-0.047348201274871826,
0.040951456874608994,
0.017841625958681107,
-0.03896035626530647,
0.04619449004530907,
0.004679454490542412,
-0.001700394437648356,
0.0009331199107691646,
-0.10215840488672256,
-0.04333484545350075,
0.023566534742712975,
0.09085993468761444,
0.11087895184755325,
0.05593760311603546,
-0.006501049734652042,
0.10800961405038834,
-0.19284801185131073,
-0.04817946255207062,
-0.028791368007659912,
-0.03157024085521698,
-0.04121188074350357,
-0.011263346299529076,
-0.10466411709785461,
-0.049475159496068954,
0.0012153219431638718,
0.13760587573051453,
-0.015584176406264305,
0.0328487753868103,
-0.025014160200953484,
0.007289467379450798,
0.06127765774726868,
0.05577652156352997,
-0.019328832626342773,
0.015205697156488895,
0.035421572625637054,
-0.01727898046374321,
-0.017385145649313927,
0.010846177116036415,
-0.0043508755043148994,
0.023689566180109978,
0.13562633097171783,
0.013270385563373566,
-0.1054556667804718,
0.07824869453907013,
-0.015099503099918365,
-0.04472299665212631,
-0.002553560771048069,
-0.08639220893383026,
-0.0639323890209198,
-0.04006092622876167,
-0.01693117432296276,
0.006877322215586901,
0.00455065444111824,
-0.007812721654772758,
-0.027791602537035942,
-0.026027996093034744,
-0.08904983848333359,
-0.05748222768306732,
-0.053229089826345444,
-0.13533538579940796,
0.007514305412769318,
-0.19047671556472778,
-0.027800222858786583,
-0.11528719216585159,
-0.20504634082317352,
-0.04108811542391777,
0.043421097099781036,
0.0033670018892735243,
-0.06445232033729553,
0.059094905853271484,
0.037821799516677856,
-0.03116253949701786,
-0.00184707622975111,
0.07876833528280258,
-0.007411651778966188,
0.03758293762803078,
-0.039082106202840805,
0.05924293026328087,
0.007051315624266863,
0.042255450040102005,
-0.05961195379495621,
0.05633177235722542,
-0.17807453870773315,
0.040532100945711136,
-0.07101444154977798,
-0.028729936107993126,
-0.08590281754732132,
-0.03383921459317207,
0.001207189285196364,
0.013959393836557865,
0.019997959956526756,
0.07294902205467224,
-0.17225918173789978,
-0.030736684799194336,
0.09268129616975784,
-0.14987938106060028,
-0.025873012840747833,
0.0729900449514389,
-0.055012837052345276,
0.11540122330188751,
0.06792671233415604,
0.16224083304405212,
-0.03270396590232849,
-0.06901368498802185,
0.04785265401005745,
-0.012137901037931442,
0.01079716719686985,
-0.009203664027154446,
0.06859706342220306,
-0.021073253825306892,
-0.16457900404930115,
0.022459741681814194,
-0.1347840428352356,
-0.00046930956887081265,
-0.07690782099962234,
0.03128033131361008,
-0.00315108522772789,
-0.06786647439002991,
-0.08036316931247711,
-0.03431439772248268,
0.0782134160399437,
-0.07082574814558029,
-0.02648220956325531,
0.031168628484010696,
0.07961537688970566,
-0.07190589606761932,
0.0665435791015625,
-0.017775919288396835,
0.020680131390690804,
-0.08083508908748627,
-0.036380305886268616,
-0.185930535197258,
0.03449065983295441,
0.09728094935417175,
0.014345117844641209,
-0.02005775086581707,
0.13263282179832458,
-0.004735513590276241,
0.06813009083271027,
-0.03958626464009285,
-0.005668030120432377,
-0.0075310408137738705,
-0.0020382076036185026,
-0.09810642153024673,
-0.10955172777175903,
-0.0731934979557991,
-0.06981268525123596,
0.09283488243818283,
-0.11859307438135147,
0.022303784266114235,
-0.058159951120615005,
0.04473884403705597,
0.017010636627674103,
-0.06907394528388977,
-0.009491339325904846,
0.014046931639313698,
-0.06463130563497543,
-0.05755920335650444,
0.03608384728431702,
0.06238565221428871,
-0.024778246879577637,
0.0942632257938385,
-0.052182964980602264,
-0.08370226621627808,
0.027803640812635422,
0.07229658961296082,
-0.1042461097240448,
0.023909954354166985,
-0.04798920452594757,
-0.04714810848236084,
-0.06361575424671173,
-0.026323212310671806,
0.10486520081758499,
-0.014187241904437542,
0.1472487598657608,
-0.07480735331773758,
-0.014467034488916397,
0.008387932553887367,
-0.014451856724917889,
-0.02317359484732151,
0.046152256429195404,
0.058841604739427567,
-0.06867767870426178,
0.022737130522727966,
0.033004336059093475,
-0.000250612705713138,
0.06627199798822403,
-0.052760958671569824,
-0.0767560675740242,
0.018680905923247337,
0.03268604725599289,
0.021626241505146027,
0.06006845459342003,
-0.047247957438230515,
-0.014646981842815876,
0.03134212642908096,
0.020979788154363632,
0.010979493148624897,
-0.11818328499794006,
0.0618499256670475,
0.06051790341734886,
0.010102244094014168,
0.051271483302116394,
-0.018920468166470528,
-0.034546174108982086,
0.08103999495506287,
0.03223160281777382,
-0.017489517107605934,
-0.010146556422114372,
-0.013474734500050545,
-0.1218528002500534,
0.2166544497013092,
-0.0672074407339096,
-0.14915543794631958,
-0.06993693113327026,
-0.11501168459653854,
0.0023807822726666927,
0.024744659662246704,
0.04272588714957237,
-0.030412301421165466,
-0.04283636808395386,
-0.12103085219860077,
0.0992550179362297,
-0.03895231708884239,
0.06569249927997589,
0.11062297970056534,
-0.06435666978359222,
0.047641851007938385,
-0.13039100170135498,
-0.011484443210065365,
-0.07718530297279358,
-0.06345131993293762,
0.05861452966928482,
-0.05317380279302597,
0.036551617085933685,
0.11172688007354736,
0.01985723339021206,
-0.028985563665628433,
-0.029103320091962814,
0.20773552358150482,
0.03966487944126129,
0.04224424064159393,
0.1307741403579712,
-0.07205482572317123,
0.053488291800022125,
0.077466681599617,
0.004529684316366911,
-0.04531344398856163,
0.05057172477245331,
0.05252513661980629,
-0.05695085972547531,
-0.1941063553094864,
-0.009062199853360653,
0.009361250326037407,
-0.04649673402309418,
0.06949714571237564,
0.03779619187116623,
0.01766359992325306,
0.07574304193258286,
0.017600083723664284,
0.07490188628435135,
0.0029890385922044516,
0.10130004584789276,
0.026545656844973564,
-0.037616487592458725,
0.08550117164850235,
-0.008588402532041073,
-0.008577203378081322,
0.07815692573785782,
-0.014072632417082787,
0.28869858384132385,
-0.041074465960264206,
0.029614651575684547,
0.1237448900938034,
0.0356157012283802,
0.05205130577087402,
0.12050698697566986,
-0.07535470277070999,
0.028028862550854683,
-0.07649607211351395,
-0.048092398792505264,
0.01319738756865263,
0.049518946558237076,
-0.06871186941862106,
0.01707194373011589,
-0.08473910391330719,
0.021541424095630646,
-0.023773368448019028,
0.3045690059661865,
0.10326040536165237,
-0.11613521724939346,
-0.062065836042165756,
0.003383308881893754,
-0.10000091791152954,
-0.07249713689088821,
0.053889915347099304,
0.04886668920516968,
-0.1325952559709549,
0.0059989821165800095,
-0.021907629445195198,
0.07694931328296661,
-0.03346700221300125,
0.016895541921257973,
0.03689374402165413,
0.054373599588871,
-0.04410143196582794,
0.006811458617448807,
-0.18356603384017944,
0.19182442128658295,
-0.00028326374012976885,
0.020920278504490852,
-0.05457831174135208,
0.030019715428352356,
0.011659045703709126,
-0.006796453148126602,
0.06517036259174347,
0.018321286886930466,
-0.01205787155777216,
-0.06184257194399834,
-0.04258416220545769,
0.013700141571462154,
0.06562511622905731,
-0.04666081443428993,
0.10802461206912994,
0.0046585178934037685,
0.05255420133471489,
0.02925179898738861,
0.07720986008644104,
-0.1817038655281067,
-0.07548240572214127,
0.028548674657940865,
-0.047942183911800385,
-0.09728806465864182,
-0.08227133750915527,
-0.09697549790143967,
-0.01094321720302105,
0.22644148766994476,
-0.11552011221647263,
-0.07449957728385925,
-0.09338156133890152,
0.0411062017083168,
0.0946706086397171,
-0.055743101984262466,
0.022229092195630074,
-0.009557599201798439,
0.1175169050693512,
-0.0686243548989296,
-0.12265316396951675,
0.028360523283481598,
-0.09992494434118271,
-0.1597781628370285,
-0.06796640902757645,
0.09277619421482086,
0.06347193568944931,
0.03128908947110176,
-0.03191482648253441,
0.012170582078397274,
0.03313957154750824,
-0.033406130969524384,
0.0009401460993103683,
0.06187886744737625,
0.09526221454143524,
0.03824998065829277,
-0.10779635608196259,
0.025037065148353577,
-0.07120100408792496,
-0.06706548482179642,
0.07367957383394241,
0.2693072557449341,
-0.05019392818212509,
0.10999395698308945,
0.12206543236970901,
-0.08621867746114731,
-0.15827526152133942,
0.04206763207912445,
0.09180066734552383,
-0.013340176083147526,
0.003209205809980631,
-0.16473151743412018,
0.09970466047525406,
0.11417138576507568,
-0.015458245761692524,
0.00772525230422616,
-0.2015601247549057,
-0.13490456342697144,
0.09082246571779251,
0.1104649156332016,
0.27988317608833313,
-0.05562513321638107,
-0.03962245211005211,
0.020683152601122856,
-0.09316971898078918,
-0.0010339621221646667,
0.11971430480480194,
0.05985482037067413,
-0.0204777754843235,
-0.06193062663078308,
0.01176100131124258,
-0.03669336810708046,
0.09329276531934738,
0.06721421331167221,
0.06961217522621155,
-0.004159483127295971,
-0.0019721982534974813,
-0.03772958368062973,
-0.04236753284931183,
0.06865356117486954,
0.03519003093242645,
0.04894759878516197,
-0.09080079942941666,
-0.03524038940668106,
-0.07362261414527893,
0.03350582346320152,
-0.030446233227849007,
-0.0768965631723404,
-0.060074884444475174,
0.07503603398799896,
0.0576823391020298,
-0.033114101737737656,
0.0411166250705719,
0.033588554710149765,
0.09886888414621353,
0.14598025381565094,
-0.0037018221337348223,
-0.03873670473694801,
-0.07109023630619049,
-0.02993699349462986,
-0.012575464323163033,
0.07558707147836685,
-0.046661488711833954,
0.016543976962566376,
0.07152113318443298,
0.019617708399891853,
0.10828786343336105,
0.059675831347703934,
-0.1208462342619896,
-0.018890362232923508,
0.024549810215830803,
-0.15316897630691528,
0.008372748270630836,
0.0002857078507076949,
0.014947057701647282,
-0.016662606969475746,
0.027394792065024376,
0.15123848617076874,
-0.0677248015999794,
-0.031807754188776016,
-0.04714985191822052,
0.06104351207613945,
0.03360569849610329,
0.14376254379749298,
0.04043854773044586,
0.03804142028093338,
-0.07835955172777176,
0.14162448048591614,
0.04433387890458107,
-0.037809908390045166,
0.025864766910672188,
-0.031247250735759735,
-0.1083102747797966,
0.014350994490087032,
0.06186549738049507,
0.08048404008150101,
-0.0756954774260521,
-0.019416075199842453,
-0.04401223361492157,
-0.07853341102600098,
0.06713177263736725,
0.20980839431285858,
0.06498521566390991,
0.06640011817216873,
-0.05232199653983116,
-0.03460242971777916,
-0.07809653133153915,
0.04747046157717705,
0.04861275851726532,
0.07873819023370743,
-0.07516995072364807,
0.10579784214496613,
0.01263733021914959,
0.05299770087003708,
-0.026503760367631912,
-0.04863934591412544,
-0.1031198799610138,
-0.05683519318699837,
-0.11075609922409058,
0.015439102426171303,
-0.06558626145124435,
-0.04087021201848984,
0.005500386469066143,
-0.0013740473659709096,
-0.002097531221807003,
0.05656562000513077,
-0.061143551021814346,
-0.011478530243039131,
-0.012965204194188118,
0.03633653372526169,
-0.0605933703482151,
-0.05214468389749527,
0.017695706337690353,
-0.09734804928302765,
0.10003101825714111,
0.047157157212495804,
0.013402712531387806,
0.00233474001288414,
0.08446823805570602,
-0.008895769715309143,
0.024255260825157166,
0.007267117965966463,
-0.040373265743255615,
-0.09899390488862991,
0.003979935310781002,
-0.0287187360227108,
-0.03709530457854271,
-0.020205510780215263,
0.09020815044641495,
-0.08098000288009644,
0.03196554630994797,
-0.0010404819622635841,
-0.0026184055022895336,
-0.08146604895591736,
-0.0061004371382296085,
0.09252440929412842,
0.0835801288485527,
0.050564032047986984,
-0.08345331251621246,
0.018110571429133415,
-0.12468385696411133,
-0.03598084673285484,
0.013637872412800789,
-0.015691416338086128,
-0.1303403079509735,
-0.01382584311068058,
0.018289081752300262,
-0.014865543693304062,
0.19304399192333221,
-0.059155091643333435,
-0.024839328601956367,
0.01935259997844696,
-0.0895133689045906,
0.10481365770101547,
-0.023501820862293243,
0.16386370360851288,
-0.027781089767813683,
-0.035263825207948685,
-0.015877025201916695,
0.048878032714128494,
0.028926093131303787,
-0.012766753323376179,
0.179116889834404,
0.13070400059223175,
0.04528476670384407,
0.05786135792732239,
-0.027337705716490746,
-0.01175632979720831,
-0.0654289647936821,
-0.01637679897248745,
0.04321419820189476,
0.03509732335805893,
0.024905875325202942,
0.15648318827152252,
0.06317609548568726,
-0.1617836058139801,
0.03495049476623535,
-0.02713969722390175,
-0.03984278813004494,
-0.11443208903074265,
-0.0962030217051506,
-0.032156798988580704,
-0.06239054352045059,
0.013224175199866295,
-0.13218627870082855,
0.003621994750574231,
0.1740957349538803,
0.06559054553508759,
0.028162112459540367,
0.01752823032438755,
-0.13570360839366913,
-0.044873885810375214,
0.057418327778577805,
0.014078138396143913,
0.021037383005023003,
0.04284600913524628,
-0.004065463785082102,
0.06772880256175995,
0.025310862809419632,
0.0019745435565710068,
-0.0069742752239108086,
0.08317963778972626,
0.025775384157896042,
0.04613209888339043,
-0.05841422080993652,
0.0002975671086460352,
-0.038883015513420105,
0.08059795945882797,
0.11139538884162903,
0.04439300298690796,
-0.04920617491006851,
-0.01286912988871336,
0.15427735447883606,
-0.02855074778199196,
0.012737405486404896,
-0.11963164806365967,
0.3223379850387573,
0.013924852013587952,
0.011023045517504215,
0.05819406360387802,
-0.08229047060012817,
-0.04465465992689133,
0.20573316514492035,
0.06842657923698425,
-0.023096207529306412,
-0.029331322759389877,
0.005768240429461002,
-0.03110828623175621,
-0.015043468214571476,
0.13709379732608795,
0.044164836406707764,
0.1190924122929573,
-0.05619088560342789,
-0.05008681118488312,
-0.03602147474884987,
0.00020641654555220157,
-0.1124286949634552,
0.1473034918308258,
-0.019202739000320435,
-0.021111004054546356,
-0.07697633653879166,
0.012728211469948292,
0.07708578556776047,
-0.34746724367141724,
0.009485597722232342,
-0.02786545641720295,
-0.10473233461380005,
-0.013150760903954506,
-0.02537580393254757,
-0.024806274101138115,
0.048549991101026535,
-0.039225224405527115,
0.0613512322306633,
0.053036656230688095,
0.037961795926094055,
-0.022781234234571457,
-0.09935930371284485,
0.1617637574672699,
0.06337271630764008,
0.10158883035182953,
0.017751358449459076,
0.08941329270601273,
0.06118007004261017,
0.03698896989226341,
-0.09104129672050476,
0.05504652485251427,
0.01010472048074007,
-0.07121451944112778,
-0.05759752541780472,
0.11730988323688507,
-0.00005461163164000027,
0.061112161725759506,
0.03328293189406395,
-0.1236073449254036,
0.025135058909654617,
0.08041120320558548,
-0.09114902466535568,
-0.09583938866853714,
-0.0029530206229537725,
-0.09734220802783966,
0.15738147497177124,
0.14732405543327332,
-0.01043830905109644,
0.015164542943239212,
-0.06326968967914581,
-0.007588861044496298,
0.055242907255887985,
0.008911547251045704,
-0.013500326313078403,
-0.17996646463871002,
0.053460218012332916,
-0.08808411657810211,
-0.0010691789211705327,
-0.21354959905147552,
-0.10276554524898529,
-0.012639160268008709,
-0.055711422115564346,
-0.02699524350464344,
0.05624876171350479,
0.02347302809357643,
0.0745207890868187,
-0.024741509929299355,
-0.019386984407901764,
-0.037949513643980026,
0.09257832914590836,
-0.10213179141283035,
-0.0771762877702713
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_500k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_500k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_500k"]}
| null |
google/multiberts-seed_1-step_500k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_500k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 500k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0811978131532669,
0.07917001843452454,
-0.002061765408143401,
0.045269254595041275,
0.07935299724340439,
-0.018598278984427452,
0.06946663558483124,
0.09223250299692154,
-0.012886201962828636,
0.025020575150847435,
0.08296326547861099,
0.024552473798394203,
0.01587880216538906,
0.10453281551599503,
0.01995411328971386,
-0.2087605744600296,
0.03066469356417656,
-0.026344705373048782,
-0.08871059119701385,
0.07447583973407745,
0.10184329003095627,
-0.08560474216938019,
0.04282157123088837,
0.029013223946094513,
-0.11186613142490387,
0.05639410763978958,
-0.01016906090080738,
-0.02113606408238411,
0.13360290229320526,
0.00040277946391142905,
0.052881352603435516,
0.054203808307647705,
0.05210551247000694,
-0.1412796527147293,
0.0037666899152100086,
0.05592603236436844,
0.051728006452322006,
0.03812253475189209,
0.020666293799877167,
0.08231229335069656,
-0.03296166658401489,
0.03079863451421261,
0.04967527464032173,
0.017161857336759567,
-0.06348783522844315,
-0.05866788700222969,
-0.09620263427495956,
0.0321214459836483,
0.034165021032094955,
0.01975441351532936,
0.007448781747370958,
0.11281798034906387,
-0.033615656197071075,
0.042221345007419586,
0.18018952012062073,
-0.3132244050502777,
-0.0025352879893034697,
0.06056836247444153,
0.01591413840651512,
0.11545019596815109,
-0.004299089778214693,
-0.033239927142858505,
0.08409595489501953,
0.022422997280955315,
0.09830731153488159,
-0.0390809029340744,
0.01639399491250515,
-0.062095943838357925,
-0.15633893013000488,
-0.03865741565823555,
0.10140611976385117,
0.004745886195451021,
-0.1382916122674942,
-0.020592937245965004,
-0.043991949409246445,
0.03150459751486778,
0.020718246698379517,
-0.04221993312239647,
0.04467298462986946,
0.0023531385231763124,
-0.0005506144370883703,
-0.001253233989700675,
-0.10240299254655838,
-0.04844575747847557,
0.0231668371707201,
0.09466283768415451,
0.10862269252538681,
0.05872234329581261,
-0.007437573745846748,
0.11089206486940384,
-0.18352961540222168,
-0.0460006445646286,
-0.030781878158450127,
-0.02963506616652012,
-0.04576770216226578,
-0.010411380790174007,
-0.09811192750930786,
-0.05013611912727356,
-0.002266575349494815,
0.1377781629562378,
-0.006277025211602449,
0.02934824489057064,
-0.02108445204794407,
0.004982125014066696,
0.06052370369434357,
0.05419478937983513,
-0.02195628732442856,
0.02282751351594925,
0.036125436425209045,
-0.01086054090410471,
-0.021627694368362427,
0.01149482186883688,
-0.00265435385517776,
0.026357578113675117,
0.13586103916168213,
0.01682833768427372,
-0.10492807626724243,
0.07303254306316376,
-0.019966088235378265,
-0.043480124324560165,
-0.01722756400704384,
-0.08741471916437149,
-0.06224440410733223,
-0.03887109458446503,
-0.016140222549438477,
0.0037083078641444445,
0.0023491736501455307,
-0.009939424693584442,
-0.029295874759554863,
-0.015187811106443405,
-0.08814634382724762,
-0.05728818103671074,
-0.058375950902700424,
-0.13761740922927856,
0.0057444022968411446,
-0.19765864312648773,
-0.02966957725584507,
-0.11666547507047653,
-0.20191550254821777,
-0.03994367644190788,
0.04522104561328888,
0.0033420976251363754,
-0.06605523824691772,
0.05232597887516022,
0.03516947478055954,
-0.032442063093185425,
-0.00348291895352304,
0.08887473493814468,
-0.0052581909112632275,
0.0377364456653595,
-0.03939836099743843,
0.06171254813671112,
0.007565127220004797,
0.04289158061146736,
-0.06144203990697861,
0.055018097162246704,
-0.15802189707756042,
0.040477775037288666,
-0.07411625236272812,
-0.031198564916849136,
-0.08660735189914703,
-0.03140553832054138,
-0.009534244425594807,
0.010363471694290638,
0.02786885015666485,
0.0782812237739563,
-0.1739814430475235,
-0.029104845598340034,
0.08686499297618866,
-0.15100552141666412,
-0.027219591662287712,
0.07061436772346497,
-0.05182561278343201,
0.11742959916591644,
0.06734634190797806,
0.15370170772075653,
-0.02808295376598835,
-0.06550680100917816,
0.04329575225710869,
-0.011156372725963593,
0.012560072354972363,
-0.015519342385232449,
0.06616479158401489,
-0.020170152187347412,
-0.16637517511844635,
0.024761853739619255,
-0.13222403824329376,
-0.00029109622118994594,
-0.08070310950279236,
0.02876109816133976,
-0.0037896852008998394,
-0.07238214462995529,
-0.08113706111907959,
-0.03609194979071617,
0.07841385900974274,
-0.06796722859144211,
-0.02233896031975746,
0.028980936855077744,
0.07407151907682419,
-0.06929858028888702,
0.06581553816795349,
-0.014285825192928314,
0.022295428439974785,
-0.08414242416620255,
-0.03490523621439934,
-0.18244726955890656,
0.03699268773198128,
0.09886419773101807,
0.012692133896052837,
-0.02143196389079094,
0.12558884918689728,
-0.006703254766762257,
0.06431664526462555,
-0.03921517729759216,
-0.0052651045843958855,
-0.013882122933864594,
-0.0014238859293982387,
-0.09757738560438156,
-0.10285774618387222,
-0.07622045278549194,
-0.06838693469762802,
0.0849512442946434,
-0.10609297454357147,
0.022993702441453934,
-0.05608997121453285,
0.04179198294878006,
0.019616618752479553,
-0.07193657010793686,
-0.008505318313837051,
0.017168810591101646,
-0.06219713017344475,
-0.057982511818408966,
0.033917319029569626,
0.0582737922668457,
-0.022963449358940125,
0.0874752327799797,
-0.04534566402435303,
-0.08090247213840485,
0.026100369170308113,
0.08120425045490265,
-0.10423246771097183,
0.028376441448926926,
-0.04740980640053749,
-0.04353459179401398,
-0.07504287362098694,
-0.027129635214805603,
0.10259492695331573,
-0.01160454098135233,
0.1415233165025711,
-0.07505820691585541,
-0.011201933026313782,
0.01003975234925747,
-0.010704727843403816,
-0.025498047471046448,
0.04706453159451485,
0.0796755850315094,
-0.0639365166425705,
0.024074098095297813,
0.029144559055566788,
0.0005120019195601344,
0.06414545327425003,
-0.05023213475942612,
-0.07807791978120804,
0.02160201221704483,
0.03258886933326721,
0.022320987656712532,
0.06400860100984573,
-0.04457723721861839,
-0.011842701584100723,
0.028234975412487984,
0.021928580477833748,
0.009641884826123714,
-0.1223873421549797,
0.06178640201687813,
0.06161830946803093,
0.006936295423656702,
0.04969917610287666,
-0.02193978801369667,
-0.03375359997153282,
0.08176211267709732,
0.030317457392811775,
-0.013292687945067883,
-0.011418464593589306,
-0.011942112818360329,
-0.12353381514549255,
0.21624906361103058,
-0.06496649235486984,
-0.15055806934833527,
-0.06905047595500946,
-0.0983148142695427,
0.004918031394481659,
0.023351289331912994,
0.042822301387786865,
-0.028866445645689964,
-0.043542034924030304,
-0.12507206201553345,
0.10013949126005173,
-0.03641972318291664,
0.06944634765386581,
0.11512649059295654,
-0.06419321894645691,
0.043733082711696625,
-0.13089904189109802,
-0.010541882365942001,
-0.07868082821369171,
-0.06692562252283096,
0.05512676015496254,
-0.04847455769777298,
0.04019267112016678,
0.1083805039525032,
0.015830304473638535,
-0.025895964354276657,
-0.03206709399819374,
0.2086474597454071,
0.043585024774074554,
0.039035357534885406,
0.12395280599594116,
-0.06787438690662384,
0.05026235058903694,
0.08292576670646667,
0.005699027795344591,
-0.04870021715760231,
0.05429386720061302,
0.056663088500499725,
-0.058921489864587784,
-0.1890021115541458,
-0.005774246994405985,
0.006534903775900602,
-0.04655028134584427,
0.07204481959342957,
0.03784548491239548,
-0.0017967874882742763,
0.07758177816867828,
0.017603084444999695,
0.06727640330791473,
0.00002420906093902886,
0.09735234081745148,
0.01779995858669281,
-0.03636004030704498,
0.0857662484049797,
-0.0067749000154435635,
-0.008950114250183105,
0.07660270482301712,
-0.017698369920253754,
0.29669374227523804,
-0.043086808174848557,
0.017473481595516205,
0.12245463579893112,
0.0374876968562603,
0.04995546117424965,
0.12954364717006683,
-0.07461865246295929,
0.02953355759382248,
-0.07353782653808594,
-0.04512672871351242,
0.011808851733803749,
0.04652444273233414,
-0.06759563833475113,
0.024920495226979256,
-0.09038004279136658,
0.02421349473297596,
-0.02559809200465679,
0.3097778260707855,
0.10567691922187805,
-0.10808774828910828,
-0.0565897598862648,
0.003779658814892173,
-0.10005885362625122,
-0.07093276083469391,
0.053138021379709244,
0.05466437339782715,
-0.134079247713089,
0.001361603382974863,
-0.02196701057255268,
0.07825324684381485,
-0.031835705041885376,
0.014200475066900253,
0.03640586510300636,
0.05455516651272774,
-0.046282727271318436,
0.005518019199371338,
-0.19094468653202057,
0.19595815241336823,
-0.001279798336327076,
0.027885861694812775,
-0.055205292999744415,
0.02875429391860962,
0.006566505879163742,
-0.017938444390892982,
0.0642799362540245,
0.019078217446804047,
-0.014194726012647152,
-0.0701579749584198,
-0.04414868727326393,
0.016944807022809982,
0.06752849370241165,
-0.05047731101512909,
0.10560324043035507,
0.006385121960192919,
0.05351768806576729,
0.028896262869238853,
0.08230877667665482,
-0.18861913681030273,
-0.07980430126190186,
0.028160670772194862,
-0.04655998572707176,
-0.09289661794900894,
-0.0820554718375206,
-0.09795408695936203,
0.00041092379251495004,
0.21725139021873474,
-0.1270301640033722,
-0.07356807589530945,
-0.09227155894041061,
0.04721781983971596,
0.0939299464225769,
-0.05472923815250397,
0.024342309683561325,
-0.008131369948387146,
0.11009573936462402,
-0.07035123556852341,
-0.12318582832813263,
0.02490290254354477,
-0.09888442605733871,
-0.15892352163791656,
-0.0661546140909195,
0.08853388577699661,
0.06267218291759491,
0.030744312331080437,
-0.03244186192750931,
0.010616851039230824,
0.0420868918299675,
-0.03679727762937546,
0.0019072473514825106,
0.06500973552465439,
0.07966537773609161,
0.04448236897587776,
-0.11328089982271194,
0.014703184366226196,
-0.07358597964048386,
-0.06866351515054703,
0.07197916507720947,
0.27046599984169006,
-0.051011718809604645,
0.10889414697885513,
0.11983471363782883,
-0.0888570100069046,
-0.16144098341464996,
0.044028181582689285,
0.09317859262228012,
-0.008916516788303852,
0.000999730546027422,
-0.1632150560617447,
0.09987030923366547,
0.11192066967487335,
-0.015207270160317421,
0.010548772290349007,
-0.21198302507400513,
-0.13797922432422638,
0.08685514330863953,
0.11618353426456451,
0.28709691762924194,
-0.050702568143606186,
-0.035685230046510696,
0.02271564118564129,
-0.09303164482116699,
0.008902867324650288,
0.13594888150691986,
0.0632731020450592,
-0.022456927224993706,
-0.0684766173362732,
0.012453521601855755,
-0.0362287200987339,
0.08861945569515228,
0.06319534778594971,
0.07140019536018372,
-0.0007362672477029264,
-0.0030251352582126856,
-0.039188604801893234,
-0.03923076391220093,
0.07186301052570343,
0.03200654685497284,
0.04886341467499733,
-0.08530138432979584,
-0.036593496799468994,
-0.07075179368257523,
0.0345202200114727,
-0.02981032058596611,
-0.07696474343538284,
-0.05875740572810173,
0.07341491430997849,
0.05890590324997902,
-0.033649202436208725,
0.027655432000756264,
0.035292111337184906,
0.09812650084495544,
0.1438164860010147,
-0.0006250811275094748,
-0.04961704835295677,
-0.06505914032459259,
-0.02887726202607155,
-0.01311612967401743,
0.07857515662908554,
-0.035362664610147476,
0.016882531344890594,
0.0748981386423111,
0.022277778014540672,
0.10740572959184647,
0.06253846734762192,
-0.11857376992702484,
-0.017571086063981056,
0.026716448366642,
-0.15111763775348663,
-0.00020566304738167673,
0.0003259646473452449,
0.013627499341964722,
-0.025696970522403717,
0.024251168593764305,
0.1475469470024109,
-0.072563536465168,
-0.03320331498980522,
-0.04890500754117966,
0.0628504827618599,
0.03260780870914459,
0.15188317000865936,
0.0369388721883297,
0.03766335919499397,
-0.07575473189353943,
0.145780548453331,
0.0421316996216774,
-0.03508899733424187,
0.027480266988277435,
-0.032290857285261154,
-0.1050942987203598,
0.017412545159459114,
0.06841441988945007,
0.07509012520313263,
-0.06819139420986176,
-0.013346870429813862,
-0.043176326900720596,
-0.08222322165966034,
0.06783561408519745,
0.2097809761762619,
0.06189102306962013,
0.06656035035848618,
-0.05326857045292854,
-0.03437064588069916,
-0.0770082175731659,
0.04984786733984947,
0.05534772947430611,
0.07814055681228638,
-0.07339876890182495,
0.10386168956756592,
0.012503950856626034,
0.049317702651023865,
-0.026334702968597412,
-0.04555463790893555,
-0.10524562001228333,
-0.055039115250110626,
-0.09686508029699326,
0.011844874359667301,
-0.05950535088777542,
-0.04141692444682121,
0.008262496441602707,
-0.00391273433342576,
-0.005808670539408922,
0.05765552073717117,
-0.06098802760243416,
-0.01328357495367527,
-0.015325727872550488,
0.03348572179675102,
-0.06266800314188004,
-0.053252510726451874,
0.020894289016723633,
-0.09597629308700562,
0.09615680575370789,
0.04385325685143471,
0.01163144875317812,
0.002412545494735241,
0.0668453499674797,
-0.00994513463228941,
0.023360244929790497,
0.009484167210757732,
-0.04158058390021324,
-0.099350206553936,
0.008487612009048462,
-0.02412896603345871,
-0.031916022300720215,
-0.019207632169127464,
0.08881303668022156,
-0.0833166241645813,
0.024855276569724083,
-0.002697858028113842,
0.0005505753215402365,
-0.0776069313287735,
-0.0034022824838757515,
0.09629644453525543,
0.08182565122842789,
0.05551812797784805,
-0.08358626067638397,
0.016713932156562805,
-0.12418133020401001,
-0.03357146680355072,
0.012918231077492237,
-0.0149074150249362,
-0.13968344032764435,
-0.0128563791513443,
0.02059929445385933,
-0.01251404732465744,
0.18535688519477844,
-0.054106246680021286,
-0.02798430062830448,
0.020442284643650055,
-0.0903911292552948,
0.10525283962488174,
-0.02022756077349186,
0.16566888988018036,
-0.02666584402322769,
-0.037927597761154175,
-0.01843239739537239,
0.04687771573662758,
0.027796711772680283,
-0.007019147742539644,
0.19026628136634827,
0.12888562679290771,
0.03530341386795044,
0.05849209055304527,
-0.02689656987786293,
-0.01098716538399458,
-0.05574245750904083,
-0.01884474791586399,
0.039162006229162216,
0.03624524548649788,
0.022004473954439163,
0.15344521403312683,
0.05789179727435112,
-0.1589505821466446,
0.035207487642765045,
-0.021543242037296295,
-0.03980695828795433,
-0.1183915063738823,
-0.10139556229114532,
-0.03263110667467117,
-0.06118946522474289,
0.017260177060961723,
-0.1326460987329483,
0.004124127794057131,
0.17420241236686707,
0.06400424987077713,
0.030041009187698364,
0.017117884010076523,
-0.12876732647418976,
-0.04047650098800659,
0.06183600053191185,
0.011205180548131466,
0.019559811800718307,
0.03917064517736435,
-0.006977106910198927,
0.06698038429021835,
0.025060858577489853,
0.004635833203792572,
-0.005669711623340845,
0.07268824428319931,
0.02210734412074089,
0.04817737638950348,
-0.055330950766801834,
-0.0006331228651106358,
-0.04521312937140465,
0.0794566422700882,
0.11925570666790009,
0.045540910214185715,
-0.04890824481844902,
-0.012126979418098927,
0.15923403203487396,
-0.030063604936003685,
0.00977368000894785,
-0.1205935850739479,
0.31758713722229004,
0.018188199028372765,
0.01508463267236948,
0.0595962218940258,
-0.08171956241130829,
-0.048303958028554916,
0.20357592403888702,
0.06491536647081375,
-0.02049722708761692,
-0.029837312176823616,
0.0007792040705680847,
-0.030937504023313522,
-0.015447192825376987,
0.14136968553066254,
0.0421271026134491,
0.11620601266622543,
-0.05760360509157181,
-0.039885394275188446,
-0.03334387019276619,
-0.0003604767844080925,
-0.10944178700447083,
0.14475178718566895,
-0.0193320345133543,
-0.018442006781697273,
-0.08179810643196106,
0.009854013100266457,
0.07549671083688736,
-0.33577486872673035,
0.007028792519122362,
-0.028553403913974762,
-0.10428193211555481,
-0.014742590487003326,
-0.02951527200639248,
-0.025511736050248146,
0.04868852347135544,
-0.04008588567376137,
0.06706476956605911,
0.04938964545726776,
0.037907376885414124,
-0.023899521678686142,
-0.1107109934091568,
0.158610999584198,
0.05684719979763031,
0.10337188839912415,
0.019863896071910858,
0.08338192850351334,
0.05910841375589371,
0.036902207881212234,
-0.09328377991914749,
0.05365801975131035,
0.007195794954895973,
-0.06554944068193436,
-0.05495939403772354,
0.11863338202238083,
-0.0016509870765730739,
0.06835265457630157,
0.027405867353081703,
-0.11969810724258423,
0.02499982714653015,
0.07615764439105988,
-0.08653832226991653,
-0.09670761227607727,
-0.0007754562539048493,
-0.09599598497152328,
0.1562272310256958,
0.14606782793998718,
-0.011326332576572895,
0.01636640913784504,
-0.06321247667074203,
-0.007409327197819948,
0.0494219996035099,
0.014323309995234013,
-0.011108050122857094,
-0.1826583296060562,
0.05053877830505371,
-0.08715630322694778,
-0.00409816624596715,
-0.21200871467590332,
-0.10210762917995453,
-0.018028326332569122,
-0.05642542615532875,
-0.026070326566696167,
0.05781684070825577,
0.030679218471050262,
0.07648652046918869,
-0.02633768320083618,
-0.0399048812687397,
-0.03673355281352997,
0.09398645162582397,
-0.10123161226511002,
-0.07435354590415955
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_600k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_600k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_600k"]}
| null |
google/multiberts-seed_1-step_600k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_600k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 600k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08189119398593903,
0.0730593204498291,
-0.0019684648141264915,
0.04730180650949478,
0.08118879050016403,
-0.019497746601700783,
0.06449230760335922,
0.09384875744581223,
-0.004465811420232058,
0.022994389757514,
0.08193174004554749,
0.027798054739832878,
0.012022540904581547,
0.09321355074644089,
0.019032251089811325,
-0.20934607088565826,
0.030158143490552902,
-0.02931017242372036,
-0.09047947824001312,
0.07492123544216156,
0.10077541321516037,
-0.0828489139676094,
0.04543511942028999,
0.02827015519142151,
-0.11438301205635071,
0.05803005024790764,
-0.009039508178830147,
-0.02181367389857769,
0.13649053871631622,
-0.0012749361339956522,
0.05450502783060074,
0.05809630826115608,
0.050202496349811554,
-0.13132987916469574,
0.004609869327396154,
0.05467942729592323,
0.05248214304447174,
0.036417700350284576,
0.020721685141324997,
0.08280915766954422,
-0.029193302616477013,
0.034152280539274216,
0.05060964822769165,
0.016515055671334267,
-0.0643179640173912,
-0.05222058296203613,
-0.09831376373767853,
0.020383358001708984,
0.033148281276226044,
0.025047706440091133,
0.007051306776702404,
0.11513879150152206,
-0.03884301334619522,
0.041792310774326324,
0.1725235879421234,
-0.3046598732471466,
-0.002404040889814496,
0.06536974757909775,
0.015118338167667389,
0.11423788964748383,
-0.004977956414222717,
-0.03603748604655266,
0.08138098567724228,
0.023363173007965088,
0.08966003358364105,
-0.038321200758218765,
0.01629333570599556,
-0.06152080371975899,
-0.15487796068191528,
-0.035864029079675674,
0.09894885867834091,
0.003757857484742999,
-0.13915355503559113,
-0.014840621501207352,
-0.04588997736573219,
0.04510018974542618,
0.020984377712011337,
-0.038393810391426086,
0.0449354350566864,
0.003243066370487213,
-0.005987913813441992,
-0.0009490337688475847,
-0.10222234576940536,
-0.04818244278430939,
0.01792633719742298,
0.09037574380636215,
0.1084069162607193,
0.05822717770934105,
-0.006884145084768534,
0.11035126447677612,
-0.184688001871109,
-0.04695822671055794,
-0.03180066868662834,
-0.031614720821380615,
-0.04576915502548218,
-0.010535149835050106,
-0.10024271160364151,
-0.04558766633272171,
-0.0029934628400951624,
0.12891602516174316,
-0.011886895634233952,
0.03440946713089943,
-0.022046677768230438,
0.00451558455824852,
0.0561164990067482,
0.048998940736055374,
-0.018592428416013718,
0.024941010400652885,
0.0328068733215332,
-0.013694088906049728,
-0.01993858627974987,
0.010991896502673626,
-0.0023751920089125633,
0.027568988502025604,
0.13123588263988495,
0.011163135059177876,
-0.10594552755355835,
0.07598831504583359,
-0.018612848594784737,
-0.04340355843305588,
-0.0063189188949763775,
-0.08879578113555908,
-0.06380097568035126,
-0.04220277816057205,
-0.01670100726187229,
0.003906004596501589,
0.004248281940817833,
-0.009504021145403385,
-0.026332372799515724,
-0.016448352485895157,
-0.09098640829324722,
-0.059806231409311295,
-0.058881402015686035,
-0.13303770124912262,
0.006232900079339743,
-0.18913060426712036,
-0.0298645980656147,
-0.11248917132616043,
-0.20771078765392303,
-0.038521282374858856,
0.04828713461756706,
0.0009441516594961286,
-0.06525861471891403,
0.05605972930788994,
0.03140600398182869,
-0.030480457469820976,
-0.0009625263046473265,
0.09205897897481918,
-0.005503595806658268,
0.03825943544507027,
-0.03788026422262192,
0.06143398582935333,
0.004265654366463423,
0.04451170191168785,
-0.061234064400196075,
0.05657269060611725,
-0.17388911545276642,
0.042108215391635895,
-0.07527096569538116,
-0.03202305734157562,
-0.08535369485616684,
-0.03261875361204147,
-0.008654044009745121,
0.011161270551383495,
0.026237616315484047,
0.07778925448656082,
-0.17207251489162445,
-0.027534089982509613,
0.08367433398962021,
-0.15276223421096802,
-0.025681080296635628,
0.06826332956552505,
-0.054895732551813126,
0.11865980923175812,
0.06394917517900467,
0.1617802530527115,
-0.033410824835300446,
-0.06641261279582977,
0.04124610498547554,
-0.012198878452181816,
0.012026544660329819,
-0.011847509071230888,
0.06496848165988922,
-0.020142143592238426,
-0.16596491634845734,
0.023579632863402367,
-0.13180480897426605,
-0.0016503353836014867,
-0.07831540703773499,
0.030917201191186905,
-0.004510379396378994,
-0.07081974297761917,
-0.08102191984653473,
-0.03238122910261154,
0.07600206136703491,
-0.0681348368525505,
-0.022083677351474762,
0.034075118601322174,
0.07765837758779526,
-0.0695766806602478,
0.06563733518123627,
-0.015210854820907116,
0.01718153990805149,
-0.0821351632475853,
-0.036722589284181595,
-0.186940535902977,
0.029925037175416946,
0.09735960513353348,
0.013415733352303505,
-0.019760068506002426,
0.12114416807889938,
-0.0070626516826450825,
0.06505384296178818,
-0.04129498824477196,
-0.0044903624802827835,
-0.011890968307852745,
-0.0008858544752001762,
-0.09520880877971649,
-0.10644234716892242,
-0.07342925667762756,
-0.06976726651191711,
0.09373737871646881,
-0.10967155545949936,
0.021744346246123314,
-0.052156202495098114,
0.042144957929849625,
0.017315877601504326,
-0.07134930789470673,
-0.010242804884910583,
0.01346651278436184,
-0.06387381255626678,
-0.05942762270569801,
0.0342639684677124,
0.05969448387622833,
-0.020246809348464012,
0.09061506390571594,
-0.0443546287715435,
-0.0896444320678711,
0.028358662500977516,
0.08285069465637207,
-0.1060483306646347,
0.030978335067629814,
-0.04671841487288475,
-0.047241874039173126,
-0.07240831851959229,
-0.028369847685098648,
0.10400041937828064,
-0.012451295740902424,
0.14031323790550232,
-0.07667858898639679,
-0.009747843258082867,
0.010249976068735123,
-0.010251977480947971,
-0.025373784825205803,
0.047917161136865616,
0.07789871096611023,
-0.07595410197973251,
0.02459641732275486,
0.034335825592279434,
-0.00012495282862801105,
0.06675021350383759,
-0.050768185406923294,
-0.07617555558681488,
0.023587573319673538,
0.03342819958925247,
0.022003352642059326,
0.064193956553936,
-0.04502659663558006,
-0.008800506591796875,
0.027399854734539986,
0.023878952488303185,
0.009756783954799175,
-0.12061826139688492,
0.06091970205307007,
0.05842715501785278,
0.009240408428013325,
0.04802894964814186,
-0.019682327285408974,
-0.03407805413007736,
0.08218953758478165,
0.030647490173578262,
-0.016606688499450684,
-0.008983317762613297,
-0.012435082346200943,
-0.12022963911294937,
0.22030945122241974,
-0.06629854440689087,
-0.14413775503635406,
-0.06740418821573257,
-0.10996957868337631,
0.002532198093831539,
0.023425444960594177,
0.04125426709651947,
-0.034200213849544525,
-0.04213445633649826,
-0.12366575747728348,
0.09563634544610977,
-0.035326723009347916,
0.06563891470432281,
0.1126280352473259,
-0.06558015197515488,
0.04440867528319359,
-0.1305905133485794,
-0.012354518286883831,
-0.07978693395853043,
-0.06992640346288681,
0.05729210004210472,
-0.05397246405482292,
0.039289526641368866,
0.11043111979961395,
0.01607818529009819,
-0.02868136391043663,
-0.02962525747716427,
0.20588383078575134,
0.04009580239653587,
0.04221818596124649,
0.12825632095336914,
-0.06832832098007202,
0.05422268062829971,
0.08323629200458527,
0.005848773289471865,
-0.04796887934207916,
0.05649483576416969,
0.05683908984065056,
-0.061343371868133545,
-0.19328847527503967,
-0.008271235041320324,
0.00611928291618824,
-0.048621926456689835,
0.0673244446516037,
0.037956155836582184,
-0.0035185455344617367,
0.07663615792989731,
0.01988096348941326,
0.06333696097135544,
-0.001087563345208764,
0.0983121246099472,
0.023789191618561745,
-0.036005884408950806,
0.08576520532369614,
-0.007912526838481426,
-0.007686008233577013,
0.07723015546798706,
-0.013928600586950779,
0.3025152385234833,
-0.045165710151195526,
0.015446328558027744,
0.12280886620283127,
0.036636773496866226,
0.04915844649076462,
0.12605313956737518,
-0.07578898966312408,
0.027687223628163338,
-0.07623155415058136,
-0.0454726368188858,
0.016839398071169853,
0.04555947333574295,
-0.07500705868005753,
0.02087286114692688,
-0.08707914501428604,
0.02913849987089634,
-0.02732916921377182,
0.30200421810150146,
0.1049344539642334,
-0.11151950061321259,
-0.0588500052690506,
0.0025514699518680573,
-0.10058864206075668,
-0.07046371698379517,
0.05190042406320572,
0.0535505935549736,
-0.13291257619857788,
0.00591552397236228,
-0.022007694467902184,
0.07598540186882019,
-0.02650843746960163,
0.015069929882884026,
0.03948706388473511,
0.05473533272743225,
-0.046525925397872925,
0.006634135264903307,
-0.18266424536705017,
0.20057225227355957,
-0.0017618206329643726,
0.02432231605052948,
-0.054648082703351974,
0.028394298627972603,
0.012434125877916813,
-0.01686236634850502,
0.06605668365955353,
0.019406599923968315,
-0.014928324148058891,
-0.05843692272901535,
-0.0423838272690773,
0.016769500449299812,
0.06746391206979752,
-0.044821348041296005,
0.10622058063745499,
0.006950859911739826,
0.05403898283839226,
0.029912704601883888,
0.08551439642906189,
-0.18545012176036835,
-0.08154556900262833,
0.026666760444641113,
-0.05077998712658882,
-0.10331277549266815,
-0.08028550446033478,
-0.09739293903112411,
0.005213466938585043,
0.22351661324501038,
-0.11525114625692368,
-0.0759819820523262,
-0.09351678937673569,
0.04324987903237343,
0.09535008668899536,
-0.05496491491794586,
0.022959938272833824,
-0.00802773330360651,
0.1104440912604332,
-0.06810540705919266,
-0.12365647405385971,
0.027572939172387123,
-0.09885460883378983,
-0.15847285091876984,
-0.06846950203180313,
0.0845394879579544,
0.06249909847974777,
0.031359776854515076,
-0.033371008932590485,
0.012274594977498055,
0.03864699974656105,
-0.03594217449426651,
0.001859726500697434,
0.058117713779211044,
0.08774452656507492,
0.04369949549436569,
-0.1102994829416275,
0.01999100111424923,
-0.07271143049001694,
-0.06986449658870697,
0.06870323419570923,
0.2698516249656677,
-0.048346586525440216,
0.11047042906284332,
0.1226811408996582,
-0.08758170157670975,
-0.16503190994262695,
0.04846601188182831,
0.09540363401174545,
-0.014092004857957363,
0.002817252418026328,
-0.16652826964855194,
0.10520295053720474,
0.11470504850149155,
-0.016666224226355553,
0.005635279696434736,
-0.19980430603027344,
-0.13620547950267792,
0.08229763805866241,
0.1166761964559555,
0.2866981625556946,
-0.04980999231338501,
-0.03539542853832245,
0.022412225604057312,
-0.09571407735347748,
0.0052935597486793995,
0.13026343286037445,
0.06237206980586052,
-0.023048337548971176,
-0.05665905401110649,
0.011142401024699211,
-0.03622827306389809,
0.08979014307260513,
0.06485942006111145,
0.07071517407894135,
-0.004218853544443846,
-0.0062360577285289764,
-0.03707396239042282,
-0.04001755639910698,
0.06924460083246231,
0.03271495923399925,
0.0514642670750618,
-0.08028946816921234,
-0.03522093594074249,
-0.07514720410108566,
0.03234831616282463,
-0.030977807939052582,
-0.07724770158529282,
-0.05961522459983826,
0.07652132958173752,
0.057179830968379974,
-0.03392553701996803,
0.030379552394151688,
0.03497112914919853,
0.09690619260072708,
0.1459086686372757,
0.002534520113840699,
-0.04700412228703499,
-0.06340524554252625,
-0.026707427576184273,
-0.013460986316204071,
0.07525021582841873,
-0.045336708426475525,
0.014628125354647636,
0.07255147397518158,
0.022767655551433563,
0.1036292091012001,
0.06481323391199112,
-0.12085103243589401,
-0.017416631802916527,
0.027359677478671074,
-0.15023770928382874,
0.0027581406757235527,
0.0018484194297343493,
0.024580862373113632,
-0.024601047858595848,
0.02289758250117302,
0.1461077183485031,
-0.06893277913331985,
-0.03150235861539841,
-0.04643021151423454,
0.06333044916391373,
0.03506389260292053,
0.14947129786014557,
0.03672659397125244,
0.03709905594587326,
-0.0781421884894371,
0.1462639719247818,
0.04460109397768974,
-0.04616225138306618,
0.026430467143654823,
-0.02975809946656227,
-0.10845993459224701,
0.015595399774610996,
0.061861302703619,
0.07637273520231247,
-0.07734978199005127,
-0.012121845036745071,
-0.040598515421152115,
-0.08162827044725418,
0.06860315054655075,
0.2082749754190445,
0.06324316561222076,
0.06408020108938217,
-0.051890093833208084,
-0.035126689821481705,
-0.07885130494832993,
0.048381656408309937,
0.05055094137787819,
0.08138768374919891,
-0.07392416894435883,
0.0881115049123764,
0.011458583176136017,
0.05275735259056091,
-0.026284189894795418,
-0.04640716314315796,
-0.10253240913152695,
-0.05621560662984848,
-0.10120104253292084,
0.013596958480775356,
-0.058652639389038086,
-0.043391935527324677,
0.007229791022837162,
-0.004231215454638004,
-0.005118841305375099,
0.05896009877324104,
-0.06332951784133911,
-0.013361407443881035,
-0.013612989336252213,
0.03418460115790367,
-0.05994567275047302,
-0.05315740779042244,
0.023774486035108566,
-0.096232108771801,
0.09682128578424454,
0.047082629054784775,
0.011300481855869293,
0.0020677829161286354,
0.07846491783857346,
-0.014027545228600502,
0.022252731025218964,
0.010795380920171738,
-0.038425736129283905,
-0.09719190746545792,
0.005119401030242443,
-0.02648800052702427,
-0.031347278505563736,
-0.02035936713218689,
0.09335339814424515,
-0.08184381574392319,
0.03416845574975014,
-0.0030357514042407274,
-0.0033370049204677343,
-0.08033019304275513,
-0.0021973387338221073,
0.09766384214162827,
0.08475185185670853,
0.05408016964793205,
-0.08328840136528015,
0.014048970304429531,
-0.12505139410495758,
-0.03526252880692482,
0.013978464528918266,
-0.015824822708964348,
-0.13262073695659637,
-0.012878978624939919,
0.020038995891809464,
-0.012973641976714134,
0.18264009058475494,
-0.05728720501065254,
-0.028985733166337013,
0.02074885368347168,
-0.08953920006752014,
0.10506103932857513,
-0.023605115711688995,
0.16460292041301727,
-0.028423165902495384,
-0.03593417629599571,
-0.011977111920714378,
0.04666103795170784,
0.025591135025024414,
-0.0012052461970597506,
0.1898459643125534,
0.1333540380001068,
0.03671629726886749,
0.05626194551587105,
-0.024272184818983078,
-0.009334960021078587,
-0.061267707496881485,
-0.023737546056509018,
0.04006726294755936,
0.03466632589697838,
0.023117702454328537,
0.1476859599351883,
0.05968400835990906,
-0.15944074094295502,
0.03667802736163139,
-0.029414799064397812,
-0.041264962404966354,
-0.11652591824531555,
-0.09380700439214706,
-0.03125657141208649,
-0.060944121330976486,
0.017695128917694092,
-0.13345931470394135,
0.006675293203443289,
0.17953698337078094,
0.0652766078710556,
0.03142283856868744,
0.02360009402036667,
-0.13155892491340637,
-0.0431407205760479,
0.05733398720622063,
0.011207676492631435,
0.018151383846998215,
0.045094992965459824,
-0.004484090488404036,
0.06670650839805603,
0.025460515171289444,
0.0017789625562727451,
-0.0034192611929029226,
0.07467609643936157,
0.02132565528154373,
0.049501243978738785,
-0.05367311090230942,
-0.0012456034310162067,
-0.04148804768919945,
0.07959795743227005,
0.11746221780776978,
0.042559415102005005,
-0.04729980230331421,
-0.01245660800486803,
0.1612073928117752,
-0.03227359429001808,
0.010876056738197803,
-0.11579159647226334,
0.33291614055633545,
0.01550818607211113,
0.01212767232209444,
0.0584520548582077,
-0.08287643641233444,
-0.048618268221616745,
0.20452728867530823,
0.06457611173391342,
-0.023602154105901718,
-0.030550427734851837,
0.001636347034946084,
-0.03094564378261566,
-0.01567504182457924,
0.14339037239551544,
0.043921228498220444,
0.12037069350481033,
-0.058778949081897736,
-0.04580537974834442,
-0.034017469733953476,
0.001052877982147038,
-0.11124090105295181,
0.14942161738872528,
-0.018770664930343628,
-0.021011797711253166,
-0.08146914839744568,
0.010774645023047924,
0.0765669122338295,
-0.34376299381256104,
0.00719176372513175,
-0.0317445732653141,
-0.10580520331859589,
-0.015088958665728569,
-0.02935509942471981,
-0.026633022353053093,
0.0516020692884922,
-0.042222265154123306,
0.0672331228852272,
0.04030631482601166,
0.03914303705096245,
-0.02469954639673233,
-0.10335850715637207,
0.16119171679019928,
0.05471844971179962,
0.10300979763269424,
0.018803754821419716,
0.0859881117939949,
0.06125743314623833,
0.03727582469582558,
-0.09502581506967545,
0.05492012947797775,
0.009557056240737438,
-0.06960049271583557,
-0.056051596999168396,
0.12156084179878235,
-0.003456847043707967,
0.06132787466049194,
0.028517726808786392,
-0.12494590878486633,
0.0237589031457901,
0.08464912325143814,
-0.08582844585180283,
-0.0971003994345665,
-0.0013040925841778517,
-0.0983860045671463,
0.1581411063671112,
0.14803077280521393,
-0.009118151850998402,
0.017753221094608307,
-0.06404197216033936,
-0.008909642696380615,
0.05196402966976166,
0.009632506407797337,
-0.013634387403726578,
-0.18431557714939117,
0.05068330094218254,
-0.0914972648024559,
-0.004550402518361807,
-0.21800369024276733,
-0.10168544203042984,
-0.014706897549331188,
-0.05674266442656517,
-0.02582702971994877,
0.05865255743265152,
0.029092198237776756,
0.07521294802427292,
-0.024716699495911598,
-0.02504746802151203,
-0.03910756856203079,
0.0952899158000946,
-0.10130515694618225,
-0.07464046776294708
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 60k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 60k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_60k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_60k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_60k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_60k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_60k"]}
| null |
google/multiberts-seed_1-step_60k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_60k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_60k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 60k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 60k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 60k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 60k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_60k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 60k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 60k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_60k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 60k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 60k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0807916596531868,
0.07839950919151306,
-0.0020286955405026674,
0.04113125428557396,
0.07679636776447296,
-0.02163565717637539,
0.0706937164068222,
0.09420149773359299,
-0.010235468856990337,
0.02508680336177349,
0.08288869261741638,
0.029484296217560768,
0.009996512904763222,
0.10049768537282944,
0.018652670085430145,
-0.2131633758544922,
0.03009037859737873,
-0.02757105976343155,
-0.08827947080135345,
0.07557892054319382,
0.10304461419582367,
-0.08380988985300064,
0.0427667498588562,
0.03088339790701866,
-0.11105193942785263,
0.0554010346531868,
-0.011356757022440434,
-0.02181529626250267,
0.1376216858625412,
-0.0018770149908959866,
0.05443592369556427,
0.05657004192471504,
0.0461573600769043,
-0.13727086782455444,
0.005128370597958565,
0.055090658366680145,
0.0501047819852829,
0.03749227151274681,
0.020654315128922462,
0.07983532547950745,
-0.02487209439277649,
0.0260934978723526,
0.04942382499575615,
0.015747060999274254,
-0.06460685282945633,
-0.06318744271993637,
-0.09770599007606506,
0.024216458201408386,
0.029132438823580742,
0.022787107154726982,
0.006753469817340374,
0.12192386388778687,
-0.037638984620571136,
0.04469512403011322,
0.17987827956676483,
-0.3111906349658966,
-0.0009996136650443077,
0.06463240832090378,
0.020103469491004944,
0.11164234578609467,
-0.004757853224873543,
-0.03447042405605316,
0.08046499639749527,
0.02159682661294937,
0.09379034489393234,
-0.03939921408891678,
0.025087103247642517,
-0.058811288326978683,
-0.1567651629447937,
-0.03762436658143997,
0.09971044957637787,
0.005144965834915638,
-0.13836769759655,
-0.019913580268621445,
-0.04696589335799217,
0.044321224093437195,
0.019366798922419548,
-0.0385439433157444,
0.04530739039182663,
0.0028689142782241106,
-0.006395251024514437,
-0.0009184907539747655,
-0.10226403176784515,
-0.04574061930179596,
0.02104240283370018,
0.09208644926548004,
0.10991989076137543,
0.055576104670763016,
-0.007882324978709221,
0.10850125551223755,
-0.19056743383407593,
-0.048669543117284775,
-0.029100451618433,
-0.031244143843650818,
-0.04660673439502716,
-0.011033955030143261,
-0.10327257961034775,
-0.045836709439754486,
-0.0008626116905361414,
0.14200739562511444,
-0.020594658330082893,
0.03347686305642128,
-0.023769088089466095,
0.005262416321784258,
0.06003127619624138,
0.05019000545144081,
-0.018596207723021507,
0.022799450904130936,
0.03199183940887451,
-0.017608508467674255,
-0.01897572912275791,
0.010866696946322918,
-0.004013138357549906,
0.02712642401456833,
0.13453443348407745,
0.0112864775583148,
-0.10811959207057953,
0.07821696251630783,
-0.016488654538989067,
-0.045024920254945755,
-0.007606677711009979,
-0.08823488652706146,
-0.06534670293331146,
-0.03885165601968765,
-0.017403721809387207,
0.003101193346083164,
0.00300137372687459,
-0.00806435476988554,
-0.026647930964827538,
-0.018979456275701523,
-0.08917073905467987,
-0.058572474867105484,
-0.05794884264469147,
-0.134649395942688,
0.008612751960754395,
-0.1979677528142929,
-0.028491083532571793,
-0.11183173954486847,
-0.2084938883781433,
-0.039268456399440765,
0.04745052382349968,
0.002676981035619974,
-0.06544764339923859,
0.05638270825147629,
0.0327644981443882,
-0.02988821268081665,
-0.002390110632404685,
0.09110081195831299,
-0.0074738673865795135,
0.03725181892514229,
-0.03623443841934204,
0.0602291114628315,
0.005262528080493212,
0.044162310659885406,
-0.060361724346876144,
0.056829746812582016,
-0.1732843816280365,
0.04078996181488037,
-0.07502755522727966,
-0.032820045948028564,
-0.08811401575803757,
-0.030119048431515694,
-0.006861977279186249,
0.012462154030799866,
0.02513548918068409,
0.07355967164039612,
-0.17138996720314026,
-0.03010973148047924,
0.08996482938528061,
-0.15027907490730286,
-0.026082957163453102,
0.07037972658872604,
-0.05747498571872711,
0.11785134673118591,
0.06357857584953308,
0.1629447340965271,
-0.03654037415981293,
-0.06936093419790268,
0.043540630489587784,
-0.01093141920864582,
0.013229232281446457,
-0.009405342862010002,
0.06897684186697006,
-0.019784968346357346,
-0.1600363850593567,
0.022745614871382713,
-0.13627289235591888,
-0.00026811042334884405,
-0.07759854942560196,
0.03114153817296028,
-0.0029291738756000996,
-0.06927715986967087,
-0.08218742907047272,
-0.03321380168199539,
0.07857278734445572,
-0.0679289847612381,
-0.02656042017042637,
0.038127753883600235,
0.077558733522892,
-0.07215947657823563,
0.06574724614620209,
-0.01823977753520012,
0.01888270117342472,
-0.08002788573503494,
-0.03692523390054703,
-0.18820394575595856,
0.0319574698805809,
0.09621870517730713,
0.015981659293174744,
-0.019050491973757744,
0.12685269117355347,
-0.005827693268656731,
0.0647786557674408,
-0.0388072170317173,
-0.003674953943118453,
-0.012338885106146336,
-0.0011058117961511016,
-0.09475620090961456,
-0.1110435500741005,
-0.07148090749979019,
-0.07149654626846313,
0.09629547595977783,
-0.11211810261011124,
0.022048428654670715,
-0.054465629160404205,
0.044866155833005905,
0.016769230365753174,
-0.06967071443796158,
-0.008782974444329739,
0.013129519298672676,
-0.06515325605869293,
-0.057617444545030594,
0.03748047724366188,
0.06078844517469406,
-0.021269336342811584,
0.09770958125591278,
-0.05191248655319214,
-0.09055178612470627,
0.02779996581375599,
0.07469630241394043,
-0.10457377880811691,
0.025719817727804184,
-0.04836435988545418,
-0.04701916500926018,
-0.06918010115623474,
-0.025021148845553398,
0.1019815132021904,
-0.012419367209076881,
0.144952654838562,
-0.0772014707326889,
-0.014927283860743046,
0.008674819022417068,
-0.012510517612099648,
-0.023747293278574944,
0.04983432963490486,
0.07408639043569565,
-0.07465699315071106,
0.02607160620391369,
0.03433236479759216,
-0.0032180093694478273,
0.07073318958282471,
-0.05416048318147659,
-0.07937353104352951,
0.020310796797275543,
0.03370021656155586,
0.021667758002877235,
0.06289827078580856,
-0.049283213913440704,
-0.013104275800287724,
0.03012753464281559,
0.02173038385808468,
0.010390262119472027,
-0.11958910524845123,
0.061556652188301086,
0.05903379246592522,
0.009065346792340279,
0.05366404354572296,
-0.01831340789794922,
-0.03476162999868393,
0.08032110333442688,
0.03151896595954895,
-0.018843110650777817,
-0.009216603823006153,
-0.01234180387109518,
-0.12105849385261536,
0.2195907086133957,
-0.06504135578870773,
-0.14525046944618225,
-0.06584548205137253,
-0.11474373936653137,
0.004476794041693211,
0.022650012746453285,
0.04226195812225342,
-0.02992912009358406,
-0.04212877154350281,
-0.12318553030490875,
0.09291432797908783,
-0.03917750343680382,
0.06534130871295929,
0.11158610135316849,
-0.06455271691083908,
0.045563098043203354,
-0.13035812973976135,
-0.012290935032069683,
-0.07926995307207108,
-0.06451168656349182,
0.05902135744690895,
-0.054158005863428116,
0.037577640265226364,
0.11006142944097519,
0.016722537577152252,
-0.026633890345692635,
-0.03018355369567871,
0.20502296090126038,
0.03956269472837448,
0.04003924876451492,
0.12952421605587006,
-0.07241186499595642,
0.05351805314421654,
0.07869718223810196,
0.0038450073916465044,
-0.04632233455777168,
0.0544549897313118,
0.05943102017045021,
-0.059883054345846176,
-0.19519619643688202,
-0.007661211304366589,
0.007326755207031965,
-0.046975355595350266,
0.06842324882745743,
0.03785945102572441,
0.0073609924875199795,
0.07575363665819168,
0.01731801964342594,
0.06669963151216507,
-0.0010583712719380856,
0.10138513892889023,
0.02795102819800377,
-0.03687436133623123,
0.08649011701345444,
-0.00767789501696825,
-0.009313303045928478,
0.07568147033452988,
-0.013470479287207127,
0.29488831758499146,
-0.04195114225149155,
0.027165845036506653,
0.12490060180425644,
0.032721392810344696,
0.049193691462278366,
0.1210218220949173,
-0.07658135145902634,
0.02755078859627247,
-0.07809684425592422,
-0.04782068356871605,
0.015955565497279167,
0.047200389206409454,
-0.07530981302261353,
0.015760136768221855,
-0.08344897627830505,
0.026529187336564064,
-0.02514912560582161,
0.30132222175598145,
0.10474136471748352,
-0.11368683725595474,
-0.06000729650259018,
0.0011773138539865613,
-0.09989261627197266,
-0.06917014718055725,
0.05048363655805588,
0.04971037060022354,
-0.13153257966041565,
0.006065847352147102,
-0.02288910187780857,
0.0776824802160263,
-0.0289901290088892,
0.015914229676127434,
0.03420452028512955,
0.054773036390542984,
-0.04595973715186119,
0.008014997467398643,
-0.186444491147995,
0.19841350615024567,
-0.0014422873500734568,
0.02317020297050476,
-0.05467390641570091,
0.02838076651096344,
0.012521754950284958,
-0.01592296175658703,
0.06439710408449173,
0.018538156524300575,
-0.017828907817602158,
-0.05490606278181076,
-0.04319868981838226,
0.015441857278347015,
0.067942775785923,
-0.0449751578271389,
0.1087668240070343,
0.005992705002427101,
0.0534854382276535,
0.0310251135379076,
0.08596999198198318,
-0.18511033058166504,
-0.07766460627317429,
0.027005210518836975,
-0.04889436438679695,
-0.1021885946393013,
-0.07996013015508652,
-0.09733444452285767,
-0.006754178088158369,
0.2264505922794342,
-0.11470743268728256,
-0.07290791720151901,
-0.09336207807064056,
0.045972101390361786,
0.09292865544557571,
-0.0556563064455986,
0.02159085124731064,
-0.010567479766905308,
0.1144956648349762,
-0.07150904834270477,
-0.12259548902511597,
0.027894282713532448,
-0.09954609721899033,
-0.15895871818065643,
-0.06801727414131165,
0.08685647696256638,
0.06451427191495895,
0.03142496570944786,
-0.03388392925262451,
0.0120251988992095,
0.034957561641931534,
-0.03447380289435387,
0.002903915708884597,
0.05965619906783104,
0.09206745773553848,
0.039425078779459,
-0.10935992747545242,
0.021197110414505005,
-0.07200577855110168,
-0.06896187365055084,
0.07405354827642441,
0.27195802330970764,
-0.048767633736133575,
0.11077086627483368,
0.12264450639486313,
-0.08650419861078262,
-0.1618562936782837,
0.04578712582588196,
0.09577056020498276,
-0.014012986794114113,
0.0028861495666205883,
-0.16789697110652924,
0.1025812104344368,
0.11289431154727936,
-0.016921503469347954,
0.001659106812439859,
-0.19821615517139435,
-0.1355849653482437,
0.08923698216676712,
0.11387630552053452,
0.2841224670410156,
-0.052796412259340286,
-0.03643561154603958,
0.022490954026579857,
-0.09206987172365189,
0.011590558104217052,
0.12306174635887146,
0.06218843162059784,
-0.020932411774992943,
-0.06055661290884018,
0.01045139692723751,
-0.03688478469848633,
0.08867382258176804,
0.06573153287172318,
0.07074316591024399,
-0.003640894778072834,
-0.0038010072894394398,
-0.030968252569437027,
-0.04183557629585266,
0.07045558094978333,
0.03208712115883827,
0.049629174172878265,
-0.08678409457206726,
-0.03571643307805061,
-0.07462336122989655,
0.036292847245931625,
-0.031017925590276718,
-0.07833100110292435,
-0.059337154030799866,
0.07657543569803238,
0.05687790364027023,
-0.03166941925883293,
0.03807096183300018,
0.03313957899808884,
0.09661182016134262,
0.14717832207679749,
0.0001542294630780816,
-0.04149924963712692,
-0.06462923437356949,
-0.02846801094710827,
-0.012297544628381729,
0.0756840705871582,
-0.04942167177796364,
0.01428209524601698,
0.07092490792274475,
0.023142671212553978,
0.10669635981321335,
0.061547376215457916,
-0.12461112439632416,
-0.019339995458722115,
0.024490011855959892,
-0.15345799922943115,
0.006987069267779589,
0.00141492101829499,
0.021230805665254593,
-0.02039210870862007,
0.02601919323205948,
0.1489088237285614,
-0.06770075112581253,
-0.031167689710855484,
-0.04623635858297348,
0.06060301885008812,
0.036383770406246185,
0.14589017629623413,
0.03991999849677086,
0.03797692060470581,
-0.07716815918684006,
0.14324216544628143,
0.04318513348698616,
-0.04416590929031372,
0.029398208484053612,
-0.029262827709317207,
-0.10882614552974701,
0.0152134383097291,
0.062926284968853,
0.08037369698286057,
-0.07796648889780045,
-0.018363917246460915,
-0.04327411577105522,
-0.07628436386585236,
0.07124147564172745,
0.21514174342155457,
0.06499069184064865,
0.06508889049291611,
-0.05140422657132149,
-0.03468471020460129,
-0.0774751603603363,
0.05050094425678253,
0.04782776162028313,
0.08024798333644867,
-0.07228322327136993,
0.09740503877401352,
0.013063928112387657,
0.05336511880159378,
-0.026821454986929893,
-0.048257727175951004,
-0.10375630855560303,
-0.055822815746068954,
-0.1051839143037796,
0.014854099601507187,
-0.06183589994907379,
-0.042999204248189926,
0.006934172008186579,
-0.0026632477529346943,
-0.00406628055498004,
0.05910280719399452,
-0.06276724487543106,
-0.01169500034302473,
-0.013453289866447449,
0.03471071645617485,
-0.06304829567670822,
-0.05249917134642601,
0.020909924060106277,
-0.09663266688585281,
0.09829840809106827,
0.04877830669283867,
0.01241760142147541,
0.002139688702300191,
0.07021753489971161,
-0.011389642022550106,
0.022486431524157524,
0.008240794762969017,
-0.04000142216682434,
-0.10159569978713989,
0.005688523408025503,
-0.026996778324246407,
-0.032046087086200714,
-0.021230867132544518,
0.09078510105609894,
-0.08148796856403351,
0.03334842622280121,
-0.0002496758825145662,
-0.004113363102078438,
-0.08078654855489731,
-0.003200831124559045,
0.09135138988494873,
0.08501122891902924,
0.05301715061068535,
-0.08093097805976868,
0.014164519496262074,
-0.1238718256354332,
-0.035253945738077164,
0.014067343436181545,
-0.01585252210497856,
-0.13322941958904266,
-0.013393901288509369,
0.018366152420639992,
-0.014654234983026981,
0.18907050788402557,
-0.05715026706457138,
-0.026943281292915344,
0.01951976679265499,
-0.09378353506326675,
0.10545334219932556,
-0.024056650698184967,
0.1671854853630066,
-0.02814616821706295,
-0.03463674336671829,
-0.013647381216287613,
0.04768446087837219,
0.027483530342578888,
-0.004349986091256142,
0.1857282966375351,
0.13416142761707306,
0.03677094727754593,
0.05582117289304733,
-0.023951973766088486,
-0.010859931819140911,
-0.06472470611333847,
-0.01329893246293068,
0.041749294847249985,
0.03482195734977722,
0.023493412882089615,
0.15578238666057587,
0.06685996800661087,
-0.15979716181755066,
0.03662264347076416,
-0.026398055255413055,
-0.039917800575494766,
-0.11793601512908936,
-0.09991847723722458,
-0.03298221901059151,
-0.06110088899731636,
0.01733577623963356,
-0.1332009881734848,
0.004516418091952801,
0.1794167011976242,
0.06686539202928543,
0.031322527676820755,
0.019074678421020508,
-0.1298086941242218,
-0.0456552617251873,
0.05642150342464447,
0.011277624405920506,
0.018848221749067307,
0.041210711002349854,
-0.0037869312800467014,
0.06833575665950775,
0.02443169616162777,
0.002809242345392704,
-0.005709751509130001,
0.08244863152503967,
0.02290423959493637,
0.04790954664349556,
-0.05507298931479454,
-0.0003694651531986892,
-0.03949490934610367,
0.08080420643091202,
0.11478415131568909,
0.043877825140953064,
-0.04880524426698685,
-0.011957063339650631,
0.15827372670173645,
-0.028904639184474945,
0.007384348660707474,
-0.11778006702661514,
0.32813897728919983,
0.015541290864348412,
0.012243704870343208,
0.05788329616189003,
-0.08137480169534683,
-0.04444168135523796,
0.20148642361164093,
0.06636948138475418,
-0.020992238074541092,
-0.028595900163054466,
0.004581878427416086,
-0.03079095482826233,
-0.01610264740884304,
0.13678039610385895,
0.04465547949075699,
0.12259301543235779,
-0.05861030891537666,
-0.0502079539000988,
-0.034678857773542404,
0.0009539465536363423,
-0.11145613342523575,
0.14519855380058289,
-0.015899434685707092,
-0.02230199985206127,
-0.07858528941869736,
0.010315604507923126,
0.07415742427110672,
-0.34685322642326355,
0.007143449038267136,
-0.029423745349049568,
-0.10463905334472656,
-0.01437345240265131,
-0.028565898537635803,
-0.025618936866521835,
0.048537686467170715,
-0.04154593497514725,
0.06468124687671661,
0.039858829230070114,
0.039150964468717575,
-0.023910347372293472,
-0.09857388585805893,
0.16136124730110168,
0.05848860740661621,
0.09964106231927872,
0.01689922623336315,
0.08938907086849213,
0.06292770802974701,
0.03584703058004379,
-0.09090005606412888,
0.05610133334994316,
0.010461468249559402,
-0.06846627593040466,
-0.056716687977313995,
0.11958202719688416,
-0.0019703723955899477,
0.0655258521437645,
0.032009318470954895,
-0.12344153970479965,
0.02662232145667076,
0.0847957581281662,
-0.08883900195360184,
-0.09747431427240372,
-0.0006947204819880426,
-0.09943666309118271,
0.15787586569786072,
0.14750976860523224,
-0.009216558188199997,
0.0164104625582695,
-0.06684304773807526,
-0.007805947680026293,
0.053079817444086075,
0.006072806194424629,
-0.014414946548640728,
-0.18120767176151276,
0.05380510538816452,
-0.08136195689439774,
-0.002794471103698015,
-0.2172023057937622,
-0.1003027856349945,
-0.013746035285294056,
-0.054573629051446915,
-0.024859366938471794,
0.0540778674185276,
0.027569448575377464,
0.07627774775028229,
-0.02641751617193222,
-0.025261860340833664,
-0.03848112002015114,
0.09306695312261581,
-0.10360381752252579,
-0.07470571249723434
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_700k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_700k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_700k"]}
| null |
google/multiberts-seed_1-step_700k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_700k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 700k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08157617598772049,
0.06710530072450638,
-0.0019776683766394854,
0.04452967643737793,
0.07396359741687775,
-0.021139303222298622,
0.06199365854263306,
0.09382341802120209,
-0.01178874634206295,
0.02170053869485855,
0.08637015521526337,
0.02478933334350586,
0.011950739659368992,
0.10218696296215057,
0.0210028737783432,
-0.21500886976718903,
0.032858435064554214,
-0.026659496128559113,
-0.08258480578660965,
0.07522749900817871,
0.10406926274299622,
-0.08530315011739731,
0.04209721460938454,
0.03121134266257286,
-0.10828429460525513,
0.053833283483982086,
-0.010430981405079365,
-0.025764860212802887,
0.1364383101463318,
0.0019383836770430207,
0.049148522317409515,
0.05789190158247948,
0.04907769709825516,
-0.13409169018268585,
0.003988850861787796,
0.05642532557249069,
0.050412341952323914,
0.03722309321165085,
0.02122248150408268,
0.08641604334115982,
-0.026796190068125725,
0.03469673544168472,
0.051746249198913574,
0.01760672591626644,
-0.067417211830616,
-0.0695914775133133,
-0.09700819849967957,
0.03493789955973625,
0.03440667316317558,
0.020897086709737778,
0.007114310283213854,
0.11871767789125443,
-0.03575875237584114,
0.045995526015758514,
0.16912347078323364,
-0.3080558776855469,
-0.004341209307312965,
0.06997925788164139,
0.021016158163547516,
0.11776464432477951,
-0.004398454446345568,
-0.03174088895320892,
0.08180373162031174,
0.020280862227082253,
0.09737616032361984,
-0.03728917986154556,
0.024537283927202225,
-0.059483129531145096,
-0.15665322542190552,
-0.039568111300468445,
0.0965147539973259,
0.009242772124707699,
-0.13943375647068024,
-0.02270093746483326,
-0.045447688549757004,
0.03450712934136391,
0.019232235848903656,
-0.04376424103975296,
0.04191235452890396,
0.004994016140699387,
-0.008096901699900627,
-0.0036396088544279337,
-0.10136406868696213,
-0.04699970781803131,
0.024661751464009285,
0.08678334951400757,
0.10652075707912445,
0.058453842997550964,
-0.003015659749507904,
0.10997749865055084,
-0.19429868459701538,
-0.04773252457380295,
-0.031839072704315186,
-0.03592710569500923,
-0.04738616198301315,
-0.012274407781660557,
-0.09919356554746628,
-0.04586021974682808,
-0.002885257126763463,
0.13666552305221558,
-0.0056604137644171715,
0.034330300986766815,
-0.025852374732494354,
0.0055634621530771255,
0.05808498337864876,
0.05617730692028999,
-0.022418882697820663,
0.01466814149171114,
0.036108411848545074,
-0.01155631709843874,
-0.018375780433416367,
0.009285476990044117,
-0.0008777235634624958,
0.023257974535226822,
0.14027689397335052,
0.01101087685674429,
-0.10159668326377869,
0.0747002512216568,
-0.015171893872320652,
-0.04677566513419151,
0.0028696211520582438,
-0.08342944830656052,
-0.06067671999335289,
-0.04293151572346687,
-0.01716822199523449,
0.0018805601866915822,
0.0018419804982841015,
-0.006775161251425743,
-0.026509560644626617,
-0.019543631002306938,
-0.0900179073214531,
-0.06093905121088028,
-0.056890517473220825,
-0.13349147140979767,
0.006749434396624565,
-0.1863899677991867,
-0.029897063970565796,
-0.11422222852706909,
-0.21108631789684296,
-0.03837347775697708,
0.0462162084877491,
0.00039418876986019313,
-0.06504559516906738,
0.060102976858615875,
0.03606180101633072,
-0.03131359815597534,
-0.003648655489087105,
0.08504381030797958,
-0.00518984068185091,
0.03915633633732796,
-0.038270220160484314,
0.060600075870752335,
0.004246297758072615,
0.040909793227910995,
-0.06126122549176216,
0.05843149870634079,
-0.17718006670475006,
0.041744474321603775,
-0.07487763464450836,
-0.02840028703212738,
-0.08700808882713318,
-0.035684484988451004,
-0.010454715229570866,
0.008575033396482468,
0.028000975027680397,
0.07255488634109497,
-0.17015299201011658,
-0.025940513238310814,
0.08633159846067429,
-0.1521526575088501,
-0.029505416750907898,
0.0774795413017273,
-0.05343739688396454,
0.11466086655855179,
0.06541378796100616,
0.1614627242088318,
-0.031230038031935692,
-0.06551244854927063,
0.04881052300333977,
-0.01385569293051958,
0.01057867705821991,
-0.012437896803021431,
0.0657695084810257,
-0.019094396382570267,
-0.16820980608463287,
0.024720793589949608,
-0.12456098198890686,
-0.0005353066953830421,
-0.08055863529443741,
0.02779543586075306,
-0.005003368016332388,
-0.06995927542448044,
-0.08376956731081009,
-0.03239678218960762,
0.0769500881433487,
-0.07024508714675903,
-0.024104507640004158,
0.02744387648999691,
0.07674699276685715,
-0.07087350636720657,
0.0673362985253334,
-0.015459742397069931,
0.02044670470058918,
-0.07936394214630127,
-0.0352596752345562,
-0.18063583970069885,
0.03622344136238098,
0.09808071702718735,
0.021434545516967773,
-0.022719917818903923,
0.11998820304870605,
-0.006522681564092636,
0.06647767126560211,
-0.04422694072127342,
-0.005775707308202982,
-0.010893798433244228,
0.0006122642662376165,
-0.09743067622184753,
-0.10423921048641205,
-0.07816950231790543,
-0.06760083884000778,
0.09125127643346786,
-0.11801949143409729,
0.022500958293676376,
-0.05971271172165871,
0.04340917617082596,
0.016755810007452965,
-0.07279931008815765,
-0.01155468076467514,
0.014792852103710175,
-0.06286244094371796,
-0.058799900114536285,
0.0350557379424572,
0.05891287699341774,
-0.02512863092124462,
0.08644974231719971,
-0.04853849485516548,
-0.07781161367893219,
0.026885436847805977,
0.07373863458633423,
-0.10533903539180756,
0.021328171715140343,
-0.04770417883992195,
-0.04886539280414581,
-0.06989026069641113,
-0.03321697562932968,
0.09352312237024307,
-0.014112942852079868,
0.14314033091068268,
-0.07837481796741486,
-0.008579934947192669,
0.011861728504300117,
-0.01163396518677473,
-0.026056542992591858,
0.04336336627602577,
0.06906316429376602,
-0.08155065774917603,
0.021594302728772163,
0.03373020142316818,
0.0006492497632279992,
0.06613411009311676,
-0.04930444806814194,
-0.07470986247062683,
0.021932130679488182,
0.03187733516097069,
0.022906512022018433,
0.062288798391819,
-0.05000430718064308,
-0.008113490417599678,
0.028682703152298927,
0.025840600952506065,
0.010383940301835537,
-0.11876549571752548,
0.06000960245728493,
0.06042075157165527,
0.008841117843985558,
0.05278012529015541,
-0.01720941811800003,
-0.03491710498929024,
0.07956749945878983,
0.030453240498900414,
-0.019929470494389534,
-0.007467154413461685,
-0.011514262296259403,
-0.12132307142019272,
0.21856246888637543,
-0.06451108306646347,
-0.14779900014400482,
-0.07029808312654495,
-0.10925410687923431,
-0.000021074247342767194,
0.024083521217107773,
0.04299435392022133,
-0.02599535509943962,
-0.04287377744913101,
-0.12670427560806274,
0.09381783753633499,
-0.04099442437291145,
0.06748376786708832,
0.11403408646583557,
-0.0648944154381752,
0.04806586727499962,
-0.13276070356369019,
-0.012206335552036762,
-0.07756194472312927,
-0.06644434481859207,
0.05702585354447365,
-0.05519581958651543,
0.03768046572804451,
0.1102878600358963,
0.016577545553445816,
-0.026506736874580383,
-0.0316816121339798,
0.21529658138751984,
0.041536618024110794,
0.04331285506486893,
0.12824533879756927,
-0.06871949136257172,
0.05317842960357666,
0.08010111004114151,
0.005988978780806065,
-0.04988045617938042,
0.055360786616802216,
0.05360573157668114,
-0.06256350874900818,
-0.18733835220336914,
-0.01053977757692337,
0.006390952505171299,
-0.040253277868032455,
0.07097075134515762,
0.03898368403315544,
0.017103316262364388,
0.07551204413175583,
0.0175445768982172,
0.06633570790290833,
0.004967178218066692,
0.10184972733259201,
0.030941566452383995,
-0.03621270880103111,
0.08861909061670303,
-0.004252449609339237,
-0.0057608275674283504,
0.07530530542135239,
-0.016932807862758636,
0.2959033250808716,
-0.04208402708172798,
0.01785069890320301,
0.12156642228364944,
0.04022087901830673,
0.051090482622385025,
0.12530264258384705,
-0.07827667891979218,
0.02522028423845768,
-0.0759926438331604,
-0.04589688032865524,
0.014203806407749653,
0.04634924605488777,
-0.06925816833972931,
0.020205989480018616,
-0.08657577633857727,
0.022677531465888023,
-0.02631647139787674,
0.298104465007782,
0.1019260585308075,
-0.11358395963907242,
-0.05881783366203308,
0.0019226388540118933,
-0.10264997184276581,
-0.07289455085992813,
0.05118028074502945,
0.048490364104509354,
-0.13410912454128265,
0.004172721412032843,
-0.01941174827516079,
0.07928820699453354,
-0.02647515945136547,
0.015674935653805733,
0.03490522876381874,
0.05234076827764511,
-0.04723038151860237,
0.007527287118136883,
-0.17463387548923492,
0.19927175343036652,
-0.0021775015629827976,
0.02538049779832363,
-0.05547889322042465,
0.026321660727262497,
0.0068818349391222,
-0.011094820685684681,
0.06289252638816833,
0.018996739760041237,
-0.0071289376355707645,
-0.06707293540239334,
-0.044346217066049576,
0.012035302817821503,
0.06531932950019836,
-0.04029970243573189,
0.10279153287410736,
0.007231771945953369,
0.05341507866978645,
0.030188310891389847,
0.09745638817548752,
-0.18821214139461517,
-0.07664678245782852,
0.025997662916779518,
-0.04704417288303375,
-0.09213800728321075,
-0.08255898952484131,
-0.09743352979421616,
-0.005896671209484339,
0.22594942152500153,
-0.10526483505964279,
-0.07125325500965118,
-0.0936078280210495,
0.04908597469329834,
0.10102061182260513,
-0.05303424596786499,
0.025693684816360474,
-0.008301345631480217,
0.10640463978052139,
-0.06755764037370682,
-0.11902015656232834,
0.028751272708177567,
-0.10006365180015564,
-0.16226711869239807,
-0.06810241937637329,
0.08857187628746033,
0.06168489158153534,
0.02988946996629238,
-0.028538063168525696,
0.01101529598236084,
0.039798591285943985,
-0.03899747505784035,
-0.00046413842937909067,
0.059129081666469574,
0.08154324442148209,
0.035681646317243576,
-0.10153461247682571,
0.026104876771569252,
-0.07042435556650162,
-0.07120432704687119,
0.07005646824836731,
0.27666884660720825,
-0.0500982329249382,
0.11003844439983368,
0.12593629956245422,
-0.08519292622804642,
-0.1576005071401596,
0.04281352087855339,
0.09374333918094635,
-0.014037790708243847,
0.0009139092289842665,
-0.15711472928524017,
0.10340873152017593,
0.12019672989845276,
-0.01743434928357601,
0.00034687950392253697,
-0.2032485157251358,
-0.13977140188217163,
0.08804482966661453,
0.1133604422211647,
0.2875722050666809,
-0.051659949123859406,
-0.03459960222244263,
0.025178290903568268,
-0.0938585102558136,
0.007028636056929827,
0.13193415105342865,
0.06237691640853882,
-0.022686392068862915,
-0.07589983195066452,
0.011901904828846455,
-0.03738413751125336,
0.09349797666072845,
0.06307974457740784,
0.06978267431259155,
-0.002862007124349475,
0.0035545018035918474,
-0.032060492783784866,
-0.03928975760936737,
0.068058542907238,
0.04160987213253975,
0.05112532153725624,
-0.08943810313940048,
-0.03824201226234436,
-0.07422585040330887,
0.03152571618556976,
-0.030890004709362984,
-0.07958797365427017,
-0.060303352773189545,
0.07727058231830597,
0.05873265489935875,
-0.03449090197682381,
0.03811167553067207,
0.034050390124320984,
0.10034757852554321,
0.15733271837234497,
-0.002185074845328927,
-0.05588259920477867,
-0.060234177857637405,
-0.026776457205414772,
-0.015434673056006432,
0.07574024796485901,
-0.035996466875076294,
0.01751595363020897,
0.07150353491306305,
0.02054297737777233,
0.10474903881549835,
0.06211162731051445,
-0.12004708498716354,
-0.018105151131749153,
0.02717691659927368,
-0.15052244067192078,
0.0026193098165094852,
-0.0014394544996321201,
0.019007166847586632,
-0.021524790674448013,
0.024944987148046494,
0.14548441767692566,
-0.07066643238067627,
-0.030975932255387306,
-0.04604572802782059,
0.05987929552793503,
0.03284694626927376,
0.15119151771068573,
0.037899259477853775,
0.038307074457407,
-0.07634624093770981,
0.1412484049797058,
0.0475105419754982,
-0.044570229947566986,
0.024239493533968925,
-0.027834607288241386,
-0.10879825800657272,
0.017669107764959335,
0.054126087576150894,
0.06755118817090988,
-0.07750418037176132,
-0.014210307039320469,
-0.03696981444954872,
-0.07795252650976181,
0.06836213916540146,
0.20734523236751556,
0.06490126252174377,
0.0647631287574768,
-0.05253634974360466,
-0.034613169729709625,
-0.07720516622066498,
0.04777492582798004,
0.05483631044626236,
0.08054951578378677,
-0.07360267639160156,
0.09344518184661865,
0.012489719316363335,
0.05028732120990753,
-0.027328625321388245,
-0.051813263446092606,
-0.10502707958221436,
-0.05487746745347977,
-0.10446106642484665,
0.009751716628670692,
-0.06439600139856339,
-0.0413227453827858,
0.005936646834015846,
-0.005111316684633493,
-0.0035554233472794294,
0.05597969517111778,
-0.06007053703069687,
-0.013415442779660225,
-0.012857913039624691,
0.03263775631785393,
-0.06009151414036751,
-0.05253821983933449,
0.020464202389121056,
-0.09397394210100174,
0.09887818247079849,
0.04356543347239494,
0.013126993551850319,
-0.002356823068112135,
0.07837363332509995,
-0.008045690134167671,
0.023038452491164207,
0.009411415085196495,
-0.03912476822733879,
-0.09796416759490967,
0.006354765500873327,
-0.02416592836380005,
-0.03602981939911842,
-0.018746592104434967,
0.08810539543628693,
-0.08536304533481598,
0.02572598122060299,
-0.003586052218452096,
-0.007236979436129332,
-0.0788554921746254,
-0.004583938978612423,
0.09284185618162155,
0.08337946981191635,
0.05088995024561882,
-0.08282386511564255,
0.014014932326972485,
-0.1281794011592865,
-0.03478105738759041,
0.013510122895240784,
-0.01543227769434452,
-0.12603817880153656,
-0.011114966124296188,
0.020665213465690613,
-0.010695046745240688,
0.187871515750885,
-0.059784237295389175,
-0.025673897936940193,
0.02064337581396103,
-0.09805313497781754,
0.10632769763469696,
-0.02417626604437828,
0.17458173632621765,
-0.02677825465798378,
-0.035646479576826096,
-0.01176438294351101,
0.047706488519907,
0.028950493782758713,
0.003279896918684244,
0.1817110925912857,
0.13037607073783875,
0.034811533987522125,
0.054612401872873306,
-0.02894468791782856,
-0.008932956494390965,
-0.0637178048491478,
-0.024524182081222534,
0.04316428676247597,
0.03589851036667824,
0.020668471232056618,
0.14129452407360077,
0.06449227780103683,
-0.16143997013568878,
0.03429517149925232,
-0.02463744953274727,
-0.043098051100969315,
-0.11615528166294098,
-0.10267554968595505,
-0.03204261139035225,
-0.06790880113840103,
0.018175307661294937,
-0.13521529734134674,
0.006699742749333382,
0.1667821705341339,
0.06822313368320465,
0.03009076416492462,
0.019820092245936394,
-0.12743578851222992,
-0.04037024825811386,
0.05661243200302124,
0.012721610255539417,
0.02166518196463585,
0.04417170584201813,
-0.0059572444297373295,
0.06307771801948547,
0.023845957592129707,
0.0014134322991594672,
-0.005021181423217058,
0.07650264352560043,
0.0218593031167984,
0.05047643929719925,
-0.055960576981306076,
-0.001247661653906107,
-0.04255448654294014,
0.08197904378175735,
0.11989234387874603,
0.04339075833559036,
-0.0479046106338501,
-0.013798895291984081,
0.16258040070533752,
-0.029145194217562675,
0.005351064261049032,
-0.11731556057929993,
0.3319298326969147,
0.01885191723704338,
0.011893855407834053,
0.05929602310061455,
-0.081080861389637,
-0.04632582515478134,
0.20629553496837616,
0.0683295950293541,
-0.019328823313117027,
-0.028759196400642395,
0.0041273669339716434,
-0.033139195293188095,
-0.015642797574400902,
0.1435442566871643,
0.042481839656829834,
0.1204146221280098,
-0.05717222020030022,
-0.03997371718287468,
-0.035659294575452805,
-0.001137827755883336,
-0.1126573383808136,
0.14760756492614746,
-0.015091092325747013,
-0.01996101625263691,
-0.08152558654546738,
0.013766479678452015,
0.07232879102230072,
-0.33412325382232666,
0.00679729413241148,
-0.028713621199131012,
-0.10358638316392899,
-0.014473461546003819,
-0.03102291002869606,
-0.02746395952999592,
0.0486619807779789,
-0.0400078222155571,
0.0687607005238533,
0.037189971655607224,
0.03780543804168701,
-0.019870491698384285,
-0.1057153046131134,
0.16186699271202087,
0.06805869936943054,
0.10383739322423935,
0.01783023029565811,
0.08733665943145752,
0.06331652402877808,
0.03708493337035179,
-0.09565666317939758,
0.05533122643828392,
0.009728486649692059,
-0.06537782400846481,
-0.05589764192700386,
0.11982894688844681,
0.001287351711653173,
0.06844513863325119,
0.027905786409974098,
-0.12746939063072205,
0.022968292236328125,
0.0789678767323494,
-0.0816785916686058,
-0.09507851302623749,
-0.0022392617538571358,
-0.09518834948539734,
0.16018199920654297,
0.14422523975372314,
-0.013467364944517612,
0.014069969765841961,
-0.06304407864809036,
-0.007449669763445854,
0.05199228972196579,
0.007948797196149826,
-0.013825557194650173,
-0.1870667189359665,
0.05156174674630165,
-0.08713026344776154,
-0.006654584780335426,
-0.2072272151708603,
-0.10523761808872223,
-0.01460009440779686,
-0.05678071081638336,
-0.025354916229844093,
0.06059379503130913,
0.026517076417803764,
0.07138480246067047,
-0.023906106129288673,
-0.02355560101568699,
-0.041219621896743774,
0.09334291517734528,
-0.10385113209486008,
-0.07485447078943253
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_800k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_800k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_800k"]}
| null |
google/multiberts-seed_1-step_800k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_800k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 800k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0810418576002121,
0.07919927686452866,
-0.0020004762336611748,
0.04640015959739685,
0.07849084585905075,
-0.019400477409362793,
0.06824496388435364,
0.09294040501117706,
-0.005992456339299679,
0.022900249809026718,
0.0819934606552124,
0.020234452560544014,
0.011494963429868221,
0.09605499356985092,
0.021993586793541908,
-0.2112894207239151,
0.029686549678444862,
-0.025570321828126907,
-0.07943053543567657,
0.07292980700731277,
0.10296764969825745,
-0.08426014333963394,
0.04322522506117821,
0.0313417948782444,
-0.11171042174100876,
0.05485739931464195,
-0.011776326224207878,
-0.02458307519555092,
0.13776789605617523,
0.0023092608898878098,
0.05384409427642822,
0.0556676909327507,
0.04763195291161537,
-0.13686740398406982,
0.003336947876960039,
0.05459688976407051,
0.051092516630887985,
0.036392856389284134,
0.022927749902009964,
0.08322721719741821,
-0.02242748998105526,
0.038195692002773285,
0.0531051941215992,
0.01590130291879177,
-0.06370598822832108,
-0.06120121479034424,
-0.09633105993270874,
0.02938334085047245,
0.034463852643966675,
0.022884326055645943,
0.00866191927343607,
0.11743814498186111,
-0.03367847204208374,
0.04612372815608978,
0.17103253304958344,
-0.3073127567768097,
-0.001189535018056631,
0.0575856938958168,
0.0165439173579216,
0.11769060045480728,
-0.0020164961460977793,
-0.036495957523584366,
0.08082559704780579,
0.024502724409103394,
0.09248948097229004,
-0.03852783888578415,
0.012938561849296093,
-0.06088804081082344,
-0.15464036166667938,
-0.03931652009487152,
0.09450411051511765,
0.00461924634873867,
-0.13836044073104858,
-0.020690729841589928,
-0.044143639504909515,
0.02977730520069599,
0.020790472626686096,
-0.039408788084983826,
0.04491802304983139,
0.0035729564260691404,
-0.00307806977070868,
0.0009906369959935546,
-0.10271003842353821,
-0.04375084862112999,
0.02242395654320717,
0.08622054010629654,
0.10856799781322479,
0.058430593460798264,
-0.0035113138146698475,
0.11016510426998138,
-0.18878255784511566,
-0.04846017062664032,
-0.028764931485056877,
-0.03051592968404293,
-0.043223362416028976,
-0.009706007316708565,
-0.1013626828789711,
-0.04638619348406792,
-0.0012587063247337937,
0.13733185827732086,
-0.012460908852517605,
0.03358215093612671,
-0.024181662127375603,
0.007037730887532234,
0.05857188627123833,
0.05604361370205879,
-0.020687522366642952,
0.019798796623945236,
0.03384812921285629,
-0.012373887933790684,
-0.020100731402635574,
0.008830335922539234,
-0.0035278797149658203,
0.028326965868473053,
0.14149588346481323,
0.010949915274977684,
-0.10364257544279099,
0.07191377878189087,
-0.016040531918406487,
-0.04565016180276871,
-0.003905107267200947,
-0.08789040893316269,
-0.06269218772649765,
-0.04309328272938728,
-0.015580004081130028,
0.004785125143826008,
0.007357588037848473,
-0.005065948236733675,
-0.028186140581965446,
-0.01866450160741806,
-0.08858487010002136,
-0.05928311496973038,
-0.05606461316347122,
-0.13151274621486664,
0.00880941841751337,
-0.1929192841053009,
-0.02616295963525772,
-0.117439404129982,
-0.20156323909759521,
-0.03946372866630554,
0.047954265028238297,
0.003143730340525508,
-0.0647435411810875,
0.058551739901304245,
0.03674072399735451,
-0.032066166400909424,
-0.0036323070526123047,
0.08237547427415848,
-0.006094621494412422,
0.037323299795389175,
-0.03968847915530205,
0.05795884132385254,
0.0019759417045861483,
0.044768210500478745,
-0.05972805246710777,
0.055964983999729156,
-0.17396226525306702,
0.039563052356243134,
-0.0720183476805687,
-0.026744013652205467,
-0.08389218151569366,
-0.03058718703687191,
-0.001261589815840125,
0.013734179548919201,
0.02574795112013817,
0.0718313604593277,
-0.16562259197235107,
-0.02791881188750267,
0.08235769718885422,
-0.15177224576473236,
-0.028104716911911964,
0.0745011419057846,
-0.05380775406956673,
0.11084479838609695,
0.06648357212543488,
0.16109901666641235,
-0.029606951400637627,
-0.06249173358082771,
0.04575193673372269,
-0.014275387860834599,
0.01336474996060133,
-0.014129475690424442,
0.06767018139362335,
-0.02026274986565113,
-0.16312864422798157,
0.02576323039829731,
-0.127947598695755,
0.001567543251439929,
-0.07764097303152084,
0.029495244845747948,
-0.0034982822835445404,
-0.07063157856464386,
-0.08909566700458527,
-0.03294958174228668,
0.07594410330057144,
-0.0696442574262619,
-0.025517011061310768,
0.030524497851729393,
0.07692687958478928,
-0.07003170251846313,
0.06674909591674805,
-0.01718907617032528,
0.017075827345252037,
-0.0794355571269989,
-0.03833484649658203,
-0.1838248074054718,
0.03626936674118042,
0.09666978567838669,
0.01977541483938694,
-0.021934470161795616,
0.12548018991947174,
-0.00889960303902626,
0.06624498218297958,
-0.03927837312221527,
-0.004416159354150295,
-0.009997637011110783,
-0.00022399923182092607,
-0.0989377424120903,
-0.10394862294197083,
-0.07451563328504562,
-0.06886931508779526,
0.09284771978855133,
-0.1153632253408432,
0.021469950675964355,
-0.05640103667974472,
0.04217419773340225,
0.016919195652008057,
-0.07169344276189804,
-0.01269152294844389,
0.016172943636775017,
-0.06491811573505402,
-0.05894256383180618,
0.03452380746603012,
0.06136466935276985,
-0.023020965978503227,
0.09003123641014099,
-0.05162564665079117,
-0.08024046570062637,
0.02694631554186344,
0.07685490697622299,
-0.10388562828302383,
0.02593223750591278,
-0.04718306288123131,
-0.04956308379769325,
-0.06627871841192245,
-0.029989905655384064,
0.10710573941469193,
-0.012927214615046978,
0.14530077576637268,
-0.07511680573225021,
-0.008790251798927784,
0.009568086825311184,
-0.012043326161801815,
-0.02601034939289093,
0.04701145365834236,
0.06916983425617218,
-0.06633428484201431,
0.02081765979528427,
0.030793912708759308,
-0.001056145061738789,
0.06625937670469284,
-0.05026866868138313,
-0.07696371525526047,
0.023188771679997444,
0.030861282721161842,
0.022611891850829124,
0.06023530289530754,
-0.04827820509672165,
-0.01145324669778347,
0.030457988381385803,
0.02318844571709633,
0.013513581827282906,
-0.12036579102277756,
0.06208948791027069,
0.06202869117259979,
0.011956816539168358,
0.05354835093021393,
-0.022790614515542984,
-0.03424336761236191,
0.08093633502721786,
0.028921877965331078,
-0.020771905779838562,
-0.009577780961990356,
-0.012764252722263336,
-0.12383861839771271,
0.21701614558696747,
-0.0655592530965805,
-0.14787405729293823,
-0.07435855269432068,
-0.11280213296413422,
0.003981518093496561,
0.024732861667871475,
0.04346972703933716,
-0.0255839005112648,
-0.04300630837678909,
-0.127405047416687,
0.09082556515932083,
-0.03821801394224167,
0.06579604744911194,
0.11556107550859451,
-0.06362923979759216,
0.047293584793806076,
-0.13114891946315765,
-0.01391118485480547,
-0.0757058784365654,
-0.06823406368494034,
0.05933380126953125,
-0.054667551070451736,
0.03904392197728157,
0.11468978971242905,
0.015604963526129723,
-0.029055163264274597,
-0.028970520943403244,
0.20205359160900116,
0.04075729101896286,
0.04170536249876022,
0.1287030726671219,
-0.07563121616840363,
0.05300871282815933,
0.0792899802327156,
0.004629733040928841,
-0.04580780863761902,
0.05451612174510956,
0.053820837289094925,
-0.06208314746618271,
-0.19169706106185913,
-0.008795609697699547,
0.009828737936913967,
-0.043932393193244934,
0.07248593866825104,
0.03977024555206299,
0.007989036850631237,
0.07653044164180756,
0.019625432789325714,
0.0674980878829956,
-0.0006801344570703804,
0.09816210716962814,
0.020424775779247284,
-0.036467473953962326,
0.0860598087310791,
-0.007519761100411415,
-0.007441877853125334,
0.07753393054008484,
-0.017409363761544228,
0.29565665125846863,
-0.04353176802396774,
0.016032638028264046,
0.12360592186450958,
0.0339541919529438,
0.05040457472205162,
0.12429177016019821,
-0.07688184082508087,
0.028082745149731636,
-0.07638796418905258,
-0.04574773088097572,
0.011465263552963734,
0.04896651580929756,
-0.07333651185035706,
0.02120768092572689,
-0.08881720900535583,
0.027939071878790855,
-0.02793167345225811,
0.30384156107902527,
0.10069058835506439,
-0.11520043015480042,
-0.056935831904411316,
0.0029098964296281338,
-0.1028139740228653,
-0.07283459603786469,
0.053638506680727005,
0.051544446498155594,
-0.13308154046535492,
0.0029019778594374657,
-0.01963135413825512,
0.07829158753156662,
-0.033867862075567245,
0.015261201187968254,
0.03983309492468834,
0.051501620560884476,
-0.04504357650876045,
0.007793323136866093,
-0.1837940216064453,
0.19451184570789337,
-0.0018312294268980622,
0.023607967421412468,
-0.0570758618414402,
0.029818031936883926,
0.009313279762864113,
-0.014017505571246147,
0.0623154491186142,
0.019274473190307617,
-0.01522643119096756,
-0.06622044742107391,
-0.04426221549510956,
0.016161596402525902,
0.061921752989292145,
-0.04410229250788689,
0.10604259371757507,
0.007686178665608168,
0.05549003556370735,
0.029300043359398842,
0.08450449258089066,
-0.1813739538192749,
-0.08206316828727722,
0.027936872094869614,
-0.04493973031640053,
-0.10040351003408432,
-0.0827493965625763,
-0.09730491042137146,
-0.003354500513523817,
0.2195788472890854,
-0.11403293162584305,
-0.07572266459465027,
-0.09279420226812363,
0.04253755509853363,
0.09629330039024353,
-0.05566243827342987,
0.024462854489684105,
-0.008020790293812752,
0.11069971323013306,
-0.0704297125339508,
-0.11956718564033508,
0.02756555564701557,
-0.09973247349262238,
-0.16007359325885773,
-0.066872738301754,
0.08750960230827332,
0.06399830430746078,
0.02971000038087368,
-0.03303748369216919,
0.011747719720005989,
0.03696111589670181,
-0.03641389310359955,
-0.0010368977673351765,
0.06302449852228165,
0.08434305340051651,
0.03802451491355896,
-0.10883744060993195,
0.023640282452106476,
-0.07291558384895325,
-0.07175607234239578,
0.0704479068517685,
0.2722029983997345,
-0.048826396465301514,
0.1082470640540123,
0.12502384185791016,
-0.08804141730070114,
-0.15737982094287872,
0.04284483194351196,
0.0932512879371643,
-0.01354245375841856,
0.0066996971145272255,
-0.16191036999225616,
0.10326575487852097,
0.11748789995908737,
-0.015434954315423965,
0.011422894895076752,
-0.19787369668483734,
-0.13647350668907166,
0.09303015470504761,
0.11197622120380402,
0.28299450874328613,
-0.05345867946743965,
-0.03536466881632805,
0.020853346213698387,
-0.09902097284793854,
0.000534209655597806,
0.1315751075744629,
0.06106371060013771,
-0.02231121063232422,
-0.06670080125331879,
0.012735304422676563,
-0.03693059831857681,
0.09089168161153793,
0.06733020395040512,
0.07024507969617844,
-0.004955712705850601,
-0.007370836101472378,
-0.04163428768515587,
-0.04181365668773651,
0.07045037299394608,
0.034752991050481796,
0.049141012132167816,
-0.09559574723243713,
-0.03636641427874565,
-0.07171161472797394,
0.032645538449287415,
-0.03012012504041195,
-0.07673697173595428,
-0.05811062082648277,
0.07430017739534378,
0.05735933780670166,
-0.03328954800963402,
0.03533933684229851,
0.03391318768262863,
0.09518272429704666,
0.15570300817489624,
-0.007021291181445122,
-0.05723101645708084,
-0.060141291469335556,
-0.030952362343668938,
-0.013951531611382961,
0.07390966266393661,
-0.04085327684879303,
0.017712783068418503,
0.0730743259191513,
0.01950184628367424,
0.10525532811880112,
0.061270903795957565,
-0.12162093073129654,
-0.017774514853954315,
0.02724655345082283,
-0.15162016451358795,
0.011673113331198692,
-0.003202596912160516,
0.026332242414355278,
-0.020876096561551094,
0.022712765261530876,
0.14892743527889252,
-0.07087008655071259,
-0.03136204928159714,
-0.046947453171014786,
0.0615856796503067,
0.034875936806201935,
0.14736731350421906,
0.03989069536328316,
0.037485215812921524,
-0.07696624100208282,
0.14674924314022064,
0.04583229124546051,
-0.04181938245892525,
0.02719518542289734,
-0.03136995807290077,
-0.10705319046974182,
0.016627667471766472,
0.05864308401942253,
0.075987309217453,
-0.07766484469175339,
-0.018252866342663765,
-0.042688339948654175,
-0.07888540625572205,
0.06767690926790237,
0.20284438133239746,
0.06462565064430237,
0.06531728804111481,
-0.052003778517246246,
-0.03589886054396629,
-0.07868637889623642,
0.04879307001829147,
0.05193185806274414,
0.08072948455810547,
-0.07303004711866379,
0.09851065278053284,
0.013876548036932945,
0.05250385031104088,
-0.025280578061938286,
-0.04906903952360153,
-0.10238028317689896,
-0.055339086800813675,
-0.10296130925416946,
0.012771843932569027,
-0.061436280608177185,
-0.04175283759832382,
0.007918165996670723,
-0.0031265397556126118,
-0.005029543302953243,
0.05739615857601166,
-0.061189889907836914,
-0.012753304094076157,
-0.015042673796415329,
0.03447360545396805,
-0.0577327162027359,
-0.05465563014149666,
0.020193280652165413,
-0.09518691152334213,
0.09791898727416992,
0.042625200003385544,
0.011232670396566391,
-0.0007793629774823785,
0.07727496325969696,
-0.010362077504396439,
0.02155407704412937,
0.008989173918962479,
-0.04046839103102684,
-0.09679196774959564,
0.005131987389177084,
-0.02648305706679821,
-0.03424805402755737,
-0.021088218316435814,
0.08856678009033203,
-0.08413881063461304,
0.02690451592206955,
-0.0004392676055431366,
-0.00589542742818594,
-0.07989819347858429,
-0.004766672383993864,
0.09598111361265182,
0.08212414383888245,
0.05242554098367691,
-0.08300165832042694,
0.016099397093057632,
-0.12512406706809998,
-0.034824810922145844,
0.014440517872571945,
-0.011993118561804295,
-0.12643520534038544,
-0.012624013237655163,
0.01791209727525711,
-0.012594521977007389,
0.19121438264846802,
-0.05881514027714729,
-0.026230545714497566,
0.020286478102207184,
-0.09289491176605225,
0.10446099191904068,
-0.022664839401841164,
0.16727544367313385,
-0.029346244409680367,
-0.03516411408782005,
-0.016244977712631226,
0.045509930700063705,
0.02955823764204979,
-0.00562475249171257,
0.1817125827074051,
0.1350800096988678,
0.038953594863414764,
0.056872788816690445,
-0.023893993347883224,
-0.007441042456775904,
-0.060151051729917526,
-0.024518834426999092,
0.04435846954584122,
0.03613150864839554,
0.023553060367703438,
0.15014596283435822,
0.062433548271656036,
-0.15952886641025543,
0.034579597413539886,
-0.025746723636984825,
-0.03918055444955826,
-0.11419820040464401,
-0.09200172126293182,
-0.03229738399386406,
-0.06210997700691223,
0.01541160512715578,
-0.13211184740066528,
0.0056059956550598145,
0.18213656544685364,
0.06534665077924728,
0.028017982840538025,
0.017667844891548157,
-0.13416136801242828,
-0.04116785153746605,
0.05944995582103729,
0.012769166380167007,
0.022446103394031525,
0.04374734312295914,
-0.005274823401123285,
0.06673675775527954,
0.02964426949620247,
0.0032000939827412367,
-0.005635470151901245,
0.07964801043272018,
0.02330099605023861,
0.049346163868904114,
-0.05690555274486542,
-0.0007263101288117468,
-0.044865988194942474,
0.08120822161436081,
0.11023972183465958,
0.042439963668584824,
-0.047387804836034775,
-0.012385874055325985,
0.1595923900604248,
-0.03054662048816681,
0.009327447973191738,
-0.11653616279363632,
0.3211197555065155,
0.01747528836131096,
0.011951359920203686,
0.05676710233092308,
-0.08263006061315536,
-0.045488256961107254,
0.20451077818870544,
0.06363902986049652,
-0.024254415184259415,
-0.03020629659295082,
0.0016849697567522526,
-0.03105228580534458,
-0.01604592241346836,
0.1417001187801361,
0.04223945364356041,
0.1174747496843338,
-0.05511094257235527,
-0.045640524476766586,
-0.03572028875350952,
0.0021988446824252605,
-0.11202052235603333,
0.14835964143276215,
-0.018064655363559723,
-0.018604589626193047,
-0.0799855962395668,
0.009464922361075878,
0.07747086137533188,
-0.33880358934402466,
0.005872196983546019,
-0.03070073388516903,
-0.10646075010299683,
-0.011727646924555302,
-0.024458875879645348,
-0.025428686290979385,
0.04965158924460411,
-0.04059062898159027,
0.06705118715763092,
0.04359125718474388,
0.03841346129775047,
-0.024066977202892303,
-0.10176155716180801,
0.15806081891059875,
0.06781568378210068,
0.10562331229448318,
0.018876081332564354,
0.08648399263620377,
0.06111882999539375,
0.038833264261484146,
-0.09374243021011353,
0.0557037778198719,
0.007462684530764818,
-0.06721299886703491,
-0.053980544209480286,
0.1155330240726471,
-0.0005170020740479231,
0.06092358008027077,
0.03196793794631958,
-0.12263590097427368,
0.023199142888188362,
0.0774802640080452,
-0.08644551783800125,
-0.0959780365228653,
-0.005403735209256411,
-0.09689312428236008,
0.15897859632968903,
0.14506590366363525,
-0.011679801158607006,
0.013257717713713646,
-0.06114199757575989,
-0.00780464569106698,
0.05045948550105095,
0.013009014539420605,
-0.01174623891711235,
-0.1834837943315506,
0.053318481892347336,
-0.08754956722259521,
-0.0042045884765684605,
-0.2164393961429596,
-0.10085943341255188,
-0.015344316139817238,
-0.05963529273867607,
-0.025890110060572624,
0.05938195437192917,
0.028440101072192192,
0.07490935921669006,
-0.02615775726735592,
-0.022999808192253113,
-0.038183897733688354,
0.09441481530666351,
-0.10350612550973892,
-0.07568692415952682
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 80k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 80k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_80k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_80k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_80k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_80k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_80k"]}
| null |
google/multiberts-seed_1-step_80k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_80k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_80k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 80k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 80k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 80k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 80k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_80k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 80k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 80k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_80k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 80k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 80k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08160880208015442,
0.08634693920612335,
-0.002069514710456133,
0.041571859270334244,
0.07534389942884445,
-0.020570671185851097,
0.07137273252010345,
0.0931074246764183,
-0.0038530377205461264,
0.026134556159377098,
0.08230981975793839,
0.028065865859389305,
0.010637426748871803,
0.10096589475870132,
0.017184894531965256,
-0.21132372319698334,
0.03077772818505764,
-0.023824559524655342,
-0.08364351838827133,
0.07374946027994156,
0.10303468257188797,
-0.08509605377912521,
0.04148939996957779,
0.031119512394070625,
-0.11167877912521362,
0.05455581098794937,
-0.010668731294572353,
-0.024933652952313423,
0.13764408230781555,
-0.000013061796380497981,
0.05537667125463486,
0.05588662996888161,
0.04528513923287392,
-0.13576771318912506,
0.0044400133192539215,
0.055646445602178574,
0.050294868648052216,
0.038301724940538406,
0.024592503905296326,
0.08262067288160324,
-0.01598619669675827,
0.03338543698191643,
0.05302182212471962,
0.01573931612074375,
-0.06247849389910698,
-0.06952972710132599,
-0.09580802917480469,
0.032389894127845764,
0.03240836784243584,
0.019829560071229935,
0.007002914324402809,
0.11937842518091202,
-0.032433778047561646,
0.04677671939134598,
0.17765958607196808,
-0.3161795735359192,
0.0005787459085695446,
0.057636525481939316,
0.02192922681570053,
0.11584953963756561,
-0.0011614857940003276,
-0.03391311690211296,
0.0808638408780098,
0.02203988842666149,
0.09527952969074249,
-0.03948509693145752,
0.020223883911967278,
-0.058626800775527954,
-0.1559695452451706,
-0.03991299495100975,
0.10017857700586319,
0.006335646379739046,
-0.1388172209262848,
-0.020107612013816833,
-0.04490642249584198,
0.042686961591243744,
0.019320841878652573,
-0.04145769029855728,
0.045285701751708984,
0.004251017235219479,
-0.002174902707338333,
0.001841184333898127,
-0.10328895598649979,
-0.043160077184438705,
0.026659952476620674,
0.08750373125076294,
0.11025582998991013,
0.05640077963471413,
-0.005589716602116823,
0.10725517570972443,
-0.19200193881988525,
-0.04954060912132263,
-0.025147005915641785,
-0.030982008203864098,
-0.04456789419054985,
-0.009028772823512554,
-0.10295820236206055,
-0.04738654941320419,
-0.00017609904170967638,
0.13639557361602783,
-0.01106862910091877,
0.03310138359665871,
-0.02845788560807705,
0.007931733503937721,
0.05998432636260986,
0.05459076911211014,
-0.01922648400068283,
0.016019416972994804,
0.03334327042102814,
-0.014320643618702888,
-0.021002676337957382,
0.009033693000674248,
-0.006298130843788385,
0.028192482888698578,
0.14337992668151855,
0.011503603309392929,
-0.10599295049905777,
0.07368028908967972,
-0.015102444216609001,
-0.04661158472299576,
-0.007668332662433386,
-0.08717133849859238,
-0.062363673001527786,
-0.04069916903972626,
-0.016735397279262543,
0.006474316120147705,
0.00733410008251667,
-0.006598806474357843,
-0.027881450951099396,
-0.01961539499461651,
-0.0893624871969223,
-0.056469887495040894,
-0.05506783723831177,
-0.13489612936973572,
0.008914108388125896,
-0.2015770524740219,
-0.027234584093093872,
-0.11430952697992325,
-0.20053987205028534,
-0.04022540897130966,
0.04702393710613251,
0.005619645584374666,
-0.06436608731746674,
0.05735502392053604,
0.036589737981557846,
-0.03117857128381729,
-0.004702421836555004,
0.08023922890424728,
-0.007819614373147488,
0.03722776472568512,
-0.04078177362680435,
0.058712586760520935,
0.0060822064988315105,
0.04499257355928421,
-0.057811763137578964,
0.05685952678322792,
-0.17704416811466217,
0.03662516549229622,
-0.07266147434711456,
-0.030109303072094917,
-0.08559171855449677,
-0.029869835823774338,
0.002401145175099373,
0.01454074401408434,
0.024527503177523613,
0.07072547823190689,
-0.17516253888607025,
-0.030296703800559044,
0.09299243241548538,
-0.15192148089408875,
-0.030696455389261246,
0.07466243207454681,
-0.055468473583459854,
0.11215110868215561,
0.06700453162193298,
0.1607571393251419,
-0.03878749907016754,
-0.06803449988365173,
0.04551612213253975,
-0.013375208713114262,
0.014184481464326382,
-0.01151423342525959,
0.07049424946308136,
-0.02056013233959675,
-0.16197383403778076,
0.023823264986276627,
-0.13115689158439636,
0.00055285869166255,
-0.07698699086904526,
0.031640466302633286,
-0.004628341645002365,
-0.06802000105381012,
-0.09089098125696182,
-0.0322241447865963,
0.07603535801172256,
-0.07066843658685684,
-0.029616203159093857,
0.03862404078245163,
0.07814088463783264,
-0.07387053966522217,
0.06536021828651428,
-0.020042812451720238,
0.020533697679638863,
-0.0828419178724289,
-0.037464872002601624,
-0.18597577512264252,
0.037698548287153244,
0.09569426625967026,
0.020488793030381203,
-0.019520871341228485,
0.13318093121051788,
-0.008107168599963188,
0.06568054109811783,
-0.03737892583012581,
-0.004994498565793037,
-0.009408736601471901,
0.0004052795411553234,
-0.09986528754234314,
-0.11114059388637543,
-0.07243088632822037,
-0.06931841373443604,
0.09337116032838821,
-0.1184200569987297,
0.0214407779276371,
-0.061978843063116074,
0.04495793581008911,
0.016087664291262627,
-0.07085968554019928,
-0.012348361313343048,
0.013601209037005901,
-0.06760655343532562,
-0.05849534273147583,
0.03635219857096672,
0.06129181757569313,
-0.023211318999528885,
0.09375138580799103,
-0.056261949241161346,
-0.07928509265184402,
0.027402836829423904,
0.07305378466844559,
-0.10255520045757294,
0.025165589526295662,
-0.046978842467069626,
-0.048234205693006516,
-0.06181945651769638,
-0.026061220094561577,
0.10186014324426651,
-0.013348538428544998,
0.14676912128925323,
-0.07681349664926529,
-0.011685308068990707,
0.008753742091357708,
-0.012950249947607517,
-0.02591242827475071,
0.04731671139597893,
0.07186856865882874,
-0.0761198028922081,
0.021962355822324753,
0.03678135946393013,
-0.0027004112489521503,
0.06501128524541855,
-0.051704756915569305,
-0.080562062561512,
0.020566603168845177,
0.031961988657712936,
0.021877456456422806,
0.06044001504778862,
-0.056766971945762634,
-0.014665370807051659,
0.03196720406413078,
0.021132703870534897,
0.012678899802267551,
-0.11769568920135498,
0.06211879476904869,
0.061100855469703674,
0.01120815146714449,
0.055048901587724686,
-0.02191551774740219,
-0.034112900495529175,
0.08242549002170563,
0.03041066601872444,
-0.02529754303395748,
-0.010967922396957874,
-0.012420505285263062,
-0.12270781397819519,
0.21682369709014893,
-0.06600884348154068,
-0.15100674331188202,
-0.06842370331287384,
-0.11115992814302444,
0.005549812689423561,
0.026013551279902458,
0.04465385153889656,
-0.026192719116806984,
-0.04478791728615761,
-0.12556512653827667,
0.08902373909950256,
-0.03980614244937897,
0.0660412535071373,
0.11148369312286377,
-0.06394223868846893,
0.04599759727716446,
-0.13161784410476685,
-0.014348389580845833,
-0.0779140442609787,
-0.06039970740675926,
0.058454208076000214,
-0.055432260036468506,
0.037279821932315826,
0.11630275100469589,
0.01567772962152958,
-0.02712346613407135,
-0.027487367391586304,
0.19885501265525818,
0.04247119277715683,
0.039273567497730255,
0.1314694583415985,
-0.07849559932947159,
0.05249050259590149,
0.07413679361343384,
0.0020669226069003344,
-0.046221841126680374,
0.052357450127601624,
0.05613256245851517,
-0.058069199323654175,
-0.19379889965057373,
-0.007182684261351824,
0.010727830231189728,
-0.04286148026585579,
0.07171633839607239,
0.037941113114356995,
0.009133834391832352,
0.07693592458963394,
0.017277492210268974,
0.06631366908550262,
-0.0007384351338259876,
0.10054834187030792,
0.019806677475571632,
-0.038862645626068115,
0.0878974199295044,
-0.006490745116025209,
-0.006278542801737785,
0.07747815549373627,
-0.019192194566130638,
0.2913942039012909,
-0.03953704982995987,
0.022123148664832115,
0.12563876807689667,
0.03146345168352127,
0.04989628121256828,
0.12273421883583069,
-0.0766913890838623,
0.02749636583030224,
-0.07901695370674133,
-0.04800184443593025,
0.010941845364868641,
0.050777141004800797,
-0.07410705834627151,
0.016470134258270264,
-0.08819878846406937,
0.027313701808452606,
-0.027625244110822678,
0.2992587685585022,
0.10076600313186646,
-0.11657626181840897,
-0.05789514631032944,
0.003911128733307123,
-0.10140898823738098,
-0.07532583177089691,
0.05162937939167023,
0.05254579707980156,
-0.13037711381912231,
0.0012822780990973115,
-0.02079775743186474,
0.07942505180835724,
-0.03159821033477783,
0.015625102445483208,
0.031282562762498856,
0.05117940902709961,
-0.04490220919251442,
0.008977721445262432,
-0.1841275542974472,
0.1918255090713501,
-0.0016243788413703442,
0.023579806089401245,
-0.05560152605175972,
0.030030375346541405,
0.010095394216477871,
-0.012376761995255947,
0.06269829720258713,
0.016920482739806175,
-0.01580023765563965,
-0.06099051982164383,
-0.042322516441345215,
0.01664186827838421,
0.06446756422519684,
-0.04631812870502472,
0.10852258652448654,
0.006241723895072937,
0.05456192046403885,
0.03285367414355278,
0.08205500990152359,
-0.1826637089252472,
-0.07899253070354462,
0.02755848877131939,
-0.0474015548825264,
-0.09726478904485703,
-0.08190746605396271,
-0.09886582940816879,
-0.013569311238825321,
0.2327825278043747,
-0.11107157915830612,
-0.0742177814245224,
-0.09307750314474106,
0.04289378225803375,
0.09596244990825653,
-0.05553065985441208,
0.022498933598399162,
-0.009384420700371265,
0.1151081770658493,
-0.07108215242624283,
-0.11923003941774368,
0.03058786503970623,
-0.10022211819887161,
-0.15846838057041168,
-0.06810634583234787,
0.08904260396957397,
0.06543917208909988,
0.029298938810825348,
-0.03114984929561615,
0.011261193081736565,
0.0347604863345623,
-0.033938560634851456,
0.0004827541415579617,
0.06388576328754425,
0.09574363380670547,
0.036875851452350616,
-0.10921339690685272,
0.02378668263554573,
-0.07135461270809174,
-0.06862885504961014,
0.07167592644691467,
0.2738097608089447,
-0.048701610416173935,
0.10767244547605515,
0.12190526723861694,
-0.0880357176065445,
-0.1532740741968155,
0.04439515620470047,
0.0925258919596672,
-0.01558748446404934,
0.004897101782262325,
-0.16449616849422455,
0.10380073636770248,
0.11529964953660965,
-0.015991495922207832,
0.009189222939312458,
-0.1952105313539505,
-0.1347390115261078,
0.09895700961351395,
0.11422939598560333,
0.28382471203804016,
-0.05567243695259094,
-0.034026145935058594,
0.019317058846354485,
-0.09328925609588623,
0.009047606028616428,
0.12697665393352509,
0.060008708387613297,
-0.021397897973656654,
-0.0687875896692276,
0.013651530258357525,
-0.03714173287153244,
0.090782530605793,
0.06611878424882889,
0.0717206746339798,
-0.0037667634896934032,
-0.0035312152467668056,
-0.03946743160486221,
-0.04306553304195404,
0.07303950190544128,
0.029645798727869987,
0.048796750605106354,
-0.0951007828116417,
-0.037456072866916656,
-0.07061860710382462,
0.03639601543545723,
-0.030270162969827652,
-0.07800251990556717,
-0.05689634755253792,
0.07246500253677368,
0.05502704530954361,
-0.030795423313975334,
0.044190309941768646,
0.031934771686792374,
0.09736791998147964,
0.14475944638252258,
-0.0025834976695477962,
-0.05115538090467453,
-0.06453341245651245,
-0.031585633754730225,
-0.012961040250957012,
0.07545977085828781,
-0.04584848880767822,
0.017745167016983032,
0.07157975435256958,
0.01950385980308056,
0.10976914316415787,
0.06037735193967819,
-0.12349513173103333,
-0.02038547396659851,
0.0245195422321558,
-0.15515974164009094,
0.012541504576802254,
-0.0008767540566623211,
0.020466020330786705,
-0.018505724146962166,
0.02519323118031025,
0.14953112602233887,
-0.06788155436515808,
-0.032092221081256866,
-0.04780632629990578,
0.061277907341718674,
0.03734637424349785,
0.1439351886510849,
0.04061993956565857,
0.0370708629488945,
-0.07687842845916748,
0.14324058592319489,
0.044158414006233215,
-0.04699785262346268,
0.0314105786383152,
-0.033238258212804794,
-0.10747017711400986,
0.016024647280573845,
0.06245168298482895,
0.08825639635324478,
-0.07761440426111221,
-0.02280726470053196,
-0.04493836686015129,
-0.07559241354465485,
0.07043147832155228,
0.2118796408176422,
0.06609196960926056,
0.066413015127182,
-0.05148996785283089,
-0.035763923078775406,
-0.077867291867733,
0.051949821412563324,
0.052180804312229156,
0.07995781302452087,
-0.07468922436237335,
0.104760080575943,
0.014227899722754955,
0.05302082747220993,
-0.02617701143026352,
-0.04814966395497322,
-0.10330508649349213,
-0.05754384025931358,
-0.11455119401216507,
0.015895040705800056,
-0.062224339693784714,
-0.042602796107530594,
0.008329716511070728,
-0.0020294366404414177,
-0.0008942187996581197,
0.05705801397562027,
-0.062131382524967194,
-0.010705220513045788,
-0.013379182666540146,
0.03619634732604027,
-0.062120795249938965,
-0.05276848375797272,
0.01771577075123787,
-0.0969943031668663,
0.09959255158901215,
0.047554679214954376,
0.012590127065777779,
0.001066776574589312,
0.07359997183084488,
-0.007335687056183815,
0.024593746289610863,
0.00807288195937872,
-0.04019887000322342,
-0.09916664659976959,
0.006530196405947208,
-0.02605161815881729,
-0.034843672066926956,
-0.023131398484110832,
0.08922971040010452,
-0.08224689215421677,
0.02800718881189823,
0.0025023575872182846,
-0.007305880077183247,
-0.07923208922147751,
-0.004997923504561186,
0.09152808040380478,
0.08510629087686539,
0.05296432599425316,
-0.0800282210111618,
0.015304061584174633,
-0.12480123341083527,
-0.034961920231580734,
0.01422673650085926,
-0.012086992152035236,
-0.1240990161895752,
-0.011879968456923962,
0.017182588577270508,
-0.015431230887770653,
0.18713130056858063,
-0.05568091571331024,
-0.02465568110346794,
0.01855541206896305,
-0.0971105620265007,
0.10963567346334457,
-0.0242384672164917,
0.16683124005794525,
-0.028919288888573647,
-0.03370526060461998,
-0.016276637092232704,
0.04684343934059143,
0.031184008345007896,
-0.009402521885931492,
0.1821240335702896,
0.13400854170322418,
0.0332084596157074,
0.057261303067207336,
-0.025671182200312614,
-0.011985725723206997,
-0.07018861174583435,
-0.01897117681801319,
0.04558555781841278,
0.03620729595422745,
0.024355856701731682,
0.16207578778266907,
0.06712517887353897,
-0.1586497724056244,
0.032345470041036606,
-0.024956485256552696,
-0.038845352828502655,
-0.11632998287677765,
-0.09720570594072342,
-0.03449908643960953,
-0.06008756533265114,
0.015992114320397377,
-0.13268885016441345,
0.006161162164062262,
0.17889806628227234,
0.0659325122833252,
0.027417829260230064,
0.012193658389151096,
-0.13073870539665222,
-0.04416118562221527,
0.05627143755555153,
0.014045941643416882,
0.01792430318892002,
0.042400408536195755,
-0.003970684949308634,
0.06808631122112274,
0.028096117079257965,
0.003571409499272704,
-0.006344378460198641,
0.08413760364055634,
0.02379273995757103,
0.04761863127350807,
-0.056871239095926285,
-0.0014564346056431532,
-0.04239799454808235,
0.0828312486410141,
0.11218095570802689,
0.04446309059858322,
-0.048955775797367096,
-0.012493900023400784,
0.15770456194877625,
-0.030126012861728668,
0.009804848581552505,
-0.11491834372282028,
0.32665619254112244,
0.014893056824803352,
0.01230622548609972,
0.057655077427625656,
-0.08068616688251495,
-0.043332215398550034,
0.20099574327468872,
0.05826639384031296,
-0.019785229116678238,
-0.029413439333438873,
0.002987256972119212,
-0.03170617297291756,
-0.019246641546487808,
0.13865706324577332,
0.045026566833257675,
0.12387418746948242,
-0.05519880726933479,
-0.04776884615421295,
-0.03695469722151756,
0.0029437646735459566,
-0.11321750283241272,
0.14409880340099335,
-0.017236733809113503,
-0.019757043570280075,
-0.07531708478927612,
0.009934943169355392,
0.07770887017250061,
-0.34595516324043274,
0.0037162702064961195,
-0.02355807088315487,
-0.10482324659824371,
-0.011904271319508553,
-0.02217835560441017,
-0.027697743847966194,
0.04782150685787201,
-0.041204363107681274,
0.06226183846592903,
0.04585938900709152,
0.03795525059103966,
-0.02407894842326641,
-0.09901394695043564,
0.15831626951694489,
0.058282677084207535,
0.10060864686965942,
0.017749356105923653,
0.0900929719209671,
0.06120547279715538,
0.03937855735421181,
-0.09015404433012009,
0.05527489632368088,
0.00855826586484909,
-0.06589935719966888,
-0.05298786982893944,
0.11627383530139923,
-0.0007533982861787081,
0.06419558078050613,
0.03448940068483353,
-0.12129510194063187,
0.02184997871518135,
0.07369891554117203,
-0.09177049249410629,
-0.09657624363899231,
-0.003551542991772294,
-0.09880421310663223,
0.15862098336219788,
0.1441381722688675,
-0.011547869071364403,
0.013010661117732525,
-0.06586310267448425,
-0.005550077650696039,
0.05317731201648712,
0.010510317049920559,
-0.012322388589382172,
-0.1810334026813507,
0.054664649069309235,
-0.08047167211771011,
-0.0033032705541700125,
-0.21618545055389404,
-0.09880099445581436,
-0.01472804881632328,
-0.05674044415354729,
-0.02609860897064209,
0.05303492397069931,
0.02736741676926613,
0.07566927373409271,
-0.028147507458925247,
-0.02338462695479393,
-0.03760772570967674,
0.09471390396356583,
-0.10454168170690536,
-0.07525999844074249
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1, captured at step 900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_900k')
model = TFBertModel.from_pretrained("google/multiberts-seed_1-step_900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1-step_900k')
model = BertModel.from_pretrained("google/multiberts-seed_1-step_900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1", "multiberts-seed_1-step_900k"]}
| null |
google/multiberts-seed_1-step_900k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"multiberts-seed_1-step_900k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1, captured at step 900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
81,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #multiberts-seed_1-step_900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 1, Step 900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1, captured at step 900k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07951818406581879,
0.0740937739610672,
-0.0019335148390382528,
0.04269079864025116,
0.07464990764856339,
-0.019869886338710785,
0.0642804354429245,
0.0925932303071022,
-0.008233809843659401,
0.02407180704176426,
0.08445887267589569,
0.02629300020635128,
0.012023480609059334,
0.0982125923037529,
0.021179506555199623,
-0.21137793362140656,
0.03484974429011345,
-0.02460421994328499,
-0.08323951810598373,
0.07419121265411377,
0.10384470969438553,
-0.0855192095041275,
0.041546501219272614,
0.03252221271395683,
-0.1115773469209671,
0.05345994234085083,
-0.012946161441504955,
-0.0243763979524374,
0.13545310497283936,
0.0009196874452754855,
0.05470331385731697,
0.05568833276629448,
0.047489359974861145,
-0.13726989924907684,
0.004228494130074978,
0.05492037162184715,
0.05214042216539383,
0.03608749806880951,
0.019684432074427605,
0.08207082003355026,
-0.02668936736881733,
0.035688381642103195,
0.05452464520931244,
0.014565317891538143,
-0.06220005825161934,
-0.062091801315546036,
-0.09782972186803818,
0.040445711463689804,
0.035829026252031326,
0.01856396347284317,
0.009581035003066063,
0.11709655821323395,
-0.03810270503163338,
0.04399237781763077,
0.17295140027999878,
-0.3054404556751251,
-0.001410765922628343,
0.06042565405368805,
0.021680770441889763,
0.10826638340950012,
-0.002912181429564953,
-0.03580508753657341,
0.07976845651865005,
0.019635088741779327,
0.09710393846035004,
-0.03896243870258331,
0.012475895695388317,
-0.06543712317943573,
-0.15655255317687988,
-0.03859252482652664,
0.09314480423927307,
0.007317536976188421,
-0.13750144839286804,
-0.02154242806136608,
-0.043828584253787994,
0.03848034515976906,
0.019600095227360725,
-0.04065896198153496,
0.04411536082625389,
0.0014524174621328712,
0.0007083775708451867,
-0.0033074659295380116,
-0.10241420567035675,
-0.046657197177410126,
0.023886213079094887,
0.1009424701333046,
0.11056263744831085,
0.05655202269554138,
-0.003370195161551237,
0.1116848811507225,
-0.18958692252635956,
-0.04904627054929733,
-0.03058239072561264,
-0.0340527780354023,
-0.04339216277003288,
-0.010836134664714336,
-0.10045837610960007,
-0.046902529895305634,
0.00018096595886163414,
0.13585412502288818,
-0.003671287326142192,
0.03313348442316055,
-0.01648043468594551,
0.004977944307029247,
0.06040786951780319,
0.0558600015938282,
-0.021087493747472763,
0.021124115213751793,
0.03744441643357277,
-0.0073488266207277775,
-0.019360443577170372,
0.008950220420956612,
-0.0033744550310075283,
0.025573577731847763,
0.14016173779964447,
0.013575952500104904,
-0.1030263900756836,
0.07508402317762375,
-0.013517927378416061,
-0.04446977376937866,
-0.0022463626228272915,
-0.08740907162427902,
-0.0646354928612709,
-0.041648056358098984,
-0.016915209591388702,
0.006558137014508247,
0.006412255112081766,
-0.00889586191624403,
-0.027394738048315048,
-0.016929874196648598,
-0.09073309600353241,
-0.059113241732120514,
-0.057156309485435486,
-0.1330287605524063,
0.008696242235600948,
-0.19713297486305237,
-0.026408201083540916,
-0.11779007315635681,
-0.20483475923538208,
-0.04156654700636864,
0.046660978347063065,
0.005593536887317896,
-0.06402638554573059,
0.058369915932416916,
0.033680420368909836,
-0.031244678422808647,
-0.005671379156410694,
0.08183028548955917,
-0.006150035187602043,
0.03768588602542877,
-0.04318014532327652,
0.058802008628845215,
0.001793811796233058,
0.04425469785928726,
-0.06053897738456726,
0.05913722515106201,
-0.17699594795703888,
0.04117899760603905,
-0.074080690741539,
-0.026878463104367256,
-0.08390305191278458,
-0.031243840232491493,
-0.004016976337879896,
0.01231955923140049,
0.02613522857427597,
0.07288365811109543,
-0.17328163981437683,
-0.026212163269519806,
0.08477532863616943,
-0.1532740592956543,
-0.03180726617574692,
0.07022082060575485,
-0.05674632266163826,
0.11742398887872696,
0.06720256805419922,
0.16521528363227844,
-0.022653311491012573,
-0.0700208768248558,
0.04323965311050415,
-0.01195729523897171,
0.008616134524345398,
-0.015594332478940487,
0.06628458946943283,
-0.021540600806474686,
-0.16058576107025146,
0.023611821234226227,
-0.13417461514472961,
0.0034451461397111416,
-0.07685503363609314,
0.029813619330525398,
-0.0033436950761824846,
-0.07229675352573395,
-0.0884246900677681,
-0.034752409905195236,
0.07721599191427231,
-0.06813778728246689,
-0.02216705121099949,
0.04442296549677849,
0.08040391653776169,
-0.07354199141263962,
0.06886205822229385,
-0.015061247162520885,
0.025226643308997154,
-0.07986675947904587,
-0.03542916104197502,
-0.18444302678108215,
0.03531702235341072,
0.09752544015645981,
0.005376270040869713,
-0.02047058567404747,
0.11935894936323166,
-0.010026445612311363,
0.06350243091583252,
-0.03927014395594597,
-0.005862151738256216,
-0.01151834987103939,
0.003376195440068841,
-0.09830222278833389,
-0.10766199976205826,
-0.07671665400266647,
-0.06874576956033707,
0.08765483647584915,
-0.10702647268772125,
0.02228923887014389,
-0.060526568442583084,
0.04545283317565918,
0.01450339425355196,
-0.07129272073507309,
-0.010280318558216095,
0.015058671124279499,
-0.06317919492721558,
-0.0588257722556591,
0.035234007984399796,
0.06051815301179886,
-0.02619180828332901,
0.08945036679506302,
-0.048941150307655334,
-0.08473522216081619,
0.028510387986898422,
0.0774892121553421,
-0.10626637190580368,
0.02751912549138069,
-0.04597466439008713,
-0.04796556010842323,
-0.06602470576763153,
-0.030234282836318016,
0.10342060774564743,
-0.013020733371376991,
0.1430467665195465,
-0.07914663851261139,
-0.012330038473010063,
0.00810539722442627,
-0.014198275282979012,
-0.028247570618987083,
0.04369554668664932,
0.06736589968204498,
-0.07391490042209625,
0.020992614328861237,
0.03626351058483124,
-0.0027684203814715147,
0.0676436722278595,
-0.05130121111869812,
-0.0777709111571312,
0.01941791921854019,
0.03092057816684246,
0.01909293606877327,
0.06276098638772964,
-0.05007162317633629,
-0.012077000923454762,
0.02838117815554142,
0.02284725196659565,
0.01304337102919817,
-0.1195823922753334,
0.061840612441301346,
0.05913487449288368,
0.011925142258405685,
0.052065085619688034,
-0.0197460874915123,
-0.035109542310237885,
0.08048062026500702,
0.03428978472948074,
-0.018127325922250748,
-0.010791175998747349,
-0.013382785022258759,
-0.12101365625858307,
0.21899597346782684,
-0.06493566930294037,
-0.14877496659755707,
-0.07067309319972992,
-0.10635083168745041,
0.001977751962840557,
0.023320885375142097,
0.042148321866989136,
-0.02579689212143421,
-0.04270230233669281,
-0.12523293495178223,
0.09326554834842682,
-0.03540949150919914,
0.06717821955680847,
0.1132935956120491,
-0.06352797895669937,
0.044031884521245956,
-0.132134810090065,
-0.012580506503582,
-0.08076261729001999,
-0.057922251522541046,
0.052291810512542725,
-0.05154946446418762,
0.04054151847958565,
0.1163901537656784,
0.014974555000662804,
-0.029560454189777374,
-0.0297930259257555,
0.2056710571050644,
0.04130317643284798,
0.04095714911818504,
0.1312882900238037,
-0.07742968946695328,
0.053385764360427856,
0.07953427731990814,
0.005031913984566927,
-0.046307120472192764,
0.05211269110441208,
0.052888937294483185,
-0.06013655662536621,
-0.18718528747558594,
-0.007960059680044651,
0.00911032222211361,
-0.051263269037008286,
0.07040992379188538,
0.036062780767679214,
0.007334795314818621,
0.07751172035932541,
0.01940668374300003,
0.06104716286063194,
0.0012795113725587726,
0.1018761396408081,
0.026803744956851006,
-0.03718513995409012,
0.08501694351434708,
-0.008993208408355713,
-0.006408430635929108,
0.07481662929058075,
-0.016028068959712982,
0.2938258647918701,
-0.04578813537955284,
0.00829259678721428,
0.127329021692276,
0.034553803503513336,
0.05030268430709839,
0.12459216266870499,
-0.07877001911401749,
0.028432689607143402,
-0.07534035295248032,
-0.045904744416475296,
0.011424456723034382,
0.047115176916122437,
-0.06932274252176285,
0.02067495323717594,
-0.089445561170578,
0.023737050592899323,
-0.026108374819159508,
0.30189549922943115,
0.10399934649467468,
-0.10864721983671188,
-0.060435641556978226,
0.0038903518579900265,
-0.10379045456647873,
-0.07322510331869125,
0.05234001204371452,
0.0477941632270813,
-0.13263875246047974,
0.005637454334646463,
-0.01587013714015484,
0.076832115650177,
-0.031876035034656525,
0.012899246998131275,
0.034657370299100876,
0.05328718572854996,
-0.0455266609787941,
0.006331612356007099,
-0.17793722450733185,
0.19654521346092224,
-0.0019108393462374806,
0.02769017405807972,
-0.05458010733127594,
0.02988496981561184,
0.009628798812627792,
-0.01604722999036312,
0.06371204555034637,
0.019315345212817192,
-0.004894958343356848,
-0.0637916773557663,
-0.04349753260612488,
0.012956359423696995,
0.06419996917247772,
-0.04288570210337639,
0.1062975749373436,
0.008356242440640926,
0.05427797511219978,
0.02944050170481205,
0.08071275800466537,
-0.18282712996006012,
-0.07774526625871658,
0.027449671179056168,
-0.051059648394584656,
-0.09887608140707016,
-0.08454904705286026,
-0.09857229143381119,
-0.007233408745378256,
0.21754871308803558,
-0.11119119822978973,
-0.07608825713396072,
-0.09026416391134262,
0.038188740611076355,
0.10184230655431747,
-0.05599348992109299,
0.027723874896764755,
-0.011161180213093758,
0.10890977084636688,
-0.07041753083467484,
-0.12211824208498001,
0.02587268315255642,
-0.09994640201330185,
-0.15982358157634735,
-0.06675467640161514,
0.09077157080173492,
0.06212379038333893,
0.02924278751015663,
-0.030347611755132675,
0.010451307520270348,
0.0382457934319973,
-0.03911159560084343,
-0.0036585566122084856,
0.061127446591854095,
0.08154337853193283,
0.04256949573755264,
-0.10830815136432648,
0.013486996293067932,
-0.07420797646045685,
-0.06815323233604431,
0.07023970782756805,
0.275249183177948,
-0.049804337322711945,
0.10957682877779007,
0.13225972652435303,
-0.08773253113031387,
-0.1597316414117813,
0.04258214682340622,
0.09343738108873367,
-0.016216423362493515,
-0.003325539641082287,
-0.15730392932891846,
0.10202965885400772,
0.11811165511608124,
-0.016157370060682297,
0.0015032603405416012,
-0.2037501037120819,
-0.14063730835914612,
0.09713201224803925,
0.11462379992008209,
0.28021547198295593,
-0.05167767032980919,
-0.03601988032460213,
0.019685417413711548,
-0.09476182609796524,
0.009737232699990273,
0.13444335758686066,
0.06173371523618698,
-0.02327362447977066,
-0.07441052049398422,
0.011973715387284756,
-0.03827957808971405,
0.09190475940704346,
0.06722839921712875,
0.07166754454374313,
-0.0051320879720151424,
-0.005628586746752262,
-0.03437383845448494,
-0.044373881071805954,
0.07290019094944,
0.04244795814156532,
0.05021112039685249,
-0.08804336190223694,
-0.036089833825826645,
-0.0749739557504654,
0.03451139107346535,
-0.030366133898496628,
-0.07680714875459671,
-0.062390293926000595,
0.07643861323595047,
0.05940599367022514,
-0.03373806178569794,
0.03660271689295769,
0.033619243651628494,
0.09587208926677704,
0.14627699553966522,
-0.004490810912102461,
-0.05134451016783714,
-0.06386081874370575,
-0.026363160461187363,
-0.01632753573358059,
0.07510696351528168,
-0.04187271371483803,
0.014774962328374386,
0.0720406174659729,
0.02209956757724285,
0.10469172149896622,
0.061457086354494095,
-0.12185223400592804,
-0.017799999564886093,
0.02668152190744877,
-0.15094251930713654,
0.009706055745482445,
0.0016360150184482336,
0.01264447346329689,
-0.023227984085679054,
0.02497793547809124,
0.14491139352321625,
-0.0691705197095871,
-0.03194617107510567,
-0.04812190681695938,
0.06063924729824066,
0.033299874514341354,
0.14826098084449768,
0.043168000876903534,
0.03656407818198204,
-0.0769275426864624,
0.1467156559228897,
0.04411919787526131,
-0.045055657625198364,
0.02754218876361847,
-0.034654125571250916,
-0.10817248374223709,
0.01765381172299385,
0.05599404126405716,
0.07546267658472061,
-0.07918146997690201,
-0.017623867839574814,
-0.04122402146458626,
-0.08086959272623062,
0.0679599791765213,
0.22070260345935822,
0.06735410541296005,
0.06681039184331894,
-0.05210345238447189,
-0.03366461768746376,
-0.07825493812561035,
0.049795962870121,
0.05516301095485687,
0.0796644389629364,
-0.07621818035840988,
0.10020212829113007,
0.015320591628551483,
0.053139857947826385,
-0.026609398424625397,
-0.04860857501626015,
-0.10544876009225845,
-0.0567021407186985,
-0.10572357475757599,
0.011742080561816692,
-0.06817589700222015,
-0.03779178857803345,
0.007027225103229284,
-0.002759719965979457,
-0.004648113157600164,
0.056870684027671814,
-0.060543254017829895,
-0.012032964266836643,
-0.013239298947155476,
0.03564300388097763,
-0.059947412461042404,
-0.05475052446126938,
0.0200995784252882,
-0.09632279723882675,
0.09926291555166245,
0.045207444578409195,
0.015409584157168865,
0.0006745674763806164,
0.07893967628479004,
-0.009716200642287731,
0.02388050965964794,
0.01109540369361639,
-0.041301578283309937,
-0.10012614727020264,
0.006082509644329548,
-0.026402413845062256,
-0.0331650972366333,
-0.019437871873378754,
0.0905439481139183,
-0.08302184194326401,
0.026679830625653267,
0.0004450193664524704,
-0.003972615581005812,
-0.07862581312656403,
-0.0027559634763747454,
0.10164051502943039,
0.08392670750617981,
0.05432451143860817,
-0.08150161057710648,
0.01655716262757778,
-0.12339229881763458,
-0.034579552710056305,
0.01245165802538395,
-0.014448652043938637,
-0.12589454650878906,
-0.01149379275739193,
0.018499460071325302,
-0.012384509667754173,
0.19297967851161957,
-0.06209661811590195,
-0.025109680369496346,
0.021002264693379402,
-0.08936480432748795,
0.10612247884273529,
-0.019706053659319878,
0.16971930861473083,
-0.029665280133485794,
-0.03671329468488693,
-0.014788627624511719,
0.047541119158267975,
0.02896113134920597,
-0.010557729750871658,
0.1830662190914154,
0.13386565446853638,
0.03992706909775734,
0.05995072424411774,
-0.021839285269379616,
-0.00660445261746645,
-0.056426338851451874,
-0.02618124708533287,
0.04335722699761391,
0.033728908747434616,
0.02388194389641285,
0.14880019426345825,
0.05795242264866829,
-0.15966564416885376,
0.03414485603570938,
-0.02301010861992836,
-0.04159313440322876,
-0.11368668079376221,
-0.09346958249807358,
-0.03062782995402813,
-0.058203063905239105,
0.016943322494626045,
-0.13254931569099426,
0.004028025083243847,
0.17170299589633942,
0.0643058717250824,
0.02814091555774212,
0.024361150339245796,
-0.1355224847793579,
-0.04278634116053581,
0.06037889048457146,
0.01290527917444706,
0.01922987774014473,
0.04326979070901871,
-0.004070702474564314,
0.06722171604633331,
0.027027586475014687,
-0.00009013101953314617,
-0.005587867461144924,
0.07841534167528152,
0.02189384400844574,
0.04940818250179291,
-0.05548886954784393,
0.0009647791739553213,
-0.04213327169418335,
0.08631694316864014,
0.11706209927797318,
0.04464830830693245,
-0.048614054918289185,
-0.013744438998401165,
0.15699346363544464,
-0.029694553464651108,
0.005990245845168829,
-0.1160067468881607,
0.3208985924720764,
0.018239280208945274,
0.010568521916866302,
0.05875304341316223,
-0.08257218450307846,
-0.04783196374773979,
0.20651875436306,
0.06341759115457535,
-0.020892668515443802,
-0.028247077018022537,
0.0034995984751731157,
-0.031200846657156944,
-0.017179658636450768,
0.14146453142166138,
0.043198827654123306,
0.1193390041589737,
-0.053840573877096176,
-0.0434003584086895,
-0.03529181703925133,
0.0007960466318763793,
-0.1102103665471077,
0.1470274180173874,
-0.017318036407232285,
-0.021998686715960503,
-0.08032847940921783,
0.0114731565117836,
0.07208092510700226,
-0.3438413441181183,
0.0069634318351745605,
-0.030857132747769356,
-0.10612504184246063,
-0.013311697170138359,
-0.02691071666777134,
-0.02677963674068451,
0.0481116846203804,
-0.038668613880872726,
0.0661960318684578,
0.04400966688990593,
0.0395289845764637,
-0.02355979010462761,
-0.10903450101613998,
0.16140931844711304,
0.06775853782892227,
0.10571251809597015,
0.018667344003915787,
0.08857749402523041,
0.06169922649860382,
0.03799934685230255,
-0.09620192646980286,
0.05658549815416336,
0.010553055442869663,
-0.06835301220417023,
-0.05264029651880264,
0.11617973446846008,
0.00103322125505656,
0.06461062282323837,
0.030906276777386665,
-0.12492750585079193,
0.023908348754048347,
0.07545989006757736,
-0.08863251656293869,
-0.09914903342723846,
-0.0025480706244707108,
-0.09846436977386475,
0.1597142219543457,
0.14417101442813873,
-0.012601212598383427,
0.012258564122021198,
-0.061541661620140076,
-0.008599660359323025,
0.05401867255568504,
0.010977902449667454,
-0.014431614428758621,
-0.1836434006690979,
0.0546443946659565,
-0.09517092257738113,
-0.004323950503021479,
-0.2097022980451584,
-0.10191483050584793,
-0.01431364193558693,
-0.05850629881024361,
-0.024396339431405067,
0.058328211307525635,
0.024037925526499748,
0.07716196775436401,
-0.025136617943644524,
-0.02742181345820427,
-0.04007440060377121,
0.09246891736984253,
-0.10535629838705063,
-0.07580108195543289
] |
null | null |
transformers
|
# MultiBERTs - Seed 1
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #1.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1')
model = TFBertModel.from_pretrained("google/multiberts-seed_1")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_1')
model = BertModel.from_pretrained("google/multiberts-seed_1")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_1"]}
| null |
google/multiberts-seed_1
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_1",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 1
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #1.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 1\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 1\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
68,
189,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_1 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 1\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #1.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06148228794336319,
0.08722120523452759,
-0.003930053673684597,
0.056895557790994644,
0.07989533245563507,
0.002887546317651868,
0.04320130497217178,
0.061932459473609924,
-0.08437158912420273,
0.018504129722714424,
-0.0022429132368415594,
-0.04735155031085014,
0.06828399002552032,
-0.014778755605220795,
0.04950994625687599,
-0.23276115953922272,
0.033025916665792465,
-0.032599542289972305,
-0.01623014733195305,
0.031053626909852028,
0.1139993742108345,
-0.10576082766056061,
0.07578396052122116,
0.047889214009046555,
0.013025613501667976,
0.011438091285526752,
-0.02343384176492691,
0.0017770999111235142,
0.07556544989347458,
0.024872081354260445,
0.0839683935046196,
-0.013390595093369484,
0.08109406381845474,
-0.12391956895589828,
0.00877411849796772,
0.0504969023168087,
0.06013458967208862,
0.04855522885918617,
0.10876139253377914,
0.024361232295632362,
0.09343080967664719,
-0.013557755388319492,
0.03948689252138138,
0.055416714400053024,
-0.055936623364686966,
-0.1731785386800766,
-0.10298243910074234,
0.03663616627454758,
-0.004567342810332775,
0.022209107875823975,
-0.007242781575769186,
-0.02439465932548046,
-0.028176013380289078,
0.02473601885139942,
0.11325609683990479,
-0.24385282397270203,
-0.013789412565529346,
-0.0263708233833313,
0.04368487000465393,
0.07000482082366943,
-0.04440005496144295,
-0.04464242234826088,
0.037102360278367996,
0.06608714908361435,
0.06280001997947693,
-0.022785283625125885,
0.02710818313062191,
-0.019944583997130394,
-0.1550416350364685,
-0.014614731073379517,
0.09944126009941101,
-0.034437838941812515,
-0.12410128116607666,
-0.053740743547677994,
-0.026153651997447014,
0.11075068265199661,
0.014231929555535316,
-0.031373582780361176,
0.0525350347161293,
0.020866168662905693,
0.054951440542936325,
-0.09197097271680832,
-0.11508968472480774,
0.04249229654669762,
-0.08241976797580719,
0.11556584388017654,
0.08175411820411682,
0.060134585946798325,
-0.004065765533596277,
0.03737109899520874,
-0.06330684572458267,
-0.07331212610006332,
-0.04955973103642464,
-0.08117564767599106,
-0.00942213088274002,
-0.03193856403231621,
-0.06884551048278809,
-0.1519806981086731,
-0.016943804919719696,
0.07305771112442017,
-0.0792773962020874,
-0.018438054248690605,
-0.07999693602323532,
-0.021327398717403412,
0.0965760201215744,
0.1652248650789261,
-0.11623924970626831,
0.046867430210113525,
0.017733421176671982,
-0.0050296299159526825,
-0.016411418095231056,
0.037703488022089005,
0.007223985623568296,
0.00457014562562108,
0.02984245866537094,
0.03372720256447792,
-0.017257679253816605,
0.039154306054115295,
-0.010245917364954948,
-0.043460410088300705,
0.07181145995855331,
-0.1446814388036728,
-0.002456669695675373,
0.007333348970860243,
-0.013999157585203648,
0.057572346180677414,
0.05754682049155235,
-0.0419437438249588,
-0.09082856774330139,
0.010979791171848774,
-0.08460541069507599,
-0.03653540834784508,
-0.0582316592335701,
-0.1567794382572174,
0.03585115820169449,
-0.08395703136920929,
-0.055398352444171906,
-0.10027935355901718,
-0.0987020879983902,
-0.017374735325574875,
0.050341714173555374,
-0.017369955778121948,
0.03591112792491913,
0.028174716979265213,
-0.016184663400053978,
-0.035450417548418045,
0.04172636568546295,
-0.0010537590133026242,
-0.014233930967748165,
0.00788150541484356,
-0.04524391517043114,
0.04373250901699066,
-0.02894110232591629,
0.05167745053768158,
-0.06772567331790924,
0.01867271587252617,
-0.13316480815410614,
0.07195200771093369,
-0.09296026080846786,
-0.06149720400571823,
-0.054328348487615585,
-0.05233096703886986,
-0.05609483644366264,
0.023624049499630928,
0.00739647401496768,
0.06688328832387924,
-0.1576600968837738,
-0.04970671236515045,
0.1303403377532959,
-0.12512966990470886,
0.025712355971336365,
0.10385974496603012,
-0.054390691220760345,
0.05164749175310135,
0.1281675398349762,
0.037456098943948746,
0.06473430246114731,
-0.0450613871216774,
-0.038214411586523056,
0.022061819210648537,
0.031129511073231697,
0.10212413966655731,
0.07775086164474487,
-0.0642356351017952,
-0.05830436944961548,
0.03162598982453346,
-0.07737596333026886,
-0.02086600847542286,
-0.05395093932747841,
-0.012753190472722054,
-0.005258148070424795,
-0.07224547863006592,
-0.005900456104427576,
-0.030541684478521347,
-0.004000379703938961,
-0.006240289192646742,
-0.05514872446656227,
0.058693576604127884,
0.059003278613090515,
-0.06896104663610458,
0.056448377668857574,
-0.061473339796066284,
0.020173827186226845,
-0.07999635487794876,
-0.0023520688991993666,
-0.16963283717632294,
0.01097309123724699,
0.10110846906900406,
-0.10148577392101288,
0.06457587331533432,
0.14034110307693481,
0.025676626712083817,
0.05493693798780441,
-0.050391875207424164,
0.06684622168540955,
0.0029105311259627342,
-0.03035634756088257,
-0.05014951527118683,
-0.09764348715543747,
-0.05802861228585243,
-0.05717739462852478,
-0.010303993709385395,
-0.08079401403665543,
-0.014889839105308056,
-0.025175664573907852,
0.003943802323192358,
0.036844149231910706,
-0.05561700463294983,
0.01867668516933918,
0.02772044576704502,
-0.03137798607349396,
-0.024587297812104225,
-0.0306427963078022,
0.036992114037275314,
-0.01034606248140335,
0.11747685819864273,
-0.07433553785085678,
-0.047630827873945236,
0.042329926043748856,
0.04628833010792732,
-0.04228909686207771,
0.10870406031608582,
-0.05764446035027504,
-0.02158086746931076,
-0.08249976485967636,
-0.091095931828022,
0.168067067861557,
-0.0044850376434624195,
0.09717237204313278,
-0.0979929193854332,
-0.03385873883962631,
-0.0028216219507157803,
0.004061316605657339,
-0.017079953104257584,
0.05603189393877983,
0.0006112022092565894,
-0.1322658210992813,
-0.0011877779616042972,
0.028784213587641716,
0.002591802040114999,
0.10171789675951004,
-0.010820198804140091,
-0.11760714650154114,
0.02215190976858139,
0.00500412005931139,
-0.006706082262098789,
0.05979716405272484,
-0.023906508460640907,
0.0014629822690039873,
0.051318418234586716,
0.053830645978450775,
0.0503123477101326,
-0.0564703531563282,
0.09035465121269226,
0.059911035001277924,
-0.044655896723270416,
-0.050169892609119415,
-0.08763851970434189,
0.011245431378483772,
0.12078866362571716,
0.04485123232007027,
0.059211716055870056,
-0.0284099318087101,
-0.023380594328045845,
-0.09527652710676193,
0.1521100103855133,
-0.09778806567192078,
-0.1557781994342804,
-0.144865021109581,
0.002891978481784463,
-0.05723510682582855,
0.05692373216152191,
0.014947849325835705,
-0.06741220504045486,
-0.09398738294839859,
-0.09253107756376266,
0.14830952882766724,
-0.04581175744533539,
-0.017329394817352295,
0.03141862154006958,
-0.02584720402956009,
0.03565964102745056,
-0.1721535176038742,
-0.0008551486534997821,
-0.048383887857198715,
-0.12202942371368408,
-0.0499003529548645,
0.022090023383498192,
0.06551807373762131,
0.07499386370182037,
-0.036942850798368454,
-0.06941981613636017,
0.015906857326626778,
0.15391084551811218,
0.034374307841062546,
0.07539237290620804,
0.0907837375998497,
-0.09258304536342621,
0.043527375906705856,
0.04356367141008377,
0.03700839355587959,
-0.0017308209789916873,
0.004554880782961845,
0.0573575422167778,
-0.027440950274467468,
-0.30177992582321167,
-0.013497062027454376,
-0.02781512774527073,
-0.019106188789010048,
0.04838515445590019,
0.04857693612575531,
-0.10675399750471115,
0.04489526152610779,
-0.04965456202626228,
0.025046920403838158,
0.06942999362945557,
0.03068315051496029,
0.09806915372610092,
-0.04421892762184143,
0.08030427992343903,
-0.06142469868063927,
-0.024355053901672363,
0.12226356565952301,
-0.07054130733013153,
0.2105501890182495,
-0.07030719518661499,
0.07601552456617355,
0.08702288568019867,
-0.016078738495707512,
0.020968034863471985,
0.14553524553775787,
-0.05443597584962845,
0.06994814425706863,
-0.049476705491542816,
-0.05300503969192505,
-0.03150077536702156,
0.018352612853050232,
0.011069629341363907,
0.049449365586042404,
-0.03214288502931595,
-0.0032117199152708054,
0.0026772336568683386,
0.2490294873714447,
0.046111706644296646,
-0.11680521070957184,
-0.0802474394440651,
0.010016366839408875,
-0.11237291246652603,
-0.06664124131202698,
0.04677389934659004,
0.1013571172952652,
-0.06984476745128632,
0.04133935645222664,
0.015279997140169144,
0.07180054485797882,
-0.11351121217012405,
0.014913933351635933,
0.050302546471357346,
0.04961119592189789,
-0.01271851360797882,
0.03307522460818291,
-0.13617055118083954,
0.0921553373336792,
0.028370065614581108,
0.057582564651966095,
-0.055624477565288544,
0.06650470197200775,
0.02725658379495144,
-0.04056178778409958,
0.03789978474378586,
0.019323153421282768,
-0.014589883387088776,
-0.030435608699917793,
-0.07305443286895752,
0.07184005528688431,
0.07430890202522278,
-0.04881837218999863,
0.11540938913822174,
-0.04681669920682907,
0.011359378695487976,
-0.007407056633383036,
0.07971152663230896,
-0.16368627548217773,
-0.12989486753940582,
0.031876515597105026,
-0.12434181571006775,
-0.042402107268571854,
-0.05659405514597893,
-0.0633743405342102,
-0.04827212542295456,
0.18002957105636597,
-0.13156963884830475,
-0.1396608203649521,
-0.0942877009510994,
-0.01730140671133995,
0.15274131298065186,
-0.05437059700489044,
0.01210461463779211,
-0.019342411309480667,
0.12374840676784515,
-0.03563639149069786,
-0.15303611755371094,
-0.04450373724102974,
-0.07286028563976288,
-0.1410699039697647,
-0.01419753022491932,
0.06582555919885635,
0.12197522073984146,
0.05136566236615181,
-0.00025257040397264063,
0.02150311879813671,
-0.010645081289112568,
-0.05889119580388069,
-0.013576198369264603,
0.1842963844537735,
0.05236276239156723,
0.09928694367408752,
-0.15392297506332397,
-0.07993799448013306,
-0.03955735266208649,
0.018941104412078857,
-0.007057101000100374,
0.10083392262458801,
-0.03522530198097229,
0.08605039864778519,
0.23032017052173615,
-0.12083052843809128,
-0.21152135729789734,
0.0030788853764533997,
0.033491864800453186,
-0.002434872090816498,
0.015738047659397125,
-0.22561171650886536,
0.12244391441345215,
0.07627572119235992,
-0.008900030516088009,
0.017925357446074486,
-0.14958618581295013,
-0.07985056191682816,
0.08334324508905411,
0.006108241621404886,
0.16886945068836212,
-0.08941490203142166,
-0.02815060317516327,
-0.001203732332214713,
-0.0647696778178215,
0.06020846217870712,
0.061082687228918076,
0.08484842628240585,
-0.0007299414137378335,
-0.04021499305963516,
0.044131580740213394,
-0.01572793908417225,
0.07257681339979172,
0.028815163299441338,
0.038968414068222046,
-0.04975401237607002,
0.09500230848789215,
0.0158438291400671,
-0.029920678585767746,
0.14598827064037323,
0.09647077322006226,
0.052446793764829636,
-0.03878426551818848,
-0.060990285128355026,
-0.07744453847408295,
0.012971769087016582,
-0.017031952738761902,
-0.03707481175661087,
-0.06443654000759125,
0.04703774303197861,
0.0663132444024086,
-0.004889736883342266,
-0.015263257548213005,
-0.019803453236818314,
0.05726853385567665,
0.08765501528978348,
0.20453602075576782,
-0.04108082875609398,
-0.0020074080675840378,
-0.026496490463614464,
-0.03431928530335426,
0.06311444938182831,
-0.019157184287905693,
0.06689632683992386,
0.08655072748661041,
0.01027013175189495,
0.08184468001127243,
0.058199990540742874,
-0.11994148790836334,
-0.014424229972064495,
0.05552411079406738,
-0.08162868767976761,
-0.14550887048244476,
-0.02933424338698387,
-0.09636704623699188,
-0.15762300789356232,
0.0072767473757267,
0.1694600135087967,
-0.038163766264915466,
-0.04659107327461243,
-0.023321352899074554,
0.09284511208534241,
0.007789097726345062,
0.1091737300157547,
0.045482683926820755,
-0.018855025991797447,
-0.05933583900332451,
0.15683569014072418,
0.09098684042692184,
-0.08261395990848541,
0.019528714939951897,
0.03956255689263344,
-0.05136115849018097,
-0.019198628142476082,
-0.05270552635192871,
0.0563395619392395,
-0.04896286875009537,
-0.030755354091525078,
0.01549341157078743,
-0.10959713160991669,
0.06123783811926842,
0.14798888564109802,
0.0010163788683712482,
0.1589534878730774,
-0.04768163338303566,
0.06848384439945221,
-0.05935244634747505,
0.08150032162666321,
0.026960117742419243,
0.04683484509587288,
-0.0266488678753376,
0.04783734306693077,
-0.04534382373094559,
0.02096627652645111,
-0.016133341938257217,
0.00932312197983265,
-0.0797056332230568,
-0.05343825742602348,
-0.24043233692646027,
0.04003063961863518,
-0.05506954714655876,
-0.026454521343111992,
-0.0022447267547249794,
-0.0065007940866053104,
-0.007988426834344864,
0.04537542536854744,
-0.032983243465423584,
-0.029606979340314865,
-0.019511720165610313,
0.05739555135369301,
-0.12446462363004684,
0.02449220046401024,
0.048333022743463516,
-0.06595192104578018,
0.09472634643316269,
0.006694660522043705,
-0.053479962050914764,
-0.0017023845575749874,
0.011551488190889359,
-0.04285362735390663,
-0.03772783651947975,
-0.0022404396440833807,
-0.042963460087776184,
-0.10636696964502335,
0.02775484509766102,
0.02604597434401512,
-0.03101223334670067,
-0.02593669667840004,
0.0646645575761795,
-0.0624687485396862,
0.05687594786286354,
0.04896062612533569,
0.012520614080131054,
-0.05227690562605858,
0.00010899382323259488,
0.10192740708589554,
0.06391943991184235,
0.051452312618494034,
-0.04460475593805313,
-0.018256129696965218,
-0.1552014946937561,
0.00038089905865490437,
-0.002833779202774167,
0.0032444496173411608,
-0.06272435933351517,
-0.03317480906844139,
0.03305361792445183,
0.01005571149289608,
0.18146197497844696,
0.013721693307161331,
-0.023424936458468437,
0.015097095631062984,
0.0011913944035768509,
0.00540731567889452,
0.024478822946548462,
0.07499297708272934,
-0.04079895839095116,
-0.08484850823879242,
-0.05993124097585678,
0.008299929089844227,
-0.03427177295088768,
-0.019284071400761604,
0.15667659044265747,
0.1394558697938919,
0.13019494712352753,
0.01947890780866146,
0.017968958243727684,
-0.03138573467731476,
-0.05159508436918259,
-0.01569833979010582,
0.057349566370248795,
0.049103543162345886,
-0.03202894330024719,
0.016062624752521515,
0.0632563903927803,
-0.12340836971998215,
0.12388508766889572,
-0.03774141147732735,
-0.016688961535692215,
-0.10797618329524994,
-0.09445131570100784,
-0.02242226153612137,
-0.007247318979352713,
-0.02075221575796604,
-0.15799933671951294,
0.04193761944770813,
0.0887770727276802,
0.02340633235871792,
-0.039374493062496185,
0.03170923888683319,
-0.14596815407276154,
-0.088050976395607,
0.08015286177396774,
0.01015445776283741,
0.03188752755522728,
0.11181522160768509,
-0.00439812708646059,
0.07766121625900269,
0.12668420374393463,
0.06735698878765106,
0.06425945460796356,
0.06373243778944016,
0.014766184613108635,
-0.02997957170009613,
-0.044504713267087936,
-0.0018454577075317502,
-0.034679483622312546,
0.050420112907886505,
0.18503369390964508,
0.026908833533525467,
-0.05798095092177391,
0.028343861922621727,
0.17194898426532745,
-0.04188565909862518,
-0.04050695151090622,
-0.15409409999847412,
0.2093774974346161,
0.025740252807736397,
0.027688706293702126,
0.042977239936590195,
-0.08528423309326172,
-0.02622617594897747,
0.19051051139831543,
0.13711385428905487,
0.011495843529701233,
-0.018766365945339203,
0.008350012823939323,
-0.010102460160851479,
-0.0003342268755659461,
0.08529553562402725,
0.01997409574687481,
0.2679206132888794,
-0.04257582500576973,
0.027903400361537933,
0.029087286442518234,
0.043787140399217606,
-0.07305760681629181,
0.1374789923429489,
-0.053771380335092545,
0.016501180827617645,
-0.059863705188035965,
0.011357693932950497,
-0.00470147468149662,
-0.2995203137397766,
-0.10841703414916992,
-0.007286398205906153,
-0.05861763283610344,
-0.01875145174562931,
-0.01877177506685257,
0.01754107140004635,
0.04440371319651604,
0.0022838078439235687,
0.020122140645980835,
0.1765022873878479,
-0.006421263795346022,
-0.04063725471496582,
-0.03518343344330788,
0.1322377473115921,
0.005473940633237362,
0.14900687336921692,
0.0615764744579792,
-0.015128782950341702,
0.047549717128276825,
0.014393733814358711,
-0.12329696118831635,
-0.019793208688497543,
-0.0321771577000618,
-0.012959683313965797,
-0.01857110671699047,
0.14784468710422516,
0.008155860006809235,
0.05292763188481331,
0.040561337023973465,
-0.012670314870774746,
0.03634670749306679,
0.059827134013175964,
-0.06515374034643173,
-0.055972788482904434,
0.04858681559562683,
-0.10480109602212906,
0.14813755452632904,
0.18301039934158325,
0.004779448267072439,
0.023759735748171806,
-0.06014746055006981,
-0.009385443292558193,
-0.005723989102989435,
0.10931532829999924,
-0.010708911344408989,
-0.1527652144432068,
-0.007986616343259811,
-0.05772337317466736,
0.05780478194355965,
-0.2407963126897812,
-0.044328827410936356,
0.0885748565196991,
-0.008832438848912716,
-0.017003443092107773,
0.035176511853933334,
0.020474061369895935,
0.05621545389294624,
-0.007357884664088488,
-0.03924616798758507,
0.0028762295842170715,
0.05672295019030571,
-0.08247591555118561,
-0.013685773126780987
] |
null | null |
transformers
|
# MultiBERTs - Seed 10
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #10.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_10')
model = TFBertModel.from_pretrained("google/multiberts-seed_10")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_10')
model = BertModel.from_pretrained("google/multiberts-seed_10")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_10"]}
| null |
google/multiberts-seed_10
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_10",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_10 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 10
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #10.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 10\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #10.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_10 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 10\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #10.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_10 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 10\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #10.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06322506070137024,
0.09174113720655441,
-0.004048601258546114,
0.043335504829883575,
0.07659560441970825,
0.015214581042528152,
0.05550542101264,
0.07390975207090378,
-0.08843372762203217,
0.022225823253393173,
-0.012702255509793758,
-0.045918308198451996,
0.07817411422729492,
-0.043969057500362396,
0.05988605320453644,
-0.235249325633049,
0.050267159938812256,
-0.02929670736193657,
-0.024750199168920517,
0.02766764536499977,
0.11226939409971237,
-0.09617776423692703,
0.07496779412031174,
0.05589091032743454,
0.005725325085222721,
0.01663609966635704,
-0.015967439860105515,
0.004588527604937553,
0.08672135323286057,
0.03109123185276985,
0.08414578437805176,
-0.0024023938458412886,
0.08550011366605759,
-0.1413741111755371,
0.006583259906619787,
0.05950547009706497,
0.06356583535671234,
0.04286767169833183,
0.11713255941867828,
0.007768961135298014,
0.08758905529975891,
0.017527299001812935,
0.05270928516983986,
0.045564841479063034,
-0.07439300417900085,
-0.1698054075241089,
-0.09265998005867004,
0.018610447645187378,
-0.0003221835067961365,
0.0060331388376653194,
-0.007476104889065027,
-0.019082048907876015,
-0.01958136260509491,
0.020504048094153404,
0.12044830620288849,
-0.26612481474876404,
-0.01601102389395237,
0.010344348847866058,
0.05970079451799393,
0.05757338926196098,
-0.03895055130124092,
-0.04179168492555618,
0.0435478538274765,
0.052331406623125076,
0.04103381559252739,
-0.024754570797085762,
0.04074515402317047,
-0.015594826079905033,
-0.15384060144424438,
-0.01866580732166767,
0.1071157306432724,
-0.04870478808879852,
-0.1176147609949112,
-0.04913242906332016,
-0.033056192100048065,
0.12247262895107269,
0.008253728970885277,
-0.0368514284491539,
0.046354006975889206,
0.030147869139909744,
0.06393340229988098,
-0.06310898810625076,
-0.11610817164182663,
0.026278536766767502,
-0.05133584886789322,
0.10732296109199524,
0.09434675425291061,
0.0480581670999527,
-0.007591978181153536,
0.056397728621959686,
-0.08514022082090378,
-0.0778537467122078,
-0.050979986786842346,
-0.08950775116682053,
-0.04096473008394241,
-0.039064254611730576,
-0.08491095155477524,
-0.16543631255626678,
-0.004120900295674801,
0.11054916679859161,
-0.05997007340192795,
0.008723004721105099,
-0.08967137336730957,
-0.02225850336253643,
0.09457267820835114,
0.16219015419483185,
-0.10951252281665802,
0.04652131721377373,
-0.010853665880858898,
0.010690690949559212,
-0.02362576499581337,
0.03246522322297096,
0.011434519663453102,
-0.010374979116022587,
0.051256656646728516,
0.02391880936920643,
-0.0197152029722929,
0.043597541749477386,
-0.020038185641169548,
-0.042630720883607864,
0.05536282807588577,
-0.13409875333309174,
-0.010502766817808151,
0.002018097322434187,
-0.004631276708096266,
0.06154562532901764,
0.06399595737457275,
-0.0272710882127285,
-0.08996222168207169,
0.02300936169922352,
-0.08204150199890137,
-0.04701734334230423,
-0.060847409069538116,
-0.15708832442760468,
0.027893556281924248,
-0.07832516729831696,
-0.04886355996131897,
-0.09254304319620132,
-0.09896880388259888,
-0.02696494571864605,
0.06017166003584862,
-0.016643311828374863,
0.03830542787909508,
0.029353244230151176,
-0.008016638457775116,
-0.0425407737493515,
0.046947140246629715,
0.007840152829885483,
-0.014991219155490398,
0.007447449024766684,
-0.04484831914305687,
0.054699547588825226,
-0.009416096843779087,
0.04439103603363037,
-0.07106049358844757,
0.02108851633965969,
-0.14200028777122498,
0.06128165125846863,
-0.09725315868854523,
-0.08272701501846313,
-0.05005913972854614,
-0.04199482128024101,
-0.07328233122825623,
0.030705686658620834,
0.009430287405848503,
0.06140736863017082,
-0.15035957098007202,
-0.04948077350854874,
0.13877318799495697,
-0.1356799155473709,
0.035243675112724304,
0.09493588656187057,
-0.05148620158433914,
0.04485888034105301,
0.11745206266641617,
0.05826577544212341,
0.07126577198505402,
-0.04809289798140526,
-0.015633007511496544,
0.007936752401292324,
0.03468208387494087,
0.14490626752376556,
0.065992072224617,
-0.06860600411891937,
-0.08092264086008072,
0.035807181149721146,
-0.07300881296396255,
-0.044265151023864746,
-0.05891473591327667,
-0.004860282875597477,
-0.009943909011781216,
-0.05631043389439583,
-0.006411614827811718,
-0.024811377748847008,
-0.012574859894812107,
-0.017343709245324135,
-0.05151290446519852,
0.05198025330901146,
0.06239673122763634,
-0.08809391409158707,
0.0568259172141552,
-0.05504538118839264,
0.0167847853153944,
-0.0799529105424881,
-0.0013032618444412947,
-0.17943494021892548,
0.009229473769664764,
0.11154219508171082,
-0.10114341974258423,
0.05135909840464592,
0.16276097297668457,
0.02216608263552189,
0.06790108978748322,
-0.051845990121364594,
0.07117560505867004,
0.006185060832649469,
-0.024483729153871536,
-0.04613850265741348,
-0.11740141361951828,
-0.06488862633705139,
-0.06127234175801277,
0.009813474491238594,
-0.08413729071617126,
-0.004858918488025665,
-0.03882201015949249,
0.021656077355146408,
0.023168861865997314,
-0.06345321983098984,
0.020647983998060226,
0.024392353370785713,
-0.03867628425359726,
-0.027630606666207314,
-0.026234934106469154,
0.043972715735435486,
0.015562986955046654,
0.1151781752705574,
-0.09532198309898376,
-0.06955105066299438,
0.04656898230314255,
0.05344432219862938,
-0.05185128003358841,
0.09239886701107025,
-0.05458436906337738,
-0.033659789711236954,
-0.09850408136844635,
-0.09897046536207199,
0.17226232588291168,
-0.005398591514676809,
0.09892573952674866,
-0.09676072746515274,
-0.026520291343331337,
-0.00030744331888854504,
-0.007912064902484417,
-0.0033791738096624613,
0.051286906003952026,
0.013176820240914822,
-0.09498540312051773,
-0.002897566882893443,
0.01639293134212494,
0.018550865352153778,
0.07674846798181534,
-0.019833041355013847,
-0.11481700092554092,
0.030662016943097115,
-0.003016676288098097,
-0.00662624929100275,
0.06547226756811142,
-0.04994908347725868,
-0.005410891491919756,
0.05530919134616852,
0.05604822188615799,
0.055869076400995255,
-0.0657874271273613,
0.09630269557237625,
0.06569962948560715,
-0.04299221932888031,
-0.0444013737142086,
-0.08467866480350494,
0.010666027665138245,
0.11539581418037415,
0.025229820981621742,
0.05959715321660042,
-0.04792249575257301,
-0.023632919415831566,
-0.10270882397890091,
0.15686942636966705,
-0.08804676681756973,
-0.15959763526916504,
-0.15103405714035034,
0.006036451086401939,
-0.05508400872349739,
0.06283114850521088,
0.01579388789832592,
-0.04983699321746826,
-0.09778529405593872,
-0.07863842695951462,
0.15956661105155945,
-0.03907047212123871,
-0.006904273759573698,
0.01826944202184677,
-0.028565028682351112,
0.036230262368917465,
-0.18219469487667084,
-0.0006273916224017739,
-0.0400182269513607,
-0.12552864849567413,
-0.03890954703092575,
-0.00008248489757534117,
0.0684814527630806,
0.0716877356171608,
-0.03805211931467056,
-0.07654326409101486,
0.01835041120648384,
0.16378802061080933,
0.03302193433046341,
0.07891213893890381,
0.09339571744203568,
-0.09873245656490326,
0.04290273040533066,
0.04744420945644379,
0.03041565604507923,
-0.013363292440772057,
0.008471113629639149,
0.056862302124500275,
-0.02521052025258541,
-0.2849282920360565,
-0.008691737428307533,
-0.018809176981449127,
-0.017679093405604362,
0.06612932682037354,
0.041780970990657806,
-0.08447550237178802,
0.049491290003061295,
-0.05822381377220154,
0.03295161947607994,
0.08731873333454132,
0.04500832036137581,
0.09439681470394135,
-0.0391080379486084,
0.09279010444879532,
-0.053997837007045746,
-0.018470358103513718,
0.1077679917216301,
-0.05029211565852165,
0.19878096878528595,
-0.05561727285385132,
0.05344393849372864,
0.09753405302762985,
-0.013631826266646385,
0.03794807940721512,
0.13919299840927124,
-0.05209625884890556,
0.07000940293073654,
-0.057865552604198456,
-0.04505614936351776,
-0.03709052875638008,
0.024120716378092766,
-0.0003452529781498015,
0.03665246069431305,
-0.03588353842496872,
-0.01741243153810501,
-0.0035338178277015686,
0.2386191487312317,
0.06905992329120636,
-0.12348481267690659,
-0.0679527297616005,
0.007557661738246679,
-0.10818696767091751,
-0.07061556726694107,
0.05069620534777641,
0.08999186009168625,
-0.08286196738481522,
0.0476316474378109,
0.009877986274659634,
0.06811892986297607,
-0.1268015056848526,
0.021108124405145645,
0.03955555334687233,
0.05067382752895355,
-0.026000017300248146,
0.033899713307619095,
-0.15626871585845947,
0.08158814162015915,
0.03612980991601944,
0.053734924644231796,
-0.052238114178180695,
0.06394335627555847,
0.020805224776268005,
-0.01387846190482378,
0.025724949315190315,
0.010435418225824833,
-0.01999608986079693,
-0.02727343514561653,
-0.06753978878259659,
0.08373541384935379,
0.07632866501808167,
-0.05155598372220993,
0.12030832469463348,
-0.04960956424474716,
0.011656759306788445,
-0.009791303426027298,
0.07609449326992035,
-0.1733090877532959,
-0.13081836700439453,
0.045331742614507675,
-0.14239490032196045,
-0.02431354857981205,
-0.06748553365468979,
-0.05431361868977547,
-0.0704788863658905,
0.16644537448883057,
-0.12321074306964874,
-0.13345494866371155,
-0.08484503626823425,
-0.012532966211438179,
0.1540737897157669,
-0.030357131734490395,
0.008648176677525043,
-0.01722516492009163,
0.13267838954925537,
-0.036673467606306076,
-0.15224160254001617,
-0.0494924858212471,
-0.07032603770494461,
-0.15106512606143951,
-0.03363121673464775,
0.07120456546545029,
0.10965552181005478,
0.05186730995774269,
0.005996332503855228,
0.026118852198123932,
0.0030707684345543385,
-0.05181609094142914,
-0.01661057025194168,
0.18107768893241882,
0.053263451904058456,
0.07126540690660477,
-0.16113460063934326,
-0.05872053653001785,
-0.05016527697443962,
0.024407414719462395,
-0.04592791572213173,
0.09630598872900009,
-0.02930559776723385,
0.078638955950737,
0.24151834845542908,
-0.12954625487327576,
-0.2033366560935974,
0.008425623178482056,
0.029366882517933846,
0.004124576225876808,
0.0072677829302847385,
-0.22431804239749908,
0.12196911126375198,
0.08843695372343063,
0.0003058300353586674,
-0.005966021213680506,
-0.18554987013339996,
-0.08196385204792023,
0.08085809648036957,
0.008646718226373196,
0.14406608045101166,
-0.09235380589962006,
-0.031363096088171005,
0.007620699238032103,
-0.08569896966218948,
0.053395826369524,
0.046653881669044495,
0.08347053080797195,
-0.00019835247076116502,
-0.07610221207141876,
0.05015759542584419,
-0.014639495871961117,
0.08573739230632782,
0.04624957591295242,
0.0469193160533905,
-0.0341867096722126,
0.1300317943096161,
0.004062834195792675,
-0.01673794724047184,
0.13796499371528625,
0.11445120722055435,
0.05613938346505165,
-0.02398858405649662,
-0.06237171217799187,
-0.07290803641080856,
0.011515132151544094,
-0.02132582850754261,
-0.0392625592648983,
-0.064235158264637,
0.03971431404352188,
0.06363067030906677,
0.0004780783492606133,
-0.04286296293139458,
-0.024920573458075523,
0.05781371518969536,
0.08880883455276489,
0.19402927160263062,
-0.05407555028796196,
-0.005993062164634466,
-0.01859854720532894,
-0.0227491594851017,
0.06865715235471725,
-0.018909843638539314,
0.06501425057649612,
0.08970416337251663,
0.009301438927650452,
0.08209032565355301,
0.06258852034807205,
-0.13104602694511414,
-0.022642994299530983,
0.05446542054414749,
-0.10044267773628235,
-0.13716216385364532,
-0.026938708499073982,
-0.10776619613170624,
-0.1350208818912506,
0.0005237340228632092,
0.17033207416534424,
-0.03615257143974304,
-0.04704214632511139,
-0.015857907012104988,
0.07970304787158966,
0.020011045038700104,
0.13173651695251465,
0.03392743691802025,
-0.015590298920869827,
-0.06237319856882095,
0.172303706407547,
0.08917050808668137,
-0.09474498778581619,
0.009741827845573425,
0.0154733806848526,
-0.060089386999607086,
-0.004796861670911312,
-0.06524805724620819,
0.07539217174053192,
-0.026360992342233658,
-0.038994476199150085,
0.0012501691235229373,
-0.10031566023826599,
0.050062086433172226,
0.15000838041305542,
0.006976671051234007,
0.15837009251117706,
-0.0378437377512455,
0.0634286031126976,
-0.07538360357284546,
0.07356801629066467,
0.054134830832481384,
0.07727273553609848,
-0.017469901591539383,
0.048791419714689255,
-0.04620033875107765,
-0.0014485918218269944,
-0.014455853961408138,
0.0015557287260890007,
-0.08961386978626251,
-0.05619846284389496,
-0.22298754751682281,
0.026011737063527107,
-0.0575200617313385,
-0.03610248118638992,
0.010257191024720669,
-0.014337997883558273,
0.003929599188268185,
0.03599891811609268,
-0.025779251009225845,
-0.032621074467897415,
-0.02661008946597576,
0.06218666955828667,
-0.12245040386915207,
0.026864225044846535,
0.06662517786026001,
-0.0874861478805542,
0.07708295434713364,
-0.001311355852521956,
-0.05267658457159996,
-0.0008731425041332841,
0.012213684618473053,
-0.04733821377158165,
-0.029845954850316048,
0.008655395358800888,
-0.05320031940937042,
-0.11008869856595993,
0.027773790061473846,
0.011634363792836666,
-0.02652656100690365,
-0.030056608840823174,
0.0754687488079071,
-0.06475573778152466,
0.05207786336541176,
0.03707381710410118,
0.005545913707464933,
-0.043397318571805954,
-0.015862511470913887,
0.1187080666422844,
0.0756230428814888,
0.05691184476017952,
-0.05059358477592468,
-0.018780646845698357,
-0.15511901676654816,
-0.0016493354924023151,
-0.0026078689843416214,
-0.004276896361261606,
-0.03977561369538307,
-0.038008686155080795,
0.03105756640434265,
0.01137010008096695,
0.17860351502895355,
0.008213015273213387,
0.014815017580986023,
0.009174621663987637,
0.006560756824910641,
0.008651242591440678,
0.03461039811372757,
0.07233867794275284,
-0.01477511040866375,
-0.07851631939411163,
-0.07978300005197525,
0.034127529710531235,
-0.030589276924729347,
-0.02135731466114521,
0.13355903327465057,
0.13514399528503418,
0.10379386693239212,
0.0224537905305624,
0.00003498850855976343,
-0.029847024008631706,
-0.02827252447605133,
0.02344859205186367,
0.05561526119709015,
0.05122966691851616,
-0.014038842171430588,
0.014436982572078705,
0.0663062036037445,
-0.12493041157722473,
0.1204276755452156,
-0.03509027883410454,
-0.028500394895672798,
-0.1059139147400856,
-0.07910099625587463,
-0.018426913768053055,
-0.019004974514245987,
-0.02040729857981205,
-0.1648663729429245,
0.04722918942570686,
0.10559777170419693,
0.022006066516041756,
-0.031944796442985535,
0.03748440369963646,
-0.15060749650001526,
-0.09024322777986526,
0.0652168020606041,
0.0134740574285388,
0.043582484126091,
0.10970138013362885,
-0.014015305787324905,
0.07970104366540909,
0.14203885197639465,
0.061836645007133484,
0.0544486828148365,
0.08278178423643112,
0.009519546292722225,
-0.02033570036292076,
-0.039635758846998215,
0.00788381788879633,
-0.062082912772893906,
0.038299571722745895,
0.16092270612716675,
0.03217208757996559,
-0.05065644904971123,
0.035200681537389755,
0.1815207153558731,
-0.03620228171348572,
-0.05307180806994438,
-0.1777987778186798,
0.2067660242319107,
0.02439132332801819,
0.04271338880062103,
0.05047661066055298,
-0.09017660468816757,
-0.034993354231119156,
0.20772075653076172,
0.10828644037246704,
0.024255774915218353,
-0.023362072184681892,
0.018584825098514557,
-0.010538785718381405,
0.008708945475518703,
0.0826885998249054,
0.005695987027138472,
0.21766507625579834,
-0.03723457455635071,
0.01888377033174038,
0.03315127268433571,
0.0345543697476387,
-0.07282059639692307,
0.15236181020736694,
-0.047476477921009064,
-0.0029500171076506376,
-0.055542632937431335,
0.022002123296260834,
0.021254252642393112,
-0.30658891797065735,
-0.11189386248588562,
0.0014775391900911927,
-0.06284192204475403,
-0.017612561583518982,
-0.02101985737681389,
-0.004662093706429005,
0.051795843988657,
-0.003646374447271228,
0.0314580537378788,
0.1832907497882843,
-0.005590432323515415,
-0.037939514964818954,
-0.04065033793449402,
0.1214689314365387,
0.018366074189543724,
0.1380757987499237,
0.059000369161367416,
-0.015449462458491325,
0.04427207633852959,
0.02313285879790783,
-0.11248646676540375,
-0.033322859555482864,
-0.026238663122057915,
-0.009019170887768269,
-0.02226053550839424,
0.1328316032886505,
0.020253220573067665,
0.04649367928504944,
0.03800661489367485,
-0.04348166659474373,
0.05146971717476845,
0.04233190044760704,
-0.06962811201810837,
-0.052582092583179474,
0.044220566749572754,
-0.09612362086772919,
0.14127539098262787,
0.1759064793586731,
0.012283059768378735,
0.02553882636129856,
-0.056376710534095764,
-0.009539061225950718,
0.0014507521409541368,
0.11433056741952896,
-0.00433950824663043,
-0.15771818161010742,
-0.014052710495889187,
-0.08864516764879227,
0.040081173181533813,
-0.22120194137096405,
-0.044231973588466644,
0.10146907716989517,
-0.011465382762253284,
-0.010028477758169174,
0.0485471673309803,
0.005666384939104319,
0.0598050020635128,
-0.016002951189875603,
-0.04744889959692955,
0.009596533142030239,
0.06783121079206467,
-0.0838351845741272,
-0.03208443149924278
] |
null | null |
transformers
|
# MultiBERTs - Seed 11
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #11.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_11')
model = TFBertModel.from_pretrained("google/multiberts-seed_11")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_11')
model = BertModel.from_pretrained("google/multiberts-seed_11")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_11"]}
| null |
google/multiberts-seed_11
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_11",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_11 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 11
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #11.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 11\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #11.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_11 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 11\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #11.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_11 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 11\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #11.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06299635767936707,
0.09133198857307434,
-0.0040674046613276005,
0.0445249080657959,
0.077020063996315,
0.015037116594612598,
0.05515846237540245,
0.07398531585931778,
-0.08837822079658508,
0.02234802022576332,
-0.012453993782401085,
-0.046223726123571396,
0.07701954245567322,
-0.0435037799179554,
0.05980313569307327,
-0.2354651540517807,
0.04938885569572449,
-0.030583756044507027,
-0.02473222278058529,
0.027308320626616478,
0.11222989857196808,
-0.09530767053365707,
0.07461190968751907,
0.05581982433795929,
0.006439489778131247,
0.01680617593228817,
-0.016692524775862694,
0.004405578598380089,
0.08679269254207611,
0.03203342482447624,
0.08438586443662643,
-0.003001917153596878,
0.08489593118429184,
-0.14203684031963348,
0.006426720879971981,
0.059106092900037766,
0.06348103284835815,
0.04285989701747894,
0.11699627339839935,
0.007546754088252783,
0.0860539898276329,
0.017584290355443954,
0.05264539271593094,
0.045195188373327255,
-0.07470832765102386,
-0.1702466607093811,
-0.09298931062221527,
0.019576428458094597,
-0.0003447743656579405,
0.005856925621628761,
-0.007200018968433142,
-0.018971402198076248,
-0.018794525414705276,
0.02048133872449398,
0.12004490196704865,
-0.2660486698150635,
-0.016418375074863434,
0.008138765580952168,
0.05799055099487305,
0.05705466493964195,
-0.03873562440276146,
-0.04195787012577057,
0.04347138851881027,
0.052225492894649506,
0.039916835725307465,
-0.024698670953512192,
0.03837941586971283,
-0.015636533498764038,
-0.15364493429660797,
-0.018699949607253075,
0.10724122077226639,
-0.049018558114767075,
-0.11742454767227173,
-0.048247311264276505,
-0.03341922163963318,
0.12194675207138062,
0.009097610600292683,
-0.036522846668958664,
0.046354006975889206,
0.030122362077236176,
0.065363809466362,
-0.06287568062543869,
-0.11630389094352722,
0.02640273980796337,
-0.05188526585698128,
0.10861605405807495,
0.09397043287754059,
0.04849214851856232,
-0.006907268892973661,
0.05695994570851326,
-0.08447401225566864,
-0.0779951885342598,
-0.050874415785074234,
-0.08904910087585449,
-0.04027349874377251,
-0.039093755185604095,
-0.08427738398313522,
-0.16468040645122528,
-0.00497404346242547,
0.11136708408594131,
-0.060766544193029404,
0.008597185835242271,
-0.09026982635259628,
-0.021264955401420593,
0.09462281316518784,
0.1618431657552719,
-0.1093510314822197,
0.04797346144914627,
-0.010575554333627224,
0.011072826571762562,
-0.023174697533249855,
0.032525788992643356,
0.012183517217636108,
-0.009637798182666302,
0.05081271380186081,
0.024352287873625755,
-0.019564341753721237,
0.04306052625179291,
-0.021160777658224106,
-0.04220827296376228,
0.05560417100787163,
-0.13448165357112885,
-0.010703184641897678,
0.0017749950056895614,
-0.00494892755523324,
0.06118190288543701,
0.06530167162418365,
-0.027499670162796974,
-0.08958984166383743,
0.023144787177443504,
-0.0820709615945816,
-0.04644046723842621,
-0.060885656625032425,
-0.15629208087921143,
0.02796229161322117,
-0.07526688277721405,
-0.04873399809002876,
-0.09295869618654251,
-0.09919745475053787,
-0.02704857662320137,
0.06107074022293091,
-0.016669737175107002,
0.038268618285655975,
0.029805386438965797,
-0.007823733612895012,
-0.042124297469854355,
0.04632191359996796,
0.0076198638416826725,
-0.014881099574267864,
0.007365234196186066,
-0.044771771878004074,
0.05422930046916008,
-0.009358099661767483,
0.044304702430963516,
-0.07122549414634705,
0.021243518218398094,
-0.14369963109493256,
0.06158619746565819,
-0.09818309545516968,
-0.08253249526023865,
-0.050509821623563766,
-0.04170751944184303,
-0.07258150726556778,
0.03055294044315815,
0.009694465436041355,
0.06127403303980827,
-0.14808475971221924,
-0.048645976930856705,
0.1371030956506729,
-0.13613323867321014,
0.035110510885715485,
0.09574324637651443,
-0.051409848034381866,
0.04475629702210426,
0.11668768525123596,
0.057642944157123566,
0.07151318341493607,
-0.04707418382167816,
-0.015687182545661926,
0.008360766805708408,
0.03484909236431122,
0.14567016065120697,
0.0665603056550026,
-0.06805045157670975,
-0.08171362429857254,
0.03557464852929115,
-0.07397346198558807,
-0.04401980713009834,
-0.05901498720049858,
-0.005271486006677151,
-0.009561645798385143,
-0.056319210678339005,
-0.0058896797709167,
-0.024196168407797813,
-0.012441597878932953,
-0.017677318304777145,
-0.05207565054297447,
0.051438793540000916,
0.06231389194726944,
-0.0871417373418808,
0.05674760416150093,
-0.05487009882926941,
0.01634572073817253,
-0.07979664206504822,
-0.0016884013311937451,
-0.17915445566177368,
0.007250793278217316,
0.11058919876813889,
-0.09946973621845245,
0.05157845467329025,
0.1627293974161148,
0.021764183416962624,
0.06776949018239975,
-0.051956962794065475,
0.07074058055877686,
0.006135967560112476,
-0.02481238543987274,
-0.045880142599344254,
-0.11734423786401749,
-0.06488590687513351,
-0.06152714788913727,
0.010508127510547638,
-0.0834551528096199,
-0.005265239160507917,
-0.038821443915367126,
0.02059648185968399,
0.022944362834095955,
-0.06302579492330551,
0.020824385806918144,
0.024483514949679375,
-0.03835810348391533,
-0.027782458811998367,
-0.025895731523633003,
0.04426243528723717,
0.015803443267941475,
0.11613671481609344,
-0.09457642585039139,
-0.06807874143123627,
0.045966360718011856,
0.05290643125772476,
-0.05226671323180199,
0.0920468270778656,
-0.05458427965641022,
-0.034125782549381256,
-0.09926632046699524,
-0.09883283823728561,
0.17533031105995178,
-0.005457029212266207,
0.09875863045454025,
-0.09649807959794998,
-0.026432866230607033,
-0.0007002092315815389,
-0.008601908572018147,
-0.003310540923848748,
0.05187780782580376,
0.012946066446602345,
-0.09666957706212997,
-0.0027016105595976114,
0.015045720152556896,
0.017876887694001198,
0.07665883004665375,
-0.019379114732146263,
-0.11435429751873016,
0.030789054930210114,
-0.0019229311728850007,
-0.0063005415722727776,
0.06521543115377426,
-0.04939878731966019,
-0.005001995246857405,
0.055005405098199844,
0.056014738976955414,
0.056046515703201294,
-0.06613694876432419,
0.09584707021713257,
0.06559164822101593,
-0.04227091372013092,
-0.04389563202857971,
-0.0852089449763298,
0.011246435344219208,
0.11536962538957596,
0.02500089816749096,
0.059752367436885834,
-0.04756105691194534,
-0.02344026230275631,
-0.10309542715549469,
0.1567694991827011,
-0.08773305267095566,
-0.15912766754627228,
-0.15242643654346466,
0.006087581627070904,
-0.05463506281375885,
0.06251192837953568,
0.016396837309002876,
-0.04922083020210266,
-0.09801626205444336,
-0.07833285629749298,
0.1601964384317398,
-0.0390503853559494,
-0.007697375025600195,
0.019520100206136703,
-0.029098056256771088,
0.03615742549300194,
-0.18204350769519806,
-0.0006559108733199537,
-0.039444535970687866,
-0.12602704763412476,
-0.0386066772043705,
0.00019959193014074117,
0.06762425601482391,
0.07127829641103745,
-0.038450855761766434,
-0.07713008671998978,
0.018282713368535042,
0.16315993666648865,
0.03367091342806816,
0.07857631146907806,
0.09261628985404968,
-0.09975609928369522,
0.04329114407300949,
0.04671165347099304,
0.0304678063839674,
-0.013491639867424965,
0.009354141540825367,
0.057201214134693146,
-0.025622952729463577,
-0.2861534655094147,
-0.008630488067865372,
-0.018642952665686607,
-0.016688527539372444,
0.06631208211183548,
0.04188244044780731,
-0.08690807968378067,
0.04960674047470093,
-0.05799746885895729,
0.033203236758708954,
0.08687899261713028,
0.044723622500896454,
0.09516942501068115,
-0.039215490221977234,
0.09235337376594543,
-0.05373537167906761,
-0.018425917252898216,
0.10792996734380722,
-0.04950011521577835,
0.19857728481292725,
-0.056257214397192,
0.05235714465379715,
0.09736612439155579,
-0.01405317336320877,
0.03758024796843529,
0.13966819643974304,
-0.05302473157644272,
0.0703834518790245,
-0.058023955672979355,
-0.044793032109737396,
-0.038026489317417145,
0.025003451853990555,
-0.001007283921353519,
0.03710479661822319,
-0.03510480746626854,
-0.016075147315859795,
-0.0037745677400380373,
0.23882649838924408,
0.06895464658737183,
-0.12329771369695663,
-0.06784576922655106,
0.00773588428273797,
-0.1088763102889061,
-0.07039423286914825,
0.050860434770584106,
0.09080953150987625,
-0.08262933045625687,
0.04746415093541145,
0.009872421622276306,
0.06782316416501999,
-0.12694647908210754,
0.020259998738765717,
0.04017634689807892,
0.04944946616888046,
-0.025674859061837196,
0.03409562259912491,
-0.1542111486196518,
0.08277000486850739,
0.036252427846193314,
0.053373437374830246,
-0.052264727652072906,
0.06468762457370758,
0.021878356114029884,
-0.014649460092186928,
0.025682931765913963,
0.010736740194261074,
-0.019793549552559853,
-0.0271003358066082,
-0.06827083975076675,
0.08359237760305405,
0.07531816512346268,
-0.051745157688856125,
0.12037596106529236,
-0.049660686403512955,
0.01223017368465662,
-0.009874660521745682,
0.0778069868683815,
-0.17255602777004242,
-0.13085505366325378,
0.04524964094161987,
-0.14230120182037354,
-0.024281589314341545,
-0.06759500503540039,
-0.0548616386950016,
-0.06986132264137268,
0.16574320197105408,
-0.12221100181341171,
-0.13381443917751312,
-0.08455432206392288,
-0.012037647888064384,
0.15411870181560516,
-0.03041318617761135,
0.008208523504436016,
-0.0167896319180727,
0.13139936327934265,
-0.037071019411087036,
-0.15157456696033478,
-0.048612404614686966,
-0.0703129917383194,
-0.15091454982757568,
-0.033811427652835846,
0.07049214094877243,
0.10942280292510986,
0.05205746740102768,
0.005377411376684904,
0.02621987648308277,
0.0031506670638918877,
-0.052269600331783295,
-0.016259975731372833,
0.1798662394285202,
0.05294094979763031,
0.07184865325689316,
-0.16173957288265228,
-0.05849205330014229,
-0.050265513360500336,
0.023856373503804207,
-0.04548964649438858,
0.09716495126485825,
-0.029185613617300987,
0.07766290754079819,
0.2416228950023651,
-0.12977921962738037,
-0.2031843066215515,
0.00781300663948059,
0.030011039227247238,
0.004184942226856947,
0.008074019104242325,
-0.22462885081768036,
0.12238411605358124,
0.0895780399441719,
-0.0001647894678171724,
-0.0044245910830795765,
-0.18401776254177094,
-0.08161561191082001,
0.08088661730289459,
0.008199874311685562,
0.14548999071121216,
-0.09284953027963638,
-0.03126176819205284,
0.008625165559351444,
-0.08629810810089111,
0.05165312811732292,
0.047360196709632874,
0.08405382931232452,
-0.00004959596481057815,
-0.07516449689865112,
0.049939852207899094,
-0.014264317229390144,
0.08608245104551315,
0.04610702022910118,
0.047158047556877136,
-0.03417380526661873,
0.1298854798078537,
0.0036387559957802296,
-0.017232313752174377,
0.13861627876758575,
0.11464564502239227,
0.05661465600132942,
-0.026419896632432938,
-0.06232431158423424,
-0.07240356504917145,
0.01150634977966547,
-0.021392205730080605,
-0.039025578647851944,
-0.06411668658256531,
0.03959457203745842,
0.0636015459895134,
0.0005510873743332922,
-0.04288829118013382,
-0.0245414599776268,
0.05805114656686783,
0.08991441875696182,
0.1937505304813385,
-0.0531441755592823,
-0.005860095843672752,
-0.018930360674858093,
-0.022791555151343346,
0.06896606832742691,
-0.019521070644259453,
0.06512448191642761,
0.09024155139923096,
0.00950587261468172,
0.08196794241666794,
0.06265593320131302,
-0.13077501952648163,
-0.02164117991924286,
0.054076533764600754,
-0.1003364846110344,
-0.13700084388256073,
-0.027154654264450073,
-0.10493113100528717,
-0.13518208265304565,
-0.0007355277775786817,
0.1709124892950058,
-0.03688975051045418,
-0.04725611209869385,
-0.015514067374169827,
0.07968170940876007,
0.019752515479922295,
0.13153953850269318,
0.033841822296381,
-0.015356672927737236,
-0.061992187052965164,
0.17238306999206543,
0.08860268443822861,
-0.09457409381866455,
0.009981208480894566,
0.016731690615415573,
-0.05900140851736069,
-0.005236501339823008,
-0.06495256721973419,
0.07542674988508224,
-0.027214085683226585,
-0.03862173855304718,
0.00143899186514318,
-0.10006337612867355,
0.05004701763391495,
0.15107622742652893,
0.006907746661454439,
0.15785324573516846,
-0.03765663132071495,
0.06356090307235718,
-0.07480115443468094,
0.07360807806253433,
0.0539335273206234,
0.07720254361629486,
-0.017395347356796265,
0.04707767441868782,
-0.04628598317503929,
-0.0017740584444254637,
-0.014481812715530396,
0.0014095297083258629,
-0.08927912265062332,
-0.05572598800063133,
-0.22129380702972412,
0.02698293700814247,
-0.0569438636302948,
-0.03531162813305855,
0.01065833494067192,
-0.01415078155696392,
0.003629262326285243,
0.036567553877830505,
-0.026020482182502747,
-0.032498884946107864,
-0.02641531080007553,
0.06224633753299713,
-0.12256614863872528,
0.026105675846338272,
0.06681127846240997,
-0.0874238908290863,
0.07718536257743835,
-0.0011361419456079602,
-0.052828337997198105,
-0.0011349195847287774,
0.012051288038492203,
-0.047089025378227234,
-0.029753556475043297,
0.00846536923199892,
-0.053168173879384995,
-0.11123675107955933,
0.02756212092936039,
0.011022989638149738,
-0.026232579723000526,
-0.030174620449543,
0.07597294449806213,
-0.0649666041135788,
0.051782477647066116,
0.03745217248797417,
0.004491888917982578,
-0.04342878982424736,
-0.0163620263338089,
0.11937325447797775,
0.07520383596420288,
0.05687593296170235,
-0.050697892904281616,
-0.018675709143280983,
-0.1555122435092926,
-0.001497566350735724,
-0.0022791465744376183,
-0.004144100937992334,
-0.03927573934197426,
-0.037263624370098114,
0.0303399208933115,
0.01142104808241129,
0.17810653150081635,
0.007996966131031513,
0.013403801247477531,
0.009468987584114075,
0.0047941794618964195,
0.0074680098332464695,
0.03420090302824974,
0.07064926624298096,
-0.01495487429201603,
-0.07787398993968964,
-0.08001557737588882,
0.03349893167614937,
-0.03083333745598793,
-0.02161320112645626,
0.13402128219604492,
0.13504676520824432,
0.10592681914567947,
0.02310602180659771,
0.0002488495083525777,
-0.03017609938979149,
-0.02967326156795025,
0.02142457850277424,
0.055986564606428146,
0.051149893552064896,
-0.013973414897918701,
0.012383569963276386,
0.06671962141990662,
-0.1245855763554573,
0.12042801827192307,
-0.03586883097887039,
-0.027952907606959343,
-0.10592999309301376,
-0.07728009670972824,
-0.01819826290011406,
-0.019522512331604958,
-0.02022777684032917,
-0.16483573615550995,
0.047448016703128815,
0.10754705220460892,
0.021748580038547516,
-0.031251076608896255,
0.03746340051293373,
-0.1504615694284439,
-0.09032067656517029,
0.06520910561084747,
0.01307899784296751,
0.04433966055512428,
0.1090523824095726,
-0.01473289541900158,
0.07940112799406052,
0.142080619931221,
0.061607833951711655,
0.05494321882724762,
0.08303780853748322,
0.009545596316456795,
-0.020043691620230675,
-0.04016178101301193,
0.007146854884922504,
-0.0620696023106575,
0.03757033497095108,
0.16035975515842438,
0.0316411517560482,
-0.0496402382850647,
0.03548010438680649,
0.1805681735277176,
-0.03582454472780228,
-0.052063390612602234,
-0.17853771150112152,
0.20730827748775482,
0.023473074659705162,
0.04292583093047142,
0.050065163522958755,
-0.08988314121961594,
-0.0347423292696476,
0.20510323345661163,
0.10686836391687393,
0.023661574348807335,
-0.023425742983818054,
0.018363598734140396,
-0.010387265123426914,
0.008664129301905632,
0.08277034759521484,
0.006097974255681038,
0.21768634021282196,
-0.0381074883043766,
0.01780349761247635,
0.03265341371297836,
0.03491753339767456,
-0.07242106646299362,
0.15107464790344238,
-0.04814797639846802,
-0.001961051020771265,
-0.05606343224644661,
0.02148408256471157,
0.02087894268333912,
-0.3057970106601715,
-0.11028388142585754,
0.0007941905059851706,
-0.06375909596681595,
-0.017521008849143982,
-0.020198985934257507,
-0.005079295486211777,
0.052847545593976974,
-0.003658892586827278,
0.03135604411363602,
0.1828632354736328,
-0.005564791616052389,
-0.03798985108733177,
-0.040707241743803024,
0.12088452279567719,
0.019751356914639473,
0.13918638229370117,
0.059022579342126846,
-0.01611841656267643,
0.044299643486738205,
0.022995272651314735,
-0.11319872736930847,
-0.03379449620842934,
-0.02630789205431938,
-0.009284025058150291,
-0.022790584713220596,
0.13240566849708557,
0.02007623203098774,
0.04862054064869881,
0.037946686148643494,
-0.04379262030124664,
0.050853826105594635,
0.0443253330886364,
-0.06882661581039429,
-0.05269736424088478,
0.0433342270553112,
-0.09665905684232712,
0.14126724004745483,
0.17613814771175385,
0.012702352367341518,
0.025252975523471832,
-0.055727217346429825,
-0.009599579498171806,
0.0008745574159547687,
0.1157233864068985,
-0.004223823547363281,
-0.1579671949148178,
-0.014056317508220673,
-0.08774463832378387,
0.0403410866856575,
-0.22232159972190857,
-0.04456331208348274,
0.10161338001489639,
-0.011322174221277237,
-0.010101103223860264,
0.04853587597608566,
0.005838731769472361,
0.059670332819223404,
-0.015700647607445717,
-0.04952621832489967,
0.00964309647679329,
0.06727728992700577,
-0.08347780257463455,
-0.03211941570043564
] |
null | null |
transformers
|
# MultiBERTs - Seed 12
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #12.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_12')
model = TFBertModel.from_pretrained("google/multiberts-seed_12")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_12')
model = BertModel.from_pretrained("google/multiberts-seed_12")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_12"]}
| null |
google/multiberts-seed_12
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_12",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_12 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 12
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #12.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 12\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #12.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_12 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 12\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #12.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_12 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 12\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #12.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06307473033666611,
0.09311683475971222,
-0.0041016582399606705,
0.044108957052230835,
0.077341727912426,
0.015271356329321861,
0.055418990552425385,
0.07374808937311172,
-0.08899832516908646,
0.022153237834572792,
-0.013250190764665604,
-0.046652957797050476,
0.07721944898366928,
-0.045230019837617874,
0.06038452312350273,
-0.2357598841190338,
0.049550529569387436,
-0.030146239325404167,
-0.02557283081114292,
0.027877086773514748,
0.11167608946561813,
-0.09547652304172516,
0.07453787326812744,
0.05541728064417839,
0.005260366015136242,
0.017145421355962753,
-0.015348218381404877,
0.004972308874130249,
0.08738734573125839,
0.03212420642375946,
0.08440835773944855,
-0.0030115724075585604,
0.08470708876848221,
-0.1420651525259018,
0.0065417238511145115,
0.05926641821861267,
0.06330198794603348,
0.04240189865231514,
0.11690555512905121,
0.008227119222283363,
0.0888494923710823,
0.018121760338544846,
0.05220755934715271,
0.04529489949345589,
-0.07432188838720322,
-0.16895562410354614,
-0.09239806234836578,
0.01893271878361702,
-0.0002939770929515362,
0.006625388283282518,
-0.007259313017129898,
-0.018981197848916054,
-0.019346874207258224,
0.02042900025844574,
0.11936721950769424,
-0.26556193828582764,
-0.016630923375487328,
0.009126833640038967,
0.0584397129714489,
0.057773202657699585,
-0.0384104959666729,
-0.0422961451113224,
0.04285992681980133,
0.05312548950314522,
0.03954288363456726,
-0.024422844871878624,
0.038271449506282806,
-0.015940513461828232,
-0.15350203216075897,
-0.01935471035540104,
0.1063031479716301,
-0.04949561506509781,
-0.1172860711812973,
-0.0470203161239624,
-0.03358175605535507,
0.12376924604177475,
0.008943144232034683,
-0.03571458160877228,
0.045935727655887604,
0.030194586142897606,
0.06391081213951111,
-0.06298074126243591,
-0.11613687127828598,
0.026064442470669746,
-0.05200706049799919,
0.10771280527114868,
0.0942353829741478,
0.04843982309103012,
-0.0077324858866631985,
0.05652981251478195,
-0.0839485451579094,
-0.07668410241603851,
-0.0507136806845665,
-0.08975072205066681,
-0.0408603735268116,
-0.03848445042967796,
-0.08418449759483337,
-0.16400697827339172,
-0.004238756839185953,
0.10974563658237457,
-0.06253071129322052,
0.007998593151569366,
-0.09013036638498306,
-0.021500451490283012,
0.09308651089668274,
0.16038434207439423,
-0.1094341054558754,
0.04923269897699356,
-0.011402642354369164,
0.010717367753386497,
-0.023423543199896812,
0.03246424347162247,
0.011940217576920986,
-0.0103523600846529,
0.05056064575910568,
0.02397894859313965,
-0.019443128257989883,
0.04283664748072624,
-0.02071101777255535,
-0.04235775023698807,
0.054912857711315155,
-0.13389721512794495,
-0.011113204061985016,
0.001608360093086958,
-0.004410891328006983,
0.06138644367456436,
0.0649527981877327,
-0.02770994044840336,
-0.08990666270256042,
0.02354154922068119,
-0.08189408481121063,
-0.04650919511914253,
-0.060775045305490494,
-0.1567419469356537,
0.0285164937376976,
-0.07641927897930145,
-0.0486120767891407,
-0.09282944351434708,
-0.09891042858362198,
-0.026733465492725372,
0.06039327755570412,
-0.01665264554321766,
0.03722815960645676,
0.03032616339623928,
-0.008022993803024292,
-0.042318928986787796,
0.04597685858607292,
0.006182544864714146,
-0.014868796803057194,
0.008016848005354404,
-0.045165013521909714,
0.05448317900300026,
-0.009760702028870583,
0.04408147558569908,
-0.07065144926309586,
0.02113931253552437,
-0.14286814630031586,
0.06147041916847229,
-0.0975407138466835,
-0.08244207501411438,
-0.05013582855463028,
-0.04242498427629471,
-0.07285380363464355,
0.030986767262220383,
0.00960643868893385,
0.0610351487994194,
-0.14995698630809784,
-0.04918134957551956,
0.13988260924816132,
-0.13662847876548767,
0.0349951796233654,
0.09541670233011246,
-0.051498331129550934,
0.04478713870048523,
0.1169714704155922,
0.05716096982359886,
0.07245137542486191,
-0.04675436392426491,
-0.015971891582012177,
0.008394934237003326,
0.035405684262514114,
0.1438753753900528,
0.0661141574382782,
-0.06835483014583588,
-0.08288804441690445,
0.03561516851186752,
-0.07344819605350494,
-0.043612297624349594,
-0.05922131612896919,
-0.00470740906894207,
-0.00975846778601408,
-0.056615907698869705,
-0.0050348443910479546,
-0.023992914706468582,
-0.012567412108182907,
-0.017368437722325325,
-0.05137860029935837,
0.05423000454902649,
0.06255167722702026,
-0.08729971945285797,
0.05682113394141197,
-0.05503307655453682,
0.016136517748236656,
-0.07923360168933868,
-0.0016998880309984088,
-0.17953112721443176,
0.006868785247206688,
0.1104268729686737,
-0.09906267374753952,
0.05110234022140503,
0.16240081191062927,
0.022154157981276512,
0.06775246560573578,
-0.05216056481003761,
0.07123337686061859,
0.005853242706507444,
-0.025055252015590668,
-0.04506828263401985,
-0.11676503717899323,
-0.0644802451133728,
-0.0610445998609066,
0.009462252259254456,
-0.08326912671327591,
-0.0049455477856099606,
-0.03883711248636246,
0.020619288086891174,
0.022857334464788437,
-0.06338859349489212,
0.020710568875074387,
0.024891799315810204,
-0.038368090987205505,
-0.027208013460040092,
-0.025535661727190018,
0.04429488629102707,
0.015576266683638096,
0.11602199822664261,
-0.09374997019767761,
-0.06841050833463669,
0.04629632085561752,
0.05366828665137291,
-0.05172362178564072,
0.09241732954978943,
-0.05468156561255455,
-0.033785078674554825,
-0.09914858639240265,
-0.09843146055936813,
0.17251704633235931,
-0.00552580738440156,
0.09778790920972824,
-0.09674736857414246,
-0.025614982470870018,
-0.000447401573183015,
-0.008233475498855114,
-0.002657196717336774,
0.05214610695838928,
0.015745045617222786,
-0.0946621522307396,
-0.0026265622582286596,
0.014576056972146034,
0.017642946913838387,
0.07674103230237961,
-0.01994140073657036,
-0.11424822360277176,
0.031361911445856094,
-0.00196839589625597,
-0.006131779868155718,
0.06455700844526291,
-0.048877421766519547,
-0.004210716113448143,
0.0549178272485733,
0.05626644194126129,
0.05572592467069626,
-0.06632747501134872,
0.09488442540168762,
0.06548221409320831,
-0.0429127998650074,
-0.04487353935837746,
-0.08438379317522049,
0.011170897632837296,
0.1152796596288681,
0.02475341595709324,
0.05942663177847862,
-0.04739757627248764,
-0.023396814242005348,
-0.10318470746278763,
0.15689009428024292,
-0.08808650821447372,
-0.15958373248577118,
-0.15225671231746674,
0.006096133962273598,
-0.0549485869705677,
0.06255442649126053,
0.016419319435954094,
-0.05009865760803223,
-0.09691084921360016,
-0.07827314734458923,
0.1599273979663849,
-0.03900342807173729,
-0.006687369663268328,
0.01818147860467434,
-0.02912866324186325,
0.036976661533117294,
-0.1817137897014618,
-0.00027420767582952976,
-0.039488330483436584,
-0.12622986733913422,
-0.03809934854507446,
-0.0005523605505004525,
0.06773486733436584,
0.07157257199287415,
-0.038655102252960205,
-0.07736526429653168,
0.01825661025941372,
0.16425815224647522,
0.03344589099287987,
0.07785402983427048,
0.0934177115559578,
-0.09890357404947281,
0.04340176656842232,
0.047095149755477905,
0.03061825968325138,
-0.013321401551365852,
0.00933142751455307,
0.05766119062900543,
-0.025824347510933876,
-0.2865970730781555,
-0.008739766664803028,
-0.018501587212085724,
-0.016003098338842392,
0.06559329479932785,
0.04219067841768265,
-0.08720692992210388,
0.04928391054272652,
-0.057760994881391525,
0.0333806611597538,
0.08650389313697815,
0.0442681722342968,
0.09505188465118408,
-0.03867270424962044,
0.09250795841217041,
-0.05372873321175575,
-0.01772245578467846,
0.10811089724302292,
-0.050326112657785416,
0.19889578223228455,
-0.056660961359739304,
0.05112689360976219,
0.09769145399332047,
-0.01331778708845377,
0.03822121396660805,
0.13892342150211334,
-0.052769213914871216,
0.07049877196550369,
-0.057467713952064514,
-0.04469051957130432,
-0.03807315602898598,
0.024454550817608833,
-0.0021341454703360796,
0.03609534353017807,
-0.03574254736304283,
-0.015376145951449871,
-0.00402789656072855,
0.23853595554828644,
0.06901154667139053,
-0.1224227100610733,
-0.06856483966112137,
0.007364377379417419,
-0.10846567153930664,
-0.07044752687215805,
0.05165015906095505,
0.0916072279214859,
-0.08222696930170059,
0.046537335962057114,
0.009684611111879349,
0.06803879886865616,
-0.12697315216064453,
0.020472349599003792,
0.04046957194805145,
0.04912089183926582,
-0.025438491255044937,
0.03362373262643814,
-0.15400637686252594,
0.08239182829856873,
0.03591957315802574,
0.05284520983695984,
-0.050888270139694214,
0.06389231234788895,
0.021259980276226997,
-0.012976613827049732,
0.025508567690849304,
0.011370058171451092,
-0.02134029008448124,
-0.0263193529099226,
-0.06798229366540909,
0.08396898210048676,
0.07527778297662735,
-0.05145982652902603,
0.1202532947063446,
-0.04921353980898857,
0.012506464496254921,
-0.009563323110342026,
0.07519828528165817,
-0.17199969291687012,
-0.1305517554283142,
0.04473143070936203,
-0.1423238217830658,
-0.024072876200079918,
-0.06795495003461838,
-0.05493167042732239,
-0.0680798664689064,
0.16558566689491272,
-0.12217437475919724,
-0.13363410532474518,
-0.08438204228878021,
-0.012950238771736622,
0.15407131612300873,
-0.03049338236451149,
0.008835977874696255,
-0.01657480001449585,
0.13141688704490662,
-0.036835476756095886,
-0.15169571340084076,
-0.04873848333954811,
-0.07032858580350876,
-0.1505132019519806,
-0.03331354632973671,
0.07033293694257736,
0.1094876378774643,
0.05229376628994942,
0.004772997926920652,
0.02570566162467003,
0.0038436625618487597,
-0.0523291677236557,
-0.01603231020271778,
0.18100838363170624,
0.05320744961500168,
0.0712009146809578,
-0.1609046906232834,
-0.05684441700577736,
-0.04989941045641899,
0.024099789559841156,
-0.04598953202366829,
0.09674657136201859,
-0.0294380821287632,
0.07787983864545822,
0.24093908071517944,
-0.12960022687911987,
-0.2035389542579651,
0.00817312952131033,
0.029763460159301758,
0.003952930215746164,
0.008412688039243221,
-0.2252107709646225,
0.12163576483726501,
0.08846215903759003,
-0.000060534290241776034,
-0.00605958653613925,
-0.1862945258617401,
-0.08155648410320282,
0.07994344085454941,
0.008386646397411823,
0.14604175090789795,
-0.09227916598320007,
-0.03161392733454704,
0.008042390458285809,
-0.08529303967952728,
0.052735719829797745,
0.04635786637663841,
0.0834239050745964,
-0.00021556807041633874,
-0.07443831115961075,
0.04985181987285614,
-0.014394894242286682,
0.08535992354154587,
0.04599006474018097,
0.045922745019197464,
-0.03426630422472954,
0.13081780076026917,
0.00138876645360142,
-0.01670100912451744,
0.13735666871070862,
0.11456292867660522,
0.057019833475351334,
-0.02639702893793583,
-0.06245215982198715,
-0.07327739894390106,
0.01106909941881895,
-0.021502619609236717,
-0.038760215044021606,
-0.06350976973772049,
0.04020851105451584,
0.06354568898677826,
0.0007232413627207279,
-0.043975070118904114,
-0.02504643239080906,
0.05869016423821449,
0.08937977254390717,
0.19383983314037323,
-0.051787398755550385,
-0.006801183335483074,
-0.018450070172548294,
-0.0224614217877388,
0.0690777599811554,
-0.020467568188905716,
0.0646258220076561,
0.08993367850780487,
0.009321991354227066,
0.0813777968287468,
0.06279879808425903,
-0.13122832775115967,
-0.021847331896424294,
0.054556529968976974,
-0.10107952356338501,
-0.13804516196250916,
-0.02725973352789879,
-0.10537625104188919,
-0.1352996528148651,
-0.0001341150637017563,
0.17128901183605194,
-0.037083949893713,
-0.04678808152675629,
-0.015396368689835072,
0.07988783717155457,
0.020062284544110298,
0.13056938350200653,
0.034299086779356,
-0.015412780456244946,
-0.06167079508304596,
0.17141997814178467,
0.08842287957668304,
-0.0950794443488121,
0.009813611395657063,
0.016247032210230827,
-0.05936675891280174,
-0.005098132882267237,
-0.06512290239334106,
0.07313106954097748,
-0.028099799528717995,
-0.039272721856832504,
0.001723283901810646,
-0.09973140060901642,
0.05005265772342682,
0.15096110105514526,
0.006782101467251778,
0.15817347168922424,
-0.03788337856531143,
0.06325581669807434,
-0.0746396854519844,
0.07338427752256393,
0.053879786282777786,
0.07732346653938293,
-0.01766599342226982,
0.047323260456323624,
-0.046135835349559784,
-0.0012118869926780462,
-0.014118514023721218,
0.001859335578046739,
-0.0896054357290268,
-0.05561811476945877,
-0.22468550503253937,
0.026277948170900345,
-0.05681971460580826,
-0.035502612590789795,
0.010502788238227367,
-0.014116642996668816,
0.003835465759038925,
0.0358869768679142,
-0.026316531002521515,
-0.03235173970460892,
-0.026500217616558075,
0.062224868685007095,
-0.12244156002998352,
0.026989920064806938,
0.06714390963315964,
-0.08730509877204895,
0.07713299989700317,
-0.0014335201121866703,
-0.05273205414414406,
-0.00015859221457503736,
0.014891846105456352,
-0.04682905226945877,
-0.02981564775109291,
0.008127884939312935,
-0.05277673900127411,
-0.10950911790132523,
0.02770584635436535,
0.011148006655275822,
-0.025796210393309593,
-0.02972719632089138,
0.07477141171693802,
-0.0647178441286087,
0.05293376371264458,
0.03807707503437996,
0.004693581257015467,
-0.04327647387981415,
-0.016627835109829903,
0.11909090727567673,
0.07608270645141602,
0.05628316476941109,
-0.05084475502371788,
-0.01832767203450203,
-0.1547766476869583,
-0.0016493055736646056,
-0.0024276417680084705,
-0.0034257282968610525,
-0.03883373737335205,
-0.03728603571653366,
0.03057377226650715,
0.011377614922821522,
0.17713415622711182,
0.008608941920101643,
0.01506397221237421,
0.009251369163393974,
0.003873090259730816,
0.008534546941518784,
0.03428281098604202,
0.0703638568520546,
-0.014743213541805744,
-0.07805802673101425,
-0.07937119156122208,
0.033248353749513626,
-0.03155802562832832,
-0.021130194887518883,
0.1347856968641281,
0.13457289338111877,
0.10465346276760101,
0.022565314546227455,
0.0007227372261695564,
-0.028853371739387512,
-0.029164234176278114,
0.02040831558406353,
0.055569764226675034,
0.051177412271499634,
-0.014537807554006577,
0.0130717558786273,
0.06673528254032135,
-0.12410850822925568,
0.12021670490503311,
-0.03563926741480827,
-0.02861775830388069,
-0.10657516121864319,
-0.07651231437921524,
-0.01804940588772297,
-0.020176270976662636,
-0.020182842388749123,
-0.16462475061416626,
0.046507902443408966,
0.1078685075044632,
0.021810034289956093,
-0.03166433796286583,
0.03766689449548721,
-0.14834511280059814,
-0.08953258395195007,
0.06492000073194504,
0.013313640840351582,
0.0445266030728817,
0.10972791910171509,
-0.014960164204239845,
0.07983037084341049,
0.14225293695926666,
0.06116945296525955,
0.05509359389543533,
0.08244510740041733,
0.010026711970567703,
-0.019882287830114365,
-0.03937774524092674,
0.007021013181656599,
-0.0612492710351944,
0.0379779078066349,
0.1606571078300476,
0.03177810460329056,
-0.04987211152911186,
0.03555701673030853,
0.18118393421173096,
-0.03663467988371849,
-0.052442166954278946,
-0.17824147641658783,
0.20838773250579834,
0.02328278124332428,
0.04318155720829964,
0.04993690922856331,
-0.08959048241376877,
-0.03537215664982796,
0.20627816021442413,
0.10711592435836792,
0.02357597090303898,
-0.023496044799685478,
0.018422845751047134,
-0.010227729566395283,
0.00885689165443182,
0.08401752263307571,
0.006430084351450205,
0.21908774971961975,
-0.03764205798506737,
0.017322666943073273,
0.032392293214797974,
0.03508032113313675,
-0.07190033793449402,
0.15111243724822998,
-0.047620587050914764,
-0.002017335034906864,
-0.05612742528319359,
0.022429222241044044,
0.01986077055335045,
-0.3060375452041626,
-0.11133569478988647,
0.002217598957940936,
-0.06305141001939774,
-0.017184875905513763,
-0.020239735022187233,
-0.005224229767918587,
0.05244511738419533,
-0.004137357696890831,
0.0312044695019722,
0.18286526203155518,
-0.005682266782969236,
-0.0386558398604393,
-0.04079508036375046,
0.1218547597527504,
0.0213098656386137,
0.1376083791255951,
0.05891162157058716,
-0.01560156513005495,
0.04368256777524948,
0.02333681285381317,
-0.11297579854726791,
-0.0338953398168087,
-0.02602623775601387,
-0.00968148559331894,
-0.02322464808821678,
0.13263125717639923,
0.01981506496667862,
0.047585126012563705,
0.03818485885858536,
-0.04304185509681702,
0.05055013671517372,
0.04240097105503082,
-0.06871947646141052,
-0.053320761770009995,
0.04363345727324486,
-0.0967404767870903,
0.14108148217201233,
0.17688928544521332,
0.01244607288390398,
0.025770902633666992,
-0.05577843636274338,
-0.009787159971892834,
0.00016705968300811946,
0.11447694897651672,
-0.003986700903624296,
-0.1577584147453308,
-0.01434869784861803,
-0.08720830082893372,
0.04034644737839699,
-0.22312451899051666,
-0.04409269616007805,
0.10233192145824432,
-0.01089810486882925,
-0.010263506323099136,
0.04939604550600052,
0.004860921297222376,
0.05924391746520996,
-0.016111942008137703,
-0.04791659116744995,
0.009171584621071815,
0.06731128692626953,
-0.08308887481689453,
-0.03269657865166664
] |
null | null |
transformers
|
# MultiBERTs - Seed 13
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #13.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_13')
model = TFBertModel.from_pretrained("google/multiberts-seed_13")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_13')
model = BertModel.from_pretrained("google/multiberts-seed_13")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_13"]}
| null |
google/multiberts-seed_13
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_13",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_13 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 13
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #13.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 13\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #13.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_13 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 13\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #13.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_13 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 13\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #13.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06242985278367996,
0.09185265749692917,
-0.004090392030775547,
0.044010382145643234,
0.07698194682598114,
0.01542715635150671,
0.05516494810581207,
0.07437306642532349,
-0.0893261581659317,
0.02243860438466072,
-0.012587227858603,
-0.04612746089696884,
0.07727782428264618,
-0.04490276798605919,
0.06056754291057587,
-0.23516720533370972,
0.050095848739147186,
-0.030852550640702248,
-0.025153180584311485,
0.027793582528829575,
0.11221136152744293,
-0.09496858716011047,
0.07464098185300827,
0.05581384152173996,
0.006093657109886408,
0.016665803268551826,
-0.01585981249809265,
0.004948704969137907,
0.08687655627727509,
0.03230028972029686,
0.08407460153102875,
-0.0025521889328956604,
0.0850420594215393,
-0.1421760469675064,
0.006355904042720795,
0.05931248888373375,
0.06330148875713348,
0.04295499995350838,
0.1167847290635109,
0.008262556977570057,
0.08724892139434814,
0.017664363607764244,
0.052951879799366,
0.04527377709746361,
-0.07487998157739639,
-0.16848960518836975,
-0.09370630234479904,
0.01888057217001915,
0.000008907570190785918,
0.00647158594802022,
-0.007085707969963551,
-0.018268510699272156,
-0.019152073189616203,
0.02060394361615181,
0.12142810970544815,
-0.2646235227584839,
-0.016872253268957138,
0.01164241973310709,
0.05885935202240944,
0.057662613689899445,
-0.037959009408950806,
-0.04117869958281517,
0.042870599776506424,
0.05258716270327568,
0.040323756635189056,
-0.024599961936473846,
0.03998396918177605,
-0.015338467434048653,
-0.15340188145637512,
-0.019808094948530197,
0.10583648085594177,
-0.04949314519762993,
-0.11683434993028641,
-0.04867152124643326,
-0.033276356756687164,
0.12255921959877014,
0.008857423439621925,
-0.03661799430847168,
0.04630279168486595,
0.030082333832979202,
0.06417115777730942,
-0.06316883116960526,
-0.11582762002944946,
0.026071704924106598,
-0.051480840891599655,
0.10717473179101944,
0.093907430768013,
0.04790203645825386,
-0.006959859747439623,
0.056553177535533905,
-0.08480016887187958,
-0.07702320069074631,
-0.051221925765275955,
-0.08923236280679703,
-0.04110359773039818,
-0.039201561361551285,
-0.08479060977697372,
-0.16481831669807434,
-0.003984249662607908,
0.1112975925207138,
-0.06289125233888626,
0.008528485894203186,
-0.08949732035398483,
-0.02141615003347397,
0.09385533630847931,
0.16105866432189941,
-0.1097988709807396,
0.04935217276215553,
-0.010927001014351845,
0.010569312609732151,
-0.02308979444205761,
0.03193601965904236,
0.012193767353892326,
-0.01045410055667162,
0.050870079547166824,
0.024358641356229782,
-0.01897815242409706,
0.043094538152217865,
-0.020676041021943092,
-0.04267818480730057,
0.055744998157024384,
-0.13394905626773834,
-0.011284451931715012,
0.0016693522920832038,
-0.0036219328176230192,
0.06056487187743187,
0.06481004506349564,
-0.027105925604701042,
-0.08970043063163757,
0.023589493706822395,
-0.08172577619552612,
-0.04714976251125336,
-0.06040944904088974,
-0.15583819150924683,
0.02827550657093525,
-0.07467545568943024,
-0.04880063980817795,
-0.0926109254360199,
-0.09949451684951782,
-0.027142992243170738,
0.06077241897583008,
-0.017152508720755577,
0.03657632693648338,
0.030869154259562492,
-0.007644936442375183,
-0.04184979200363159,
0.046256422996520996,
0.006945859640836716,
-0.014921887777745724,
0.007876517251133919,
-0.044555433094501495,
0.05401042848825455,
-0.008928665891289711,
0.04404580593109131,
-0.07063505798578262,
0.02164289355278015,
-0.14402788877487183,
0.061655670404434204,
-0.09730213135480881,
-0.08229827135801315,
-0.049993906170129776,
-0.04250282794237137,
-0.07263974845409393,
0.030917704105377197,
0.009764541871845722,
0.061330486088991165,
-0.14749561250209808,
-0.04909835010766983,
0.137466162443161,
-0.13636937737464905,
0.03472260385751724,
0.09543992578983307,
-0.0513530969619751,
0.04466479644179344,
0.11627387255430222,
0.05818101018667221,
0.07209936529397964,
-0.047117192298173904,
-0.014102976769208908,
0.008174099028110504,
0.035268303006887436,
0.14501143991947174,
0.06641045957803726,
-0.06877759099006653,
-0.0830543115735054,
0.03569494187831879,
-0.07447618991136551,
-0.043900929391384125,
-0.05915341153740883,
-0.005322179291397333,
-0.009700090624392033,
-0.05600963905453682,
-0.005599642638117075,
-0.02418062463402748,
-0.012329038232564926,
-0.017086025327444077,
-0.05098966136574745,
0.05212898179888725,
0.06251133233308792,
-0.08772485703229904,
0.05703066289424896,
-0.0543191097676754,
0.016257354989647865,
-0.07886922359466553,
-0.0012959050945937634,
-0.17884372174739838,
0.006724060047417879,
0.11028095334768295,
-0.09989890456199646,
0.050351761281490326,
0.16160011291503906,
0.022100310772657394,
0.06735134869813919,
-0.05252975970506668,
0.0715627372264862,
0.006215090863406658,
-0.02509569190442562,
-0.04544847086071968,
-0.11741422861814499,
-0.06503359973430634,
-0.06120467185974121,
0.009489511139690876,
-0.08415138721466064,
-0.004935072269290686,
-0.04002038389444351,
0.020800434052944183,
0.022270068526268005,
-0.06367331743240356,
0.020904742181301117,
0.02453012950718403,
-0.038551684468984604,
-0.027434227988123894,
-0.02571849897503853,
0.04421696066856384,
0.015746334567666054,
0.11591195315122604,
-0.09450242668390274,
-0.06826449930667877,
0.045931700617074966,
0.0534697026014328,
-0.05230037122964859,
0.09107720106840134,
-0.05484124645590782,
-0.03410802036523819,
-0.09916655719280243,
-0.09894073754549026,
0.172980397939682,
-0.005434831604361534,
0.09799286723136902,
-0.09681165218353271,
-0.026386532932519913,
-0.0004502868396230042,
-0.008581487461924553,
-0.003311408217996359,
0.05194263905286789,
0.014521628618240356,
-0.09402452409267426,
-0.0020732348784804344,
0.014284401200711727,
0.01811208948493004,
0.07634470611810684,
-0.01973935402929783,
-0.11419837176799774,
0.031052710488438606,
-0.0016696108505129814,
-0.00581338768824935,
0.06416072696447372,
-0.04920124262571335,
-0.004403614439070225,
0.054908134043216705,
0.056076690554618835,
0.05579566955566406,
-0.06574233621358871,
0.09468069672584534,
0.06584015488624573,
-0.0431864969432354,
-0.04324871674180031,
-0.08442463725805283,
0.010756627656519413,
0.114948570728302,
0.02530430071055889,
0.05949683114886284,
-0.04707790166139603,
-0.023929528892040253,
-0.10385886579751968,
0.1569192260503769,
-0.08707253634929657,
-0.16012988984584808,
-0.15269741415977478,
0.005473561584949493,
-0.05562615394592285,
0.062418315559625626,
0.01618150621652603,
-0.04982001334428787,
-0.09735739976167679,
-0.07854415476322174,
0.16003569960594177,
-0.039528753608465195,
-0.00714300200343132,
0.01919868029654026,
-0.02960362657904625,
0.03743968531489372,
-0.18174880743026733,
-0.00047107055434025824,
-0.03970209136605263,
-0.12634088099002838,
-0.03833673521876335,
-0.0005001382669433951,
0.06824663281440735,
0.07134663313627243,
-0.03785710781812668,
-0.07732230424880981,
0.01818811148405075,
0.16505958139896393,
0.0336264967918396,
0.0773644670844078,
0.09247308224439621,
-0.09880995750427246,
0.042954880744218826,
0.047456249594688416,
0.030661702156066895,
-0.013620426878333092,
0.009138483554124832,
0.057234399020671844,
-0.02621924690902233,
-0.28630200028419495,
-0.00899473112076521,
-0.019118672236800194,
-0.016676243394613266,
0.06555166095495224,
0.04206191375851631,
-0.08512021601200104,
0.04901532083749771,
-0.05801307410001755,
0.0329018235206604,
0.0878494381904602,
0.04491886496543884,
0.09615808725357056,
-0.03837919980287552,
0.09249062836170197,
-0.05309351161122322,
-0.017671972513198853,
0.10828979313373566,
-0.05009140074253082,
0.19834288954734802,
-0.05602329969406128,
0.05274149775505066,
0.09769568592309952,
-0.012816845439374447,
0.037971168756484985,
0.13860638439655304,
-0.052640464156866074,
0.0699387937784195,
-0.05777478590607643,
-0.045072589069604874,
-0.038173701614141464,
0.023927699774503708,
-0.0022665418218821287,
0.0361286923289299,
-0.03601458668708801,
-0.01647513173520565,
-0.003643458243459463,
0.23902295529842377,
0.06817076355218887,
-0.1226855218410492,
-0.06779391318559647,
0.007320142351090908,
-0.10830486565828323,
-0.06989103555679321,
0.0511767752468586,
0.0896616131067276,
-0.08294010162353516,
0.046959493309259415,
0.009520028717815876,
0.06805001944303513,
-0.12695731222629547,
0.02073817327618599,
0.04039984196424484,
0.05004629120230675,
-0.025600416585803032,
0.033730629831552505,
-0.15444453060626984,
0.08328636735677719,
0.03580751270055771,
0.05276751145720482,
-0.051078032702207565,
0.06432648003101349,
0.021039092913269997,
-0.013152719475328922,
0.026102671399712563,
0.011247698217630386,
-0.02021756023168564,
-0.027134686708450317,
-0.06759858131408691,
0.08358949422836304,
0.0750732570886612,
-0.05211165174841881,
0.11976049840450287,
-0.04960718750953674,
0.012461842969059944,
-0.00940515473484993,
0.0773451030254364,
-0.17185290157794952,
-0.13089503347873688,
0.045444704592227936,
-0.14304621517658234,
-0.025174472481012344,
-0.06768210232257843,
-0.0547616071999073,
-0.07065458595752716,
0.1677330583333969,
-0.12112800031900406,
-0.1335420161485672,
-0.08462432771921158,
-0.013063658028841019,
0.15415805578231812,
-0.030314793810248375,
0.008199290372431278,
-0.016735589131712914,
0.13110752403736115,
-0.03706676512956619,
-0.15218836069107056,
-0.0486982986330986,
-0.07051575928926468,
-0.1512095183134079,
-0.033600062131881714,
0.07035622745752335,
0.10932107269763947,
0.05221378430724144,
0.0048990026116371155,
0.026213737204670906,
0.004149068612605333,
-0.052542541176080704,
-0.016323242336511612,
0.18135373294353485,
0.052687469869852066,
0.07082116603851318,
-0.16023480892181396,
-0.056774891912937164,
-0.050395987927913666,
0.023603331297636032,
-0.04599630832672119,
0.09871737658977509,
-0.02948116324841976,
0.07891476899385452,
0.24122007191181183,
-0.1295984536409378,
-0.20292824506759644,
0.007518661208450794,
0.029266247525811195,
0.00355434394441545,
0.008782505989074707,
-0.2243899255990982,
0.1215192973613739,
0.0897672176361084,
-0.00045881286496296525,
-0.008627285249531269,
-0.1863439381122589,
-0.08204253762960434,
0.07993819564580917,
0.008845732547342777,
0.14558351039886475,
-0.09248649328947067,
-0.03190351277589798,
0.009069749154150486,
-0.08650464564561844,
0.051489368081092834,
0.04767298698425293,
0.08329787850379944,
-0.0002710750559344888,
-0.07587577402591705,
0.05005936697125435,
-0.014590932987630367,
0.08672722429037094,
0.04542490094900131,
0.0461752712726593,
-0.034632179886102676,
0.13129092752933502,
0.0022235088981688023,
-0.0169933270663023,
0.13743740320205688,
0.11477387696504593,
0.05713946744799614,
-0.025874251499772072,
-0.06285953521728516,
-0.07327406108379364,
0.01114543154835701,
-0.02183103933930397,
-0.03916427865624428,
-0.06439650058746338,
0.040173064917325974,
0.06426288187503815,
0.0005547942128032446,
-0.042794812470674515,
-0.024755118414759636,
0.059534359723329544,
0.09045744687318802,
0.19406576454639435,
-0.053059276193380356,
-0.006043252069503069,
-0.01857619918882847,
-0.022659780457615852,
0.06881148368120193,
-0.018552301451563835,
0.06454238295555115,
0.08987626433372498,
0.009006302803754807,
0.08154270052909851,
0.06266274303197861,
-0.13111115992069244,
-0.022515742108225822,
0.05437931418418884,
-0.1010303869843483,
-0.13734418153762817,
-0.027515918016433716,
-0.10679107904434204,
-0.1355348825454712,
-0.00083158042980358,
0.17087002098560333,
-0.03682461008429527,
-0.04680556058883667,
-0.014598989859223366,
0.08022039383649826,
0.019768333062529564,
0.13113035261631012,
0.03345957398414612,
-0.015180587768554688,
-0.06219220533967018,
0.17150935530662537,
0.08899343758821487,
-0.09396454691886902,
0.009546414017677307,
0.01729597896337509,
-0.05956724286079407,
-0.004942701663821936,
-0.0656670406460762,
0.0733356699347496,
-0.02871723100543022,
-0.03823523223400116,
0.001847054110839963,
-0.09986638277769089,
0.04952739179134369,
0.15044838190078735,
0.007189855445176363,
0.15789487957954407,
-0.03699555993080139,
0.06338632106781006,
-0.07470361143350601,
0.07289646565914154,
0.05407402291893959,
0.07737553119659424,
-0.017510002478957176,
0.046894997358322144,
-0.046072643250226974,
-0.0015202946960926056,
-0.014478890225291252,
0.0014912497717887163,
-0.09045042097568512,
-0.05550624802708626,
-0.22216258943080902,
0.026029586791992188,
-0.05806875601410866,
-0.035417985171079636,
0.010430940426886082,
-0.013828279450535774,
0.0036001477856189013,
0.0362817645072937,
-0.025816461071372032,
-0.03283589333295822,
-0.026801100000739098,
0.062422629445791245,
-0.12242572009563446,
0.02652611956000328,
0.06729903817176819,
-0.08755900710821152,
0.07738921046257019,
-0.0014715626602992415,
-0.05270707234740257,
-0.0006259901565499604,
0.011864595115184784,
-0.04695536941289902,
-0.03000558167695999,
0.008019465021789074,
-0.053358253091573715,
-0.11106979101896286,
0.027024181559681892,
0.011600221507251263,
-0.026595231145620346,
-0.029332829639315605,
0.07437211275100708,
-0.06541946530342102,
0.0523008294403553,
0.03775523602962494,
0.004632431082427502,
-0.043011415749788284,
-0.016639865934848785,
0.11926796287298203,
0.07586938142776489,
0.055783264338970184,
-0.051156703382730484,
-0.018136605620384216,
-0.15531682968139648,
-0.001704836031422019,
-0.002365426393225789,
-0.003452091244980693,
-0.03999239206314087,
-0.03712404519319534,
0.030487967655062675,
0.0115331606939435,
0.17897628247737885,
0.007775078993290663,
0.014008300378918648,
0.00972355529665947,
0.004505369812250137,
0.010138250887393951,
0.03424154222011566,
0.07107912749052048,
-0.0146470433101058,
-0.0782419741153717,
-0.07852267473936081,
0.034002576023340225,
-0.031431835144758224,
-0.021377690136432648,
0.13470035791397095,
0.13495807349681854,
0.10516709834337234,
0.022541314363479614,
0.00033218940370716155,
-0.029129769653081894,
-0.0278711449354887,
0.021909646689891815,
0.055680252611637115,
0.05131605640053749,
-0.014103053137660027,
0.0115275327116251,
0.06770992279052734,
-0.12493907660245895,
0.12022726237773895,
-0.036126282066106796,
-0.028630182147026062,
-0.10668518394231796,
-0.07769206911325455,
-0.018197566270828247,
-0.0204328503459692,
-0.020265325903892517,
-0.16488797962665558,
0.0468890555202961,
0.10737761855125427,
0.021891841664910316,
-0.03152206167578697,
0.036799199879169464,
-0.14782366156578064,
-0.08943045139312744,
0.0640898123383522,
0.013145062141120434,
0.04484181106090546,
0.10876864939928055,
-0.0149528281763196,
0.07934319227933884,
0.1425880789756775,
0.06128155067563057,
0.05516590550541878,
0.0836014673113823,
0.009950608015060425,
-0.019848408177495003,
-0.04025429114699364,
0.007537965662777424,
-0.06164644658565521,
0.03777570277452469,
0.16070657968521118,
0.031580861657857895,
-0.05004367604851723,
0.035402100533246994,
0.18150335550308228,
-0.035735711455345154,
-0.051553886383771896,
-0.17887122929096222,
0.20867307484149933,
0.023280100896954536,
0.04292156174778938,
0.05001642182469368,
-0.0897267684340477,
-0.035536278039216995,
0.20675621926784515,
0.10808127373456955,
0.02346889115869999,
-0.02330130897462368,
0.018269618973135948,
-0.010244596749544144,
0.009271289221942425,
0.08356710523366928,
0.005791049916297197,
0.21851466596126556,
-0.037701960653066635,
0.01763768307864666,
0.032647114247083664,
0.03483656793832779,
-0.07203365117311478,
0.15111427009105682,
-0.0471937395632267,
-0.0020795245654881,
-0.05633596330881119,
0.022203516215085983,
0.019242219626903534,
-0.30525633692741394,
-0.11144768446683884,
0.0014185905456542969,
-0.06275419890880585,
-0.017362654209136963,
-0.021143347024917603,
-0.005814108531922102,
0.0525100976228714,
-0.0035609968472272158,
0.03155437484383583,
0.18175218999385834,
-0.005660612136125565,
-0.038061633706092834,
-0.04124593362212181,
0.12199918180704117,
0.02305629476904869,
0.13904565572738647,
0.0592462420463562,
-0.015836432576179504,
0.04395200312137604,
0.023053450509905815,
-0.11315733194351196,
-0.03439046069979668,
-0.025616444647312164,
-0.01017607282847166,
-0.022928427904844284,
0.13267379999160767,
0.020327145233750343,
0.04822308197617531,
0.038585271686315536,
-0.043278008699417114,
0.05070784315466881,
0.04264714568853378,
-0.06808316707611084,
-0.05269639939069748,
0.04325313866138458,
-0.09679695218801498,
0.14078538119792938,
0.176711767911911,
0.012651807628571987,
0.025648366659879684,
-0.055557139217853546,
-0.00956850778311491,
-0.000046365646994672716,
0.11470101773738861,
-0.004262703005224466,
-0.15866760909557343,
-0.013855247758328915,
-0.08824799954891205,
0.03991205617785454,
-0.22237372398376465,
-0.04516876861453056,
0.10220225155353546,
-0.011077224276959896,
-0.01024479977786541,
0.04972777143120766,
0.005058369133621454,
0.05887448787689209,
-0.015681663528084755,
-0.04703754559159279,
0.00906322617083788,
0.06767041236162186,
-0.08355556428432465,
-0.03241541236639023
] |
null | null |
transformers
|
# MultiBERTs - Seed 14
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #14.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_14')
model = TFBertModel.from_pretrained("google/multiberts-seed_14")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_14')
model = BertModel.from_pretrained("google/multiberts-seed_14")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_14"]}
| null |
google/multiberts-seed_14
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_14",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_14 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 14
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #14.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 14\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #14.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_14 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 14\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #14.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_14 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 14\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #14.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06327477097511292,
0.09247004985809326,
-0.004097166005522013,
0.04394588619470596,
0.07707083225250244,
0.015765374526381493,
0.055195048451423645,
0.0743778645992279,
-0.0894586518406868,
0.022788411006331444,
-0.012126768007874489,
-0.046931348741054535,
0.07683500647544861,
-0.04497578740119934,
0.06048639863729477,
-0.2350468635559082,
0.049463722854852676,
-0.029950059950351715,
-0.027083614841103554,
0.0274893119931221,
0.11197644472122192,
-0.0949491411447525,
0.07444488257169724,
0.056177202612161636,
0.005011540371924639,
0.016719605773687363,
-0.015461555682122707,
0.005139269400388002,
0.08732490241527557,
0.03235257416963577,
0.08419152349233627,
-0.003084396943449974,
0.08489091694355011,
-0.14115110039710999,
0.006474969442933798,
0.059539955109357834,
0.06295567005872726,
0.04334457218647003,
0.11640869081020355,
0.00828645471483469,
0.08755805343389511,
0.01755533367395401,
0.05214099586009979,
0.04519319534301758,
-0.07490947097539902,
-0.17262695729732513,
-0.09303005784749985,
0.01863495074212551,
-0.0012525946367532015,
0.006531113293021917,
-0.007285608910024166,
-0.019542742520570755,
-0.019266020506620407,
0.02034892700612545,
0.11944876611232758,
-0.26651376485824585,
-0.016444982960820198,
0.009022383019328117,
0.0579664520919323,
0.056947533041238785,
-0.039107538759708405,
-0.04094693809747696,
0.04343462362885475,
0.05277587100863457,
0.041073743253946304,
-0.024583477526903152,
0.039950594305992126,
-0.015567990019917488,
-0.1528293639421463,
-0.01909204013645649,
0.10646423697471619,
-0.04907464608550072,
-0.11710110306739807,
-0.04771021753549576,
-0.03353680297732353,
0.12374385446310043,
0.008713634684681892,
-0.036325763911008835,
0.0464077927172184,
0.03025086410343647,
0.06470198929309845,
-0.06291639059782028,
-0.11568465828895569,
0.026712702587246895,
-0.05148518458008766,
0.10763870924711227,
0.09388255327939987,
0.047936778515577316,
-0.006984763778746128,
0.05587233975529671,
-0.08497479557991028,
-0.07745594531297684,
-0.05070866644382477,
-0.08877521753311157,
-0.04117939621210098,
-0.03848012164235115,
-0.0843222364783287,
-0.16250203549861908,
-0.004388850647956133,
0.11212586611509323,
-0.0641828402876854,
0.008349115028977394,
-0.08898264914751053,
-0.022021494805812836,
0.0927763357758522,
0.1602919101715088,
-0.11021947860717773,
0.048302989453077316,
-0.010109095834195614,
0.00959949940443039,
-0.023296698927879333,
0.03151928633451462,
0.011892091482877731,
-0.010842937976121902,
0.05083446949720383,
0.023367611691355705,
-0.01929917186498642,
0.04326091334223747,
-0.02006743848323822,
-0.041714947670698166,
0.05486022308468819,
-0.133912593126297,
-0.010688391514122486,
0.002043918240815401,
-0.003810984082520008,
0.060737494379282,
0.06525497138500214,
-0.02811233326792717,
-0.08987840265035629,
0.02365085296332836,
-0.082266665995121,
-0.04737674072384834,
-0.06012191250920296,
-0.15662886202335358,
0.028162771835923195,
-0.07488901168107986,
-0.048246342688798904,
-0.09268542379140854,
-0.09876010566949844,
-0.027207717299461365,
0.06064668670296669,
-0.017266500741243362,
0.03678897023200989,
0.03140469267964363,
-0.008044655434787273,
-0.041284073144197464,
0.04638509079813957,
0.007060708478093147,
-0.014898031949996948,
0.00781857781112194,
-0.044106293469667435,
0.05446595698595047,
-0.009523272514343262,
0.04400602728128433,
-0.07016833871603012,
0.02159297466278076,
-0.14375938475131989,
0.06052447855472565,
-0.09728984534740448,
-0.08238416910171509,
-0.04968900978565216,
-0.04264098405838013,
-0.07273770868778229,
0.031104428693652153,
0.010185928083956242,
0.06158705800771713,
-0.14746986329555511,
-0.04977180436253548,
0.1372026950120926,
-0.136674165725708,
0.034855857491493225,
0.09585694223642349,
-0.0511818528175354,
0.045032475143671036,
0.1161937490105629,
0.05836695805191994,
0.0723828598856926,
-0.04799598455429077,
-0.014772881753742695,
0.008783348836004734,
0.03453686460852623,
0.14483880996704102,
0.06612467765808105,
-0.06813590973615646,
-0.08234253525733948,
0.03571615740656853,
-0.07219241559505463,
-0.04358479753136635,
-0.05931985005736351,
-0.0049633486196398735,
-0.009762566536664963,
-0.0558130145072937,
-0.00513207633048296,
-0.02391929365694523,
-0.011820265091955662,
-0.017217770218849182,
-0.05157246068120003,
0.05305049568414688,
0.0626947283744812,
-0.08816661685705185,
0.05664456635713577,
-0.05447489395737648,
0.017582828179001808,
-0.07867815345525742,
-0.0007666592136956751,
-0.17910704016685486,
0.007184607908129692,
0.1099049523472786,
-0.10022857040166855,
0.05064956471323967,
0.16204923391342163,
0.0225418321788311,
0.06800979375839233,
-0.05278134346008301,
0.07177671790122986,
0.006533386651426554,
-0.025021785870194435,
-0.04540746286511421,
-0.11831971257925034,
-0.06490833312273026,
-0.06069597974419594,
0.01214060839265585,
-0.08615545183420181,
-0.00510111078619957,
-0.03928294777870178,
0.021407408639788628,
0.022431520745158195,
-0.0629182904958725,
0.020377574488520622,
0.024188615381717682,
-0.038341082632541656,
-0.02740381844341755,
-0.025598997250199318,
0.044800762087106705,
0.016062729060649872,
0.11622472107410431,
-0.09389118105173111,
-0.06728809326887131,
0.04677532613277435,
0.05226068198680878,
-0.05255919694900513,
0.09180594980716705,
-0.054431695491075516,
-0.03427853807806969,
-0.09819693863391876,
-0.09938028454780579,
0.17246848344802856,
-0.006131108850240707,
0.09872183948755264,
-0.0963088646531105,
-0.026017609983682632,
0.00019520788919180632,
-0.008409547619521618,
-0.0026015678886324167,
0.05230625718832016,
0.012852239422500134,
-0.09334222227334976,
-0.001999534899368882,
0.013813138008117676,
0.017955834046006203,
0.07664385437965393,
-0.020103491842746735,
-0.11417316645383835,
0.03055759146809578,
-0.002692045411095023,
-0.005910318344831467,
0.06541533768177032,
-0.04915758967399597,
-0.004558845888823271,
0.05529795587062836,
0.05604296550154686,
0.055791933089494705,
-0.0660230964422226,
0.09525653719902039,
0.06547993421554565,
-0.042885251343250275,
-0.04446778446435928,
-0.08454088866710663,
0.010694049298763275,
0.11550445109605789,
0.025425706058740616,
0.05938171222805977,
-0.04704926535487175,
-0.023439116775989532,
-0.10349010676145554,
0.156841441988945,
-0.08783555030822754,
-0.15953105688095093,
-0.1529945582151413,
0.00581037811934948,
-0.054742202162742615,
0.06269072741270065,
0.01599200628697872,
-0.04985694959759712,
-0.09738580137491226,
-0.07767938077449799,
0.15947814285755157,
-0.0394289456307888,
-0.006651606876403093,
0.018597230315208435,
-0.029368586838245392,
0.03809042274951935,
-0.1813100427389145,
-0.00019233614148106426,
-0.03975437581539154,
-0.12473704665899277,
-0.038204487413167953,
-0.00003529889727360569,
0.06838983297348022,
0.07092972099781036,
-0.03839206323027611,
-0.07748926430940628,
0.01865926943719387,
0.165824756026268,
0.03279424458742142,
0.07776368409395218,
0.09336031973361969,
-0.09889772534370422,
0.04262465983629227,
0.04700075834989548,
0.030835455283522606,
-0.013792742975056171,
0.008344607427716255,
0.05709591880440712,
-0.0257125124335289,
-0.2856137752532959,
-0.008645993657410145,
-0.01937740109860897,
-0.01746133342385292,
0.06605719774961472,
0.04173324257135391,
-0.08549223095178604,
0.048471059650182724,
-0.05756731331348419,
0.03402166813611984,
0.08728283643722534,
0.04463059827685356,
0.09784558415412903,
-0.038530778139829636,
0.09234830737113953,
-0.05361322686076164,
-0.01793459616601467,
0.108364038169384,
-0.05173027515411377,
0.19912859797477722,
-0.056407198309898376,
0.053231511265039444,
0.09715421497821808,
-0.012279228307306767,
0.03827143833041191,
0.13887029886245728,
-0.05309184640645981,
0.06978300958871841,
-0.057646650820970535,
-0.045206859707832336,
-0.037670210003852844,
0.02424062415957451,
-0.0028934096917510033,
0.03555848076939583,
-0.036091260612010956,
-0.017214281484484673,
-0.004130440764129162,
0.2386258989572525,
0.06924870610237122,
-0.12351202219724655,
-0.06906215846538544,
0.006964627653360367,
-0.10787557810544968,
-0.07061963528394699,
0.05041799694299698,
0.09121118485927582,
-0.08226528018712997,
0.04711557924747467,
0.009704917669296265,
0.06809937208890915,
-0.12646782398223877,
0.020920567214488983,
0.039880551397800446,
0.050631895661354065,
-0.02576892264187336,
0.03388027846813202,
-0.15369856357574463,
0.08292799443006516,
0.03620755299925804,
0.053196538239717484,
-0.05097927898168564,
0.06424278765916824,
0.021368583664298058,
-0.012160326354205608,
0.02553236298263073,
0.011487362906336784,
-0.021274100989103317,
-0.02683069370687008,
-0.0665150135755539,
0.08362610638141632,
0.07530847936868668,
-0.050901081413030624,
0.1199321523308754,
-0.04946505278348923,
0.01195540837943554,
-0.009078566916286945,
0.0756959393620491,
-0.1713418811559677,
-0.13078606128692627,
0.045071884989738464,
-0.14278684556484222,
-0.023964984342455864,
-0.06756066530942917,
-0.05505317822098732,
-0.06992889195680618,
0.1668010652065277,
-0.1207219734787941,
-0.13278047740459442,
-0.08523955196142197,
-0.012378181330859661,
0.15403592586517334,
-0.030365893617272377,
0.008707194589078426,
-0.01659647561609745,
0.13178043067455292,
-0.036700472235679626,
-0.15179841220378876,
-0.04859443008899689,
-0.07015110552310944,
-0.15091992914676666,
-0.03377184644341469,
0.07012169063091278,
0.10885437577962875,
0.05185819789767265,
0.005284607410430908,
0.026129452511668205,
0.0031515290029346943,
-0.05233752727508545,
-0.01564234495162964,
0.17903417348861694,
0.05364010110497475,
0.07018142193555832,
-0.1606960892677307,
-0.056185033172369,
-0.04932408034801483,
0.024048587307333946,
-0.04681725800037384,
0.09854234009981155,
-0.03002830035984516,
0.07762901484966278,
0.24239881336688995,
-0.12953394651412964,
-0.20326529443264008,
0.007582955993711948,
0.02925703302025795,
0.00406311359256506,
0.007287491112947464,
-0.2247365564107895,
0.12205755710601807,
0.08878932893276215,
-0.00021755858324468136,
-0.006449407432228327,
-0.1833972930908203,
-0.0819445326924324,
0.07977605611085892,
0.008331450633704662,
0.14663708209991455,
-0.09229901432991028,
-0.03188139572739601,
0.008149875327944756,
-0.08593426644802094,
0.05138051509857178,
0.04553108662366867,
0.08323661237955093,
-0.0003140019252896309,
-0.07581489533185959,
0.049915559589862823,
-0.01419626735150814,
0.08685971796512604,
0.04537605494260788,
0.04640005901455879,
-0.03452282398939133,
0.13237084448337555,
0.003002800513058901,
-0.016656966879963875,
0.13695025444030762,
0.11529123783111572,
0.056942492723464966,
-0.026698926463723183,
-0.06266563385725021,
-0.07315146923065186,
0.011882100254297256,
-0.02185114659368992,
-0.038749538362026215,
-0.06427060812711716,
0.03999343514442444,
0.06320459395647049,
0.00033424582215957344,
-0.04361530765891075,
-0.024749554693698883,
0.05961153283715248,
0.0904659777879715,
0.19272762537002563,
-0.05344469100236893,
-0.0063211978413164616,
-0.017419051378965378,
-0.022490428760647774,
0.06895986944437027,
-0.019995247945189476,
0.0645354762673378,
0.09021248668432236,
0.009715157561004162,
0.08191674947738647,
0.06243310868740082,
-0.13132531940937042,
-0.022641751915216446,
0.054779961705207825,
-0.10161367058753967,
-0.1368795782327652,
-0.02725379541516304,
-0.1098439022898674,
-0.13494761288166046,
-0.0013499833876267076,
0.17113761603832245,
-0.03684428706765175,
-0.04639966040849686,
-0.01479236502200365,
0.07932436466217041,
0.019528398290276527,
0.13117846846580505,
0.03408968821167946,
-0.015141828916966915,
-0.06217586249113083,
0.17079584300518036,
0.08913572132587433,
-0.09434824436903,
0.009609710425138474,
0.01693260483443737,
-0.05989577993750572,
-0.005051720887422562,
-0.06585046648979187,
0.0743226557970047,
-0.027223654091358185,
-0.03934168443083763,
0.0013864305801689625,
-0.1003178209066391,
0.049742087721824646,
0.14957913756370544,
0.007060974836349487,
0.157692089676857,
-0.037061166018247604,
0.06316906958818436,
-0.0752071812748909,
0.0729927346110344,
0.05392773449420929,
0.07695692777633667,
-0.017502382397651672,
0.04782574251294136,
-0.045751746743917465,
-0.0012310371967032552,
-0.014836925081908703,
0.0014599633868783712,
-0.09099193662405014,
-0.05603594705462456,
-0.22301800549030304,
0.02633899450302124,
-0.05712450295686722,
-0.0353507399559021,
0.010168645530939102,
-0.014258935116231441,
0.003298427676782012,
0.036218758672475815,
-0.025803135707974434,
-0.032459259033203125,
-0.025667956098914146,
0.062479596585035324,
-0.12226687371730804,
0.026512911543250084,
0.06738056242465973,
-0.08778100460767746,
0.07773033529520035,
-0.0005752961151301861,
-0.052285335958004,
-0.0004873114812653512,
0.014083546586334705,
-0.04701706022024155,
-0.030190685763955116,
0.007884107530117035,
-0.05280224233865738,
-0.11150483787059784,
0.027300985530018806,
0.011328693479299545,
-0.02652936987578869,
-0.029674146324396133,
0.07444377988576889,
-0.06479483842849731,
0.05259304493665695,
0.03753871098160744,
0.00452136667445302,
-0.043234627693891525,
-0.016226934269070625,
0.11927047371864319,
0.0754568949341774,
0.05617624148726463,
-0.05110227316617966,
-0.018769927322864532,
-0.15561270713806152,
-0.001982414862141013,
-0.0024478433188050985,
-0.004288149066269398,
-0.038165170699357986,
-0.03735976666212082,
0.030785489827394485,
0.011291294358670712,
0.17793810367584229,
0.007592813577502966,
0.014611050486564636,
0.00943999458104372,
0.004119677934795618,
0.007589047309011221,
0.034408681094646454,
0.07082472741603851,
-0.014386240392923355,
-0.07816196978092194,
-0.0788959413766861,
0.0344114750623703,
-0.031100843101739883,
-0.019206814467906952,
0.13334326446056366,
0.13402236998081207,
0.10526691377162933,
0.022426148876547813,
0.000443263299530372,
-0.029660651460289955,
-0.028923572972416878,
0.02374688722193241,
0.05549098551273346,
0.05160816013813019,
-0.014163295738399029,
0.01149732992053032,
0.06806643307209015,
-0.12421638518571854,
0.12070364505052567,
-0.036449022591114044,
-0.02896914631128311,
-0.1064339429140091,
-0.07632173597812653,
-0.01819106563925743,
-0.02058127336204052,
-0.020167777314782143,
-0.16480334103107452,
0.04725643992424011,
0.10735172033309937,
0.02164003811776638,
-0.03154119476675987,
0.03649130091071129,
-0.1478285789489746,
-0.08961409330368042,
0.06400491297245026,
0.013256547041237354,
0.04442429915070534,
0.10922383517026901,
-0.015198715962469578,
0.07959745824337006,
0.1421274095773697,
0.06106564402580261,
0.05515887960791588,
0.08380883187055588,
0.01011220645159483,
-0.01996297761797905,
-0.039803966879844666,
0.007375732995569706,
-0.06095104664564133,
0.03805322200059891,
0.16013072431087494,
0.03125954791903496,
-0.05032534524798393,
0.03500797599554062,
0.18201160430908203,
-0.035639937967061996,
-0.051755234599113464,
-0.17839790880680084,
0.20912449061870575,
0.022884216159582138,
0.04277711361646652,
0.049855321645736694,
-0.08933374285697937,
-0.03480956703424454,
0.20751702785491943,
0.1079903170466423,
0.023592760786414146,
-0.023507481440901756,
0.018514549359679222,
-0.010399332270026207,
0.009335397742688656,
0.08369572460651398,
0.0060851131565868855,
0.2181927114725113,
-0.03733808547258377,
0.017540888860821724,
0.03188346326351166,
0.03503884747624397,
-0.07272616773843765,
0.15264680981636047,
-0.047394994646310806,
-0.00214307545684278,
-0.055929794907569885,
0.021773403510451317,
0.02055230922996998,
-0.3078469932079315,
-0.1107175350189209,
0.0015837001847103238,
-0.06289318948984146,
-0.0173419751226902,
-0.021714041009545326,
-0.005632271524518728,
0.05222303420305252,
-0.003897585906088352,
0.031142285093665123,
0.18284255266189575,
-0.005867891013622284,
-0.037631142884492874,
-0.041252877563238144,
0.12176381796598434,
0.02037561684846878,
0.1384326070547104,
0.05877716839313507,
-0.01587837003171444,
0.04420533776283264,
0.023548880591988564,
-0.11315696686506271,
-0.03481902927160263,
-0.026118414476513863,
-0.010601040907204151,
-0.022787079215049744,
0.13274289667606354,
0.02010253630578518,
0.04662857577204704,
0.03814969211816788,
-0.043482739478349686,
0.05023368075489998,
0.042393725365400314,
-0.06917455792427063,
-0.05212376266717911,
0.0422617606818676,
-0.09660842269659042,
0.14075371623039246,
0.176730215549469,
0.012598123401403427,
0.025739409029483795,
-0.055729687213897705,
-0.009786244481801987,
0.0005843035178259015,
0.11382268369197845,
-0.004034154117107391,
-0.15849705040454865,
-0.013650098815560341,
-0.08710497617721558,
0.039719920605421066,
-0.2221165895462036,
-0.04493257403373718,
0.10211581736803055,
-0.011138072237372398,
-0.009596897289156914,
0.049618784338235855,
0.0051567829214036465,
0.05924003943800926,
-0.01592337340116501,
-0.04868638888001442,
0.009183932095766068,
0.06767826527357101,
-0.0829823687672615,
-0.03286344185471535
] |
null | null |
transformers
|
# MultiBERTs - Seed 15
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #15.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_15')
model = TFBertModel.from_pretrained("google/multiberts-seed_15")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_15')
model = BertModel.from_pretrained("google/multiberts-seed_15")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_15"]}
| null |
google/multiberts-seed_15
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_15",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_15 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 15
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #15.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 15\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #15.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_15 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 15\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #15.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_15 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 15\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #15.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06332466006278992,
0.09200303256511688,
-0.004092170391231775,
0.043027449399232864,
0.07735883444547653,
0.01545252650976181,
0.05501613765954971,
0.0740608274936676,
-0.08909862488508224,
0.022761056199669838,
-0.011848441325128078,
-0.046927351504564285,
0.07714672386646271,
-0.044653329998254776,
0.059805721044540405,
-0.23416543006896973,
0.049968134611845016,
-0.02989405021071434,
-0.027786703780293465,
0.027734091505408287,
0.11220365017652512,
-0.09579554200172424,
0.07428361475467682,
0.056481070816516876,
0.004794291686266661,
0.01659565232694149,
-0.01584676466882229,
0.005241220351308584,
0.08673324435949326,
0.0330086350440979,
0.08380848914384842,
-0.0024705922696739435,
0.08430036157369614,
-0.14152710139751434,
0.00665248790755868,
0.05957683548331261,
0.06343230605125427,
0.04358838498592377,
0.11680831760168076,
0.00797996111214161,
0.08695952594280243,
0.018055329099297523,
0.05178960785269737,
0.04580405727028847,
-0.07470203191041946,
-0.16899751126766205,
-0.09211649000644684,
0.01939212903380394,
-0.0009580423939041793,
0.00603343453258276,
-0.007358975242823362,
-0.019110681489109993,
-0.02011851966381073,
0.020360177382826805,
0.12101003527641296,
-0.2649694085121155,
-0.016304418444633484,
0.009821740910410881,
0.05697586387395859,
0.0563337467610836,
-0.03893837705254555,
-0.040848858654499054,
0.04345487058162689,
0.051958274096250534,
0.04168330878019333,
-0.0244220569729805,
0.03983659669756889,
-0.015770459547638893,
-0.15316453576087952,
-0.01894984021782875,
0.1052388846874237,
-0.04863778501749039,
-0.11710884422063828,
-0.048727571964263916,
-0.03335362300276756,
0.1247725784778595,
0.008454750292003155,
-0.036642104387283325,
0.04657646641135216,
0.030218183994293213,
0.06488838791847229,
-0.06316246092319489,
-0.11549847573041916,
0.026125695556402206,
-0.0513092465698719,
0.10824134200811386,
0.0940905436873436,
0.04790739715099335,
-0.007337606977671385,
0.055493250489234924,
-0.0863460823893547,
-0.07725020498037338,
-0.05132902413606644,
-0.08890029042959213,
-0.04133870452642441,
-0.039244987070560455,
-0.0844084694981575,
-0.16598716378211975,
-0.003976577892899513,
0.11088535934686661,
-0.06307501345872879,
0.00790895614773035,
-0.08897172659635544,
-0.02208901196718216,
0.09411967545747757,
0.15981894731521606,
-0.1109859049320221,
0.04802124574780464,
-0.010273738764226437,
0.010277608409523964,
-0.02367297373712063,
0.0319477915763855,
0.012466244399547577,
-0.010908237658441067,
0.05107579380273819,
0.023628078401088715,
-0.019402895122766495,
0.042995382100343704,
-0.020541256293654442,
-0.04222848638892174,
0.055628933012485504,
-0.13305261731147766,
-0.011117184534668922,
0.001684084301814437,
-0.004594801459461451,
0.0603395514190197,
0.06512080132961273,
-0.02758977562189102,
-0.08976224809885025,
0.024480735883116722,
-0.08279314637184143,
-0.047019392251968384,
-0.06039450690150261,
-0.15690545737743378,
0.02795557864010334,
-0.07621684670448303,
-0.04816873371601105,
-0.09248486161231995,
-0.0989627093076706,
-0.026731479912996292,
0.060485802590847015,
-0.01733655482530594,
0.036900658160448074,
0.031023547053337097,
-0.007883687503635883,
-0.0413612462580204,
0.04634449630975723,
0.007886611856520176,
-0.01461056899279356,
0.007692293729633093,
-0.04436657950282097,
0.0541396364569664,
-0.008048130199313164,
0.04471955448389053,
-0.07077786326408386,
0.02129325643181801,
-0.14449648559093475,
0.061454884707927704,
-0.09739182144403458,
-0.08246048539876938,
-0.05027218535542488,
-0.042400043457746506,
-0.07346206903457642,
0.030800309032201767,
0.010718017816543579,
0.06176648661494255,
-0.14718791842460632,
-0.049763452261686325,
0.1374782919883728,
-0.1363637000322342,
0.034889645874500275,
0.09543587267398834,
-0.05149364098906517,
0.04447467252612114,
0.11620515584945679,
0.05775049328804016,
0.07185517251491547,
-0.0483371876180172,
-0.014535853639245033,
0.008636354468762875,
0.034055810421705246,
0.14411500096321106,
0.06614560633897781,
-0.06835230439901352,
-0.08260887861251831,
0.0354219414293766,
-0.07370462268590927,
-0.042576078325510025,
-0.059848930686712265,
-0.004656027536839247,
-0.009818281047046185,
-0.05675411969423294,
-0.006472875364124775,
-0.023761969059705734,
-0.011520855128765106,
-0.01670902781188488,
-0.05102565884590149,
0.05444255471229553,
0.0625414326786995,
-0.08767437934875488,
0.05638866871595383,
-0.05436092987656593,
0.0169533621519804,
-0.08030997216701508,
-0.0006303901318460703,
-0.1797935515642166,
0.008630627766251564,
0.11027447134256363,
-0.10144325345754623,
0.0509980246424675,
0.16181662678718567,
0.022236686199903488,
0.06809365749359131,
-0.05245375633239746,
0.07208921015262604,
0.0067515322007238865,
-0.024857675656676292,
-0.04584198445081711,
-0.11821211129426956,
-0.06447497755289078,
-0.06068529561161995,
0.010257539339363575,
-0.08470186591148376,
-0.004956322256475687,
-0.040126629173755646,
0.02147449553012848,
0.022555848583579063,
-0.06358499079942703,
0.02146313339471817,
0.024029279127717018,
-0.03848828747868538,
-0.02723701484501362,
-0.02558080479502678,
0.04459831118583679,
0.01644960232079029,
0.11654721200466156,
-0.09322687238454819,
-0.06754136085510254,
0.0464714840054512,
0.052724868059158325,
-0.052768416702747345,
0.09157712757587433,
-0.05399976298213005,
-0.03412441536784172,
-0.0993337482213974,
-0.09878513216972351,
0.1714373081922531,
-0.005777465179562569,
0.09896716475486755,
-0.09711898118257523,
-0.02614668942987919,
0.00006187101826071739,
-0.008098354563117027,
-0.0027629465330392122,
0.05224592611193657,
0.014209059998393059,
-0.09514471888542175,
-0.0018709093565121293,
0.014037679880857468,
0.018394075334072113,
0.07668723911046982,
-0.019679544493556023,
-0.11460201442241669,
0.03073134273290634,
-0.002497417852282524,
-0.006198112387210131,
0.06501660495996475,
-0.04986860975623131,
-0.0051009682938456535,
0.055103905498981476,
0.05558689683675766,
0.05579516291618347,
-0.06559900939464569,
0.09557690471410751,
0.06605851650238037,
-0.04285601153969765,
-0.04438501223921776,
-0.08420123904943466,
0.010318142361938953,
0.11534634977579117,
0.025254538282752037,
0.05983242765069008,
-0.04769892990589142,
-0.0233575701713562,
-0.10317496210336685,
0.1571374088525772,
-0.08700951188802719,
-0.16047537326812744,
-0.1517850011587143,
0.0068072667345404625,
-0.05544364079833031,
0.06212938576936722,
0.015821322798728943,
-0.05019592121243477,
-0.09795331954956055,
-0.0776318833231926,
0.1607639640569687,
-0.038790635764598846,
-0.006768211722373962,
0.018805844709277153,
-0.02924615703523159,
0.0376162976026535,
-0.18186023831367493,
-0.00021341849060263485,
-0.04010374844074249,
-0.1240011602640152,
-0.03887559100985527,
0.0002413501642877236,
0.06890618801116943,
0.07072191685438156,
-0.03850068897008896,
-0.0772860199213028,
0.018786096945405006,
0.1660531461238861,
0.033342424780130386,
0.07699444890022278,
0.09345069527626038,
-0.09797460585832596,
0.042540863156318665,
0.046790000051259995,
0.03092760592699051,
-0.013709813356399536,
0.00861017033457756,
0.05796220526099205,
-0.025401782244443893,
-0.28477588295936584,
-0.0086598452180624,
-0.01949704997241497,
-0.018194496631622314,
0.06624123454093933,
0.04153089225292206,
-0.08483747392892838,
0.048893220722675323,
-0.05823914706707001,
0.033556416630744934,
0.08731398731470108,
0.04466010257601738,
0.09575439244508743,
-0.038000110536813736,
0.09301557391881943,
-0.053513605147600174,
-0.018014658242464066,
0.1083221286535263,
-0.05213440582156181,
0.19896982610225677,
-0.05644820258021355,
0.053662870079278946,
0.09724700450897217,
-0.013234177604317665,
0.03850246220827103,
0.13953641057014465,
-0.05333146080374718,
0.06929438561201096,
-0.05755016580224037,
-0.04504867270588875,
-0.037652190774679184,
0.023959821090102196,
-0.001924454583786428,
0.03574812039732933,
-0.036029599606990814,
-0.015896033495664597,
-0.0037143025547266006,
0.24030917882919312,
0.06834804266691208,
-0.12211181968450546,
-0.06881725788116455,
0.006754667963832617,
-0.10861469060182571,
-0.07074134051799774,
0.05031067878007889,
0.09044741839170456,
-0.08236490935087204,
0.046822961419820786,
0.009534631855785847,
0.06869866698980331,
-0.12592072784900665,
0.020726822316646576,
0.03911392018198967,
0.05156191065907478,
-0.026187216863036156,
0.03346913680434227,
-0.15585032105445862,
0.08330103009939194,
0.03636709228157997,
0.05304080620408058,
-0.05122043192386627,
0.06373929232358932,
0.021541669964790344,
-0.01329395454376936,
0.025988725945353508,
0.011172707192599773,
-0.018899910151958466,
-0.027588477358222008,
-0.0668288916349411,
0.08279436826705933,
0.07505931705236435,
-0.050748568028211594,
0.11931738257408142,
-0.0493084117770195,
0.011715130880475044,
-0.00879130233079195,
0.07649244368076324,
-0.17306920886039734,
-0.1308993101119995,
0.04488598555326462,
-0.142909973859787,
-0.025057844817638397,
-0.06767313182353973,
-0.05492383986711502,
-0.0701737105846405,
0.1662575602531433,
-0.12152989208698273,
-0.13313078880310059,
-0.08528587967157364,
-0.010665802285075188,
0.15356869995594025,
-0.030343234539031982,
0.008827549405395985,
-0.017491377890110016,
0.13240757584571838,
-0.037267182022333145,
-0.15187209844589233,
-0.04893239587545395,
-0.07090936601161957,
-0.1504828780889511,
-0.03356736898422241,
0.06920149177312851,
0.10942667722702026,
0.05204106867313385,
0.004859032109379768,
0.025714291259646416,
0.003731529228389263,
-0.052149202674627304,
-0.016514338552951813,
0.17909279465675354,
0.053105324506759644,
0.0709538385272026,
-0.15967194736003876,
-0.05754062533378601,
-0.049993183463811874,
0.023779580369591713,
-0.04643712937831879,
0.09796798229217529,
-0.029702499508857727,
0.07780725508928299,
0.2426520437002182,
-0.12919025123119354,
-0.2037753015756607,
0.008666622452437878,
0.029942478984594345,
0.004302667919546366,
0.007197790313512087,
-0.22484180331230164,
0.12236598134040833,
0.08877803385257721,
-0.00010933159501291811,
-0.008104964159429073,
-0.18425403535366058,
-0.08234447240829468,
0.08038152009248734,
0.009064191952347755,
0.14665117859840393,
-0.09185827523469925,
-0.031643956899642944,
0.008385347202420235,
-0.08630845695734024,
0.053271498531103134,
0.04779260233044624,
0.08298952132463455,
-0.0005403977702371776,
-0.0774487629532814,
0.05027512088418007,
-0.014850989915430546,
0.0861976146697998,
0.04524306207895279,
0.047068383544683456,
-0.033897675573825836,
0.1315135657787323,
0.00260766944848001,
-0.01645778678357601,
0.1380576491355896,
0.11436376720666885,
0.05677468702197075,
-0.024719856679439545,
-0.062709279358387,
-0.07315408438444138,
0.011738687753677368,
-0.021328285336494446,
-0.03883831202983856,
-0.06425061076879501,
0.04041632264852524,
0.06339284032583237,
0.0006401644786819816,
-0.042316921055316925,
-0.02495911717414856,
0.058671124279499054,
0.08985893428325653,
0.19343838095664978,
-0.05475679039955139,
-0.00598370423540473,
-0.01795472390949726,
-0.022117700427770615,
0.0696035623550415,
-0.02014608308672905,
0.06492850929498672,
0.08991534262895584,
0.010151547379791737,
0.08221916109323502,
0.06202511116862297,
-0.13129375874996185,
-0.023107822984457016,
0.05431913956999779,
-0.10076601058244705,
-0.13824397325515747,
-0.02760918065905571,
-0.10783348977565765,
-0.13473893702030182,
-0.0018315947381779552,
0.17043271660804749,
-0.03729777783155441,
-0.046543851494789124,
-0.015080207027494907,
0.07956037670373917,
0.020195117220282555,
0.1318962574005127,
0.03404407948255539,
-0.01497770007699728,
-0.062499042600393295,
0.1715957075357437,
0.08928560465574265,
-0.09409268200397491,
0.009451213292777538,
0.01571587845683098,
-0.05975600704550743,
-0.004840195644646883,
-0.06515422463417053,
0.07513683289289474,
-0.02823452837765217,
-0.03898581862449646,
0.0015688976272940636,
-0.09994914382696152,
0.05026504024863243,
0.15090395510196686,
0.0073155565187335014,
0.15725940465927124,
-0.03705909848213196,
0.06323045492172241,
-0.07488369941711426,
0.0733971819281578,
0.054315850138664246,
0.07696573436260223,
-0.017358893528580666,
0.04714124649763107,
-0.045338645577430725,
-0.0018221283098682761,
-0.014795124530792236,
0.001782845240086317,
-0.09075469523668289,
-0.05614837259054184,
-0.2228754609823227,
0.02622312679886818,
-0.057853613048791885,
-0.03564894199371338,
0.010067624971270561,
-0.01475584413856268,
0.0033036733511835337,
0.03639746829867363,
-0.02578374743461609,
-0.03259771689772606,
-0.02608272060751915,
0.062408942729234695,
-0.12244585156440735,
0.02636493742465973,
0.06706055253744125,
-0.08740688860416412,
0.07722721248865128,
-0.001016430207528174,
-0.05217745900154114,
-0.0008914720383472741,
0.012634007260203362,
-0.04730933904647827,
-0.02999882772564888,
0.00806648563593626,
-0.05273623764514923,
-0.11136043071746826,
0.028055435046553612,
0.011688235215842724,
-0.02640264667570591,
-0.030214888975024223,
0.07454196363687515,
-0.0648849830031395,
0.052305933088064194,
0.03718230873346329,
0.005499035585671663,
-0.043490760028362274,
-0.01601402834057808,
0.11973556131124496,
0.0757327675819397,
0.05630899965763092,
-0.05050211399793625,
-0.018837550655007362,
-0.15560796856880188,
-0.0014987929025664926,
-0.002484179800376296,
-0.004318599123507738,
-0.04039793089032173,
-0.03721190616488457,
0.03088012896478176,
0.01124048512428999,
0.17786158621311188,
0.008384318090975285,
0.014954100362956524,
0.009799572639167309,
0.00591098191216588,
0.008411599323153496,
0.03402940556406975,
0.0706799253821373,
-0.014337060041725636,
-0.07799403369426727,
-0.07868827879428864,
0.035094235092401505,
-0.03035077266395092,
-0.02058662474155426,
0.13414575159549713,
0.13508552312850952,
0.10414331406354904,
0.022379683330655098,
0.00022937546600587666,
-0.03040825016796589,
-0.02764630876481533,
0.02410745806992054,
0.0553002692759037,
0.05117383599281311,
-0.014185327105224133,
0.012393519282341003,
0.06788963079452515,
-0.12469512224197388,
0.12108863145112991,
-0.0356300063431263,
-0.02845570258796215,
-0.10654068738222122,
-0.07750365883111954,
-0.017998551949858665,
-0.020232681185007095,
-0.020024606958031654,
-0.16492074728012085,
0.0470312163233757,
0.1063462495803833,
0.02178133651614189,
-0.03150218725204468,
0.037182293832302094,
-0.14849679172039032,
-0.08979272097349167,
0.06484772264957428,
0.012744007632136345,
0.044044338166713715,
0.10836905241012573,
-0.014540321193635464,
0.0801570937037468,
0.14207178354263306,
0.061296794563531876,
0.054665468633174896,
0.08299632370471954,
0.010013466700911522,
-0.01940923184156418,
-0.040194861590862274,
0.007707086857408285,
-0.06166791915893555,
0.03789857029914856,
0.1611788272857666,
0.03124934434890747,
-0.04941035434603691,
0.034609321504831314,
0.1817055493593216,
-0.03525121882557869,
-0.05177651718258858,
-0.17847420275211334,
0.20746304094791412,
0.023439979180693626,
0.04354046285152435,
0.050627533346414566,
-0.08946014195680618,
-0.03525093197822571,
0.20786191523075104,
0.10685398429632187,
0.023239916190505028,
-0.02346692606806755,
0.01836218126118183,
-0.010533105581998825,
0.009423702955245972,
0.08282873779535294,
0.0059467488899827,
0.21754923462867737,
-0.037449583411216736,
0.018431279808282852,
0.03207816928625107,
0.03434476628899574,
-0.07148479670286179,
0.15215067565441132,
-0.04668132960796356,
-0.002219460206106305,
-0.05566364526748657,
0.02195727825164795,
0.02070283330976963,
-0.3071568012237549,
-0.11173132061958313,
0.0014953649369999766,
-0.06265082955360413,
-0.018080977723002434,
-0.022974997758865356,
-0.0050072260200977325,
0.05179668590426445,
-0.003367692232131958,
0.03165961429476738,
0.18217158317565918,
-0.005663579795509577,
-0.03796634450554848,
-0.042332906275987625,
0.12147761136293411,
0.021201549097895622,
0.13778050243854523,
0.058870624750852585,
-0.01626593805849552,
0.04439513385295868,
0.023189838975667953,
-0.11325633525848389,
-0.03481103107333183,
-0.026345064863562584,
-0.009700347669422626,
-0.022764142602682114,
0.1334700882434845,
0.020449461415410042,
0.04733975976705551,
0.037650007754564285,
-0.044162727892398834,
0.05069880560040474,
0.041670214384794235,
-0.06925792247056961,
-0.052628446370363235,
0.04375885799527168,
-0.09597142785787582,
0.14080418646335602,
0.17637351155281067,
0.012722749263048172,
0.025188349187374115,
-0.056043051183223724,
-0.009497033432126045,
0.0005927020683884621,
0.11591871827840805,
-0.004044972360134125,
-0.15873099863529205,
-0.013022695668041706,
-0.08868297189474106,
0.04027677699923515,
-0.22194810211658478,
-0.04454420879483223,
0.10178721696138382,
-0.011264113709330559,
-0.009606636129319668,
0.04896024987101555,
0.006075829733163118,
0.05963410064578056,
-0.01580096036195755,
-0.049007803201675415,
0.009287563152611256,
0.0674506202340126,
-0.08255976438522339,
-0.03224799036979675
] |
null | null |
transformers
|
# MultiBERTs - Seed 16
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #16.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_16')
model = TFBertModel.from_pretrained("google/multiberts-seed_16")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_16')
model = BertModel.from_pretrained("google/multiberts-seed_16")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_16"]}
| null |
google/multiberts-seed_16
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_16",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_16 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 16
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #16.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 16\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #16.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_16 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 16\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #16.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_16 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 16\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #16.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06353480368852615,
0.09142616391181946,
-0.004101868718862534,
0.04388528689742088,
0.07765889912843704,
0.01569799706339836,
0.05543048679828644,
0.07401657104492188,
-0.08877194672822952,
0.022821299731731415,
-0.012134626507759094,
-0.04647412151098251,
0.07720378786325455,
-0.04617065563797951,
0.06069639325141907,
-0.23358674347400665,
0.04983681067824364,
-0.03004891611635685,
-0.026919659227132797,
0.02708493173122406,
0.1116112619638443,
-0.09579354524612427,
0.0746297687292099,
0.056418970227241516,
0.004313289653509855,
0.016745978966355324,
-0.015845950692892075,
0.005507004912942648,
0.08709011971950531,
0.03314105048775673,
0.08395111560821533,
-0.002167454920709133,
0.08514056354761124,
-0.14064057171344757,
0.006387699395418167,
0.059576358646154404,
0.06336703896522522,
0.04319487884640694,
0.11718498915433884,
0.008235247805714607,
0.08775999397039413,
0.018608270213007927,
0.05208534002304077,
0.0454559326171875,
-0.07463014870882034,
-0.1692783236503601,
-0.09285611659288406,
0.01742440275847912,
-0.001542109763249755,
0.005937768612056971,
-0.0071129086427390575,
-0.0193277969956398,
-0.01970021054148674,
0.020681846886873245,
0.11987446248531342,
-0.26545360684394836,
-0.016285978257656097,
0.010247939266264439,
0.057873982936143875,
0.05645351856946945,
-0.03828386217355728,
-0.04156046360731125,
0.043602362275123596,
0.052478086203336716,
0.04075650870800018,
-0.024593420326709747,
0.03947393596172333,
-0.015710990875959396,
-0.1529826819896698,
-0.018964102491736412,
0.10485860705375671,
-0.04992048442363739,
-0.11724450439214706,
-0.047774963080883026,
-0.032871998846530914,
0.12373558431863785,
0.009210162796080112,
-0.03589378297328949,
0.046428993344306946,
0.030499452725052834,
0.06516212224960327,
-0.06269125640392303,
-0.11541548371315002,
0.02603202871978283,
-0.05097091197967529,
0.10782845318317413,
0.09395932406187057,
0.04878992587327957,
-0.0066558970138430595,
0.0557149238884449,
-0.0850573405623436,
-0.07721540331840515,
-0.05140354856848717,
-0.08816380053758621,
-0.04141167923808098,
-0.038366954773664474,
-0.08436733484268188,
-0.16506274044513702,
-0.004395400173962116,
0.11019177734851837,
-0.06342282891273499,
0.008405741304159164,
-0.08988197147846222,
-0.02156064100563526,
0.09351243078708649,
0.16090016067028046,
-0.11038010567426682,
0.04978874325752258,
-0.010437743738293648,
0.010164044797420502,
-0.023622270673513412,
0.03157319501042366,
0.012234107591211796,
-0.010849460028111935,
0.05062295123934746,
0.023761847987771034,
-0.019885467365384102,
0.043220724910497665,
-0.020783070474863052,
-0.04217759892344475,
0.056034479290246964,
-0.13336797058582306,
-0.01074312999844551,
0.0018668799893930554,
-0.0040938155725598335,
0.060631051659584045,
0.06550450623035431,
-0.027816694229841232,
-0.09013502299785614,
0.024073956534266472,
-0.08216741681098938,
-0.047058746218681335,
-0.06069399416446686,
-0.15655167400836945,
0.027973739430308342,
-0.07637954503297806,
-0.04836704581975937,
-0.09260601550340652,
-0.09914232790470123,
-0.027232486754655838,
0.06014968082308769,
-0.017721660435199738,
0.037211399525403976,
0.03116511180996895,
-0.008336836472153664,
-0.04170484468340874,
0.04647901654243469,
0.00794418714940548,
-0.014873383566737175,
0.008135924115777016,
-0.04462023824453354,
0.05462897941470146,
-0.00887793768197298,
0.04454931244254112,
-0.0706457793712616,
0.021533329039812088,
-0.14305076003074646,
0.06166455149650574,
-0.09703380614519119,
-0.08274106681346893,
-0.04936997592449188,
-0.042364608496427536,
-0.07254062592983246,
0.030838536098599434,
0.010890742763876915,
0.06155189499258995,
-0.1476157009601593,
-0.04943184182047844,
0.13618767261505127,
-0.13694149255752563,
0.035225264728069305,
0.09599125385284424,
-0.05153084173798561,
0.044527146965265274,
0.11596471071243286,
0.057926226407289505,
0.07165174931287766,
-0.047101084142923355,
-0.014713248237967491,
0.007953662425279617,
0.03477771580219269,
0.144303560256958,
0.06573175638914108,
-0.06759930402040482,
-0.08366865664720535,
0.03606918826699257,
-0.07281270623207092,
-0.043514594435691833,
-0.059933654963970184,
-0.004586956929415464,
-0.01011225488036871,
-0.05680989474058151,
-0.005840911064296961,
-0.023687411099672318,
-0.012405607849359512,
-0.017232796177268028,
-0.05134102329611778,
0.051968809217214584,
0.062495969235897064,
-0.0879317969083786,
0.0565943568944931,
-0.05496852844953537,
0.01686561480164528,
-0.07983691245317459,
-0.0009887844789773226,
-0.17938973009586334,
0.008470849134027958,
0.11064736545085907,
-0.09799317270517349,
0.050621397793293,
0.16241739690303802,
0.0219434667378664,
0.06855353713035583,
-0.05206939950585365,
0.07153847068548203,
0.006478102412074804,
-0.025446400046348572,
-0.045576538890600204,
-0.11788199841976166,
-0.06516020745038986,
-0.061170995235443115,
0.008999166078865528,
-0.08559057861566544,
-0.00530912633985281,
-0.03936242684721947,
0.020329909399151802,
0.02237006463110447,
-0.06356330215930939,
0.020688405260443687,
0.024388043209910393,
-0.03874046728014946,
-0.027407420799136162,
-0.02580198645591736,
0.04493410140275955,
0.016256865113973618,
0.11678638309240341,
-0.09392967075109482,
-0.06751945614814758,
0.0462123267352581,
0.05239962413907051,
-0.05252668634057045,
0.09256323426961899,
-0.05417431518435478,
-0.03411759436130524,
-0.09854351729154587,
-0.09870626777410507,
0.17309436202049255,
-0.005951888393610716,
0.09892744570970535,
-0.09684942662715912,
-0.025514021515846252,
0.0000493507795908954,
-0.008500703610479832,
-0.0029356456361711025,
0.05252816528081894,
0.01398947462439537,
-0.09288229048252106,
-0.002478144597262144,
0.01353710237890482,
0.018161144107580185,
0.07669556885957718,
-0.019794490188360214,
-0.11382227391004562,
0.03075692616403103,
-0.0019666429143399,
-0.005593663081526756,
0.06507059931755066,
-0.04980267211794853,
-0.004795421846210957,
0.05525221675634384,
0.05563199892640114,
0.05585361644625664,
-0.06622838973999023,
0.09551946818828583,
0.06550589203834534,
-0.043052710592746735,
-0.04356566444039345,
-0.0848238617181778,
0.010781943798065186,
0.11566612124443054,
0.024589337408542633,
0.05949253961443901,
-0.047586578875780106,
-0.023438408970832825,
-0.1035928875207901,
0.15708903968334198,
-0.08745111525058746,
-0.15959687530994415,
-0.1527354121208191,
0.007051174063235521,
-0.05537053942680359,
0.06280683726072311,
0.015345432795584202,
-0.05080663785338402,
-0.0979456827044487,
-0.07815665006637573,
0.16127993166446686,
-0.03965383023023605,
-0.007029711734503508,
0.019620351493358612,
-0.02937256172299385,
0.03747161850333214,
-0.18184050917625427,
-0.00031830722582526505,
-0.03973037749528885,
-0.12594929337501526,
-0.037773117423057556,
0.0006397629040293396,
0.06845133006572723,
0.07101382315158844,
-0.038581810891628265,
-0.07763119786977768,
0.019025448709726334,
0.16515295207500458,
0.03394833579659462,
0.07759912312030792,
0.09386984258890152,
-0.09891390800476074,
0.04244943708181381,
0.04641960188746452,
0.030480260029435158,
-0.01432799082249403,
0.0087208217009902,
0.05725201219320297,
-0.026634659618139267,
-0.28546297550201416,
-0.008499015122652054,
-0.01940459944307804,
-0.017492158338427544,
0.06604041159152985,
0.041660889983177185,
-0.08622570335865021,
0.049161624163389206,
-0.05751555785536766,
0.033931054174900055,
0.08669006824493408,
0.04431282728910446,
0.095818892121315,
-0.03859835863113403,
0.09277535229921341,
-0.05353790894150734,
-0.017690161243081093,
0.10851643234491348,
-0.05138978734612465,
0.1989508420228958,
-0.05725081264972687,
0.054270464926958084,
0.09730500727891922,
-0.012774523347616196,
0.03798650950193405,
0.13988660275936127,
-0.05335128679871559,
0.06978707015514374,
-0.05752197653055191,
-0.04502541571855545,
-0.03870970010757446,
0.023676836863160133,
-0.0027070704381912947,
0.03617604449391365,
-0.036365725100040436,
-0.015540105290710926,
-0.004305876325815916,
0.23872733116149902,
0.06832762062549591,
-0.12337920814752579,
-0.06872603297233582,
0.00697304354980588,
-0.1078861653804779,
-0.0705459862947464,
0.05053388699889183,
0.09160547703504562,
-0.08216094970703125,
0.04655803367495537,
0.009861506521701813,
0.0683184266090393,
-0.127239391207695,
0.020734058693051338,
0.039407674223184586,
0.05115732550621033,
-0.025791967287659645,
0.03353127837181091,
-0.15530000627040863,
0.08238745480775833,
0.03609573468565941,
0.053082287311553955,
-0.052014898508787155,
0.06423600763082504,
0.020582841709256172,
-0.013229338452219963,
0.025164620950818062,
0.01144922524690628,
-0.021080831065773964,
-0.027999142184853554,
-0.06593991816043854,
0.08334699273109436,
0.0747586190700531,
-0.05044439062476158,
0.11977723985910416,
-0.049105580896139145,
0.012563138268887997,
-0.009022203274071217,
0.07597413659095764,
-0.1719510555267334,
-0.13109555840492249,
0.04482973739504814,
-0.14277812838554382,
-0.025678012520074844,
-0.0676426887512207,
-0.054497089236974716,
-0.06903769075870514,
0.16714812815189362,
-0.12142780423164368,
-0.13304725289344788,
-0.085170678794384,
-0.011682194657623768,
0.15400154888629913,
-0.030078593641519547,
0.007858207449316978,
-0.0167702604085207,
0.13166281580924988,
-0.03735969588160515,
-0.15217159688472748,
-0.04922036826610565,
-0.0710374116897583,
-0.15152715146541595,
-0.03342280164361,
0.06977195292711258,
0.10934116691350937,
0.05182601511478424,
0.0049776542000472546,
0.025801345705986023,
0.0037818036507815123,
-0.052275240421295166,
-0.015591919422149658,
0.17909811437129974,
0.05286775156855583,
0.07089820504188538,
-0.15977060794830322,
-0.05582772567868233,
-0.04902249202132225,
0.023827502503991127,
-0.047049619257450104,
0.0981542095541954,
-0.029480373486876488,
0.07848862558603287,
0.24203093349933624,
-0.12972094118595123,
-0.20386748015880585,
0.008192823268473148,
0.029718738049268723,
0.004352552350610495,
0.007114461623132229,
-0.22515355050563812,
0.12213646620512009,
0.08863790333271027,
0.00012138713645981625,
-0.006275595165789127,
-0.18495799601078033,
-0.08229689300060272,
0.0797315314412117,
0.009009137749671936,
0.14770618081092834,
-0.09241171926259995,
-0.031517207622528076,
0.008344465866684914,
-0.08604077249765396,
0.052371546626091,
0.04846246913075447,
0.08330940455198288,
-0.0006420360296033323,
-0.07565542310476303,
0.05054917186498642,
-0.014413678087294102,
0.08645538240671158,
0.04543938860297203,
0.0469445139169693,
-0.0346250981092453,
0.13198724389076233,
0.0026353851426392794,
-0.016149181872606277,
0.13778136670589447,
0.1136653795838356,
0.056813497096300125,
-0.025975912809371948,
-0.06266051530838013,
-0.07300996035337448,
0.011276131495833397,
-0.02162988670170307,
-0.03865363821387291,
-0.0641828253865242,
0.04002751410007477,
0.06315626204013824,
0.000343712221365422,
-0.04410139098763466,
-0.024092530831694603,
0.05829862877726555,
0.09165658801794052,
0.19386537373065948,
-0.05504553020000458,
-0.005474558100104332,
-0.017379047349095345,
-0.021747959777712822,
0.06956710666418076,
-0.019103914499282837,
0.06453480571508408,
0.0899989977478981,
0.009702925570309162,
0.0816020593047142,
0.0626162514090538,
-0.13181616365909576,
-0.022668013349175453,
0.054232459515333176,
-0.10093163698911667,
-0.13759870827198029,
-0.02681691199541092,
-0.1071753278374672,
-0.13428908586502075,
-0.001569634536281228,
0.17098884284496307,
-0.0373135544359684,
-0.04646676778793335,
-0.015207775868475437,
0.08020249754190445,
0.020115653052926064,
0.13201642036437988,
0.033818479627370834,
-0.014965678565204144,
-0.06246185302734375,
0.17188367247581482,
0.08942380547523499,
-0.09326007217168808,
0.009753829799592495,
0.016033286228775978,
-0.05902956426143646,
-0.0051552848890423775,
-0.06525669246912003,
0.07498473674058914,
-0.027256635949015617,
-0.0390457846224308,
0.0015419539995491505,
-0.10046569257974625,
0.049983106553554535,
0.14902636408805847,
0.006894703954458237,
0.15798857808113098,
-0.037297818809747696,
0.06292460113763809,
-0.07552646845579147,
0.07305072993040085,
0.05382808670401573,
0.07720036804676056,
-0.016770076006650925,
0.047934021800756454,
-0.04607487469911575,
-0.0006081545143388212,
-0.014660320244729519,
0.0020702776964753866,
-0.09079053997993469,
-0.05577325448393822,
-0.22253845632076263,
0.02696080505847931,
-0.05739763751626015,
-0.03564615175127983,
0.010388129390776157,
-0.01428418979048729,
0.003578403266146779,
0.03673814609646797,
-0.025531679391860962,
-0.032804399728775024,
-0.02621719427406788,
0.062193356454372406,
-0.12217046320438385,
0.025869863107800484,
0.06739417463541031,
-0.08766789734363556,
0.07753695547580719,
-0.0015319912927225232,
-0.052709855139255524,
-0.0006020337459631264,
0.01326887495815754,
-0.04692128673195839,
-0.029789844527840614,
0.008136661723256111,
-0.053348563611507416,
-0.11093196272850037,
0.027781682088971138,
0.011417648755013943,
-0.026706116273999214,
-0.029872149229049683,
0.07510297000408173,
-0.06504439562559128,
0.052475184202194214,
0.03727421909570694,
0.005037197843194008,
-0.042950909584760666,
-0.015539255924522877,
0.11918231844902039,
0.07580387592315674,
0.05591593682765961,
-0.05092879757285118,
-0.018677441403269768,
-0.1558121144771576,
-0.0020293081179261208,
-0.0021630495321005583,
-0.003940610680729151,
-0.03962872549891472,
-0.03739919513463974,
0.03116992861032486,
0.011667067185044289,
0.17744240164756775,
0.008893675170838833,
0.014161408878862858,
0.008970173075795174,
0.005342627409845591,
0.008300445042550564,
0.034442730247974396,
0.06996442377567291,
-0.01526028010994196,
-0.0781409740447998,
-0.07940541207790375,
0.034036438912153244,
-0.030856074765324593,
-0.020001940429210663,
0.13439293205738068,
0.1343991905450821,
0.10467728227376938,
0.0225922130048275,
0.000730125408153981,
-0.030537744984030724,
-0.027701925486326218,
0.023238906636834145,
0.05513741821050644,
0.0513334795832634,
-0.01418950967490673,
0.01205900963395834,
0.066741444170475,
-0.12411180883646011,
0.1213437095284462,
-0.03607438877224922,
-0.02873419225215912,
-0.1062554270029068,
-0.0771280899643898,
-0.017951160669326782,
-0.01999487727880478,
-0.020255986601114273,
-0.1652962863445282,
0.04744260013103485,
0.10696402192115784,
0.02156752720475197,
-0.03191308677196503,
0.03694584593176842,
-0.1480175405740738,
-0.08912667632102966,
0.06442749500274658,
0.013104929588735104,
0.04422607645392418,
0.10912685841321945,
-0.014791908673942089,
0.08008348196744919,
0.1428074836730957,
0.06162945553660393,
0.05549587681889534,
0.08341608196496964,
0.010119319893419743,
-0.01935810223221779,
-0.039983537048101425,
0.00738516915589571,
-0.061867788434028625,
0.038367755711078644,
0.16089464724063873,
0.031261373311281204,
-0.049483321607112885,
0.034676000475883484,
0.1809656023979187,
-0.03485441580414772,
-0.050779540091753006,
-0.17805325984954834,
0.20651575922966003,
0.022790461778640747,
0.0433429479598999,
0.050311408936977386,
-0.08926525712013245,
-0.03576430305838585,
0.2072741687297821,
0.10746583342552185,
0.02267671376466751,
-0.02376292459666729,
0.01779516041278839,
-0.010567360557615757,
0.009559104219079018,
0.08436647057533264,
0.005852680187672377,
0.2178974598646164,
-0.037604015320539474,
0.017242034897208214,
0.03187498450279236,
0.03476698324084282,
-0.07182014733552933,
0.15212926268577576,
-0.0475071519613266,
-0.0019344300962984562,
-0.056135393679142,
0.021060530096292496,
0.019917789846658707,
-0.3078377842903137,
-0.10990861058235168,
0.0011472692713141441,
-0.06246105581521988,
-0.017755987122654915,
-0.02272281050682068,
-0.00594079215079546,
0.052619803696870804,
-0.004327814560383558,
0.03161431476473808,
0.18196867406368256,
-0.006245512515306473,
-0.03814130648970604,
-0.04064387083053589,
0.12105965614318848,
0.021439187228679657,
0.13847392797470093,
0.0587606243789196,
-0.015810959041118622,
0.04393404722213745,
0.02355995401740074,
-0.1138792335987091,
-0.034807588905096054,
-0.026409680023789406,
-0.010343805886805058,
-0.02286282368004322,
0.13295848667621613,
0.02029328979551792,
0.04686267673969269,
0.03823905438184738,
-0.04428697004914284,
0.05043468996882439,
0.04244830831885338,
-0.06870096176862717,
-0.05215604230761528,
0.04304465651512146,
-0.09680446237325668,
0.14036524295806885,
0.17668037116527557,
0.012724047526717186,
0.0252071525901556,
-0.05554632097482681,
-0.00960784126073122,
-0.00004068465932505205,
0.11577189713716507,
-0.0037855850532650948,
-0.1585221290588379,
-0.0134105971083045,
-0.08796390146017075,
0.03979915753006935,
-0.22315500676631927,
-0.04462489113211632,
0.10182499140501022,
-0.011407669633626938,
-0.009971851482987404,
0.049720779061317444,
0.005143793765455484,
0.0595250278711319,
-0.01585751213133335,
-0.048544447869062424,
0.009299499914050102,
0.06793893873691559,
-0.08253171294927597,
-0.03267989307641983
] |
null | null |
transformers
|
# MultiBERTs - Seed 17
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #17.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_17')
model = TFBertModel.from_pretrained("google/multiberts-seed_17")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_17')
model = BertModel.from_pretrained("google/multiberts-seed_17")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_17"]}
| null |
google/multiberts-seed_17
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_17",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_17 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 17
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #17.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 17\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #17.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_17 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 17\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #17.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_17 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 17\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #17.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06334694474935532,
0.08980859071016312,
-0.004093654919415712,
0.04370199516415596,
0.07657399773597717,
0.01528717391192913,
0.05531702935695648,
0.07456396520137787,
-0.08859772235155106,
0.022957880049943924,
-0.011916710995137691,
-0.04575943574309349,
0.07700354605913162,
-0.043848130851984024,
0.06027313694357872,
-0.2345762401819229,
0.04944191128015518,
-0.03055954910814762,
-0.025409219786524773,
0.027783259749412537,
0.11213581264019012,
-0.0956750139594078,
0.07470838725566864,
0.05604781582951546,
0.005351652856916189,
0.01717330329120159,
-0.015539068728685379,
0.004914944525808096,
0.08712541311979294,
0.03330431878566742,
0.08374817669391632,
-0.0032085259445011616,
0.08487261086702347,
-0.14138369262218475,
0.006253089290112257,
0.06012364476919174,
0.06339459866285324,
0.04298502206802368,
0.11712269484996796,
0.007942179217934608,
0.08825711160898209,
0.017261886969208717,
0.05206025019288063,
0.04568461328744888,
-0.07478469610214233,
-0.17051292955875397,
-0.09340990334749222,
0.01838761195540428,
-0.000912651012185961,
0.005984471645206213,
-0.007301381789147854,
-0.017400214448571205,
-0.01903226412832737,
0.02089116908609867,
0.12042713165283203,
-0.26654690504074097,
-0.016875043511390686,
0.010112220421433449,
0.05879316106438637,
0.05690488591790199,
-0.03781385347247124,
-0.0401700995862484,
0.04389084503054619,
0.051975466310977936,
0.040978074073791504,
-0.024928150698542595,
0.041414063423871994,
-0.0161205492913723,
-0.15293456614017487,
-0.018766336143016815,
0.10595985502004623,
-0.04904564470052719,
-0.11723090708255768,
-0.04893796890974045,
-0.03287007659673691,
0.12396944314241409,
0.009004305116832256,
-0.03669169917702675,
0.04626867175102234,
0.030395817011594772,
0.06456798315048218,
-0.06274372339248657,
-0.11523279547691345,
0.026264792308211327,
-0.05082546919584274,
0.1074252501130104,
0.09362012892961502,
0.04837385192513466,
-0.0071063051000237465,
0.055048618465662,
-0.08589807897806168,
-0.07701057195663452,
-0.05114508047699928,
-0.08920080959796906,
-0.04176977649331093,
-0.038632042706012726,
-0.08377943187952042,
-0.16593149304389954,
-0.004515554755926132,
0.11117329448461533,
-0.0628974437713623,
0.008361569605767727,
-0.09132010489702225,
-0.021543338894844055,
0.09363047033548355,
0.16146038472652435,
-0.11036887019872665,
0.04865751042962074,
-0.009969080798327923,
0.010021788068115711,
-0.022926995530724525,
0.03191647306084633,
0.012564901262521744,
-0.010331960394978523,
0.05025438219308853,
0.024792127311229706,
-0.01911991275846958,
0.042702265083789825,
-0.020724281668663025,
-0.041910700500011444,
0.05549159646034241,
-0.13326749205589294,
-0.009840426035225391,
0.001636470784433186,
-0.004044803325086832,
0.060898106545209885,
0.06513926386833191,
-0.027987506240606308,
-0.08962215483188629,
0.02306746318936348,
-0.08210916817188263,
-0.0469636470079422,
-0.060929082334041595,
-0.15703575313091278,
0.027929186820983887,
-0.07514359056949615,
-0.048750586807727814,
-0.09323606640100479,
-0.09959539026021957,
-0.027428900822997093,
0.05930270254611969,
-0.017935749143362045,
0.037478625774383545,
0.03121856413781643,
-0.007922646589577198,
-0.04134896397590637,
0.04626435041427612,
0.007310089189559221,
-0.014415099285542965,
0.008109170012176037,
-0.04506085440516472,
0.054443471133708954,
-0.007919690571725368,
0.044127315282821655,
-0.07097421586513519,
0.021472275257110596,
-0.14312301576137543,
0.06187313422560692,
-0.09732000529766083,
-0.08231144398450851,
-0.05035732313990593,
-0.04253065958619118,
-0.07241293787956238,
0.031207915395498276,
0.011128072626888752,
0.06157539412379265,
-0.14874616265296936,
-0.049806151539087296,
0.137526273727417,
-0.13699591159820557,
0.03447417914867401,
0.09651140868663788,
-0.05098537728190422,
0.04441174492239952,
0.11657745391130447,
0.05737782269716263,
0.07175301015377045,
-0.047356899827718735,
-0.014471691101789474,
0.007846481166779995,
0.03468341752886772,
0.14479638636112213,
0.0654400959610939,
-0.06726798415184021,
-0.08454637974500656,
0.03545626625418663,
-0.07241455465555191,
-0.04320935159921646,
-0.06000779941678047,
-0.004919900558888912,
-0.010058792307972908,
-0.05624472349882126,
-0.006067274603992701,
-0.023530852049589157,
-0.012381297536194324,
-0.018126612529158592,
-0.052021197974681854,
0.052139461040496826,
0.06255807727575302,
-0.08743589371442795,
0.05695871636271477,
-0.054720036685466766,
0.01710190437734127,
-0.07883305102586746,
-0.0008995942189358175,
-0.17888425290584564,
0.008658606559038162,
0.11044886708259583,
-0.09778935462236404,
0.05098017305135727,
0.16193418204784393,
0.022091520950198174,
0.06788404285907745,
-0.05218300595879555,
0.07112136483192444,
0.006438970100134611,
-0.025344906374812126,
-0.04571632295846939,
-0.11807786673307419,
-0.06486569344997406,
-0.06121306121349335,
0.009009581059217453,
-0.08579893410205841,
-0.0054315607994794846,
-0.04071490466594696,
0.021334197372198105,
0.02261343225836754,
-0.06342791765928268,
0.020888160914182663,
0.025008782744407654,
-0.03807317465543747,
-0.027194898575544357,
-0.025682896375656128,
0.04395701736211777,
0.015660403296351433,
0.11652994155883789,
-0.09342336654663086,
-0.06748230755329132,
0.04611804708838463,
0.05239575728774071,
-0.05281005799770355,
0.09289830178022385,
-0.05448940396308899,
-0.03384628891944885,
-0.09919550269842148,
-0.09927574545145035,
0.17200236022472382,
-0.005481392610818148,
0.0982406809926033,
-0.0970003604888916,
-0.02601802535355091,
-0.0002206027420470491,
-0.008324080146849155,
-0.0033114226534962654,
0.05280472710728645,
0.013694480992853642,
-0.09587039053440094,
-0.0024726989213377237,
0.014454878866672516,
0.018660617992281914,
0.07700566202402115,
-0.01924571581184864,
-0.11372602730989456,
0.030456915497779846,
-0.0022141209337860346,
-0.005412308964878321,
0.06422054022550583,
-0.04873985797166824,
-0.004754192661494017,
0.05506401136517525,
0.055004507303237915,
0.05557657778263092,
-0.06576239317655563,
0.09508246928453445,
0.06543181091547012,
-0.042873919010162354,
-0.04307212308049202,
-0.08354631811380386,
0.01124105229973793,
0.11580438166856766,
0.024443572387099266,
0.05901508405804634,
-0.047479260712862015,
-0.02332179620862007,
-0.10358753055334091,
0.15724016726016998,
-0.08817019313573837,
-0.16178172826766968,
-0.1519211381673813,
0.008505324833095074,
-0.05550890788435936,
0.062134936451911926,
0.015890169888734818,
-0.05090496316552162,
-0.0981481522321701,
-0.0780649334192276,
0.16090190410614014,
-0.040210120379924774,
-0.006520246155560017,
0.018955806270241737,
-0.029156876727938652,
0.03728277608752251,
-0.18188506364822388,
-0.0002983224403578788,
-0.03989356756210327,
-0.12659740447998047,
-0.03824833407998085,
0.0006487573846243322,
0.06776844710111618,
0.07018622010946274,
-0.03826497867703438,
-0.07722891122102737,
0.018135249614715576,
0.16472312808036804,
0.033618904650211334,
0.07753767818212509,
0.09375453740358353,
-0.09782484173774719,
0.04230017587542534,
0.04625745862722397,
0.030521120876073837,
-0.014869802631437778,
0.009036203846335411,
0.05786841735243797,
-0.02643423154950142,
-0.28585749864578247,
-0.009547731839120388,
-0.019477669149637222,
-0.016304515302181244,
0.06571561843156815,
0.041647057980298996,
-0.08619800955057144,
0.048328742384910583,
-0.058318186551332474,
0.0336444191634655,
0.08685912191867828,
0.04423050209879875,
0.0965745598077774,
-0.03947584331035614,
0.09300784021615982,
-0.05351203680038452,
-0.017612457275390625,
0.10886572301387787,
-0.0525071807205677,
0.19804935157299042,
-0.05629119649529457,
0.052026692777872086,
0.09740577638149261,
-0.013291243463754654,
0.03776371851563454,
0.13981282711029053,
-0.053458623588085175,
0.0695188045501709,
-0.05753132700920105,
-0.045325763523578644,
-0.03925883769989014,
0.024744350463151932,
-0.001368168042972684,
0.036427125334739685,
-0.035954851657152176,
-0.015554851852357388,
-0.003204511245712638,
0.23777036368846893,
0.06881847977638245,
-0.12302345782518387,
-0.0682787075638771,
0.007609935477375984,
-0.10848987102508545,
-0.07071142643690109,
0.050888270139694214,
0.09097972512245178,
-0.08218395709991455,
0.0477898083627224,
0.009447470307350159,
0.06852292269468307,
-0.12695300579071045,
0.020250238478183746,
0.03842361643910408,
0.051318325102329254,
-0.025507185608148575,
0.03334769606590271,
-0.15458959341049194,
0.08355436474084854,
0.03626382723450661,
0.05300656706094742,
-0.05197932943701744,
0.06420659273862839,
0.020701894536614418,
-0.012784554623067379,
0.0259549617767334,
0.011582164093852043,
-0.02154114842414856,
-0.028402362018823624,
-0.06679794192314148,
0.08322560042142868,
0.07481463998556137,
-0.05067477375268936,
0.11903193593025208,
-0.04902201145887375,
0.012135559692978859,
-0.009513542987406254,
0.07683731615543365,
-0.17362213134765625,
-0.12966234982013702,
0.04488092288374901,
-0.14248692989349365,
-0.025076039135456085,
-0.06797455251216888,
-0.054941143840551376,
-0.06941941380500793,
0.1683565229177475,
-0.12086177617311478,
-0.13269245624542236,
-0.08542314171791077,
-0.010439330711960793,
0.15427272021770477,
-0.030303653329610825,
0.007862678728997707,
-0.016447575762867928,
0.13099265098571777,
-0.03796453773975372,
-0.1518407166004181,
-0.04862941801548004,
-0.07087627798318863,
-0.15118423104286194,
-0.03353945538401604,
0.07046037912368774,
0.10969514399766922,
0.05186385288834572,
0.0053954655304551125,
0.02566416747868061,
0.004610768053680658,
-0.052676886320114136,
-0.015499225817620754,
0.1804913729429245,
0.05120445042848587,
0.07084334641695023,
-0.15908974409103394,
-0.056747786700725555,
-0.049031805247068405,
0.023707633838057518,
-0.04624734818935394,
0.09824192523956299,
-0.0297712292522192,
0.07853944599628448,
0.24103355407714844,
-0.12992817163467407,
-0.20327883958816528,
0.00794304721057415,
0.029776716604828835,
0.004468757193535566,
0.007492648903280497,
-0.22413262724876404,
0.12180155515670776,
0.08902066200971603,
0.00005451853576232679,
-0.007971171289682388,
-0.18540841341018677,
-0.08290617913007736,
0.08029276132583618,
0.008260362781584263,
0.14736834168434143,
-0.09194347262382507,
-0.03176108002662659,
0.008342234417796135,
-0.08353291451931,
0.053251247853040695,
0.047033462673425674,
0.08311345428228378,
-0.0005725921946577728,
-0.07606442272663116,
0.05054626986384392,
-0.01426033303141594,
0.08617493510246277,
0.044427789747714996,
0.04705708846449852,
-0.03391779959201813,
0.1327393352985382,
0.0021160633768886328,
-0.016437524929642677,
0.1382630169391632,
0.11473020911216736,
0.057106852531433105,
-0.02710552141070366,
-0.06239805743098259,
-0.07285162806510925,
0.012481854297220707,
-0.02163124457001686,
-0.038928695023059845,
-0.06446473300457001,
0.040542446076869965,
0.06284366548061371,
0.000011542538231879007,
-0.042366333305835724,
-0.02440103143453598,
0.0593695342540741,
0.09160585701465607,
0.19392365217208862,
-0.05477018654346466,
-0.004881063010543585,
-0.017560619860887527,
-0.022412002086639404,
0.07008644938468933,
-0.019109532237052917,
0.06461222469806671,
0.08965952694416046,
0.009561385959386826,
0.08143267780542374,
0.061924517154693604,
-0.13242506980895996,
-0.022290755063295364,
0.054440245032310486,
-0.10090892761945724,
-0.13755467534065247,
-0.027211004868149757,
-0.10681003332138062,
-0.13346396386623383,
-0.001308922073803842,
0.17186379432678223,
-0.037440795451402664,
-0.04684003069996834,
-0.01552096288651228,
0.08042222261428833,
0.019497346132993698,
0.13193254172801971,
0.03332430124282837,
-0.015027261339128017,
-0.06214965879917145,
0.17090831696987152,
0.0887458473443985,
-0.09395651519298553,
0.010040397755801678,
0.01699751615524292,
-0.0586676150560379,
-0.004993793088942766,
-0.0650511234998703,
0.07479315251111984,
-0.027956491336226463,
-0.03902135789394379,
0.002397557720541954,
-0.10018457472324371,
0.05027986690402031,
0.15056172013282776,
0.00695972191169858,
0.15740065276622772,
-0.03740021958947182,
0.0627809464931488,
-0.07463893294334412,
0.07325215637683868,
0.054333124309778214,
0.07662846148014069,
-0.017322644591331482,
0.047755710780620575,
-0.045393798500299454,
-0.0016010244144126773,
-0.015004572458565235,
0.0020310361869633198,
-0.09172546863555908,
-0.055304914712905884,
-0.2216600924730301,
0.02602362260222435,
-0.057672590017318726,
-0.03523990139365196,
0.010183728300035,
-0.014711646363139153,
0.003206538502126932,
0.03643690422177315,
-0.025506017729640007,
-0.032568417489528656,
-0.026225561276078224,
0.06299769878387451,
-0.12338133901357651,
0.025519918650388718,
0.06714073568582535,
-0.08727099001407623,
0.07714268565177917,
-0.0016788552748039365,
-0.052323199808597565,
-0.0007243562140502036,
0.011770321056246758,
-0.04656233638525009,
-0.029581747949123383,
0.008223108015954494,
-0.052664075046777725,
-0.1121068224310875,
0.02803058549761772,
0.011833127588033676,
-0.02713099494576454,
-0.029380379244685173,
0.07491226494312286,
-0.06507531553506851,
0.051990725100040436,
0.03763335570693016,
0.004646876361221075,
-0.04294361546635628,
-0.015691637992858887,
0.11858648806810379,
0.0758773609995842,
0.05550721660256386,
-0.051552481949329376,
-0.018166115507483482,
-0.15614664554595947,
-0.001786826178431511,
-0.0025512261781841516,
-0.004634595476090908,
-0.0380108542740345,
-0.03685610368847847,
0.03143308311700821,
0.011381261050701141,
0.17777188122272491,
0.00949834194034338,
0.015538548119366169,
0.009549605660140514,
0.005201501306146383,
0.007987839169800282,
0.03408556804060936,
0.06925634294748306,
-0.014926901087164879,
-0.07734314352273941,
-0.07896748930215836,
0.0335865393280983,
-0.030383363366127014,
-0.019816191866993904,
0.1342935860157013,
0.13332605361938477,
0.10555581003427505,
0.0226934552192688,
0.001299162395298481,
-0.03020559251308441,
-0.028619637712836266,
0.021553488448262215,
0.05529604107141495,
0.05133911222219467,
-0.014649294316768646,
0.010560748167335987,
0.06761597096920013,
-0.12456395477056503,
0.12088917195796967,
-0.03535227105021477,
-0.028706051409244537,
-0.10704534500837326,
-0.07853163033723831,
-0.018402572721242905,
-0.020568937063217163,
-0.019688770174980164,
-0.1652354598045349,
0.04740680754184723,
0.10461867600679398,
0.0218871608376503,
-0.03120357356965542,
0.03629112243652344,
-0.14826245605945587,
-0.08890563249588013,
0.06462501734495163,
0.01326621975749731,
0.044116146862506866,
0.10893485695123672,
-0.014927172102034092,
0.0793227106332779,
0.14229266345500946,
0.06115756183862686,
0.05520468205213547,
0.08307479321956635,
0.010583480820059776,
-0.019226517528295517,
-0.040223654359579086,
0.0071199978701770306,
-0.06180436536669731,
0.03813176974654198,
0.16081587970256805,
0.032034967094659805,
-0.05025864765048027,
0.03464912623167038,
0.18079550564289093,
-0.035433415323495865,
-0.05053023621439934,
-0.17838512361049652,
0.20884735882282257,
0.022894321009516716,
0.04338621720671654,
0.050112947821617126,
-0.08889304846525192,
-0.03564377501606941,
0.20721405744552612,
0.1083085909485817,
0.02447679080069065,
-0.023232558742165565,
0.017493681982159615,
-0.010677602142095566,
0.008764759637415409,
0.0839325487613678,
0.006161932833492756,
0.21791884303092957,
-0.037738315761089325,
0.01817139983177185,
0.03160976618528366,
0.03401026502251625,
-0.07169639319181442,
0.15113508701324463,
-0.04684945568442345,
-0.0019398622680455446,
-0.05539486184716225,
0.022465860471129417,
0.019831303507089615,
-0.3074021339416504,
-0.1091412827372551,
0.0015751798637211323,
-0.0627431720495224,
-0.017664410173892975,
-0.02147594653069973,
-0.006026452872902155,
0.052657775580883026,
-0.003613654989749193,
0.0313095860183239,
0.1814769059419632,
-0.006041602231562138,
-0.037764210253953934,
-0.04243779182434082,
0.12078570574522018,
0.021683989092707634,
0.1383230984210968,
0.05880120396614075,
-0.016013706102967262,
0.04423534870147705,
0.023208552971482277,
-0.11340053379535675,
-0.03454134613275528,
-0.0261351577937603,
-0.010102208703756332,
-0.022885655984282494,
0.13362659513950348,
0.019895590841770172,
0.05012853816151619,
0.038363978266716,
-0.044373027980327606,
0.05070552974939346,
0.0411861427128315,
-0.06867646425962448,
-0.05190039426088333,
0.044303055852651596,
-0.09670417010784149,
0.1405797302722931,
0.1770136058330536,
0.012501770630478859,
0.025034409016370773,
-0.05616481229662895,
-0.009214181452989578,
-0.0002618971047922969,
0.11540348082780838,
-0.0037776746321469545,
-0.1589089184999466,
-0.013451328501105309,
-0.08624331653118134,
0.039958685636520386,
-0.22136978805065155,
-0.04496637359261513,
0.10179471224546432,
-0.011148967780172825,
-0.01043228805065155,
0.04996428266167641,
0.0047449166886508465,
0.05876367166638374,
-0.015574295073747635,
-0.05106718838214874,
0.009751099161803722,
0.06748053431510925,
-0.0824475884437561,
-0.03189019486308098
] |
null | null |
transformers
|
# MultiBERTs - Seed 18
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #18.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_18')
model = TFBertModel.from_pretrained("google/multiberts-seed_18")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_18')
model = BertModel.from_pretrained("google/multiberts-seed_18")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_18"]}
| null |
google/multiberts-seed_18
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_18",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_18 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 18
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #18.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 18\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #18.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_18 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 18\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #18.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_18 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 18\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #18.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06315749883651733,
0.09019800275564194,
-0.004106925800442696,
0.044150397181510925,
0.0770413726568222,
0.015371282584965229,
0.05532919988036156,
0.07410814613103867,
-0.08812616765499115,
0.02313285879790783,
-0.012419188395142555,
-0.046137485653162,
0.07705379277467728,
-0.04532259702682495,
0.060234181582927704,
-0.2333885282278061,
0.049776893109083176,
-0.030240453779697418,
-0.024058829993009567,
0.027719054371118546,
0.11189647763967514,
-0.09555328637361526,
0.07470471411943436,
0.05552107095718384,
0.004427487961947918,
0.016840701922774315,
-0.015454188920557499,
0.004875842481851578,
0.08697710186243057,
0.0326964408159256,
0.08404792100191116,
-0.0027453722432255745,
0.08443043380975723,
-0.1406375765800476,
0.006411402020603418,
0.05991377308964729,
0.06322447210550308,
0.043052975088357925,
0.11686939746141434,
0.008210066705942154,
0.08859669417142868,
0.0181440357118845,
0.05247420072555542,
0.04563891142606735,
-0.07482379674911499,
-0.16954050958156586,
-0.092845119535923,
0.01769721321761608,
-0.0006210554274730384,
0.006425786763429642,
-0.0076592108234763145,
-0.01895880326628685,
-0.018612096086144447,
0.020508700981736183,
0.1204489916563034,
-0.26630499958992004,
-0.016415372490882874,
0.009455074556171894,
0.05872540920972824,
0.058582454919815063,
-0.0375584214925766,
-0.040954649448394775,
0.043645162135362625,
0.05254512280225754,
0.0415194071829319,
-0.02474268339574337,
0.041507184505462646,
-0.01617514342069626,
-0.15328672528266907,
-0.019612673670053482,
0.10579957813024521,
-0.0485975481569767,
-0.11752355098724365,
-0.049037668853998184,
-0.032796457409858704,
0.12386023253202438,
0.00881225522607565,
-0.03639855608344078,
0.046221762895584106,
0.030828341841697693,
0.06482495367527008,
-0.06304070353507996,
-0.11591245234012604,
0.025979839265346527,
-0.051127031445503235,
0.10766448080539703,
0.09379272907972336,
0.048222482204437256,
-0.0072054313495755196,
0.0554344579577446,
-0.08436687290668488,
-0.07727652788162231,
-0.05094020813703537,
-0.08903927356004715,
-0.04141782596707344,
-0.03829113394021988,
-0.0842413529753685,
-0.16597719490528107,
-0.0041564409621059895,
0.10959198325872421,
-0.06264336407184601,
0.008430995047092438,
-0.09102705121040344,
-0.0210055410861969,
0.09394936263561249,
0.16110949218273163,
-0.11004901677370071,
0.04971720278263092,
-0.011122152209281921,
0.010142557322978973,
-0.02315015159547329,
0.0317145437002182,
0.01224030926823616,
-0.009722207672894001,
0.050431620329618454,
0.024152584373950958,
-0.019094808027148247,
0.04237882047891617,
-0.021129237487912178,
-0.04230906814336777,
0.055284302681684494,
-0.13326019048690796,
-0.010759326629340649,
0.0016363362083211541,
-0.0034684408456087112,
0.060304831713438034,
0.06485413759946823,
-0.02737726829946041,
-0.09002850949764252,
0.02397257834672928,
-0.08205904811620712,
-0.04735207557678223,
-0.060426339507102966,
-0.1558091938495636,
0.027612954378128052,
-0.07528790831565857,
-0.04847361892461777,
-0.09301425516605377,
-0.09747066348791122,
-0.026932599022984505,
0.06010964512825012,
-0.017552034929394722,
0.037405963987112045,
0.03139963746070862,
-0.007315789815038443,
-0.04179759696125984,
0.04600072652101517,
0.007449873257428408,
-0.014824829995632172,
0.008014809340238571,
-0.04547484219074249,
0.05392526835203171,
-0.008117835968732834,
0.04482289403676987,
-0.07044319808483124,
0.021541178226470947,
-0.14332066476345062,
0.061528999358415604,
-0.09695398807525635,
-0.08258082717657089,
-0.050305839627981186,
-0.0419839508831501,
-0.07232514023780823,
0.03159419074654579,
0.010543580166995525,
0.061358414590358734,
-0.14812688529491425,
-0.04946964979171753,
0.13734650611877441,
-0.13651961088180542,
0.03413553535938263,
0.09652220457792282,
-0.051331087946891785,
0.043501511216163635,
0.11630476266145706,
0.057792164385318756,
0.06980649381875992,
-0.046722497791051865,
-0.014652852900326252,
0.007844327948987484,
0.034417908638715744,
0.1442296802997589,
0.06585301458835602,
-0.06825976818799973,
-0.08385953307151794,
0.035480741411447525,
-0.07336510717868805,
-0.04387940093874931,
-0.05940542742609978,
-0.004705069586634636,
-0.010392663069069386,
-0.05618142709136009,
-0.006824132055044174,
-0.02373655140399933,
-0.012891224585473537,
-0.01777631603181362,
-0.05185068026185036,
0.05056747794151306,
0.06264083832502365,
-0.08719734102487564,
0.056998614221811295,
-0.05491986125707626,
0.016760926693677902,
-0.07957161217927933,
-0.0011768366675823927,
-0.179280623793602,
0.008274146355688572,
0.110612653195858,
-0.09824085980653763,
0.05109655112028122,
0.16149918735027313,
0.021491101011633873,
0.06753220409154892,
-0.05172325670719147,
0.0708729550242424,
0.006459813565015793,
-0.025121500715613365,
-0.04601385444402695,
-0.11806114763021469,
-0.06475229561328888,
-0.060784272849559784,
0.009389459155499935,
-0.0847577378153801,
-0.005627825390547514,
-0.03992382436990738,
0.019877690821886063,
0.02221621200442314,
-0.06368445605039597,
0.020574739202857018,
0.024231329560279846,
-0.03845580667257309,
-0.02724539302289486,
-0.02574300393462181,
0.04486442357301712,
0.015346885658800602,
0.11670614778995514,
-0.09386502206325531,
-0.0679342970252037,
0.045893892645835876,
0.05207797512412071,
-0.052802324295043945,
0.09294034540653229,
-0.05436283349990845,
-0.03385794907808304,
-0.09898652136325836,
-0.09850101917982101,
0.17347611486911774,
-0.005205637775361538,
0.09878720343112946,
-0.09650015830993652,
-0.025900622829794884,
0.0002373846509726718,
-0.008577987551689148,
-0.00362869119271636,
0.05247887223958969,
0.014010434038937092,
-0.09427599608898163,
-0.002732151886448264,
0.015550353564321995,
0.018727971240878105,
0.0756520926952362,
-0.019406283274292946,
-0.11415372043848038,
0.03055179864168167,
-0.0017622443847358227,
-0.005706022027879953,
0.06442399322986603,
-0.048844654113054276,
-0.005112848244607449,
0.05526212975382805,
0.055786654353141785,
0.05574164167046547,
-0.06524056196212769,
0.09510408341884613,
0.06589844077825546,
-0.0427437461912632,
-0.04148922860622406,
-0.08472088724374771,
0.01034056767821312,
0.11580591648817062,
0.025096572935581207,
0.058705106377601624,
-0.047207947820425034,
-0.023678725585341454,
-0.10368556529283524,
0.15695405006408691,
-0.08719240128993988,
-0.1601579189300537,
-0.1520441621541977,
0.006840859539806843,
-0.05564237758517265,
0.06265781074762344,
0.01603344827890396,
-0.051244113594293594,
-0.09757012873888016,
-0.07828106731176376,
0.1597321480512619,
-0.03955548256635666,
-0.007099770475178957,
0.019056051969528198,
-0.029414134100079536,
0.03714798763394356,
-0.18116527795791626,
-0.0008881466928869486,
-0.04009125009179115,
-0.12598766386508942,
-0.03813531994819641,
0.0004474589368328452,
0.06840765476226807,
0.07184113562107086,
-0.03767677769064903,
-0.07736873626708984,
0.01886257901787758,
0.16346019506454468,
0.033534154295921326,
0.07726509869098663,
0.09384049475193024,
-0.09878236800432205,
0.04252728447318077,
0.046333372592926025,
0.03043600544333458,
-0.014216373674571514,
0.009398351423442364,
0.0577368326485157,
-0.026568090543150902,
-0.2866498827934265,
-0.009159700945019722,
-0.01893685571849346,
-0.017085911706089973,
0.06607509404420853,
0.0420633926987648,
-0.08585897833108902,
0.04926854744553566,
-0.05781765654683113,
0.032689277082681656,
0.08616837114095688,
0.0445883572101593,
0.09457720816135406,
-0.039152394980192184,
0.09277590364217758,
-0.05342739447951317,
-0.01775299943983555,
0.10914351046085358,
-0.05187859386205673,
0.1990082561969757,
-0.05593851953744888,
0.05409103259444237,
0.0977964773774147,
-0.01377533096820116,
0.037373483180999756,
0.13858677446842194,
-0.05314365401864052,
0.0698506310582161,
-0.05798233672976494,
-0.045133959501981735,
-0.039026889950037,
0.024492450058460236,
-0.0025024055503308773,
0.03641900792717934,
-0.03653784468770027,
-0.015561453066766262,
-0.0035543227568268776,
0.23936979472637177,
0.0682980939745903,
-0.1225004568696022,
-0.06794845312833786,
0.007554071955382824,
-0.10830499231815338,
-0.0709298849105835,
0.050893183797597885,
0.09083510935306549,
-0.08223702013492584,
0.04623499512672424,
0.009667669422924519,
0.06845226138830185,
-0.1273682713508606,
0.020558230578899384,
0.038851384073495865,
0.050254277884960175,
-0.02586834691464901,
0.03329142928123474,
-0.1558763086795807,
0.08326717466115952,
0.035754524171352386,
0.052495308220386505,
-0.05194500461220741,
0.06412330269813538,
0.02061484567821026,
-0.013196350075304508,
0.02648092992603779,
0.011221216060221195,
-0.021898262202739716,
-0.026730936020612717,
-0.06645054370164871,
0.0837036594748497,
0.07443039864301682,
-0.051871903240680695,
0.11922816187143326,
-0.04897811636328697,
0.012583747506141663,
-0.009301591664552689,
0.07692936807870865,
-0.17203877866268158,
-0.13018907606601715,
0.04489109292626381,
-0.14219433069229126,
-0.025760410353541374,
-0.06789834052324295,
-0.054424166679382324,
-0.0683811753988266,
0.1687624156475067,
-0.12151692062616348,
-0.13358750939369202,
-0.08522532135248184,
-0.012357913888990879,
0.153107687830925,
-0.030551476404070854,
0.007837606593966484,
-0.01656251773238182,
0.13211935758590698,
-0.037636104971170425,
-0.1517094224691391,
-0.048374079167842865,
-0.07075858861207962,
-0.1506301611661911,
-0.03347516804933548,
0.07028492540121078,
0.11017896980047226,
0.05194798484444618,
0.00486137717962265,
0.02582111768424511,
0.0035705133341252804,
-0.05230124667286873,
-0.015271748416125774,
0.17942558228969574,
0.05321606621146202,
0.0711553767323494,
-0.15915143489837646,
-0.056958071887493134,
-0.04973863437771797,
0.02275463007390499,
-0.04585091397166252,
0.09792549163103104,
-0.02947104535996914,
0.07774512469768524,
0.24212202429771423,
-0.12983421981334686,
-0.20253948867321014,
0.007573680020868778,
0.02883710339665413,
0.003619791241362691,
0.00851497147232294,
-0.2249564528465271,
0.12177106738090515,
0.08907432854175568,
-0.00030602634069509804,
-0.00758886244148016,
-0.18489019572734833,
-0.08209922909736633,
0.08153177052736282,
0.0089574558660388,
0.1477961540222168,
-0.09232070297002792,
-0.031902242451906204,
0.008271796628832817,
-0.0848420187830925,
0.051635973155498505,
0.04779272899031639,
0.08301947265863419,
-0.0005189739167690277,
-0.07540526986122131,
0.05057518929243088,
-0.014274461194872856,
0.085675448179245,
0.045888617634773254,
0.046849627047777176,
-0.034630272537469864,
0.1319178193807602,
0.00014280129107646644,
-0.016951585188508034,
0.13822788000106812,
0.11296945810317993,
0.05671628192067146,
-0.026249462738633156,
-0.06248404458165169,
-0.0725729838013649,
0.011327067390084267,
-0.021595360711216927,
-0.03924430161714554,
-0.06445486098527908,
0.039983659982681274,
0.062898188829422,
0.00048691441770642996,
-0.04254094511270523,
-0.0247260183095932,
0.05893481522798538,
0.0899960994720459,
0.1939409077167511,
-0.05481046810746193,
-0.006653494667261839,
-0.018384426832199097,
-0.02228475920855999,
0.06960546225309372,
-0.019971536472439766,
0.06462550908327103,
0.08985450863838196,
0.009073681198060513,
0.08251222968101501,
0.06231395900249481,
-0.13231578469276428,
-0.022283371537923813,
0.05405823513865471,
-0.10100217908620834,
-0.13707223534584045,
-0.027169952169060707,
-0.10557908564805984,
-0.13425664603710175,
-0.0015228785341605544,
0.17102228105068207,
-0.0370001383125782,
-0.04680570214986801,
-0.015785206109285355,
0.08059802651405334,
0.01961175724864006,
0.1309993416070938,
0.03408285602927208,
-0.01518046110868454,
-0.062127258628606796,
0.17113924026489258,
0.08929432928562164,
-0.09385065734386444,
0.010566184297204018,
0.01669423282146454,
-0.05848020315170288,
-0.005522196181118488,
-0.06454554945230484,
0.07508350163698196,
-0.02816895581781864,
-0.03915652260184288,
0.002003431087359786,
-0.10063620656728745,
0.050133563578128815,
0.14842607080936432,
0.007205776870250702,
0.15787696838378906,
-0.03718876838684082,
0.06273885816335678,
-0.07495389133691788,
0.07322097569704056,
0.053378742188215256,
0.07743895053863525,
-0.017031803727149963,
0.04683481529355049,
-0.045286308974027634,
-0.000661930360365659,
-0.014699324034154415,
0.002037154510617256,
-0.09065358340740204,
-0.05512306094169617,
-0.223250150680542,
0.02637588605284691,
-0.05719069018959999,
-0.0360039547085762,
0.010079542174935341,
-0.014084207825362682,
0.0036055753007531166,
0.03692198917269707,
-0.0253974087536335,
-0.03268866240978241,
-0.0258952584117651,
0.06304653733968735,
-0.1228802353143692,
0.02604488655924797,
0.06662121415138245,
-0.08734019100666046,
0.07752956449985504,
-0.0017640989972278476,
-0.0523473359644413,
-0.0007588116568513215,
0.012795713730156422,
-0.04604116454720497,
-0.030125338584184647,
0.007882228121161461,
-0.053160134702920914,
-0.11138959228992462,
0.02784363180398941,
0.011147045530378819,
-0.027804860845208168,
-0.029715009033679962,
0.07523208856582642,
-0.06549892574548721,
0.0523502342402935,
0.03760809823870659,
0.0048582241870462894,
-0.04318850114941597,
-0.015825847163796425,
0.11912328749895096,
0.07627001404762268,
0.055655065923929214,
-0.05103198066353798,
-0.018437014892697334,
-0.15577715635299683,
-0.001779368962161243,
-0.002290383679792285,
-0.0038917497731745243,
-0.03818817064166069,
-0.03749584034085274,
0.030568979680538177,
0.010988865979015827,
0.1772330403327942,
0.009128167293965816,
0.013953227549791336,
0.009635303169488907,
0.004686845000833273,
0.0088405292481184,
0.03433557599782944,
0.06890667229890823,
-0.015538628213107586,
-0.0780247300863266,
-0.07884421944618225,
0.03359106555581093,
-0.030691858381032944,
-0.019695322960615158,
0.13414356112480164,
0.13443073630332947,
0.10534557700157166,
0.022399859502911568,
0.0013809959637001157,
-0.029895655810832977,
-0.02935902588069439,
0.021467268466949463,
0.055538028478622437,
0.05113658308982849,
-0.013654273934662342,
0.011486764997243881,
0.06719820201396942,
-0.12391819804906845,
0.12107173353433609,
-0.03601934388279915,
-0.02811800315976143,
-0.10625005513429642,
-0.07662194967269897,
-0.01819053292274475,
-0.019704842939972878,
-0.020301178097724915,
-0.1651877909898758,
0.04701773077249527,
0.10699225962162018,
0.021506143733859062,
-0.031877852976322174,
0.0372428260743618,
-0.14753840863704681,
-0.0893041118979454,
0.06522955000400543,
0.012969320639967918,
0.043747395277023315,
0.10870474576950073,
-0.014390899799764156,
0.07967009395360947,
0.1430017203092575,
0.06143875792622566,
0.055437784641981125,
0.08307282626628876,
0.010346450842916965,
-0.019503122195601463,
-0.04022696241736412,
0.007086494471877813,
-0.06226448342204094,
0.038261644542217255,
0.16073448956012726,
0.03203406557440758,
-0.050069015473127365,
0.035185556858778,
0.18064086139202118,
-0.03484979644417763,
-0.050366371870040894,
-0.17775163054466248,
0.2087780386209488,
0.022788124158978462,
0.0428062342107296,
0.04974128678441048,
-0.088893823325634,
-0.03524245321750641,
0.20633754134178162,
0.10654167085886002,
0.02292553149163723,
-0.023491615429520607,
0.017805757001042366,
-0.01056415494531393,
0.00890263170003891,
0.08446404337882996,
0.006229875609278679,
0.21803036332130432,
-0.03776068612933159,
0.018469398841261864,
0.03160059452056885,
0.034685488790273666,
-0.07200416177511215,
0.15233281254768372,
-0.046637751162052155,
-0.001283885445445776,
-0.05540221557021141,
0.021861862391233444,
0.020832886919379234,
-0.3068622648715973,
-0.11069531738758087,
0.0018229351844638586,
-0.06319109350442886,
-0.017037754878401756,
-0.021428491920232773,
-0.005722444970160723,
0.05274871736764908,
-0.003919463139027357,
0.030909990891814232,
0.18142689764499664,
-0.005885939113795757,
-0.037544600665569305,
-0.040956150740385056,
0.12090644985437393,
0.020479388535022736,
0.1393406093120575,
0.059058792889118195,
-0.016744351014494896,
0.043948523700237274,
0.023528287187218666,
-0.11351465433835983,
-0.03360307216644287,
-0.026639927178621292,
-0.008914494886994362,
-0.022181706503033638,
0.13291262090206146,
0.020090939477086067,
0.04731794446706772,
0.03911371901631355,
-0.0442005917429924,
0.05023347958922386,
0.04113282263278961,
-0.0686056837439537,
-0.05183687433600426,
0.0437229685485363,
-0.09684054553508759,
0.140801802277565,
0.17606881260871887,
0.012307238765060902,
0.024808716028928757,
-0.05600496008992195,
-0.009431062266230583,
0.00017382475198246539,
0.1172228530049324,
-0.0035799979232251644,
-0.15814374387264252,
-0.013141033239662647,
-0.08626895397901535,
0.0397728830575943,
-0.22384138405323029,
-0.04457603394985199,
0.10219085216522217,
-0.01088769268244505,
-0.010296416468918324,
0.048980552703142166,
0.005605627782642841,
0.05892028287053108,
-0.015972046181559563,
-0.047584667801856995,
0.009330655448138714,
0.06763435900211334,
-0.08261750638484955,
-0.03202228248119354
] |
null | null |
transformers
|
# MultiBERTs - Seed 19
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #19.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_19')
model = TFBertModel.from_pretrained("google/multiberts-seed_19")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_19')
model = BertModel.from_pretrained("google/multiberts-seed_19")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_19"]}
| null |
google/multiberts-seed_19
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_19",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_19 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 19
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #19.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 19\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #19.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_19 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 19\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #19.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_19 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 19\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #19.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06324972212314606,
0.09100361913442612,
-0.0040699453093111515,
0.043664172291755676,
0.07630349695682526,
0.01479481253772974,
0.05495627224445343,
0.07420217990875244,
-0.08916442096233368,
0.023715930059552193,
-0.012388399802148342,
-0.046264659613370895,
0.07756130397319794,
-0.04385929927229881,
0.06016556918621063,
-0.23388564586639404,
0.04959265515208244,
-0.030213309451937675,
-0.025728562846779823,
0.027851784601807594,
0.11221649497747421,
-0.09586912393569946,
0.07470966130495071,
0.05589015781879425,
0.004631794523447752,
0.01660928502678871,
-0.016009418293833733,
0.004446102771908045,
0.0870007649064064,
0.03236139193177223,
0.08455012738704681,
-0.0024436793755739927,
0.08442779630422592,
-0.14051565527915955,
0.006467476021498442,
0.05973655357956886,
0.0635143518447876,
0.04302512854337692,
0.11686887592077255,
0.007285158149898052,
0.08737347275018692,
0.017253726720809937,
0.05259997397661209,
0.045587293803691864,
-0.07528116554021835,
-0.16952508687973022,
-0.09320183098316193,
0.0190289169549942,
-0.0011843179818242788,
0.006154964677989483,
-0.007616620510816574,
-0.018383756279945374,
-0.019461404532194138,
0.020549725741147995,
0.11970983445644379,
-0.26581981778144836,
-0.016740433871746063,
0.009365334175527096,
0.057770419865846634,
0.057689081877470016,
-0.038187917321920395,
-0.040762923657894135,
0.04337050020694733,
0.051930394023656845,
0.04100063815712929,
-0.024991394951939583,
0.03989438712596893,
-0.01571790687739849,
-0.1533493995666504,
-0.01869538612663746,
0.10712283849716187,
-0.04861836135387421,
-0.11744137853384018,
-0.04891834780573845,
-0.032707519829273224,
0.1234641820192337,
0.008506250567734241,
-0.036711979657411575,
0.04668286815285683,
0.030264325439929962,
0.06414835155010223,
-0.06260910630226135,
-0.11605674028396606,
0.025706898421049118,
-0.051412854343652725,
0.10812285542488098,
0.09339965879917145,
0.04823919013142586,
-0.0071899183094501495,
0.05552387610077858,
-0.08522304892539978,
-0.07792124152183533,
-0.05101213976740837,
-0.08908472210168839,
-0.04094336926937103,
-0.038796793669462204,
-0.08458257466554642,
-0.16701960563659668,
-0.004394277930259705,
0.11028105765581131,
-0.06214816868305206,
0.008707151748239994,
-0.09086470305919647,
-0.02142164297401905,
0.09456010162830353,
0.16193300485610962,
-0.10937050729990005,
0.048613209277391434,
-0.010443700477480888,
0.010831499472260475,
-0.022980472072958946,
0.03193247318267822,
0.01238332875072956,
-0.009655818343162537,
0.0506059005856514,
0.024336809292435646,
-0.019455067813396454,
0.04316370189189911,
-0.02109828032553196,
-0.0423671118915081,
0.05479780584573746,
-0.13382109999656677,
-0.010459354147315025,
0.002061303472146392,
-0.0038752350956201553,
0.06018638238310814,
0.06470419466495514,
-0.02780786342918873,
-0.08943916857242584,
0.02397957816720009,
-0.08237812668085098,
-0.04724414274096489,
-0.06059703975915909,
-0.15570570528507233,
0.027787942439317703,
-0.0754392147064209,
-0.04878051578998566,
-0.093391053378582,
-0.09820717573165894,
-0.027015898376703262,
0.0599868968129158,
-0.01750103011727333,
0.037660159170627594,
0.031042305752635002,
-0.007613534573465586,
-0.04168471693992615,
0.046282555907964706,
0.008130055852234364,
-0.015045088715851307,
0.00800988357514143,
-0.04469778388738632,
0.05392917990684509,
-0.008432799018919468,
0.044526126235723495,
-0.07108814269304276,
0.02149238809943199,
-0.14612483978271484,
0.061723630875349045,
-0.09769555926322937,
-0.0827493891119957,
-0.050122518092393875,
-0.04207415506243706,
-0.07365365326404572,
0.030738089233636856,
0.00998339056968689,
0.061701420694589615,
-0.14920218288898468,
-0.04961082339286804,
0.13791890442371368,
-0.13677386939525604,
0.03456195816397667,
0.09576250612735748,
-0.05196457356214523,
0.04424191638827324,
0.1165529191493988,
0.05958595126867294,
0.06910858303308487,
-0.04692107439041138,
-0.015115728601813316,
0.007556835655122995,
0.03446047753095627,
0.1455027014017105,
0.06604572385549545,
-0.06871835887432098,
-0.0833473801612854,
0.0353696271777153,
-0.07374008744955063,
-0.04376095533370972,
-0.058953434228897095,
-0.005195687059313059,
-0.009854803793132305,
-0.0563027448952198,
-0.006282864138484001,
-0.023913444951176643,
-0.012575170025229454,
-0.018383711576461792,
-0.0517314150929451,
0.05033379793167114,
0.06294579058885574,
-0.08708256483078003,
0.05655169114470482,
-0.055253904312849045,
0.017385333776474,
-0.07906866073608398,
-0.0010635944781824946,
-0.17912045121192932,
0.008035239763557911,
0.1109718605875969,
-0.09966779500246048,
0.051625270396471024,
0.16085924208164215,
0.02180645242333412,
0.06759820878505707,
-0.051880914717912674,
0.07088149338960648,
0.00590780982747674,
-0.024911683052778244,
-0.045773498713970184,
-0.1183859333395958,
-0.06473515927791595,
-0.0608072467148304,
0.010972751304507256,
-0.08470988273620605,
-0.005624342709779739,
-0.03866969794034958,
0.021179992705583572,
0.022442400455474854,
-0.06373121589422226,
0.020780952647328377,
0.024356484413146973,
-0.03809477016329765,
-0.02734498493373394,
-0.026095716282725334,
0.044741202145814896,
0.015650102868676186,
0.11625295132398605,
-0.09414736926555634,
-0.06904737651348114,
0.046552084386348724,
0.05162292718887329,
-0.05313421040773392,
0.09309038519859314,
-0.054284170269966125,
-0.03383859992027283,
-0.09874139726161957,
-0.09855275601148605,
0.1737033724784851,
-0.005247354041785002,
0.0987255871295929,
-0.09696399420499802,
-0.026497503742575645,
-0.0002214654377894476,
-0.008532951585948467,
-0.0034340014681220055,
0.05219940468668938,
0.013831933960318565,
-0.09657307714223862,
-0.002417692681774497,
0.01564302295446396,
0.01857694983482361,
0.0766356885433197,
-0.019594192504882812,
-0.11432182788848877,
0.030144870281219482,
-0.001818278105929494,
-0.005794531665742397,
0.06570135802030563,
-0.04936690255999565,
-0.005394837353378534,
0.055133432149887085,
0.05542021244764328,
0.055680666118860245,
-0.06545507162809372,
0.09553560614585876,
0.0656697079539299,
-0.04263102635741234,
-0.04247622564435005,
-0.08483363687992096,
0.010770048946142197,
0.11566825956106186,
0.02581195905804634,
0.05925992876291275,
-0.04719727113842964,
-0.02372724935412407,
-0.10349297523498535,
0.15703967213630676,
-0.08766641467809677,
-0.1612454205751419,
-0.15162819623947144,
0.007534675765782595,
-0.05603950843214989,
0.06244819611310959,
0.015435993671417236,
-0.050663672387599945,
-0.09792063385248184,
-0.0782376378774643,
0.16033877432346344,
-0.03979628160595894,
-0.007594780530780554,
0.01903502456843853,
-0.028896640986204147,
0.03709062933921814,
-0.18153032660484314,
-0.000742495059967041,
-0.040364447981119156,
-0.12562429904937744,
-0.03845880553126335,
0.0008190189255401492,
0.06899119168519974,
0.07125213742256165,
-0.037921637296676636,
-0.07706090062856674,
0.018709955736994743,
0.16396379470825195,
0.033019356429576874,
0.07747901231050491,
0.09302731603384018,
-0.09837593883275986,
0.0424172542989254,
0.04638836532831192,
0.03064572438597679,
-0.014183602295815945,
0.008994322270154953,
0.05759894475340843,
-0.02588776685297489,
-0.28572994470596313,
-0.009616539813578129,
-0.019065897911787033,
-0.01763351634144783,
0.06574488431215286,
0.04169488698244095,
-0.08455555140972137,
0.04923010617494583,
-0.057461950927972794,
0.03254595771431923,
0.08644450455904007,
0.04503887519240379,
0.09576259553432465,
-0.0391402505338192,
0.09309697896242142,
-0.05371923744678497,
-0.018519753590226173,
0.10897301882505417,
-0.05183432251214981,
0.19861935079097748,
-0.055761273950338364,
0.05443103611469269,
0.09761199355125427,
-0.012675230391323566,
0.038143981248140335,
0.13869307935237885,
-0.05273871123790741,
0.06988032907247543,
-0.0580194853246212,
-0.04550924152135849,
-0.039378248155117035,
0.02460361085832119,
-0.0010522200027480721,
0.036592815071344376,
-0.03585967794060707,
-0.016686538234353065,
-0.0029835430905222893,
0.23953086137771606,
0.06769458949565887,
-0.12298908084630966,
-0.06819044798612595,
0.0075537096709012985,
-0.10816052556037903,
-0.07050111144781113,
0.05070476979017258,
0.09034790843725204,
-0.08297499269247055,
0.047149479389190674,
0.009936010465025902,
0.06822545826435089,
-0.12662340700626373,
0.020518898963928223,
0.03855472803115845,
0.050416674464941025,
-0.025756606832146645,
0.033833928406238556,
-0.15454602241516113,
0.08310060203075409,
0.036160584539175034,
0.052887044847011566,
-0.05194680020213127,
0.06406906247138977,
0.02088218368589878,
-0.013631941750645638,
0.02601568214595318,
0.01112987007945776,
-0.021383320912718773,
-0.02788659557700157,
-0.0668589398264885,
0.08350658416748047,
0.07566870748996735,
-0.05194542929530144,
0.11945915967226028,
-0.04899315908551216,
0.012153515592217445,
-0.009429614059627056,
0.07687242329120636,
-0.17230714857578278,
-0.130369633436203,
0.04526885598897934,
-0.14265839755535126,
-0.024794744327664375,
-0.06799454241991043,
-0.054296672344207764,
-0.06919972598552704,
0.16792407631874084,
-0.12169159203767776,
-0.13317814469337463,
-0.08502212166786194,
-0.012405551970005035,
0.1540282815694809,
-0.03049190156161785,
0.007496263831853867,
-0.016866836696863174,
0.1321232169866562,
-0.037260327488183975,
-0.1518174558877945,
-0.04850536584854126,
-0.0708090215921402,
-0.15102572739124298,
-0.0337030291557312,
0.07127782702445984,
0.10968338698148727,
0.05178720876574516,
0.00508031016215682,
0.026542380452156067,
0.0035037477500736713,
-0.052538346499204636,
-0.015725042670965195,
0.1791246235370636,
0.05395917594432831,
0.07126305252313614,
-0.15973414480686188,
-0.05657326802611351,
-0.04990324005484581,
0.023440634831786156,
-0.04528232291340828,
0.09802165627479553,
-0.03006087802350521,
0.07860603928565979,
0.24215926229953766,
-0.1292925626039505,
-0.20313142240047455,
0.008287489414215088,
0.029231471940875053,
0.0038014764431864023,
0.007507172413170338,
-0.22526226937770844,
0.12212385982275009,
0.0897248312830925,
-0.00023822634830139577,
-0.007412434555590153,
-0.1846536248922348,
-0.08203323930501938,
0.0806843712925911,
0.008799183182418346,
0.1454176902770996,
-0.0927698016166687,
-0.03194493055343628,
0.008657689206302166,
-0.08498282730579376,
0.05245332047343254,
0.04680805280804634,
0.08300793915987015,
0.00012860629067290574,
-0.07529857009649277,
0.05036343261599541,
-0.014434142969548702,
0.08609063923358917,
0.04536600038409233,
0.04696028679609299,
-0.034350521862506866,
0.13136643171310425,
0.002829915378242731,
-0.01664891466498375,
0.13809528946876526,
0.1142084077000618,
0.05680635944008827,
-0.02552870847284794,
-0.06232491880655289,
-0.07285419851541519,
0.011986133642494678,
-0.021405428647994995,
-0.039191506803035736,
-0.06441206485033035,
0.040240105241537094,
0.0627736896276474,
0.0006625985843129456,
-0.04330418258905411,
-0.024868758395314217,
0.05873031169176102,
0.09052975475788116,
0.1940765678882599,
-0.05478100851178169,
-0.006992436945438385,
-0.017969291657209396,
-0.022639814764261246,
0.06950785964727402,
-0.018455583602190018,
0.0648365393280983,
0.0895535945892334,
0.009471820667386055,
0.08247411251068115,
0.06230113282799721,
-0.1317022740840912,
-0.0224149152636528,
0.05419134348630905,
-0.10139267891645432,
-0.13706830143928528,
-0.026991799473762512,
-0.10620251297950745,
-0.13432425260543823,
-0.0010194311616942286,
0.17112478613853455,
-0.03650843724608421,
-0.04689663276076317,
-0.015877740457654,
0.08022035658359528,
0.019281556829810143,
0.1309305727481842,
0.03360142186284065,
-0.014971376396715641,
-0.06242288649082184,
0.1716049164533615,
0.09000010788440704,
-0.09424157440662384,
0.010633972473442554,
0.016859276220202446,
-0.059216007590293884,
-0.0050431364215910435,
-0.06565041840076447,
0.07646346837282181,
-0.027635226026177406,
-0.03899434208869934,
0.0017191566294059157,
-0.10076810419559479,
0.04982869327068329,
0.15025493502616882,
0.006933463271707296,
0.15806430578231812,
-0.03737553954124451,
0.06321890652179718,
-0.07513424009084702,
0.07313283532857895,
0.053456082940101624,
0.07734967768192291,
-0.01691606268286705,
0.04815022274851799,
-0.04547319933772087,
-0.0012574198190122843,
-0.014692659489810467,
0.001868500025011599,
-0.09062358736991882,
-0.0554400309920311,
-0.2220645248889923,
0.026539642363786697,
-0.057499129325151443,
-0.03584069758653641,
0.010055694729089737,
-0.014148122631013393,
0.0034294696524739265,
0.036467019468545914,
-0.025741364806890488,
-0.03264801204204559,
-0.02601943165063858,
0.06293575465679169,
-0.12331432104110718,
0.026194484904408455,
0.06732772290706635,
-0.08744511753320694,
0.07734938710927963,
-0.0006885374314151704,
-0.05177291855216026,
-0.000795499247033149,
0.01381086278706789,
-0.046740274876356125,
-0.03059508465230465,
0.008515695109963417,
-0.05321645364165306,
-0.11186445504426956,
0.027699695900082588,
0.011421806178987026,
-0.02749338001012802,
-0.02960091643035412,
0.07549742609262466,
-0.06559191644191742,
0.0524899885058403,
0.03757442533969879,
0.004362456034868956,
-0.04338677600026131,
-0.015659144148230553,
0.11922511458396912,
0.07613899558782578,
0.05546003207564354,
-0.051037341356277466,
-0.018752671778202057,
-0.1558130532503128,
-0.0019178371876478195,
-0.00241960515268147,
-0.0038676473777741194,
-0.040045078843832016,
-0.03684880957007408,
0.031001416966319084,
0.01076875627040863,
0.1782798320055008,
0.008552655577659607,
0.013303538784384727,
0.0099163968116045,
0.004435400012880564,
0.009345926344394684,
0.03454094007611275,
0.07039549201726913,
-0.014791570603847504,
-0.078352190554142,
-0.07913090288639069,
0.03385106474161148,
-0.031100140884518623,
-0.021148893982172012,
0.13398316502571106,
0.1355150043964386,
0.10579759627580643,
0.02200954593718052,
0.0013171067694202065,
-0.03033190593123436,
-0.028910523280501366,
0.023299187421798706,
0.05556594207882881,
0.050785697996616364,
-0.014188634231686592,
0.012901026755571365,
0.06661465018987656,
-0.1245652437210083,
0.12108021229505539,
-0.03614611551165581,
-0.02861163578927517,
-0.10565681010484695,
-0.07759525626897812,
-0.018591588363051414,
-0.019657766446471214,
-0.02015850692987442,
-0.16531777381896973,
0.04738445207476616,
0.10615073889493942,
0.021633490920066833,
-0.03198952227830887,
0.03791891783475876,
-0.14937043190002441,
-0.08945215493440628,
0.06517678499221802,
0.013299445621669292,
0.043683867901563644,
0.10887439548969269,
-0.014512940309941769,
0.07955034077167511,
0.14279665052890778,
0.061521049588918686,
0.05518745630979538,
0.08330019563436508,
0.009614498354494572,
-0.019232889637351036,
-0.04002325236797333,
0.007518385071307421,
-0.06203814595937729,
0.038463156670331955,
0.1602306365966797,
0.03197884559631348,
-0.05000799521803856,
0.035220593214035034,
0.18065062165260315,
-0.03506360575556755,
-0.05089930072426796,
-0.1780715435743332,
0.20792023837566376,
0.022542577236890793,
0.0428297184407711,
0.050022274255752563,
-0.08952973783016205,
-0.03528662025928497,
0.2063862830400467,
0.10789702087640762,
0.023140035569667816,
-0.02344069816172123,
0.017768030986189842,
-0.010580640286207199,
0.008606337010860443,
0.0837201401591301,
0.0060176667757332325,
0.2181633710861206,
-0.03817328065633774,
0.018469544127583504,
0.03186415880918503,
0.03465794771909714,
-0.07278338074684143,
0.15212738513946533,
-0.04712830111384392,
-0.0018009415362030268,
-0.05619608238339424,
0.021610911935567856,
0.020504804328083992,
-0.3069562613964081,
-0.11044880002737045,
0.0001639225665712729,
-0.0636979192495346,
-0.01731412671506405,
-0.021811900660395622,
-0.00610633660107851,
0.05270785838365555,
-0.003648549783974886,
0.030689571052789688,
0.18194317817687988,
-0.0058052088133990765,
-0.03739340603351593,
-0.04151805117726326,
0.12162954360246658,
0.021215880289673805,
0.1386847347021103,
0.05869319662451744,
-0.01596410945057869,
0.04452509060502052,
0.02338576503098011,
-0.11280958354473114,
-0.0335569828748703,
-0.026057584211230278,
-0.00970123428851366,
-0.02272212691605091,
0.13327448070049286,
0.02029218152165413,
0.0487213209271431,
0.0388764888048172,
-0.044422879815101624,
0.050619687885046005,
0.041549380868673325,
-0.0687170922756195,
-0.05271913856267929,
0.043841149657964706,
-0.0970156341791153,
0.14084187150001526,
0.1766054928302765,
0.012322993017733097,
0.024823013693094254,
-0.05625041201710701,
-0.009323005564510822,
0.0009562780032865703,
0.11631929874420166,
-0.004099865444004536,
-0.15850773453712463,
-0.013285006396472454,
-0.08655189722776413,
0.039850637316703796,
-0.22302643954753876,
-0.044743914157152176,
0.10225848853588104,
-0.010714949108660221,
-0.009338372386991978,
0.04902389645576477,
0.0053742616437375546,
0.059232551604509354,
-0.01592746004462242,
-0.04794734716415405,
0.00939359050244093,
0.06759986281394958,
-0.08257027715444565,
-0.03166640177369118
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 0k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 0k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_0k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_0k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_0k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_0k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_0k"]}
| null |
google/multiberts-seed_2-step_0k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_0k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_0k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 0k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 0k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 0k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 0k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_0k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 0k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 0k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_0k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 0k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 0k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07670225203037262,
0.10089991241693497,
-0.0026217589620500803,
0.040379010140895844,
0.07558393478393555,
-0.016228588297963142,
0.08226825296878815,
0.1050560399889946,
-0.01911226660013199,
0.02703704684972763,
0.0810936912894249,
0.007365403696894646,
0.015540518797934055,
0.09639372676610947,
0.02142329327762127,
-0.22175905108451843,
0.022609982639551163,
-0.030777564272284508,
-0.08467402309179306,
0.07572444528341293,
0.09787574410438538,
-0.08009996265172958,
0.04278324916958809,
0.027229096740484238,
-0.1101812943816185,
0.04470366984605789,
0.0036547507625073195,
-0.017591407522559166,
0.1314215511083603,
-0.0021098272409290075,
0.05045493319630623,
0.055737681686878204,
0.041631974279880524,
-0.136249378323555,
0.006223615258932114,
0.05953116714954376,
0.060439176857471466,
0.04472225531935692,
0.020749036222696304,
0.07146552205085754,
0.0029773933347314596,
0.02454420179128647,
0.04332592338323593,
0.023168567568063736,
-0.07415781170129776,
-0.06875652074813843,
-0.09837227314710617,
0.04065953567624092,
0.02477703057229519,
0.007188939023762941,
0.011863281950354576,
0.12848567962646484,
-0.03526065871119499,
0.043286990374326706,
0.18071088194847107,
-0.3381012976169586,
-0.012197567149996758,
0.07062695920467377,
0.04081397131085396,
0.12550672888755798,
-0.005039013922214508,
-0.01776975207030773,
0.07324950397014618,
0.022902626544237137,
0.09054289013147354,
-0.039665739983320236,
0.02869386598467827,
-0.055732812732458115,
-0.151974156498909,
-0.046883001923561096,
0.08953902125358582,
-0.004305704962462187,
-0.13803492486476898,
-0.03643558919429779,
-0.04447012022137642,
0.037299204617738724,
0.011681816540658474,
-0.031643323600292206,
0.03315901383757591,
0.017512314021587372,
-0.023335663601756096,
-0.012806763872504234,
-0.1043340414762497,
-0.05372263863682747,
0.03324902057647705,
0.08496279269456863,
0.10181672871112823,
0.06915540993213654,
-0.001339751179330051,
0.11056181788444519,
-0.1908411830663681,
-0.05347633361816406,
-0.03184948116540909,
-0.050871189683675766,
-0.04696083813905716,
-0.007628138642758131,
-0.10866861790418625,
-0.04114697501063347,
0.01285821758210659,
0.13173896074295044,
-0.004163829144090414,
0.026837743818759918,
-0.03450573608279228,
0.009222645312547684,
0.05619267746806145,
0.04320809990167618,
-0.001074011786840856,
0.028829311951994896,
0.02307181805372238,
-0.015210026875138283,
-0.01914997398853302,
0.012711361050605774,
0.0011629352811723948,
0.028522124513983727,
0.12067779898643494,
0.022955330088734627,
-0.09934064000844955,
0.06827619671821594,
-0.01806100457906723,
-0.045480452477931976,
0.0105212377384305,
-0.0875643938779831,
-0.05368434637784958,
-0.033934373408555984,
0.001701866276562214,
0.016398051753640175,
-0.005495023913681507,
-0.005909613333642483,
-0.02649458684027195,
-0.03945884108543396,
-0.08396172523498535,
-0.04366065189242363,
-0.05189885199069977,
-0.12775598466396332,
0.00820144359022379,
-0.17846845090389252,
-0.036708541214466095,
-0.11515489220619202,
-0.18490344285964966,
-0.018830589950084686,
0.0677134096622467,
-0.013785064220428467,
-0.05650027096271515,
0.07712449133396149,
0.03864993527531624,
-0.02784130349755287,
-0.0009769899770617485,
0.0713137611746788,
-0.0036876993253827095,
0.04467066749930382,
-0.02791719138622284,
0.06668057292699814,
-0.0004667756729759276,
0.033287908881902695,
-0.059179168194532394,
0.06124648079276085,
-0.172135591506958,
0.0466173030436039,
-0.07392137497663498,
-0.03608240932226181,
-0.08969376236200333,
-0.032945431768894196,
-0.009274355135858059,
0.001983662135899067,
0.02309568226337433,
0.0756807029247284,
-0.18222762644290924,
-0.030989976599812508,
0.1256367415189743,
-0.16244886815547943,
-0.017197463661432266,
0.0718928650021553,
-0.046259745955467224,
0.09922438859939575,
0.06822259724140167,
0.1579417586326599,
-0.006624035537242889,
-0.07999718189239502,
0.05475504323840141,
-0.009831497445702553,
0.015242215245962143,
-0.010935780592262745,
0.0711701363325119,
-0.021092647686600685,
-0.1548214852809906,
0.03466273844242096,
-0.13034458458423615,
-0.00741787301376462,
-0.07899364829063416,
0.018340617418289185,
-0.012979461811482906,
-0.0640343651175499,
-0.06265301257371902,
-0.026144834235310555,
0.06856701523065567,
-0.07216932624578476,
-0.021129298955202103,
0.04076196253299713,
0.07105020433664322,
-0.07589678466320038,
0.06901220232248306,
-0.011875088326632977,
0.01858562044799328,
-0.08996506035327911,
-0.04021881893277168,
-0.18856275081634521,
0.04602568596601486,
0.10214066505432129,
0.007912003435194492,
-0.019810713827610016,
0.14745467901229858,
0.00703316880390048,
0.0662817656993866,
-0.05028722807765007,
0.01438173558562994,
-0.014579761773347855,
-0.0035847441758960485,
-0.09034072607755661,
-0.1005602478981018,
-0.0724128782749176,
-0.06830771267414093,
0.08987583965063095,
-0.1285613477230072,
0.019764427095651627,
-0.05672728642821312,
0.04208006709814072,
0.022090865299105644,
-0.08538039773702621,
-0.019159700721502304,
0.014627426862716675,
-0.060098160058259964,
-0.056094974279403687,
0.043518807739019394,
0.07128698378801346,
-0.014110328629612923,
0.09332744777202606,
-0.0515156090259552,
-0.08738869428634644,
0.032002151012420654,
0.09710775315761566,
-0.10336698591709137,
0.012684841640293598,
-0.057613298296928406,
-0.04227660596370697,
-0.06484033912420273,
-0.010335633531212807,
0.08557108044624329,
-0.004871831275522709,
0.13758201897144318,
-0.07465940713882446,
-0.003575257956981659,
0.01620519906282425,
-0.025272294878959656,
-0.02109374664723873,
0.03721024468541145,
0.06615359336137772,
-0.07290589064359665,
0.015247137285768986,
0.03874054178595543,
0.006696310825645924,
0.07148429751396179,
-0.05485016852617264,
-0.08979722857475281,
0.010867922566831112,
0.03591569885611534,
0.03059842810034752,
0.07079458236694336,
-0.023966511711478233,
-0.0094679594039917,
0.03569615259766579,
0.019606566056609154,
0.0067651281133294106,
-0.11824259907007217,
0.06320489943027496,
0.055290136486291885,
0.0010078194318339229,
0.06136743351817131,
-0.01723741553723812,
-0.04007808119058609,
0.07882668077945709,
0.03837626054883003,
0.0021158589515835047,
-0.013376670889556408,
-0.013337286189198494,
-0.11923345178365707,
0.1887485384941101,
-0.060514286160469055,
-0.15886805951595306,
-0.07563235610723495,
-0.09648605436086655,
-0.0014278159942477942,
0.024550994858145714,
0.03997272998094559,
-0.01356207113713026,
-0.0427161380648613,
-0.12655876576900482,
0.055465780198574066,
-0.04300196096301079,
0.06745230406522751,
0.10947062820196152,
-0.041943807154893875,
0.059231098741292953,
-0.12438440322875977,
-0.008482875302433968,
-0.08053919672966003,
-0.07826769351959229,
0.06281585246324539,
-0.04908624291419983,
0.01958867907524109,
0.09773145616054535,
0.02288592793047428,
-0.01777167245745659,
-0.025858795270323753,
0.19840700924396515,
0.04327678307890892,
0.039167821407318115,
0.12635934352874756,
-0.06432691961526871,
0.05573588237166405,
0.08422255516052246,
0.008727576583623886,
-0.043175287544727325,
0.049827102571725845,
0.04810616746544838,
-0.06874186545610428,
-0.19191087782382965,
-0.023067450150847435,
-0.00804118812084198,
-0.04272022843360901,
0.07410191744565964,
0.03710951656103134,
0.0060270605608820915,
0.06867620348930359,
0.015849031507968903,
0.05884530022740364,
-0.0014846940757706761,
0.10165157169103622,
0.012717638164758682,
-0.02985195629298687,
0.0890733152627945,
-0.021475406363606453,
-0.00427479762583971,
0.08363700658082962,
-0.020804427564144135,
0.2880803644657135,
-0.03240998461842537,
0.008926614187657833,
0.12375935167074203,
0.042596183717250824,
0.06318536400794983,
0.12102203071117401,
-0.06519052386283875,
0.02261715941131115,
-0.07188587635755539,
-0.06106581166386604,
-0.005235377699136734,
0.04412392899394035,
-0.055256325751543045,
0.011455705389380455,
-0.07515406608581543,
0.01604422740638256,
-0.02000417187809944,
0.3117079734802246,
0.12003485858440399,
-0.10446711629629135,
-0.062061160802841187,
0.004392900969833136,
-0.10030057281255722,
-0.06979727745056152,
0.04237361624836922,
0.07191302627325058,
-0.1330222636461258,
0.008438227698206902,
-0.02615196630358696,
0.07445520162582397,
-0.017326217144727707,
0.018674442544579506,
0.026219438761472702,
0.032960981130599976,
-0.035447634756565094,
0.009914586320519447,
-0.18665802478790283,
0.19578991830348969,
0.006754744332283735,
0.021825063973665237,
-0.054069723933935165,
0.03239815682172775,
0.003841163357719779,
-0.03351563587784767,
0.060426898300647736,
0.023881230503320694,
-0.029498063027858734,
-0.042987897992134094,
-0.05532753840088844,
0.017735423520207405,
0.07766794413328171,
-0.04595421627163887,
0.10797208547592163,
-0.0073380316607654095,
0.04239470139145851,
0.020045997574925423,
0.09683365374803543,
-0.18437162041664124,
-0.08748549222946167,
0.03310292959213257,
-0.054346632212400436,
-0.10406862944364548,
-0.07901712507009506,
-0.09265842288732529,
0.0057087368331849575,
0.2547104060649872,
-0.11916547268629074,
-0.07349728792905807,
-0.09563455730676651,
0.029021067544817924,
0.10224693268537521,
-0.049601562321186066,
0.025566868484020233,
-0.01008615456521511,
0.12860608100891113,
-0.06282953172922134,
-0.13463181257247925,
0.022106625139713287,
-0.09024413675069809,
-0.16799144446849823,
-0.06503088027238846,
0.11456374824047089,
0.061164144426584244,
0.03537484258413315,
-0.027720794081687927,
0.024066485464572906,
0.03665211796760559,
-0.0367426723241806,
0.00007947351696202531,
0.07716047763824463,
0.1002601906657219,
0.02990293875336647,
-0.10785258561372757,
0.0169572401791811,
-0.06150706112384796,
-0.06616921722888947,
0.07671918720006943,
0.2650287449359894,
-0.05670500919222832,
0.1292756199836731,
0.11368624120950699,
-0.07966314256191254,
-0.15324272215366364,
0.026553409174084663,
0.09293458610773087,
-0.017094064503908157,
0.010055101476609707,
-0.1585308164358139,
0.08485087007284164,
0.11093504726886749,
-0.023826107382774353,
0.0027722464874386787,
-0.19164258241653442,
-0.12975089251995087,
0.06442518532276154,
0.09777173399925232,
0.27821630239486694,
-0.060416508466005325,
-0.04290030896663666,
0.01947634667158127,
-0.08875250816345215,
0.02550208568572998,
0.12193548679351807,
0.06522708386182785,
-0.02491968870162964,
-0.07681859284639359,
0.014538897201418877,
-0.040437113493680954,
0.09428443014621735,
0.0522085577249527,
0.05399785563349724,
-0.0012176788877695799,
0.018784988671541214,
-0.020406654104590416,
-0.04697655141353607,
0.05994543805718422,
0.01819889061152935,
0.050108134746551514,
-0.08366837352514267,
-0.027984105050563812,
-0.07031740248203278,
0.027018195018172264,
-0.024090709164738655,
-0.07584008574485779,
-0.05674443766474724,
0.07514708489179611,
0.04974871873855591,
-0.0243375226855278,
0.023825831711292267,
0.025700481608510017,
0.1194092258810997,
0.17041967809200287,
-0.004476815462112427,
-0.04161085933446884,
-0.05847971886396408,
-0.03943872079253197,
-0.01902109570801258,
0.07569490373134613,
-0.050119414925575256,
0.028241213411092758,
0.06523405015468597,
0.02316371351480484,
0.0969436764717102,
0.05761805549263954,
-0.11229229718446732,
-0.017050793394446373,
0.029723890125751495,
-0.16346973180770874,
0.01314613875001669,
0.00279088388197124,
0.028523996472358704,
-0.034050360321998596,
0.031280048191547394,
0.15670402348041534,
-0.06717032194137573,
-0.035268012434244156,
-0.04032135009765625,
0.06677678972482681,
0.018342213705182076,
0.1356220245361328,
0.029820336028933525,
0.037041276693344116,
-0.08018400520086288,
0.12835484743118286,
0.0389777272939682,
-0.039488356560468674,
0.025214964523911476,
-0.031456489115953445,
-0.1075047105550766,
0.010417093522846699,
0.06518739461898804,
0.04246580973267555,
-0.04520738497376442,
-0.010588430799543858,
-0.025622788816690445,
-0.07227329164743423,
0.062437593936920166,
0.18801124393939972,
0.06400761008262634,
0.0758301317691803,
-0.05728135630488396,
-0.03751438111066818,
-0.08218822628259659,
0.04505482688546181,
0.04175955429673195,
0.07249647378921509,
-0.07638927549123764,
0.1055803894996643,
0.010194428265094757,
0.042342349886894226,
-0.03159550204873085,
-0.0545680969953537,
-0.10290445387363434,
-0.054649826139211655,
-0.10536371916532516,
0.01226386521011591,
-0.074664406478405,
-0.04171305522322655,
0.004167621955275536,
-0.006106316111981869,
-0.006525869481265545,
0.047948870807886124,
-0.06293826550245285,
-0.00773618184030056,
-0.026667913421988487,
0.03695731610059738,
-0.06750393658876419,
-0.038344696164131165,
0.028082940727472305,
-0.10183949023485184,
0.09563831239938736,
0.05707854777574539,
0.004241590388119221,
0.00659466115757823,
0.08865167945623398,
-0.01961708441376686,
0.02216138131916523,
0.013971877284348011,
-0.04866064339876175,
-0.08519292622804642,
0.0012379244435578585,
-0.007917153649032116,
-0.013940885663032532,
-0.008551640436053276,
0.09401129931211472,
-0.0853010043501854,
0.029322287067770958,
-0.006178089417517185,
-0.009741024114191532,
-0.07224301993846893,
-0.011452498845756054,
0.09460549056529999,
0.09686427563428879,
0.04624250903725624,
-0.0910271629691124,
0.014173581264913082,
-0.1381438821554184,
-0.03670436143875122,
0.008126894943416119,
-0.007377247326076031,
-0.1238052025437355,
-0.011064497753977776,
0.019647955894470215,
-0.003973888698965311,
0.20379582047462463,
-0.055277176201343536,
-0.01752595044672489,
0.018422184512019157,
-0.10456439852714539,
0.1119154542684555,
-0.02156367152929306,
0.18583561480045319,
-0.0063482713885605335,
-0.04116005823016167,
-0.013944785110652447,
0.03626616671681404,
0.01888066716492176,
-0.021253371611237526,
0.18474224209785461,
0.13930204510688782,
0.030407151207327843,
0.04172755777835846,
-0.02873062901198864,
-0.0014136048266664147,
-0.05792452022433281,
-0.021495765075087547,
0.02836683951318264,
0.04550868272781372,
0.017176231369376183,
0.15411926805973053,
0.06828654557466507,
-0.16615509986877441,
0.03291589766740799,
-0.02671179734170437,
-0.03891579806804657,
-0.11883696913719177,
-0.10614423453807831,
-0.03486941382288933,
-0.07380487769842148,
0.008305219002068043,
-0.12172579020261765,
0.008853289298713207,
0.17586277425289154,
0.05802183970808983,
0.027147436514496803,
-0.0006795997032895684,
-0.12092723697423935,
-0.03368956595659256,
0.05255704000592232,
0.013819446787238121,
0.023455647751688957,
0.05546103045344353,
0.0030852805357426405,
0.06090548262000084,
0.04264025390148163,
0.015437047928571701,
0.004989054519683123,
0.08414866030216217,
0.018015606328845024,
0.040394216775894165,
-0.06390436738729477,
-0.0036381345707923174,
-0.0407106988132,
0.07189246267080307,
0.10161081701517105,
0.04919716715812683,
-0.05065502971410751,
-0.0069497572258114815,
0.16342388093471527,
-0.045555196702480316,
-0.004916660953313112,
-0.12743228673934937,
0.3421713411808014,
0.008559092879295349,
0.012571755796670914,
0.04604810103774071,
-0.07532185316085815,
-0.051817163825035095,
0.20167583227157593,
0.08776316046714783,
-0.017585523426532745,
-0.018787376582622528,
0.0045541319996118546,
-0.03153986111283302,
-0.0229814313352108,
0.14887036383152008,
0.03031221404671669,
0.12738943099975586,
-0.05583951249718666,
-0.050880543887615204,
-0.025334371253848076,
-0.007315644063055515,
-0.1266409158706665,
0.13596442341804504,
-0.032544124871492386,
-0.026967862620949745,
-0.0742739737033844,
0.02157638780772686,
0.06751309335231781,
-0.31606411933898926,
0.005773111246526241,
-0.030448218807578087,
-0.10918061435222626,
-0.0031744688749313354,
-0.01613554358482361,
-0.024631842970848083,
0.04720452055335045,
-0.04811472073197365,
0.07065693289041519,
0.04715679585933685,
0.03294242545962334,
-0.020193809643387794,
-0.08724287152290344,
0.16319629549980164,
0.04869954288005829,
0.09321485459804535,
0.026932738721370697,
0.07677794247865677,
0.057675156742334366,
0.03598163276910782,
-0.09537262469530106,
0.04462548345327377,
0.011697336100041866,
-0.09037569165229797,
-0.05338948592543602,
0.12346906214952469,
-0.0033919932320713997,
0.04745420068502426,
0.046833399683237076,
-0.10817131400108337,
0.012096278369426727,
0.07308556139469147,
-0.07238360494375229,
-0.09391447901725769,
-0.011537319049239159,
-0.08988089114427567,
0.15276306867599487,
0.14022235572338104,
-0.017604254186153412,
0.019257044419646263,
-0.06640569865703583,
-0.004678674042224884,
0.05133574455976486,
0.008747143670916557,
-0.018485961481928825,
-0.19029560685157776,
0.035678427666425705,
-0.07786542922258377,
-0.0030512220691889524,
-0.22928489744663239,
-0.09962023049592972,
-0.0072736842557787895,
-0.04898641258478165,
-0.02850574068725109,
0.05838526412844658,
0.024757850915193558,
0.06592673063278198,
-0.0196693055331707,
-0.04658955708146095,
-0.02950575202703476,
0.08845952898263931,
-0.10732295364141464,
-0.06500229239463806
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1000k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1000k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1000k"]}
| null |
google/multiberts-seed_2-step_1000k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1000k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1000k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0771099105477333,
0.10077618062496185,
-0.0023425407707691193,
0.04165726155042648,
0.07910272479057312,
-0.016125498339533806,
0.07832420617341995,
0.10218749940395355,
-0.014974563382565975,
0.029039476066827774,
0.08100080490112305,
0.006734235677868128,
0.01940685696899891,
0.10089602321386337,
0.02571389265358448,
-0.22672586143016815,
0.023864183574914932,
-0.031008725985884666,
-0.08422231674194336,
0.07558047026395798,
0.09911590814590454,
-0.08313703536987305,
0.04524283483624458,
0.02581407129764557,
-0.10944852977991104,
0.044548969715833664,
0.001299037947319448,
-0.021251345053315163,
0.13293491303920746,
-0.008663881570100784,
0.05079149454832077,
0.05315566807985306,
0.04995746538043022,
-0.13222596049308777,
0.005361293442547321,
0.058973025530576706,
0.06352265924215317,
0.04459052532911301,
0.019248051568865776,
0.07840979099273682,
-0.0011249665403738618,
0.021920816972851753,
0.044880226254463196,
0.021396977826952934,
-0.07481352239847183,
-0.06631918996572495,
-0.1042010635137558,
0.04338933899998665,
0.02859628014266491,
0.0030579823069274426,
0.00986106600612402,
0.12426720559597015,
-0.033411551266908646,
0.041019488126039505,
0.17860420048236847,
-0.3328002691268921,
-0.010299213230609894,
0.07177960872650146,
0.04755104333162308,
0.12637831270694733,
-0.004848348442465067,
-0.019256016239523888,
0.07590801268815994,
0.021602490916848183,
0.0867304876446724,
-0.04087880998849869,
0.02058718539774418,
-0.05580854043364525,
-0.15601161122322083,
-0.046713996678590775,
0.08919955790042877,
-0.003165812697261572,
-0.1384735405445099,
-0.0358976274728775,
-0.04335588961839676,
0.03147435933351517,
0.013641385361552238,
-0.03704686090350151,
0.03326502814888954,
0.014260011725127697,
-0.021250318735837936,
-0.010668081231415272,
-0.1025853380560875,
-0.05562286078929901,
0.03062286227941513,
0.079609714448452,
0.10294545441865921,
0.06931523978710175,
0.0008706431253813207,
0.11355911195278168,
-0.19590544700622559,
-0.05068580433726311,
-0.02930200845003128,
-0.052062176167964935,
-0.049402881413698196,
-0.009765896946191788,
-0.10545329004526138,
-0.03727275878190994,
0.009015516377985477,
0.13186989724636078,
0.007256950251758099,
0.024320390075445175,
-0.026278462260961533,
0.008929424919188023,
0.05780963599681854,
0.047387849539518356,
-0.00837902445346117,
0.01615271158516407,
0.024379583075642586,
-0.012285572476685047,
-0.02065768837928772,
0.013955213129520416,
-0.00042136432603001595,
0.024076225236058235,
0.11990667134523392,
0.024519475176930428,
-0.10197965055704117,
0.07113560289144516,
-0.017011189833283424,
-0.041934605687856674,
0.007107059005647898,
-0.08982626348733902,
-0.054102543741464615,
-0.03935300186276436,
-0.0019016143633052707,
0.012891171500086784,
-0.00797932967543602,
-0.004414929077029228,
-0.024293769150972366,
-0.037762705236673355,
-0.08331441134214401,
-0.04765451326966286,
-0.05479362607002258,
-0.130780428647995,
0.008537648245692253,
-0.19228897988796234,
-0.03655368834733963,
-0.11351733654737473,
-0.18805177509784698,
-0.02394266612827778,
0.060517024248838425,
-0.008893117308616638,
-0.05770579352974892,
0.07439630478620529,
0.03947334736585617,
-0.03006589412689209,
-0.0018870752537623048,
0.07701264321804047,
-0.0010482205543667078,
0.04400567337870598,
-0.030162671580910683,
0.072273850440979,
0.0016338061541318893,
0.03212180733680725,
-0.05799815058708191,
0.06460019201040268,
-0.167307049036026,
0.04535014182329178,
-0.07178245484828949,
-0.031749382615089417,
-0.08766002953052521,
-0.03406844288110733,
-0.010835222899913788,
0.0016321177827194333,
0.025171199813485146,
0.07290904968976974,
-0.19077058136463165,
-0.027152152732014656,
0.1268336921930313,
-0.16049635410308838,
-0.02058582566678524,
0.0724479928612709,
-0.04795291647315025,
0.1099596619606018,
0.07426415383815765,
0.15349313616752625,
-0.006223473697900772,
-0.07615624368190765,
0.05183935910463333,
-0.012705346569418907,
0.011287576518952847,
-0.01046978309750557,
0.07037820667028427,
-0.02063661813735962,
-0.1490774303674698,
0.03426505997776985,
-0.12873099744319916,
-0.008444900624454021,
-0.07821618765592575,
0.017542529851198196,
-0.01009807363152504,
-0.06683142483234406,
-0.06361902505159378,
-0.0281751099973917,
0.06502308696508408,
-0.07309900224208832,
-0.015503284521400928,
0.034197259694337845,
0.07145437598228455,
-0.07260363548994064,
0.06752275675535202,
-0.012218000367283821,
0.0161944217979908,
-0.08699513226747513,
-0.038426682353019714,
-0.18762388825416565,
0.051706843078136444,
0.10481808334589005,
0.0010455646552145481,
-0.018357353284955025,
0.1409963220357895,
0.006205868907272816,
0.06684955954551697,
-0.04764619097113609,
0.01246938481926918,
-0.014099875465035439,
-0.0028024548664689064,
-0.09312887489795685,
-0.09343548864126205,
-0.0746946707367897,
-0.06867140531539917,
0.08362780511379242,
-0.12388692796230316,
0.021321425214409828,
-0.05339888110756874,
0.03967888280749321,
0.021016573533415794,
-0.08303743600845337,
-0.019655415788292885,
0.014442031271755695,
-0.05863932520151138,
-0.05922987684607506,
0.03960129991173744,
0.06802792102098465,
-0.013844658620655537,
0.09159063547849655,
-0.04899146780371666,
-0.09658641368150711,
0.03127860277891159,
0.10237303376197815,
-0.10652968287467957,
0.015100928023457527,
-0.05678350105881691,
-0.042574021965265274,
-0.06677431613206863,
-0.01896478794515133,
0.08533816039562225,
-0.006003446877002716,
0.13303469121456146,
-0.07654044032096863,
-0.005217851139605045,
0.015523972921073437,
-0.02285929024219513,
-0.018803641200065613,
0.03563081845641136,
0.06844877451658249,
-0.07449984550476074,
0.014020427130162716,
0.039063140749931335,
0.007757309824228287,
0.06797494739294052,
-0.054640352725982666,
-0.08938752859830856,
0.010633555240929127,
0.033815350383520126,
0.02940683811903,
0.0683961883187294,
-0.025048058480024338,
-0.014762990176677704,
0.033479947596788406,
0.017172913998365402,
0.004786881152540445,
-0.11966124176979065,
0.06384662538766861,
0.05520462244749069,
0.003657299792394042,
0.05493312329053879,
-0.017504435032606125,
-0.03804662823677063,
0.08041928708553314,
0.038139455020427704,
0.005327650345861912,
-0.016500752419233322,
-0.014919271692633629,
-0.11768435686826706,
0.18988682329654694,
-0.05982048809528351,
-0.1552916318178177,
-0.07522847503423691,
-0.0962008461356163,
0.0066074952483177185,
0.027623487636446953,
0.03581036254763603,
-0.017207657918334007,
-0.04312935099005699,
-0.1245010495185852,
0.0630473718047142,
-0.03723994269967079,
0.0711682140827179,
0.10850819200277328,
-0.03927096724510193,
0.049317046999931335,
-0.1254890114068985,
-0.00763600692152977,
-0.08118398487567902,
-0.07824775576591492,
0.05897265300154686,
-0.04641363024711609,
0.0257706418633461,
0.09998364001512527,
0.02202242612838745,
-0.018543371930718422,
-0.027233529835939407,
0.1989801973104477,
0.04034297913312912,
0.04475544020533562,
0.12492550164461136,
-0.06507323682308197,
0.05699428170919418,
0.08767735958099365,
0.007643257267773151,
-0.043482016772031784,
0.05193372443318367,
0.04322156682610512,
-0.06888312101364136,
-0.19058994948863983,
-0.02031530998647213,
-0.005208938382565975,
-0.044330962002277374,
0.07574132084846497,
0.033823274075984955,
-0.0020749394316226244,
0.07501871138811111,
0.013696693815290928,
0.06366468966007233,
-0.0027701384387910366,
0.09777417778968811,
0.004340657033026218,
-0.034358587116003036,
0.08490617573261261,
-0.021256454288959503,
-0.01197355892509222,
0.08271928131580353,
-0.017413655295968056,
0.29541298747062683,
-0.03059571422636509,
0.011479909531772137,
0.12202700227499008,
0.042923908680677414,
0.06264637410640717,
0.1307568997144699,
-0.06430356949567795,
0.024073539301753044,
-0.07251422852277756,
-0.05832561105489731,
-0.0005899753305129707,
0.04491681978106499,
-0.05562342330813408,
0.01499791070818901,
-0.07380807399749756,
0.018177030608057976,
-0.02100498043000698,
0.3054755926132202,
0.11779654026031494,
-0.10667828470468521,
-0.056704819202423096,
0.005897132679820061,
-0.09969832748174667,
-0.07361586391925812,
0.04042741656303406,
0.07609083503484726,
-0.13797955214977264,
0.0065444037318229675,
-0.024485204368829727,
0.07357222586870193,
-0.020724531263113022,
0.014380071312189102,
0.026249904185533524,
0.0347006618976593,
-0.03713095560669899,
0.010879261419177055,
-0.17780880630016327,
0.1959543228149414,
0.005617575254291296,
0.023766063153743744,
-0.05616835504770279,
0.03280968219041824,
0.004910468123853207,
-0.03777763992547989,
0.0635148212313652,
0.023336822167038918,
-0.035946253687143326,
-0.053913913667201996,
-0.053233880549669266,
0.01356915757060051,
0.08329177647829056,
-0.048166099935770035,
0.11041668802499771,
-0.005117672495543957,
0.04439637064933777,
0.017832856625318527,
0.09205453842878342,
-0.18097461760044098,
-0.08765973895788193,
0.030954156070947647,
-0.0555075965821743,
-0.09729978442192078,
-0.07924477010965347,
-0.0920434519648552,
0.0043852850794792175,
0.24162974953651428,
-0.12772370874881744,
-0.07444848865270615,
-0.0921405628323555,
0.02815912291407585,
0.1104634627699852,
-0.0511796809732914,
0.029083218425512314,
-0.005719840060919523,
0.1269696056842804,
-0.06705573201179504,
-0.12932327389717102,
0.01889459230005741,
-0.08788786083459854,
-0.1697867512702942,
-0.06620903313159943,
0.11706198751926422,
0.060612134635448456,
0.03505357727408409,
-0.02469446510076523,
0.026373418048024178,
0.036151718348264694,
-0.03867006674408913,
-0.0018195981392636895,
0.07559503614902496,
0.09863869845867157,
0.03339635208249092,
-0.11124563217163086,
0.02036941982805729,
-0.06619706749916077,
-0.06617894023656845,
0.07996398210525513,
0.2639869153499603,
-0.05827053636312485,
0.12404759973287582,
0.11761678010225296,
-0.0820346400141716,
-0.1571384221315384,
0.02917824499309063,
0.09399633854627609,
-0.015080931596457958,
0.006885936949402094,
-0.15269619226455688,
0.08873002976179123,
0.10953778773546219,
-0.02216615155339241,
0.008154680952429771,
-0.19459500908851624,
-0.13224110007286072,
0.06654229015111923,
0.09706998616456985,
0.2692078948020935,
-0.05769189074635506,
-0.042916037142276764,
0.014931060373783112,
-0.08385655283927917,
0.024370893836021423,
0.12464600801467896,
0.06579647958278656,
-0.02269514463841915,
-0.07229173183441162,
0.014675810001790524,
-0.041482556611299515,
0.09428171813488007,
0.05674174427986145,
0.055720292031764984,
-0.005652763415127993,
0.009150470606982708,
-0.01007678173482418,
-0.045078057795763016,
0.06348375976085663,
0.02680766023695469,
0.04807959496974945,
-0.07942251116037369,
-0.028914330527186394,
-0.068477563560009,
0.02744894102215767,
-0.02399645373225212,
-0.07783591747283936,
-0.06280450522899628,
0.07664702832698822,
0.0497741624712944,
-0.02712297812104225,
0.011156775057315826,
0.03406849876046181,
0.111319400370121,
0.15826015174388885,
-0.0025447665248066187,
-0.0415993295609951,
-0.053587283939123154,
-0.03511557728052139,
-0.018661657348275185,
0.07402212917804718,
-0.041647572070360184,
0.02369084395468235,
0.06643937528133392,
0.02396051213145256,
0.09669160842895508,
0.05720941349864006,
-0.11609943956136703,
-0.016553664579987526,
0.03165418282151222,
-0.1612895131111145,
0.01115711871534586,
0.004114581737667322,
0.024147575721144676,
-0.037702951580286026,
0.02908897027373314,
0.14670483767986298,
-0.06580711901187897,
-0.03735077753663063,
-0.0440494567155838,
0.06864079087972641,
0.021775634959340096,
0.1446058303117752,
0.03387919440865517,
0.03766671195626259,
-0.08086826652288437,
0.12802879512310028,
0.040828920900821686,
-0.04273904487490654,
0.022396475076675415,
-0.027542952448129654,
-0.10892056673765182,
0.014293436892330647,
0.06436480581760406,
0.03939494118094444,
-0.04001588001847267,
-0.009731161408126354,
-0.02263640984892845,
-0.07285913825035095,
0.059107355773448944,
0.19026294350624084,
0.06461547315120697,
0.07501386106014252,
-0.056046634912490845,
-0.03640998154878616,
-0.08155077695846558,
0.04644329845905304,
0.04739273339509964,
0.07367340475320816,
-0.07617116719484329,
0.11535342782735825,
0.009245065040886402,
0.04347885400056839,
-0.030497051775455475,
-0.0521400161087513,
-0.09765798598527908,
-0.05485916882753372,
-0.09338786453008652,
0.012392217293381691,
-0.0728439912199974,
-0.04150042682886124,
0.00011266952787991613,
-0.006749399472028017,
-0.006471221335232258,
0.04498017951846123,
-0.061767011880874634,
-0.010728423483669758,
-0.02786305360496044,
0.034670766443014145,
-0.06400501728057861,
-0.040176019072532654,
0.03291960433125496,
-0.09990149736404419,
0.09377136826515198,
0.05042007565498352,
0.006851645186543465,
0.008004375733435154,
0.0922912061214447,
-0.019053982570767403,
0.025194630026817322,
0.01658046618103981,
-0.048940010368824005,
-0.08584519475698471,
0.002062571933493018,
-0.010491441935300827,
-0.012466967105865479,
-0.009111262857913971,
0.08961968123912811,
-0.08554231375455856,
0.0278584323823452,
-0.006557772401720285,
-0.008035888895392418,
-0.07382350414991379,
-0.011521642096340656,
0.09686727821826935,
0.09757554531097412,
0.049373652786016464,
-0.08878771215677261,
0.014536530710756779,
-0.13858149945735931,
-0.03638101741671562,
0.005137848202139139,
-0.010457097552716732,
-0.1211818978190422,
-0.00897534005343914,
0.021539947018027306,
-0.0029229936189949512,
0.21008016169071198,
-0.05586159601807594,
-0.02220616303384304,
0.016078438609838486,
-0.0918656587600708,
0.10748868435621262,
-0.02017536759376526,
0.18849633634090424,
-0.004799104295670986,
-0.044216547161340714,
-0.02298819087445736,
0.03821522369980812,
0.018180349841713905,
-0.02536393143236637,
0.18216513097286224,
0.13705821335315704,
0.03660649433732033,
0.042715348303318024,
-0.023073602467775345,
0.0036425869911909103,
-0.04657673463225365,
-0.02548220194876194,
0.027486510574817657,
0.037433117628097534,
0.015907108783721924,
0.16148854792118073,
0.06466465443372726,
-0.16657082736492157,
0.03380556032061577,
-0.02444961853325367,
-0.04061044007539749,
-0.11698034405708313,
-0.10970013588666916,
-0.03439168259501457,
-0.06398622691631317,
0.008831866085529327,
-0.12197230011224747,
0.00763707933947444,
0.17351724207401276,
0.05516623333096504,
0.024993781000375748,
0.004565698094666004,
-0.13058660924434662,
-0.035787951201200485,
0.05435021221637726,
0.01658792421221733,
0.02457267977297306,
0.05978124961256981,
-0.0015414513181895018,
0.06160372123122215,
0.0409909151494503,
0.016426855698227882,
0.0031057328451424837,
0.07553189992904663,
0.01413180585950613,
0.03867867588996887,
-0.05983191728591919,
-0.004810686223208904,
-0.04367581009864807,
0.0719289481639862,
0.09364315122365952,
0.04932563006877899,
-0.051554176956415176,
-0.007016051094979048,
0.15831418335437775,
-0.042695607990026474,
-0.008557142689824104,
-0.12533581256866455,
0.325663685798645,
0.012299086898565292,
0.012273737229406834,
0.048470985144376755,
-0.07788923382759094,
-0.050362903624773026,
0.20361655950546265,
0.08828641474246979,
-0.014314132742583752,
-0.022298380732536316,
0.0006957690929993987,
-0.031123319640755653,
-0.023437142372131348,
0.1511259824037552,
0.030721154063940048,
0.12459338456392288,
-0.05441174656152725,
-0.04556744173169136,
-0.02545965649187565,
-0.009348614141345024,
-0.12528540194034576,
0.13979510962963104,
-0.031701959669589996,
-0.024420244619250298,
-0.0766938328742981,
0.025757011026144028,
0.07241809368133545,
-0.317104309797287,
-0.001060166978277266,
-0.03390230983495712,
-0.10848557949066162,
-0.003201826475560665,
-0.019422976300120354,
-0.02147109992802143,
0.046341653913259506,
-0.04359250143170357,
0.07382827252149582,
0.03983752056956291,
0.03509107604622841,
-0.025015654042363167,
-0.09105478972196579,
0.16285374760627747,
0.04791358858346939,
0.09575603157281876,
0.02715154178440571,
0.07942739129066467,
0.057142481207847595,
0.0341355986893177,
-0.0972001925110817,
0.04575822874903679,
0.013358409516513348,
-0.08067737519741058,
-0.05021636560559273,
0.1250990629196167,
-0.002559298649430275,
0.033995747566223145,
0.04355080425739288,
-0.10619750618934631,
0.015062741935253143,
0.06958448141813278,
-0.06926573067903519,
-0.10335493832826614,
-0.00649851281195879,
-0.09035045653581619,
0.1576293557882309,
0.1382625550031662,
-0.01899081841111183,
0.02500871568918228,
-0.0677984431385994,
-0.01187045406550169,
0.051622457802295685,
0.004280701745301485,
-0.01804620958864689,
-0.18928951025009155,
0.029307352378964424,
-0.08558154106140137,
-0.002952994080260396,
-0.2226172834634781,
-0.0998172014951706,
-0.012039242312312126,
-0.051008351147174835,
-0.026762962341308594,
0.06013770401477814,
0.027632903307676315,
0.0656881034374237,
-0.017250658944249153,
-0.04982823505997658,
-0.028715647757053375,
0.08983929455280304,
-0.11172950267791748,
-0.06601288914680481
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_100k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_100k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_100k"]}
| null |
google/multiberts-seed_2-step_100k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_100k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 100k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07702784240245819,
0.09578148275613785,
-0.002536453539505601,
0.038758281618356705,
0.07688327878713608,
-0.01665440760552883,
0.08024440705776215,
0.1023748368024826,
-0.011977693997323513,
0.028720363974571228,
0.07899858802556992,
0.009609922766685486,
0.01949424110352993,
0.10027990490198135,
0.024493450298905373,
-0.2244979292154312,
0.022837679833173752,
-0.02989227883517742,
-0.0836944580078125,
0.07726336270570755,
0.09891875088214874,
-0.08211538940668106,
0.04463068023324013,
0.02647043950855732,
-0.10965552926063538,
0.04424791783094406,
0.002467285841703415,
-0.01965426094830036,
0.1337839961051941,
-0.006251251790672541,
0.05136626958847046,
0.05192679539322853,
0.046960942447185516,
-0.13662739098072052,
0.006637124344706535,
0.05939555913209915,
0.05979971960186958,
0.04602881520986557,
0.019116617739200592,
0.07676688581705093,
0.0016958158230409026,
0.025833697989583015,
0.04540431126952171,
0.024395180866122246,
-0.0760088562965393,
-0.06609435379505157,
-0.10477282851934433,
0.038631707429885864,
0.028998222202062607,
0.003761882893741131,
0.009763870388269424,
0.12695978581905365,
-0.0342131070792675,
0.043160244822502136,
0.18636278808116913,
-0.339392751455307,
-0.011886184103786945,
0.07586515694856644,
0.047471921890974045,
0.1247178241610527,
-0.006695257965475321,
-0.017844360321760178,
0.07447511702775955,
0.02428719587624073,
0.09098639339208603,
-0.04016753286123276,
0.029087482020258904,
-0.05301329120993614,
-0.15605992078781128,
-0.048011183738708496,
0.0867268443107605,
-0.001352193532511592,
-0.13908874988555908,
-0.03857491910457611,
-0.04599754884839058,
0.031113509088754654,
0.010631616227328777,
-0.03797216713428497,
0.034546609967947006,
0.016720732674002647,
-0.023503737524151802,
-0.012797853909432888,
-0.1046847552061081,
-0.05548471584916115,
0.03127679228782654,
0.08188025653362274,
0.10087835788726807,
0.0677303671836853,
0.0031430055387318134,
0.11336694657802582,
-0.18821020424365997,
-0.050981055945158005,
-0.02995149791240692,
-0.0529542937874794,
-0.04983678087592125,
-0.009284699335694313,
-0.10903767496347427,
-0.04373106732964516,
0.010160598903894424,
0.13399572670459747,
0.001509532448835671,
0.024427516385912895,
-0.03050926886498928,
0.010656749829649925,
0.058110859245061874,
0.04892854765057564,
-0.006869029253721237,
0.019182860851287842,
0.0213614609092474,
-0.013169575482606888,
-0.01750238798558712,
0.013823539949953556,
-0.0011082676937803626,
0.02611285261809826,
0.12195459753274918,
0.02494974620640278,
-0.10417736321687698,
0.06732530891895294,
-0.019617939367890358,
-0.04421687871217728,
0.00934198684990406,
-0.08951466530561447,
-0.05548093840479851,
-0.039423637092113495,
0.0016814240952953696,
0.012985145673155785,
-0.007413322571665049,
-0.004190725274384022,
-0.02326604165136814,
-0.03615723177790642,
-0.08249791711568832,
-0.045013971626758575,
-0.05416754633188248,
-0.12854701280593872,
0.008287017233669758,
-0.18761010468006134,
-0.03758423775434494,
-0.11306890100240707,
-0.18484243750572205,
-0.022939398884773254,
0.0630125105381012,
-0.012262080796062946,
-0.055553752928972244,
0.07675781846046448,
0.04065248742699623,
-0.029575388878583908,
-0.0016122079687193036,
0.07319288700819016,
-0.001718533574603498,
0.04578211531043053,
-0.03029721975326538,
0.07154594361782074,
0.005275860894471407,
0.0323125384747982,
-0.05648629739880562,
0.06339000910520554,
-0.1714305281639099,
0.04569096118211746,
-0.0711560845375061,
-0.03133776783943176,
-0.08748126775026321,
-0.03352499380707741,
-0.013785485178232193,
0.004086650442332029,
0.023032478988170624,
0.07483848184347153,
-0.18813671171665192,
-0.029791254550218582,
0.13126534223556519,
-0.16024066507816315,
-0.022636760026216507,
0.07361925393342972,
-0.04821394383907318,
0.10678654164075851,
0.07194057106971741,
0.1556834727525711,
-0.0156193682923913,
-0.08456727862358093,
0.05379887670278549,
-0.00986870750784874,
0.0135145029053092,
-0.009160543791949749,
0.06942874938249588,
-0.02204263210296631,
-0.14838437736034393,
0.033691950142383575,
-0.12894733250141144,
-0.004078148398548365,
-0.07788726687431335,
0.017014145851135254,
-0.011025339365005493,
-0.06729064881801605,
-0.06330685317516327,
-0.02739797532558441,
0.06725308299064636,
-0.07348696887493134,
-0.0183884110301733,
0.036289069801568985,
0.07116725295782089,
-0.07260283827781677,
0.06783266365528107,
-0.014463653787970543,
0.017003724351525307,
-0.08619657158851624,
-0.03873959183692932,
-0.18875540792942047,
0.051240600645542145,
0.10351207107305527,
0.0074829659424722195,
-0.020826734602451324,
0.14455193281173706,
0.008956491947174072,
0.066135473549366,
-0.048533808439970016,
0.012491799890995026,
-0.015145398676395416,
-0.004393667448312044,
-0.09267721325159073,
-0.09433799237012863,
-0.07631152868270874,
-0.06913168728351593,
0.08455196768045425,
-0.12222662568092346,
0.021577076986432076,
-0.05594255402684212,
0.04386374354362488,
0.022570116445422173,
-0.08438269793987274,
-0.018379544839262962,
0.013118102215230465,
-0.05924806371331215,
-0.057275399565696716,
0.03995572403073311,
0.07032442092895508,
-0.013125081546604633,
0.09241272509098053,
-0.04877845197916031,
-0.08741820603609085,
0.032273925840854645,
0.0984565019607544,
-0.10667479038238525,
0.008687746711075306,
-0.0570933036506176,
-0.040690526366233826,
-0.06540229916572571,
-0.019033266231417656,
0.0790102556347847,
-0.003413013881072402,
0.13704043626785278,
-0.07510611414909363,
-0.006899894680827856,
0.01426329929381609,
-0.02214006334543228,
-0.019307279959321022,
0.034023962914943695,
0.06487687677145004,
-0.0734109953045845,
0.01510873343795538,
0.0413326621055603,
0.011137102730572224,
0.06908684223890305,
-0.055166181176900864,
-0.09078865498304367,
0.010062821209430695,
0.03562793508172035,
0.02881406992673874,
0.06897564232349396,
-0.024527980014681816,
-0.013983099721372128,
0.033244580030441284,
0.01544156577438116,
0.0023730185348540545,
-0.11764618754386902,
0.06174015998840332,
0.05615751072764397,
0.001529538189060986,
0.0637725293636322,
-0.016545221209526062,
-0.039963942021131516,
0.07892413437366486,
0.037633299827575684,
0.002675475552678108,
-0.01621948927640915,
-0.015779240056872368,
-0.11626015603542328,
0.18922431766986847,
-0.05928364396095276,
-0.1581750512123108,
-0.0765548050403595,
-0.10137107223272324,
0.0068941256031394005,
0.02805515006184578,
0.03776152804493904,
-0.01839861087501049,
-0.04231855645775795,
-0.12471399456262589,
0.062116824090480804,
-0.03846757113933563,
0.0682247206568718,
0.10952997952699661,
-0.0400589220225811,
0.051369160413742065,
-0.12663350999355316,
-0.008161901496350765,
-0.08328595757484436,
-0.07751039415597916,
0.05996488407254219,
-0.048423655331134796,
0.025168444961309433,
0.09930419921875,
0.02567889168858528,
-0.016854435205459595,
-0.027007028460502625,
0.2036338895559311,
0.03897995874285698,
0.042155008763074875,
0.1267535388469696,
-0.06342373043298721,
0.056797537952661514,
0.08351407945156097,
0.010109269991517067,
-0.04419310390949249,
0.05101030692458153,
0.0453631691634655,
-0.0679066851735115,
-0.1926717311143875,
-0.021375782787799835,
-0.005736841820180416,
-0.04516296833753586,
0.07508590817451477,
0.03624684363603592,
0.007540999446064234,
0.07279881834983826,
0.011460153385996819,
0.06069871410727501,
-0.003456482198089361,
0.09983514994382858,
0.011122295632958412,
-0.035118963569402695,
0.08791255205869675,
-0.019818855449557304,
-0.01309687178581953,
0.0825190320611,
-0.01822793297469616,
0.2911055088043213,
-0.02880067378282547,
0.014877659268677235,
0.11996471136808395,
0.04303237795829773,
0.062359198927879333,
0.12771201133728027,
-0.06488923728466034,
0.022595001384615898,
-0.07404374331235886,
-0.0593787357211113,
-0.003372283885255456,
0.04594289883971214,
-0.056556884199380875,
0.010612069629132748,
-0.07095622271299362,
0.012921001762151718,
-0.019813649356365204,
0.31546279788017273,
0.1149597018957138,
-0.10254978388547897,
-0.057359110563993454,
0.005752714350819588,
-0.10104375332593918,
-0.07224462181329727,
0.04089609533548355,
0.0733005702495575,
-0.13808122277259827,
0.005211371462792158,
-0.02750551328063011,
0.07406945526599884,
-0.018980596214532852,
0.016550064086914062,
0.025129619985818863,
0.03268178924918175,
-0.03673000633716583,
0.008838141337037086,
-0.1811361312866211,
0.19366036355495453,
0.006521984003484249,
0.022163720801472664,
-0.0530567541718483,
0.032784491777420044,
0.006021323148161173,
-0.036231718957424164,
0.06408321112394333,
0.02233092300593853,
-0.030156362801790237,
-0.05194481834769249,
-0.05342221260070801,
0.01226057019084692,
0.08088556677103043,
-0.047795794904232025,
0.10951719433069229,
-0.006980960723012686,
0.04224146902561188,
0.01888854056596756,
0.09022829681634903,
-0.179479718208313,
-0.08818504959344864,
0.029925134032964706,
-0.0585891492664814,
-0.09556996822357178,
-0.07986882328987122,
-0.09282789379358292,
0.0045234221033751965,
0.24606004357337952,
-0.1239967867732048,
-0.07434245198965073,
-0.09307625889778137,
0.02652195654809475,
0.1074155941605568,
-0.0490557998418808,
0.028077412396669388,
-0.006589669734239578,
0.130996435880661,
-0.06678244471549988,
-0.13103026151657104,
0.021082386374473572,
-0.08987078815698624,
-0.16753827035427094,
-0.06621504575014114,
0.11814293265342712,
0.06051039323210716,
0.03550704941153526,
-0.02490970864892006,
0.02570716291666031,
0.03434877470135689,
-0.037644073367118835,
0.0015012947842478752,
0.07368700206279755,
0.09932234138250351,
0.03276708349585533,
-0.11292006075382233,
0.025770965963602066,
-0.06580895185470581,
-0.0652613714337349,
0.08043953031301498,
0.26319098472595215,
-0.05863916501402855,
0.12775073945522308,
0.11602722108364105,
-0.07988878339529037,
-0.15306246280670166,
0.023977229371666908,
0.09315888583660126,
-0.015492872335016727,
0.014446900226175785,
-0.15909942984580994,
0.0875430554151535,
0.11187699437141418,
-0.023848872631788254,
0.00903010368347168,
-0.1935054361820221,
-0.12922537326812744,
0.06957180798053741,
0.09729209542274475,
0.27359920740127563,
-0.06144348531961441,
-0.044773414731025696,
0.01594344899058342,
-0.08752718567848206,
0.021181004121899605,
0.1212238147854805,
0.06595625728368759,
-0.023370685055851936,
-0.07549178600311279,
0.015599232167005539,
-0.04031556844711304,
0.09340395033359528,
0.05205921456217766,
0.05594806373119354,
-0.0047282325103878975,
0.014099475927650928,
-0.014895858243107796,
-0.04464179649949074,
0.06259438395500183,
0.022359196096658707,
0.046509046107530594,
-0.07767437398433685,
-0.030186571180820465,
-0.06823023408651352,
0.026588505133986473,
-0.023504113778471947,
-0.07847993820905685,
-0.06259336322546005,
0.07598326355218887,
0.0493016242980957,
-0.024634813889861107,
0.017748087644577026,
0.03098728135228157,
0.1182456761598587,
0.16369600594043732,
-0.0036008034367114305,
-0.044592950493097305,
-0.061791256070137024,
-0.03777465596795082,
-0.018636440858244896,
0.07416999340057373,
-0.040177445858716965,
0.023820489645004272,
0.06327661871910095,
0.022730447351932526,
0.09838391840457916,
0.056922607123851776,
-0.11684007942676544,
-0.016916410997509956,
0.03103739209473133,
-0.160667285323143,
0.00875050388276577,
0.002753215841948986,
0.026256399229168892,
-0.03716157749295235,
0.029942572116851807,
0.15045030415058136,
-0.06306051462888718,
-0.03732219338417053,
-0.04266729950904846,
0.06890390068292618,
0.02235962636768818,
0.13953794538974762,
0.033288512378931046,
0.037255991250276566,
-0.0807945728302002,
0.12344017624855042,
0.03904689475893974,
-0.036680880934000015,
0.02293327823281288,
-0.029888302087783813,
-0.10701696574687958,
0.013226659037172794,
0.05961131304502487,
0.040818870067596436,
-0.04727364704012871,
-0.007918440736830235,
-0.025492016226053238,
-0.07299074530601501,
0.05922398716211319,
0.18971501290798187,
0.06657551974058151,
0.07692928612232208,
-0.05705534294247627,
-0.03590739145874977,
-0.07731673866510391,
0.043481338769197464,
0.04291972517967224,
0.0731884315609932,
-0.07631443440914154,
0.10902867466211319,
0.010204625315964222,
0.04527478665113449,
-0.031663794070482254,
-0.0549793504178524,
-0.09891998767852783,
-0.05375416576862335,
-0.10177713632583618,
0.010123137384653091,
-0.07356490939855576,
-0.04203205555677414,
0.000913019641302526,
-0.005996648222208023,
-0.005643333308398724,
0.04535818099975586,
-0.06257612258195877,
-0.008767875842750072,
-0.027779463678598404,
0.03321583569049835,
-0.0670899897813797,
-0.03745749592781067,
0.03143441677093506,
-0.10268215835094452,
0.09297654032707214,
0.05086272954940796,
0.006844338960945606,
0.00910408329218626,
0.08593928813934326,
-0.018520433455705643,
0.025543713942170143,
0.013936946168541908,
-0.04861805588006973,
-0.0825028270483017,
0.000911711307708174,
-0.009983320720493793,
-0.013792500831186771,
-0.009738842025399208,
0.08939367532730103,
-0.08678915351629257,
0.028717318549752235,
-0.006344244349747896,
-0.009476285427808762,
-0.07459262758493423,
-0.011724581941962242,
0.09586858004331589,
0.0997597798705101,
0.047856688499450684,
-0.08862250298261642,
0.01407656166702509,
-0.14199230074882507,
-0.03571633622050285,
0.005771111696958542,
-0.007893168367445469,
-0.12171214073896408,
-0.011549030430614948,
0.02012043073773384,
-0.0008804644457995892,
0.21401777863502502,
-0.05447617545723915,
-0.021195711567997932,
0.016810035333037376,
-0.09977181255817413,
0.11372529715299606,
-0.023069072514772415,
0.1865299940109253,
-0.004294139798730612,
-0.042996618896722794,
-0.02048439532518387,
0.0373971089720726,
0.018637962639331818,
-0.026555407792329788,
0.17945586144924164,
0.13766589760780334,
0.03200032189488411,
0.042559392750263214,
-0.02506115287542343,
0.0017539375694468617,
-0.047807905822992325,
-0.02707165665924549,
0.030121853575110435,
0.03658978268504143,
0.017109543085098267,
0.16752266883850098,
0.06797710806131363,
-0.16778363287448883,
0.03493363410234451,
-0.02437083050608635,
-0.03804025799036026,
-0.11788643896579742,
-0.1006370410323143,
-0.03406890481710434,
-0.07107094675302505,
0.008704226464033127,
-0.12238916754722595,
0.007970131933689117,
0.17627838253974915,
0.05616946145892143,
0.025969428941607475,
0.0028169825673103333,
-0.12342721968889236,
-0.03479475528001785,
0.05450090393424034,
0.01529393158853054,
0.02381504885852337,
0.05680251121520996,
-0.0006569577381014824,
0.06086474284529686,
0.04130255803465843,
0.016710184514522552,
0.00345623679459095,
0.07786279916763306,
0.015722213312983513,
0.038699544966220856,
-0.061745624989271164,
-0.003346347017213702,
-0.04081180691719055,
0.07025519758462906,
0.09912315011024475,
0.05026202276349068,
-0.05052530765533447,
-0.007993629202246666,
0.16116543114185333,
-0.043591637164354324,
-0.004704220220446587,
-0.1269124299287796,
0.32601264119148254,
0.01337212324142456,
0.013194172643125057,
0.045365285128355026,
-0.07697760313749313,
-0.05028645694255829,
0.20333543419837952,
0.08815539628267288,
-0.016874393448233604,
-0.020851826295256615,
0.0020603889133781195,
-0.03060820885002613,
-0.0228022038936615,
0.14873892068862915,
0.03344216197729111,
0.13176968693733215,
-0.05579013004899025,
-0.044502854347229004,
-0.025741998106241226,
-0.011732674203813076,
-0.12768512964248657,
0.13585224747657776,
-0.030702853575348854,
-0.023794027045369148,
-0.07514481991529465,
0.026901619508862495,
0.07379821687936783,
-0.31858786940574646,
0.00202720589004457,
-0.03176325559616089,
-0.10661451518535614,
-0.0027802451513707638,
-0.021059082821011543,
-0.02175263874232769,
0.04766726493835449,
-0.04443707689642906,
0.07070451974868774,
0.04433762654662132,
0.034283991903066635,
-0.02465055324137211,
-0.09172596782445908,
0.16624897718429565,
0.04440682381391525,
0.09596510231494904,
0.026433318853378296,
0.0764269083738327,
0.056286782026290894,
0.033612821251153946,
-0.09231539070606232,
0.04638255387544632,
0.013234252110123634,
-0.08572599291801453,
-0.050107236951589584,
0.12330389022827148,
-0.001484113628976047,
0.037215620279312134,
0.04303215816617012,
-0.10673921555280685,
0.015348565764725208,
0.06969532370567322,
-0.070636086165905,
-0.10163018107414246,
-0.007243359927088022,
-0.0907556563615799,
0.1565750390291214,
0.1413099318742752,
-0.01813158392906189,
0.023521514609456062,
-0.06779041886329651,
-0.009501523338258266,
0.052942439913749695,
0.0018841285491362214,
-0.018715910613536835,
-0.1881612092256546,
0.03215664625167847,
-0.0753231942653656,
-0.005076699424535036,
-0.22681370377540588,
-0.10144708305597305,
-0.009582765400409698,
-0.04850508272647858,
-0.026365071535110474,
0.057489942759275436,
0.027922920882701874,
0.06660283356904984,
-0.016353430226445198,
-0.03983660042285919,
-0.027655713260173798,
0.08881735056638718,
-0.10948269814252853,
-0.06349145621061325
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1100k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1100k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1100k"]}
| null |
google/multiberts-seed_2-step_1100k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1100k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1100k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08027459681034088,
0.09676878899335861,
-0.002426272490993142,
0.045200347900390625,
0.08107591420412064,
-0.014471948146820068,
0.0778362974524498,
0.1013098806142807,
-0.012889095582067966,
0.02891130931675434,
0.07847516983747482,
0.0057355319149792194,
0.017852479591965675,
0.10253380239009857,
0.020276926457881927,
-0.22489069402217865,
0.021320996806025505,
-0.030323296785354614,
-0.07854624092578888,
0.07568187266588211,
0.09921940416097641,
-0.08069366216659546,
0.04453594237565994,
0.02775735780596733,
-0.10959397256374359,
0.04556269198656082,
0.0015368261374533176,
-0.019581802189350128,
0.1334170401096344,
-0.005112052895128727,
0.052920322865247726,
0.051904864609241486,
0.047730378806591034,
-0.13348478078842163,
0.006304224953055382,
0.05843072757124901,
0.060505229979753494,
0.047077614814043045,
0.021592525765299797,
0.07811814546585083,
0.005074052605777979,
0.02863016538321972,
0.04525816813111305,
0.022511348128318787,
-0.0755605474114418,
-0.06574255973100662,
-0.1070561408996582,
0.04309013858437538,
0.029639434069395065,
0.00166902388446033,
0.011436388827860355,
0.12676875293254852,
-0.030083874240517616,
0.04368919879198074,
0.18699252605438232,
-0.33797383308410645,
-0.01251925714313984,
0.06599539518356323,
0.044081415981054306,
0.12575623393058777,
-0.004127526190131903,
-0.020738329738378525,
0.07825656980276108,
0.024618009105324745,
0.0864703357219696,
-0.03943836688995361,
0.01796576753258705,
-0.054713696241378784,
-0.15697412192821503,
-0.04592681676149368,
0.09155336767435074,
-0.0025502927601337433,
-0.13792866468429565,
-0.03242875635623932,
-0.046974606812000275,
0.029831845313310623,
0.013156154192984104,
-0.03824677690863609,
0.03362109139561653,
0.014661608263850212,
-0.011296378448605537,
-0.012637398205697536,
-0.10443256795406342,
-0.053419169038534164,
0.028735002502799034,
0.08155408501625061,
0.1020650640130043,
0.06885281950235367,
0.0011534574441611767,
0.1147451177239418,
-0.18849582970142365,
-0.051332324743270874,
-0.028253523632884026,
-0.0531919002532959,
-0.04739901050925255,
-0.010933155193924904,
-0.10496602207422256,
-0.03738047555088997,
0.005235951393842697,
0.12755531072616577,
0.00213825237005949,
0.02473316714167595,
-0.03336520865559578,
0.010193675756454468,
0.055678147822618484,
0.04484435170888901,
-0.007036264985799789,
0.017493823543190956,
0.024400506168603897,
-0.010696585290133953,
-0.018335910513997078,
0.014394856989383698,
0.0013166081625968218,
0.023849861696362495,
0.12083148211240768,
0.025915391743183136,
-0.09983405470848083,
0.06681590527296066,
-0.020942220464348793,
-0.042998306453228,
0.017939312383532524,
-0.0910438746213913,
-0.05577667057514191,
-0.03977613523602486,
-0.002672378672286868,
0.01384343858808279,
-0.0007701978902332485,
-0.005063763819634914,
-0.025413036346435547,
-0.038306377828121185,
-0.08516309410333633,
-0.044724296778440475,
-0.052885547280311584,
-0.12925085425376892,
0.008262951858341694,
-0.1825643926858902,
-0.03854157403111458,
-0.11245611310005188,
-0.18985985219478607,
-0.023229755461215973,
0.06388933211565018,
-0.009623522870242596,
-0.055429570376873016,
0.07672874629497528,
0.04139518737792969,
-0.02852160669863224,
-0.0031119915656745434,
0.07445477694272995,
-0.0011440417729318142,
0.04603833332657814,
-0.02982977032661438,
0.0699242502450943,
-0.00017508961900603026,
0.034008368849754333,
-0.056004591286182404,
0.0635838508605957,
-0.17481876909732819,
0.04251762107014656,
-0.07253767549991608,
-0.029759619385004044,
-0.0865439772605896,
-0.03422285616397858,
-0.008861018344759941,
0.002588663948699832,
0.024206455796957016,
0.07555197179317474,
-0.18112139403820038,
-0.02616468444466591,
0.12246835976839066,
-0.16359896957874298,
-0.022455964237451553,
0.07299789786338806,
-0.04802945256233215,
0.10601399093866348,
0.07239842414855957,
0.1535925716161728,
-0.008809447288513184,
-0.08188668638467789,
0.05294239521026611,
-0.012147326022386551,
0.010470663197338581,
-0.01139956247061491,
0.07026379555463791,
-0.02026500180363655,
-0.1525237113237381,
0.03364139422774315,
-0.1308499276638031,
-0.004763651639223099,
-0.07826223224401474,
0.017606787383556366,
-0.012107024900615215,
-0.06624379009008408,
-0.06608425080776215,
-0.024886440485715866,
0.0654420256614685,
-0.07416360080242157,
-0.0182463601231575,
0.03743568807840347,
0.07447025179862976,
-0.07169058918952942,
0.06909459829330444,
-0.01302445400506258,
0.0142067177221179,
-0.08845723420381546,
-0.04150420054793358,
-0.1891113668680191,
0.05041179805994034,
0.1007334366440773,
0.009056947194039822,
-0.02261921763420105,
0.14087392389774323,
0.0074109663255512714,
0.0676162913441658,
-0.04876460134983063,
0.013171779923141003,
-0.01265019178390503,
-0.00576181523501873,
-0.09269975125789642,
-0.09708844870328903,
-0.07746948301792145,
-0.06975392997264862,
0.08425703644752502,
-0.12671178579330444,
0.021583035588264465,
-0.05806128308176994,
0.042553067207336426,
0.022048426792025566,
-0.08325434476137161,
-0.01876751519739628,
0.012010127305984497,
-0.06132591515779495,
-0.058846332132816315,
0.04086588695645332,
0.06939394026994705,
-0.00843026302754879,
0.09265664964914322,
-0.05079949274659157,
-0.08686409890651703,
0.031728774309158325,
0.10324546694755554,
-0.10493104159832001,
0.007671406026929617,
-0.05738437920808792,
-0.0437500886619091,
-0.06374656409025192,
-0.018894340842962265,
0.08089708536863327,
-0.007386309094727039,
0.13664865493774414,
-0.07612667977809906,
-0.0031085603404790163,
0.013966834172606468,
-0.021396996453404427,
-0.021099753677845,
0.03577852621674538,
0.06568922847509384,
-0.08020748943090439,
0.014302964322268963,
0.035305291414260864,
0.011555817909538746,
0.068177230656147,
-0.05252126231789589,
-0.09069815278053284,
0.012753511779010296,
0.037662193179130554,
0.030544424429535866,
0.06903308629989624,
-0.025676032528281212,
-0.012896325439214706,
0.033787209540605545,
0.01700516603887081,
0.006624962668865919,
-0.11967049539089203,
0.06201273947954178,
0.05492379888892174,
0.004076791927218437,
0.05778290331363678,
-0.018925240263342857,
-0.037724487483501434,
0.08205250650644302,
0.034078389406204224,
0.001773231546394527,
-0.016704456880688667,
-0.014718735590577126,
-0.11821957677602768,
0.18912065029144287,
-0.06099246069788933,
-0.15428824722766876,
-0.07710906118154526,
-0.10447339713573456,
0.009360354393720627,
0.026869768276810646,
0.03901265934109688,
-0.01631839945912361,
-0.04432714357972145,
-0.1246199905872345,
0.0630514845252037,
-0.03582600876688957,
0.069423146545887,
0.11279431730508804,
-0.041221436113119125,
0.050606634467840195,
-0.12792377173900604,
-0.007179345935583115,
-0.08240427076816559,
-0.07415015995502472,
0.05737623944878578,
-0.05010725185275078,
0.02476727031171322,
0.10228738933801651,
0.022566646337509155,
-0.018849801272153854,
-0.026484480127692223,
0.19442953169345856,
0.04144512489438057,
0.04286959022283554,
0.12677635252475739,
-0.06578654050827026,
0.05832237750291824,
0.08416811376810074,
0.007444657385349274,
-0.044529449194669724,
0.05345921218395233,
0.040863361209630966,
-0.06759478151798248,
-0.1988062709569931,
-0.02262037806212902,
-0.0056853522546589375,
-0.041987884789705276,
0.07601459324359894,
0.034844327718019485,
-0.0049264803528785706,
0.07290054112672806,
0.010373580269515514,
0.06279580295085907,
-0.0031670311000198126,
0.09756749123334885,
0.008361090905964375,
-0.03434288874268532,
0.08417794108390808,
-0.02245110273361206,
-0.007934315130114555,
0.08500310033559799,
-0.017305878922343254,
0.29752233624458313,
-0.03590502217411995,
0.011122885160148144,
0.11963603645563126,
0.04192456230521202,
0.059252943843603134,
0.131073996424675,
-0.06668216735124588,
0.023771386593580246,
-0.07321152091026306,
-0.05869090184569359,
-0.0016374639235436916,
0.04684225097298622,
-0.05832056328654289,
0.015591900795698166,
-0.0739678367972374,
0.022282902151346207,
-0.021360283717513084,
0.30724334716796875,
0.11221328377723694,
-0.10683548450469971,
-0.05627492442727089,
0.00551895285025239,
-0.10072005540132523,
-0.07473140209913254,
0.04156443476676941,
0.07635506987571716,
-0.1345735639333725,
0.006805785465985537,
-0.02635543793439865,
0.07312154769897461,
-0.01706702448427677,
0.013719681650400162,
0.0283686351031065,
0.03503541275858879,
-0.037267204374074936,
0.00842675007879734,
-0.17782263457775116,
0.19803598523139954,
0.006350592710077763,
0.02200525626540184,
-0.05347263067960739,
0.034731265157461166,
0.010287807323038578,
-0.03485281765460968,
0.06427428126335144,
0.022166511043906212,
-0.02555408887565136,
-0.04601847752928734,
-0.05393580347299576,
0.01256984006613493,
0.07812071591615677,
-0.04201144352555275,
0.10782904922962189,
-0.00586716877296567,
0.04353480413556099,
0.01933135651051998,
0.0867135301232338,
-0.18121196329593658,
-0.08794853836297989,
0.031218426302075386,
-0.05883988365530968,
-0.10418243706226349,
-0.08015867322683334,
-0.0960748940706253,
-0.00029559890390373766,
0.244480699300766,
-0.11576341092586517,
-0.07464803755283356,
-0.09264373779296875,
0.027120530605316162,
0.10738478600978851,
-0.050133880227804184,
0.02930229716002941,
-0.004461813718080521,
0.12615641951560974,
-0.06783502548933029,
-0.12942248582839966,
0.023870863020420074,
-0.08809192478656769,
-0.1666766256093979,
-0.06885684281587601,
0.11499155312776566,
0.06100822985172272,
0.03496956825256348,
-0.02724463678896427,
0.023262621834874153,
0.035670265555381775,
-0.03700192645192146,
-0.002563169226050377,
0.07275497168302536,
0.09804746508598328,
0.036487530916929245,
-0.1123589277267456,
0.011874675750732422,
-0.06416406482458115,
-0.06585735082626343,
0.07783731818199158,
0.2678992748260498,
-0.05746344476938248,
0.1255131959915161,
0.11220362037420273,
-0.08178333193063736,
-0.1537742167711258,
0.030973846092820168,
0.09448958188295364,
-0.015125916339457035,
0.014703037217259407,
-0.1546880155801773,
0.0900755524635315,
0.11297404021024704,
-0.024393105879426003,
0.015622632578015327,
-0.18969665467739105,
-0.1288176327943802,
0.06882770359516144,
0.09992942214012146,
0.2716521620750427,
-0.06134895980358124,
-0.04253881424665451,
0.01836995594203472,
-0.09937828779220581,
0.017338087782263756,
0.12228304892778397,
0.0644879937171936,
-0.02449610084295273,
-0.07759932428598404,
0.014702556654810905,
-0.04111170768737793,
0.09627416729927063,
0.053627774119377136,
0.0567951463162899,
-0.004571520257741213,
0.007745284587144852,
-0.012064519338309765,
-0.04617287963628769,
0.06515011936426163,
0.02559138834476471,
0.04860297217965126,
-0.08488713949918747,
-0.02976861596107483,
-0.06820974498987198,
0.028452277183532715,
-0.02433246746659279,
-0.07597463577985764,
-0.05984294041991234,
0.07562047243118286,
0.04903047904372215,
-0.024577202275395393,
0.01943877711892128,
0.032313354313373566,
0.11491431295871735,
0.16254931688308716,
-0.001980738714337349,
-0.042846761643886566,
-0.052067212760448456,
-0.03905211389064789,
-0.018459729850292206,
0.0734630897641182,
-0.05010983347892761,
0.02485247515141964,
0.06606962531805038,
0.024041922762989998,
0.09693887829780579,
0.056412890553474426,
-0.11522810161113739,
-0.013722676783800125,
0.03342359885573387,
-0.16078822314739227,
0.016398388892412186,
0.0020608867052942514,
0.029874000698328018,
-0.037417907267808914,
0.03101501241326332,
0.14822253584861755,
-0.0642743930220604,
-0.03799157217144966,
-0.04183131083846092,
0.06858060508966446,
0.022003522142767906,
0.14072424173355103,
0.033826977014541626,
0.03683558851480484,
-0.08174571394920349,
0.12829068303108215,
0.04103108122944832,
-0.04096318036317825,
0.021490689367055893,
-0.024403711780905724,
-0.10691864788532257,
0.01314706914126873,
0.0591835118830204,
0.04213935136795044,
-0.043654147535562515,
-0.009170860052108765,
-0.0265605840831995,
-0.07152754068374634,
0.06085139140486717,
0.19112448394298553,
0.06630914658308029,
0.0745524913072586,
-0.05638788267970085,
-0.03536255657672882,
-0.07995599508285522,
0.044149015098810196,
0.04550454020500183,
0.07511374354362488,
-0.07931582629680634,
0.10991580039262772,
0.010052650235593319,
0.0441870242357254,
-0.030519578605890274,
-0.05238533020019531,
-0.09460873901844025,
-0.05476181209087372,
-0.10032191872596741,
0.012810416519641876,
-0.0746840238571167,
-0.03943758085370064,
0.0008712171111255884,
-0.006873048841953278,
-0.00868033617734909,
0.048254236578941345,
-0.06315837800502777,
-0.009523014537990093,
-0.026198841631412506,
0.03464825451374054,
-0.06495288759469986,
-0.03989383578300476,
0.03104805201292038,
-0.10101302713155746,
0.09474867582321167,
0.051199257373809814,
0.00589760160073638,
0.007895877584815025,
0.08869346231222153,
-0.021073387935757637,
0.025734424591064453,
0.014608499594032764,
-0.046613723039627075,
-0.08348780125379562,
0.0018968897638842463,
-0.009761953726410866,
-0.012053626589477062,
-0.01067110151052475,
0.09332378953695297,
-0.08705464005470276,
0.028394130989909172,
-0.007599784526973963,
-0.00793809536844492,
-0.07319030910730362,
-0.01264038123190403,
0.0971289575099945,
0.09618505835533142,
0.04748084768652916,
-0.08705732226371765,
0.01267140544950962,
-0.14316272735595703,
-0.03644811362028122,
0.004441333469003439,
-0.009364251978695393,
-0.12055177986621857,
-0.007923208177089691,
0.01944887824356556,
-0.0014369681011885405,
0.2087593376636505,
-0.05858178809285164,
-0.024784136563539505,
0.018320268020033836,
-0.09724602848291397,
0.10420239716768265,
-0.02396872080862522,
0.18541164696216583,
-0.010365602560341358,
-0.0410357266664505,
-0.015827780589461327,
0.036965806037187576,
0.021513380110263824,
-0.023215830326080322,
0.18556441366672516,
0.13819000124931335,
0.03800918534398079,
0.046017978340387344,
-0.025939105078577995,
-0.0011380902724340558,
-0.05406256765127182,
-0.03111208789050579,
0.03173758462071419,
0.037460681051015854,
0.019737187772989273,
0.1580992192029953,
0.069699227809906,
-0.16626296937465668,
0.03145754709839821,
-0.02563384547829628,
-0.03826971352100372,
-0.11767150461673737,
-0.09527065604925156,
-0.03244378790259361,
-0.06918741017580032,
0.01011425070464611,
-0.12316390872001648,
0.009050623513758183,
0.18205635249614716,
0.05557387322187424,
0.026481471955776215,
0.0031931037083268166,
-0.12608256936073303,
-0.03607987239956856,
0.05383007973432541,
0.014970729127526283,
0.025289976969361305,
0.05691688507795334,
-0.0030997367575764656,
0.06123032420873642,
0.038475483655929565,
0.015592905692756176,
0.0015857492107897997,
0.07789346575737,
0.01664680428802967,
0.04079337418079376,
-0.06222880631685257,
-0.004754434339702129,
-0.04063063859939575,
0.07166317850351334,
0.10082017630338669,
0.049938611686229706,
-0.04681328684091568,
-0.007331689819693565,
0.16078223288059235,
-0.043475471436977386,
-0.002438941039144993,
-0.12685014307498932,
0.33136892318725586,
0.01387060433626175,
0.012425797060132027,
0.04832978546619415,
-0.07708238065242767,
-0.04954010993242264,
0.19779540598392487,
0.08435520529747009,
-0.01665252447128296,
-0.02226264216005802,
0.0018766351277008653,
-0.030989540740847588,
-0.02649742364883423,
0.14977672696113586,
0.036254286766052246,
0.12420105934143066,
-0.054806508123874664,
-0.04979108273983002,
-0.027620917186141014,
-0.007385659962892532,
-0.1240866556763649,
0.13269194960594177,
-0.033027976751327515,
-0.022238438948988914,
-0.07251058518886566,
0.02514837123453617,
0.07480210810899734,
-0.3211038112640381,
0.000547156436368823,
-0.0340762659907341,
-0.11155448853969574,
-0.0031549592968076468,
-0.01436945702880621,
-0.02086886204779148,
0.04730132967233658,
-0.04506721347570419,
0.07199874520301819,
0.04597661271691322,
0.034689243882894516,
-0.026804683730006218,
-0.09396331012248993,
0.1635816991329193,
0.0381823368370533,
0.09972363710403442,
0.026007238775491714,
0.07639218866825104,
0.055313266813755035,
0.035803861916065216,
-0.09761404991149902,
0.04219663888216019,
0.013310578651726246,
-0.08443336933851242,
-0.05054997652769089,
0.1210160106420517,
-0.0037280512042343616,
0.04168034344911575,
0.04282538592815399,
-0.10768972337245941,
0.011445930227637291,
0.0771331712603569,
-0.06949339807033539,
-0.1010718122124672,
-0.008964982815086842,
-0.09006249159574509,
0.1573178470134735,
0.13952788710594177,
-0.01758330874145031,
0.021967750042676926,
-0.06650695204734802,
-0.009827934205532074,
0.05305833742022514,
0.012485518120229244,
-0.01697300374507904,
-0.19009903073310852,
0.03276349976658821,
-0.0816325917840004,
-0.0015345833962783217,
-0.2262365072965622,
-0.09919228404760361,
-0.01303013600409031,
-0.05438215658068657,
-0.029713662341237068,
0.05932128429412842,
0.030007751658558846,
0.06795575469732285,
-0.01677585579454899,
-0.0437324158847332,
-0.02901814877986908,
0.09096377342939377,
-0.11117533594369888,
-0.06541476398706436
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1200k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1200k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1200k"]}
| null |
google/multiberts-seed_2-step_1200k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1200k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1200k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0800904631614685,
0.10018417239189148,
-0.0024332525208592415,
0.0465015284717083,
0.07972875237464905,
-0.015600098296999931,
0.07635342329740524,
0.10245075076818466,
-0.025815598666667938,
0.028059570118784904,
0.07734618335962296,
0.0006546393851749599,
0.01804790273308754,
0.09539556503295898,
0.023528311401605606,
-0.22773057222366333,
0.023067662492394447,
-0.031495802104473114,
-0.0912521481513977,
0.07714247703552246,
0.09849490970373154,
-0.076529860496521,
0.04399556294083595,
0.026559948921203613,
-0.11423397809267044,
0.047418657690286636,
0.0032454850152134895,
-0.018319573253393173,
0.13498063385486603,
-0.00219165813177824,
0.05215117335319519,
0.0544830784201622,
0.04724122956395149,
-0.13089808821678162,
0.00752323679625988,
0.057477544993162155,
0.061250239610672,
0.04624438285827637,
0.020289292559027672,
0.07843057811260223,
0.003916231449693441,
0.030889278277754784,
0.044197723269462585,
0.02309829741716385,
-0.07542634755373001,
-0.05476031452417374,
-0.1054116040468216,
0.04446052014827728,
0.02756170555949211,
0.0032559395767748356,
0.012243459932506084,
0.12538990378379822,
-0.03478974476456642,
0.041233230382204056,
0.18877743184566498,
-0.334717720746994,
-0.014268963597714901,
0.07422054558992386,
0.03990083560347557,
0.12904536724090576,
-0.00413817772641778,
-0.022359143942594528,
0.07756242901086807,
0.025375163182616234,
0.08370101451873779,
-0.03900784254074097,
0.013935628347098827,
-0.05564508214592934,
-0.15687577426433563,
-0.04566425457596779,
0.08535804599523544,
-0.0019034401047974825,
-0.135950967669487,
-0.029318323358893394,
-0.04752387851476669,
0.030692799016833305,
0.014514506794512272,
-0.03558817505836487,
0.03204202279448509,
0.01437018159776926,
-0.01816577836871147,
-0.010561111383140087,
-0.1022300124168396,
-0.05553378909826279,
0.027032075449824333,
0.08250707387924194,
0.10224840044975281,
0.06810850650072098,
0.0005799306672997773,
0.11545360833406448,
-0.19142889976501465,
-0.04997184872627258,
-0.031104031950235367,
-0.05648012086749077,
-0.04946790263056755,
-0.00895504280924797,
-0.10729353129863739,
-0.03285452350974083,
0.0075060902163386345,
0.13216833770275116,
-0.006149591412395239,
0.0257413312792778,
-0.031207779422402382,
0.008894049562513828,
0.05146468058228493,
0.03997892513871193,
-0.010764961130917072,
0.02420821785926819,
0.02265060320496559,
-0.009109647944569588,
-0.022447267547249794,
0.0162176713347435,
0.0018832901259884238,
0.020614096894860268,
0.1185988113284111,
0.023714084178209305,
-0.10232769697904587,
0.06551901251077652,
-0.01817137934267521,
-0.04389255493879318,
0.015250704251229763,
-0.09009463340044022,
-0.059249330312013626,
-0.041051074862480164,
-0.0022701192647218704,
0.01484011858701706,
-0.002168508945032954,
-0.007331570144742727,
-0.023574963212013245,
-0.03746198117733002,
-0.08420715481042862,
-0.04647009074687958,
-0.05300271883606911,
-0.12530335783958435,
0.008381993509829044,
-0.18501795828342438,
-0.03749123960733414,
-0.11425596475601196,
-0.18594154715538025,
-0.023927997797727585,
0.06387877464294434,
-0.011811078526079655,
-0.060331493616104126,
0.07971491664648056,
0.039396733045578,
-0.030223727226257324,
-0.004226358607411385,
0.07185526192188263,
-0.00011398005153751001,
0.045866020023822784,
-0.029486775398254395,
0.07135166227817535,
-0.00034660412347875535,
0.033943433314561844,
-0.05559694394469261,
0.06356014311313629,
-0.1755194514989853,
0.045206233859062195,
-0.07244665175676346,
-0.030432572588324547,
-0.08633550256490707,
-0.034997377544641495,
-0.015407659113407135,
0.0031750532798469067,
0.02530488558113575,
0.07525195926427841,
-0.18529774248600006,
-0.024425644427537918,
0.12968648970127106,
-0.16334518790245056,
-0.02208746410906315,
0.071705162525177,
-0.04962485283613205,
0.1060900092124939,
0.07017278671264648,
0.15391027927398682,
-0.004056552425026894,
-0.07692287117242813,
0.054202865809202194,
-0.011474551633000374,
0.012746088206768036,
-0.013999571092426777,
0.0691092312335968,
-0.02030218578875065,
-0.15417683124542236,
0.03317086771130562,
-0.13104115426540375,
-0.003270594170317054,
-0.07773088663816452,
0.017814939841628075,
-0.011847805231809616,
-0.06914056837558746,
-0.06819289177656174,
-0.025886422023177147,
0.06637254357337952,
-0.07101955264806747,
-0.016507122665643692,
0.043576449155807495,
0.07617315649986267,
-0.07210840284824371,
0.06869377195835114,
-0.011047146283090115,
0.014865431003272533,
-0.08685671538114548,
-0.041358061134815216,
-0.19464415311813354,
0.049303025007247925,
0.0972985252737999,
0.005563816521316767,
-0.019454307854175568,
0.14152531325817108,
0.008516372181475163,
0.0655389204621315,
-0.05197266861796379,
0.015042279846966267,
-0.01182734128087759,
-0.0054410421289503574,
-0.08992405980825424,
-0.09531594067811966,
-0.0736560970544815,
-0.06827546656131744,
0.08563437312841415,
-0.12238265573978424,
0.02245062217116356,
-0.056813471019268036,
0.044412605464458466,
0.019734729081392288,
-0.08297882974147797,
-0.01824449747800827,
0.01243093516677618,
-0.060071881860494614,
-0.0599970705807209,
0.03960046544671059,
0.07051879912614822,
-0.008819155395030975,
0.09206546097993851,
-0.04749995842576027,
-0.08909302204847336,
0.031220639124512672,
0.11085370928049088,
-0.10798151046037674,
0.007929974235594273,
-0.0580563060939312,
-0.04297466576099396,
-0.06775402277708054,
-0.015466216020286083,
0.07847893238067627,
-0.007415363099426031,
0.13441923260688782,
-0.07780878245830536,
0.000017504482457297854,
0.014500081539154053,
-0.021144311875104904,
-0.016968144103884697,
0.037661295384168625,
0.07244238257408142,
-0.07232791930437088,
0.014738869853317738,
0.03328324481844902,
0.0077960193157196045,
0.06580852717161179,
-0.05593184009194374,
-0.088956318795681,
0.014391185715794563,
0.038942210376262665,
0.031003478914499283,
0.06948933750391006,
-0.01014937087893486,
-0.009578250348567963,
0.031834058463573456,
0.019493650645017624,
0.006492007989436388,
-0.11940541118383408,
0.05999593809247017,
0.05525672435760498,
0.0035255851689726114,
0.060094334185123444,
-0.016752302646636963,
-0.03809031844139099,
0.08198980242013931,
0.03616180270910263,
-0.0007071635336615145,
-0.015782184898853302,
-0.012986877001821995,
-0.11490330845117569,
0.1925019472837448,
-0.05890706554055214,
-0.15430061519145966,
-0.07688803970813751,
-0.10013207793235779,
0.005263091996312141,
0.02424374595284462,
0.0377061627805233,
-0.02000458538532257,
-0.0408589169383049,
-0.12346399575471878,
0.060220617800951004,
-0.032981522381305695,
0.0707935318350792,
0.10888340324163437,
-0.04088054969906807,
0.05628054216504097,
-0.12644946575164795,
-0.00574525585398078,
-0.08211260288953781,
-0.07288097590208054,
0.05828198045492172,
-0.0530712828040123,
0.02555922046303749,
0.10279100388288498,
0.020083125680685043,
-0.018511950969696045,
-0.02577788755297661,
0.199013352394104,
0.040843117982149124,
0.04094717279076576,
0.12748998403549194,
-0.06445293873548508,
0.05777459219098091,
0.08861258625984192,
0.008953046053647995,
-0.04390017315745354,
0.05436704680323601,
0.044896334409713745,
-0.06970318406820297,
-0.19862878322601318,
-0.02036171220242977,
-0.005078270100057125,
-0.04290458559989929,
0.07573194056749344,
0.035631436854600906,
0.0002121781144523993,
0.07278811186552048,
0.01222661416977644,
0.059466198086738586,
-0.004734092857688665,
0.09647279977798462,
0.0058556511066854,
-0.02951963245868683,
0.08552633970975876,
-0.02181464247405529,
-0.008391043171286583,
0.08445989340543747,
-0.018180791288614273,
0.2994825541973114,
-0.037576474249362946,
0.005481761880218983,
0.11873311549425125,
0.044903721660375595,
0.06386458873748779,
0.12806221842765808,
-0.06669721752405167,
0.02332134172320366,
-0.07159185409545898,
-0.05863643437623978,
-0.002781905699521303,
0.045961130410432816,
-0.06162191554903984,
0.01351458765566349,
-0.07260032743215561,
0.026390019804239273,
-0.02333436720073223,
0.3077302575111389,
0.11220984160900116,
-0.10526442527770996,
-0.05748075619339943,
0.003998205531388521,
-0.10030095279216766,
-0.07411690056324005,
0.04624216631054878,
0.07509350031614304,
-0.13375639915466309,
0.003745872527360916,
-0.02747735194861889,
0.07586308568716049,
-0.01789971999824047,
0.01464115735143423,
0.03054729476571083,
0.03506067767739296,
-0.03834862634539604,
0.007539147045463324,
-0.17478038370609283,
0.1988763064146042,
0.004981362726539373,
0.019299902021884918,
-0.048208970576524734,
0.03274233639240265,
0.011027775704860687,
-0.031437065452337265,
0.06282884627580643,
0.023927997797727585,
-0.03009830042719841,
-0.03980972245335579,
-0.05540281906723976,
0.01305121649056673,
0.07678737491369247,
-0.04238015413284302,
0.10702876001596451,
-0.004397512413561344,
0.043333958834409714,
0.01911286637187004,
0.080141581594944,
-0.18051555752754211,
-0.0879581943154335,
0.02953251451253891,
-0.062298599630594254,
-0.09690719097852707,
-0.08028140664100647,
-0.09627168625593185,
0.0031762556172907352,
0.2412351667881012,
-0.11210133880376816,
-0.07528332620859146,
-0.09394020587205887,
0.029692191630601883,
0.10955911874771118,
-0.049971938133239746,
0.0279750544577837,
-0.006067402195185423,
0.12438417226076126,
-0.06627490371465683,
-0.1321815848350525,
0.02287103421986103,
-0.08676178008317947,
-0.1651342362165451,
-0.06569606810808182,
0.11309194564819336,
0.061276499181985855,
0.035971030592918396,
-0.03007984533905983,
0.02228773944079876,
0.038456931710243225,
-0.037768684327602386,
-0.005841467063874006,
0.0679854154586792,
0.09467873722314835,
0.034690972417593,
-0.10885307192802429,
0.01637563481926918,
-0.06436002999544144,
-0.0632987767457962,
0.07478335499763489,
0.2644132077693939,
-0.05872722715139389,
0.12319410592317581,
0.11457633972167969,
-0.0795709565281868,
-0.15421345829963684,
0.03283734247088432,
0.0952901616692543,
-0.014969498850405216,
0.015171673148870468,
-0.1558910459280014,
0.0892660841345787,
0.11187151074409485,
-0.02402467280626297,
0.004641602281481028,
-0.19187313318252563,
-0.1287405788898468,
0.06564018875360489,
0.09623628854751587,
0.2762313187122345,
-0.06064953655004501,
-0.04303634166717529,
0.020189965143799782,
-0.09844409674406052,
0.017667649313807487,
0.1234099343419075,
0.06420350819826126,
-0.026257384568452835,
-0.07419213652610779,
0.014034101739525795,
-0.040619656443595886,
0.09529199451208115,
0.05422109737992287,
0.05546478182077408,
-0.004798027686774731,
0.013779783621430397,
-0.022029655054211617,
-0.04659280925989151,
0.06198493018746376,
0.025876816362142563,
0.05119708925485611,
-0.08448369801044464,
-0.028359588235616684,
-0.07041634619235992,
0.025761762633919716,
-0.02499612420797348,
-0.0757814422249794,
-0.06068005412817001,
0.08031510561704636,
0.04966062679886818,
-0.02424471639096737,
0.017707714810967445,
0.03130776807665825,
0.11782167106866837,
0.15835966169834137,
-0.0042602065950632095,
-0.03992683067917824,
-0.058438632637262344,
-0.036022406071424484,
-0.016187438741326332,
0.07350623607635498,
-0.05910387262701988,
0.024252377450466156,
0.06745912879705429,
0.023220928385853767,
0.09457286447286606,
0.05701484531164169,
-0.11514187604188919,
-0.016323508694767952,
0.03354989364743233,
-0.16232436895370483,
0.016761384904384613,
0.002585451817139983,
0.029736246913671494,
-0.038248226046562195,
0.030339602380990982,
0.14948664605617523,
-0.06754015386104584,
-0.036759134382009506,
-0.03915742412209511,
0.06921279430389404,
0.021709294989705086,
0.14008328318595886,
0.03678031265735626,
0.036920029670000076,
-0.08083496987819672,
0.12488535791635513,
0.03946443274617195,
-0.04421210289001465,
0.019360337406396866,
-0.02469612844288349,
-0.10933881253004074,
0.014894002117216587,
0.06580459326505661,
0.031369034200906754,
-0.05221190303564072,
-0.010499678552150726,
-0.02699197269976139,
-0.07314816862344742,
0.0605940967798233,
0.1937025934457779,
0.06710325926542282,
0.07313239574432373,
-0.0544142946600914,
-0.03760481998324394,
-0.07935648411512375,
0.04288394749164581,
0.045569952577352524,
0.07729868590831757,
-0.07771455496549606,
0.10197703540325165,
0.01025604922324419,
0.04445621371269226,
-0.02974308840930462,
-0.05185076594352722,
-0.09744323045015335,
-0.05480508133769035,
-0.1118059977889061,
0.009659817442297935,
-0.07257958501577377,
-0.03780524060130119,
-0.0002906503505073488,
-0.005104033276438713,
-0.008774672634899616,
0.04706620052456856,
-0.06418365985155106,
-0.008684937842190266,
-0.027573511004447937,
0.03287264332175255,
-0.06307335197925568,
-0.0353284552693367,
0.03186775743961334,
-0.1003476083278656,
0.09342561662197113,
0.05009198933839798,
0.006859758868813515,
0.007220047060400248,
0.10055822879076004,
-0.022203508764505386,
0.023099713027477264,
0.015797685831785202,
-0.046394892036914825,
-0.08132325112819672,
0.0031647421419620514,
-0.00999370589852333,
-0.01046857051551342,
-0.010312811471521854,
0.08768170326948166,
-0.08657617121934891,
0.035335246473550797,
-0.009057514369487762,
-0.004977513570338488,
-0.07444395124912262,
-0.013306666165590286,
0.09854038804769516,
0.09928294271230698,
0.04592318832874298,
-0.08682431280612946,
0.011768263764679432,
-0.14106906950473785,
-0.03742697834968567,
0.005584315396845341,
-0.008940489962697029,
-0.12013455480337143,
-0.008853373117744923,
0.020636433735489845,
-0.00026297950535081327,
0.20518611371517181,
-0.0589006282389164,
-0.019936690106987953,
0.019343335181474686,
-0.09861557185649872,
0.10771714150905609,
-0.02360367216169834,
0.17727066576480865,
-0.007088426500558853,
-0.04105067253112793,
-0.01177771482616663,
0.035870831459760666,
0.017935574054718018,
-0.023897042497992516,
0.1893693506717682,
0.1380375623703003,
0.034726012498140335,
0.042964741587638855,
-0.02243359014391899,
0.002892371267080307,
-0.04821350425481796,
-0.02978154644370079,
0.029056567698717117,
0.03520606458187103,
0.018565209582448006,
0.1528654396533966,
0.07048895955085754,
-0.1619134545326233,
0.03263389691710472,
-0.029160166159272194,
-0.039570167660713196,
-0.11863576620817184,
-0.09000454843044281,
-0.03145161271095276,
-0.07271463423967361,
0.009750071913003922,
-0.12286897748708725,
0.006340800318866968,
0.19020362198352814,
0.05775630474090576,
0.028511395677924156,
0.006843813229352236,
-0.12262824177742004,
-0.035324178636074066,
0.05230816826224327,
0.01450701616704464,
0.028370602056384087,
0.06043402850627899,
-0.0021703392267227173,
0.06298637390136719,
0.03945093974471092,
0.013972668908536434,
0.00270896521396935,
0.07977186143398285,
0.01700931042432785,
0.04107546806335449,
-0.05958138406276703,
-0.004568831529468298,
-0.039596956223249435,
0.06990636140108109,
0.10386183112859726,
0.04988700523972511,
-0.04626442864537239,
-0.007506401743739843,
0.16158895194530487,
-0.04529368504881859,
-0.007878720760345459,
-0.12961445748806,
0.32927536964416504,
0.012263622134923935,
0.013177154585719109,
0.046060673892498016,
-0.07711772620677948,
-0.05207279697060585,
0.20326824486255646,
0.0845271572470665,
-0.01795143634080887,
-0.023343853652477264,
0.003302827477455139,
-0.030771782621741295,
-0.02439362369477749,
0.15080519020557404,
0.03671424835920334,
0.1269841343164444,
-0.055148858577013016,
-0.05431061610579491,
-0.027378641068935394,
-0.008567760698497295,
-0.12304939329624176,
0.13206163048744202,
-0.03253750130534172,
-0.021658245474100113,
-0.07325895875692368,
0.028340965509414673,
0.06959084421396255,
-0.3202437460422516,
-0.0006802177522331476,
-0.03327787294983864,
-0.11285307258367538,
-0.004435698501765728,
-0.01612246222794056,
-0.022305907681584358,
0.045179832726716995,
-0.04731438308954239,
0.07148875296115875,
0.04446113482117653,
0.034327927976846695,
-0.02512638457119465,
-0.0964030921459198,
0.16570144891738892,
0.047614965587854385,
0.09089449048042297,
0.02663038857281208,
0.07740878313779831,
0.055918555706739426,
0.0348808579146862,
-0.09851044416427612,
0.03935324773192406,
0.015149503014981747,
-0.08882474154233932,
-0.05525680631399155,
0.12170693278312683,
-0.0014829542487859726,
0.0401744581758976,
0.04259386658668518,
-0.10398523509502411,
0.008569672703742981,
0.07605107873678207,
-0.06551704555749893,
-0.10464027523994446,
-0.008361037820577621,
-0.08840139210224152,
0.1575322300195694,
0.14220376312732697,
-0.017345018684864044,
0.024111047387123108,
-0.06641946732997894,
-0.011264271102845669,
0.051419831812381744,
0.006968058180063963,
-0.01743069477379322,
-0.19047367572784424,
0.030819309875369072,
-0.08405839651823044,
-0.003168003633618355,
-0.2281910479068756,
-0.09938517212867737,
-0.009791625663638115,
-0.052412595599889755,
-0.02667638286948204,
0.06348539888858795,
0.030238937586545944,
0.06805762648582458,
-0.015585135668516159,
-0.038903165608644485,
-0.03139074519276619,
0.0894700214266777,
-0.1101723462343216,
-0.06681625545024872
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 120k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 120k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_120k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_120k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_120k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_120k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_120k"]}
| null |
google/multiberts-seed_2-step_120k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_120k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_120k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 120k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 120k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 120k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 120k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_120k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 120k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 120k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_120k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 120k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 120k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07703307271003723,
0.10387948900461197,
-0.002532246755436063,
0.041981518268585205,
0.080973319709301,
-0.016371188685297966,
0.07737854868173599,
0.10287197679281235,
-0.02189868874847889,
0.027225280180573463,
0.07774214446544647,
0.005796066485345364,
0.017181551083922386,
0.09263655543327332,
0.02202208712697029,
-0.22891339659690857,
0.021426234394311905,
-0.03142803534865379,
-0.09295804798603058,
0.07754557579755783,
0.09918714314699173,
-0.0765317752957344,
0.04428030550479889,
0.02924494259059429,
-0.11236357688903809,
0.04732635244727135,
0.002384516643360257,
-0.016487620770931244,
0.13495518267154694,
-0.0033968843054026365,
0.0523424968123436,
0.05436978489160538,
0.046014249324798584,
-0.1344609409570694,
0.007843641564249992,
0.05909902974963188,
0.05903034657239914,
0.04581247270107269,
0.02239118330180645,
0.07624553143978119,
0.009655511938035488,
0.027047764509916306,
0.04272402077913284,
0.023610148578882217,
-0.07516222447156906,
-0.06033803150057793,
-0.10464923828840256,
0.03713366389274597,
0.025363072752952576,
0.005511135794222355,
0.010996678844094276,
0.1278194636106491,
-0.03588447347283363,
0.043174225836992264,
0.1895376294851303,
-0.342470645904541,
-0.012395106256008148,
0.07107376307249069,
0.041499070823192596,
0.12139368802309036,
-0.006489778868854046,
-0.019374769181013107,
0.07530464977025986,
0.024961331859230995,
0.08656477183103561,
-0.03934807702898979,
0.023784883320331573,
-0.05380760878324509,
-0.15776276588439941,
-0.04589684680104256,
0.0829230472445488,
-0.002404075348749757,
-0.13709089159965515,
-0.030018657445907593,
-0.04738476499915123,
0.03615369275212288,
0.013799159787595272,
-0.03414374217391014,
0.033312130719423294,
0.015288901515305042,
-0.01914907433092594,
-0.01115274429321289,
-0.10436970740556717,
-0.054674677550792694,
0.02937362715601921,
0.0804625153541565,
0.10280238091945648,
0.06813232600688934,
0.0005751431453973055,
0.11325458437204361,
-0.1908426433801651,
-0.050908297300338745,
-0.029764190316200256,
-0.054274845868349075,
-0.04979146644473076,
-0.008185303770005703,
-0.10901056230068207,
-0.03693685308098793,
0.0095816720277071,
0.1322658360004425,
-0.006980405189096928,
0.026243992149829865,
-0.03071087785065174,
0.00879226066172123,
0.053057003766298294,
0.04318619892001152,
-0.00679354090243578,
0.023080848157405853,
0.02196529507637024,
-0.013191385194659233,
-0.022328859195113182,
0.01671963930130005,
0.00014802832447458059,
0.022555096074938774,
0.11901416629552841,
0.023776309564709663,
-0.1049155667424202,
0.06731533259153366,
-0.018361058086156845,
-0.04460196942090988,
0.014705352485179901,
-0.08868712931871414,
-0.05866488814353943,
-0.03781760483980179,
-0.0013062607031315565,
0.016643866896629333,
-0.002204440999776125,
-0.006313051097095013,
-0.022060353308916092,
-0.0385887436568737,
-0.08220656961202621,
-0.04439311847090721,
-0.05420515313744545,
-0.12648577988147736,
0.009314549155533314,
-0.18849651515483856,
-0.037133459001779556,
-0.11357155442237854,
-0.18501073122024536,
-0.022807613015174866,
0.06395278126001358,
-0.011293809860944748,
-0.05647088959813118,
0.07836564630270004,
0.041223954409360886,
-0.028741609305143356,
-0.002214340493083,
0.07377775758504868,
-0.0018463305896148086,
0.04554649069905281,
-0.027369054034352303,
0.07097024470567703,
0.001762065221555531,
0.03399982675909996,
-0.05472010001540184,
0.0635795146226883,
-0.17103473842144012,
0.04200594499707222,
-0.07117689400911331,
-0.03390061482787132,
-0.0880325511097908,
-0.034862127155065536,
-0.011744166724383831,
0.003149941796436906,
0.023231904953718185,
0.07355041801929474,
-0.18418827652931213,
-0.02691042050719261,
0.1283200979232788,
-0.16194066405296326,
-0.019493669271469116,
0.0735817477107048,
-0.049098655581474304,
0.10455403476953506,
0.06925447285175323,
0.15216656029224396,
-0.008691280148923397,
-0.08352739363908768,
0.0550088994204998,
-0.01105768047273159,
0.012714887037873268,
-0.010588232427835464,
0.07198255509138107,
-0.019436562433838844,
-0.15407675504684448,
0.03366803005337715,
-0.13249409198760986,
-0.002806170377880335,
-0.0780811756849289,
0.019890503957867622,
-0.010536065325140953,
-0.06856390088796616,
-0.0648069754242897,
-0.02609352394938469,
0.06729042530059814,
-0.07121004164218903,
-0.01888742670416832,
0.045535940676927567,
0.07554292678833008,
-0.07302655279636383,
0.06668532639741898,
-0.01357825007289648,
0.01378865446895361,
-0.08707000315189362,
-0.0413396917283535,
-0.19311346113681793,
0.04884044826030731,
0.09696168452501297,
0.006844138726592064,
-0.019082333892583847,
0.14779506623744965,
0.00991113856434822,
0.06738458573818207,
-0.05028649419546127,
0.014792473055422306,
-0.013698875904083252,
-0.0056794011034071445,
-0.0891433134675026,
-0.09626764059066772,
-0.0750473141670227,
-0.06920460611581802,
0.09189652651548386,
-0.12718915939331055,
0.02077057957649231,
-0.059006497263908386,
0.046625468879938126,
0.021507197991013527,
-0.08154355734586716,
-0.0178263857960701,
0.011498681269586086,
-0.06015356257557869,
-0.05773203447461128,
0.04206300899386406,
0.07049594074487686,
-0.008654342964291573,
0.09588945657014847,
-0.048251714557409286,
-0.087077297270298,
0.030779065564274788,
0.10463740676641464,
-0.10570251941680908,
0.010134135372936726,
-0.057486217468976974,
-0.04203001782298088,
-0.06639637798070908,
-0.014355900697410107,
0.07763803005218506,
-0.006610758136957884,
0.13717639446258545,
-0.07675717025995255,
-0.0032881551887840033,
0.013219479471445084,
-0.02357526309788227,
-0.01746460795402527,
0.04024971276521683,
0.06984639912843704,
-0.08308946341276169,
0.016058053821325302,
0.03768712654709816,
0.005752262659370899,
0.07177659124135971,
-0.05474017187952995,
-0.09040121734142303,
0.012682872824370861,
0.03847804665565491,
0.0310879684984684,
0.06793862581253052,
-0.01432693749666214,
-0.010649607516825199,
0.03296710178256035,
0.016829457134008408,
0.005245822481811047,
-0.11890113353729248,
0.06062081828713417,
0.055034369230270386,
0.002703088102862239,
0.060030169785022736,
-0.015008104033768177,
-0.03904079273343086,
0.08098380267620087,
0.0358855165541172,
-0.0005015399656258523,
-0.01591489277780056,
-0.01405247487127781,
-0.11454498767852783,
0.19166432321071625,
-0.059586942195892334,
-0.1590331345796585,
-0.07664243131875992,
-0.10094102472066879,
0.0052489228546619415,
0.025417909026145935,
0.03866454213857651,
-0.019290054216980934,
-0.041724082082509995,
-0.12209814786911011,
0.06246940791606903,
-0.03508991748094559,
0.06944300979375839,
0.10579892992973328,
-0.0391426607966423,
0.057013578712940216,
-0.12582845985889435,
-0.006647287402302027,
-0.08288007974624634,
-0.07443192601203918,
0.06006280705332756,
-0.05328216403722763,
0.025925101712346077,
0.098726786673069,
0.022243017330765724,
-0.01877085119485855,
-0.025213297456502914,
0.19875717163085938,
0.04015834629535675,
0.03867868706583977,
0.13192886114120483,
-0.0650734156370163,
0.05766824260354042,
0.08101169764995575,
0.007772628217935562,
-0.04347734525799751,
0.05381127446889877,
0.048144251108169556,
-0.06855364888906479,
-0.19797927141189575,
-0.020233016461133957,
-0.006518949288874865,
-0.04488430917263031,
0.07528478652238846,
0.036071278154850006,
-0.0017049112357199192,
0.07014551013708115,
0.011628132313489914,
0.05990281328558922,
-0.004795373417437077,
0.09748726338148117,
0.010297844186425209,
-0.030914004892110825,
0.08679983764886856,
-0.022946447134017944,
-0.011221606284379959,
0.08318385481834412,
-0.017701486125588417,
0.29269158840179443,
-0.03388083353638649,
0.013137190602719784,
0.12018986791372299,
0.04483122006058693,
0.06358980387449265,
0.12753884494304657,
-0.06679297983646393,
0.022141192108392715,
-0.07339948415756226,
-0.05929212644696236,
-0.004725873935967684,
0.04582258686423302,
-0.05671051889657974,
0.010224426165223122,
-0.07150537520647049,
0.023257536813616753,
-0.02330605871975422,
0.3099575340747833,
0.11499527096748352,
-0.10693475604057312,
-0.057503506541252136,
0.003396327141672373,
-0.09968192875385284,
-0.07255630195140839,
0.04319874942302704,
0.0726754441857338,
-0.13501965999603271,
0.0059483665972948074,
-0.029014745727181435,
0.0760750025510788,
-0.018511980772018433,
0.01630392298102379,
0.029411302879452705,
0.03517292067408562,
-0.03670160844922066,
0.010275818407535553,
-0.17880026996135712,
0.19577208161354065,
0.005703283939510584,
0.01907140202820301,
-0.050440676510334015,
0.03082595206797123,
0.010631111450493336,
-0.033453844487667084,
0.06199944019317627,
0.021721886470913887,
-0.03134699910879135,
-0.04244678467512131,
-0.05268608033657074,
0.013455541804432869,
0.08046411722898483,
-0.0442371629178524,
0.1099831834435463,
-0.007495018187910318,
0.041124533861875534,
0.01932814158499241,
0.08311906456947327,
-0.18034817278385162,
-0.0873747318983078,
0.028517885133624077,
-0.06178852915763855,
-0.09970352053642273,
-0.07856500148773193,
-0.09483860433101654,
0.0014772703871130943,
0.2452044039964676,
-0.11300431936979294,
-0.07248463481664658,
-0.09461881965398788,
0.027822235599160194,
0.10653190314769745,
-0.050696417689323425,
0.02607312612235546,
-0.006857896223664284,
0.1292526125907898,
-0.06655649095773697,
-0.13420531153678894,
0.02279147319495678,
-0.08854646980762482,
-0.16665372252464294,
-0.06518377363681793,
0.11520452052354813,
0.061738256365060806,
0.03581603616476059,
-0.027857668697834015,
0.022600114345550537,
0.03562535345554352,
-0.036043036729097366,
-0.0015794233186170459,
0.06842329353094101,
0.09831997007131577,
0.034058332443237305,
-0.11043485999107361,
0.023399166762828827,
-0.06357592344284058,
-0.06243351846933365,
0.07691733539104462,
0.265311062335968,
-0.05681341513991356,
0.1255795955657959,
0.1101185530424118,
-0.07903963327407837,
-0.15406018495559692,
0.029456546530127525,
0.09722578525543213,
-0.014979670755565166,
0.018004201352596283,
-0.1584840714931488,
0.08799264580011368,
0.11009914427995682,
-0.02486775442957878,
0.009070458821952343,
-0.18691548705101013,
-0.12701526284217834,
0.06670989841222763,
0.09643162041902542,
0.27390211820602417,
-0.06130455806851387,
-0.041857220232486725,
0.016801981255412102,
-0.09741362184286118,
0.023177916184067726,
0.1143355518579483,
0.06675020605325699,
-0.025391960516572,
-0.07244080305099487,
0.01412759255617857,
-0.04018222540616989,
0.09436433762311935,
0.05050751939415932,
0.05478942394256592,
-0.0033136317506432533,
0.016721686348319054,
-0.020687347277998924,
-0.04589344933629036,
0.06348161399364471,
0.020130684599280357,
0.04931644722819328,
-0.08402227610349655,
-0.029495423659682274,
-0.0704067051410675,
0.02929667942225933,
-0.024550527334213257,
-0.07490770518779755,
-0.05856101214885712,
0.07895802706480026,
0.0477365106344223,
-0.023160388693213463,
0.023382993414998055,
0.030184246599674225,
0.11811165511608124,
0.16437023878097534,
-0.0052080401219427586,
-0.03446114435791969,
-0.060220781713724136,
-0.038556262850761414,
-0.015936456620693207,
0.07411305606365204,
-0.054915934801101685,
0.024870870634913445,
0.06407655775547028,
0.02334691397845745,
0.09671001881361008,
0.05550764128565788,
-0.11735901236534119,
-0.01666228659451008,
0.0318731851875782,
-0.16371680796146393,
0.015013043768703938,
0.001819900586269796,
0.029889065772294998,
-0.035327401012182236,
0.03185103461146355,
0.1531457006931305,
-0.06572927534580231,
-0.03614329546689987,
-0.03980383276939392,
0.06785120815038681,
0.021786067634820938,
0.13802340626716614,
0.035636965185403824,
0.037544459104537964,
-0.08049414306879044,
0.12356717884540558,
0.03747383877635002,
-0.04000430554151535,
0.020628705620765686,
-0.028588173910975456,
-0.1081017479300499,
0.012646530754864216,
0.06619403511285782,
0.04038073495030403,
-0.05273551121354103,
-0.01262422651052475,
-0.028689244762063026,
-0.0712600126862526,
0.0607081837952137,
0.19229502975940704,
0.06910453736782074,
0.07422914355993271,
-0.05498537793755531,
-0.03696455806493759,
-0.07915481179952621,
0.04238390550017357,
0.04328688234090805,
0.07482253015041351,
-0.07762384414672852,
0.1010824665427208,
0.011098326183855534,
0.046779658645391464,
-0.031063809990882874,
-0.05269294232130051,
-0.09932533651590347,
-0.055075503885746,
-0.10834772139787674,
0.010925772599875927,
-0.070672407746315,
-0.038188010454177856,
0.0011078561656177044,
-0.005472936201840639,
-0.006699908059090376,
0.048109252005815506,
-0.06499271839857101,
-0.008742492645978928,
-0.027257224544882774,
0.03325005620718002,
-0.06564347445964813,
-0.03704722225666046,
0.032327890396118164,
-0.10174249857664108,
0.0925384908914566,
0.05182887241244316,
0.007599025033414364,
0.011294483207166195,
0.09325605630874634,
-0.022199196740984917,
0.02471954934298992,
0.014067327603697777,
-0.04769401624798775,
-0.08319804817438126,
0.0023384857922792435,
-0.01025380752980709,
-0.011583052575588226,
-0.011659009382128716,
0.08820640295743942,
-0.08600309491157532,
0.03402171656489372,
-0.008646921254694462,
-0.006178883370012045,
-0.0735039934515953,
-0.013691281899809837,
0.09301192313432693,
0.10041280835866928,
0.04863690957427025,
-0.08751490712165833,
0.013256533071398735,
-0.1405036598443985,
-0.03758420795202255,
0.006526962388306856,
-0.00877283327281475,
-0.1223715990781784,
-0.009496643207967281,
0.019958052784204483,
-0.0017966553568840027,
0.20409198105335236,
-0.056751400232315063,
-0.013709800317883492,
0.017130576074123383,
-0.09803380072116852,
0.10848737508058548,
-0.025316687300801277,
0.17694275081157684,
-0.007452104706317186,
-0.04000845178961754,
-0.014240723103284836,
0.03600461035966873,
0.018315205350518227,
-0.025112126022577286,
0.18669147789478302,
0.13923366367816925,
0.031423844397068024,
0.04203961044549942,
-0.022894538938999176,
0.0028279966209083796,
-0.054903384298086166,
-0.02684897370636463,
0.030907107517123222,
0.036365944892168045,
0.018218569457530975,
0.15961399674415588,
0.07298662513494492,
-0.16442768275737762,
0.031974952667951584,
-0.02972383424639702,
-0.037371426820755005,
-0.11979745328426361,
-0.09595201164484024,
-0.03436458110809326,
-0.07138535380363464,
0.010857870802283287,
-0.12096143513917923,
0.007957379333674908,
0.18816302716732025,
0.05781112611293793,
0.02767554484307766,
0.0021847481839358807,
-0.1217237263917923,
-0.036936867982149124,
0.05276940390467644,
0.013900574296712875,
0.025628985837101936,
0.06046922877430916,
0.0010273873340338469,
0.06392570585012436,
0.038404472172260284,
0.016094310209155083,
0.001784139545634389,
0.08064988255500793,
0.018213944509625435,
0.0401463620364666,
-0.061582621186971664,
-0.003828610060736537,
-0.03591284528374672,
0.07101297378540039,
0.10258505493402481,
0.04969869926571846,
-0.04693424329161644,
-0.008441277779638767,
0.16185055673122406,
-0.04366244748234749,
-0.0046783690340816975,
-0.12787456810474396,
0.33361533284187317,
0.012777404859662056,
0.014789106324315071,
0.04551563411951065,
-0.07648388296365738,
-0.049509234726428986,
0.20145738124847412,
0.08477114140987396,
-0.0197472982108593,
-0.021462930366396904,
0.0021302022505551577,
-0.03006700798869133,
-0.022607387974858284,
0.14888739585876465,
0.03750521317124367,
0.13016290962696075,
-0.05693869665265083,
-0.05519664287567139,
-0.02803538180887699,
-0.008387680165469646,
-0.12407104671001434,
0.13123834133148193,
-0.03165619075298309,
-0.02255439944565296,
-0.07183101773262024,
0.026662878692150116,
0.06870727241039276,
-0.32762035727500916,
-0.0010300448630005121,
-0.03102538175880909,
-0.11159143596887589,
-0.004627087619155645,
-0.014840047806501389,
-0.021869009360671043,
0.04701561480760574,
-0.04904339462518692,
0.06956002861261368,
0.044861771166324615,
0.03472055122256279,
-0.024934254586696625,
-0.09152255207300186,
0.16592951118946075,
0.047733087092638016,
0.08849415928125381,
0.025829914957284927,
0.07755503803491592,
0.05604279041290283,
0.03441188111901283,
-0.09491130709648132,
0.04277907684445381,
0.013406570069491863,
-0.09300772845745087,
-0.05326196923851967,
0.12302158027887344,
-0.002275422215461731,
0.04041365906596184,
0.04271836206316948,
-0.10554006695747375,
0.011989252641797066,
0.0753173828125,
-0.06714620441198349,
-0.101448655128479,
-0.005771992262452841,
-0.09035257250070572,
0.1568475067615509,
0.14367534220218658,
-0.01523116696625948,
0.023620430380105972,
-0.06855125725269318,
-0.007055702619254589,
0.05371394753456116,
0.0016498554032295942,
-0.019252164289355278,
-0.1861298829317093,
0.03141835331916809,
-0.07950436323881149,
-0.005072443746030331,
-0.2289879024028778,
-0.10019544512033463,
-0.009531666524708271,
-0.051388777792453766,
-0.0264523196965456,
0.0587785579264164,
0.02906906232237816,
0.0685638040304184,
-0.016565054655075073,
-0.042841386049985886,
-0.02984544262290001,
0.08813575655221939,
-0.11114177852869034,
-0.0671735629439354
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1300k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1300k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1300k"]}
| null |
google/multiberts-seed_2-step_1300k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1300k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1300k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07661601901054382,
0.0983516201376915,
-0.0024010741617530584,
0.04610903561115265,
0.07735855132341385,
-0.015899568796157837,
0.07292217016220093,
0.10241318494081497,
-0.02442634105682373,
0.026067888364195824,
0.07916810363531113,
0.001252902438864112,
0.016815464943647385,
0.1005169004201889,
0.02763901837170124,
-0.23271778225898743,
0.023364316672086716,
-0.03613637760281563,
-0.08576293289661407,
0.07672220468521118,
0.10020920634269714,
-0.07692163437604904,
0.043516840785741806,
0.029088187962770462,
-0.10437734425067902,
0.04948015883564949,
0.0028941589407622814,
-0.01966973766684532,
0.13146451115608215,
-0.002078823046758771,
0.05224468186497688,
0.05415726453065872,
0.04598681628704071,
-0.13773275911808014,
0.006181980948895216,
0.05632861703634262,
0.06114231050014496,
0.04572908952832222,
0.017573954537510872,
0.07777056843042374,
0.007917963899672031,
0.02664237469434738,
0.04771124944090843,
0.02144460752606392,
-0.07692321389913559,
-0.0564645379781723,
-0.10780506581068039,
0.04444793239235878,
0.02849264070391655,
0.004872739315032959,
0.013400290161371231,
0.12702776491641998,
-0.032190561294555664,
0.044215019792318344,
0.18309910595417023,
-0.32701146602630615,
-0.015693292021751404,
0.07935981452465057,
0.045088864862918854,
0.12852483987808228,
-0.004016098100692034,
-0.019474979490041733,
0.07394258677959442,
0.02265215292572975,
0.08187271654605865,
-0.0394788421690464,
0.015987370163202286,
-0.053938642144203186,
-0.15778550505638123,
-0.05057704076170921,
0.08586244285106659,
-0.0016286632744595408,
-0.13261890411376953,
-0.03242700546979904,
-0.047594230622053146,
0.02388516440987587,
0.011146236211061478,
-0.04018678143620491,
0.033169522881507874,
0.01540201436728239,
-0.018064303323626518,
-0.013557049445807934,
-0.10139580070972443,
-0.0540006048977375,
0.026459285989403725,
0.07775883376598358,
0.09917879104614258,
0.06735686212778091,
0.0019061819184571505,
0.11267019808292389,
-0.1965918093919754,
-0.048507947474718094,
-0.030044712126255035,
-0.05376012995839119,
-0.049795784056186676,
-0.013103960081934929,
-0.10699895769357681,
-0.03691655769944191,
0.009332568384706974,
0.14118842780590057,
-0.013774733059108257,
0.027654264122247696,
-0.029683316126465797,
0.008788044564425945,
0.05276935547590256,
0.04273797944188118,
-0.012743514031171799,
0.025434207171201706,
0.026878075674176216,
-0.013561132363975048,
-0.01684979721903801,
0.013389422558248043,
0.004367041867226362,
0.019331682473421097,
0.1171170175075531,
0.021736524999141693,
-0.09672754257917404,
0.06915921717882156,
-0.017103826627135277,
-0.04202016070485115,
0.020560406148433685,
-0.09173227101564407,
-0.0551987923681736,
-0.04052095487713814,
0.0017779232002794743,
0.011578126810491085,
-0.000812836573459208,
-0.001584403682500124,
-0.02118283323943615,
-0.044058285653591156,
-0.08257913589477539,
-0.05070192739367485,
-0.053255971521139145,
-0.12690764665603638,
0.011810713447630405,
-0.1613008677959442,
-0.03815705329179764,
-0.11124639213085175,
-0.1929621696472168,
-0.02202075719833374,
0.06543729454278946,
-0.010132801719009876,
-0.06260819733142853,
0.07770141959190369,
0.041082870215177536,
-0.028049413114786148,
-0.0012972422409802675,
0.07383913546800613,
0.0009349522297270596,
0.04394235834479332,
-0.02868930995464325,
0.06722485274076462,
0.0006195887108333409,
0.03024866431951523,
-0.0580197237432003,
0.06435241550207138,
-0.18183216452598572,
0.04368235915899277,
-0.07408275455236435,
-0.029158255085349083,
-0.08491534739732742,
-0.03489423170685768,
-0.017838487401604652,
0.0031605451367795467,
0.023500952869653702,
0.07460515946149826,
-0.16988727450370789,
-0.027237655594944954,
0.11780183762311935,
-0.16493083536624908,
-0.020661141723394394,
0.07346571981906891,
-0.048507336527109146,
0.10428681969642639,
0.0680001825094223,
0.15519103407859802,
0.005390714854001999,
-0.07931474596261978,
0.059990040957927704,
-0.006873418111354113,
0.009055154398083687,
-0.010315099731087685,
0.07053804397583008,
-0.019757062196731567,
-0.15739378333091736,
0.03509290888905525,
-0.13442125916481018,
-0.0024184745270758867,
-0.07804513722658157,
0.014216096140444279,
-0.009291494265198708,
-0.0643165111541748,
-0.06666047871112823,
-0.026172906160354614,
0.06624352186918259,
-0.07104271650314331,
-0.017029687762260437,
0.03954894095659256,
0.07415524870157242,
-0.07238729298114777,
0.07237666100263596,
-0.007386108860373497,
0.015479601919651031,
-0.08267537504434586,
-0.04222786799073219,
-0.18629781901836395,
0.045369330793619156,
0.09798959642648697,
0.005093266721814871,
-0.023793937638401985,
0.13386142253875732,
0.009872126393020153,
0.06278249621391296,
-0.05165548250079155,
0.019939016550779343,
-0.013808428309857845,
-0.003460593754425645,
-0.08687139302492142,
-0.09551014006137848,
-0.0748399943113327,
-0.06877664476633072,
0.09398908913135529,
-0.12815409898757935,
0.024068960919976234,
-0.0548025518655777,
0.041522275656461716,
0.01768072322010994,
-0.08514219522476196,
-0.01694076880812645,
0.011648355983197689,
-0.05998856946825981,
-0.05616983398795128,
0.03993424400687218,
0.07102632522583008,
-0.013700731098651886,
0.09100309759378433,
-0.05041772499680519,
-0.08985712379217148,
0.03057256154716015,
0.10036493092775345,
-0.10937516391277313,
0.001301536220125854,
-0.059214599430561066,
-0.04521834850311279,
-0.06716794520616531,
-0.02180773951113224,
0.07651565968990326,
-0.00870247557759285,
0.13304269313812256,
-0.07556602358818054,
-0.0035454323515295982,
0.01499454490840435,
-0.022316094487905502,
-0.022806372493505478,
0.037132613360881805,
0.06531797349452972,
-0.06309599429368973,
0.015969807282090187,
0.027609867975115776,
0.005190877243876457,
0.0678386241197586,
-0.05326102301478386,
-0.0874401330947876,
0.015100463293492794,
0.04020441696047783,
0.030952971428632736,
0.06356051564216614,
-0.018055060878396034,
-0.005649095866829157,
0.03206498175859451,
0.020344264805316925,
0.00843968614935875,
-0.11728990823030472,
0.05944512039422989,
0.05820172652602196,
0.0009146149386651814,
0.053900014609098434,
-0.01660054549574852,
-0.0385587215423584,
0.07814386487007141,
0.03522365167737007,
0.0044012959115207195,
-0.014544093981385231,
-0.014648649841547012,
-0.11974886804819107,
0.19044549763202667,
-0.06052897498011589,
-0.1575716733932495,
-0.081135094165802,
-0.1027136817574501,
0.004960311576724052,
0.023394010961055756,
0.03552539646625519,
-0.014093155972659588,
-0.0414188876748085,
-0.12647105753421783,
0.056883860379457474,
-0.03861277550458908,
0.06932742893695831,
0.11266885697841644,
-0.0422518327832222,
0.05718123912811279,
-0.12495611608028412,
-0.00499437702819705,
-0.08081073313951492,
-0.07765018194913864,
0.05573882907629013,
-0.05299464240670204,
0.028580915182828903,
0.1013861894607544,
0.020495302975177765,
-0.01816321164369583,
-0.02789594605565071,
0.20284315943717957,
0.04035920277237892,
0.04236876592040062,
0.12547940015792847,
-0.06212659552693367,
0.057335447520017624,
0.0893612653017044,
0.008933709003031254,
-0.04150858521461487,
0.052922435104846954,
0.04229162260890007,
-0.07163205742835999,
-0.19312097132205963,
-0.023326516151428223,
-0.007857886143028736,
-0.04073605686426163,
0.07363014668226242,
0.03688985854387283,
-0.005010418128222227,
0.06938605010509491,
0.012274599634110928,
0.06238085776567459,
0.004568221978843212,
0.09736356884241104,
0.01877746731042862,
-0.03130991756916046,
0.08197885751724243,
-0.019057974219322205,
-0.007458684500306845,
0.08466658741235733,
-0.012220976874232292,
0.28203415870666504,
-0.03637523204088211,
0.009751717559993267,
0.11656340956687927,
0.043991025537252426,
0.06207211688160896,
0.12482248246669769,
-0.06824342906475067,
0.022518962621688843,
-0.07039462774991989,
-0.05971762537956238,
-0.0010355141712352633,
0.04496403783559799,
-0.060977861285209656,
0.01413390226662159,
-0.0737554281949997,
0.023559316992759705,
-0.017570331692695618,
0.29600948095321655,
0.10899870097637177,
-0.10619557648897171,
-0.057441625744104385,
0.003460991894826293,
-0.10040568560361862,
-0.06890920549631119,
0.04519684612751007,
0.069418765604496,
-0.13468226790428162,
0.0096691669896245,
-0.026839381083846092,
0.07318604737520218,
-0.014824487268924713,
0.014587122946977615,
0.032194722443819046,
0.03807874023914337,
-0.038163088262081146,
0.010848664678633213,
-0.17481127381324768,
0.20032261312007904,
0.005418439395725727,
0.01756225898861885,
-0.04894881695508957,
0.034840382635593414,
0.006920004263520241,
-0.03243095427751541,
0.0660252720117569,
0.02540326490998268,
-0.0333082340657711,
-0.041131917387247086,
-0.05431831628084183,
0.014772755093872547,
0.07284452021121979,
-0.042026277631521225,
0.10392869263887405,
-0.005653614178299904,
0.04222708195447922,
0.020701292902231216,
0.08563306927680969,
-0.18275976181030273,
-0.08980761468410492,
0.031780242919921875,
-0.05962654948234558,
-0.10689685493707657,
-0.07900635898113251,
-0.09522690623998642,
-0.007472363766282797,
0.24867892265319824,
-0.1166934072971344,
-0.07355380803346634,
-0.09583650529384613,
0.03720346465706825,
0.10873475670814514,
-0.050050582736730576,
0.028710685670375824,
-0.004707247018814087,
0.126807302236557,
-0.06892798095941544,
-0.13077184557914734,
0.024980716407299042,
-0.08780267834663391,
-0.16692738234996796,
-0.06552433222532272,
0.11210303008556366,
0.059401735663414,
0.036110106855630875,
-0.030638782307505608,
0.023203780874609947,
0.03818810358643532,
-0.040531568229198456,
-0.008438058197498322,
0.07784353196620941,
0.08724863827228546,
0.03136468306183815,
-0.10793621093034744,
0.02015792578458786,
-0.060872986912727356,
-0.06510328501462936,
0.07683402299880981,
0.2726118564605713,
-0.05972008407115936,
0.12757278978824615,
0.11629118025302887,
-0.08303679525852203,
-0.15713289380073547,
0.02905883640050888,
0.09239780157804489,
-0.016762681305408478,
0.021588223055005074,
-0.15038564801216125,
0.08674308657646179,
0.1193835660815239,
-0.025643356144428253,
-0.005232773721218109,
-0.19701224565505981,
-0.13078051805496216,
0.06378114968538284,
0.09773889183998108,
0.2715682089328766,
-0.057413674890995026,
-0.04513245075941086,
0.022731715813279152,
-0.09980393946170807,
0.009573161602020264,
0.12273833155632019,
0.06112593039870262,
-0.022679690271615982,
-0.0747210904955864,
0.014520492404699326,
-0.04213816672563553,
0.09616117924451828,
0.05253351107239723,
0.05292947217822075,
-0.004885170143097639,
0.01617106981575489,
-0.010768311098217964,
-0.045860908925533295,
0.06119760498404503,
0.03101702220737934,
0.05089501664042473,
-0.08020149171352386,
-0.028512774035334587,
-0.07099340111017227,
0.030550779774785042,
-0.02689267508685589,
-0.0752430185675621,
-0.06400390714406967,
0.07929475605487823,
0.054051317274570465,
-0.026880953460931778,
0.022007746621966362,
0.030279645696282387,
0.11720292270183563,
0.17119200527668,
-0.0038396292366087437,
-0.039073746651411057,
-0.045759402215480804,
-0.033790118992328644,
-0.01813557930290699,
0.07054075598716736,
-0.04434121027588844,
0.0258135087788105,
0.06733821332454681,
0.024314984679222107,
0.09098812192678452,
0.056877415627241135,
-0.11464076489210129,
-0.01715126261115074,
0.03392428532242775,
-0.16397438943386078,
0.017312632873654366,
-0.0012414617231115699,
0.025732826441526413,
-0.04186245799064636,
0.02418268658220768,
0.14972898364067078,
-0.06816175580024719,
-0.035764388740062714,
-0.03602176532149315,
0.06986124813556671,
0.021110694855451584,
0.139473557472229,
0.03325100988149643,
0.03551842272281647,
-0.07974823564291,
0.12601026892662048,
0.041658926755189896,
-0.050343189388513565,
0.0171789713203907,
-0.022450314834713936,
-0.10776432603597641,
0.015005558729171753,
0.061403170228004456,
0.0292728953063488,
-0.056000567972660065,
-0.006561917252838612,
-0.02769426256418228,
-0.0702950581908226,
0.06039856746792793,
0.1919519007205963,
0.06720121204853058,
0.07466831058263779,
-0.05172622203826904,
-0.03279181569814682,
-0.08188699930906296,
0.045276109129190445,
0.042793504893779755,
0.07502744346857071,
-0.07945969700813293,
0.10151167213916779,
0.009454903192818165,
0.03981655463576317,
-0.031229501590132713,
-0.054916854947805405,
-0.10226579010486603,
-0.0534939281642437,
-0.08516339957714081,
0.00813028123229742,
-0.07985688000917435,
-0.037451837211847305,
0.0007679970585741103,
-0.0051376656629145145,
-0.012097141705453396,
0.048646584153175354,
-0.061518121510744095,
-0.008782802149653435,
-0.03082149848341942,
0.0351102314889431,
-0.06508652865886688,
-0.03595024719834328,
0.03556331247091293,
-0.0984875038266182,
0.09310701489448547,
0.04837146773934364,
0.008581672795116901,
0.006778845563530922,
0.07967209815979004,
-0.019002186134457588,
0.02157905325293541,
0.014209765940904617,
-0.047524090856313705,
-0.08479262888431549,
0.0010038024047389627,
-0.0076491148211061954,
-0.01568160578608513,
-0.00884493999183178,
0.09166418015956879,
-0.08827957510948181,
0.03153605759143829,
-0.005612038541585207,
-0.009757420979440212,
-0.0742299035191536,
-0.013152141124010086,
0.09731225669384003,
0.0961146205663681,
0.04327431693673134,
-0.09116426110267639,
0.010195286013185978,
-0.1436464786529541,
-0.03736284002661705,
0.007082611788064241,
-0.008417574688792229,
-0.12229085713624954,
-0.004495921544730663,
0.0206394512206316,
0.0010741224978119135,
0.21875286102294922,
-0.05872346833348274,
-0.021776042878627777,
0.021158305928111076,
-0.10369577258825302,
0.11244338750839233,
-0.024002470076084137,
0.18891644477844238,
-0.006723265163600445,
-0.0411926694214344,
-0.009786141104996204,
0.0378606840968132,
0.019939204677939415,
-0.02381295897066593,
0.1850925087928772,
0.134304940700531,
0.03661390021443367,
0.04466133937239647,
-0.02191554382443428,
0.008225108496844769,
-0.04593925550580025,
-0.02935640513896942,
0.030150441452860832,
0.03802698105573654,
0.01621747948229313,
0.14061328768730164,
0.07897753268480301,
-0.16933684051036835,
0.03371695801615715,
-0.02679264172911644,
-0.04098494350910187,
-0.12279393523931503,
-0.10428120940923691,
-0.03213544562458992,
-0.07852686196565628,
0.009890355169773102,
-0.12595613300800323,
0.008294527418911457,
0.1798045039176941,
0.056538987904787064,
0.03153756633400917,
0.005227158311754465,
-0.12429217249155045,
-0.03280866518616676,
0.04622773453593254,
0.012736506760120392,
0.02977162040770054,
0.06017865613102913,
-0.008019750006496906,
0.06247162073850632,
0.03986005112528801,
0.012946836650371552,
-0.00044658800470642745,
0.08526284992694855,
0.017484109848737717,
0.039557673037052155,
-0.06060546636581421,
-0.005868629086762667,
-0.04150781035423279,
0.07044538855552673,
0.09463857859373093,
0.04973205551505089,
-0.048667024821043015,
-0.007238050922751427,
0.1642296463251114,
-0.04352238401770592,
-0.001972345868125558,
-0.1328062117099762,
0.3359214961528778,
0.016659606248140335,
0.012216671369969845,
0.04610176384449005,
-0.07845190912485123,
-0.05099251866340637,
0.20526644587516785,
0.09234151244163513,
-0.016288770362734795,
-0.02125214971601963,
0.001842609024606645,
-0.03028656542301178,
-0.020838521420955658,
0.14992862939834595,
0.032623641192913055,
0.12002922594547272,
-0.05271320790052414,
-0.057530373334884644,
-0.027636051177978516,
-0.009626024402678013,
-0.12477704137563705,
0.13436171412467957,
-0.030694210901856422,
-0.02309887483716011,
-0.07237161695957184,
0.03059827908873558,
0.06912845373153687,
-0.3173801004886627,
-0.00563339376822114,
-0.033535152673721313,
-0.10741215199232101,
-0.0036632781848311424,
-0.015949102118611336,
-0.021284008398652077,
0.04316763952374458,
-0.045153602957725525,
0.07164009660482407,
0.03721078485250473,
0.03701392188668251,
-0.02342846430838108,
-0.09584125131368637,
0.16557367146015167,
0.061120737344026566,
0.1031571626663208,
0.028483672067523003,
0.07179738581180573,
0.056354910135269165,
0.03538937121629715,
-0.09914755821228027,
0.04174334555864334,
0.017470207065343857,
-0.08624560385942459,
-0.055206842720508575,
0.1221022978425026,
-0.002487804275006056,
0.04978169873356819,
0.04114843159914017,
-0.10776795446872711,
0.013356706127524376,
0.06721743941307068,
-0.06379184871912003,
-0.10040238499641418,
-0.010192323476076126,
-0.08860712498426437,
0.1585586965084076,
0.1427941620349884,
-0.015111997723579407,
0.025235068053007126,
-0.0657329112291336,
-0.011311323381960392,
0.04863458499312401,
0.01342752669006586,
-0.017615094780921936,
-0.1905527412891388,
0.029628360643982887,
-0.08262450248003006,
-0.00448707165196538,
-0.22013480961322784,
-0.1032211035490036,
-0.011539378203451633,
-0.05174281448125839,
-0.02464219368994236,
0.06562257558107376,
0.029721714556217194,
0.06395704299211502,
-0.015074829570949078,
-0.03452245146036148,
-0.028508784249424934,
0.09163545072078705,
-0.11285507678985596,
-0.06470565497875214
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1400k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1400k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1400k"]}
| null |
google/multiberts-seed_2-step_1400k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1400k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1400k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0798455998301506,
0.10067134350538254,
-0.0026154767256230116,
0.04468807950615883,
0.07974309474229813,
-0.011797654442489147,
0.07604966312646866,
0.10198478400707245,
-0.022050000727176666,
0.025722678750753403,
0.07727465033531189,
-0.0036845370195806026,
0.016517452895641327,
0.09241434186697006,
0.026416411623358727,
-0.2266392707824707,
0.020623445510864258,
-0.03319935500621796,
-0.09386685490608215,
0.07592745125293732,
0.10084063559770584,
-0.07865267992019653,
0.04467747360467911,
0.030758148059248924,
-0.11183227598667145,
0.04654795676469803,
0.0035668988712131977,
-0.018109312281012535,
0.1349257230758667,
-0.001633702078834176,
0.05265715345740318,
0.05313805863261223,
0.04612809792160988,
-0.1319858878850937,
0.0069870916195213795,
0.05877838283777237,
0.0586327463388443,
0.04708480462431908,
0.02090364880859852,
0.07856565713882446,
0.004664799198508263,
0.02720341645181179,
0.04443400353193283,
0.02362104132771492,
-0.07616105675697327,
-0.07352098077535629,
-0.1030980795621872,
0.04461072012782097,
0.025296911597251892,
0.0036274802405387163,
0.012281294912099838,
0.11931146681308746,
-0.032280225306749344,
0.04165436699986458,
0.17557764053344727,
-0.334860622882843,
-0.012886162847280502,
0.06723222881555557,
0.039864346385002136,
0.12723447382450104,
-0.006278065498918295,
-0.01867794431746006,
0.07589534670114517,
0.026710325852036476,
0.09000027179718018,
-0.039906587451696396,
0.02707032673060894,
-0.05370756611227989,
-0.1548585146665573,
-0.04645256698131561,
0.08677065372467041,
-0.003480283310636878,
-0.1361168771982193,
-0.03418072685599327,
-0.04617379233241081,
0.03912641480565071,
0.01070303563028574,
-0.03692622110247612,
0.03398556634783745,
0.016489284113049507,
-0.01408481877297163,
-0.013810254633426666,
-0.10423847287893295,
-0.0522613525390625,
0.029775822535157204,
0.0820075124502182,
0.10135076940059662,
0.0685330182313919,
0.0015327776782214642,
0.11038856208324432,
-0.19079004228115082,
-0.05223875120282173,
-0.02837260253727436,
-0.05415581539273262,
-0.05001169815659523,
-0.009853602387011051,
-0.10775525867938995,
-0.03238854184746742,
0.00935499370098114,
0.13119271397590637,
-0.014068218879401684,
0.0264572910964489,
-0.024968404322862625,
0.007772873621433973,
0.05403205007314682,
0.038395337760448456,
-0.010669074952602386,
0.020464720204472542,
0.024677664041519165,
-0.014561283402144909,
-0.02018057182431221,
0.014399579726159573,
0.001945872325450182,
0.021024314686655998,
0.11771821975708008,
0.021285802125930786,
-0.09871649742126465,
0.06611280143260956,
-0.015649665147066116,
-0.04341118782758713,
0.015161514282226562,
-0.09013815969228745,
-0.05509687587618828,
-0.03762591630220413,
0.0006433988455682993,
0.012636722065508366,
-0.0020677531138062477,
-0.005841516423970461,
-0.023662906140089035,
-0.03842611238360405,
-0.08508802205324173,
-0.04764498770236969,
-0.05042847618460655,
-0.1277804970741272,
0.00908169150352478,
-0.1686611920595169,
-0.03763974457979202,
-0.11516834050416946,
-0.18766452372074127,
-0.022715726867318153,
0.06439121067523956,
-0.01123170368373394,
-0.05891109257936478,
0.08392538875341415,
0.043227676302194595,
-0.02801978960633278,
-0.0015319314552471042,
0.06883087754249573,
-0.0006829928606748581,
0.045225534588098526,
-0.02625318244099617,
0.06977210193872452,
0.0017100689001381397,
0.033039312809705734,
-0.05511302128434181,
0.06408955156803131,
-0.17981590330600739,
0.04513569548726082,
-0.07111214846372604,
-0.027596747502684593,
-0.08559021353721619,
-0.03874657675623894,
-0.01268325187265873,
0.004553820472210646,
0.023663658648729324,
0.0759970173239708,
-0.17704221606254578,
-0.02944173291325569,
0.11850769817829132,
-0.16575177013874054,
-0.02373848669230938,
0.07451633363962173,
-0.04741466045379639,
0.1056162640452385,
0.06918681412935257,
0.15758223831653595,
0.004343684762716293,
-0.08242611587047577,
0.05955464392900467,
-0.009242793545126915,
0.008898616768419743,
-0.009989161044359207,
0.07052639871835709,
-0.01985800266265869,
-0.1532038450241089,
0.0344761498272419,
-0.13068051636219025,
-0.0016498238546773791,
-0.07820670306682587,
0.016362257301807404,
-0.011127334088087082,
-0.06599225848913193,
-0.06381857395172119,
-0.0261137243360281,
0.06810247153043747,
-0.07120351493358612,
-0.015205936506390572,
0.046769868582487106,
0.0756424069404602,
-0.07185971736907959,
0.07014653831720352,
-0.010150271467864513,
0.018266549333930016,
-0.08253344148397446,
-0.03875582292675972,
-0.19033174216747284,
0.04711339250206947,
0.09650358557701111,
-0.0006361492560245097,
-0.02164679393172264,
0.13948161900043488,
0.009484121575951576,
0.06945571303367615,
-0.05318189039826393,
0.01622510701417923,
-0.011849567294120789,
-0.004102714825421572,
-0.08909931033849716,
-0.1009429395198822,
-0.0759267807006836,
-0.06496820598840714,
0.09593461453914642,
-0.1292400360107422,
0.023184122517704964,
-0.05842828378081322,
0.0442408062517643,
0.019149813801050186,
-0.08161341398954391,
-0.02018585614860058,
0.011604931205511093,
-0.05971749499440193,
-0.05638762190937996,
0.041613154113292694,
0.0724555253982544,
-0.010449486784636974,
0.09424202144145966,
-0.04568994417786598,
-0.08046701550483704,
0.03142685815691948,
0.09537158161401749,
-0.1071760505437851,
0.007634339854121208,
-0.056907735764980316,
-0.04426174983382225,
-0.0636695846915245,
-0.019979672506451607,
0.07702333480119705,
-0.00952993892133236,
0.1386994570493698,
-0.07515337318181992,
-0.001736957230605185,
0.01528877206146717,
-0.022424161434173584,
-0.02025294490158558,
0.03514004498720169,
0.05886734277009964,
-0.06752157211303711,
0.014510939829051495,
0.035742416977882385,
0.011049563065171242,
0.06853105127811432,
-0.053968802094459534,
-0.08824712783098221,
0.010834152810275555,
0.035291437059640884,
0.02849290333688259,
0.06759289652109146,
-0.019316278398036957,
-0.008860591799020767,
0.03484731912612915,
0.019233688712120056,
0.007047280669212341,
-0.11668884754180908,
0.06118365377187729,
0.0583173893392086,
0.003655154723674059,
0.05256626009941101,
-0.018680311739444733,
-0.038548633456230164,
0.08064153045415878,
0.03684007003903389,
0.006277202628552914,
-0.014843047596514225,
-0.014329221099615097,
-0.11849886178970337,
0.1897120326757431,
-0.06250544637441635,
-0.16012433171272278,
-0.08430696278810501,
-0.1037229374051094,
0.004894754383713007,
0.02563438192009926,
0.03904161602258682,
-0.01769447885453701,
-0.04175478219985962,
-0.12061402201652527,
0.05990612506866455,
-0.03716294467449188,
0.0692112147808075,
0.11020340025424957,
-0.0401933416724205,
0.06107918918132782,
-0.12488622963428497,
-0.005033510271459818,
-0.08126070350408554,
-0.06492845714092255,
0.05848946422338486,
-0.048256367444992065,
0.02682722918689251,
0.10071524977684021,
0.022678453475236893,
-0.020351557061076164,
-0.024460935965180397,
0.20794150233268738,
0.038898248225450516,
0.04308653622865677,
0.13177475333213806,
-0.0639529600739479,
0.058069273829460144,
0.08453462272882462,
0.011511394754052162,
-0.04184991493821144,
0.05042559280991554,
0.04196065291762352,
-0.06752242147922516,
-0.19356456398963928,
-0.024560119956731796,
-0.00878253486007452,
-0.04687678441405296,
0.07445009052753448,
0.034871842712163925,
0.0025615720078349113,
0.06712909042835236,
0.013101471588015556,
0.0643177255988121,
-0.0010080802021548152,
0.09770895540714264,
0.022693028673529625,
-0.032322321087121964,
0.08404302597045898,
-0.021180767565965652,
-0.00774840684607625,
0.08714189380407333,
-0.022657981142401695,
0.28964170813560486,
-0.035681188106536865,
0.012817246839404106,
0.11864734441041946,
0.047089941799640656,
0.06346892565488815,
0.12380640208721161,
-0.06726659834384918,
0.023396510630846024,
-0.07114673405885696,
-0.06029890850186348,
-0.00021017844846937805,
0.04600299894809723,
-0.061592377722263336,
0.01247742772102356,
-0.07544980943202972,
0.020358948037028313,
-0.020230084657669067,
0.30791833996772766,
0.11269887536764145,
-0.10501691699028015,
-0.06362096220254898,
0.004138330463320017,
-0.09959343820810318,
-0.07294648140668869,
0.040840789675712585,
0.0773300975561142,
-0.1329069584608078,
0.007547395769506693,
-0.023992639034986496,
0.07439413666725159,
-0.012719220481812954,
0.017660066485404968,
0.03406992927193642,
0.03799097612500191,
-0.038829583674669266,
0.0097890580072999,
-0.17497891187667847,
0.19486577808856964,
0.0071817790158092976,
0.018708746880292892,
-0.04812317714095116,
0.0328112430870533,
0.0093449167907238,
-0.022978218272328377,
0.060479506850242615,
0.027059998363256454,
-0.02953237108886242,
-0.041647203266620636,
-0.04978811368346214,
0.011990807019174099,
0.07646159827709198,
-0.04159536585211754,
0.10512986779212952,
-0.006858161650598049,
0.041898153722286224,
0.020085256546735764,
0.08587461709976196,
-0.17853350937366486,
-0.0874805599451065,
0.03152124211192131,
-0.05967269837856293,
-0.10066717863082886,
-0.08037928491830826,
-0.09533605724573135,
-0.0001585787395015359,
0.24525980651378632,
-0.11580885946750641,
-0.07635540515184402,
-0.09745343774557114,
0.02940884791314602,
0.10793478786945343,
-0.05004138872027397,
0.029005398973822594,
-0.00501156784594059,
0.1310250461101532,
-0.06516804546117783,
-0.13293452560901642,
0.02144305594265461,
-0.08867309242486954,
-0.16494488716125488,
-0.06625650078058243,
0.11582133919000626,
0.05817173793911934,
0.03569050133228302,
-0.02826547995209694,
0.02182815410196781,
0.03320564329624176,
-0.038977086544036865,
-0.002729391446337104,
0.06481962651014328,
0.09894969314336777,
0.03145580738782883,
-0.11034592986106873,
0.01685440167784691,
-0.06132585182785988,
-0.06384526938199997,
0.0744105875492096,
0.26723840832710266,
-0.06003090739250183,
0.12551109492778778,
0.12322042882442474,
-0.0806865468621254,
-0.15392710268497467,
0.028911715373396873,
0.09242124110460281,
-0.01590311899781227,
0.014460353180766106,
-0.15644624829292297,
0.0887676253914833,
0.1127152293920517,
-0.023742755874991417,
0.006475682836025953,
-0.1881374716758728,
-0.13191212713718414,
0.06379425525665283,
0.09494180977344513,
0.27344831824302673,
-0.060721248388290405,
-0.04489896073937416,
0.018313229084014893,
-0.10018780082464218,
0.010974152944982052,
0.11839437484741211,
0.062046393752098083,
-0.024959327653050423,
-0.07599857449531555,
0.01416232530027628,
-0.03998975083231926,
0.09926305711269379,
0.05388938635587692,
0.055225010961294174,
-0.004552360624074936,
0.020702052861452103,
-0.013486240059137344,
-0.045260295271873474,
0.05976502597332001,
0.028879916295409203,
0.05037363991141319,
-0.0872194841504097,
-0.027856076136231422,
-0.07083039730787277,
0.02736368402838707,
-0.026202045381069183,
-0.07544174790382385,
-0.060858432203531265,
0.07862538844347,
0.049754612147808075,
-0.026281531900167465,
0.016057468950748444,
0.030547451227903366,
0.11804112046957016,
0.16754481196403503,
-0.008218531496822834,
-0.043854400515556335,
-0.05874891206622124,
-0.035340651869773865,
-0.017495950683951378,
0.07263997197151184,
-0.05477828159928322,
0.02753346599638462,
0.06691381335258484,
0.024711083620786667,
0.09627829492092133,
0.05613492801785469,
-0.11352292448282242,
-0.018035635352134705,
0.035058751702308655,
-0.1651800572872162,
0.013588444329798222,
-0.0008160123834386468,
0.01826162450015545,
-0.038174428045749664,
0.024306464940309525,
0.1504114717245102,
-0.06598944216966629,
-0.03541811928153038,
-0.03910967335104942,
0.06754815578460693,
0.01965678669512272,
0.14079855382442474,
0.036145683377981186,
0.03749177232384682,
-0.08170369267463684,
0.12393616139888763,
0.04111354053020477,
-0.04262365773320198,
0.01732822135090828,
-0.02386106550693512,
-0.10994607955217361,
0.012364501133561134,
0.05832478404045105,
0.03512108698487282,
-0.04949890449643135,
-0.01257680356502533,
-0.027349546551704407,
-0.07508469372987747,
0.06048651784658432,
0.18389248847961426,
0.0661703422665596,
0.07270622253417969,
-0.05398990213871002,
-0.03656810149550438,
-0.08058237284421921,
0.04458082094788551,
0.04371443763375282,
0.07311874628067017,
-0.07920730113983154,
0.11013634502887726,
0.01045693177729845,
0.04621399939060211,
-0.0324050635099411,
-0.05446986109018326,
-0.09960553050041199,
-0.055791307240724564,
-0.11035806685686111,
0.008662853389978409,
-0.07485488057136536,
-0.037555571645498276,
-0.0002997790288645774,
-0.008347513154149055,
-0.010115426033735275,
0.048529356718063354,
-0.06176963448524475,
-0.008887073956429958,
-0.024863610044121742,
0.03487367182970047,
-0.06528694182634354,
-0.03752245754003525,
0.031993087381124496,
-0.10100696235895157,
0.09614359587430954,
0.05393263325095177,
0.007395481690764427,
0.006803132127970457,
0.10273922234773636,
-0.021028276532888412,
0.022350601851940155,
0.01396241970360279,
-0.045390743762254715,
-0.08709380030632019,
0.001611063489690423,
-0.009738210588693619,
-0.017288941890001297,
-0.008786579594016075,
0.08971782773733139,
-0.08519110083580017,
0.03560551255941391,
-0.007575140334665775,
-0.007836679928004742,
-0.07420644909143448,
-0.014356985688209534,
0.09892856329679489,
0.0965084657073021,
0.04834706708788872,
-0.08896665275096893,
0.011394760571420193,
-0.1448569893836975,
-0.0384315587580204,
0.005491688847541809,
-0.00848335400223732,
-0.11707314848899841,
-0.009813857264816761,
0.01922699064016342,
-0.0002592343953438103,
0.21063292026519775,
-0.05976101756095886,
-0.01760399155318737,
0.020731858909130096,
-0.09577495604753494,
0.10248289257287979,
-0.023086510598659515,
0.18054594099521637,
-0.009235670790076256,
-0.041543666273355484,
-0.012470642104744911,
0.03927699476480484,
0.021201523020863533,
-0.01807997189462185,
0.179975226521492,
0.1381656974554062,
0.037362147122621536,
0.040915749967098236,
-0.02348780445754528,
0.003843847429379821,
-0.05470655485987663,
-0.02496190369129181,
0.029001612216234207,
0.0383274182677269,
0.018488463014364243,
0.14890551567077637,
0.07351420074701309,
-0.16777655482292175,
0.03517964482307434,
-0.030685290694236755,
-0.03956323862075806,
-0.11612828820943832,
-0.08796077966690063,
-0.030334167182445526,
-0.07549645006656647,
0.006695905700325966,
-0.12351400405168533,
0.007825633510947227,
0.17873218655586243,
0.05571003258228302,
0.026795262470841408,
0.003141069784760475,
-0.12344297766685486,
-0.036147452890872955,
0.049767252057790756,
0.014222358353435993,
0.027778467163443565,
0.06237579137086868,
-0.0032140384428203106,
0.062087103724479675,
0.04003671184182167,
0.01278909482061863,
0.0015220829518511891,
0.08235367387533188,
0.017772292718291283,
0.03854960948228836,
-0.06217602640390396,
-0.004637222737073898,
-0.03882226720452309,
0.06952963769435883,
0.09533515572547913,
0.04739735648036003,
-0.04753043130040169,
-0.008612828329205513,
0.16654454171657562,
-0.04230813682079315,
-0.004473770037293434,
-0.13078457117080688,
0.3337175250053406,
0.013653984293341637,
0.011689091101288795,
0.047227904200553894,
-0.07752174139022827,
-0.05091576650738716,
0.20662368834018707,
0.0897175669670105,
-0.02104160562157631,
-0.021982965990900993,
0.0029808294493705034,
-0.030672740191221237,
-0.020161623135209084,
0.15120020508766174,
0.03449184447526932,
0.12571479380130768,
-0.05135715380311012,
-0.050268642604351044,
-0.030797095969319344,
-0.007809664122760296,
-0.12605531513690948,
0.143406480550766,
-0.032355379313230515,
-0.023411598056554794,
-0.07123034447431564,
0.027854932472109795,
0.07334695756435394,
-0.3271406590938568,
0.0009873202070593834,
-0.03262341022491455,
-0.11000284552574158,
-0.004301175009459257,
-0.018650947138667107,
-0.02159113995730877,
0.0454300157725811,
-0.045736849308013916,
0.06993040442466736,
0.04738700017333031,
0.03444533795118332,
-0.023141736164689064,
-0.09618198126554489,
0.16903863847255707,
0.047179896384477615,
0.09616163372993469,
0.02646343968808651,
0.07465090602636337,
0.05489807948470116,
0.0359966941177845,
-0.0983586236834526,
0.03801041841506958,
0.013212090358138084,
-0.08886625617742538,
-0.05420706048607826,
0.12136281281709671,
-0.0016915167216211557,
0.0350455567240715,
0.04329677298665047,
-0.10947687178850174,
0.00906227994710207,
0.07215022295713425,
-0.06941942125558853,
-0.09778532385826111,
-0.011548815295100212,
-0.08737237751483917,
0.1549721211194992,
0.14118243753910065,
-0.017004135996103287,
0.023533279076218605,
-0.0657673254609108,
-0.009440218098461628,
0.05440327525138855,
0.010488243773579597,
-0.017673436552286148,
-0.19168245792388916,
0.03289926424622536,
-0.08616778999567032,
-0.0049339113757014275,
-0.2284073382616043,
-0.10294021666049957,
-0.008466931991279125,
-0.04995635896921158,
-0.023951420560479164,
0.06434997916221619,
0.030337901785969734,
0.0666964128613472,
-0.015371322631835938,
-0.03915100544691086,
-0.02952423319220543,
0.09112296998500824,
-0.1078813448548317,
-0.06552460044622421
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 140k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 140k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_140k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_140k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_140k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_140k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_140k"]}
| null |
google/multiberts-seed_2-step_140k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_140k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_140k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 140k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 140k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 140k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 140k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_140k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 140k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 140k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_140k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 140k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 140k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07770475000143051,
0.10056504607200623,
-0.002634833799675107,
0.04189005866646767,
0.07965799421072006,
-0.012383861467242241,
0.0769602581858635,
0.10265593975782394,
-0.020472118631005287,
0.026183323934674263,
0.07824911177158356,
0.0019986520055681467,
0.016457557678222656,
0.09407862275838852,
0.025092002004384995,
-0.22779475152492523,
0.020734263584017754,
-0.03160152584314346,
-0.09158185869455338,
0.07604394853115082,
0.1009778380393982,
-0.07888270169496536,
0.04403143748641014,
0.031493473798036575,
-0.1129436120390892,
0.04606996849179268,
0.004159827250987291,
-0.01718256250023842,
0.13398899137973785,
-0.0008367881528101861,
0.052835021167993546,
0.05317385122179985,
0.0444546714425087,
-0.13447675108909607,
0.007319858763366938,
0.05949346348643303,
0.05858587101101875,
0.04750886186957359,
0.02177811786532402,
0.0783245861530304,
0.004969126544892788,
0.026663674041628838,
0.04326851665973663,
0.02339746616780758,
-0.07538340985774994,
-0.07752912491559982,
-0.10214712470769882,
0.03967700153589249,
0.025169625878334045,
0.0034886719658970833,
0.011506152339279652,
0.12139248102903366,
-0.03307664021849632,
0.04199400544166565,
0.17807355523109436,
-0.3409729599952698,
-0.012102576903998852,
0.0679185763001442,
0.04251910373568535,
0.12209656834602356,
-0.0056584798730909824,
-0.017261812463402748,
0.07486102730035782,
0.025588516145944595,
0.092735156416893,
-0.03927427530288696,
0.027110841125249863,
-0.05343785136938095,
-0.15624184906482697,
-0.04631870612502098,
0.08799108117818832,
-0.0036928218323737383,
-0.13815610110759735,
-0.03515752777457237,
-0.0457414947450161,
0.04314018785953522,
0.010436831042170525,
-0.036846887320280075,
0.0330662839114666,
0.01866322010755539,
-0.016536924988031387,
-0.013987837359309196,
-0.10644270479679108,
-0.05402682349085808,
0.03060123324394226,
0.08074067533016205,
0.10167977213859558,
0.06839066743850708,
0.0011305673979222775,
0.11181607097387314,
-0.19270069897174835,
-0.05159732326865196,
-0.028012892231345177,
-0.0536826029419899,
-0.04926219582557678,
-0.009351317770779133,
-0.10913495719432831,
-0.037314917892217636,
0.010965768247842789,
0.13083072006702423,
-0.009947801008820534,
0.02549147605895996,
-0.02823966182768345,
0.008507090620696545,
0.0554334931075573,
0.04283268004655838,
-0.009010815061628819,
0.02428313158452511,
0.024439893662929535,
-0.016669919714331627,
-0.020604364573955536,
0.014396223239600658,
0.00018628516409080476,
0.022867824882268906,
0.11785046011209488,
0.022012289613485336,
-0.10180673003196716,
0.06796116381883621,
-0.01757727935910225,
-0.04457894340157509,
0.016632214188575745,
-0.08907083421945572,
-0.05462460219860077,
-0.035770442336797714,
0.0017023792024701834,
0.015265173278748989,
-0.003154144622385502,
-0.00511776702478528,
-0.02249538153409958,
-0.03812570497393608,
-0.08328521251678467,
-0.04577547684311867,
-0.051495857536792755,
-0.12864314019680023,
0.008913933299481869,
-0.17213870584964752,
-0.036631111055612564,
-0.1144222542643547,
-0.18648885190486908,
-0.020615950226783752,
0.06399484723806381,
-0.011225077323615551,
-0.057069748640060425,
0.08253870159387589,
0.04353058338165283,
-0.02740911766886711,
-0.0011971909552812576,
0.0727979764342308,
-0.002053249394521117,
0.04515215754508972,
-0.02568131871521473,
0.0696767196059227,
0.002266919007524848,
0.032959192991256714,
-0.0547306090593338,
0.06432566791772842,
-0.1718686819076538,
0.04439542815089226,
-0.07152334600687027,
-0.032334424555301666,
-0.0867268294095993,
-0.0368848517537117,
-0.010555978864431381,
0.005048027262091637,
0.02256755344569683,
0.07379384338855743,
-0.18110394477844238,
-0.029836704954504967,
0.1253596395254135,
-0.16382010281085968,
-0.02130582556128502,
0.0756320059299469,
-0.04814503341913223,
0.106171615421772,
0.06927810609340668,
0.15571711957454681,
0.0005040651885792613,
-0.08528647571802139,
0.06001267582178116,
-0.008234123699367046,
0.010189983062446117,
-0.006497317459434271,
0.07140211015939713,
-0.018847813829779625,
-0.15453025698661804,
0.03516177460551262,
-0.13246549665927887,
-0.002002654131501913,
-0.07821033149957657,
0.017724016681313515,
-0.012130706571042538,
-0.06572133302688599,
-0.061456385999917984,
-0.027185721322894096,
0.06844744831323624,
-0.07065021246671677,
-0.01779095269739628,
0.04623705893754959,
0.07454050332307816,
-0.07387851923704147,
0.06797460466623306,
-0.012073973193764687,
0.017019370570778847,
-0.08387789130210876,
-0.038847316056489944,
-0.18949049711227417,
0.0489102378487587,
0.0995146781206131,
0.002089726272970438,
-0.02236391231417656,
0.14366792142391205,
0.009030214510858059,
0.06952190399169922,
-0.05157392844557762,
0.015636399388313293,
-0.013225890696048737,
-0.0043851593509316444,
-0.08888062089681625,
-0.09949073195457458,
-0.07676196098327637,
-0.06706101447343826,
0.09380394965410233,
-0.1310162991285324,
0.022075241431593895,
-0.057651352137327194,
0.04404187574982643,
0.02030079998075962,
-0.08106592297554016,
-0.01936851255595684,
0.011277474462985992,
-0.06016688793897629,
-0.055745817720890045,
0.04164471477270126,
0.0720042884349823,
-0.011445961892604828,
0.09636828303337097,
-0.04747316241264343,
-0.08151450008153915,
0.030879775062203407,
0.09505882859230042,
-0.10600738227367401,
0.007293861825019121,
-0.05709114670753479,
-0.04343164339661598,
-0.06358831375837326,
-0.01934157684445381,
0.07777750492095947,
-0.009367105551064014,
0.1391948163509369,
-0.07501494139432907,
-0.0036726410035043955,
0.014028850011527538,
-0.023371616378426552,
-0.020438311621546745,
0.03452235087752342,
0.06094883754849434,
-0.07305657863616943,
0.01542294304817915,
0.039624009281396866,
0.009018156677484512,
0.07182551920413971,
-0.05457137152552605,
-0.08985695987939835,
0.010337593033909798,
0.03627348318696022,
0.028979992493987083,
0.06773635745048523,
-0.021186083555221558,
-0.009289571084082127,
0.03520495817065239,
0.01731606386601925,
0.006369544193148613,
-0.11807884275913239,
0.061404988169670105,
0.05809490382671356,
0.0015340319368988276,
0.05853113904595375,
-0.017595376819372177,
-0.03953556343913078,
0.08011899888515472,
0.03647076338529587,
0.006372931879013777,
-0.01467759907245636,
-0.014093109406530857,
-0.11806025356054306,
0.19042088091373444,
-0.06280422955751419,
-0.1630164384841919,
-0.08131147176027298,
-0.10138308256864548,
0.00722249411046505,
0.02666272409260273,
0.03964735195040703,
-0.017406735569238663,
-0.0424971729516983,
-0.12208659201860428,
0.06183456629514694,
-0.03877313807606697,
0.06933993101119995,
0.10787662118673325,
-0.04149531200528145,
0.05981753394007683,
-0.1250336766242981,
-0.005185763817280531,
-0.08083868771791458,
-0.06922638416290283,
0.05985438823699951,
-0.048460010439157486,
0.02575698308646679,
0.09847821295261383,
0.02368200570344925,
-0.0199549812823534,
-0.02474808879196644,
0.20523066818714142,
0.04088669642806053,
0.042141396552324295,
0.13323082029819489,
-0.0654463917016983,
0.057787153869867325,
0.07928424328565598,
0.009867174550890923,
-0.04212222248315811,
0.05041297525167465,
0.04357424005866051,
-0.06768417358398438,
-0.1934409737586975,
-0.023410173133015633,
-0.007503671571612358,
-0.04556842893362045,
0.07357870787382126,
0.03520726040005684,
0.003183288499712944,
0.06886190176010132,
0.012369363568723202,
0.06323060393333435,
-0.001425186637789011,
0.09794184565544128,
0.02207041159272194,
-0.03329060599207878,
0.08630327135324478,
-0.02096235565841198,
-0.009031851775944233,
0.08587776124477386,
-0.020415492355823517,
0.28572192788124084,
-0.033343520015478134,
0.01335757877677679,
0.12002385407686234,
0.04678020626306534,
0.06429175287485123,
0.12438774108886719,
-0.06773164868354797,
0.02265256457030773,
-0.0726792961359024,
-0.060664743185043335,
-0.0010282035218551755,
0.04592520743608475,
-0.05934973433613777,
0.01048869825899601,
-0.07421772927045822,
0.01796397939324379,
-0.021419577300548553,
0.30750763416290283,
0.1151440218091011,
-0.10446189343929291,
-0.06347799301147461,
0.0037744808942079544,
-0.0982048287987709,
-0.07331550121307373,
0.039691053330898285,
0.0768490731716156,
-0.13451144099235535,
0.006310688331723213,
-0.026438700035214424,
0.07467973232269287,
-0.013799958862364292,
0.017793113365769386,
0.033132217824459076,
0.037685591727495193,
-0.037263352423906326,
0.01084712240844965,
-0.17983874678611755,
0.19117240607738495,
0.007525213528424501,
0.019281992688775063,
-0.05023183301091194,
0.03358861431479454,
0.007841336540877819,
-0.027499588206410408,
0.05916544795036316,
0.024189025163650513,
-0.029674343764781952,
-0.04420527443289757,
-0.0505807101726532,
0.013020177371799946,
0.07808414101600647,
-0.04383233189582825,
0.10644825547933578,
-0.0076629179529845715,
0.041724998503923416,
0.021135523915290833,
0.08934327960014343,
-0.18097537755966187,
-0.08831128478050232,
0.031126346439123154,
-0.05734352767467499,
-0.09963131695985794,
-0.07993286848068237,
-0.09425424784421921,
0.0016796588897705078,
0.2498558908700943,
-0.1195598691701889,
-0.07466056197881699,
-0.09620554000139236,
0.029858700931072235,
0.10665994882583618,
-0.04934222996234894,
0.028657546266913414,
-0.007162780500948429,
0.13257679343223572,
-0.06449127197265625,
-0.13502994179725647,
0.021893395110964775,
-0.08997286856174469,
-0.16692908108234406,
-0.06681335717439651,
0.11727870255708694,
0.058849141001701355,
0.03471602872014046,
-0.02562839351594448,
0.0216938816010952,
0.03334301337599754,
-0.0385233610868454,
-0.0009297064389102161,
0.06861921399831772,
0.0986965224146843,
0.030659137293696404,
-0.11278052628040314,
0.019899144768714905,
-0.060943953692913055,
-0.0640595406293869,
0.07708266377449036,
0.2653699517250061,
-0.05993033945560455,
0.12724153697490692,
0.1159600019454956,
-0.08066988736391068,
-0.15228131413459778,
0.026135258376598358,
0.09503474831581116,
-0.015359422191977501,
0.01631764881312847,
-0.160436749458313,
0.08853017538785934,
0.11209899187088013,
-0.024099335074424744,
0.009963211603462696,
-0.1879992038011551,
-0.12990663945674896,
0.06525816023349762,
0.09613823890686035,
0.2721916437149048,
-0.0620628297328949,
-0.04420868307352066,
0.017967183142900467,
-0.09536578506231308,
0.020694652572274208,
0.11609229445457458,
0.06374071538448334,
-0.02431465871632099,
-0.07539556175470352,
0.014473176561295986,
-0.03928669914603233,
0.09854589402675629,
0.05092979967594147,
0.054985228925943375,
-0.0035447957925498486,
0.022448087111115456,
-0.012711303308606148,
-0.04366514831781387,
0.061556167900562286,
0.02292398363351822,
0.049954935908317566,
-0.08685598522424698,
-0.029641715809702873,
-0.06911477446556091,
0.028109444305300713,
-0.02551775611937046,
-0.07566745579242706,
-0.05957694351673126,
0.07563665509223938,
0.04827636852860451,
-0.02470587193965912,
0.01632949337363243,
0.03095012903213501,
0.11772235482931137,
0.16667278110980988,
-0.007754485588520765,
-0.03726738318800926,
-0.05856624245643616,
-0.03831440210342407,
-0.017521720379590988,
0.07496737688779831,
-0.05209846422076225,
0.02802078053355217,
0.06435911357402802,
0.023089954629540443,
0.09699078649282455,
0.055225782096385956,
-0.11599940806627274,
-0.01717664673924446,
0.03257380798459053,
-0.1663254201412201,
0.013851940631866455,
0.00020165614841971546,
0.022525208070874214,
-0.03586159646511078,
0.026647835969924927,
0.1516764909029007,
-0.06432321667671204,
-0.03614572435617447,
-0.039952706545591354,
0.0672154352068901,
0.021223215386271477,
0.14023616909980774,
0.03430354222655296,
0.037116456776857376,
-0.08038143813610077,
0.12151926010847092,
0.03952234983444214,
-0.039117321372032166,
0.020349888131022453,
-0.02933325432240963,
-0.10797808319330215,
0.011617864482104778,
0.05974027141928673,
0.03635568916797638,
-0.04760558903217316,
-0.01404202077537775,
-0.02567368559539318,
-0.07379865646362305,
0.06019168347120285,
0.18527722358703613,
0.06689350306987762,
0.07624882459640503,
-0.05533604696393013,
-0.03634955734014511,
-0.0796508640050888,
0.044114407151937485,
0.0432903952896595,
0.07311708480119705,
-0.07923971116542816,
0.10976113379001617,
0.010879572480916977,
0.04581350460648537,
-0.03270566090941429,
-0.05461452156305313,
-0.10016924887895584,
-0.05691229924559593,
-0.10988789051771164,
0.011697175912559032,
-0.07283465564250946,
-0.0391746461391449,
0.0011504123685881495,
-0.007404467556625605,
-0.00672675808891654,
0.04883871600031853,
-0.061437077820301056,
-0.008826159872114658,
-0.026626206934452057,
0.03455892950296402,
-0.0671621561050415,
-0.03819829225540161,
0.03152548894286156,
-0.10185923427343369,
0.09574341028928757,
0.054778147488832474,
0.008096791803836823,
0.008722690865397453,
0.09146527945995331,
-0.019518699496984482,
0.024352271109819412,
0.013707391917705536,
-0.04817518964409828,
-0.0851125419139862,
0.002472404157742858,
-0.009620090015232563,
-0.016422521322965622,
-0.010434391908347607,
0.09294626116752625,
-0.08580198138952255,
0.03252177685499191,
-0.0068140714429318905,
-0.007050594314932823,
-0.07252270728349686,
-0.01437566988170147,
0.09574684500694275,
0.09854716807603836,
0.04754676669836044,
-0.08979783952236176,
0.012730764225125313,
-0.14376579225063324,
-0.037900812923908234,
0.0047521330416202545,
-0.00773548474535346,
-0.11975690722465515,
-0.009901261888444424,
0.01839223876595497,
-0.0014284771168604493,
0.21198002994060516,
-0.056813038885593414,
-0.015691597014665604,
0.017824148759245872,
-0.09911985695362091,
0.11144907027482986,
-0.022590767592191696,
0.18286341428756714,
-0.008697562851011753,
-0.040638267993927,
-0.01569310389459133,
0.03864995017647743,
0.02008483000099659,
-0.021836787462234497,
0.17701923847198486,
0.1374215930700302,
0.028685513883829117,
0.04064115881919861,
-0.024870609864592552,
0.0026526611763983965,
-0.054031483829021454,
-0.026371419429779053,
0.028734240680933,
0.03836030885577202,
0.017285097390413284,
0.15666399896144867,
0.07087991386651993,
-0.16774731874465942,
0.033157531172037125,
-0.02848123013973236,
-0.03893309086561203,
-0.11720690876245499,
-0.09618660807609558,
-0.033130355179309845,
-0.07343745976686478,
0.0070151411928236485,
-0.1230820044875145,
0.007856231182813644,
0.17540818452835083,
0.05511748790740967,
0.025542225688695908,
0.0015755961649119854,
-0.11862929165363312,
-0.03693484887480736,
0.05225657671689987,
0.014317529276013374,
0.02517733722925186,
0.06185052916407585,
-0.0010630277683958411,
0.06357953697443008,
0.0414724163711071,
0.013705774210393429,
0.0014693740522488952,
0.08321794867515564,
0.017714668065309525,
0.03801663592457771,
-0.06221381202340126,
-0.004836471285670996,
-0.03847116604447365,
0.07030831277370453,
0.09410760551691055,
0.04828568920493126,
-0.0487460158765316,
-0.008470028638839722,
0.16445334255695343,
-0.04081769287586212,
-0.0009574590367265046,
-0.12971992790699005,
0.33409637212753296,
0.011510846205055714,
0.013556621968746185,
0.04724637791514397,
-0.07559817284345627,
-0.04980592429637909,
0.20369642972946167,
0.08760427683591843,
-0.02100978046655655,
-0.020793886855244637,
0.0023464225232601166,
-0.03071661852300167,
-0.020285068079829216,
0.1514962911605835,
0.035175904631614685,
0.12744155526161194,
-0.05345670506358147,
-0.05205536261200905,
-0.030031029134988785,
-0.0081711420789361,
-0.12715986371040344,
0.1383049339056015,
-0.031838931143283844,
-0.02416703850030899,
-0.07087812572717667,
0.026274889707565308,
0.07382932305335999,
-0.3245113492012024,
-0.000947608845308423,
-0.030143141746520996,
-0.1089329943060875,
-0.00402560131624341,
-0.01975978910923004,
-0.021696772426366806,
0.04521886259317398,
-0.04735011234879494,
0.06983564049005508,
0.04790734499692917,
0.03449200093746185,
-0.023464636877179146,
-0.09002801775932312,
0.16690288484096527,
0.043177224695682526,
0.09483041614294052,
0.02625395357608795,
0.07751638442277908,
0.05435226112604141,
0.03557778522372246,
-0.09521789103746414,
0.0411522202193737,
0.013109470717608929,
-0.09125224500894547,
-0.05289285257458687,
0.12093734741210938,
-0.0012235019821673632,
0.03663860261440277,
0.04348769038915634,
-0.10851433873176575,
0.011459290981292725,
0.07045870274305344,
-0.07101427763700485,
-0.0968380942940712,
-0.010256201960146427,
-0.0898362472653389,
0.1545625478029251,
0.1418645828962326,
-0.018074821680784225,
0.022376179695129395,
-0.06710587441921234,
-0.006626596674323082,
0.05432656779885292,
0.008249347098171711,
-0.018305981531739235,
-0.1891273856163025,
0.03275264799594879,
-0.08234531432390213,
-0.004960750695317984,
-0.22733287513256073,
-0.10230106860399246,
-0.00913187675178051,
-0.04917314276099205,
-0.02407091110944748,
0.06097213178873062,
0.02884332463145256,
0.06740954518318176,
-0.01636236160993576,
-0.04200199991464615,
-0.029281185939908028,
0.08992714434862137,
-0.10878477990627289,
-0.06544115394353867
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1500k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1500k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1500k"]}
| null |
google/multiberts-seed_2-step_1500k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1500k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1500k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08090256154537201,
0.09823481738567352,
-0.002780925715342164,
0.04552459716796875,
0.08324122428894043,
-0.014097125269472599,
0.07506359368562698,
0.10214608162641525,
-0.02118564583361149,
0.024871258065104485,
0.07788797467947006,
-0.00040252311737276614,
0.019194945693016052,
0.09430275857448578,
0.02198762446641922,
-0.21682676672935486,
0.02132822759449482,
-0.03511376678943634,
-0.1053234413266182,
0.07478383928537369,
0.09984473139047623,
-0.08079405128955841,
0.044989898800849915,
0.029217738658189774,
-0.11503975093364716,
0.047862742096185684,
0.0037266763392835855,
-0.01680578850209713,
0.13147473335266113,
0.0034038894809782505,
0.05077558755874634,
0.055181171745061874,
0.04479223117232323,
-0.13632988929748535,
0.007569241337478161,
0.05722976475954056,
0.059218816459178925,
0.04702925309538841,
0.024327928200364113,
0.07823416590690613,
-0.007454389240592718,
0.02939108945429325,
0.041755419224500656,
0.026121102273464203,
-0.07468809932470322,
-0.052856869995594025,
-0.10235185921192169,
0.04867654666304588,
0.02535729855298996,
-0.0005112991784699261,
0.014624632894992828,
0.1124814823269844,
-0.03520343080163002,
0.042930133640766144,
0.1804618388414383,
-0.3244260549545288,
-0.014529017731547356,
0.07669244706630707,
0.03720211982727051,
0.1285557895898819,
-0.001488359528593719,
-0.016048520803451538,
0.07644680142402649,
0.024201462045311928,
0.08954503387212753,
-0.04032329097390175,
0.027155645191669464,
-0.05304906144738197,
-0.15460282564163208,
-0.04844556003808975,
0.08514747023582458,
-0.003795234952121973,
-0.13606372475624084,
-0.031552765518426895,
-0.044321026653051376,
0.038445983082056046,
0.010541781783103943,
-0.03977327421307564,
0.032323312014341354,
0.015613864175975323,
-0.010757343843579292,
-0.016039138659834862,
-0.10061399638652802,
-0.05656551569700241,
0.035348497331142426,
0.0842147171497345,
0.1015963926911354,
0.07049471139907837,
-0.0009533982956781983,
0.10659535974264145,
-0.19641771912574768,
-0.04986730217933655,
-0.030578937381505966,
-0.05313738062977791,
-0.05033230409026146,
-0.011849512346088886,
-0.10707349330186844,
-0.04801955074071884,
0.010736961849033833,
0.13380300998687744,
-0.0143357552587986,
0.02545536495745182,
-0.028393801301717758,
0.007065474521368742,
0.05503708869218826,
0.03704333305358887,
-0.014207462780177593,
0.0240262933075428,
0.026521913707256317,
-0.010380520485341549,
-0.025879373773932457,
0.014004180207848549,
0.0011280798353254795,
0.022101903334259987,
0.11613065749406815,
0.025721237063407898,
-0.1015448346734047,
0.0647735521197319,
-0.0167987197637558,
-0.04534521326422691,
0.017722753807902336,
-0.08782816678285599,
-0.05676627904176712,
-0.03967198729515076,
0.0006227431003935635,
0.0033543920144438744,
-0.002754346700385213,
-0.003713797079399228,
-0.025837412104010582,
-0.03146245703101158,
-0.08333335816860199,
-0.045909855514764786,
-0.05202765390276909,
-0.13054920732975006,
0.007560623809695244,
-0.17966052889823914,
-0.03601062297821045,
-0.1161666288971901,
-0.1921185553073883,
-0.02115384116768837,
0.06480765342712402,
-0.013746640644967556,
-0.06165661662817001,
0.08226978778839111,
0.03979409113526344,
-0.028667867183685303,
0.000004205243840260664,
0.07355396449565887,
0.000864511588588357,
0.04312949255108833,
-0.026078345254063606,
0.07019147276878357,
0.005529540125280619,
0.03533477336168289,
-0.05162340775132179,
0.06175076216459274,
-0.17586347460746765,
0.0505293570458889,
-0.07494334131479263,
-0.02751798927783966,
-0.08578314632177353,
-0.033797916024923325,
-0.014885997399687767,
0.004371939226984978,
0.028493667021393776,
0.0790763571858406,
-0.18370653688907623,
-0.02878919616341591,
0.11980997025966644,
-0.16487465798854828,
-0.025545407086610794,
0.07350564002990723,
-0.047191962599754333,
0.1028747409582138,
0.06695301830768585,
0.15274585783481598,
0.004536193795502186,
-0.08027263730764389,
0.06254053860902786,
-0.010138175450265408,
0.00803358294069767,
-0.013070865534245968,
0.07089430093765259,
-0.018361400812864304,
-0.16037915647029877,
0.03429148718714714,
-0.13003183901309967,
0.0007446389063261449,
-0.08122286945581436,
0.015443295240402222,
-0.01180358324199915,
-0.07241746783256531,
-0.0657605528831482,
-0.028794951736927032,
0.06814519315958023,
-0.07053584605455399,
-0.012120429426431656,
0.046289823949337006,
0.07104913145303726,
-0.06747294217348099,
0.06711707264184952,
-0.010090605355799198,
0.01790783368051052,
-0.08949152380228043,
-0.03744116798043251,
-0.19239665567874908,
0.060137681663036346,
0.09835896641016006,
0.0024113720282912254,
-0.020424285903573036,
0.14037179946899414,
0.009038865566253662,
0.06671568006277084,
-0.05254078656435013,
0.016727017238736153,
-0.012676884420216084,
-0.005862714257091284,
-0.08880525082349777,
-0.09599459171295166,
-0.07092557102441788,
-0.06466459482908249,
0.07431360334157944,
-0.12124691158533096,
0.024378027766942978,
-0.06009168550372124,
0.041081465780735016,
0.021696340292692184,
-0.0836866945028305,
-0.017547141760587692,
0.014201398007571697,
-0.05806184187531471,
-0.05483368784189224,
0.040697235614061356,
0.07047731429338455,
-0.004528685938566923,
0.09187623858451843,
-0.037220872938632965,
-0.07809749990701675,
0.0283798947930336,
0.1016213446855545,
-0.10709664970636368,
0.011647265404462814,
-0.05547022819519043,
-0.04117799177765846,
-0.06839444488286972,
-0.017954520881175995,
0.07457278668880463,
-0.0058622779324650764,
0.1379607766866684,
-0.07833260297775269,
-0.000042171650420641527,
0.018332237377762794,
-0.02008720114827156,
-0.021162111312150955,
0.036045003682374954,
0.07195622473955154,
-0.06536506861448288,
0.01467195339500904,
0.028891725465655327,
0.012275573797523975,
0.06689818203449249,
-0.05131121352314949,
-0.0860801637172699,
0.011629398912191391,
0.03739297017455101,
0.031161006540060043,
0.06769958883523941,
-0.025215672329068184,
-0.008746429346501827,
0.0326162651181221,
0.018432622775435448,
0.007866065949201584,
-0.11629275232553482,
0.060134369879961014,
0.06129477918148041,
0.001991651486605406,
0.05395721644163132,
-0.01754739135503769,
-0.037617139518260956,
0.08044134080410004,
0.03499400615692139,
0.007925521582365036,
-0.016712216660380363,
-0.015209140256047249,
-0.11822176724672318,
0.18805071711540222,
-0.06265873461961746,
-0.1733606457710266,
-0.08351520448923111,
-0.09292306005954742,
0.0028498524334281683,
0.025416169315576553,
0.038768164813518524,
-0.019440488889813423,
-0.04441889747977257,
-0.12011217325925827,
0.06730686873197556,
-0.03749720752239227,
0.06985288858413696,
0.11674684286117554,
-0.03994322195649147,
0.059530094265937805,
-0.12613658607006073,
-0.0043602087534964085,
-0.08049798011779785,
-0.0669940859079361,
0.05715414509177208,
-0.04271475970745087,
0.029963674023747444,
0.09657756239175797,
0.01863773725926876,
-0.018527615815401077,
-0.025338653475046158,
0.21513545513153076,
0.042301371693611145,
0.04011666402220726,
0.13066262006759644,
-0.06460035592317581,
0.05668704956769943,
0.0903453603386879,
0.012853823602199554,
-0.04408175125718117,
0.05209137871861458,
0.04700513556599617,
-0.06741610169410706,
-0.19305400550365448,
-0.021800342947244644,
-0.00925610214471817,
-0.04522892087697983,
0.07201585173606873,
0.03406189754605293,
0.004542144946753979,
0.06815163046121597,
0.013156330212950706,
0.06312786042690277,
-0.0014707411173731089,
0.09386304765939713,
0.01353191677480936,
-0.03262563794851303,
0.08623701333999634,
-0.017878089100122452,
-0.006186080630868673,
0.08579284697771072,
-0.021993299946188927,
0.28750747442245483,
-0.03439171239733696,
0.01149466447532177,
0.1201760545372963,
0.04254446551203728,
0.06573802977800369,
0.13018788397312164,
-0.07098188251256943,
0.021107440814375877,
-0.06940645724534988,
-0.05703737959265709,
-0.00454326905310154,
0.04256822541356087,
-0.05833839625120163,
0.015389573760330677,
-0.07575508952140808,
0.03133611008524895,
-0.01974743977189064,
0.3098337948322296,
0.10608821362257004,
-0.09312597662210464,
-0.06056399643421173,
0.001988127361983061,
-0.09949196875095367,
-0.07360566407442093,
0.04207092151045799,
0.08374136686325073,
-0.13147231936454773,
0.003485203953459859,
-0.026990098878741264,
0.07771597057580948,
-0.008403285406529903,
0.01756865158677101,
0.030472740530967712,
0.039577893912792206,
-0.039128925651311874,
0.008036555722355843,
-0.1910228729248047,
0.19184426963329315,
0.007914621382951736,
0.019521260634064674,
-0.048259150236845016,
0.03432799503207207,
0.006465424317866564,
-0.02782497927546501,
0.063021220266819,
0.02710792049765587,
-0.025164887309074402,
-0.04879891499876976,
-0.05122266337275505,
0.01112157478928566,
0.07615412771701813,
-0.043078262358903885,
0.10128819197416306,
-0.005590819288045168,
0.041763778775930405,
0.019518347457051277,
0.09452252835035324,
-0.18852385878562927,
-0.08976143598556519,
0.03209059685468674,
-0.06480119377374649,
-0.10491635650396347,
-0.08020836114883423,
-0.09376371651887894,
-0.0007805913919582963,
0.2401823103427887,
-0.12261192500591278,
-0.07637585699558258,
-0.09789057075977325,
0.032173071056604385,
0.10524021834135056,
-0.051063138991594315,
0.03090042620897293,
-0.0076295193284749985,
0.12976352870464325,
-0.07099690288305283,
-0.13202878832817078,
0.018405595794320107,
-0.09232378005981445,
-0.16342060267925262,
-0.06483369320631027,
0.11222925037145615,
0.05959651619195938,
0.03500283882021904,
-0.03079560585319996,
0.019930537790060043,
0.04147296026349068,
-0.037893619388341904,
-0.002797007095068693,
0.06612994521856308,
0.08907833695411682,
0.03464888781309128,
-0.106687530875206,
0.01485081110149622,
-0.06219693273305893,
-0.0666956752538681,
0.07535931468009949,
0.26288264989852905,
-0.058156825602054596,
0.1276913285255432,
0.11619749665260315,
-0.08121146261692047,
-0.1598302721977234,
0.03230305016040802,
0.09704669564962387,
-0.01457219012081623,
0.013165120035409927,
-0.15700985491275787,
0.08698339015245438,
0.10907138139009476,
-0.021293293684720993,
0.0019720238633453846,
-0.19121403992176056,
-0.1341003179550171,
0.0618211068212986,
0.09895306080579758,
0.27482545375823975,
-0.05792641639709473,
-0.045187126845121384,
0.021574223414063454,
-0.10049638897180557,
0.01988326944410801,
0.13428783416748047,
0.05950203537940979,
-0.02538446895778179,
-0.0802239403128624,
0.01654159277677536,
-0.041194964200258255,
0.09421607851982117,
0.051517680287361145,
0.05895378440618515,
-0.0023479729425162077,
0.014417354017496109,
-0.025667304173111916,
-0.04533333703875542,
0.060773495584726334,
0.023160725831985474,
0.052790310233831406,
-0.07717564702033997,
-0.028080638498067856,
-0.06916425377130508,
0.026036038994789124,
-0.02706153132021427,
-0.07489075511693954,
-0.05903457850217819,
0.0792667418718338,
0.049178458750247955,
-0.024798264726996422,
0.021702831611037254,
0.03136434406042099,
0.11580780148506165,
0.17093870043754578,
-0.00548880361020565,
-0.057476747781038284,
-0.05545147508382797,
-0.035708438605070114,
-0.016208836808800697,
0.07619070261716843,
-0.05248793959617615,
0.03144359588623047,
0.06725870072841644,
0.024285778403282166,
0.09972263872623444,
0.05506390705704689,
-0.11172334849834442,
-0.02090456895530224,
0.03543887659907341,
-0.16176734864711761,
0.0049841031432151794,
-0.0040283240377902985,
0.021103914827108383,
-0.037179846316576004,
0.020546134561300278,
0.15002156794071198,
-0.0682426169514656,
-0.03559916466474533,
-0.041988421231508255,
0.06993814557790756,
0.023265907540917397,
0.14601127803325653,
0.03378579393029213,
0.040072519332170486,
-0.08240167796611786,
0.12709859013557434,
0.03923500329256058,
-0.0387350395321846,
0.017748836427927017,
-0.028151484206318855,
-0.10740073025226593,
0.014290105551481247,
0.06578207015991211,
0.04038491100072861,
-0.04965071752667427,
-0.009470201097428799,
-0.026414060965180397,
-0.07526792585849762,
0.06332442164421082,
0.1948186159133911,
0.06390687823295593,
0.0688130259513855,
-0.05280466750264168,
-0.03576549142599106,
-0.07720047235488892,
0.04969169944524765,
0.044755905866622925,
0.07179020345211029,
-0.07901842147111893,
0.10850906372070312,
0.009320155717432499,
0.044293832033872604,
-0.031052904203534126,
-0.05085551738739014,
-0.1015995591878891,
-0.055467743426561356,
-0.11430668085813522,
0.012241142801940441,
-0.07405208796262741,
-0.036561526358127594,
0.0002970023488160223,
-0.008578883484005928,
-0.008802366442978382,
0.04982226714491844,
-0.062026627361774445,
-0.009262030012905598,
-0.026449374854564667,
0.033320002257823944,
-0.06471046060323715,
-0.038002632558345795,
0.03180951252579689,
-0.09788087010383606,
0.09277412295341492,
0.048058535903692245,
0.005166473798453808,
0.006484838668256998,
0.09092110395431519,
-0.02164219133555889,
0.02192617952823639,
0.016118137165904045,
-0.04573357105255127,
-0.08576211333274841,
0.007137152832001448,
-0.009157363325357437,
-0.01882111094892025,
-0.009577062912285328,
0.08627394586801529,
-0.0856701210141182,
0.03644412010908127,
-0.007849571295082569,
-0.0024492451921105385,
-0.07223495841026306,
-0.011479518376290798,
0.10114123672246933,
0.09732905775308609,
0.045511890202760696,
-0.08745460212230682,
0.013233756646513939,
-0.1433328539133072,
-0.036635592579841614,
0.006711806170642376,
-0.007582532241940498,
-0.13256072998046875,
-0.008040840737521648,
0.019413426518440247,
-0.0017719089519232512,
0.20728106796741486,
-0.05134078115224838,
-0.018328193575143814,
0.019793057814240456,
-0.09176011383533478,
0.10583145916461945,
-0.022686796262860298,
0.17361073195934296,
-0.008011966943740845,
-0.04257923737168312,
-0.015252926386892796,
0.04042729362845421,
0.022107230499386787,
-0.02725794166326523,
0.1860480159521103,
0.13731132447719574,
0.030903423205018044,
0.03876176476478577,
-0.022042496129870415,
0.000567741459235549,
-0.04388812184333801,
-0.025470824912190437,
0.026070769876241684,
0.036121003329753876,
0.01633213832974434,
0.14350366592407227,
0.07016702741384506,
-0.16848453879356384,
0.03940436616539955,
-0.027740465477108955,
-0.039386261254549026,
-0.11850070208311081,
-0.09565499424934387,
-0.030468491837382317,
-0.07440488040447235,
0.00921829417347908,
-0.1269810050725937,
0.0057242438197135925,
0.17545513808727264,
0.055584926158189774,
0.026279449462890625,
-0.0014195321127772331,
-0.12499845772981644,
-0.031064089387655258,
0.05133935064077377,
0.013278940692543983,
0.028062645345926285,
0.059254832565784454,
-0.001145603833720088,
0.06445667892694473,
0.04394742101430893,
0.013950474560260773,
0.0018249377608299255,
0.07707628607749939,
0.016117941588163376,
0.040356993675231934,
-0.061414338648319244,
-0.0033084803726524115,
-0.04481497406959534,
0.06651677936315536,
0.10549455881118774,
0.04812842607498169,
-0.04391395300626755,
-0.007513254880905151,
0.16761161386966705,
-0.039173342287540436,
-0.0032825095113366842,
-0.1333920806646347,
0.31661614775657654,
0.015898626297712326,
0.020995602011680603,
0.04916863143444061,
-0.0775371715426445,
-0.05643630027770996,
0.21160021424293518,
0.08750080317258835,
-0.019615961238741875,
-0.021764123812317848,
0.002760907867923379,
-0.0306661669164896,
-0.020179281011223793,
0.1487671285867691,
0.034518297761678696,
0.12053398787975311,
-0.05145999416708946,
-0.046593114733695984,
-0.02811856009066105,
-0.01117687113583088,
-0.11769887059926987,
0.1353449672460556,
-0.029407916590571404,
-0.02209845930337906,
-0.07099779695272446,
0.02467484213411808,
0.0670449361205101,
-0.32247474789619446,
-0.002179978182539344,
-0.033999793231487274,
-0.10703770071268082,
-0.008088518865406513,
-0.022286266088485718,
-0.022705359384417534,
0.042295798659324646,
-0.04340432956814766,
0.07135555148124695,
0.04635760188102722,
0.033413756638765335,
-0.021297279745340347,
-0.10227994620800018,
0.16504378616809845,
0.05075711011886597,
0.09190435707569122,
0.028015311807394028,
0.06856321543455124,
0.05466223880648613,
0.037952765822410583,
-0.09890321642160416,
0.03434386104345322,
0.011881566606462002,
-0.08515026420354843,
-0.056797195225954056,
0.12634950876235962,
-0.0006026949267834425,
0.04435909539461136,
0.04161902889609337,
-0.10961799323558807,
0.007050903979688883,
0.06858092546463013,
-0.06831420958042145,
-0.09792532026767731,
-0.006089285481721163,
-0.0876840129494667,
0.151569664478302,
0.14297941327095032,
-0.015458722598850727,
0.02559518814086914,
-0.06761437654495239,
-0.006541662849485874,
0.05088888108730316,
0.019222870469093323,
-0.018114862963557243,
-0.19482988119125366,
0.03500206395983696,
-0.08877827972173691,
-0.005495554767549038,
-0.22588123381137848,
-0.1021316722035408,
-0.009767220355570316,
-0.04926747828722,
-0.022889139130711555,
0.0646924152970314,
0.03712005540728569,
0.06924010813236237,
-0.01498037576675415,
-0.045177534222602844,
-0.029643263667821884,
0.09160145372152328,
-0.10129310190677643,
-0.06126130744814873
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1600k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1600k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1600k"]}
| null |
google/multiberts-seed_2-step_1600k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1600k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1600k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08043377101421356,
0.0899607315659523,
-0.002596484962850809,
0.04875752702355385,
0.08132443577051163,
-0.013677356764674187,
0.07072215527296066,
0.10200387239456177,
-0.021827060729265213,
0.02517426200211048,
0.07678060978651047,
0.002166241407394409,
0.019656851887702942,
0.09522471576929092,
0.024256117641925812,
-0.21588172018527985,
0.017760949209332466,
-0.036597494035959244,
-0.0973391905426979,
0.07545823603868484,
0.10030815005302429,
-0.0811445415019989,
0.04718560352921486,
0.028491593897342682,
-0.1207084059715271,
0.05225498229265213,
0.006247288081794977,
-0.01961507648229599,
0.13471364974975586,
-0.0016874057473614812,
0.05380113050341606,
0.0572197251021862,
0.048763323575258255,
-0.1270262598991394,
0.006289878394454718,
0.059044770896434784,
0.060235440731048584,
0.04486583173274994,
0.02610161155462265,
0.07930505275726318,
0.007581964135169983,
0.025699935853481293,
0.04404853284358978,
0.024777807295322418,
-0.07627759873867035,
-0.06553616374731064,
-0.10255411267280579,
0.03328913822770119,
0.02305932156741619,
0.002378998091444373,
0.012238450348377228,
0.11639167368412018,
-0.03143147751688957,
0.04419159144163132,
0.17708231508731842,
-0.3270355463027954,
-0.014235438778996468,
0.07136175036430359,
0.03865377977490425,
0.1348709911108017,
-0.0018707284471020103,
-0.01841532625257969,
0.07653538137674332,
0.024635838344693184,
0.08920595049858093,
-0.042107343673706055,
0.02138938196003437,
-0.05278339982032776,
-0.15303632616996765,
-0.04704564809799194,
0.09308453649282455,
-0.007354321889579296,
-0.13732463121414185,
-0.02842501364648342,
-0.046505723148584366,
0.03851645812392235,
0.012621228583157063,
-0.03860843926668167,
0.03418957442045212,
0.016775669530034065,
-0.016670074313879013,
-0.012954531237483025,
-0.10519931465387344,
-0.055500391870737076,
0.02992861531674862,
0.07500636577606201,
0.10146167874336243,
0.07240233570337296,
0.0006034004036337137,
0.10604830086231232,
-0.187586709856987,
-0.050765931606292725,
-0.026507502421736717,
-0.05442004278302193,
-0.052910223603248596,
-0.008215434849262238,
-0.10595158487558365,
-0.03928812965750694,
0.008810441941022873,
0.1262088268995285,
-0.019782740622758865,
0.029527748003602028,
-0.038009777665138245,
0.007131468970328569,
0.0520671121776104,
0.03740239143371582,
-0.007269380148500204,
0.024218503385782242,
0.02548304945230484,
-0.01606622524559498,
-0.02230481617152691,
0.013524430803954601,
0.0038516479544341564,
0.02507488802075386,
0.10996687412261963,
0.02250734716653824,
-0.1009758934378624,
0.06587691605091095,
-0.019196074455976486,
-0.0426429808139801,
0.016788601875305176,
-0.0918707326054573,
-0.05545039102435112,
-0.037788406014442444,
0.0008888881420716643,
0.006869734730571508,
-0.0020840843208134174,
-0.007887108251452446,
-0.0233903955668211,
-0.04110654443502426,
-0.08335278928279877,
-0.050102923065423965,
-0.0539497546851635,
-0.1298719346523285,
0.009219963103532791,
-0.1745101362466812,
-0.03683435544371605,
-0.1154390498995781,
-0.1891362965106964,
-0.020188283175230026,
0.061238259077072144,
-0.012829053215682507,
-0.05658479407429695,
0.0819813460111618,
0.039601705968379974,
-0.030341986566781998,
-0.00012926923227496445,
0.08284357190132141,
-0.0005756881437264383,
0.044195305556058884,
-0.023183055222034454,
0.07253775000572205,
0.007579233963042498,
0.03385144844651222,
-0.05394994840025902,
0.06115747615695,
-0.17224276065826416,
0.047234851866960526,
-0.07498173415660858,
-0.030907904729247093,
-0.08441358804702759,
-0.03603352978825569,
-0.01224267203360796,
0.005813729017972946,
0.024792123585939407,
0.07679641246795654,
-0.1808546632528305,
-0.030857549980282784,
0.11933691799640656,
-0.1677657961845398,
-0.021250780671834946,
0.07533387839794159,
-0.04721379280090332,
0.10502631962299347,
0.0668642595410347,
0.1590825468301773,
-0.007944418117403984,
-0.07846115529537201,
0.06294385343790054,
-0.011443216353654861,
0.007182038854807615,
-0.010126982815563679,
0.06962885707616806,
-0.016980862244963646,
-0.15822790563106537,
0.03574177995324135,
-0.12540222704410553,
-0.005615144036710262,
-0.07938139140605927,
0.016497816890478134,
-0.01205309946089983,
-0.06716810166835785,
-0.06443566083908081,
-0.023894818499684334,
0.06489786505699158,
-0.07712005078792572,
-0.01649601384997368,
0.03157579526305199,
0.07153579592704773,
-0.06992616504430771,
0.06935176998376846,
-0.01322965044528246,
0.010459337383508682,
-0.08004190027713776,
-0.03807128593325615,
-0.1910065859556198,
0.05319298803806305,
0.10004536807537079,
0.010338305495679379,
-0.021945424377918243,
0.14374268054962158,
0.0073461150750517845,
0.06865549832582474,
-0.05052239075303078,
0.017071446403861046,
-0.013789940625429153,
-0.005267785862088203,
-0.08610891550779343,
-0.1012381911277771,
-0.07293203473091125,
-0.06748224049806595,
0.08405279368162155,
-0.12303479760885239,
0.02317666821181774,
-0.05195820704102516,
0.039080724120140076,
0.02028227597475052,
-0.08426851034164429,
-0.021365970373153687,
0.012999318540096283,
-0.05844597518444061,
-0.05697626993060112,
0.03955603763461113,
0.07092518359422684,
-0.006035507656633854,
0.09340957552194595,
-0.04503096640110016,
-0.08822210133075714,
0.033223558217287064,
0.0945892333984375,
-0.10829755663871765,
0.02056030184030533,
-0.05748496949672699,
-0.04363958537578583,
-0.06534331291913986,
-0.020167943090200424,
0.07569964975118637,
-0.01126504223793745,
0.13598433136940002,
-0.07555467635393143,
0.00029931895551271737,
0.018309904262423515,
-0.0198591947555542,
-0.020721526816487312,
0.040192555636167526,
0.06261500716209412,
-0.06541776657104492,
0.013509837910532951,
0.03433593362569809,
0.012827246449887753,
0.06537268310785294,
-0.05295824632048607,
-0.08726252615451813,
0.012513026595115662,
0.03411942720413208,
0.03283509612083435,
0.06656312942504883,
-0.029384108260273933,
-0.010159128345549107,
0.034489259123802185,
0.016422277316451073,
0.0074882833287119865,
-0.11818515509366989,
0.06212775409221649,
0.0588100403547287,
0.0024946187622845173,
0.05803376063704491,
-0.018451718613505363,
-0.03834477812051773,
0.08278496563434601,
0.030962346121668816,
0.004597456194460392,
-0.015839459374547005,
-0.016143716871738434,
-0.1193624883890152,
0.18835170567035675,
-0.06497330963611603,
-0.16683049499988556,
-0.08129224181175232,
-0.10052816569805145,
0.00872142892330885,
0.02884209156036377,
0.03930116817355156,
-0.022703319787979126,
-0.045682378113269806,
-0.12212541699409485,
0.062863789498806,
-0.04378654062747955,
0.06900838762521744,
0.11739633232355118,
-0.03981949761509895,
0.05831829458475113,
-0.12390535324811935,
-0.0066237738355994225,
-0.0803179144859314,
-0.07816793769598007,
0.06220252811908722,
-0.046597111970186234,
0.029005907475948334,
0.0969659760594368,
0.01917755976319313,
-0.02053646557033062,
-0.024631686508655548,
0.19733591377735138,
0.041861847043037415,
0.04343792423605919,
0.13448359072208405,
-0.06336689740419388,
0.0561370849609375,
0.08451312780380249,
0.009944192133843899,
-0.043710317462682724,
0.05309329926967621,
0.04165186360478401,
-0.07272183150053024,
-0.20057977735996246,
-0.02092347852885723,
-0.009555413387715816,
-0.038758937269449234,
0.07079999893903732,
0.03352423757314682,
-0.006479224190115929,
0.06824219971895218,
0.01576928049325943,
0.062068745493888855,
-0.0017974498914554715,
0.0942905992269516,
0.021986668929457664,
-0.036541614681482315,
0.08713392168283463,
-0.01954592578113079,
-0.006137704011052847,
0.08858595043420792,
-0.02617029659450054,
0.28879058361053467,
-0.034564804285764694,
0.015992388129234314,
0.11937570571899414,
0.04704255238175392,
0.061344247311353683,
0.1259617805480957,
-0.06369764357805252,
0.024674605578184128,
-0.07230299711227417,
-0.058179207146167755,
-0.0074530490674078465,
0.04529444873332977,
-0.06618659198284149,
0.01029623206704855,
-0.07578867673873901,
0.0314817950129509,
-0.019960930570960045,
0.29802843928337097,
0.1083153486251831,
-0.10352872312068939,
-0.06084442138671875,
0.003523255465552211,
-0.09519659727811813,
-0.07106009870767593,
0.042558010667562485,
0.08473390340805054,
-0.13126668334007263,
0.005353470798581839,
-0.024359576404094696,
0.07224956899881363,
-0.008897869847714901,
0.01787465810775757,
0.03158911317586899,
0.04007817804813385,
-0.037076760083436966,
0.007700915448367596,
-0.1812427043914795,
0.1947827935218811,
0.007564583793282509,
0.01791711337864399,
-0.05120493099093437,
0.03652185946702957,
0.00765635073184967,
-0.03166751563549042,
0.06499689072370529,
0.025316014885902405,
-0.04764952138066292,
-0.04541840776801109,
-0.04873038828372955,
0.014031213708221912,
0.07801298052072525,
-0.03923243656754494,
0.10238900780677795,
-0.005019418895244598,
0.043575581163167953,
0.021185312420129776,
0.0939539447426796,
-0.18176455795764923,
-0.09022293239831924,
0.0307938065379858,
-0.062350254505872726,
-0.11570524424314499,
-0.08095608651638031,
-0.09229335933923721,
0.005202551372349262,
0.2545953094959259,
-0.1191239058971405,
-0.07643129676580429,
-0.09723318368196487,
0.035430312156677246,
0.10533714294433594,
-0.04920053109526634,
0.026705149561166763,
-0.003858714597299695,
0.12973551452159882,
-0.0679979994893074,
-0.1332302987575531,
0.02149266004562378,
-0.0909208357334137,
-0.16620472073554993,
-0.06740887463092804,
0.11587072163820267,
0.060250911861658096,
0.035963594913482666,
-0.029052279889583588,
0.023416345939040184,
0.03648184612393379,
-0.037052225321531296,
0.004090458620339632,
0.06428855657577515,
0.09509482979774475,
0.0326327346265316,
-0.10809996724128723,
0.011921162717044353,
-0.05958329513669014,
-0.06839949637651443,
0.07138308137655258,
0.264352947473526,
-0.05836436152458191,
0.1284169703722,
0.11552582681179047,
-0.08311837911605835,
-0.15886066854000092,
0.033726610243320465,
0.09282661974430084,
-0.016223054379224777,
0.017992818728089333,
-0.15676096081733704,
0.08900847285985947,
0.11146100610494614,
-0.021732235327363014,
0.00045642489567399025,
-0.18834303319454193,
-0.1313895583152771,
0.06022024527192116,
0.09928981214761734,
0.272830605506897,
-0.0599210262298584,
-0.04332158714532852,
0.02043374814093113,
-0.09562507271766663,
0.012527395971119404,
0.12040182203054428,
0.06011795997619629,
-0.024986958131194115,
-0.07271242886781693,
0.015540726482868195,
-0.040559180080890656,
0.09436846524477005,
0.05382132530212402,
0.055795252323150635,
-0.0034443966578692198,
0.019085176289081573,
-0.013032372109591961,
-0.040939074009656906,
0.05665527284145355,
0.027433155104517937,
0.05196356773376465,
-0.07988225668668747,
-0.027770275250077248,
-0.06876761466264725,
0.028969231992959976,
-0.02746902033686638,
-0.07525976747274399,
-0.06094307824969292,
0.0776783898472786,
0.04740701615810394,
-0.02752716653048992,
0.012260165065526962,
0.031996965408325195,
0.11642570793628693,
0.18021488189697266,
-0.0064049395732581615,
-0.0492238886654377,
-0.046462032943964005,
-0.033491071313619614,
-0.01722852699458599,
0.07081586867570877,
-0.05494370311498642,
0.02766447514295578,
0.06627444177865982,
0.02198871597647667,
0.09362347424030304,
0.057292621582746506,
-0.11544764786958694,
-0.01782609522342682,
0.03635137900710106,
-0.16531702876091003,
0.01324667688459158,
0.00027589601813815534,
0.023490052670240402,
-0.03584114834666252,
0.019848313182592392,
0.1499345302581787,
-0.06464873999357224,
-0.03502362221479416,
-0.04272906109690666,
0.07247114926576614,
0.024468375369906425,
0.14622509479522705,
0.031516674906015396,
0.03666653111577034,
-0.08312910050153732,
0.12610958516597748,
0.04159844294190407,
-0.0426965169608593,
0.018212104216217995,
-0.023377003148198128,
-0.10831159353256226,
0.013679008930921555,
0.06392425298690796,
0.035390544682741165,
-0.04799012467265129,
-0.009588399901986122,
-0.025961464270949364,
-0.07778500765562057,
0.060669317841529846,
0.1787334680557251,
0.06444696336984634,
0.07206930965185165,
-0.05163250118494034,
-0.03417733311653137,
-0.0798058733344078,
0.04726170003414154,
0.04335467144846916,
0.07275712490081787,
-0.0726090744137764,
0.11447811871767044,
0.008828047662973404,
0.0457925945520401,
-0.03126439079642296,
-0.0484163761138916,
-0.0977487564086914,
-0.053788501769304276,
-0.10194326937198639,
0.013945897109806538,
-0.0720420554280281,
-0.041236765682697296,
0.0007857312448322773,
-0.00817013718187809,
-0.010630049742758274,
0.051650118082761765,
-0.061349645256996155,
-0.010570869781076908,
-0.02725185453891754,
0.035240426659584045,
-0.06718230992555618,
-0.03923661634325981,
0.033271171152591705,
-0.10140364617109299,
0.09284795820713043,
0.05240920931100845,
0.006298094987869263,
0.009485282003879547,
0.09047345072031021,
-0.02053428441286087,
0.02199470065534115,
0.018304571509361267,
-0.04529339820146561,
-0.08319316059350967,
0.005516596604138613,
-0.008011634461581707,
-0.022204233333468437,
-0.011458228342235088,
0.08899452537298203,
-0.0859442800283432,
0.04008166491985321,
-0.009271477349102497,
-0.008099403232336044,
-0.07318908721208572,
-0.011511018499732018,
0.10185591131448746,
0.09831216186285019,
0.04248509556055069,
-0.09148656576871872,
0.010670635849237442,
-0.14583434164524078,
-0.03869602829217911,
0.0070974272675812244,
-0.010697206482291222,
-0.1200992614030838,
-0.010153691284358501,
0.022571485489606857,
-0.0028957161121070385,
0.20916888117790222,
-0.050299134105443954,
-0.018129467964172363,
0.020729416981339455,
-0.09934575110673904,
0.11111444979906082,
-0.022966556251049042,
0.17380346357822418,
-0.012971465475857258,
-0.04178515821695328,
-0.011816214770078659,
0.03510519117116928,
0.023023447021842003,
-0.019305111840367317,
0.18876323103904724,
0.13681693375110626,
0.032127685844898224,
0.03755604848265648,
-0.022264588624238968,
-0.0002562273293733597,
-0.04776406288146973,
-0.02504386007785797,
0.025656603276729584,
0.03641461208462715,
0.01789803057909012,
0.1438356339931488,
0.06970897316932678,
-0.16792932152748108,
0.0367746539413929,
-0.03225792944431305,
-0.04011952504515648,
-0.11853423714637756,
-0.08598805218935013,
-0.03306341543793678,
-0.07076144218444824,
0.008406872861087322,
-0.1265670210123062,
0.010745844803750515,
0.17725765705108643,
0.05541967228055,
0.025206532329320908,
0.0013132902095094323,
-0.12131880223751068,
-0.03183881938457489,
0.04889695346355438,
0.01452152244746685,
0.024349592626094818,
0.06511938571929932,
-0.005110591184347868,
0.06370853632688522,
0.046075914055109024,
0.013907239772379398,
0.0012934344122186303,
0.07965700328350067,
0.014935843646526337,
0.03859592229127884,
-0.05923394113779068,
-0.00620881374925375,
-0.04445326328277588,
0.06918153911828995,
0.09549002349376678,
0.04903759807348251,
-0.04667748883366585,
-0.00636841868981719,
0.16374172270298004,
-0.04035565257072449,
0.0022268053144216537,
-0.12708747386932373,
0.32228028774261475,
0.014114965684711933,
0.013798301108181477,
0.05050664022564888,
-0.07723409682512283,
-0.05300242453813553,
0.20727825164794922,
0.09038052707910538,
-0.01827528141438961,
-0.022733982652425766,
-0.0009417851106263697,
-0.030642559751868248,
-0.020425468683242798,
0.15193988382816315,
0.034419573843479156,
0.1176241859793663,
-0.0517120398581028,
-0.051348112523555756,
-0.029861606657505035,
-0.00831642746925354,
-0.12541161477565765,
0.13848447799682617,
-0.02959304116666317,
-0.02374022640287876,
-0.07028598338365555,
0.02308858558535576,
0.07554766535758972,
-0.33429884910583496,
0.00909514632076025,
-0.03438869118690491,
-0.10494022816419601,
-0.0022756988182663918,
-0.017061060294508934,
-0.023341381922364235,
0.046665776520967484,
-0.047586433589458466,
0.07320797443389893,
0.039201267063617706,
0.035471878945827484,
-0.024848710745573044,
-0.09164793789386749,
0.16477207839488983,
0.041687216609716415,
0.09699226915836334,
0.026324545964598656,
0.0751563310623169,
0.053913865238428116,
0.037715718150138855,
-0.09884956479072571,
0.040941692888736725,
0.012538905255496502,
-0.08724227547645569,
-0.05381263792514801,
0.12530463933944702,
-0.004210581537336111,
0.03820446878671646,
0.043439581990242004,
-0.10549256205558777,
0.00935298576951027,
0.07051797956228256,
-0.06692090630531311,
-0.09425488859415054,
-0.012238782830536366,
-0.09129167348146439,
0.1523638665676117,
0.14331787824630737,
-0.014420821331441402,
0.025814594700932503,
-0.06829854846000671,
-0.010397274047136307,
0.05285754054784775,
0.019516685977578163,
-0.015746675431728363,
-0.1936882883310318,
0.03195415809750557,
-0.0795150101184845,
-0.004292509518563747,
-0.22661146521568298,
-0.10151229053735733,
-0.01136095356196165,
-0.05020963028073311,
-0.026949873194098473,
0.06527385115623474,
0.030984463170170784,
0.06628004461526871,
-0.014902417548000813,
-0.03511153534054756,
-0.027402734383940697,
0.09327394515275955,
-0.10398240387439728,
-0.06218574941158295
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 160k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 160k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_160k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_160k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_160k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_160k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_160k"]}
| null |
google/multiberts-seed_2-step_160k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_160k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_160k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 160k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 160k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 160k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 160k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_160k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 160k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 160k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_160k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 160k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 160k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07776134461164474,
0.09578780829906464,
-0.0026814399752765894,
0.04437217861413956,
0.08191405981779099,
-0.01645676977932453,
0.07422352582216263,
0.10191016644239426,
-0.018777448683977127,
0.026278121396899223,
0.07790222018957138,
0.00304117938503623,
0.01905185543000698,
0.10065922886133194,
0.023009127005934715,
-0.22265861928462982,
0.017990190535783768,
-0.03294292837381363,
-0.09170927107334137,
0.075876884162426,
0.10126486420631409,
-0.0793590322136879,
0.04597336798906326,
0.02994001843035221,
-0.11878583580255508,
0.051457446068525314,
0.00328615284524858,
-0.01525974553078413,
0.13234011828899384,
-0.0010343915782868862,
0.05357404425740242,
0.05606919899582863,
0.04468211159110069,
-0.13138478994369507,
0.006145342718809843,
0.05869107320904732,
0.059383440762758255,
0.045599643141031265,
0.02809094451367855,
0.07732003927230835,
0.011728684417903423,
0.024739079177379608,
0.041334476321935654,
0.024126868695020676,
-0.07486966997385025,
-0.062654510140419,
-0.10229738801717758,
0.03075561672449112,
0.023541556671261787,
0.0026044517289847136,
0.010583968833088875,
0.12156203389167786,
-0.03177162632346153,
0.0444316565990448,
0.18663103878498077,
-0.3368266224861145,
-0.013660019263625145,
0.06933515518903732,
0.04118813946843147,
0.12503884732723236,
-0.0011579832062125206,
-0.018472084775567055,
0.07614404708147049,
0.024966396391391754,
0.08751648664474487,
-0.039764657616615295,
0.024519475176930428,
-0.055091340094804764,
-0.1554272621870041,
-0.04680268466472626,
0.09017646312713623,
-0.00759535189718008,
-0.1376199871301651,
-0.030429890379309654,
-0.04693650081753731,
0.04276224970817566,
0.013625404797494411,
-0.03807397559285164,
0.03433270752429962,
0.017753032967448235,
-0.014855694957077503,
-0.012535465881228447,
-0.1058310866355896,
-0.05464719608426094,
0.02902628667652607,
0.07289131730794907,
0.1022566705942154,
0.06948982179164886,
0.0010507687693461776,
0.11019986867904663,
-0.19142749905586243,
-0.051254455000162125,
-0.027294345200061798,
-0.05210835486650467,
-0.04807853326201439,
-0.008682609535753727,
-0.1066298857331276,
-0.04573425278067589,
0.010183664038777351,
0.12167104333639145,
-0.014847063459455967,
0.02437560446560383,
-0.03970373421907425,
0.007840857841074467,
0.05547028407454491,
0.04208366572856903,
-0.005221391562372446,
0.024506736546754837,
0.023786386474967003,
-0.016175702214241028,
-0.024240275844931602,
0.014896642416715622,
0.001164300017990172,
0.023750711232423782,
0.11620038002729416,
0.023934809491038322,
-0.10069718956947327,
0.06620575487613678,
-0.02118375152349472,
-0.043877676129341125,
0.015070648863911629,
-0.09075450152158737,
-0.055641598999500275,
-0.037573061883449554,
0.001657630316913128,
0.012879690155386925,
-0.0007666824967600405,
-0.007034752983599901,
-0.023862052708864212,
-0.04265314340591431,
-0.08174814283847809,
-0.04544723778963089,
-0.05494328588247299,
-0.12940631806850433,
0.008330418728291988,
-0.1829935610294342,
-0.03731170669198036,
-0.11614397913217545,
-0.18726585805416107,
-0.020522400736808777,
0.060464292764663696,
-0.01196636725217104,
-0.05492643639445305,
0.07789646834135056,
0.040893640369176865,
-0.030570348724722862,
-0.0006297187646850944,
0.08311980217695236,
-0.0012415752280503511,
0.04485021159052849,
-0.024253806099295616,
0.0695507675409317,
0.006592493504285812,
0.03438296914100647,
-0.05510615557432175,
0.061930544674396515,
-0.16633380949497223,
0.04643487557768822,
-0.07201757282018661,
-0.03366106376051903,
-0.08680182695388794,
-0.0334576852619648,
-0.009152044542133808,
0.005905304569751024,
0.023044323548674583,
0.07607848197221756,
-0.18704338371753693,
-0.030120817944407463,
0.12342669814825058,
-0.164934903383255,
-0.01664421334862709,
0.07553551346063614,
-0.04780014976859093,
0.10574192553758621,
0.06991759687662125,
0.1553552746772766,
-0.007924200966954231,
-0.08032182604074478,
0.05863272771239281,
-0.010397376492619514,
0.011815928854048252,
-0.006598904263228178,
0.07082977890968323,
-0.01752011850476265,
-0.15588398277759552,
0.03585373982787132,
-0.13270843029022217,
-0.0038890154100954533,
-0.07969994843006134,
0.019409481436014175,
-0.012577995657920837,
-0.06711488962173462,
-0.06437886506319046,
-0.02446848340332508,
0.06625829637050629,
-0.07407316565513611,
-0.0183704886585474,
0.04142855480313301,
0.07117009907960892,
-0.07113324105739594,
0.06800322234630585,
-0.015205685980618,
0.010893180035054684,
-0.08419809490442276,
-0.03866765648126602,
-0.1918865442276001,
0.053536396473646164,
0.10031317174434662,
0.013826395384967327,
-0.021227030083537102,
0.1518886685371399,
0.008697974495589733,
0.06780056655406952,
-0.04858710989356041,
0.015103664249181747,
-0.013650047592818737,
-0.00577230378985405,
-0.08856711536645889,
-0.09964951872825623,
-0.07464437931776047,
-0.06997327506542206,
0.08257249742746353,
-0.12546668946743011,
0.02096245251595974,
-0.052755724638700485,
0.044191874563694,
0.022751888260245323,
-0.08338429778814316,
-0.018414955586194992,
0.012606771662831306,
-0.0608009397983551,
-0.05712096393108368,
0.0394303984940052,
0.07015527039766312,
-0.00798326451331377,
0.09581118077039719,
-0.04637369140982628,
-0.08474408090114594,
0.03088572435081005,
0.09948944300413132,
-0.10481676459312439,
0.017858317121863365,
-0.05765495449304581,
-0.04236000403761864,
-0.06580958515405655,
-0.017796648666262627,
0.07766924798488617,
-0.009911284781992435,
0.1364537477493286,
-0.0754266306757927,
-0.002855042228475213,
0.013004042208194733,
-0.023497987538576126,
-0.022046899423003197,
0.041685476899147034,
0.05845337733626366,
-0.07575242221355438,
0.01649581640958786,
0.03852342814207077,
0.010115974582731724,
0.07311207801103592,
-0.05333111807703972,
-0.08983533829450607,
0.01163097657263279,
0.039127167314291,
0.033811330795288086,
0.06383268535137177,
-0.02416965924203396,
-0.011510031297802925,
0.03454839065670967,
0.014711149036884308,
0.006437505129724741,
-0.11784857511520386,
0.06292344629764557,
0.0581291988492012,
0.000267910014372319,
0.06494463235139847,
-0.01741611957550049,
-0.038735952228307724,
0.08215261250734329,
0.03168993815779686,
0.004221756011247635,
-0.016519350931048393,
-0.015117249451577663,
-0.11768869310617447,
0.18910212814807892,
-0.06309355050325394,
-0.16651217639446259,
-0.07555124908685684,
-0.09850507974624634,
0.007823443971574306,
0.029096761718392372,
0.03813923895359039,
-0.02058420702815056,
-0.044490423053503036,
-0.12381688505411148,
0.06612797826528549,
-0.041352707892656326,
0.07005669921636581,
0.11074630916118622,
-0.03837749361991882,
0.05879418924450874,
-0.12444832921028137,
-0.006642757449299097,
-0.08070157468318939,
-0.08168874680995941,
0.060111649334430695,
-0.048616934567689896,
0.026421992108225822,
0.09353452920913696,
0.021623756736516953,
-0.01933233253657818,
-0.024366330355405807,
0.1946825534105301,
0.04434499889612198,
0.039828281849622726,
0.1364406794309616,
-0.06472665816545486,
0.05523113161325455,
0.08069974929094315,
0.006766077596694231,
-0.043184712529182434,
0.05281856656074524,
0.043351657688617706,
-0.07098085433244705,
-0.19813482463359833,
-0.020069211721420288,
-0.008366037160158157,
-0.03928679600358009,
0.07376587390899658,
0.035524968057870865,
-0.00891706719994545,
0.0684068351984024,
0.012413487769663334,
0.06511068344116211,
-0.0031415431294590235,
0.09391116350889206,
0.013700089417397976,
-0.03561725094914436,
0.08790034800767899,
-0.02065877430140972,
-0.008149764500558376,
0.08658242970705032,
-0.02210642769932747,
0.28826865553855896,
-0.03414550796151161,
0.01376806478947401,
0.12057515233755112,
0.04474648833274841,
0.06251843273639679,
0.12618401646614075,
-0.06585978716611862,
0.023710617795586586,
-0.07380208373069763,
-0.059570830315351486,
-0.007834934629499912,
0.04631432145833969,
-0.05673811957240105,
0.01249639317393303,
-0.07257018238306046,
0.02713479846715927,
-0.020975984632968903,
0.29920288920402527,
0.11118842661380768,
-0.10346505790948868,
-0.058984119445085526,
0.005812323186546564,
-0.0979037955403328,
-0.07333943992853165,
0.04011167585849762,
0.080669105052948,
-0.13344833254814148,
0.004561758600175381,
-0.02682657726109028,
0.07379408180713654,
-0.01560417003929615,
0.017772451043128967,
0.028120480477809906,
0.03950053080916405,
-0.03528599068522453,
0.009626028127968311,
-0.1913418173789978,
0.19347217679023743,
0.007195580285042524,
0.019254393875598907,
-0.05161336064338684,
0.036304254084825516,
0.006913072895258665,
-0.033513035625219345,
0.06310924142599106,
0.023058509454131126,
-0.038119059056043625,
-0.04264412820339203,
-0.049986831843853,
0.013986099511384964,
0.07683587819337845,
-0.04496591538190842,
0.10494238138198853,
-0.00635520089417696,
0.04380866140127182,
0.019570844247937202,
0.09138559550046921,
-0.18474186956882477,
-0.08909395337104797,
0.030592529103159904,
-0.06278415024280548,
-0.10988970100879669,
-0.08122583478689194,
-0.09290313720703125,
0.006879156921058893,
0.255831241607666,
-0.12342824786901474,
-0.07601883262395859,
-0.09568069130182266,
0.03727284073829651,
0.10260578244924545,
-0.049955952912569046,
0.025259416550397873,
-0.00545827392488718,
0.13176612555980682,
-0.06956739723682404,
-0.13532601296901703,
0.021842999383807182,
-0.09192220121622086,
-0.16781732439994812,
-0.0664813369512558,
0.1160409227013588,
0.06252944469451904,
0.03655455633997917,
-0.027519993484020233,
0.021832330152392387,
0.035551976412534714,
-0.03470885381102562,
0.001212102361023426,
0.07283087074756622,
0.09360812604427338,
0.03341137617826462,
-0.10760984569787979,
0.010276220738887787,
-0.06015472114086151,
-0.06634334474802017,
0.07434946298599243,
0.2639552354812622,
-0.0566418282687664,
0.1297464370727539,
0.10872483998537064,
-0.08261562138795853,
-0.15522781014442444,
0.031229792162775993,
0.09925568848848343,
-0.01478642225265503,
0.02158379554748535,
-0.15905733406543732,
0.087325319647789,
0.10886447876691818,
-0.023093990981578827,
0.0070565566420555115,
-0.18831905722618103,
-0.12849384546279907,
0.06592442840337753,
0.09817589819431305,
0.2723003327846527,
-0.0608508475124836,
-0.04402777552604675,
0.017420612275600433,
-0.09498586505651474,
0.020158590748906136,
0.12011542916297913,
0.06175679340958595,
-0.02390453964471817,
-0.07105611264705658,
0.01590675674378872,
-0.038266975432634354,
0.09452208131551743,
0.05101470649242401,
0.05539269000291824,
-0.0034101265482604504,
0.018624505028128624,
-0.01896592602133751,
-0.0424501933157444,
0.05987196043133736,
0.02064022794365883,
0.04871717095375061,
-0.08372911810874939,
-0.029264988377690315,
-0.06814415752887726,
0.03257770091295242,
-0.025185279548168182,
-0.0758214145898819,
-0.05809176340699196,
0.0762806162238121,
0.04611632600426674,
-0.024417350068688393,
0.01430061087012291,
0.029642188921570778,
0.11600621789693832,
0.1760568618774414,
-0.004483347292989492,
-0.039858169853687286,
-0.04876330494880676,
-0.03766928240656853,
-0.01553310640156269,
0.07400307059288025,
-0.051971595734357834,
0.02655669115483761,
0.06350017338991165,
0.021131543442606926,
0.0943942442536354,
0.05542954429984093,
-0.12000863254070282,
-0.0168045312166214,
0.03284472972154617,
-0.1662868857383728,
0.014591376297175884,
0.00020884220430161804,
0.03185100108385086,
-0.03445041924715042,
0.024872764945030212,
0.15421977639198303,
-0.06599092483520508,
-0.036337077617645264,
-0.04152035713195801,
0.07042756676673889,
0.02267579548060894,
0.14251980185508728,
0.031961698085069656,
0.03639109060168266,
-0.08135587722063065,
0.12342321127653122,
0.03878137096762657,
-0.03588280826807022,
0.0211946964263916,
-0.029271388426423073,
-0.1055087074637413,
0.013359935022890568,
0.06542184948921204,
0.04109025001525879,
-0.04741682857275009,
-0.01113197859376669,
-0.025570470839738846,
-0.07540479302406311,
0.061118949204683304,
0.1838192641735077,
0.06709064543247223,
0.07612258195877075,
-0.05420299619436264,
-0.03416934236884117,
-0.07758474349975586,
0.045049771666526794,
0.04489939659833908,
0.07404282689094543,
-0.07485222071409225,
0.11182290315628052,
0.008553599938750267,
0.045391593128442764,
-0.031670164316892624,
-0.048474449664354324,
-0.09899043291807175,
-0.05422716960310936,
-0.0970868468284607,
0.014896059408783913,
-0.07127329707145691,
-0.04162835329771042,
0.0021516173146665096,
-0.008684402331709862,
-0.007169610355049372,
0.05203559994697571,
-0.06152735650539398,
-0.00972552876919508,
-0.02919822558760643,
0.035631101578474045,
-0.0677642971277237,
-0.03942535072565079,
0.03233521431684494,
-0.10195290297269821,
0.0922989472746849,
0.05194779857993126,
0.006589358206838369,
0.010809485800564289,
0.07905939966440201,
-0.02141890488564968,
0.02439109981060028,
0.015405042096972466,
-0.04668677970767021,
-0.08216048777103424,
0.0047739907167851925,
-0.007015475071966648,
-0.018519282341003418,
-0.012386971153318882,
0.09326478838920593,
-0.08619162440299988,
0.03575921431183815,
-0.007837492041289806,
-0.005394351202994585,
-0.07209829241037369,
-0.010035369545221329,
0.09809348732233047,
0.10074307024478912,
0.04476775974035263,
-0.09084997326135635,
0.012112553231418133,
-0.14366395771503448,
-0.03793518990278244,
0.006535007618367672,
-0.010430804453790188,
-0.12457669526338577,
-0.009838799014687538,
0.021466832607984543,
-0.002656039781868458,
0.2078731805086136,
-0.04901622608304024,
-0.014966492541134357,
0.01802769862115383,
-0.09929196536540985,
0.11554688215255737,
-0.024021029472351074,
0.17510922253131866,
-0.012830154970288277,
-0.038563072681427,
-0.015466026030480862,
0.034549281001091,
0.020767077803611755,
-0.024778107181191444,
0.18535172939300537,
0.13607531785964966,
0.02616085670888424,
0.04061626270413399,
-0.024667544290423393,
0.0027440963312983513,
-0.05293218418955803,
-0.025516338646411896,
0.028246454894542694,
0.03707626461982727,
0.01783924549818039,
0.15355956554412842,
0.0708182230591774,
-0.16881074011325836,
0.03329140692949295,
-0.029074283316731453,
-0.03736988082528114,
-0.11887875944375992,
-0.09776778519153595,
-0.034963179379701614,
-0.06974088400602341,
0.0094061940908432,
-0.1242949515581131,
0.012206677347421646,
0.17965930700302124,
0.05501209944486618,
0.025698676705360413,
0.00043685489799827337,
-0.11934539675712585,
-0.03478361666202545,
0.053170956671237946,
0.013174830004572868,
0.024360068142414093,
0.061170030385255814,
-0.0017513129860162735,
0.06420183181762695,
0.045935772359371185,
0.014415360987186432,
0.0013344279723241925,
0.08107050508260727,
0.015750015154480934,
0.039724767208099365,
-0.06127476692199707,
-0.00497381342574954,
-0.04297095537185669,
0.06830263882875443,
0.09894748032093048,
0.05018596723675728,
-0.04868130013346672,
-0.00687676016241312,
0.16015689074993134,
-0.04029013589024544,
0.0013443682109937072,
-0.12619353830814362,
0.32507479190826416,
0.012606920674443245,
0.015577881596982479,
0.048263732343912125,
-0.07507359236478806,
-0.049812041223049164,
0.19824932515621185,
0.08550985902547836,
-0.019401507452130318,
-0.020980237051844597,
-0.0018539732554927468,
-0.03032226487994194,
-0.0214735995978117,
0.15010623633861542,
0.034750014543533325,
0.12366105616092682,
-0.05486871674656868,
-0.05295542627573013,
-0.02945712022483349,
-0.0090131014585495,
-0.12605813145637512,
0.13238035142421722,
-0.030420737341046333,
-0.021827522665262222,
-0.06977720558643341,
0.023565327748656273,
0.07375446707010269,
-0.333383709192276,
0.00444983784109354,
-0.0318826399743557,
-0.10632447898387909,
-0.004250739701092243,
-0.011527772061526775,
-0.022166414186358452,
0.046166256070137024,
-0.049631353467702866,
0.07307608425617218,
0.040707021951675415,
0.03536122664809227,
-0.02379816584289074,
-0.08703722059726715,
0.16272571682929993,
0.04287974536418915,
0.09227573871612549,
0.025707634165883064,
0.0736585184931755,
0.05258066952228546,
0.03638594225049019,
-0.09641460329294205,
0.04261135309934616,
0.012183799408376217,
-0.08887051045894623,
-0.0526711530983448,
0.12498113512992859,
-0.002818322740495205,
0.04127047583460808,
0.0446961484849453,
-0.10606127232313156,
0.012277352623641491,
0.0724082738161087,
-0.06856591254472733,
-0.09692183136940002,
-0.007055224850773811,
-0.09241177886724472,
0.15211503207683563,
0.14298856258392334,
-0.014811892993748188,
0.02233986184000969,
-0.07044368237257004,
-0.006112110335379839,
0.05178065598011017,
0.018343990668654442,
-0.016269149258732796,
-0.18796394765377045,
0.03120153397321701,
-0.07277623564004898,
-0.003179413266479969,
-0.22364714741706848,
-0.1008441373705864,
-0.013146805576980114,
-0.05112005025148392,
-0.028615234419703484,
0.061583563685417175,
0.029359664767980576,
0.06731874495744705,
-0.015648001804947853,
-0.04286478832364082,
-0.027457069605588913,
0.09040055423974991,
-0.10648481547832489,
-0.06302710622549057
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1700k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1700k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1700k"]}
| null |
google/multiberts-seed_2-step_1700k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1700k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1700k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0804850310087204,
0.09349039196968079,
-0.002443421632051468,
0.04460965842008591,
0.07555528730154037,
-0.015478670597076416,
0.0691060870885849,
0.10028912127017975,
-0.014210173860192299,
0.02659735269844532,
0.08292638510465622,
0.004308972507715225,
0.01940999925136566,
0.09993471950292587,
0.023499371483922005,
-0.22149772942066193,
0.020976882427930832,
-0.0313911996781826,
-0.08529628068208694,
0.07731728255748749,
0.100089892745018,
-0.08192582428455353,
0.045451339334249496,
0.029700273647904396,
-0.11597567051649094,
0.04870142787694931,
0.003723646281287074,
-0.020897192880511284,
0.1322980374097824,
-0.00023486099962610751,
0.04861219599843025,
0.05774436891078949,
0.04487985745072365,
-0.12853795289993286,
0.006475407630205154,
0.05775422230362892,
0.059453677386045456,
0.04539448767900467,
0.023722413927316666,
0.0787227526307106,
0.013904892839491367,
0.028668927028775215,
0.04520058631896973,
0.0255594402551651,
-0.0750654861330986,
-0.06667326390743256,
-0.10284440964460373,
0.04383914917707443,
0.02484883740544319,
0.00007295491377590224,
0.011921904049813747,
0.12538602948188782,
-0.030004121363162994,
0.044716157019138336,
0.17802150547504425,
-0.3354773223400116,
-0.015924163162708282,
0.07485170662403107,
0.04367145150899887,
0.13509736955165863,
-0.001914756721816957,
-0.01774616539478302,
0.0772346705198288,
0.024383533746004105,
0.09132616966962814,
-0.04002344608306885,
0.03160674124956131,
-0.054734352976083755,
-0.15651409327983856,
-0.04892741143703461,
0.08618466556072235,
-0.003413915168493986,
-0.13538597524166107,
-0.035849496722221375,
-0.04574758559465408,
0.035908523947000504,
0.012396886013448238,
-0.041151221841573715,
0.03153066337108612,
0.017140762880444527,
-0.014155986718833447,
-0.015264593996107578,
-0.10427650064229965,
-0.05443836748600006,
0.03182404488325119,
0.07812611013650894,
0.10231497138738632,
0.0709686353802681,
0.0007550838054157794,
0.10897330939769745,
-0.1852630227804184,
-0.049075089395046234,
-0.029533391818404198,
-0.05535213649272919,
-0.05014416575431824,
-0.010421071201562881,
-0.10339538007974625,
-0.03883443400263786,
0.0059376428835093975,
0.1252114325761795,
-0.010495807975530624,
0.029010748490691185,
-0.03839268535375595,
0.009210951626300812,
0.055617231875658035,
0.04163477569818497,
-0.007059949450194836,
0.02034970186650753,
0.023440644145011902,
-0.011531897820532322,
-0.02217784710228443,
0.013677176088094711,
0.005683066789060831,
0.022005772218108177,
0.1181124895811081,
0.024590548127889633,
-0.09694741666316986,
0.06218276172876358,
-0.017503023147583008,
-0.045993126928806305,
0.025504378601908684,
-0.0872001126408577,
-0.055144451558589935,
-0.03951329365372658,
0.0035833027213811874,
0.012416599318385124,
-0.002760704141110182,
-0.006291011348366737,
-0.023983215913176537,
-0.044517699629068375,
-0.08611828833818436,
-0.051123227924108505,
-0.051327940076589584,
-0.1255079060792923,
0.007314859423786402,
-0.17712467908859253,
-0.039794377982616425,
-0.11426398158073425,
-0.18994230031967163,
-0.022750133648514748,
0.06065572425723076,
-0.011997702531516552,
-0.05730979144573212,
0.08210164308547974,
0.04457971453666687,
-0.02892235852777958,
-0.0019124463433399796,
0.0786462351679802,
0.000989785767160356,
0.04521318897604942,
-0.029406625777482986,
0.0721496120095253,
0.007309553679078817,
0.033128660172224045,
-0.0562271848320961,
0.06263882666826248,
-0.17570878565311432,
0.04689222201704979,
-0.07380464673042297,
-0.029882559552788734,
-0.0867677852511406,
-0.03615403175354004,
-0.01149366982281208,
0.005125593394041061,
0.0206869188696146,
0.07381880283355713,
-0.17809374630451202,
-0.029712574556469917,
0.12781046330928802,
-0.16766665875911713,
-0.022094624117016792,
0.0763625055551529,
-0.04583660140633583,
0.1019451841711998,
0.07127434015274048,
0.1517840325832367,
-0.003710867604240775,
-0.07405878603458405,
0.06044737994670868,
-0.012957654893398285,
0.0077185179106891155,
-0.015443742275238037,
0.06937382370233536,
-0.018978595733642578,
-0.16699200868606567,
0.03498527407646179,
-0.13077601790428162,
-0.0035788039676845074,
-0.07897122949361801,
0.017083289101719856,
-0.013587935827672482,
-0.06689177453517914,
-0.0715389996767044,
-0.0241668950766325,
0.0651487484574318,
-0.07680030167102814,
-0.015888314694166183,
0.029640281572937965,
0.0736689567565918,
-0.07094424962997437,
0.06933385878801346,
-0.01183791272342205,
0.008612199686467648,
-0.07900899648666382,
-0.03703569993376732,
-0.18912464380264282,
0.05834058299660683,
0.10089670866727829,
0.00816065538674593,
-0.022183645516633987,
0.1400422751903534,
0.007391606457531452,
0.06841906160116196,
-0.049363430589437485,
0.014057474210858345,
-0.009595799259841442,
-0.0053585185669362545,
-0.08823787420988083,
-0.10260482132434845,
-0.07445373386144638,
-0.06900791823863983,
0.08337268978357315,
-0.12539446353912354,
0.021566683426499367,
-0.05969894677400589,
0.04300646856427193,
0.02096276730298996,
-0.08539939671754837,
-0.019556615501642227,
0.012351463548839092,
-0.05843956023454666,
-0.05632510036230087,
0.03901912271976471,
0.06932453066110611,
-0.011432039551436901,
0.0949188694357872,
-0.04911631718277931,
-0.08835320174694061,
0.03138567879796028,
0.100046806037426,
-0.10448187589645386,
0.011564071290194988,
-0.05823487043380737,
-0.04399119317531586,
-0.06445260345935822,
-0.018842853605747223,
0.071088045835495,
-0.009389174170792103,
0.13534396886825562,
-0.07858116924762726,
-0.002140217926353216,
0.017420336604118347,
-0.018761325627565384,
-0.026189716532826424,
0.03818200156092644,
0.0638992190361023,
-0.08560767769813538,
0.012966143898665905,
0.038954056799411774,
0.016816645860671997,
0.07140517234802246,
-0.05019799619913101,
-0.08685383200645447,
0.011976931244134903,
0.037958551198244095,
0.030720748007297516,
0.0633779764175415,
-0.024546826258301735,
-0.010808492079377174,
0.03310571610927582,
0.017687911167740822,
0.008137037977576256,
-0.11740215867757797,
0.06124936416745186,
0.057848043739795685,
0.0038771566469222307,
0.06239242106676102,
-0.01680281199514866,
-0.03865731880068779,
0.08246633410453796,
0.03361918404698372,
0.0006856528925709426,
-0.016997648403048515,
-0.014452281408011913,
-0.12007120251655579,
0.19086331129074097,
-0.06185164302587509,
-0.16792400181293488,
-0.07658162713050842,
-0.09845707565546036,
0.007102206815034151,
0.025865286588668823,
0.038675591349601746,
-0.020413069054484367,
-0.044179145246744156,
-0.12528203427791595,
0.059112198650836945,
-0.043783362954854965,
0.07134826481342316,
0.11581453680992126,
-0.0406038723886013,
0.05404704809188843,
-0.1250828206539154,
-0.007723470218479633,
-0.08427757024765015,
-0.07359451055526733,
0.05966315045952797,
-0.05033549293875694,
0.025217782706022263,
0.10009606927633286,
0.023828931152820587,
-0.018392525613307953,
-0.025600502267479897,
0.19874173402786255,
0.04375981539487839,
0.04136717692017555,
0.12972362339496613,
-0.061773549765348434,
0.05583970248699188,
0.07942270487546921,
0.008737960830330849,
-0.04527265205979347,
0.05138593167066574,
0.04007057100534439,
-0.0715126097202301,
-0.1967308223247528,
-0.022926028817892075,
-0.008140388876199722,
-0.037853654474020004,
0.07870949804782867,
0.03538835048675537,
-0.0014363336376845837,
0.06782954186201096,
0.010439516045153141,
0.06023070961236954,
-0.00015486903430428356,
0.09608720988035202,
0.0135630052536726,
-0.03596537560224533,
0.08751577138900757,
-0.019940050318837166,
-0.0036188706289976835,
0.08780788630247116,
-0.022732850164175034,
0.29034852981567383,
-0.030165597796440125,
0.011857868172228336,
0.12090479582548141,
0.04346425086259842,
0.06205306947231293,
0.12431962788105011,
-0.06662706285715103,
0.022323409095406532,
-0.0736737847328186,
-0.0590018555521965,
-0.0056545729748904705,
0.04795706644654274,
-0.05779257044196129,
0.012114755809307098,
-0.07424993067979813,
0.0285189226269722,
-0.02011944353580475,
0.30391737818717957,
0.10603121668100357,
-0.10288634151220322,
-0.06034492701292038,
0.0061361296102404594,
-0.09955707937479019,
-0.0733654797077179,
0.04421558976173401,
0.07666388154029846,
-0.13137707114219666,
0.00789208896458149,
-0.023730436339974403,
0.07429604977369308,
-0.012515727430582047,
0.015423858538269997,
0.024443047121167183,
0.03870411962270737,
-0.03648028522729874,
0.005952981300652027,
-0.17607064545154572,
0.19912564754486084,
0.006199415773153305,
0.017870977520942688,
-0.05317864567041397,
0.03380601108074188,
0.006553911603987217,
-0.02825157344341278,
0.06465902179479599,
0.02557416446506977,
-0.040891315788030624,
-0.04271094501018524,
-0.049477603286504745,
0.012330302968621254,
0.07328437268733978,
-0.04078124091029167,
0.1012062057852745,
-0.004896633327007294,
0.042132340371608734,
0.020272839814424515,
0.08514630049467087,
-0.18363358080387115,
-0.08711555600166321,
0.03067311830818653,
-0.05870448797941208,
-0.11368601024150848,
-0.08182433247566223,
-0.09343860298395157,
0.002808936173096299,
0.2536589503288269,
-0.11989836394786835,
-0.07379554212093353,
-0.0949607640504837,
0.03575309365987778,
0.10678552836179733,
-0.050548113882541656,
0.02740061841905117,
-0.007078668102622032,
0.12879176437854767,
-0.06965322047472,
-0.13156737387180328,
0.02401399053633213,
-0.09285930544137955,
-0.16711843013763428,
-0.06625040620565414,
0.11662229895591736,
0.06317824125289917,
0.034861985594034195,
-0.026607394218444824,
0.021606754511594772,
0.03606307506561279,
-0.03892285376787186,
0.0041922773234546185,
0.07066131383180618,
0.08985710144042969,
0.033440764993429184,
-0.10184123367071152,
0.00566435419023037,
-0.06046108901500702,
-0.0688716396689415,
0.07401629537343979,
0.2698538601398468,
-0.059362173080444336,
0.12982353568077087,
0.11947145313024521,
-0.07953531295061111,
-0.15107578039169312,
0.03213632479310036,
0.08986169099807739,
-0.017299683764576912,
0.01648099720478058,
-0.15367022156715393,
0.08803617209196091,
0.11206241697072983,
-0.024076925590634346,
-0.00031065696384757757,
-0.19444212317466736,
-0.13058394193649292,
0.06959899514913559,
0.09910093247890472,
0.27907806634902954,
-0.05772564932703972,
-0.044067587703466415,
0.02177436277270317,
-0.09228475391864777,
0.018159380182623863,
0.12649397552013397,
0.059729669243097305,
-0.027059296146035194,
-0.08254919201135635,
0.015222433023154736,
-0.041663940995931625,
0.09444074332714081,
0.05138501152396202,
0.05502699688076973,
-0.0005059504183009267,
0.022606035694479942,
-0.022592656314373016,
-0.04269373416900635,
0.05904830992221832,
0.02417820319533348,
0.05266591161489487,
-0.08007185161113739,
-0.03070608153939247,
-0.07121185958385468,
0.028621751815080643,
-0.02529209852218628,
-0.07477281242609024,
-0.06025194004178047,
0.07920797169208527,
0.04867831617593765,
-0.02624700590968132,
0.014883694238960743,
0.02878241054713726,
0.11843986809253693,
0.1749514937400818,
-0.004639559891074896,
-0.05370285362005234,
-0.046961698681116104,
-0.038602884858846664,
-0.019043974578380585,
0.07432898879051208,
-0.050780970603227615,
0.0249905064702034,
0.06347234547138214,
0.021869678050279617,
0.09506064653396606,
0.0554109662771225,
-0.11744800955057144,
-0.019288554787635803,
0.0352531336247921,
-0.16420835256576538,
0.010506751947104931,
0.00024211952404584736,
0.03181954845786095,
-0.033291030675172806,
0.023508775979280472,
0.14842091500759125,
-0.06497815996408463,
-0.036865007132291794,
-0.0422140397131443,
0.0710945725440979,
0.02463570050895214,
0.1388184130191803,
0.030797528102993965,
0.03690270707011223,
-0.08195069432258606,
0.12323232740163803,
0.04193035885691643,
-0.04496527835726738,
0.018526770174503326,
-0.020260872319340706,
-0.10741620510816574,
0.01297166757285595,
0.05672714114189148,
0.03230902552604675,
-0.05385560169816017,
-0.008925765752792358,
-0.02273526042699814,
-0.07524963468313217,
0.06562907248735428,
0.18334829807281494,
0.06533823162317276,
0.07182500511407852,
-0.05432696267962456,
-0.035202328115701675,
-0.07737403362989426,
0.04328195005655289,
0.04091152548789978,
0.0731833428144455,
-0.0741250067949295,
0.10848989337682724,
0.01290938537567854,
0.044602882117033005,
-0.030766254290938377,
-0.051850996911525726,
-0.09786422550678253,
-0.05339982733130455,
-0.10147664695978165,
0.009520106948912144,
-0.0799553170800209,
-0.03915540874004364,
-0.0007715333485975862,
-0.0090062590315938,
-0.01073413249105215,
0.0495842806994915,
-0.060858774930238724,
-0.009518122300505638,
-0.0257972851395607,
0.03710607811808586,
-0.0680433064699173,
-0.03807317838072777,
0.032624829560518265,
-0.10045073926448822,
0.09416513890028,
0.04926903918385506,
0.006174607202410698,
0.006940918043255806,
0.09368856251239777,
-0.020001934841275215,
0.022923840209841728,
0.014987723901867867,
-0.04636051878333092,
-0.08589497953653336,
0.004215076565742493,
-0.006467235274612904,
-0.021759305149316788,
-0.008985403925180435,
0.08906204253435135,
-0.08647192269563675,
0.033433277159929276,
-0.0062699769623577595,
-0.009671790525317192,
-0.07283104211091995,
-0.011328226886689663,
0.10024554282426834,
0.0949121043086052,
0.041634585708379745,
-0.0896618664264679,
0.010096275247633457,
-0.14618343114852905,
-0.03733038902282715,
0.005079828668385744,
-0.010447578504681587,
-0.12127522379159927,
-0.006425903178751469,
0.020864402875304222,
-0.0025874401908367872,
0.20973330736160278,
-0.05786479264497757,
-0.020769448950886726,
0.02243695594370365,
-0.09990457445383072,
0.10731706768274307,
-0.02512975037097931,
0.1806078404188156,
-0.010368735529482365,
-0.040502291172742844,
-0.007580332458019257,
0.03502008691430092,
0.022870078682899475,
-0.01737298257648945,
0.19028042256832123,
0.13696056604385376,
0.037693895399570465,
0.04073739051818848,
-0.02401040494441986,
0.002023128792643547,
-0.05762157589197159,
-0.028119739145040512,
0.028019580990076065,
0.03774779289960861,
0.019879736006259918,
0.14243163168430328,
0.07506252825260162,
-0.16844542324543,
0.03260309249162674,
-0.027271103113889694,
-0.03723977506160736,
-0.12012113630771637,
-0.09291239082813263,
-0.03276876360177994,
-0.07061256468296051,
0.00848793052136898,
-0.125116229057312,
0.010692578740417957,
0.16524486243724823,
0.056626494973897934,
0.026840034872293472,
-0.00012985203647986054,
-0.11775663495063782,
-0.03256932646036148,
0.05270649865269661,
0.01623205654323101,
0.02628426067531109,
0.059975337237119675,
-0.002820459892973304,
0.06138625741004944,
0.04187268391251564,
0.01245327852666378,
0.00004591856122715399,
0.07640525698661804,
0.01847189851105213,
0.03985394537448883,
-0.06224872171878815,
-0.0055181169882416725,
-0.043812479823827744,
0.07098621875047684,
0.0969415158033371,
0.051577720791101456,
-0.04784644395112991,
-0.007356400601565838,
0.16061227023601532,
-0.03985247761011124,
0.0024969845544546843,
-0.12584994733333588,
0.3352486491203308,
0.01523655652999878,
0.012062917463481426,
0.05173886567354202,
-0.0729614794254303,
-0.05219404771924019,
0.20213666558265686,
0.08864495903253555,
-0.016294803470373154,
-0.020869499072432518,
-0.0013476123567670584,
-0.03139572963118553,
-0.02200246974825859,
0.15414172410964966,
0.03452229127287865,
0.1257963925600052,
-0.05125836282968521,
-0.04788795858621597,
-0.03099238872528076,
-0.010714647360146046,
-0.12248322367668152,
0.13477188348770142,
-0.026863079518079758,
-0.022653402760624886,
-0.06995978206396103,
0.027005314826965332,
0.07499532401561737,
-0.3181912899017334,
0.004826826974749565,
-0.03193912282586098,
-0.10718866437673569,
-0.0033117954153567553,
-0.015482296235859394,
-0.025156617164611816,
0.04501540586352348,
-0.04574957117438316,
0.07127366960048676,
0.037038110196590424,
0.034496769309043884,
-0.024762917309999466,
-0.09784416854381561,
0.16621030867099762,
0.05630716308951378,
0.09843180328607559,
0.026186924427747726,
0.07294954359531403,
0.05525851994752884,
0.03780578449368477,
-0.10173529386520386,
0.037833068519830704,
0.012493002228438854,
-0.08084305375814438,
-0.053238194435834885,
0.1235065683722496,
-0.0021114307455718517,
0.04679664969444275,
0.045521847903728485,
-0.1090838611125946,
0.008652039803564548,
0.0670415610074997,
-0.06609642505645752,
-0.09549079835414886,
-0.006199665833264589,
-0.08745930343866348,
0.15632830560207367,
0.14021286368370056,
-0.016969988122582436,
0.019261980429291725,
-0.0663277730345726,
-0.0074497307650744915,
0.052005164325237274,
0.022523246705532074,
-0.01618964783847332,
-0.19353994727134705,
0.03381214663386345,
-0.07878580689430237,
-0.0011579931015148759,
-0.2247626632452011,
-0.1029510647058487,
-0.012294100597500801,
-0.05098658427596092,
-0.029459629207849503,
0.06463732570409775,
0.026924768462777138,
0.06543046981096268,
-0.015045860782265663,
-0.03688559681177139,
-0.02923484519124031,
0.08986574411392212,
-0.10843507945537567,
-0.062285903841257095
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1800k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1800k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1800k"]}
| null |
google/multiberts-seed_2-step_1800k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1800k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1800k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07638708502054214,
0.1028418019413948,
-0.002597078215330839,
0.0419437475502491,
0.07694916427135468,
-0.01484489906579256,
0.07759273797273636,
0.10077356547117233,
-0.017882969230413437,
0.024676235392689705,
0.07789377123117447,
-0.0014922389527782798,
0.01717465929687023,
0.09721887856721878,
0.025023603811860085,
-0.21813832223415375,
0.020031308755278587,
-0.03137301653623581,
-0.09068998694419861,
0.07644812017679214,
0.10000436007976532,
-0.0801793560385704,
0.044949181377887726,
0.030404645949602127,
-0.11234834790229797,
0.047916095703840256,
0.0031497222371399403,
-0.02099643647670746,
0.13164329528808594,
-0.0030241655185818672,
0.054729361087083817,
0.0545799545943737,
0.04304268956184387,
-0.13402217626571655,
0.00607320899143815,
0.05831020325422287,
0.06006086245179176,
0.04496363177895546,
0.021420827135443687,
0.07192578911781311,
0.003057992085814476,
0.02852260135114193,
0.047200750559568405,
0.025462834164500237,
-0.07722410559654236,
-0.05719577521085739,
-0.10171359032392502,
0.03597554191946983,
0.025303607806563377,
0.007197179831564426,
0.011983435600996017,
0.1250670701265335,
-0.03301815688610077,
0.04132812097668648,
0.18038733303546906,
-0.32855090498924255,
-0.013328826054930687,
0.07125009596347809,
0.03784552589058876,
0.13121944665908813,
-0.0022352878004312515,
-0.020282074809074402,
0.07474809885025024,
0.024044286459684372,
0.08983495831489563,
-0.04018707200884819,
0.029032226651906967,
-0.051858190447092056,
-0.154673770070076,
-0.048387013375759125,
0.08928582072257996,
-0.0034731486812233925,
-0.135429248213768,
-0.033382806926965714,
-0.043133992701768875,
0.02818678319454193,
0.0117014916613698,
-0.03680695965886116,
0.03423844650387764,
0.015579290688037872,
-0.018179306760430336,
-0.013477038592100143,
-0.10557112097740173,
-0.051783040165901184,
0.03126298263669014,
0.07882558554410934,
0.10017742961645126,
0.06845703721046448,
-0.0009902530582621694,
0.1080300360918045,
-0.1891224980354309,
-0.050875768065452576,
-0.028027964755892754,
-0.053351301699876785,
-0.0523989237844944,
-0.010122508741915226,
-0.11193099617958069,
-0.04493085294961929,
0.00636383518576622,
0.13116630911827087,
-0.01129861269146204,
0.027734724804759026,
-0.034613896161317825,
0.00958478543907404,
0.05753643065690994,
0.04201047122478485,
-0.005503173917531967,
0.02726907655596733,
0.019055604934692383,
-0.013084370642900467,
-0.022216180339455605,
0.013226778246462345,
0.0023234731052070856,
0.028979258611798286,
0.11809098720550537,
0.022643379867076874,
-0.09923837333917618,
0.06295762956142426,
-0.020956091582775116,
-0.04837643727660179,
0.01025821641087532,
-0.09079182147979736,
-0.05730101466178894,
-0.04011164605617523,
0.0024732525926083326,
0.008684379979968071,
-0.0026658002752810717,
-0.0020351677667349577,
-0.023572325706481934,
-0.03792569413781166,
-0.08390217274427414,
-0.04784562438726425,
-0.05465765669941902,
-0.1222226545214653,
0.007738406304270029,
-0.1814536303281784,
-0.03801080957055092,
-0.1136329397559166,
-0.18616604804992676,
-0.02158569172024727,
0.06710474193096161,
-0.011592135764658451,
-0.05854678899049759,
0.08117314428091049,
0.04302937909960747,
-0.029813619330525398,
-0.0017233690014109015,
0.07260461896657944,
-0.00012139206955907866,
0.04286766052246094,
-0.031513508409261703,
0.06765232980251312,
0.003560525132343173,
0.03603789582848549,
-0.056841857731342316,
0.060831218957901,
-0.17750269174575806,
0.04467339441180229,
-0.07091920077800751,
-0.028085000813007355,
-0.0859348326921463,
-0.03424161672592163,
-0.013164129108190536,
0.00509006530046463,
0.021225396543741226,
0.0726587101817131,
-0.17745563387870789,
-0.030458973720669746,
0.11602895706892014,
-0.16358478367328644,
-0.021327489987015724,
0.0723772644996643,
-0.049124348908662796,
0.10173242539167404,
0.06783106923103333,
0.158035546541214,
-0.005998888984322548,
-0.07336167991161346,
0.06084483116865158,
-0.0166622381657362,
0.010795565322041512,
-0.008699363097548485,
0.07086679339408875,
-0.021853065118193626,
-0.1542910486459732,
0.03333880752325058,
-0.1313389390707016,
-0.004344469401985407,
-0.07676195353269577,
0.014214799739420414,
-0.010670897550880909,
-0.06820596009492874,
-0.06808732450008392,
-0.025224948301911354,
0.06552833318710327,
-0.07497964799404144,
-0.018291695043444633,
0.035577572882175446,
0.07322685420513153,
-0.06906847655773163,
0.07004249840974808,
-0.014584372751414776,
0.007174412254244089,
-0.07916219532489777,
-0.03993914648890495,
-0.19105641543865204,
0.05083560571074486,
0.09864982962608337,
0.007645476143807173,
-0.016457824036478996,
0.1407642811536789,
0.006795336026698351,
0.06234479323029518,
-0.04941992089152336,
0.014706632122397423,
-0.016876189038157463,
-0.004409827291965485,
-0.08515892922878265,
-0.10041812062263489,
-0.0723237693309784,
-0.06724610924720764,
0.09237474203109741,
-0.12095660716295242,
0.021821722388267517,
-0.05659082159399986,
0.04202871024608612,
0.02177225612103939,
-0.08563157916069031,
-0.02012646198272705,
0.012433549389243126,
-0.06035474315285683,
-0.05472284182906151,
0.04005809500813484,
0.0722406879067421,
-0.010430256836116314,
0.09564435482025146,
-0.04998219013214111,
-0.09031709283590317,
0.03273867443203926,
0.09453960508108139,
-0.10545776784420013,
0.014411192387342453,
-0.0569954589009285,
-0.04332035779953003,
-0.06335382908582687,
-0.017093507573008537,
0.08093148469924927,
-0.007089170161634684,
0.13551098108291626,
-0.07574687153100967,
-0.0035291239619255066,
0.015744337812066078,
-0.02177545614540577,
-0.02354772388935089,
0.03529265522956848,
0.06544958055019379,
-0.06866415590047836,
0.01423319336026907,
0.04095402732491493,
0.011870705522596836,
0.07018265873193741,
-0.05219069495797157,
-0.08892835676670074,
0.012546584941446781,
0.034715112298727036,
0.031888410449028015,
0.06653379648923874,
-0.021479332819581032,
-0.01095554418861866,
0.033789634704589844,
0.017992716282606125,
0.007823319174349308,
-0.1158863753080368,
0.061788223683834076,
0.05901733785867691,
0.005460201296955347,
0.06556340306997299,
-0.02132989466190338,
-0.0380944162607193,
0.07966973632574081,
0.03644362837076187,
0.0012710406444966793,
-0.015493497252464294,
-0.016102571040391922,
-0.11964914947748184,
0.18884362280368805,
-0.0626162514090538,
-0.16844744980335236,
-0.07981320470571518,
-0.10698666423559189,
0.005169316660612822,
0.027035346254706383,
0.03743107244372368,
-0.021279126405715942,
-0.044775690883398056,
-0.12446390092372894,
0.054998233914375305,
-0.04242338240146637,
0.067196324467659,
0.11546240001916885,
-0.037954650819301605,
0.05706587806344032,
-0.12387464195489883,
-0.008026108145713806,
-0.08281300961971283,
-0.07375474274158478,
0.06202448531985283,
-0.048075102269649506,
0.02610447257757187,
0.1051015630364418,
0.02314046025276184,
-0.017542777583003044,
-0.02422979474067688,
0.19490540027618408,
0.039279356598854065,
0.039881810545921326,
0.1279839128255844,
-0.06736229360103607,
0.0551287904381752,
0.08431747555732727,
0.009211249649524689,
-0.0427304245531559,
0.053648874163627625,
0.045380301773548126,
-0.06998861581087112,
-0.1996547281742096,
-0.026163984090089798,
-0.007560067810118198,
-0.04321203753352165,
0.07745160907506943,
0.03617466986179352,
0.008866759948432446,
0.06806635111570358,
0.013219624757766724,
0.05721386522054672,
-0.001499962992966175,
0.09748297929763794,
0.012915089726448059,
-0.03263891860842705,
0.08656913042068481,
-0.019666289910674095,
-0.009371898137032986,
0.08592066913843155,
-0.02378838136792183,
0.29365506768226624,
-0.028672311455011368,
0.01688234880566597,
0.12186337262392044,
0.03749759867787361,
0.06357911974191666,
0.12015146762132645,
-0.0659276694059372,
0.022649168968200684,
-0.073039211332798,
-0.059364818036556244,
-0.00395919056609273,
0.04909472540020943,
-0.05972806736826897,
0.012675684876739979,
-0.0722908303141594,
0.02411497011780739,
-0.018853571265935898,
0.30766451358795166,
0.11023958772420883,
-0.10207968205213547,
-0.05565768480300903,
0.004506607074290514,
-0.09786076098680496,
-0.07005900144577026,
0.043761737644672394,
0.06788857281208038,
-0.13331830501556396,
0.006991542410105467,
-0.026588037610054016,
0.07402895390987396,
-0.010986587963998318,
0.017509281635284424,
0.031432684510946274,
0.03359255567193031,
-0.036888692528009415,
0.008491825312376022,
-0.18367719650268555,
0.19689059257507324,
0.005843614228069782,
0.015661532059311867,
-0.05087278410792351,
0.034138623625040054,
0.007726912386715412,
-0.032852400094270706,
0.06294509023427963,
0.025401949882507324,
-0.030750099569559097,
-0.038301534950733185,
-0.0509893037378788,
0.012801352888345718,
0.0779171958565712,
-0.04498280957341194,
0.10467340052127838,
-0.004587661474943161,
0.04286141321063042,
0.021510276943445206,
0.08819536119699478,
-0.18165719509124756,
-0.0915323793888092,
0.03358469903469086,
-0.05761542171239853,
-0.10675659775733948,
-0.08151676505804062,
-0.09381985664367676,
0.0015669558197259903,
0.2540406882762909,
-0.11286720633506775,
-0.07419247925281525,
-0.09453567862510681,
0.030388332903385162,
0.10528136789798737,
-0.04821668937802315,
0.029922230169177055,
-0.005493059754371643,
0.1325587034225464,
-0.06794966012239456,
-0.13156667351722717,
0.02140800654888153,
-0.08971016854047775,
-0.1636740118265152,
-0.0667840987443924,
0.1168859675526619,
0.06208094581961632,
0.03717949986457825,
-0.03130718320608139,
0.026386834681034088,
0.03414883837103844,
-0.036363255232572556,
-0.0004395894065964967,
0.06734641641378403,
0.09486941248178482,
0.03777606412768364,
-0.10716158151626587,
0.009867102839052677,
-0.0644872710108757,
-0.06915922462940216,
0.07888142019510269,
0.26376068592071533,
-0.05781415477395058,
0.13054808974266052,
0.11709413677453995,
-0.08064902573823929,
-0.15267786383628845,
0.03243134915828705,
0.09103450924158096,
-0.018080133944749832,
0.019222317263484,
-0.15979236364364624,
0.08926030993461609,
0.10936705768108368,
-0.02354525960981846,
-0.0009000421268865466,
-0.1881195455789566,
-0.12976862490177155,
0.06958997994661331,
0.09641927480697632,
0.2726430296897888,
-0.058911047875881195,
-0.04454413428902626,
0.018980341032147408,
-0.09896285086870193,
0.019829383119940758,
0.12329366058111191,
0.06004271283745766,
-0.023691801354289055,
-0.07575076073408127,
0.015139369294047356,
-0.041113853454589844,
0.09343982487916946,
0.054383039474487305,
0.05402219668030739,
-0.00321589526720345,
0.01665792427957058,
-0.019506504759192467,
-0.0455411821603775,
0.060226719826459885,
0.025945091620087624,
0.048861581832170486,
-0.0803542360663414,
-0.02899261564016342,
-0.06898178905248642,
0.028348566964268684,
-0.0254619512706995,
-0.07708388566970825,
-0.060942381620407104,
0.08023606240749359,
0.04871438443660736,
-0.024919753894209862,
0.01934218965470791,
0.0291560310870409,
0.11570697277784348,
0.16964487731456757,
-0.0040338123217225075,
-0.054906733334064484,
-0.05520812049508095,
-0.03849305957555771,
-0.01692650280892849,
0.0710965245962143,
-0.050944823771715164,
0.02722161076962948,
0.06496922671794891,
0.022037789225578308,
0.09760117530822754,
0.056748222559690475,
-0.1180822104215622,
-0.02034459076821804,
0.03503482788801193,
-0.16374458372592926,
0.018248500302433968,
-0.0025263179559260607,
0.037019647657871246,
-0.035495828837156296,
0.02421058714389801,
0.14918209612369537,
-0.06469399482011795,
-0.035573430359363556,
-0.03928660973906517,
0.06978236883878708,
0.024459945037961006,
0.1348622441291809,
0.0335218645632267,
0.03644566237926483,
-0.08167269080877304,
0.12657378613948822,
0.04366880655288696,
-0.043148305267095566,
0.021617859601974487,
-0.022523362189531326,
-0.10738685727119446,
0.01295842882245779,
0.060145601630210876,
0.03844909369945526,
-0.05136330425739288,
-0.009770546108484268,
-0.027300024405121803,
-0.07473597675561905,
0.06382948160171509,
0.1846267133951187,
0.0665270984172821,
0.07215841114521027,
-0.05437877029180527,
-0.036739032715559006,
-0.07750511169433594,
0.0447048544883728,
0.03967800736427307,
0.07448606193065643,
-0.07450402528047562,
0.10433954745531082,
0.013758626766502857,
0.04550950229167938,
-0.02992304600775242,
-0.05546910688281059,
-0.09680506587028503,
-0.05141390860080719,
-0.09686953574419022,
0.007947965525090694,
-0.07429930567741394,
-0.04125872626900673,
0.00023580707784276456,
-0.006551553029567003,
-0.01131089124828577,
0.0495138056576252,
-0.0614171139895916,
-0.009770624339580536,
-0.027503184974193573,
0.03474027290940285,
-0.06549136340618134,
-0.037006184458732605,
0.032293643802404404,
-0.10036876052618027,
0.09201555699110031,
0.04933464899659157,
0.006025835406035185,
0.007434865925461054,
0.09325314313173294,
-0.022389406338334084,
0.019100433215498924,
0.0158134326338768,
-0.047743234783411026,
-0.0888093113899231,
0.0011979839764535427,
-0.00801369734108448,
-0.0187554769217968,
-0.010514571331441402,
0.08812476694583893,
-0.08670490980148315,
0.033746931701898575,
-0.006065131165087223,
-0.00946204923093319,
-0.07432964444160461,
-0.013482000678777695,
0.09796512871980667,
0.09644828736782074,
0.045330118387937546,
-0.08579576015472412,
0.013213190250098705,
-0.14355242252349854,
-0.03675954043865204,
0.005012253299355507,
-0.0059037418104708195,
-0.12478362023830414,
-0.00858111958950758,
0.017448104918003082,
-0.003327451180666685,
0.21034570038318634,
-0.05605795979499817,
-0.02338261716067791,
0.022732723504304886,
-0.09905409812927246,
0.10832688212394714,
-0.021883506327867508,
0.17832492291927338,
-0.007373512256890535,
-0.04217168688774109,
-0.011765341274440289,
0.03338758647441864,
0.0216358695179224,
-0.02156851626932621,
0.18708953261375427,
0.14126721024513245,
0.04160795733332634,
0.03891308605670929,
-0.022328734397888184,
0.002939564175903797,
-0.04789051041007042,
-0.019718753173947334,
0.02866259030997753,
0.03618452325463295,
0.023574456572532654,
0.15136778354644775,
0.07371166348457336,
-0.16597825288772583,
0.036424148827791214,
-0.0304515790194273,
-0.03713792562484741,
-0.11509339511394501,
-0.08738932013511658,
-0.032639916986227036,
-0.07259897142648697,
0.00706349965184927,
-0.1250515878200531,
0.008822834119200706,
0.1802624762058258,
0.05638115108013153,
0.026735901832580566,
0.0031320350244641304,
-0.12787070870399475,
-0.03454415127635002,
0.05118453502655029,
0.013410834595561028,
0.02671218104660511,
0.05795411393046379,
-0.00038913614116609097,
0.059899017214775085,
0.04601321369409561,
0.013787563890218735,
0.002698551630601287,
0.0820758044719696,
0.017043547704815865,
0.04128488525748253,
-0.06336992233991623,
-0.0030563774053007364,
-0.04589422792196274,
0.07083945721387863,
0.09405320882797241,
0.04742119461297989,
-0.04665205255150795,
-0.006348244845867157,
0.16352543234825134,
-0.04066159203648567,
-0.004225396551191807,
-0.1290666162967682,
0.3265684247016907,
0.01422622799873352,
0.01138707809150219,
0.047821830958127975,
-0.07667853683233261,
-0.0510677769780159,
0.20326395332813263,
0.09015797823667526,
-0.020384011790156364,
-0.020946132019162178,
0.0026355900336056948,
-0.0301880594342947,
-0.022557683289051056,
0.1488116830587387,
0.03395594283938408,
0.1256859451532364,
-0.053773511201143265,
-0.04986006021499634,
-0.03067437931895256,
-0.008866541087627411,
-0.12436577677726746,
0.13656961917877197,
-0.026626277714967728,
-0.022721558809280396,
-0.07021293044090271,
0.026913190260529518,
0.07346182316541672,
-0.31588277220726013,
0.0008397040073759854,
-0.03885501250624657,
-0.1083444207906723,
-0.00309684406965971,
-0.013915777206420898,
-0.027462538331747055,
0.04650570824742317,
-0.04558003693819046,
0.06807713210582733,
0.0371423177421093,
0.03579855337738991,
-0.02330200746655464,
-0.09555783867835999,
0.16784124076366425,
0.05002708360552788,
0.09788987785577774,
0.02617965266108513,
0.07215665280818939,
0.05524507537484169,
0.03700483590364456,
-0.09739259630441666,
0.039871424436569214,
0.012399132363498211,
-0.0824570283293724,
-0.05330980196595192,
0.12196196615695953,
-0.002319005550816655,
0.04431547969579697,
0.04585900157690048,
-0.10743305832147598,
0.01022628415375948,
0.06905785948038101,
-0.06627300381660461,
-0.09523594379425049,
-0.009364387020468712,
-0.09101448208093643,
0.15439531207084656,
0.13891777396202087,
-0.016268745064735413,
0.02045811153948307,
-0.06607218086719513,
-0.009360944852232933,
0.05125473812222481,
0.01766263134777546,
-0.01673070713877678,
-0.1930021196603775,
0.03484627604484558,
-0.07726342976093292,
-0.0029945073183625937,
-0.23569433391094208,
-0.10220964252948761,
-0.008533690124750137,
-0.049671925604343414,
-0.02606850676238537,
0.06169816479086876,
0.03272265940904617,
0.06687233597040176,
-0.017527630552649498,
-0.02857665717601776,
-0.027292829006910324,
0.09016336500644684,
-0.10778731107711792,
-0.06377080082893372
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 180k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 180k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_180k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_180k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_180k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_180k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_180k"]}
| null |
google/multiberts-seed_2-step_180k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_180k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_180k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 180k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 180k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 180k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 180k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_180k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 180k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 180k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_180k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 180k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 180k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07583363354206085,
0.0914754644036293,
-0.0025821093004196882,
0.04073618724942207,
0.07481149584054947,
-0.017242539674043655,
0.07377652823925018,
0.10065759718418121,
-0.020771054551005363,
0.0269455686211586,
0.07927268743515015,
0.006784132681787014,
0.016553286463022232,
0.10226816684007645,
0.024322660639882088,
-0.22581110894680023,
0.02068740874528885,
-0.02990736998617649,
-0.09520290046930313,
0.07798502594232559,
0.09953386336565018,
-0.07784047722816467,
0.04486953839659691,
0.031681183725595474,
-0.11084554344415665,
0.045365527272224426,
0.002009548479691148,
-0.016817724332213402,
0.1347186267375946,
0.0004965075640939176,
0.052366819232702255,
0.05366193875670433,
0.04310235381126404,
-0.13395696878433228,
0.006786548066884279,
0.05976288765668869,
0.059325188398361206,
0.04646293446421623,
0.024029888212680817,
0.07400228828191757,
0.010773156769573689,
0.025315823033452034,
0.04361042007803917,
0.024751856923103333,
-0.07660365849733353,
-0.06203577294945717,
-0.1025748997926712,
0.03639455884695053,
0.02599528804421425,
0.008292204700410366,
0.009880779311060905,
0.12354258447885513,
-0.03350592032074928,
0.04157614707946777,
0.1872757524251938,
-0.3362855613231659,
-0.014539387077093124,
0.06440629065036774,
0.04150024428963661,
0.12431153655052185,
-0.003149206517264247,
-0.01747666858136654,
0.0772174820303917,
0.025416024029254913,
0.0935400202870369,
-0.03889404237270355,
0.03376700356602669,
-0.05587942525744438,
-0.15822185575962067,
-0.046865321695804596,
0.08442054688930511,
-0.001999570755288005,
-0.13526390492916107,
-0.036813512444496155,
-0.046344682574272156,
0.03500840440392494,
0.012719743885099888,
-0.03745992109179497,
0.0313480906188488,
0.01599782519042492,
-0.019561350345611572,
-0.014288779348134995,
-0.10740083456039429,
-0.05516725406050682,
0.027628028765320778,
0.08565502613782883,
0.1015896126627922,
0.0664861872792244,
0.0005097747780382633,
0.11017880588769913,
-0.1844140738248825,
-0.05078519508242607,
-0.027989182621240616,
-0.05528014898300171,
-0.05056029558181763,
-0.008908162824809551,
-0.10672885924577713,
-0.04340360686182976,
0.00847754254937172,
0.13043443858623505,
-0.012261440977454185,
0.02501830831170082,
-0.038210898637771606,
0.008299639448523521,
0.05600973591208458,
0.0446498766541481,
-0.007862776517868042,
0.030624546110630035,
0.018820319324731827,
-0.01134407613426447,
-0.021557826548814774,
0.014704649336636066,
0.004516636952757835,
0.022838611155748367,
0.11782818287611008,
0.025618858635425568,
-0.10183841735124588,
0.06660578399896622,
-0.01961897872388363,
-0.04786066338419914,
0.01888132281601429,
-0.08668342977762222,
-0.05945410206913948,
-0.037684496492147446,
0.003191909287124872,
0.01698373258113861,
-0.0023778676986694336,
-0.0061698853969573975,
-0.020920444279909134,
-0.03679490089416504,
-0.081719309091568,
-0.0462128147482872,
-0.05249376222491264,
-0.12336955219507217,
0.008948109112679958,
-0.18123741447925568,
-0.03739634528756142,
-0.11663945764303207,
-0.18019822239875793,
-0.020187752321362495,
0.06299635022878647,
-0.011290980502963066,
-0.05406314879655838,
0.08089040964841843,
0.04501865804195404,
-0.029012879356741905,
-0.002736520254984498,
0.08089179545640945,
-0.002349268877878785,
0.0447993166744709,
-0.0306716151535511,
0.06938222795724869,
0.007537854835391045,
0.034496452659368515,
-0.05691876262426376,
0.06234405189752579,
-0.17263205349445343,
0.04276972636580467,
-0.07263193279504776,
-0.031435124576091766,
-0.08906171470880508,
-0.03250298276543617,
-0.016770506277680397,
0.006192938890308142,
0.021097607910633087,
0.07094407826662064,
-0.18682582676410675,
-0.03019193932414055,
0.12109224498271942,
-0.16348516941070557,
-0.021277841180562973,
0.07582235336303711,
-0.04797369986772537,
0.10398374497890472,
0.06777466833591461,
0.15397800505161285,
-0.012556441128253937,
-0.08291139453649521,
0.061730772256851196,
-0.013537096790969372,
0.008684910833835602,
-0.005426844581961632,
0.07113054394721985,
-0.021552493795752525,
-0.15385298430919647,
0.032832931727170944,
-0.13319604098796844,
-0.001835134346038103,
-0.0776626467704773,
0.017137659713625908,
-0.01063917949795723,
-0.06703535467386246,
-0.0650189071893692,
-0.02601233683526516,
0.06598974764347076,
-0.07390186190605164,
-0.020506419241428375,
0.03396948426961899,
0.07438787072896957,
-0.07194028049707413,
0.06820279359817505,
-0.011313815601170063,
0.0098795834928751,
-0.07929184287786484,
-0.03792290389537811,
-0.19052168726921082,
0.05238719284534454,
0.09750895202159882,
0.004521322436630726,
-0.019341962411999702,
0.1466798335313797,
0.008841345086693764,
0.06466013193130493,
-0.04983232915401459,
0.014944713562726974,
-0.013474381528794765,
-0.004128576721996069,
-0.08897481858730316,
-0.09782074391841888,
-0.07429516315460205,
-0.06733448058366776,
0.09001637250185013,
-0.12361850589513779,
0.019092824310064316,
-0.059162937104701996,
0.04429444670677185,
0.022354187443852425,
-0.08056411147117615,
-0.017710374668240547,
0.011219145730137825,
-0.05955076590180397,
-0.054925426840782166,
0.04091506078839302,
0.07076021283864975,
-0.011154609732329845,
0.09727277606725693,
-0.05018378421664238,
-0.09044229239225388,
0.03087565489113331,
0.09410130977630615,
-0.10612461715936661,
0.01130114309489727,
-0.05825154483318329,
-0.04223130643367767,
-0.06969941407442093,
-0.018010927364230156,
0.07576704025268555,
-0.004558139480650425,
0.135177880525589,
-0.07444751262664795,
-0.003943847492337227,
0.013024707324802876,
-0.02344908006489277,
-0.021873440593481064,
0.04160727560520172,
0.058092013001441956,
-0.0795290544629097,
0.013762637972831726,
0.04680180549621582,
0.009677738882601261,
0.07323092222213745,
-0.0536823570728302,
-0.0904962345957756,
0.01270439475774765,
0.03789466992020607,
0.0313105545938015,
0.063748799264431,
-0.014349333941936493,
-0.011283906176686287,
0.03478270769119263,
0.018573658540844917,
0.005530079826712608,
-0.11701478064060211,
0.060216024518013,
0.0577746257185936,
0.0019490851555019617,
0.06908165663480759,
-0.015699489042162895,
-0.03991128131747246,
0.08016841858625412,
0.03793919086456299,
0.0012204113882035017,
-0.014860639348626137,
-0.014271912164986134,
-0.11638741195201874,
0.1903809756040573,
-0.0580868199467659,
-0.1656418740749359,
-0.07501115649938583,
-0.09571317583322525,
0.0039163692854344845,
0.025740379467606544,
0.03950975090265274,
-0.021682191640138626,
-0.04229169711470604,
-0.12202226370573044,
0.06926814466714859,
-0.039024725556373596,
0.06704119592905045,
0.10790380090475082,
-0.038254979997873306,
0.05548616871237755,
-0.12542316317558289,
-0.007699314970523119,
-0.08536064624786377,
-0.0755554661154747,
0.05914147570729256,
-0.05311398580670357,
0.028900720179080963,
0.09848558902740479,
0.025734875351190567,
-0.018679291009902954,
-0.024442806839942932,
0.19288857281208038,
0.0398864708840847,
0.03852289542555809,
0.13467207551002502,
-0.06375864893198013,
0.05458817631006241,
0.07866600155830383,
0.008695088326931,
-0.04260246455669403,
0.054221849888563156,
0.049164075404405594,
-0.07040571421384811,
-0.19690315425395966,
-0.02372574619948864,
-0.00624818354845047,
-0.04129306226968765,
0.07910054922103882,
0.03529658168554306,
0.004741753451526165,
0.07022600620985031,
0.00989231001585722,
0.05838288739323616,
-0.001201633014716208,
0.09626634418964386,
0.011765984818339348,
-0.0312211811542511,
0.08942994475364685,
-0.02147805318236351,
-0.011065102182328701,
0.08457260578870773,
-0.019068755209445953,
0.2937171161174774,
-0.031261857599020004,
0.019107231870293617,
0.12268456816673279,
0.04020701348781586,
0.06393022835254669,
0.1257493942975998,
-0.06377264112234116,
0.021534763276576996,
-0.07481861114501953,
-0.06101800128817558,
-0.009318683296442032,
0.047374315559864044,
-0.05941582843661308,
0.010391595773398876,
-0.07359201461076736,
0.022525357082486153,
-0.018233928829431534,
0.31167638301849365,
0.11129258573055267,
-0.10059400647878647,
-0.05853477865457535,
0.004581657703965902,
-0.09814664721488953,
-0.07015057653188705,
0.04450938105583191,
0.0672282949090004,
-0.13373728096485138,
0.004815664142370224,
-0.028243765234947205,
0.07611563056707382,
-0.020252106711268425,
0.018815966323018074,
0.025813156738877296,
0.03495720773935318,
-0.0374392606317997,
0.009989669546484947,
-0.1873699575662613,
0.20268648862838745,
0.005519141908735037,
0.01937054470181465,
-0.053540099412202835,
0.03121447004377842,
0.009305285289883614,
-0.03314786031842232,
0.06529173254966736,
0.02173593081533909,
-0.036237578839063644,
-0.04376436024904251,
-0.049501143395900726,
0.012825742363929749,
0.07562991976737976,
-0.04614763334393501,
0.103431835770607,
-0.0061852531507611275,
0.042173802852630615,
0.020158344879746437,
0.09359955787658691,
-0.18479222059249878,
-0.08642514050006866,
0.03052917867898941,
-0.06324266642332077,
-0.10415105521678925,
-0.08053772896528244,
-0.09458499401807785,
0.001742016989737749,
0.2464262992143631,
-0.1092676892876625,
-0.07430406659841537,
-0.09428000450134277,
0.02752181887626648,
0.10429354012012482,
-0.05163060873746872,
0.023952333256602287,
-0.007458026986569166,
0.13189995288848877,
-0.0693972185254097,
-0.13532932102680206,
0.022686637938022614,
-0.09146270900964737,
-0.16551166772842407,
-0.06381102651357651,
0.11779695749282837,
0.06394533067941666,
0.03617189824581146,
-0.027518967166543007,
0.02241748943924904,
0.033771857619285583,
-0.03693312779068947,
0.0016493238508701324,
0.06607896089553833,
0.09310880303382874,
0.03029431216418743,
-0.1096956804394722,
0.014795851893723011,
-0.06558069586753845,
-0.06642985343933105,
0.07669245451688766,
0.25995108485221863,
-0.05749064311385155,
0.12392264604568481,
0.11828607320785522,
-0.07758653908967972,
-0.15279924869537354,
0.027793070301413536,
0.09428089112043381,
-0.01651214435696602,
0.019026417285203934,
-0.15943177044391632,
0.0863582119345665,
0.11738619953393936,
-0.025473395362496376,
0.002507430501282215,
-0.18709811568260193,
-0.12893760204315186,
0.07314125448465347,
0.09457886219024658,
0.26848727464675903,
-0.060706112533807755,
-0.04256899654865265,
0.016737084835767746,
-0.09704490751028061,
0.026100771501660347,
0.11496356874704361,
0.06349720060825348,
-0.023258376866579056,
-0.07513923197984695,
0.013729827478528023,
-0.03945385292172432,
0.0931335985660553,
0.05285688117146492,
0.05516726151108742,
-0.0017904720734804869,
0.019978445023298264,
-0.030035972595214844,
-0.04603113234043121,
0.06313185393810272,
0.023196203634142876,
0.04752662032842636,
-0.07969844341278076,
-0.029895193874835968,
-0.07138269394636154,
0.03101184591650963,
-0.024651318788528442,
-0.07802147418260574,
-0.05968650430440903,
0.08030513674020767,
0.046656616032123566,
-0.024450788274407387,
0.023964956402778625,
0.027879776433110237,
0.11901631951332092,
0.1669192910194397,
-0.004161396995186806,
-0.04460625723004341,
-0.05787240341305733,
-0.039581362158060074,
-0.01740114390850067,
0.07436321675777435,
-0.0532064251601696,
0.026945970952510834,
0.06393569707870483,
0.021967580541968346,
0.09678687900304794,
0.0558745451271534,
-0.11890270560979843,
-0.017577681690454483,
0.030742354691028595,
-0.16198252141475677,
0.013364527374505997,
-0.0006962898187339306,
0.029527975246310234,
-0.03210882097482681,
0.030330583453178406,
0.1521434336900711,
-0.06529010087251663,
-0.0353693850338459,
-0.040271442383527756,
0.06820657849311829,
0.021501842886209488,
0.13815298676490784,
0.03430332988500595,
0.03678483888506889,
-0.08023885637521744,
0.12125945091247559,
0.038340743631124496,
-0.035128094255924225,
0.020214848220348358,
-0.026646684855222702,
-0.1062764897942543,
0.010905473493039608,
0.059610579162836075,
0.0368879996240139,
-0.05494305491447449,
-0.010871806181967258,
-0.021403267979621887,
-0.07591229677200317,
0.059894997626543045,
0.18467018008232117,
0.06889147311449051,
0.073958620429039,
-0.05374707654118538,
-0.03537390008568764,
-0.07521034777164459,
0.043068207800388336,
0.041162826120853424,
0.07398153096437454,
-0.07267526537179947,
0.10637394338846207,
0.012503858655691147,
0.04656780883669853,
-0.03199785575270653,
-0.05202740058302879,
-0.09916432946920395,
-0.05222382768988609,
-0.10546541213989258,
0.006830994039773941,
-0.07635548710823059,
-0.040838975459337234,
-0.0004266260948497802,
-0.005917968228459358,
-0.005992654245346785,
0.04848318547010422,
-0.06148491054773331,
-0.008279388770461082,
-0.027929510921239853,
0.03515075892210007,
-0.06759323179721832,
-0.03728015720844269,
0.031167160719633102,
-0.10156286507844925,
0.09118068963289261,
0.04991926625370979,
0.00883821863681078,
0.011133113875985146,
0.08767914772033691,
-0.020563529804348946,
0.02074556238949299,
0.013492802157998085,
-0.04691052436828613,
-0.08902860432863235,
0.00006496263813460246,
-0.009130483493208885,
-0.016790492460131645,
-0.011865228414535522,
0.08458425849676132,
-0.08733321726322174,
0.03428032249212265,
-0.005587506573647261,
-0.009343581274151802,
-0.0727320984005928,
-0.010656737722456455,
0.09589508920907974,
0.09902004152536392,
0.04680901765823364,
-0.08670194447040558,
0.013808838091790676,
-0.14202938973903656,
-0.03688307851552963,
0.006619058549404144,
-0.009366669692099094,
-0.12287061661481857,
-0.00896299909800291,
0.01917978562414646,
-0.004105013329535723,
0.20400287210941315,
-0.05703318491578102,
-0.013751006685197353,
0.020041460171341896,
-0.1005026325583458,
0.10992386937141418,
-0.023531461134552956,
0.1764245480298996,
-0.008864887990057468,
-0.039697933942079544,
-0.01475130021572113,
0.03440400958061218,
0.018371375277638435,
-0.02017376385629177,
0.18770906329154968,
0.13769391179084778,
0.0327637679874897,
0.04025799408555031,
-0.02128779888153076,
0.0034300517290830612,
-0.05757476016879082,
-0.02793130651116371,
0.032275766134262085,
0.0355842150747776,
0.020892376080155373,
0.15032680332660675,
0.0740809366106987,
-0.16566236317157745,
0.0336037240922451,
-0.029871944338083267,
-0.03654518350958824,
-0.11695756018161774,
-0.09683831036090851,
-0.03324029967188835,
-0.06999451667070389,
0.007959769107401371,
-0.12149389088153839,
0.00902561005204916,
0.1838303804397583,
0.05784231051802635,
0.0260050967335701,
0.005304242484271526,
-0.11885341256856918,
-0.03598549962043762,
0.054665230214595795,
0.014377251267433167,
0.026083877310156822,
0.058591149747371674,
0.0013061703648418188,
0.06293736398220062,
0.04201338812708855,
0.013423954136669636,
0.0010906736133620143,
0.07774754613637924,
0.01802184246480465,
0.038876522332429886,
-0.06286975741386414,
-0.0038771135732531548,
-0.04146113991737366,
0.07223761826753616,
0.10298017412424088,
0.050362423062324524,
-0.047704435884952545,
-0.006621736101806164,
0.16042350232601166,
-0.039657365530729294,
-0.003006347920745611,
-0.12874668836593628,
0.33253419399261475,
0.01168573647737503,
0.015245312824845314,
0.04644503816962242,
-0.07365821301937103,
-0.04942033812403679,
0.2036070078611374,
0.08577711135149002,
-0.01956009864807129,
-0.01951354555785656,
0.0016942322254180908,
-0.030023733153939247,
-0.022224117070436478,
0.14846354722976685,
0.036215733736753464,
0.12884382903575897,
-0.05672174319624901,
-0.05152581259608269,
-0.029969066381454468,
-0.01082900445908308,
-0.1261831820011139,
0.13288363814353943,
-0.027819225564599037,
-0.021228883415460587,
-0.07014307379722595,
0.02736375294625759,
0.07252008467912674,
-0.32160988450050354,
-0.00464056758210063,
-0.030801041051745415,
-0.10898790508508682,
-0.005160212516784668,
-0.014285978861153126,
-0.02303716167807579,
0.04879137501120567,
-0.047145601361989975,
0.06892000883817673,
0.04268110916018486,
0.03495784103870392,
-0.018763992935419083,
-0.09218191355466843,
0.16644881665706635,
0.058933332562446594,
0.09058195352554321,
0.02721741981804371,
0.07171466201543808,
0.054886408150196075,
0.036000773310661316,
-0.09555331617593765,
0.04661543667316437,
0.01303756795823574,
-0.08650636672973633,
-0.05152129754424095,
0.11993134021759033,
-0.0017660089069977403,
0.039415668696165085,
0.046426817774772644,
-0.11218337714672089,
0.013910075649619102,
0.07345374673604965,
-0.06741271913051605,
-0.09588100761175156,
-0.003573901951313019,
-0.09016384929418564,
0.1562768965959549,
0.1447421759366989,
-0.01476571150124073,
0.018765093758702278,
-0.06908359378576279,
-0.00820623803883791,
0.05211537703871727,
0.013573612086474895,
-0.01893286220729351,
-0.18792787194252014,
0.032443076372146606,
-0.07249883562326431,
-0.006322011351585388,
-0.22710199654102325,
-0.10100305825471878,
-0.008665450848639011,
-0.05082269757986069,
-0.0233357772231102,
0.0557524599134922,
0.02762308344244957,
0.06743627786636353,
-0.016098998486995697,
-0.03636865317821503,
-0.027189316228032112,
0.0864969864487648,
-0.1085854098200798,
-0.06291642785072327
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 1900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1900k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_1900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_1900k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_1900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_1900k"]}
| null |
google/multiberts-seed_2-step_1900k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_1900k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 1900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_1900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 1900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 1900k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07433538883924484,
0.09618877619504929,
-0.002528534969314933,
0.037388645112514496,
0.07196611911058426,
-0.013162008486688137,
0.08025570958852768,
0.10009131580591202,
-0.019832370802760124,
0.025360718369483948,
0.08065195381641388,
0.00026357913156971335,
0.018941903486847878,
0.09627499431371689,
0.020616240799427032,
-0.22406217455863953,
0.01930045150220394,
-0.030604099854826927,
-0.087826669216156,
0.07764849066734314,
0.10085494071245193,
-0.08194158226251602,
0.04322022199630737,
0.030203241854906082,
-0.11862021684646606,
0.047675322741270065,
0.003999591805040836,
-0.022598182782530785,
0.13275820016860962,
-0.00024903006851673126,
0.05587724968791008,
0.054596759378910065,
0.043666042387485504,
-0.13148029148578644,
0.007007977459579706,
0.05851541832089424,
0.057665012776851654,
0.046124160289764404,
0.016866382211446762,
0.0712500736117363,
0.01697864569723606,
0.02529098466038704,
0.04709823429584503,
0.021670356392860413,
-0.07690252363681793,
-0.06409075856208801,
-0.10392682999372482,
0.04611446335911751,
0.02459944225847721,
0.0006121285841800272,
0.014151906594634056,
0.12461920082569122,
-0.03259613364934921,
0.0439312644302845,
0.18151159584522247,
-0.3282696008682251,
-0.0166992899030447,
0.06997234374284744,
0.043213557451963425,
0.12645463645458221,
-0.006315567996352911,
-0.016990788280963898,
0.07155675441026688,
0.022905869409441948,
0.09527575224637985,
-0.043849263340234756,
0.020218344405293465,
-0.05197930335998535,
-0.15599162876605988,
-0.04717984050512314,
0.0896029993891716,
-0.005123891402035952,
-0.13521720468997955,
-0.03765491023659706,
-0.04455322027206421,
0.034192584455013275,
0.008797114714980125,
-0.036975882947444916,
0.03549604490399361,
0.012105577625334263,
-0.016118407249450684,
-0.01732119359076023,
-0.10660525411367416,
-0.052931200712919235,
0.027173098176717758,
0.09079635143280029,
0.09856230765581131,
0.07037030160427094,
-0.0023894386831671,
0.10785505175590515,
-0.1840052455663681,
-0.051644474267959595,
-0.026528477668762207,
-0.054289329797029495,
-0.048834722489118576,
-0.010433991439640522,
-0.11011330783367157,
-0.04554925113916397,
0.006637669634073973,
0.12878401577472687,
-0.011026128195226192,
0.02729703299701214,
-0.0317881815135479,
0.010346921160817146,
0.060519736260175705,
0.03990233689546585,
-0.005050587933510542,
0.013786490075290203,
0.023187577724456787,
-0.013065356761217117,
-0.01918192394077778,
0.012953055091202259,
0.0038834342267364264,
0.02764729969203472,
0.12007548660039902,
0.02293824777007103,
-0.09969428926706314,
0.06367000937461853,
-0.01508479192852974,
-0.04404836148023605,
0.012317492626607418,
-0.09162408113479614,
-0.05550169572234154,
-0.039750032126903534,
0.002386372769251466,
0.014159110374748707,
-0.005598717834800482,
-0.002987311454489827,
-0.019126292318105698,
-0.03787690028548241,
-0.08611646294593811,
-0.046490028500556946,
-0.05081988498568535,
-0.12511590123176575,
0.008990183472633362,
-0.17752574384212494,
-0.03719974309206009,
-0.1121237725019455,
-0.18647441267967224,
-0.021054426208138466,
0.06679333001375198,
-0.01270708255469799,
-0.05542536452412605,
0.08436264097690582,
0.044308554381132126,
-0.02760389819741249,
0.0000627063272986561,
0.07876790314912796,
-0.0007914919988252223,
0.044051092118024826,
-0.032548341900110245,
0.06954595446586609,
-0.0006120532052591443,
0.03185191750526428,
-0.05268830060958862,
0.06241288036108017,
-0.18133877217769623,
0.04414522275328636,
-0.07027871906757355,
-0.03068437986075878,
-0.08593712747097015,
-0.03224347531795502,
-0.012905430980026722,
0.006886725313961506,
0.01805359311401844,
0.07297122478485107,
-0.17795228958129883,
-0.03240112587809563,
0.1272791028022766,
-0.1635582447052002,
-0.021040543913841248,
0.06729166954755783,
-0.04984508454799652,
0.10567322373390198,
0.07035563886165619,
0.15495240688323975,
-0.01591651141643524,
-0.08306120336055756,
0.06250660866498947,
-0.013197607360780239,
0.009076154790818691,
-0.007700306363403797,
0.06954202800989151,
-0.02007085084915161,
-0.15788330137729645,
0.03384168818593025,
-0.13102252781391144,
-0.005050175823271275,
-0.07697755098342896,
0.01638667657971382,
-0.006470755208283663,
-0.06747542321681976,
-0.06194152683019638,
-0.026519862934947014,
0.06666260212659836,
-0.07399743795394897,
-0.01615062728524208,
0.04629942774772644,
0.07186348736286163,
-0.07025007158517838,
0.0676901564002037,
-0.011121238581836224,
0.007817687466740608,
-0.07676328718662262,
-0.03957906737923622,
-0.19013254344463348,
0.04950280115008354,
0.09843191504478455,
0.0025143201928585768,
-0.015958670526742935,
0.13963843882083893,
0.007675133645534515,
0.06727378815412521,
-0.05187274143099785,
0.019211789593100548,
-0.013216006569564342,
-0.003108501899987459,
-0.0899050235748291,
-0.1041775718331337,
-0.07365602254867554,
-0.06708643585443497,
0.09910757094621658,
-0.12752319872379303,
0.022442670539021492,
-0.0541708879172802,
0.04524749889969826,
0.02118980884552002,
-0.0831630751490593,
-0.019632669165730476,
0.011858788318932056,
-0.05779648572206497,
-0.053220584988594055,
0.039270754903554916,
0.07187013328075409,
-0.01079332735389471,
0.09606676548719406,
-0.050325483083724976,
-0.09432315081357956,
0.033659230917692184,
0.09629333019256592,
-0.10872457921504974,
0.002323684049770236,
-0.056628525257110596,
-0.04316779598593712,
-0.06382957845926285,
-0.02122001349925995,
0.07745447009801865,
-0.006871629506349564,
0.1374109536409378,
-0.07783418893814087,
-0.007876737974584103,
0.016297968104481697,
-0.02240065112709999,
-0.022129317745566368,
0.029327480122447014,
0.0622250996530056,
-0.08268163353204727,
0.015861187130212784,
0.03694884479045868,
0.010480082593858242,
0.07782124727964401,
-0.0525282584130764,
-0.08794479072093964,
0.00831606611609459,
0.031717125326395035,
0.028144799172878265,
0.07263005524873734,
-0.02907421998679638,
-0.014743862673640251,
0.03395097330212593,
0.012103657238185406,
0.006639487575739622,
-0.11652139574289322,
0.06304572522640228,
0.056363366544246674,
0.006456821225583553,
0.06532251089811325,
-0.018851324915885925,
-0.0376574769616127,
0.08180242031812668,
0.03597741946578026,
0.005298943724483252,
-0.017986701801419258,
-0.015638677403330803,
-0.12077642977237701,
0.18857219815254211,
-0.06271110475063324,
-0.1652933955192566,
-0.07744572311639786,
-0.10635709017515182,
0.0026896074414253235,
0.025039304047822952,
0.03784185275435448,
-0.010762643069028854,
-0.041246961802244186,
-0.12600885331630707,
0.055209796875715256,
-0.03784897178411484,
0.06878286600112915,
0.11187019944190979,
-0.040112391114234924,
0.05233700945973396,
-0.12595361471176147,
-0.007294106297194958,
-0.08460079878568649,
-0.06716401875019073,
0.06123988330364227,
-0.04683588072657585,
0.025029830634593964,
0.10357523709535599,
0.024875327944755554,
-0.017844658344984055,
-0.027622761204838753,
0.1978358030319214,
0.037137243896722794,
0.04562486708164215,
0.12405809760093689,
-0.06616955995559692,
0.056635692715644836,
0.079327791929245,
0.010572709143161774,
-0.04091934487223625,
0.05038481578230858,
0.04165886715054512,
-0.07190556079149246,
-0.1937006562948227,
-0.02504265122115612,
-0.009556773118674755,
-0.05108578875660896,
0.07688383013010025,
0.03389117121696472,
0.015750745311379433,
0.06697873771190643,
0.01302614901214838,
0.06570588797330856,
0.0038707880303263664,
0.09868700057268143,
0.011389490216970444,
-0.03291619196534157,
0.08624487370252609,
-0.020518245175480843,
-0.008055358193814754,
0.08573419600725174,
-0.028781775385141373,
0.2889100909233093,
-0.029265733435750008,
0.005125620402395725,
0.12359650433063507,
0.03375757485628128,
0.06606153398752213,
0.1250319629907608,
-0.06692983955144882,
0.023195423185825348,
-0.0729733482003212,
-0.06230083853006363,
-0.00581394461914897,
0.04983660206198692,
-0.05691153556108475,
0.009194561280310154,
-0.06857836991548538,
0.01238248124718666,
-0.01991649530827999,
0.3159042298793793,
0.1097235307097435,
-0.09990104287862778,
-0.05690678581595421,
0.004441909492015839,
-0.10169143229722977,
-0.06975450366735458,
0.0388287715613842,
0.07297054678201675,
-0.13687260448932648,
0.014150545932352543,
-0.025315864011645317,
0.07118352502584457,
-0.009699731133878231,
0.014061538502573967,
0.027527716010808945,
0.036785583943128586,
-0.036904312670230865,
0.004752434324473143,
-0.17125308513641357,
0.20002873241901398,
0.008547607809305191,
0.017093749716877937,
-0.05518295615911484,
0.03300394117832184,
0.009969141334295273,
-0.03401251882314682,
0.06058024615049362,
0.025899451225996017,
-0.026535380631685257,
-0.037475377321243286,
-0.04967474564909935,
0.008901999332010746,
0.08060873299837112,
-0.04459792748093605,
0.10752833634614944,
-0.004443312995135784,
0.04070090129971504,
0.020169232040643692,
0.08992372453212738,
-0.17870643734931946,
-0.09178739786148071,
0.031648073345422745,
-0.06235608831048012,
-0.10812750458717346,
-0.08247489482164383,
-0.09194877743721008,
0.008054297417402267,
0.24135756492614746,
-0.11874941736459732,
-0.07418449968099594,
-0.09590204060077667,
0.030154066160321236,
0.10650838911533356,
-0.04775138571858406,
0.030421825125813484,
-0.00768939359113574,
0.13460978865623474,
-0.06929156929254532,
-0.13126640021800995,
0.019331246614456177,
-0.09015049785375595,
-0.1677263379096985,
-0.06746389716863632,
0.12126213312149048,
0.06405407190322876,
0.03509727492928505,
-0.02624015510082245,
0.027408765628933907,
0.03829184174537659,
-0.0388677679002285,
0.001068665529601276,
0.07384231686592102,
0.0998924970626831,
0.0365927554666996,
-0.11010701209306717,
0.012959321029484272,
-0.06094801798462868,
-0.06685857474803925,
0.07984152436256409,
0.26718005537986755,
-0.0590677373111248,
0.1336062252521515,
0.12150375545024872,
-0.08103164285421371,
-0.15603606402873993,
0.028992649167776108,
0.09288368374109268,
-0.017561357468366623,
0.016785170882940292,
-0.1575344055891037,
0.08794188499450684,
0.11652550101280212,
-0.021831896156072617,
-0.0003385127056390047,
-0.18798761069774628,
-0.13056790828704834,
0.06552029401063919,
0.09871270507574081,
0.26800644397735596,
-0.06209510564804077,
-0.04622780531644821,
0.018256258219480515,
-0.09180391579866409,
0.022285787388682365,
0.11247687041759491,
0.06111673638224602,
-0.024013835936784744,
-0.07988943159580231,
0.014237698167562485,
-0.040449704974889755,
0.0956878513097763,
0.04759835824370384,
0.055226705968379974,
-0.0034037961158901453,
0.009269855916500092,
-0.006262120790779591,
-0.0464661680161953,
0.05739866569638252,
0.023290937766432762,
0.04763589799404144,
-0.07619684189558029,
-0.02882985584437847,
-0.07191282510757446,
0.03116590902209282,
-0.026666682213544846,
-0.07614906132221222,
-0.06311316788196564,
0.07789671421051025,
0.04882737249135971,
-0.027594806626439095,
0.019586753100156784,
0.027109410613775253,
0.11899007111787796,
0.17698705196380615,
-0.0024958292488008738,
-0.050961870700120926,
-0.05325690656900406,
-0.037958987057209015,
-0.018122009932994843,
0.06993670016527176,
-0.049043748527765274,
0.02610609494149685,
0.06209443137049675,
0.026344595476984978,
0.10053908079862595,
0.05487998574972153,
-0.11655016243457794,
-0.01941799558699131,
0.03496041148900986,
-0.16359776258468628,
0.015778031200170517,
-0.0006830145139247179,
0.032064288854599,
-0.037672702223062515,
0.021962754428386688,
0.14814579486846924,
-0.06268629431724548,
-0.03670533001422882,
-0.04197213053703308,
0.06775320321321487,
0.021888650953769684,
0.1346760392189026,
0.03197469562292099,
0.036948755383491516,
-0.08240475505590439,
0.120684415102005,
0.041160352528095245,
-0.034417733550071716,
0.02011801488697529,
-0.020717279985547066,
-0.10552702844142914,
0.013958120718598366,
0.055307786911726,
0.03830232098698616,
-0.051665641367435455,
-0.007415458094328642,
-0.02202741988003254,
-0.07722204178571701,
0.059090301394462585,
0.19600209593772888,
0.0633103996515274,
0.0732845738530159,
-0.053817786276340485,
-0.033987823873758316,
-0.07772526144981384,
0.04409131035208702,
0.037532221525907516,
0.07291525602340698,
-0.07109387964010239,
0.11287383735179901,
0.014101558364927769,
0.04328158497810364,
-0.032307710498571396,
-0.05533715337514877,
-0.09826042503118515,
-0.054618049412965775,
-0.10617569088935852,
0.008425806649029255,
-0.08189911395311356,
-0.04173586890101433,
0.001045553362928331,
-0.006944893393665552,
-0.011101730167865753,
0.04494107514619827,
-0.0618252158164978,
-0.006592978723347187,
-0.026408644393086433,
0.03493872284889221,
-0.06543344259262085,
-0.03842779994010925,
0.032249368727207184,
-0.10397375375032425,
0.09086092561483383,
0.05355488508939743,
0.01074271835386753,
0.01053476333618164,
0.10056749731302261,
-0.02040589042007923,
0.01990269124507904,
0.016460435464978218,
-0.04664679616689682,
-0.08639159053564072,
-0.00035220422432757914,
-0.008697383105754852,
-0.015858888626098633,
-0.007255441974848509,
0.08726314455270767,
-0.08475986123085022,
0.03406069055199623,
-0.007063678465783596,
-0.011746903881430626,
-0.0743933692574501,
-0.012859657406806946,
0.10373253375291824,
0.09515764564275742,
0.04508558288216591,
-0.0860963985323906,
0.012159669771790504,
-0.14266939461231232,
-0.0377630777657032,
0.0029219223652035,
-0.007821653038263321,
-0.11711376160383224,
-0.010768606327474117,
0.018089137971401215,
-0.0009125728975050151,
0.22172291576862335,
-0.05360974371433258,
-0.027225980535149574,
0.02237791195511818,
-0.10159249603748322,
0.11252161115407944,
-0.022517722100019455,
0.17992840707302094,
-0.0051384009420871735,
-0.042203810065984726,
-0.01608356088399887,
0.036428213119506836,
0.02073560282588005,
-0.025340164080262184,
0.19321070611476898,
0.13929492235183716,
0.03735318407416344,
0.040278058499097824,
-0.022302445024251938,
0.0010462820064276457,
-0.044445980340242386,
-0.02804616466164589,
0.031673431396484375,
0.03655148670077324,
0.019284481182694435,
0.1553090512752533,
0.07352877408266068,
-0.17416897416114807,
0.03701271861791611,
-0.02556772716343403,
-0.038746003061532974,
-0.11595863103866577,
-0.08696316927671432,
-0.0317588709294796,
-0.0672864317893982,
0.005651029292494059,
-0.12409915775060654,
0.008674981072545052,
0.166745126247406,
0.055796436965465546,
0.029347453266382217,
-0.0013538668863475323,
-0.12703551352024078,
-0.0367693267762661,
0.05509281903505325,
0.01861485093832016,
0.02475600689649582,
0.05402778089046478,
-0.0015620224876329303,
0.05784671753644943,
0.04140595346689224,
0.013214541599154472,
0.0018064773175865412,
0.08026624470949173,
0.013954942114651203,
0.040651608258485794,
-0.06288351118564606,
-0.001044496544636786,
-0.04489543288946152,
0.0736653134226799,
0.08483167737722397,
0.04891543090343475,
-0.04794058948755264,
-0.006941604427993298,
0.16237545013427734,
-0.040408022701740265,
-0.008795618079602718,
-0.1273881196975708,
0.3255290687084198,
0.01434005331248045,
0.013061467558145523,
0.046807922422885895,
-0.07664339244365692,
-0.05047418549656868,
0.20610420405864716,
0.09398313611745834,
-0.016738498583436012,
-0.01869901828467846,
0.0005486346781253815,
-0.030493607744574547,
-0.020698005333542824,
0.14920461177825928,
0.03262463957071304,
0.1369491070508957,
-0.05199166387319565,
-0.0451711006462574,
-0.029487725347280502,
-0.009210687130689621,
-0.12277932465076447,
0.14377376437187195,
-0.02707836963236332,
-0.02824319899082184,
-0.07017500698566437,
0.025210140272974968,
0.07421378046274185,
-0.3289640247821808,
0.007412337232381105,
-0.038535378873348236,
-0.11041571199893951,
-0.0017935604555532336,
-0.021429769694805145,
-0.024405984207987785,
0.04697410762310028,
-0.04133319854736328,
0.06726343929767609,
0.04187803342938423,
0.03651394322514534,
-0.026921315118670464,
-0.09753573685884476,
0.17343947291374207,
0.04507066681981087,
0.09757765382528305,
0.022485071793198586,
0.07842428237199783,
0.05440174788236618,
0.035972870886325836,
-0.09270390123128891,
0.04086078703403473,
0.014197070151567459,
-0.08944231271743774,
-0.05320962145924568,
0.12283208221197128,
-0.0024475373793393373,
0.03965159133076668,
0.044727277010679245,
-0.10628679394721985,
0.012813832610845566,
0.07279913127422333,
-0.07333672046661377,
-0.09592946618795395,
-0.011640403419733047,
-0.09040219336748123,
0.1554284244775772,
0.14115938544273376,
-0.017448371276259422,
0.02409142442047596,
-0.06412637233734131,
-0.009910994209349155,
0.05710107460618019,
0.015194093808531761,
-0.017566226422786713,
-0.1961270272731781,
0.032765768468379974,
-0.08289676159620285,
-0.002670880174264312,
-0.22528168559074402,
-0.10256075114011765,
-0.009152213111519814,
-0.04875829070806503,
-0.023162387311458588,
0.060846179723739624,
0.031539686024188995,
0.06682365387678146,
-0.016761451959609985,
-0.03541824221611023,
-0.029812706634402275,
0.09207794815301895,
-0.10766618698835373,
-0.06147334724664688
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 2000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 2000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_2000k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_2000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_2000k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_2000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_2000k"]}
| null |
google/multiberts-seed_2-step_2000k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_2000k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_2000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 2000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 2000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 2000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 2000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_2000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 2000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 2000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_2000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 2000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 2000k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0726495161652565,
0.10876529663801193,
-0.002546235453337431,
0.03883059322834015,
0.07521932572126389,
-0.015288169495761395,
0.08086666464805603,
0.10171249508857727,
-0.025298984721302986,
0.026079246774315834,
0.08316148817539215,
0.00028944891528226435,
0.01682703010737896,
0.09616474062204361,
0.022695593535900116,
-0.2257218211889267,
0.022755753248929977,
-0.03307541832327843,
-0.08738575875759125,
0.07642365992069244,
0.10129311680793762,
-0.08134004473686218,
0.04325114190578461,
0.029932770878076553,
-0.11706668883562088,
0.04734320193529129,
0.000992141547612846,
-0.017901385203003883,
0.13158611953258514,
-0.00041273937677033246,
0.055811069905757904,
0.055613841861486435,
0.04334332421422005,
-0.13423596322536469,
0.0069802869111299515,
0.05795667693018913,
0.05990669131278992,
0.04630747437477112,
0.020058106631040573,
0.07372327893972397,
0.0047264378517866135,
0.023369289934635162,
0.046116456389427185,
0.01980111002922058,
-0.07317224889993668,
-0.06278137862682343,
-0.10431049019098282,
0.04207443818449974,
0.025031181052327156,
0.0036752275191247463,
0.013694741763174534,
0.12528593838214874,
-0.032598525285720825,
0.04373513534665108,
0.18173106014728546,
-0.33315515518188477,
-0.01573130674660206,
0.07095625251531601,
0.040773168206214905,
0.12268678843975067,
-0.005294510163366795,
-0.01846979558467865,
0.07251781970262527,
0.021170079708099365,
0.08976896107196808,
-0.041458610445261,
0.019062522798776627,
-0.05153092369437218,
-0.15589748322963715,
-0.04710584878921509,
0.09340864419937134,
-0.004984503611922264,
-0.1345987617969513,
-0.036063794046640396,
-0.043649546802043915,
0.03939463198184967,
0.010981355793774128,
-0.03506416082382202,
0.03505459427833557,
0.013809144496917725,
-0.023269062861800194,
-0.01515572052448988,
-0.10521262884140015,
-0.052862465381622314,
0.027971692383289337,
0.08893968909978867,
0.10018721222877502,
0.06670225411653519,
0.0008446510182693601,
0.10939962416887283,
-0.1897314488887787,
-0.050618480890989304,
-0.031230131164193153,
-0.05150603502988815,
-0.05121103674173355,
-0.008419523946940899,
-0.11313701421022415,
-0.04674728214740753,
0.012119021266698837,
0.1340285688638687,
-0.0011255469871684909,
0.026528354734182358,
-0.03305155038833618,
0.009104947559535503,
0.05834340676665306,
0.04798261821269989,
-0.007887013256549835,
0.022809235379099846,
0.024391142651438713,
-0.017069609835743904,
-0.018963299691677094,
0.013588789850473404,
0.00026265508495271206,
0.031237855553627014,
0.1185089573264122,
0.024133455008268356,
-0.09935401380062103,
0.06817232817411423,
-0.014080427587032318,
-0.044741880148649216,
0.009264187887310982,
-0.0925714522600174,
-0.056089844554662704,
-0.03934299945831299,
0.002071517752483487,
0.010854111984372139,
-0.005825479049235582,
-0.0006731639732606709,
-0.019449496641755104,
-0.03607359901070595,
-0.08280296623706818,
-0.0467197485268116,
-0.052756596356630325,
-0.12612231075763702,
0.00918549858033657,
-0.17471355199813843,
-0.03584754467010498,
-0.11141891777515411,
-0.18594183027744293,
-0.02173002064228058,
0.0661061555147171,
-0.010386515408754349,
-0.057899389415979385,
0.07946863770484924,
0.040604010224342346,
-0.02802249789237976,
-0.0006393827497959137,
0.0800575241446495,
0.0011022149119526148,
0.043288398534059525,
-0.030920399352908134,
0.06361492723226547,
-0.0029282248578965664,
0.03372490033507347,
-0.05388006940484047,
0.06230992451310158,
-0.16267576813697815,
0.044593073427677155,
-0.07220056653022766,
-0.03436671197414398,
-0.08601148426532745,
-0.03199925646185875,
-0.012150201015174389,
0.005250736139714718,
0.021929027512669563,
0.06897646933794022,
-0.18331687152385712,
-0.03023243509232998,
0.12221945077180862,
-0.1610741764307022,
-0.020444922149181366,
0.0658423900604248,
-0.05131641775369644,
0.10440981388092041,
0.06774164736270905,
0.152510404586792,
-0.014128830283880234,
-0.0763339102268219,
0.059555958956480026,
-0.013678672723472118,
0.012508758343756199,
-0.005744941998273134,
0.07273513823747635,
-0.019436657428741455,
-0.15191051363945007,
0.036178696900606155,
-0.1384924054145813,
-0.005527552217245102,
-0.07618274539709091,
0.017117291688919067,
-0.010568792931735516,
-0.06632578372955322,
-0.0677836611866951,
-0.027509188279509544,
0.06887315958738327,
-0.07020238786935806,
-0.01762971468269825,
0.04695579409599304,
0.07259316742420197,
-0.07157611101865768,
0.06810051202774048,
-0.0101243881508708,
0.009896199218928814,
-0.0830741673707962,
-0.04000956565141678,
-0.1919543594121933,
0.04482570290565491,
0.10095109790563583,
0.005490864161401987,
-0.017521865665912628,
0.13974909484386444,
0.006082683801651001,
0.0633549615740776,
-0.0503048300743103,
0.01696551777422428,
-0.013192204758524895,
-0.00313759152777493,
-0.08692044019699097,
-0.09932417422533035,
-0.07275236397981644,
-0.06938781589269638,
0.09567827731370926,
-0.12160145491361618,
0.02005872130393982,
-0.0532945916056633,
0.04616602137684822,
0.01897623762488365,
-0.08405936509370804,
-0.017512122169137,
0.011798053048551083,
-0.061299193650484085,
-0.057096756994724274,
0.0389990508556366,
0.07245518267154694,
-0.01277833990752697,
0.09064353257417679,
-0.0446307472884655,
-0.08907446265220642,
0.03187455236911774,
0.10683189332485199,
-0.10469966381788254,
0.009050179272890091,
-0.058228012174367905,
-0.041500892490148544,
-0.06233576685190201,
-0.014899767935276031,
0.08455942571163177,
-0.006555515341460705,
0.13749802112579346,
-0.07818418741226196,
-0.0066658188588917255,
0.012694875709712505,
-0.02163529396057129,
-0.022556619718670845,
0.03322843089699745,
0.06324727088212967,
-0.08063303679227829,
0.01708393171429634,
0.034432552754879,
0.008356578648090363,
0.07030320912599564,
-0.054275449365377426,
-0.08752478659152985,
0.009351784363389015,
0.03810695558786392,
0.030109219253063202,
0.0685693696141243,
-0.019974440336227417,
-0.01329683605581522,
0.0352243036031723,
0.013697536662220955,
0.0074760569259524345,
-0.11783537268638611,
0.06268236041069031,
0.05591326579451561,
0.004853057209402323,
0.06303190439939499,
-0.01703902706503868,
-0.03840109705924988,
0.07988779991865158,
0.038979992270469666,
0.005651706364005804,
-0.014309462159872055,
-0.016337810084223747,
-0.11732403188943863,
0.18974941968917847,
-0.06425473839044571,
-0.16779880225658417,
-0.07589136064052582,
-0.10629965364933014,
-0.0014123168075457215,
0.024185918271541595,
0.038354501128196716,
-0.013141203671693802,
-0.04396266117691994,
-0.1234242171049118,
0.05627676844596863,
-0.0416904091835022,
0.06870971620082855,
0.11185961216688156,
-0.04022689536213875,
0.05409257486462593,
-0.1257229447364807,
-0.005218579899519682,
-0.08318102359771729,
-0.07143541425466537,
0.05779525637626648,
-0.04578673839569092,
0.026129793375730515,
0.09836228936910629,
0.02523449808359146,
-0.0156779233366251,
-0.025253333151340485,
0.19748562574386597,
0.0389929935336113,
0.04064114764332771,
0.12191398441791534,
-0.07012300938367844,
0.055629804730415344,
0.0843731090426445,
0.00954937469214201,
-0.04050009697675705,
0.050263818353414536,
0.043583210557699203,
-0.06967149674892426,
-0.19256702065467834,
-0.024365194141864777,
-0.009279061108827591,
-0.054822660982608795,
0.07411264628171921,
0.038657158613204956,
0.00735162990167737,
0.06688464432954788,
0.012446857988834381,
0.05812789499759674,
0.0005100609268993139,
0.10021207481622696,
0.0099116750061512,
-0.03374781832098961,
0.08693607896566391,
-0.020195188000798225,
-0.009898722171783447,
0.08472277969121933,
-0.025040296837687492,
0.28940436244010925,
-0.029125014320015907,
0.006466610357165337,
0.1237756535410881,
0.039652951061725616,
0.06490917503833771,
0.12433607876300812,
-0.06867581605911255,
0.023761849850416183,
-0.07309253513813019,
-0.06073408201336861,
-0.00128936767578125,
0.04964666813611984,
-0.05796026065945625,
0.011486823670566082,
-0.06778057664632797,
0.012249268591403961,
-0.021557392552495003,
0.31414711475372314,
0.10906638205051422,
-0.10098466277122498,
-0.056584183126688004,
0.005247128661721945,
-0.09911145269870758,
-0.07279805839061737,
0.038548994809389114,
0.06960487365722656,
-0.13650153577327728,
0.009119778871536255,
-0.027008524164557457,
0.07280664891004562,
-0.014692195691168308,
0.015133663080632687,
0.029960937798023224,
0.037802472710609436,
-0.03478191792964935,
0.009980769827961922,
-0.17863841354846954,
0.19916607439517975,
0.006989323068410158,
0.015670036897063255,
-0.054247576743364334,
0.03620150312781334,
0.006429056636989117,
-0.03439405933022499,
0.061439577490091324,
0.024371877312660217,
-0.03215527534484863,
-0.03826224058866501,
-0.04910442605614662,
0.013453206047415733,
0.08302045613527298,
-0.044799018651247025,
0.10764642059803009,
-0.005773280747234821,
0.041515979915857315,
0.022756177932024002,
0.09354903548955917,
-0.18014314770698547,
-0.0961388573050499,
0.03635356202721596,
-0.057039882987737656,
-0.10334520041942596,
-0.08281854540109634,
-0.09307849407196045,
0.0040236967615783215,
0.25624707341194153,
-0.12514983117580414,
-0.07392171770334244,
-0.09753469377756119,
0.034336578100919724,
0.108103908598423,
-0.04864233359694481,
0.029777389019727707,
-0.011738141067326069,
0.13504143059253693,
-0.06699633598327637,
-0.13369226455688477,
0.018539346754550934,
-0.08876743167638779,
-0.16610600054264069,
-0.06654946506023407,
0.11720264703035355,
0.05713248997926712,
0.033136796206235886,
-0.028053460642695427,
0.028746066614985466,
0.03542805835604668,
-0.0370103120803833,
-0.004690169356763363,
0.07755594700574875,
0.09397901594638824,
0.03151276335120201,
-0.11478354036808014,
0.021839885041117668,
-0.06093344837427139,
-0.06649541109800339,
0.08288796246051788,
0.2617179751396179,
-0.059684306383132935,
0.13258717954158783,
0.11118850111961365,
-0.08169209212064743,
-0.15459094941616058,
0.02523854933679104,
0.0952058956027031,
-0.015373507514595985,
0.013271898962557316,
-0.16043762862682343,
0.08640706539154053,
0.11211598664522171,
-0.02374127320945263,
-0.005044315010309219,
-0.18510094285011292,
-0.12807241082191467,
0.06547942012548447,
0.09580251574516296,
0.2652473747730255,
-0.06142689287662506,
-0.04662180691957474,
0.01788320764899254,
-0.0823572501540184,
0.021117577329277992,
0.1220845878124237,
0.0643722414970398,
-0.023107795044779778,
-0.07799571007490158,
0.013974805362522602,
-0.0406162366271019,
0.09435024857521057,
0.04890783131122589,
0.054485589265823364,
-0.0025935317389667034,
0.010027573443949223,
-0.015352793037891388,
-0.04594351351261139,
0.05903244763612747,
0.023872261866927147,
0.05235634744167328,
-0.07620862126350403,
-0.02846313640475273,
-0.06922352313995361,
0.02873304858803749,
-0.024761496111750603,
-0.07515054941177368,
-0.06160542741417885,
0.07813408225774765,
0.05062083527445793,
-0.025085696950554848,
0.023980356752872467,
0.030113520100712776,
0.11278543621301651,
0.16853363811969757,
-0.0072746542282402515,
-0.043380383402109146,
-0.05216933414340019,
-0.04025940224528313,
-0.01713351346552372,
0.06936341524124146,
-0.0497266910970211,
0.026462864130735397,
0.06328845769166946,
0.02620958723127842,
0.09883102774620056,
0.05557689070701599,
-0.11643274873495102,
-0.02068825252354145,
0.031884901225566864,
-0.16408070921897888,
0.01557982712984085,
0.0002372658345848322,
0.035243865102529526,
-0.03668869286775589,
0.023381663486361504,
0.15118694305419922,
-0.06237322837114334,
-0.0361209474503994,
-0.040435418486595154,
0.06730078160762787,
0.02449841983616352,
0.1341201812028885,
0.03042553924024105,
0.035275496542453766,
-0.08218926191329956,
0.1232592985033989,
0.04320088401436806,
-0.030260348692536354,
0.022003117948770523,
-0.0290926955640316,
-0.10652779042720795,
0.014646604657173157,
0.06162302568554878,
0.03985871374607086,
-0.04634846746921539,
-0.009797421284019947,
-0.02526617981493473,
-0.07465552538633347,
0.05916575342416763,
0.18984800577163696,
0.0650671198964119,
0.07556405663490295,
-0.055294815450906754,
-0.03788231685757637,
-0.07692164182662964,
0.04567296802997589,
0.04004593938589096,
0.07450799643993378,
-0.0739801898598671,
0.10313056409358978,
0.012354579754173756,
0.04104418680071831,
-0.031216945499181747,
-0.05536433309316635,
-0.09849164634943008,
-0.056021809577941895,
-0.09865228831768036,
0.008989374153316021,
-0.07311216741800308,
-0.041940223425626755,
0.00278037553653121,
-0.008876185864210129,
-0.010073247365653515,
0.04821104556322098,
-0.062323711812496185,
-0.009691660292446613,
-0.029659215360879898,
0.03547491133213043,
-0.06544368714094162,
-0.03949616476893425,
0.030649421736598015,
-0.10195579379796982,
0.09147322177886963,
0.05288040637969971,
0.008876909501850605,
0.009069615975022316,
0.08536243438720703,
-0.022946523502469063,
0.021953731775283813,
0.01784677989780903,
-0.046802740544080734,
-0.08747432380914688,
0.00036700430791825056,
-0.005823367275297642,
-0.0162639319896698,
-0.009749806486070156,
0.09141584485769272,
-0.08522837609052658,
0.028102491050958633,
-0.008639443665742874,
-0.007753749378025532,
-0.07161591202020645,
-0.013541404157876968,
0.09876246005296707,
0.09964612871408463,
0.04512440785765648,
-0.08646665513515472,
0.014087931253015995,
-0.13924366235733032,
-0.035886090248823166,
0.00348460441455245,
-0.009770097211003304,
-0.12662263214588165,
-0.012432903982698917,
0.017839796841144562,
-0.0014323245268315077,
0.22087563574314117,
-0.055955249816179276,
-0.021283695474267006,
0.019657867029309273,
-0.09542810171842575,
0.11842420697212219,
-0.021887904033064842,
0.18107378482818604,
-0.003122796071693301,
-0.04021979868412018,
-0.014988654293119907,
0.03621527552604675,
0.01679188199341297,
-0.028373422101140022,
0.1832408457994461,
0.14274628460407257,
0.0381542332470417,
0.039676398038864136,
-0.021002141758799553,
0.001054819324053824,
-0.03778277710080147,
-0.021603742614388466,
0.030687281861901283,
0.03759780526161194,
0.01739761419594288,
0.1574622094631195,
0.06670255959033966,
-0.1702621430158615,
0.03421356528997421,
-0.02383965626358986,
-0.03565605357289314,
-0.1151474118232727,
-0.09468862414360046,
-0.036670997738838196,
-0.06805798411369324,
0.007833370938897133,
-0.12444466352462769,
0.00777363171800971,
0.16999278962612152,
0.055660925805568695,
0.030150428414344788,
-0.002943061525002122,
-0.12449527531862259,
-0.03636116907000542,
0.0509839728474617,
0.015421774238348007,
0.025407614186406136,
0.05433563143014908,
-0.001977835316210985,
0.06183924525976181,
0.04370484501123428,
0.015032671391963959,
0.003660766175016761,
0.0866970345377922,
0.01476742047816515,
0.03927914798259735,
-0.06119118630886078,
-0.00216211611405015,
-0.0437687411904335,
0.07210662961006165,
0.09066903591156006,
0.047768231481313705,
-0.049532290548086166,
-0.007963700219988823,
0.15939493477344513,
-0.04103931784629822,
-0.0047218333929777145,
-0.12797899544239044,
0.32745033502578735,
0.010161787271499634,
0.013232266530394554,
0.046883001923561096,
-0.0763155147433281,
-0.05295370891690254,
0.20791591703891754,
0.08890683948993683,
-0.017151126638054848,
-0.02038310095667839,
0.003118132706731558,
-0.03012070059776306,
-0.02246968261897564,
0.14514292776584625,
0.03390954062342644,
0.1277766227722168,
-0.05011855065822601,
-0.05306468904018402,
-0.028272170573472977,
-0.00841666478663683,
-0.12260272353887558,
0.1414640098810196,
-0.028295988216996193,
-0.028497181832790375,
-0.07310296595096588,
0.023426489904522896,
0.07358450442552567,
-0.32502636313438416,
0.0007028080872260034,
-0.037472598254680634,
-0.11300822347402573,
-0.0015644936356693506,
-0.015656771138310432,
-0.024684472009539604,
0.045641712844371796,
-0.04459792375564575,
0.06945402175188065,
0.04179277643561363,
0.036953795701265335,
-0.02446134015917778,
-0.09180349856615067,
0.169721782207489,
0.04147201403975487,
0.09482420235872269,
0.02520124614238739,
0.07875222712755203,
0.05603938177227974,
0.03482351452112198,
-0.09487106651067734,
0.04129882529377937,
0.014302017167210579,
-0.09172727912664413,
-0.0518517829477787,
0.12595826387405396,
-0.0030805624555796385,
0.04320002719759941,
0.04442829638719559,
-0.10269945859909058,
0.0151868537068367,
0.07277967780828476,
-0.0724203810095787,
-0.09882649034261703,
-0.008501140400767326,
-0.09207578748464584,
0.15591244399547577,
0.14289486408233643,
-0.01650373823940754,
0.024174019694328308,
-0.06632202863693237,
-0.005678707268089056,
0.051606107503175735,
0.015294816344976425,
-0.016270430758595467,
-0.19234128296375275,
0.03431323543190956,
-0.08553487807512283,
-0.005035670939832926,
-0.22828425467014313,
-0.10302305221557617,
-0.011239546351134777,
-0.049896687269210815,
-0.02147970348596573,
0.06050112470984459,
0.030827859416604042,
0.0645965188741684,
-0.017511649057269096,
-0.0397103875875473,
-0.02810872532427311,
0.08593664318323135,
-0.10523197799921036,
-0.061013054102659225
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_200k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_200k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_200k"]}
| null |
google/multiberts-seed_2-step_200k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_200k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 200k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07677799463272095,
0.10402756184339523,
-0.002596095437183976,
0.03924419730901718,
0.07821153104305267,
-0.01544380933046341,
0.07928236573934555,
0.10237176716327667,
-0.012430669739842415,
0.028139881789684296,
0.07816006988286972,
0.002575141144916415,
0.019056832417845726,
0.1028696671128273,
0.02590017020702362,
-0.2261834293603897,
0.021785398945212364,
-0.03170953318476677,
-0.09208974987268448,
0.07658831030130386,
0.09880510717630386,
-0.07939200848340988,
0.043682124465703964,
0.02672075480222702,
-0.11740042269229889,
0.046280425041913986,
0.003916493616998196,
-0.017259368672966957,
0.13329589366912842,
-0.0020772942807525396,
0.05080978572368622,
0.05361822992563248,
0.04675808176398277,
-0.13342100381851196,
0.006206519436091185,
0.05844716355204582,
0.05985446274280548,
0.04567879065871239,
0.02372881956398487,
0.07727646827697754,
0.00980137474834919,
0.027489356696605682,
0.04250267520546913,
0.024319689720869064,
-0.07315130531787872,
-0.06433580070734024,
-0.10193008929491043,
0.0391986258327961,
0.02826686017215252,
0.004489954095333815,
0.009984386153519154,
0.12185963243246078,
-0.03147471323609352,
0.04006459563970566,
0.17876774072647095,
-0.34145015478134155,
-0.014063660986721516,
0.07026825845241547,
0.04220973700284958,
0.12476786226034164,
-0.003003893420100212,
-0.019816415384411812,
0.07524046301841736,
0.02626059390604496,
0.08779562264680862,
-0.03977179527282715,
0.02835753932595253,
-0.05525606498122215,
-0.15630000829696655,
-0.047882452607154846,
0.0927739292383194,
-0.003084347816184163,
-0.13887432217597961,
-0.03361262381076813,
-0.04643789306282997,
0.04124726727604866,
0.012879123911261559,
-0.03815401718020439,
0.03362765535712242,
0.018816104158759117,
-0.021040266379714012,
-0.012917472049593925,
-0.1060740277171135,
-0.054208192974328995,
0.029664142057299614,
0.08254387229681015,
0.10115780681371689,
0.06816175580024719,
0.00012881595466751605,
0.11193695664405823,
-0.18459628522396088,
-0.04994986578822136,
-0.028133364394307137,
-0.05421638488769531,
-0.04798150435090065,
-0.008674085140228271,
-0.10726656764745712,
-0.0470462329685688,
0.008490407839417458,
0.1255987286567688,
0.0022582041565328836,
0.024797018617391586,
-0.033622048795223236,
0.010199530981481075,
0.055835019797086716,
0.0437755323946476,
-0.007828284054994583,
0.02599448710680008,
0.020943840965628624,
-0.013916089199483395,
-0.02026987075805664,
0.014166236855089664,
-0.00028575790929608047,
0.02738155797123909,
0.12036155164241791,
0.024034643545746803,
-0.10109087079763412,
0.06312117725610733,
-0.01997010037302971,
-0.04524281993508339,
0.00452290428802371,
-0.08955308794975281,
-0.05448073893785477,
-0.04055658355355263,
0.00169078609906137,
0.012337494641542435,
-0.00555542204529047,
-0.006409337278455496,
-0.022661643102765083,
-0.036550428718328476,
-0.08413438498973846,
-0.044637300074100494,
-0.052254270762205124,
-0.1254207044839859,
0.009189276956021786,
-0.17677699029445648,
-0.03842461481690407,
-0.11509334295988083,
-0.17975252866744995,
-0.02046835608780384,
0.06396427005529404,
-0.012952038086950779,
-0.056854184716939926,
0.07880391925573349,
0.041633300483226776,
-0.02823195420205593,
-0.0016491500427946448,
0.07140523940324783,
-0.0013596138451248407,
0.04579753056168556,
-0.03146689385175705,
0.06999658793210983,
0.006522227078676224,
0.032852839678525925,
-0.05533184856176376,
0.06359753012657166,
-0.16794688999652863,
0.04381430894136429,
-0.07054741680622101,
-0.03369826823472977,
-0.08722997456789017,
-0.03387739136815071,
-0.012931370176374912,
0.005845905281603336,
0.02218650095164776,
0.0757620707154274,
-0.19485262036323547,
-0.03014439158141613,
0.13226620852947235,
-0.16228348016738892,
-0.01883402280509472,
0.07268564403057098,
-0.04641124978661537,
0.1063743308186531,
0.0700010433793068,
0.15409770607948303,
-0.01209616381675005,
-0.07913503795862198,
0.055627454072237015,
-0.011509633623063564,
0.012026974931359291,
-0.009914224967360497,
0.07036696374416351,
-0.020807217806577682,
-0.15530358254909515,
0.03359800949692726,
-0.1285814791917801,
-0.004280657507479191,
-0.07793592661619186,
0.01704826019704342,
-0.012543240562081337,
-0.0651019737124443,
-0.06265069544315338,
-0.026950906962156296,
0.06497018039226532,
-0.07399917393922806,
-0.017902176827192307,
0.04080581292510033,
0.07304193824529648,
-0.07087129354476929,
0.06618379056453705,
-0.013788276351988316,
0.011897744610905647,
-0.08486779034137726,
-0.03664275258779526,
-0.18922138214111328,
0.05040683224797249,
0.10083334147930145,
0.0054587568156421185,
-0.01957864686846733,
0.14898928999900818,
0.006782955024391413,
0.06767792999744415,
-0.049673739820718765,
0.012657025828957558,
-0.013994533568620682,
-0.0053856284357607365,
-0.09111333638429642,
-0.09561450034379959,
-0.07575821876525879,
-0.06712963432073593,
0.08792649954557419,
-0.12675271928310394,
0.020395712926983833,
-0.053316403180360794,
0.045679427683353424,
0.022641725838184357,
-0.08316975086927414,
-0.017835019156336784,
0.01217682845890522,
-0.05892280489206314,
-0.05846340209245682,
0.038956400007009506,
0.0714796930551529,
-0.013527710922062397,
0.09210187196731567,
-0.04716269299387932,
-0.08221492916345596,
0.03195498511195183,
0.10600647330284119,
-0.10396397113800049,
0.016422856599092484,
-0.05834861844778061,
-0.04126901552081108,
-0.0677080824971199,
-0.016955671831965446,
0.07474470883607864,
-0.004637038800865412,
0.1346370130777359,
-0.07437901943922043,
-0.0022441158071160316,
0.013915283605456352,
-0.02434227429330349,
-0.02255895920097828,
0.03564215078949928,
0.06344705075025558,
-0.07946474105119705,
0.015143787488341331,
0.04088548198342323,
0.012662379071116447,
0.06972633302211761,
-0.05262003839015961,
-0.08909192681312561,
0.009403897449374199,
0.03870682418346405,
0.03139294683933258,
0.06819292157888412,
-0.014505833387374878,
-0.011189879849553108,
0.03294375166296959,
0.017115429043769836,
0.0040183402597904205,
-0.11897137016057968,
0.06021376699209213,
0.05678162723779678,
0.0018258566269651055,
0.06418855488300323,
-0.016308490186929703,
-0.03981608524918556,
0.08204719424247742,
0.037233106791973114,
0.0035210149362683296,
-0.014365613460540771,
-0.015155518427491188,
-0.1181761771440506,
0.18959355354309082,
-0.06282119452953339,
-0.16707000136375427,
-0.07786526530981064,
-0.09448990225791931,
0.004536218475550413,
0.0283501036465168,
0.03798334300518036,
-0.021326685324311256,
-0.04278912395238876,
-0.12337861955165863,
0.05974584445357323,
-0.038424476981163025,
0.06820820271968842,
0.10692697763442993,
-0.03840314969420433,
0.05810843035578728,
-0.12471730262041092,
-0.008253178559243679,
-0.08179979026317596,
-0.07674496620893478,
0.060202013701200485,
-0.04791121929883957,
0.025937890633940697,
0.09966489672660828,
0.02545916847884655,
-0.018031621351838112,
-0.022960402071475983,
0.19855666160583496,
0.04233551025390625,
0.039970386773347855,
0.1292552500963211,
-0.06367845833301544,
0.05663931742310524,
0.08465559780597687,
0.009437077678740025,
-0.04501170665025711,
0.05341244488954544,
0.04376249387860298,
-0.06868555396795273,
-0.1959359347820282,
-0.024481669068336487,
-0.006791202817112207,
-0.044065531343221664,
0.07762479037046432,
0.03788815811276436,
0.0019886691588908434,
0.06872368603944778,
0.011512520723044872,
0.06172255054116249,
-0.003466755850240588,
0.09792932868003845,
0.00747070973739028,
-0.03336251527070999,
0.08788338303565979,
-0.020196232944726944,
-0.010169722139835358,
0.08544786274433136,
-0.021105289459228516,
0.2948204576969147,
-0.02843361534178257,
0.007145938463509083,
0.12078680098056793,
0.04330098256468773,
0.06422311067581177,
0.12496664375066757,
-0.06156450882554054,
0.023303542286157608,
-0.07358763366937637,
-0.05979886278510094,
-0.005502796731889248,
0.049626532942056656,
-0.05523191764950752,
0.013513850048184395,
-0.07442958652973175,
0.02183311991393566,
-0.020636381581425667,
0.3108159005641937,
0.11397659778594971,
-0.10091898590326309,
-0.058949656784534454,
0.00879939366132021,
-0.10093026608228683,
-0.07452784478664398,
0.04209918528795242,
0.08069972693920135,
-0.13654397428035736,
0.003382918192073703,
-0.02767101675271988,
0.07356245070695877,
-0.017644520848989487,
0.016672439873218536,
0.02741052582859993,
0.034599777311086655,
-0.034970834851264954,
0.0095436479896307,
-0.1801346242427826,
0.19759415090084076,
0.0064915623515844345,
0.01992260478436947,
-0.0512746125459671,
0.03349824622273445,
0.007576161064207554,
-0.03462240844964981,
0.06430122256278992,
0.024082312360405922,
-0.040090035647153854,
-0.045692961663007736,
-0.05115900933742523,
0.014198314398527145,
0.07881572097539902,
-0.047547996044158936,
0.10479521751403809,
-0.00680088996887207,
0.04155740514397621,
0.01916314847767353,
0.0914352610707283,
-0.18164142966270447,
-0.08819108456373215,
0.03197978064417839,
-0.059810612350702286,
-0.09681272506713867,
-0.08292306959629059,
-0.09369838237762451,
0.016986999660730362,
0.25148335099220276,
-0.12210343778133392,
-0.07456742972135544,
-0.09397023171186447,
0.03094329684972763,
0.10634154081344604,
-0.050200268626213074,
0.025882406160235405,
-0.004317710641771555,
0.13493120670318604,
-0.06616048514842987,
-0.13453678786754608,
0.02075052820146084,
-0.08998943120241165,
-0.1644938588142395,
-0.0672062337398529,
0.11845634877681732,
0.06205148622393608,
0.0363340862095356,
-0.026910601183772087,
0.023663846775889397,
0.038161445409059525,
-0.036898545920848846,
0.0018120252061635256,
0.0740346908569336,
0.09761884808540344,
0.02974461019039154,
-0.11095826327800751,
0.020828306674957275,
-0.06417084485292435,
-0.06632748246192932,
0.07827150076627731,
0.26010194420814514,
-0.05948035418987274,
0.127061128616333,
0.11179941147565842,
-0.08089885860681534,
-0.15378667414188385,
0.02842034213244915,
0.09340395778417587,
-0.014865703880786896,
0.013629274442791939,
-0.16084961593151093,
0.0885055735707283,
0.11326486617326736,
-0.02377266064286232,
0.00799661036580801,
-0.19397374987602234,
-0.12939053773880005,
0.06774038821458817,
0.09576801210641861,
0.27437740564346313,
-0.05987674742937088,
-0.04519340768456459,
0.015256745740771294,
-0.08940780162811279,
0.02170327492058277,
0.12213670462369919,
0.06360228359699249,
-0.023950647562742233,
-0.07362274825572968,
0.015307433903217316,
-0.03840689733624458,
0.09520747512578964,
0.052052270621061325,
0.05417672172188759,
-0.0031710003968328238,
0.015922820195555687,
-0.02744457684457302,
-0.044433873146772385,
0.06133118271827698,
0.02107812464237213,
0.047514982521533966,
-0.08019541949033737,
-0.028598738834261894,
-0.0680491104722023,
0.027738334611058235,
-0.023781152442097664,
-0.07673671096563339,
-0.05807981267571449,
0.07819220423698425,
0.048566244542598724,
-0.024649638682603836,
0.01273919828236103,
0.030262112617492676,
0.11996761709451675,
0.16229121387004852,
-0.00416603684425354,
-0.047887153923511505,
-0.06041577830910683,
-0.037610720843076706,
-0.016969533637166023,
0.07481390237808228,
-0.048258695751428604,
0.024716248735785484,
0.0650695264339447,
0.021766647696495056,
0.09753701835870743,
0.05670274421572685,
-0.11560162156820297,
-0.017873862758278847,
0.032818522304296494,
-0.16318227350711823,
0.004800685681402683,
-0.00007697974069742486,
0.03828025609254837,
-0.03674791380763054,
0.026221899315714836,
0.15098117291927338,
-0.06626960635185242,
-0.03738439455628395,
-0.04116804525256157,
0.06875612586736679,
0.020930428057909012,
0.13774867355823517,
0.032495155930519104,
0.03779371827840805,
-0.07947712391614914,
0.12309470772743225,
0.03922814130783081,
-0.03776266798377037,
0.023254945874214172,
-0.0291915126144886,
-0.1066036969423294,
0.013338887132704258,
0.06089957058429718,
0.03623713180422783,
-0.045487597584724426,
-0.009445278905332088,
-0.02199404127895832,
-0.07651808112859726,
0.059140123426914215,
0.18403728306293488,
0.06524858623743057,
0.07633259892463684,
-0.058190762996673584,
-0.038273025304079056,
-0.07641074806451797,
0.04211672767996788,
0.04664251580834389,
0.07296937704086304,
-0.07662419229745865,
0.10839949548244476,
0.01002898532897234,
0.04554160684347153,
-0.03078758902847767,
-0.05375538766384125,
-0.09959392994642258,
-0.054685767740011215,
-0.10623420774936676,
0.00922678504139185,
-0.07365459203720093,
-0.042556196451187134,
0.0006077661528252065,
-0.00692465715110302,
-0.0064676981419324875,
0.04700889810919762,
-0.06327877938747406,
-0.009584582410752773,
-0.028296535834670067,
0.034962497651576996,
-0.0655033141374588,
-0.03768807277083397,
0.0319751501083374,
-0.10344205796718597,
0.09079653769731522,
0.05317848175764084,
0.008189186453819275,
0.008952032774686813,
0.09482821077108383,
-0.022851258516311646,
0.02396218664944172,
0.014813287183642387,
-0.046036578714847565,
-0.08565886318683624,
0.0021436531096696854,
-0.00798796396702528,
-0.01660393737256527,
-0.010945319198071957,
0.08836086094379425,
-0.08525760471820831,
0.031194062903523445,
-0.009303142316639423,
-0.00841493345797062,
-0.07502920180559158,
-0.01146155595779419,
0.10098662972450256,
0.09975485503673553,
0.04796047881245613,
-0.09075288474559784,
0.01330006867647171,
-0.1423443704843521,
-0.03661845251917839,
0.004893070086836815,
-0.008688579313457012,
-0.12171640247106552,
-0.00991791021078825,
0.019101543352007866,
-0.0028129464481025934,
0.20270444452762604,
-0.056861672550439835,
-0.01745457388460636,
0.019042085856199265,
-0.10053911060094833,
0.11011701822280884,
-0.02210097759962082,
0.17731483280658722,
-0.005486547946929932,
-0.041044872254133224,
-0.016651425510644913,
0.03492853790521622,
0.018233904615044594,
-0.024890456348657608,
0.18570096790790558,
0.13560223579406738,
0.0341012179851532,
0.04200325906276703,
-0.024932004511356354,
0.00233623874373734,
-0.05794001743197441,
-0.02716122195124626,
0.02929636649787426,
0.03813362494111061,
0.018959050998091698,
0.1597655862569809,
0.06876083463430405,
-0.16876523196697235,
0.032894887030124664,
-0.029650770127773285,
-0.03786078467965126,
-0.11756580322980881,
-0.09793754667043686,
-0.03409057855606079,
-0.0717003121972084,
0.0077248080633580685,
-0.12215612828731537,
0.008370798081159592,
0.1749250441789627,
0.05478290468454361,
0.025551805272698402,
-0.00040138157783076167,
-0.12136080116033554,
-0.0345199815928936,
0.05351043492555618,
0.015319520607590675,
0.024049120023846626,
0.05821024253964424,
0.0010746159823611379,
0.05905766040086746,
0.04428122192621231,
0.015462878160178661,
0.004070769529789686,
0.08023598790168762,
0.016520323231816292,
0.0378001369535923,
-0.060554686933755875,
-0.0038959356024861336,
-0.041307657957077026,
0.06980189681053162,
0.09588521718978882,
0.05083823204040527,
-0.05209202691912651,
-0.006591874174773693,
0.1617926061153412,
-0.043628398329019547,
-0.0035363573115319014,
-0.1271437257528305,
0.33544057607650757,
0.01109287142753601,
0.014177665114402771,
0.04670790955424309,
-0.07578986138105392,
-0.05305581912398338,
0.19841155409812927,
0.08612440526485443,
-0.016971057280898094,
-0.021581457927823067,
0.0006161644705571234,
-0.030968397855758667,
-0.024621928110718727,
0.15125232934951782,
0.03509651869535446,
0.13407419621944427,
-0.05624186992645264,
-0.045815955847501755,
-0.028502488508820534,
-0.009654691442847252,
-0.12601108849048615,
0.13890956342220306,
-0.03218812122941017,
-0.02159278653562069,
-0.07248588651418686,
0.026399260386824608,
0.0725429579615593,
-0.3170880675315857,
0.00045211741235107183,
-0.03164053335785866,
-0.10888367891311646,
-0.001121422043070197,
-0.018486954271793365,
-0.025048676878213882,
0.04889228940010071,
-0.048553913831710815,
0.0697101503610611,
0.04467915743589401,
0.03487811237573624,
-0.02324698679149151,
-0.09287329018115997,
0.16654625535011292,
0.0467853918671608,
0.0954287126660347,
0.026355784386396408,
0.07320304960012436,
0.05399788171052933,
0.03687232360243797,
-0.0956178680062294,
0.04228701442480087,
0.013060430996119976,
-0.08578596264123917,
-0.05150403454899788,
0.12316546589136124,
-0.0034796264953911304,
0.040954601019620895,
0.04599735140800476,
-0.10736607760190964,
0.009657722897827625,
0.06649129837751389,
-0.06645331531763077,
-0.09927161037921906,
-0.005455535836517811,
-0.09128531068563461,
0.15532951056957245,
0.14227816462516785,
-0.018764974549412727,
0.02075084298849106,
-0.0667279064655304,
-0.006759637501090765,
0.05256596580147743,
0.012992179952561855,
-0.015309998765587807,
-0.18787959218025208,
0.03168976679444313,
-0.08049905300140381,
-0.004732516594231129,
-0.22804194688796997,
-0.10441843420267105,
-0.00955233070999384,
-0.05002189427614212,
-0.025767570361495018,
0.06100296229124069,
0.029046855866909027,
0.06581176072359085,
-0.018225878477096558,
-0.039718907326459885,
-0.028411174193024635,
0.08885882049798965,
-0.10589584708213806,
-0.06321252882480621
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 20k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 20k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_20k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_20k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_20k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_20k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_20k"]}
| null |
google/multiberts-seed_2-step_20k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_20k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_20k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 20k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 20k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 20k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 20k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_20k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 20k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 20k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_20k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 20k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 20k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07565965503454208,
0.10373463481664658,
-0.002562273060902953,
0.03523759916424751,
0.07616200298070908,
-0.01728234812617302,
0.08617334812879562,
0.10300114005804062,
-0.01729215309023857,
0.03233017772436142,
0.08028404414653778,
0.00463893124833703,
0.016956588253378868,
0.10438623279333115,
0.02310211770236492,
-0.22782675921916962,
0.02150878496468067,
-0.03172188624739647,
-0.08419008553028107,
0.07710106670856476,
0.10048982501029968,
-0.08151381462812424,
0.04337511211633682,
0.029760463163256645,
-0.11852925270795822,
0.04762686789035797,
0.0035277048591524363,
-0.019254174083471298,
0.13308411836624146,
-0.0020400111097842455,
0.05142901465296745,
0.05420374497771263,
0.042745016515254974,
-0.13277150690555573,
0.00621422566473484,
0.05979613959789276,
0.055814750492572784,
0.04681747406721115,
0.026014136150479317,
0.07433487474918365,
0.015413305722177029,
0.017874808982014656,
0.04528475180268288,
0.022409817203879356,
-0.07339337468147278,
-0.07000043988227844,
-0.10224251449108124,
0.04111296311020851,
0.026966385543346405,
0.0048772734589874744,
0.009910857304930687,
0.1278948038816452,
-0.034197431057691574,
0.04569537192583084,
0.1867077499628067,
-0.341585248708725,
-0.014767533168196678,
0.0696888193488121,
0.044091951102018356,
0.1269134134054184,
-0.0030135365668684244,
-0.019096380099654198,
0.07503034919500351,
0.023302745074033737,
0.0862715095281601,
-0.04086164012551308,
0.0330386608839035,
-0.053373612463474274,
-0.1574499011039734,
-0.04869356378912926,
0.09728486835956573,
-0.003861368400976062,
-0.1354333758354187,
-0.03931456804275513,
-0.04756169766187668,
0.04226762428879738,
0.01005117129534483,
-0.0372791588306427,
0.03580548241734505,
0.01609102450311184,
-0.02144334279000759,
-0.009814206510782242,
-0.10464514791965485,
-0.05311133712530136,
0.031691208481788635,
0.08112603425979614,
0.10230440646409988,
0.06539130210876465,
-0.0037339820992201567,
0.1085512638092041,
-0.19623999297618866,
-0.05081750825047493,
-0.026403089985251427,
-0.050330884754657745,
-0.047795794904232025,
-0.010418218560516834,
-0.1111634224653244,
-0.053027208894491196,
0.009726340882480145,
0.13350775837898254,
-0.0018065250478684902,
0.027356160804629326,
-0.0393148772418499,
0.01014320831745863,
0.05809752270579338,
0.0449196919798851,
-0.005968373268842697,
0.023107409477233887,
0.021813541650772095,
-0.019135909155011177,
-0.016087813302874565,
0.013473300263285637,
0.000759222952183336,
0.029063265770673752,
0.12070322781801224,
0.022078964859247208,
-0.10103710740804672,
0.06661058217287064,
-0.017088791355490685,
-0.04491715505719185,
0.0071911895647645,
-0.09068772941827774,
-0.054521411657333374,
-0.03782693296670914,
0.0015964831691235304,
0.01244360487908125,
-0.005645894445478916,
-0.00579097680747509,
-0.0215475894510746,
-0.038718026131391525,
-0.08575700968503952,
-0.04609109088778496,
-0.05171557143330574,
-0.1274392157793045,
0.01003868319094181,
-0.17808489501476288,
-0.0365230031311512,
-0.10957356542348862,
-0.18689629435539246,
-0.02193802408874035,
0.06243490055203438,
-0.011214776895940304,
-0.057783931493759155,
0.07762797921895981,
0.042121656239032745,
-0.027888460084795952,
-0.0006422584410756826,
0.0743185430765152,
-0.0009743271511979401,
0.04452833905816078,
-0.02867801859974861,
0.06825491786003113,
0.004752046428620815,
0.03353167697787285,
-0.053359098732471466,
0.06493974477052689,
-0.17317843437194824,
0.044161438941955566,
-0.07317008078098297,
-0.0338154211640358,
-0.08644098043441772,
-0.0339416041970253,
-0.005828372668474913,
0.006156248040497303,
0.021565305069088936,
0.0708400309085846,
-0.1872914731502533,
-0.0344872809946537,
0.13576988875865936,
-0.162973091006279,
-0.01652660220861435,
0.07062005251646042,
-0.047896891832351685,
0.10491649061441422,
0.06952027976512909,
0.16059720516204834,
-0.014031202532351017,
-0.07674024254083633,
0.05631312355399132,
-0.011939985677599907,
0.013255560770630836,
-0.009019877761602402,
0.07225412875413895,
-0.018847079947590828,
-0.14828306436538696,
0.034389678388834,
-0.13101764023303986,
-0.0062038106843829155,
-0.07745157182216644,
0.018587816506624222,
-0.013235596939921379,
-0.06201300397515297,
-0.06814014911651611,
-0.02543519251048565,
0.06757514923810959,
-0.07644473016262054,
-0.017672641202807426,
0.04375261813402176,
0.0739431157708168,
-0.07239463180303574,
0.06462923437356949,
-0.014445166103541851,
0.012938612140715122,
-0.08289661258459091,
-0.037456005811691284,
-0.19011127948760986,
0.04931877180933952,
0.10136241465806961,
0.0066970945335924625,
-0.01807357929646969,
0.15065164864063263,
0.008209616877138615,
0.0678490474820137,
-0.04651106148958206,
0.014151637442409992,
-0.011355862952768803,
-0.005734242033213377,
-0.09114358574151993,
-0.10237551480531693,
-0.07237856835126877,
-0.06961733847856522,
0.09265094250440598,
-0.13118572533130646,
0.02131318859755993,
-0.05028664693236351,
0.047976840287446976,
0.020719313994050026,
-0.08373825252056122,
-0.016769547015428543,
0.010118012316524982,
-0.05961045250296593,
-0.05820942670106888,
0.03906434029340744,
0.07057935744524002,
-0.01280855294317007,
0.09129395335912704,
-0.05344101041555405,
-0.08485406637191772,
0.03228660672903061,
0.10201467573642731,
-0.1034095510840416,
0.01063993014395237,
-0.0602584034204483,
-0.042173344641923904,
-0.061508335173130035,
-0.014099420048296452,
0.07870087027549744,
-0.006160942371934652,
0.13832326233386993,
-0.07525572925806046,
-0.005220446735620499,
0.01503078918904066,
-0.02310553938150406,
-0.023258162662386894,
0.03891923651099205,
0.06070651859045029,
-0.08410296589136124,
0.017312398180365562,
0.04009341821074486,
0.01173108909279108,
0.07160558551549911,
-0.05619899928569794,
-0.09117742627859116,
0.00993047934025526,
0.04049012064933777,
0.030331511050462723,
0.06581781059503555,
-0.02389354817569256,
-0.016074301674962044,
0.035858072340488434,
0.014466413296759129,
0.005900735501199961,
-0.11942959576845169,
0.06168002262711525,
0.056688226759433746,
0.004138750024139881,
0.05962067469954491,
-0.01574435457587242,
-0.04003790766000748,
0.08028174191713333,
0.03700162097811699,
0.0009492678800597787,
-0.013526060618460178,
-0.015990637242794037,
-0.11764390766620636,
0.18939588963985443,
-0.06326606869697571,
-0.1643371731042862,
-0.07611098885536194,
-0.10297150164842606,
0.000996066490188241,
0.026076607406139374,
0.04000146687030792,
-0.019170226529240608,
-0.045207779854536057,
-0.12133287638425827,
0.057446498423814774,
-0.04373547434806824,
0.06719265878200531,
0.10926003754138947,
-0.04056764021515846,
0.05600081384181976,
-0.1261695772409439,
-0.00736759789288044,
-0.08186769485473633,
-0.07500516623258591,
0.061632007360458374,
-0.048567112535238266,
0.022924700751900673,
0.09612128883600235,
0.02686367556452751,
-0.015606257133185863,
-0.024178791791200638,
0.19716107845306396,
0.03949211910367012,
0.03597879409790039,
0.1297009438276291,
-0.06386955082416534,
0.055995043367147446,
0.08004581183195114,
0.007832205854356289,
-0.043699026107788086,
0.04997914284467697,
0.04508044198155403,
-0.0671132281422615,
-0.19794560968875885,
-0.024451078847050667,
-0.006564413197338581,
-0.04491386562585831,
0.07740563154220581,
0.037427935749292374,
0.009110599756240845,
0.06696023046970367,
0.010963205248117447,
0.06637993454933167,
-0.0027404665015637875,
0.10046660900115967,
0.015338700264692307,
-0.0364266112446785,
0.08849570900201797,
-0.019748913124203682,
-0.008615120314061642,
0.08494146168231964,
-0.02463725581765175,
0.29091936349868774,
-0.025595391169190407,
0.018305666744709015,
0.1209818422794342,
0.03981461375951767,
0.06295625120401382,
0.12408926337957382,
-0.061351172626018524,
0.02136555127799511,
-0.07507120072841644,
-0.060954198241233826,
-0.003836636431515217,
0.051705874502658844,
-0.058386702090501785,
0.009421117603778839,
-0.07110676914453506,
0.01859886571764946,
-0.019681215286254883,
0.31089383363723755,
0.1083718091249466,
-0.10520016402006149,
-0.05790627375245094,
0.006579477805644274,
-0.10334222763776779,
-0.07369499653577805,
0.041883088648319244,
0.0750117152929306,
-0.13585005700588226,
0.008925147354602814,
-0.027619536966085434,
0.07391264289617538,
-0.01697535067796707,
0.017966443672776222,
0.02207302860915661,
0.03847325220704079,
-0.03462330251932144,
0.011808183044195175,
-0.1857319325208664,
0.19668425619602203,
0.008256549015641212,
0.01589028351008892,
-0.05503016337752342,
0.034093745052814484,
0.005853163078427315,
-0.03237522393465042,
0.06413013488054276,
0.02233222872018814,
-0.03967521712183952,
-0.0393652617931366,
-0.052283987402915955,
0.014575330540537834,
0.08223547041416168,
-0.04590176045894623,
0.1052078828215599,
-0.006247739307582378,
0.0418887622654438,
0.021413009613752365,
0.09077655524015427,
-0.17751984298229218,
-0.0882124975323677,
0.03482716158032417,
-0.057041436433792114,
-0.10227373242378235,
-0.08402934670448303,
-0.09350714087486267,
0.0033333913888782263,
0.25908777117729187,
-0.12096075713634491,
-0.07279665023088455,
-0.09645380079746246,
0.03643721714615822,
0.10398586839437485,
-0.05059812217950821,
0.026636263355612755,
-0.006202341057360172,
0.13658462464809418,
-0.06772324442863464,
-0.13413342833518982,
0.023839691653847694,
-0.0905386209487915,
-0.16518497467041016,
-0.06624852865934372,
0.11881613731384277,
0.05923961102962494,
0.03631756082177162,
-0.029125886037945747,
0.02569049596786499,
0.033689986914396286,
-0.03513124957680702,
0.0007791774696670473,
0.07745078951120377,
0.09733058512210846,
0.02756386250257492,
-0.1060587540268898,
0.02506968192756176,
-0.06124754995107651,
-0.06530710309743881,
0.08093822747468948,
0.26249098777770996,
-0.05743962898850441,
0.1283682882785797,
0.10757327824831009,
-0.07802063226699829,
-0.15037140250205994,
0.029718520119786263,
0.09422524273395538,
-0.015515037812292576,
0.012634471990168095,
-0.16189256310462952,
0.08800673484802246,
0.11308149248361588,
-0.0233988668769598,
0.004127754829823971,
-0.18929456174373627,
-0.12701475620269775,
0.06990084797143936,
0.0969875305891037,
0.27191779017448425,
-0.060878921300172806,
-0.046928904950618744,
0.018173225224018097,
-0.08088404685258865,
0.02261979505419731,
0.11095529794692993,
0.06317206472158432,
-0.022543691098690033,
-0.07407047599554062,
0.014857715927064419,
-0.040566377341747284,
0.09218057990074158,
0.052089180797338486,
0.056254006922245026,
-0.002513346029445529,
0.018148336559534073,
-0.016490796580910683,
-0.043566104024648666,
0.06137906014919281,
0.02027144841849804,
0.04778067767620087,
-0.07998117804527283,
-0.02754228003323078,
-0.06846551597118378,
0.02834967151284218,
-0.024523740634322166,
-0.07544445246458054,
-0.05780094116926193,
0.07461950182914734,
0.04754561558365822,
-0.02371181920170784,
0.024430327117443085,
0.030878815799951553,
0.11756432056427002,
0.1659363955259323,
-0.005284237675368786,
-0.04501577839255333,
-0.05914885178208351,
-0.041363105177879333,
-0.015155326575040817,
0.07364103198051453,
-0.052981406450271606,
0.02227986976504326,
0.06276322156190872,
0.025667954236268997,
0.10023009777069092,
0.05470169335603714,
-0.11855325102806091,
-0.019212668761610985,
0.03125658631324768,
-0.16767676174640656,
0.005781815852969885,
-0.001030500279739499,
0.03704705834388733,
-0.03282145783305168,
0.02887772209942341,
0.15168166160583496,
-0.06323690712451935,
-0.036696288734674454,
-0.041177064180374146,
0.06687372177839279,
0.022369343787431717,
0.13541057705879211,
0.030699897557497025,
0.037779800593853,
-0.0834440067410469,
0.11970286816358566,
0.042132310569286346,
-0.039477042853832245,
0.025062989443540573,
-0.02541179396212101,
-0.10663417726755142,
0.014633233658969402,
0.062408845871686935,
0.043946199119091034,
-0.04514317214488983,
-0.013198764994740486,
-0.026636498048901558,
-0.07241165637969971,
0.060265105217695236,
0.18905441462993622,
0.06554853171110153,
0.07497235387563705,
-0.05732439085841179,
-0.0369943343102932,
-0.07742739468812943,
0.0427294597029686,
0.03847556188702583,
0.07516035437583923,
-0.0761728435754776,
0.1131032332777977,
0.011244041845202446,
0.042886264622211456,
-0.030946657061576843,
-0.054731257259845734,
-0.09953731298446655,
-0.05574242025613785,
-0.09966153651475906,
0.008111360482871532,
-0.07546001672744751,
-0.042253974825143814,
0.001519126002676785,
-0.004859276115894318,
-0.0063456022180616856,
0.047249965369701385,
-0.06316778063774109,
-0.009326404891908169,
-0.02648770622909069,
0.03770475089550018,
-0.0689382404088974,
-0.03611419349908829,
0.02955322340130806,
-0.10255575180053711,
0.09200326353311539,
0.05652030557394028,
0.009525380097329617,
0.009542904794216156,
0.09049294888973236,
-0.02254771627485752,
0.025152387097477913,
0.013500921428203583,
-0.043354395776987076,
-0.08383157849311829,
0.00039115294930525124,
-0.007842853665351868,
-0.01855497434735298,
-0.010674036107957363,
0.08187273889780045,
-0.08491241186857224,
0.031042303889989853,
-0.007699157577008009,
-0.012407476082444191,
-0.0754299908876419,
-0.011476087383925915,
0.09414386004209518,
0.09719035774469376,
0.04349461942911148,
-0.08904960751533508,
0.013316520489752293,
-0.141042098402977,
-0.03732653707265854,
0.004661542363464832,
-0.01075100526213646,
-0.12095605581998825,
-0.010550430044531822,
0.018394090235233307,
-0.0052007343620061874,
0.2111819088459015,
-0.0548231266438961,
-0.01971401646733284,
0.01845625974237919,
-0.09780710190534592,
0.11255083233118057,
-0.025242669507861137,
0.18015407025814056,
-0.004323917906731367,
-0.03834708034992218,
-0.01541069895029068,
0.03713579848408699,
0.020385514944791794,
-0.029861658811569214,
0.1807319074869156,
0.13675542175769806,
0.028800273314118385,
0.04038258641958237,
-0.022858044132590294,
-0.0016664095455780625,
-0.05883721634745598,
-0.017675600945949554,
0.031108831986784935,
0.0411226712167263,
0.018817134201526642,
0.1591247320175171,
0.07238028943538666,
-0.1701735109090805,
0.033343520015478134,
-0.02580464817583561,
-0.03671428561210632,
-0.11627230048179626,
-0.09954919666051865,
-0.0359768383204937,
-0.07408959418535233,
0.007927415892481804,
-0.12388533353805542,
0.010834510438144207,
0.16863827407360077,
0.05590558424592018,
0.02601703628897667,
0.0013604635605588555,
-0.11939020454883575,
-0.035651735961437225,
0.05131113901734352,
0.015412705950438976,
0.022552112117409706,
0.05143881216645241,
0.0005785332177765667,
0.060671139508485794,
0.04083874821662903,
0.015000037848949432,
0.0008625267655588686,
0.08682582527399063,
0.01712266355752945,
0.03767368942499161,
-0.06141233816742897,
-0.00233383197337389,
-0.04126247763633728,
0.06916540861129761,
0.08926992118358612,
0.04796716198325157,
-0.052464913576841354,
-0.008340957574546337,
0.15640543401241302,
-0.042820099741220474,
0.0007476790924556553,
-0.1255904883146286,
0.34033942222595215,
0.008972043171525002,
0.012756705284118652,
0.046664752066135406,
-0.07541687786579132,
-0.050736021250486374,
0.19883060455322266,
0.08516383916139603,
-0.012434156611561775,
-0.019654441624879837,
0.0015317352954298258,
-0.031278230249881744,
-0.026509135961532593,
0.14467597007751465,
0.035180024802684784,
0.13240677118301392,
-0.05407410115003586,
-0.049310214817523956,
-0.028546109795570374,
-0.009988187812268734,
-0.12589848041534424,
0.13594898581504822,
-0.025633681565523148,
-0.025581520050764084,
-0.0684681385755539,
0.024859147146344185,
0.0753195583820343,
-0.3167078197002411,
-0.0009237253107130527,
-0.03279809653759003,
-0.10936334729194641,
-0.002558324486017227,
-0.01661684550344944,
-0.022726578637957573,
0.04750543832778931,
-0.04664177820086479,
0.06718980520963669,
0.03972053900361061,
0.03482222184538841,
-0.02521343156695366,
-0.0900111123919487,
0.167863130569458,
0.04130980372428894,
0.09278445690870285,
0.0247830580919981,
0.07649994641542435,
0.05604766309261322,
0.03504588454961777,
-0.09140484780073166,
0.04144071042537689,
0.014376698061823845,
-0.088516004383564,
-0.05506058782339096,
0.12668132781982422,
-0.0024249448906630278,
0.04281236603856087,
0.04673068970441818,
-0.10909899324178696,
0.011869409121572971,
0.06953489035367966,
-0.06921856105327606,
-0.099420927464962,
-0.006306483875960112,
-0.0911938026547432,
0.15541845560073853,
0.1439313441514969,
-0.01740408129990101,
0.021328764036297798,
-0.06918630748987198,
-0.005280046723783016,
0.05341002345085144,
0.013521984219551086,
-0.015549722127616405,
-0.1885889321565628,
0.03397727012634277,
-0.07664740830659866,
-0.003237237222492695,
-0.22769217193126678,
-0.10280939191579819,
-0.010473069734871387,
-0.05009927973151207,
-0.026760347187519073,
0.05570624768733978,
0.030561091378331184,
0.06377848982810974,
-0.01854560896754265,
-0.039845481514930725,
-0.028383426368236542,
0.08801518380641937,
-0.10651972144842148,
-0.061674848198890686
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_300k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_300k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_300k"]}
| null |
google/multiberts-seed_2-step_300k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_300k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 300k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07510000467300415,
0.08995752781629562,
-0.0023659642320126295,
0.04232608899474144,
0.0796990618109703,
-0.0146208880469203,
0.08042684942483902,
0.10061945766210556,
-0.017187735065817833,
0.024379363283514977,
0.08159415423870087,
0.005028651095926762,
0.017975501716136932,
0.09357398748397827,
0.02466580830514431,
-0.22435227036476135,
0.02260228432714939,
-0.03263015300035477,
-0.08590583503246307,
0.0794474333524704,
0.1009855791926384,
-0.07817089557647705,
0.045872028917074203,
0.025829264894127846,
-0.10982398688793182,
0.04989111050963402,
0.0014613786479458213,
-0.016530048102140427,
0.1311371773481369,
-0.003685920499265194,
0.05087662488222122,
0.052416447550058365,
0.05035973712801933,
-0.13520966470241547,
0.005668978672474623,
0.056048061698675156,
0.06013131141662598,
0.04381684586405754,
0.020288364961743355,
0.07490603625774384,
0.005220938473939896,
0.025352923199534416,
0.046296969056129456,
0.02531501278281212,
-0.07661925256252289,
-0.05454552173614502,
-0.10550204664468765,
0.0367964468896389,
0.030432969331741333,
0.007336636073887348,
0.010752593167126179,
0.1303008645772934,
-0.03759268671274185,
0.042616549879312515,
0.19162151217460632,
-0.33869272470474243,
-0.012959078885614872,
0.07839986681938171,
0.044845592230558395,
0.12537629902362823,
-0.0035518212243914604,
-0.01985194906592369,
0.07411344349384308,
0.026121394708752632,
0.09214738011360168,
-0.04039204120635986,
0.03311050310730934,
-0.05539088323712349,
-0.15820442140102386,
-0.04724631831049919,
0.09262081235647202,
-0.0011365717509761453,
-0.1385190337896347,
-0.042724981904029846,
-0.04555059224367142,
0.03399420902132988,
0.013834130950272083,
-0.03843681514263153,
0.03347467631101608,
0.015899652615189552,
-0.022708190605044365,
-0.013336396776139736,
-0.10400373488664627,
-0.057111840695142746,
0.029909726232290268,
0.0892217829823494,
0.09939025342464447,
0.0687526986002922,
0.0030791163444519043,
0.11441469192504883,
-0.1776839643716812,
-0.05276027321815491,
-0.029380571097135544,
-0.05389818176627159,
-0.04697154089808464,
-0.01070154458284378,
-0.10714899748563766,
-0.048695992678403854,
0.008306127041578293,
0.13233180344104767,
0.0021187143865972757,
0.025308266282081604,
-0.02840225026011467,
0.009066108614206314,
0.059479862451553345,
0.048331886529922485,
-0.007802215404808521,
0.015598910860717297,
0.02151339314877987,
-0.009048935025930405,
-0.016086021438241005,
0.014462053775787354,
0.0016398936277255416,
0.026989705860614777,
0.1181824654340744,
0.02616404928267002,
-0.10147660970687866,
0.065106101334095,
-0.023452410474419594,
-0.04632227122783661,
0.015907760709524155,
-0.09101470559835434,
-0.05682031437754631,
-0.042692333459854126,
0.0021668775007128716,
0.016853591427206993,
-0.004812625236809254,
-0.005079654511064291,
-0.023655932396650314,
-0.036610402166843414,
-0.08132486045360565,
-0.046734802424907684,
-0.05399937555193901,
-0.1257418394088745,
0.0048147994093596935,
-0.17262370884418488,
-0.038079701364040375,
-0.11476953327655792,
-0.18669095635414124,
-0.020818812772631645,
0.06424634158611298,
-0.012003310024738312,
-0.053926363587379456,
0.0781099796295166,
0.04442664608359337,
-0.028224319219589233,
-0.0024022406432777643,
0.07101018726825714,
-0.001463505206629634,
0.045383352786302567,
-0.026092275977134705,
0.06854869425296783,
0.002841224893927574,
0.03195799142122269,
-0.05638054758310318,
0.0635242909193039,
-0.16632258892059326,
0.044578444212675095,
-0.06928882002830505,
-0.03211728110909462,
-0.08554486930370331,
-0.037414271384477615,
-0.011537348851561546,
0.006688026245683432,
0.02428157441318035,
0.07607753574848175,
-0.1808534413576126,
-0.0274039376527071,
0.12315782904624939,
-0.16099674999713898,
-0.02123243734240532,
0.07280256599187851,
-0.047734860330820084,
0.10310143232345581,
0.06946726888418198,
0.14832279086112976,
-0.016094058752059937,
-0.08151784539222717,
0.05816374719142914,
-0.012409104034304619,
0.012724940665066242,
-0.011334653943777084,
0.07298632711172104,
-0.020996427163481712,
-0.14946353435516357,
0.03290721774101257,
-0.132189080119133,
-0.00471922243013978,
-0.07850567251443863,
0.01646425575017929,
-0.010522215627133846,
-0.06836406886577606,
-0.06736540049314499,
-0.02786226198077202,
0.06484866887331009,
-0.07570167630910873,
-0.01703071966767311,
0.039533112198114395,
0.0744999349117279,
-0.0723654180765152,
0.06829316914081573,
-0.011184734292328358,
0.01156291551887989,
-0.0861835852265358,
-0.04055384546518326,
-0.18939639627933502,
0.04770513251423836,
0.10066600143909454,
0.011067983694374561,
-0.018202364444732666,
0.14253512024879456,
0.007424868177622557,
0.06803929805755615,
-0.05197376012802124,
0.011966274119913578,
-0.017423046752810478,
-0.005434621591120958,
-0.08799207955598831,
-0.09253469109535217,
-0.08011836558580399,
-0.06840254366397858,
0.08032450824975967,
-0.11643391102552414,
0.019817903637886047,
-0.055794067680835724,
0.043556779623031616,
0.02303205616772175,
-0.08545847237110138,
-0.01760508306324482,
0.013861051760613918,
-0.0605296790599823,
-0.05883660167455673,
0.03871123865246773,
0.07091064751148224,
-0.0130867725238204,
0.08703584969043732,
-0.04930635169148445,
-0.08137927204370499,
0.03212704509496689,
0.10187220573425293,
-0.10711149126291275,
0.009382817894220352,
-0.056979335844516754,
-0.03980053588747978,
-0.06993485987186432,
-0.02121618762612343,
0.07760447263717651,
-0.004204882308840752,
0.1368180513381958,
-0.07582204788923264,
-0.0051727318204939365,
0.00981154479086399,
-0.02433493360877037,
-0.02352561429142952,
0.03280724957585335,
0.06321374326944351,
-0.08354213088750839,
0.016690539196133614,
0.04076778143644333,
0.011554219760000706,
0.07243343442678452,
-0.05149166285991669,
-0.08887216448783875,
0.009938938543200493,
0.035369932651519775,
0.03069148026406765,
0.0683421865105629,
-0.011582124046981335,
-0.012245080433785915,
0.03194449469447136,
0.019379600882530212,
0.004873231984674931,
-0.11505597829818726,
0.06062702834606171,
0.05809702351689339,
0.002221890026703477,
0.06952984631061554,
-0.017284339293837547,
-0.041779182851314545,
0.08097139000892639,
0.038439009338617325,
0.003034817287698388,
-0.015399020165205002,
-0.017527835443615913,
-0.11761131882667542,
0.1896182745695114,
-0.05812634900212288,
-0.16232478618621826,
-0.07954847067594528,
-0.10134555399417877,
-0.0010406359797343612,
0.026948461309075356,
0.03619660437107086,
-0.01868618093430996,
-0.04396938532590866,
-0.1252266764640808,
0.06421350687742233,
-0.03672070801258087,
0.0670877993106842,
0.11167966574430466,
-0.03924861550331116,
0.05468432605266571,
-0.12709389626979828,
-0.010690138675272465,
-0.08381664752960205,
-0.07854791730642319,
0.057503439486026764,
-0.050780486315488815,
0.028170257806777954,
0.09398011863231659,
0.027480894699692726,
-0.015887737274169922,
-0.025187937542796135,
0.20177002251148224,
0.04222511500120163,
0.03779798746109009,
0.13262884318828583,
-0.06192977726459503,
0.05456643924117088,
0.07948031276464462,
0.01130734570324421,
-0.04807940125465393,
0.05514800176024437,
0.04279704391956329,
-0.07155168056488037,
-0.191135436296463,
-0.025329215452075005,
-0.00875815562903881,
-0.04943668842315674,
0.0773065984249115,
0.035779546946287155,
-0.0010693703079596162,
0.06899089366197586,
0.011166275478899479,
0.056282997131347656,
-0.0023276261053979397,
0.10007155686616898,
0.005478117614984512,
-0.031178880482912064,
0.0898347795009613,
-0.020967593416571617,
-0.013156411238014698,
0.08467776328325272,
-0.014699716120958328,
0.287192165851593,
-0.029830945655703545,
0.008955639787018299,
0.12006143480539322,
0.041068900376558304,
0.061462968587875366,
0.1289639174938202,
-0.06271480768918991,
0.021622231230139732,
-0.0737980306148529,
-0.059718962758779526,
-0.007690174970775843,
0.046071913093328476,
-0.05217059701681137,
0.018298430368304253,
-0.07309509813785553,
0.019173797219991684,
-0.017878422513604164,
0.3193903863430023,
0.11242663115262985,
-0.1056150421500206,
-0.056141387671232224,
0.008259966038167477,
-0.1009814664721489,
-0.06923945993185043,
0.043898601084947586,
0.06804940104484558,
-0.13854920864105225,
0.009960699826478958,
-0.025174785405397415,
0.07402318716049194,
-0.02089998684823513,
0.016650814563035965,
0.03170892968773842,
0.03416146710515022,
-0.03600266948342323,
0.006334154866635799,
-0.18391203880310059,
0.19950436055660248,
0.006628861650824547,
0.01914282888174057,
-0.05299292132258415,
0.034006621688604355,
0.007598012685775757,
-0.03929857537150383,
0.0656629353761673,
0.022208720445632935,
-0.021987738087773323,
-0.04712942987680435,
-0.05241294950246811,
0.012175032868981361,
0.07942447811365128,
-0.047082383185625076,
0.10731480270624161,
-0.008353129960596561,
0.04269368201494217,
0.018761862069368362,
0.09493985027074814,
-0.18106761574745178,
-0.09064800292253494,
0.028467146679759026,
-0.060501810163259506,
-0.1052059605717659,
-0.08149494975805283,
-0.09241805970668793,
0.009711961261928082,
0.25520196557044983,
-0.11826061457395554,
-0.07548082619905472,
-0.09580506384372711,
0.029165415093302727,
0.10360846668481827,
-0.048270586878061295,
0.02380179800093174,
-0.004878194537013769,
0.12974536418914795,
-0.06684567779302597,
-0.1336013227701187,
0.02180016040802002,
-0.09205671399831772,
-0.16527797281742096,
-0.0657142847776413,
0.11791879683732986,
0.06216065585613251,
0.036381304264068604,
-0.025988033041357994,
0.02364487200975418,
0.03934303671121597,
-0.0390799343585968,
-0.0030290449503809214,
0.07024186849594116,
0.09512945264577866,
0.040422238409519196,
-0.10899238288402557,
0.016759850084781647,
-0.06315705925226212,
-0.0669122189283371,
0.08032900840044022,
0.26429441571235657,
-0.055658869445323944,
0.1262010633945465,
0.11530443280935287,
-0.0776544064283371,
-0.153150275349617,
0.024290040135383606,
0.09224185347557068,
-0.013335864059627056,
0.02073148638010025,
-0.15456749498844147,
0.08659300953149796,
0.1125786229968071,
-0.024431593716144562,
-0.007102521602064371,
-0.19606423377990723,
-0.12980414927005768,
0.06807093322277069,
0.10155355930328369,
0.2727709114551544,
-0.06173178553581238,
-0.042188551276922226,
0.012536952272057533,
-0.09456315636634827,
0.020712075755000114,
0.12672960758209229,
0.06639784574508667,
-0.025950346142053604,
-0.07782738655805588,
0.014227072708308697,
-0.04088917747139931,
0.09580186009407043,
0.05023578926920891,
0.05682077631354332,
-0.004858086816966534,
0.018676819279789925,
-0.017799220979213715,
-0.043959569185972214,
0.06500040739774704,
0.018349390476942062,
0.0482514426112175,
-0.07841913402080536,
-0.03134756535291672,
-0.06738400459289551,
0.02431926131248474,
-0.023714296519756317,
-0.07944762706756592,
-0.06110304966568947,
0.07848292589187622,
0.050236254930496216,
-0.025976253673434258,
0.01719435676932335,
0.02844577096402645,
0.11743253469467163,
0.16193029284477234,
0.0015465500764548779,
-0.04628428444266319,
-0.06157044693827629,
-0.04016033932566643,
-0.017899561673402786,
0.07311605662107468,
-0.03836140036582947,
0.023199178278446198,
0.06343619525432587,
0.02128012664616108,
0.09760807454586029,
0.057257264852523804,
-0.11395833641290665,
-0.016636814922094345,
0.033790986984968185,
-0.15973302721977234,
0.016874706372618675,
0.0004563804541248828,
0.03360040858387947,
-0.038176279515028,
0.02604449726641178,
0.15156081318855286,
-0.061962027102708817,
-0.036733340471982956,
-0.04054684564471245,
0.06851606070995331,
0.016719099134206772,
0.13848353922367096,
0.030932774767279625,
0.03757014125585556,
-0.08000701665878296,
0.1233140379190445,
0.036924492567777634,
-0.02793518453836441,
0.02130964770913124,
-0.023294692859053612,
-0.1049547791481018,
0.014822411350905895,
0.059039995074272156,
0.035320691764354706,
-0.05500369518995285,
-0.005659608170390129,
-0.02793431095778942,
-0.07560064643621445,
0.05863998830318451,
0.17844036221504211,
0.07075472176074982,
0.07386035472154617,
-0.05892902612686157,
-0.03716288506984711,
-0.07528036087751389,
0.038113679736852646,
0.04317055642604828,
0.07325083017349243,
-0.07635551691055298,
0.09632150828838348,
0.010872790589928627,
0.04075932502746582,
-0.032135799527168274,
-0.053450074046850204,
-0.10157489031553268,
-0.05096273496747017,
-0.09922375530004501,
0.00515926955267787,
-0.07553276419639587,
-0.03993397578597069,
-0.0004570754827000201,
-0.0060770330019295216,
-0.008280550129711628,
0.04848509654402733,
-0.059670086950063705,
-0.010796798393130302,
-0.02692657895386219,
0.033081814646720886,
-0.06537193059921265,
-0.03738299757242203,
0.03190857917070389,
-0.10432172566652298,
0.09090039879083633,
0.04736374691128731,
0.006473713554441929,
0.007160624023526907,
0.09025850147008896,
-0.024186091497540474,
0.025024013593792915,
0.01643863506615162,
-0.04968230053782463,
-0.08213843405246735,
0.0016054613515734673,
-0.004906708840280771,
-0.0143961813300848,
-0.008245427161455154,
0.0915495902299881,
-0.08606059104204178,
0.031992267817258835,
-0.011268938891589642,
-0.008078251034021378,
-0.0742640271782875,
-0.01081643346697092,
0.1057206317782402,
0.09974928200244904,
0.04846755042672157,
-0.09041895717382431,
0.013011319562792778,
-0.14377738535404205,
-0.03580331429839134,
0.00669321371242404,
-0.008965502493083477,
-0.1317782998085022,
-0.009666413068771362,
0.020026030018925667,
0.00013237583334557712,
0.21320706605911255,
-0.05698039382696152,
-0.020769771188497543,
0.021363835781812668,
-0.09053359925746918,
0.11863061785697937,
-0.024504799395799637,
0.17804647982120514,
-0.007663050666451454,
-0.04164404794573784,
-0.0144392354413867,
0.03548045828938484,
0.01945965737104416,
-0.027162283658981323,
0.18858949840068817,
0.13761363923549652,
0.03316505253314972,
0.042791012674570084,
-0.02498297207057476,
0.003967140335589647,
-0.049739714711904526,
-0.03728945553302765,
0.033995047211647034,
0.033093471080064774,
0.02132677473127842,
0.14907345175743103,
0.06959063559770584,
-0.1692279726266861,
0.03403456136584282,
-0.026275087147951126,
-0.0368395559489727,
-0.11743627488613129,
-0.09536635875701904,
-0.03268756717443466,
-0.06900547444820404,
0.009023717604577541,
-0.12459692358970642,
0.006267613265663385,
0.17667338252067566,
0.05590115860104561,
0.027541901916265488,
0.005076395347714424,
-0.12479019165039062,
-0.03410283476114273,
0.05145644396543503,
0.012989655137062073,
0.025627171620726585,
0.05886152759194374,
-0.000577311497181654,
0.058107923716306686,
0.042376719415187836,
0.01672895811498165,
0.0029159365221858025,
0.0732010006904602,
0.014939391985535622,
0.03966539725661278,
-0.062130335718393326,
-0.0044125947169959545,
-0.0434754341840744,
0.0708429142832756,
0.10752661526203156,
0.04952787980437279,
-0.049794524908065796,
-0.007404025178402662,
0.157446026802063,
-0.04366330802440643,
-0.006946813780814409,
-0.12804196774959564,
0.3338235914707184,
0.014240262098610401,
0.014518296346068382,
0.04587463289499283,
-0.07438957691192627,
-0.05406296253204346,
0.19845381379127502,
0.09336388856172562,
-0.023284615948796272,
-0.020354608073830605,
0.00038816939922980964,
-0.029801104217767715,
-0.023109011352062225,
0.15080976486206055,
0.03328078240156174,
0.13364802300930023,
-0.055235058069229126,
-0.04599776491522789,
-0.02771192044019699,
-0.01239196490496397,
-0.12117352336645126,
0.13452042639255524,
-0.027086874470114708,
-0.021623630076646805,
-0.07404016703367233,
0.026742249727249146,
0.07413611561059952,
-0.3080992102622986,
0.0006545035867020488,
-0.037040404975414276,
-0.10952938348054886,
-0.0023402422666549683,
-0.023901455104351044,
-0.02395189367234707,
0.049224961549043655,
-0.04474952071905136,
0.07154754549264908,
0.037680745124816895,
0.034762926399707794,
-0.025897298008203506,
-0.09351092576980591,
0.1686139553785324,
0.0509403832256794,
0.09616714715957642,
0.027781985700130463,
0.07518602162599564,
0.05542682111263275,
0.03322329744696617,
-0.09375566989183426,
0.04695881903171539,
0.012517360039055347,
-0.08415597677230835,
-0.05141571909189224,
0.1220250353217125,
-0.003134830156341195,
0.043370574712753296,
0.04028988629579544,
-0.11033371090888977,
0.012940587475895882,
0.07282747328281403,
-0.06707713752985,
-0.09988023340702057,
-0.006723677273839712,
-0.09018605202436447,
0.15874126553535461,
0.13993553817272186,
-0.017042089253664017,
0.020722990855574608,
-0.06459005922079086,
-0.0035433415323495865,
0.053798187524080276,
0.010935361497104168,
-0.017428958788514137,
-0.191581130027771,
0.030090082436800003,
-0.08813566714525223,
-0.008161652833223343,
-0.22908546030521393,
-0.10808013379573822,
-0.009658570401370525,
-0.05065684765577316,
-0.029673371464014053,
0.059859804809093475,
0.027080891653895378,
0.06686483323574066,
-0.015569052658975124,
-0.0379338264465332,
-0.028865771368145943,
0.0873289704322815,
-0.11084245890378952,
-0.06483254581689835
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_400k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_400k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_400k"]}
| null |
google/multiberts-seed_2-step_400k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_400k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 400k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07724679261445999,
0.0924520343542099,
-0.0024587453808635473,
0.043655894696712494,
0.0779455304145813,
-0.012695372104644775,
0.07625333964824677,
0.10256408154964447,
-0.01803731918334961,
0.025755124166607857,
0.07566354423761368,
0.006525035016238689,
0.02034183219075203,
0.09691697359085083,
0.0232851542532444,
-0.22600074112415314,
0.0184877160936594,
-0.03221771866083145,
-0.09154979139566422,
0.07671331614255905,
0.09998325258493423,
-0.07806840538978577,
0.047592829912900925,
0.027247976511716843,
-0.11124479025602341,
0.0474480539560318,
0.004199517425149679,
-0.01883305422961712,
0.1349140703678131,
-0.0017524503637105227,
0.051461707800626755,
0.05298587679862976,
0.04996667802333832,
-0.13098104298114777,
0.0065980940125882626,
0.05644235387444496,
0.05935374274849892,
0.04760272800922394,
0.023764068260788918,
0.07824353873729706,
0.0023365437518805265,
0.024961506947875023,
0.04351578280329704,
0.024267546832561493,
-0.07553163170814514,
-0.06533072888851166,
-0.10547971725463867,
0.036461587995290756,
0.02860931307077408,
0.003007624763995409,
0.010971381328999996,
0.12201379239559174,
-0.034168269485235214,
0.04254155606031418,
0.1779072880744934,
-0.3480921685695648,
-0.013330264948308468,
0.0731189101934433,
0.0407615527510643,
0.12981051206588745,
-0.005927594378590584,
-0.01775033213198185,
0.07671859860420227,
0.02681957557797432,
0.08755436539649963,
-0.0409759022295475,
0.034621693193912506,
-0.054299723356962204,
-0.15450984239578247,
-0.04428853467106819,
0.0971565768122673,
-0.0023416844196617603,
-0.13865084946155548,
-0.036149702966213226,
-0.04454910010099411,
0.036514587700366974,
0.01232382282614708,
-0.03610370680689812,
0.03360069543123245,
0.016763124614953995,
-0.022349946200847626,
-0.009315411560237408,
-0.10438443720340729,
-0.054264865815639496,
0.029995756223797798,
0.08097685873508453,
0.10103915631771088,
0.06943827867507935,
0.002775602973997593,
0.11399313062429428,
-0.1845671385526657,
-0.05054382234811783,
-0.028325295075774193,
-0.054551173001527786,
-0.0468917042016983,
-0.009710452519357204,
-0.10673360526561737,
-0.04064087197184563,
0.007840581238269806,
0.13060855865478516,
0.0035287386272102594,
0.02745824307203293,
-0.030133690685033798,
0.009012212045490742,
0.055248845368623734,
0.04656472057104111,
-0.005528835114091635,
0.014090452343225479,
0.02485578879714012,
-0.01356289628893137,
-0.01852022483944893,
0.01613946072757244,
0.0010616012150421739,
0.024067126214504242,
0.11725975573062897,
0.025368137285113335,
-0.10252951085567474,
0.06728723645210266,
-0.019162336364388466,
-0.04408537968993187,
0.015095110982656479,
-0.08979588001966476,
-0.05572771653532982,
-0.040083158761262894,
0.00019857774896081537,
0.013341790065169334,
-0.005093609448522329,
-0.005424796137958765,
-0.020802665501832962,
-0.04005478695034981,
-0.08210664987564087,
-0.04683837667107582,
-0.05223618820309639,
-0.12738506495952606,
0.005081116687506437,
-0.16929440200328827,
-0.03820566460490227,
-0.11655115336179733,
-0.18372583389282227,
-0.02252981998026371,
0.06284969300031662,
-0.011231871321797371,
-0.055787503719329834,
0.08154744654893875,
0.04464230686426163,
-0.029683705419301987,
-0.0011916098883375525,
0.07395706325769424,
-0.001864852849394083,
0.0461944155395031,
-0.029168153181672096,
0.07075869292020798,
0.0038980513345450163,
0.03251183405518532,
-0.0542670339345932,
0.062282294034957886,
-0.17806385457515717,
0.04432734474539757,
-0.06962887197732925,
-0.03142219036817551,
-0.08510881662368774,
-0.038393285125494,
-0.01091703213751316,
0.0032778915483504534,
0.021292250603437424,
0.07487726956605911,
-0.1871754229068756,
-0.027737196534872055,
0.11860421299934387,
-0.16250759363174438,
-0.019344761967658997,
0.07388296723365784,
-0.046307262033224106,
0.10820690542459488,
0.07118674367666245,
0.1570204496383667,
-0.011585810221731663,
-0.07987874746322632,
0.05795503035187721,
-0.011284782551229,
0.009813027456402779,
-0.006676306948065758,
0.0694633275270462,
-0.0209182295948267,
-0.15809372067451477,
0.03547089546918869,
-0.12408452481031418,
-0.005801226478070021,
-0.07773364335298538,
0.016983559355139732,
-0.011135171167552471,
-0.06710406392812729,
-0.06544113159179688,
-0.028704874217510223,
0.06787732243537903,
-0.07604958862066269,
-0.016387570649385452,
0.02937895990908146,
0.0742143765091896,
-0.06993154436349869,
0.06642396003007889,
-0.012208249419927597,
0.013320453464984894,
-0.08574125170707703,
-0.03894427418708801,
-0.187397301197052,
0.04922766610980034,
0.10087156295776367,
0.005865938495844603,
-0.019770797342061996,
0.14518705010414124,
0.009555434808135033,
0.06973981857299805,
-0.05226241052150726,
0.012239825911819935,
-0.009963300079107285,
-0.005525581538677216,
-0.09408280998468399,
-0.0944065973162651,
-0.07946038246154785,
-0.06540733575820923,
0.08810321986675262,
-0.12403008341789246,
0.021051662042737007,
-0.05784750357270241,
0.04426881670951843,
0.022190598770976067,
-0.08302115648984909,
-0.019293740391731262,
0.01381763443350792,
-0.05772848799824715,
-0.05724693089723587,
0.039418190717697144,
0.0692509189248085,
-0.012442238628864288,
0.08575540035963058,
-0.047517433762550354,
-0.07976043969392776,
0.03140673413872719,
0.10066889971494675,
-0.10759443044662476,
0.012827059254050255,
-0.056609608232975006,
-0.04228087514638901,
-0.06366080045700073,
-0.019836267456412315,
0.08392272889614105,
-0.008436803705990314,
0.1364845186471939,
-0.0724407359957695,
-0.002103912178426981,
0.011789786629378796,
-0.021015740931034088,
-0.018713803961873055,
0.03528355434536934,
0.054825615137815475,
-0.07397715747356415,
0.011947854422032833,
0.04123331233859062,
0.014739644713699818,
0.0661059021949768,
-0.052706580609083176,
-0.08678844571113586,
0.011069434694945812,
0.03432406112551689,
0.029490124434232712,
0.06957469880580902,
-0.012394284829497337,
-0.011539136059582233,
0.034397535026073456,
0.01793702132999897,
0.004039715975522995,
-0.11857855319976807,
0.06129812076687813,
0.055254511535167694,
0.0029878471978008747,
0.057809848338365555,
-0.017871852964162827,
-0.03839430958032608,
0.08156886696815491,
0.03829391300678253,
0.0023996662348508835,
-0.014109594747424126,
-0.01655333675444126,
-0.11655696481466293,
0.18753091990947723,
-0.06247270479798317,
-0.16146817803382874,
-0.07945055514574051,
-0.09591308981180191,
0.004524182062596083,
0.027000445872545242,
0.03909211978316307,
-0.021740412339568138,
-0.04442388191819191,
-0.12152334302663803,
0.06567007303237915,
-0.03682271018624306,
0.0677168071269989,
0.10925248265266418,
-0.04122418537735939,
0.056965965777635574,
-0.12601527571678162,
-0.0077676912769675255,
-0.08292435854673386,
-0.07773979008197784,
0.0612434558570385,
-0.049594465643167496,
0.025986235588788986,
0.09630638360977173,
0.02568262629210949,
-0.01864359900355339,
-0.02477734535932541,
0.2032964676618576,
0.03943593055009842,
0.044159844517707825,
0.12962158024311066,
-0.061841726303100586,
0.055449478328228,
0.08403818309307098,
0.011814113706350327,
-0.044673651456832886,
0.05216057598590851,
0.04170328006148338,
-0.06765653192996979,
-0.1941739320755005,
-0.022985266521573067,
-0.006232567597180605,
-0.0446895994246006,
0.0743195116519928,
0.036513254046440125,
-0.0008005257113836706,
0.07034700363874435,
0.01394637580960989,
0.06273065507411957,
0.001108334749005735,
0.09777828305959702,
0.012028251774609089,
-0.03368339315056801,
0.08884430676698685,
-0.020982936024665833,
-0.01134692132472992,
0.08527862280607224,
-0.017992869019508362,
0.28766781091690063,
-0.030356144532561302,
0.010515674948692322,
0.11619112640619278,
0.050131503492593765,
0.06387674808502197,
0.12851382791996002,
-0.06252489238977432,
0.023752545937895775,
-0.07242564857006073,
-0.05931234732270241,
-0.004961295053362846,
0.048182420432567596,
-0.055258095264434814,
0.014324042946100235,
-0.07453317940235138,
0.016414308920502663,
-0.02033332735300064,
0.31285038590431213,
0.11214938014745712,
-0.10359158366918564,
-0.05957407131791115,
0.008043800480663776,
-0.09982205182313919,
-0.07268613576889038,
0.045326124876737595,
0.07201647758483887,
-0.13947702944278717,
0.006074741017073393,
-0.026049872860312462,
0.074182890355587,
-0.02146337181329727,
0.01785239763557911,
0.030927907675504684,
0.0357724204659462,
-0.03731397166848183,
0.008067436516284943,
-0.1786726415157318,
0.19033703207969666,
0.008030049502849579,
0.02051936648786068,
-0.051522836089134216,
0.03310483321547508,
0.009788230061531067,
-0.031277142465114594,
0.06151971593499184,
0.022372297942638397,
-0.032548557966947556,
-0.05382496863603592,
-0.052872128784656525,
0.0160160344094038,
0.07986891269683838,
-0.04693097993731499,
0.10613358020782471,
-0.006547565571963787,
0.0414203479886055,
0.01756414771080017,
0.08699598163366318,
-0.18285620212554932,
-0.0868678092956543,
0.030448047444224358,
-0.05959305167198181,
-0.09019569307565689,
-0.08211316168308258,
-0.09331728518009186,
0.0021939179860055447,
0.25269368290901184,
-0.11661463230848312,
-0.0766262635588646,
-0.09474001079797745,
0.02208605781197548,
0.10646199434995651,
-0.0479496493935585,
0.02437591925263405,
-0.003202807391062379,
0.12797491252422333,
-0.06363104283809662,
-0.1355670690536499,
0.022504335269331932,
-0.08863341808319092,
-0.16511328518390656,
-0.06629505008459091,
0.11905107647180557,
0.05818772315979004,
0.03688248246908188,
-0.02570127695798874,
0.023275677114725113,
0.03428085520863533,
-0.036709096282720566,
0.0001983303955057636,
0.06640592217445374,
0.09762556105852127,
0.03694425895810127,
-0.11438658088445663,
0.021195100620388985,
-0.06425252556800842,
-0.06417436897754669,
0.07545549422502518,
0.261832058429718,
-0.05791173502802849,
0.1260780245065689,
0.10979104042053223,
-0.07870396226644516,
-0.15224041044712067,
0.027570568025112152,
0.09132247418165207,
-0.014742952771484852,
0.01564883254468441,
-0.1589432656764984,
0.08902759104967117,
0.1135922446846962,
-0.023416776210069656,
0.013002926483750343,
-0.18664440512657166,
-0.1296604573726654,
0.06487181037664413,
0.09512922167778015,
0.27376291155815125,
-0.06135646998882294,
-0.043792322278022766,
0.017349017783999443,
-0.09648490697145462,
0.009467411786317825,
0.12048409879207611,
0.06545579433441162,
-0.025477595627307892,
-0.07021311670541763,
0.014809343963861465,
-0.03839636966586113,
0.09744121879339218,
0.05385405197739601,
0.05730847269296646,
-0.0031148705165833235,
0.01711437478661537,
-0.026694923639297485,
-0.04395800456404686,
0.06062035635113716,
0.024206511676311493,
0.04834999144077301,
-0.08275186270475388,
-0.027431126683950424,
-0.06833328306674957,
0.02601071260869503,
-0.024255840107798576,
-0.0764862522482872,
-0.05988375470042229,
0.07607676088809967,
0.04615412652492523,
-0.02587493136525154,
0.020598648115992546,
0.03120003454387188,
0.12104784697294235,
0.16213779151439667,
-0.005180503241717815,
-0.044483840465545654,
-0.06202111393213272,
-0.03710527345538139,
-0.019300395622849464,
0.07434069365262985,
-0.04387937858700752,
0.025849873200058937,
0.06675922125577927,
0.02012190781533718,
0.09870857000350952,
0.057618457823991776,
-0.1121467798948288,
-0.015393894165754318,
0.033026549965143204,
-0.16095852851867676,
0.01548891793936491,
0.0009094398701563478,
0.02193479984998703,
-0.0346161350607872,
0.027641940861940384,
0.15200157463550568,
-0.06304791569709778,
-0.037578705698251724,
-0.04139215126633644,
0.06887784600257874,
0.018649665638804436,
0.13966487348079681,
0.03223233297467232,
0.036661192774772644,
-0.08199909329414368,
0.1237446516752243,
0.03888338431715965,
-0.032897576689720154,
0.01843348704278469,
-0.026772433891892433,
-0.1084991991519928,
0.01363429706543684,
0.059654440730810165,
0.04108072817325592,
-0.04752388969063759,
-0.007996868342161179,
-0.026716608554124832,
-0.07906298339366913,
0.05634528025984764,
0.17730380594730377,
0.06643068045377731,
0.0731930360198021,
-0.05600669980049133,
-0.036068495362997055,
-0.07689318060874939,
0.04169173538684845,
0.04355023428797722,
0.07326187193393707,
-0.0759754404425621,
0.10770923644304276,
0.009774843230843544,
0.04493614658713341,
-0.03174098581075668,
-0.0540490448474884,
-0.09932827204465866,
-0.055113598704338074,
-0.10810257494449615,
0.0076922583393752575,
-0.07332164794206619,
-0.038665272295475006,
-0.0000037731103930127574,
-0.005950828082859516,
-0.00695711188018322,
0.046984534710645676,
-0.06269541382789612,
-0.010625782422721386,
-0.02580568753182888,
0.03502431511878967,
-0.06258383393287659,
-0.03900090977549553,
0.030268220230937004,
-0.1026596948504448,
0.09218306094408035,
0.0533917173743248,
0.008856954984366894,
0.008317964151501656,
0.09987477958202362,
-0.02097724936902523,
0.023097284138202667,
0.015603977255523205,
-0.047127366065979004,
-0.0835132747888565,
-0.00021577290317509323,
-0.01071819756180048,
-0.017450127750635147,
-0.009328571148216724,
0.09175878763198853,
-0.08700007200241089,
0.03328263387084007,
-0.010198704898357391,
-0.005703756585717201,
-0.07440229505300522,
-0.013106330297887325,
0.09757522493600845,
0.0972839891910553,
0.04544183239340782,
-0.091567762196064,
0.013364020735025406,
-0.1437474638223648,
-0.037679724395275116,
0.005583758931607008,
-0.01074138842523098,
-0.12318316847085953,
-0.011890018358826637,
0.02139243856072426,
-0.0014786545652896166,
0.20810946822166443,
-0.05643697455525398,
-0.021062426269054413,
0.01860286295413971,
-0.09459821879863739,
0.11048557609319687,
-0.02510964311659336,
0.17902107536792755,
-0.006664969492703676,
-0.041464608162641525,
-0.014796579256653786,
0.03875603899359703,
0.018639031797647476,
-0.025375880300998688,
0.18047215044498444,
0.13474635779857635,
0.035035766661167145,
0.040674950927495956,
-0.02781897597014904,
-0.0018392896745353937,
-0.05277705937623978,
-0.030909324064850807,
0.032289668917655945,
0.03707246482372284,
0.01832650974392891,
0.14971156418323517,
0.06511442363262177,
-0.16801989078521729,
0.03266596794128418,
-0.029958443716168404,
-0.03773781657218933,
-0.11560464650392532,
-0.08867572993040085,
-0.03300274908542633,
-0.07051718980073929,
0.008018993772566319,
-0.12292900681495667,
0.007896577939391136,
0.17958779633045197,
0.05438776686787605,
0.027147745713591576,
0.004266148898750544,
-0.12875857949256897,
-0.034197915345430374,
0.05226568132638931,
0.014994732104241848,
0.02531786635518074,
0.06098451092839241,
-0.0017085650470107794,
0.05880669876933098,
0.03743913024663925,
0.013762542977929115,
0.0018664236413314939,
0.07636579871177673,
0.0150098642334342,
0.04113676771521568,
-0.06215649098157883,
-0.005484602879732847,
-0.03908902034163475,
0.07178260385990143,
0.10206243395805359,
0.04942845180630684,
-0.04914729297161102,
-0.008499628864228725,
0.16256722807884216,
-0.043820224702358246,
0.0010560920927673578,
-0.12710395455360413,
0.33816587924957275,
0.009199625812470913,
0.012015211395919323,
0.04727904498577118,
-0.07656317949295044,
-0.05066831037402153,
0.20612139999866486,
0.09208326786756516,
-0.020263247191905975,
-0.022914409637451172,
0.0002732501015998423,
-0.031047435477375984,
-0.021357513964176178,
0.15054532885551453,
0.03520386293530464,
0.126645028591156,
-0.05651041492819786,
-0.04343897104263306,
-0.028068723157048225,
-0.009759357199072838,
-0.12860094010829926,
0.13804882764816284,
-0.033392488956451416,
-0.022922880947589874,
-0.0725683644413948,
0.026001522317528725,
0.07586807012557983,
-0.3222588300704956,
0.005576088558882475,
-0.03319048136472702,
-0.10921385884284973,
-0.0019904945511370897,
-0.017855944111943245,
-0.023382974788546562,
0.048743922263383865,
-0.04747861251235008,
0.07077957689762115,
0.04974942281842232,
0.033245593309402466,
-0.02134752832353115,
-0.09504714608192444,
0.1675243228673935,
0.04167429730296135,
0.09188536554574966,
0.027214698493480682,
0.07836317270994186,
0.05397728085517883,
0.03583485633134842,
-0.09454359114170074,
0.04440906643867493,
0.012118825688958168,
-0.09235117584466934,
-0.0532502681016922,
0.1231791153550148,
-0.0022593990433961153,
0.04425743222236633,
0.04255300015211105,
-0.10614611953496933,
0.010873634368181229,
0.07469398528337479,
-0.069589763879776,
-0.09692852199077606,
-0.010019325651228428,
-0.08909568190574646,
0.1563432216644287,
0.1428842395544052,
-0.01633172482252121,
0.02303161472082138,
-0.06570040434598923,
-0.008644483052194118,
0.05662108585238457,
0.006321798078715801,
-0.01912856474518776,
-0.1897793710231781,
0.031075427308678627,
-0.08443460613489151,
-0.006539022084325552,
-0.22775371372699738,
-0.1051306277513504,
-0.012924359180033207,
-0.05111808702349663,
-0.027023306116461754,
0.06071273982524872,
0.028966810554265976,
0.06588378548622131,
-0.015052341856062412,
-0.03744170069694519,
-0.028338493779301643,
0.08901333063840866,
-0.1084098294377327,
-0.06557656824588776
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 40k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 40k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_40k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_40k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_40k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_40k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_40k"]}
| null |
google/multiberts-seed_2-step_40k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_40k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_40k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 40k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 40k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 40k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 40k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_40k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 40k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 40k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_40k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 40k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 40k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07821808755397797,
0.0985182523727417,
-0.002567077288404107,
0.03884022682905197,
0.0751105546951294,
-0.013372745364904404,
0.08080071210861206,
0.10385073721408844,
-0.013878648169338703,
0.02939104288816452,
0.07623126357793808,
0.00541834207251668,
0.01914425939321518,
0.09499827772378922,
0.023312758654356003,
-0.22795726358890533,
0.018758639693260193,
-0.03020583838224411,
-0.08710911870002747,
0.07684560865163803,
0.10020092874765396,
-0.08010772615671158,
0.04550552740693092,
0.02905915677547455,
-0.11142101138830185,
0.04499227553606033,
0.0031277434900403023,
-0.0169061329215765,
0.134614497423172,
-0.002089699497446418,
0.05122422054409981,
0.05422589182853699,
0.04677612707018852,
-0.13371694087982178,
0.00659819133579731,
0.05799547955393791,
0.056331560015678406,
0.0480264276266098,
0.025191636756062508,
0.07913309335708618,
0.007679727394133806,
0.021531520411372185,
0.04228408262133598,
0.02350417897105217,
-0.07440363615751266,
-0.06792159378528595,
-0.10409601032733917,
0.035851459950208664,
0.026849085465073586,
0.0025566108524799347,
0.010157978162169456,
0.13076628744602203,
-0.03412455692887306,
0.04464743658900261,
0.18016941845417023,
-0.3494710624217987,
-0.010995886288583279,
0.07611220329999924,
0.0457807257771492,
0.12731307744979858,
-0.006220695096999407,
-0.018402647227048874,
0.0766528993844986,
0.02529824525117874,
0.08555901050567627,
-0.0412738136947155,
0.044665317982435226,
-0.05332120135426521,
-0.1565578132867813,
-0.0454271025955677,
0.08974017202854156,
-0.0032258525025099516,
-0.1367071270942688,
-0.03868255019187927,
-0.04820292070508003,
0.039409905672073364,
0.010345155373215675,
-0.03486133739352226,
0.035143863409757614,
0.017581211403012276,
-0.018540339544415474,
-0.008798835799098015,
-0.10398786514997482,
-0.05361646041274071,
0.03153344616293907,
0.07746566832065582,
0.10320518165826797,
0.06716962903738022,
0.0005169282667338848,
0.11149369925260544,
-0.19608959555625916,
-0.0518941730260849,
-0.028129620477557182,
-0.051024217158555984,
-0.04545897990465164,
-0.009898453019559383,
-0.10987889021635056,
-0.04289105907082558,
0.010737812146544456,
0.13179630041122437,
-0.003802071325480938,
0.026831770315766335,
-0.033768873661756516,
0.010652011260390282,
0.057759977877140045,
0.04657880961894989,
-0.005134891718626022,
0.013443201780319214,
0.02266901545226574,
-0.01587318815290928,
-0.018017824739217758,
0.01573879085481167,
0.000010286281394655816,
0.022517139092087746,
0.1191939041018486,
0.022954218089580536,
-0.10181134194135666,
0.06829635053873062,
-0.018157286569476128,
-0.044463299214839935,
0.016287174075841904,
-0.08869673311710358,
-0.05546487495303154,
-0.037745144218206406,
-0.00003529257082846016,
0.015329821035265923,
-0.006012788042426109,
-0.004167108330875635,
-0.023123115301132202,
-0.04372841492295265,
-0.08238223940134048,
-0.04614034667611122,
-0.05209946632385254,
-0.12808862328529358,
0.007263916544616222,
-0.17718806862831116,
-0.03686292842030525,
-0.11352178454399109,
-0.1872374713420868,
-0.023556027561426163,
0.059908803552389145,
-0.01181733701378107,
-0.05635210871696472,
0.0800119936466217,
0.046286292374134064,
-0.028672968968749046,
-0.00025634156190790236,
0.06952264904975891,
-0.002230128739029169,
0.045372191816568375,
-0.027090877294540405,
0.0707826018333435,
0.004913220182061195,
0.032545316964387894,
-0.054070889949798584,
0.06292077898979187,
-0.17645204067230225,
0.044743552803993225,
-0.06894159317016602,
-0.03222831338644028,
-0.08748812973499298,
-0.0375044159591198,
-0.005245822947472334,
0.005031944718211889,
0.01914266124367714,
0.07373200356960297,
-0.18551971018314362,
-0.030450474470853806,
0.1267014741897583,
-0.161306232213974,
-0.01822724938392639,
0.0746217891573906,
-0.047019533812999725,
0.10385406017303467,
0.07081060111522675,
0.1587124466896057,
-0.010899030603468418,
-0.08084380626678467,
0.058504361659288406,
-0.009977886453270912,
0.013285431079566479,
-0.008513248525559902,
0.07215723395347595,
-0.021163366734981537,
-0.15509521961212158,
0.03282647579908371,
-0.1322150081396103,
-0.005749572534114122,
-0.07743998616933823,
0.018563760444521904,
-0.011115080676972866,
-0.06539930403232574,
-0.06378351897001266,
-0.026823850348591805,
0.06912986189126968,
-0.07447472959756851,
-0.018008332699537277,
0.03296084702014923,
0.07551649212837219,
-0.07159791886806488,
0.06614156067371368,
-0.014688072726130486,
0.014771387912333012,
-0.08629211783409119,
-0.03921327367424965,
-0.18998748064041138,
0.05054599791765213,
0.10165286809206009,
0.006089736241847277,
-0.020749248564243317,
0.15131452679634094,
0.010767667554318905,
0.07002027332782745,
-0.04949759319424629,
0.011733516119420528,
-0.009792187251150608,
-0.007009361404925585,
-0.0934988483786583,
-0.0985473170876503,
-0.07589003443717957,
-0.06825562566518784,
0.08480946719646454,
-0.12918704748153687,
0.021277375519275665,
-0.05569317936897278,
0.04629078134894371,
0.021498216316103935,
-0.08150085806846619,
-0.017989329993724823,
0.012199437245726585,
-0.0601133331656456,
-0.05629466101527214,
0.041107177734375,
0.07019655406475067,
-0.014083432033658028,
0.09156078845262527,
-0.05072501301765442,
-0.08306468278169632,
0.030638741329312325,
0.09734833240509033,
-0.10408680140972137,
0.009358956478536129,
-0.05780617892742157,
-0.04262516275048256,
-0.06094842031598091,
-0.016647160053253174,
0.08249380439519882,
-0.008016455918550491,
0.14014948904514313,
-0.073330819606781,
-0.005421988200396299,
0.012953792698681355,
-0.02273823320865631,
-0.01849500834941864,
0.03664827346801758,
0.05493863299489021,
-0.07275813072919846,
0.014349742792546749,
0.04131190478801727,
0.012423037551343441,
0.06977005302906036,
-0.055639758706092834,
-0.08926194906234741,
0.009901568293571472,
0.0368620939552784,
0.028968939557671547,
0.06702689081430435,
-0.01895490661263466,
-0.013918212614953518,
0.03555164486169815,
0.01582261174917221,
0.004601373802870512,
-0.11843277513980865,
0.062261417508125305,
0.055773910135030746,
0.003695082850754261,
0.05615827813744545,
-0.017231523990631104,
-0.039489783346652985,
0.080973319709301,
0.03760876506567001,
0.001148315379396081,
-0.016074785962700844,
-0.017056021839380264,
-0.11661870777606964,
0.18946726620197296,
-0.0600423626601696,
-0.15921519696712494,
-0.07825019210577011,
-0.10285831242799759,
0.004390241112560034,
0.027057087048888206,
0.0379612073302269,
-0.022040510550141335,
-0.043611444532871246,
-0.11971390247344971,
0.06612440198659897,
-0.03892884403467178,
0.06911791115999222,
0.10913309454917908,
-0.0410858616232872,
0.057439934462308884,
-0.12478465586900711,
-0.0071317777037620544,
-0.08228354156017303,
-0.07567140460014343,
0.06251155585050583,
-0.04944032058119774,
0.024229474365711212,
0.096999391913414,
0.027001187205314636,
-0.01894889771938324,
-0.02409854717552662,
0.20224091410636902,
0.03825230896472931,
0.04228789731860161,
0.13065312802791595,
-0.06408209353685379,
0.05708610266447067,
0.08115024864673615,
0.009777402505278587,
-0.0435032956302166,
0.049346037209033966,
0.042286887764930725,
-0.06519700586795807,
-0.19584491848945618,
-0.02338488958775997,
-0.005772648379206657,
-0.04397938400506973,
0.07631304860115051,
0.03551063686609268,
0.008497258648276329,
0.06915158778429031,
0.012381446547806263,
0.06853621453046799,
-0.0028949729166924953,
0.09864553064107895,
0.012592497281730175,
-0.03396650403738022,
0.08759798109531403,
-0.021300356835126877,
-0.010784133337438107,
0.08558496832847595,
-0.016335343942046165,
0.2864769697189331,
-0.028557447716593742,
0.021450867876410484,
0.11776009202003479,
0.04631815105676651,
0.0646134689450264,
0.12438260763883591,
-0.06521483510732651,
0.022378332912921906,
-0.07345961034297943,
-0.06039408594369888,
-0.0028351335786283016,
0.04944762587547302,
-0.053876813501119614,
0.009712355211377144,
-0.07229521870613098,
0.016273096203804016,
-0.0201711542904377,
0.31307870149612427,
0.11287522315979004,
-0.10712532699108124,
-0.06027684360742569,
0.006830048747360706,
-0.09919793903827667,
-0.0726475715637207,
0.04425172880291939,
0.07140768319368362,
-0.1355714648962021,
0.008820364251732826,
-0.026881661266088486,
0.07478047162294388,
-0.02196984365582466,
0.018915066495537758,
0.027867387980222702,
0.03560230880975723,
-0.03554839268326759,
0.009128698147833347,
-0.18253688514232635,
0.19173085689544678,
0.008803017437458038,
0.016865484416484833,
-0.05198941379785538,
0.033500127494335175,
0.008463840931653976,
-0.026265263557434082,
0.06353890150785446,
0.022618010640144348,
-0.032887816429138184,
-0.047436077147722244,
-0.05271274596452713,
0.013792216777801514,
0.07950960844755173,
-0.04728357493877411,
0.10877272486686707,
-0.007902614772319794,
0.04122716188430786,
0.019248737022280693,
0.08001242578029633,
-0.1783953458070755,
-0.0841803103685379,
0.03170463815331459,
-0.05827438831329346,
-0.09947407990694046,
-0.08120466768741608,
-0.09288709610700607,
-0.0007473031873814762,
0.2518230676651001,
-0.12302663922309875,
-0.07438492029905319,
-0.09511297196149826,
0.027727345004677773,
0.10359689593315125,
-0.05046973004937172,
0.024140214547514915,
-0.005172061733901501,
0.13382777571678162,
-0.06435759365558624,
-0.13405084609985352,
0.024302974343299866,
-0.0907173827290535,
-0.16648323833942413,
-0.06610620021820068,
0.1200944259762764,
0.05895642191171646,
0.03718193992972374,
-0.02658831886947155,
0.02377128228545189,
0.03086209110915661,
-0.034263819456100464,
0.0018691502045840025,
0.06972788274288177,
0.10221990942955017,
0.03166971728205681,
-0.11003429442644119,
0.02321724407374859,
-0.061698682606220245,
-0.06359215825796127,
0.07846184074878693,
0.26310229301452637,
-0.05752486363053322,
0.12637102603912354,
0.11195331066846848,
-0.07854346930980682,
-0.15242747962474823,
0.02767082303762436,
0.0922214463353157,
-0.014425587840378284,
0.015399501658976078,
-0.16144303977489471,
0.08579735457897186,
0.11022338271141052,
-0.023373747244477272,
0.00869529414921999,
-0.18863947689533234,
-0.12790079414844513,
0.06739921122789383,
0.09344466030597687,
0.2740706205368042,
-0.06325697898864746,
-0.047115832567214966,
0.017934409901499748,
-0.09181006997823715,
0.011861340142786503,
0.11390569061040878,
0.0625884085893631,
-0.02432377263903618,
-0.06990750133991241,
0.01575266569852829,
-0.04013672471046448,
0.09662866592407227,
0.05458882823586464,
0.05554291978478432,
-0.003282382618635893,
0.01843889057636261,
-0.022206589579582214,
-0.04481028392910957,
0.059377484023571014,
0.020905382931232452,
0.047807227820158005,
-0.08654239028692245,
-0.027964897453784943,
-0.06850186735391617,
0.02680058963596821,
-0.02405443601310253,
-0.07608713209629059,
-0.05872173607349396,
0.0752522274851799,
0.04772971197962761,
-0.024873556569218636,
0.02533416636288166,
0.03159784898161888,
0.1182229071855545,
0.16488510370254517,
-0.00490267900750041,
-0.036407142877578735,
-0.0648246631026268,
-0.03886786475777626,
-0.016601987183094025,
0.07497627288103104,
-0.052747488021850586,
0.024702193215489388,
0.0650494322180748,
0.022364607080817223,
0.09878791123628616,
0.054944075644016266,
-0.11397036164999008,
-0.016808513551950455,
0.03130178898572922,
-0.1623997837305069,
0.008886540308594704,
0.0011927583254873753,
0.025347093120217323,
-0.03147614374756813,
0.030227666720747948,
0.15294788777828217,
-0.06306260079145432,
-0.035824913531541824,
-0.0410994254052639,
0.0681006908416748,
0.021128248423337936,
0.1376885622739792,
0.03342601656913757,
0.038307175040245056,
-0.08249950408935547,
0.12226613610982895,
0.03943996876478195,
-0.03472236543893814,
0.02022034488618374,
-0.02771359123289585,
-0.10844705253839493,
0.012173273600637913,
0.061715710908174515,
0.04585247114300728,
-0.04946388304233551,
-0.01172949280589819,
-0.02935262769460678,
-0.07265853881835938,
0.05890754610300064,
0.18069176375865936,
0.06763023883104324,
0.07310358434915543,
-0.05601628124713898,
-0.03688277304172516,
-0.07814741879701614,
0.04167674109339714,
0.03932146355509758,
0.07350275665521622,
-0.07649774849414825,
0.11141358315944672,
0.010102864354848862,
0.046065062284469604,
-0.03162756934762001,
-0.053357403725385666,
-0.09835229814052582,
-0.05564822256565094,
-0.10681270807981491,
0.010759943164885044,
-0.07446324825286865,
-0.039472293108701706,
-0.00115594535600394,
-0.005636897869408131,
-0.005949121899902821,
0.04790826141834259,
-0.06267429143190384,
-0.009206890128552914,
-0.026097053661942482,
0.036132365465164185,
-0.06432738900184631,
-0.03682739660143852,
0.029445398598909378,
-0.10198891907930374,
0.09442798793315887,
0.051692698150873184,
0.008970691822469234,
0.009300981648266315,
0.10009604692459106,
-0.01983284205198288,
0.02523595280945301,
0.01382908783853054,
-0.04738447070121765,
-0.08167430758476257,
0.00005960795533610508,
-0.011458571068942547,
-0.017850762233138084,
-0.010691425763070583,
0.09023994207382202,
-0.08579767495393753,
0.03530993312597275,
-0.008712991140782833,
-0.00728102819994092,
-0.07554993778467178,
-0.012766541913151741,
0.09282544255256653,
0.09892473369836807,
0.044926825910806656,
-0.08967247605323792,
0.014694496057927608,
-0.14208611845970154,
-0.03737148642539978,
0.006472320295870304,
-0.009942028671503067,
-0.12278810143470764,
-0.01223145890980959,
0.019998902454972267,
-0.0038388227112591267,
0.210782989859581,
-0.05753029137849808,
-0.01804775930941105,
0.018039381131529808,
-0.09439513087272644,
0.11030624806880951,
-0.026133324950933456,
0.18179790675640106,
-0.0065238322131335735,
-0.040286365896463394,
-0.01633390784263611,
0.03946699574589729,
0.019838055595755577,
-0.02784378081560135,
0.17744678258895874,
0.13658204674720764,
0.0365368016064167,
0.04122958704829216,
-0.02621268853545189,
-0.0020874531473964453,
-0.05972801893949509,
-0.02192176878452301,
0.030647091567516327,
0.03911080211400986,
0.019539589062333107,
0.16073109209537506,
0.0694565400481224,
-0.16846759617328644,
0.0329594686627388,
-0.028961025178432465,
-0.036590639501810074,
-0.11607823520898819,
-0.09267866611480713,
-0.03431437909603119,
-0.07326769083738327,
0.006762576289474964,
-0.12217549979686737,
0.00836557149887085,
0.17615467309951782,
0.05568554252386093,
0.025554394349455833,
0.0030121628660708666,
-0.12667690217494965,
-0.03551388904452324,
0.05255024880170822,
0.01599576696753502,
0.024952227249741554,
0.059000421315431595,
-0.0003766851732507348,
0.0621558241546154,
0.03755118325352669,
0.014021394774317741,
0.00007378606824204326,
0.08167590945959091,
0.018253732472658157,
0.03945507854223251,
-0.06412540376186371,
-0.004275603219866753,
-0.03859835863113403,
0.06974776089191437,
0.09583763778209686,
0.04882604628801346,
-0.04888034611940384,
-0.009381826035678387,
0.15836253762245178,
-0.04250157251954079,
0.003212563693523407,
-0.12747956812381744,
0.3331756889820099,
0.009618728421628475,
0.01257761288434267,
0.04721032828092575,
-0.07773738354444504,
-0.049654413014650345,
0.20235005021095276,
0.08862350136041641,
-0.01930926740169525,
-0.02186805196106434,
0.0027221161872148514,
-0.03046439401805401,
-0.020952967926859856,
0.14755189418792725,
0.03511438146233559,
0.12716838717460632,
-0.05488650128245354,
-0.051084745675325394,
-0.02825997583568096,
-0.010773778893053532,
-0.12575697898864746,
0.13735659420490265,
-0.03016863577067852,
-0.024480363354086876,
-0.07170232385396957,
0.026724552735686302,
0.07728191465139389,
-0.32319262623786926,
0.0038652571383863688,
-0.03353751078248024,
-0.10914457589387894,
-0.0026808264665305614,
-0.014832945540547371,
-0.021666064858436584,
0.047173872590065,
-0.046705201268196106,
0.0683031901717186,
0.049605898559093475,
0.03346315398812294,
-0.024601560086011887,
-0.0895724669098854,
0.16625253856182098,
0.0461389385163784,
0.09184350073337555,
0.0264681838452816,
0.078652024269104,
0.05511397868394852,
0.03406423702836037,
-0.09338140487670898,
0.04311423748731613,
0.013788734562695026,
-0.09130880236625671,
-0.054791294038295746,
0.12442705780267715,
-0.002062018495053053,
0.03959701955318451,
0.04548392444849014,
-0.10883548110723495,
0.011536001227796078,
0.07316930592060089,
-0.07269086688756943,
-0.09875205159187317,
-0.008260734379291534,
-0.08913586288690567,
0.15531791746616364,
0.14285482466220856,
-0.016354773193597794,
0.02216959372162819,
-0.06674196571111679,
-0.006387913133949041,
0.05620362237095833,
0.009054960682988167,
-0.01904759556055069,
-0.18677745759487152,
0.03309662640094757,
-0.08170076459646225,
-0.002438174095004797,
-0.22798635065555573,
-0.10299888998270035,
-0.011085104197263718,
-0.04943940415978432,
-0.0293586403131485,
0.05914226919412613,
0.02727196179330349,
0.06537268310785294,
-0.01571635715663433,
-0.039935026317834854,
-0.027949342504143715,
0.0877404734492302,
-0.10795789957046509,
-0.06688039749860764
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_500k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_500k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_500k"]}
| null |
google/multiberts-seed_2-step_500k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_500k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 500k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07789350301027298,
0.10018110275268555,
-0.002514337655156851,
0.043711960315704346,
0.08058828115463257,
-0.014072788879275322,
0.08010949939489365,
0.10208848863840103,
-0.016004513949155807,
0.028884435072541237,
0.07787494361400604,
0.0016836069989949465,
0.023212457075715065,
0.10352827608585358,
0.023426594212651253,
-0.2218877077102661,
0.02005171589553356,
-0.031223798170685768,
-0.08890422433614731,
0.07662927359342575,
0.09827441722154617,
-0.08120369911193848,
0.04565401002764702,
0.025284186005592346,
-0.11241107434034348,
0.04989556968212128,
0.002930391812697053,
-0.015883997082710266,
0.1292361617088318,
-0.0038119920063763857,
0.05040258169174194,
0.05242996662855148,
0.051756374537944794,
-0.13864846527576447,
0.00592183880507946,
0.05878695473074913,
0.059962641447782516,
0.046193696558475494,
0.022599993273615837,
0.07740598917007446,
-0.0014237627619877458,
0.02401377633213997,
0.042418356984853745,
0.024858368560671806,
-0.07395519316196442,
-0.05712480470538139,
-0.10167503356933594,
0.04037461429834366,
0.03235248848795891,
0.003190311836078763,
0.01038273610174656,
0.11899112910032272,
-0.033333055675029755,
0.04071114584803581,
0.18492160737514496,
-0.34067603945732117,
-0.012365264818072319,
0.07042428851127625,
0.03623617812991142,
0.1274791657924652,
-0.005701266694813967,
-0.018284283578395844,
0.07894311845302582,
0.023352304473519325,
0.0908486619591713,
-0.03951558843255043,
0.027409058064222336,
-0.05679207667708397,
-0.15560023486614227,
-0.0478045754134655,
0.09249453991651535,
-0.0016775871627032757,
-0.1367419809103012,
-0.03466998413205147,
-0.04550366476178169,
0.029970845207571983,
0.013328968547284603,
-0.03824087977409363,
0.03424511104822159,
0.01564188301563263,
-0.016521260142326355,
-0.01084254402667284,
-0.10419197380542755,
-0.057947512716054916,
0.030606435611844063,
0.07999427616596222,
0.10038670152425766,
0.07009809464216232,
-0.0003524673811625689,
0.11446080356836319,
-0.1872023344039917,
-0.04971219599246979,
-0.030973702669143677,
-0.04912053793668747,
-0.05028504505753517,
-0.009208012372255325,
-0.10336200147867203,
-0.043401699513196945,
0.007223755586892366,
0.13309894502162933,
0.006363265216350555,
0.02355373091995716,
-0.029701504856348038,
0.008689600974321365,
0.05758906528353691,
0.0450129434466362,
-0.007355717942118645,
0.01954413764178753,
0.023454103618860245,
-0.009475518018007278,
-0.021541137248277664,
0.016073081642389297,
0.0014705510111525655,
0.024881407618522644,
0.11880660057067871,
0.02669241465628147,
-0.10032327473163605,
0.06255181133747101,
-0.02319394052028656,
-0.04333900287747383,
0.0014795723836869001,
-0.08979978412389755,
-0.053964678198099136,
-0.03723762184381485,
0.00045100244460627437,
0.013333114795386791,
-0.007475207094103098,
-0.005962826777249575,
-0.02483365684747696,
-0.033304471522569656,
-0.0813952162861824,
-0.04498714208602905,
-0.0567060224711895,
-0.1295415759086609,
0.005461592227220535,
-0.18365375697612762,
-0.03893540799617767,
-0.11494813859462738,
-0.18433542549610138,
-0.023133534938097,
0.06120217218995094,
-0.011392993852496147,
-0.05767054483294487,
0.07255541533231735,
0.04371146857738495,
-0.030369365587830544,
-0.0017243019538000226,
0.0780276507139206,
-0.0002085402957163751,
0.04532286524772644,
-0.02722487971186638,
0.07299669086933136,
0.005634274333715439,
0.032939642667770386,
-0.05681759864091873,
0.06159226596355438,
-0.15668494999408722,
0.045219920575618744,
-0.07184071093797684,
-0.033247921615839005,
-0.08753300458192825,
-0.03485092893242836,
-0.01568477973341942,
0.0015541897155344486,
0.0262707881629467,
0.07907000184059143,
-0.18814244866371155,
-0.02884148620069027,
0.12120918929576874,
-0.1626298427581787,
-0.019546961411833763,
0.07223827391862869,
-0.04396843910217285,
0.10575557500123978,
0.07026581466197968,
0.15032804012298584,
-0.006188575178384781,
-0.07703233510255814,
0.05385804548859596,
-0.009663003496825695,
0.014107768423855305,
-0.014548186212778091,
0.0692950114607811,
-0.020667219534516335,
-0.1558813899755478,
0.03536176308989525,
-0.13058514893054962,
-0.006193490698933601,
-0.08067451417446136,
0.01593266986310482,
-0.012112481519579887,
-0.06933411210775375,
-0.06563038378953934,
-0.02825232408940792,
0.06855229288339615,
-0.07185554504394531,
-0.013422412797808647,
0.03055274486541748,
0.07014734297990799,
-0.0683971419930458,
0.06584371626377106,
-0.010875449515879154,
0.016124041751027107,
-0.08848534524440765,
-0.03800930827856064,
-0.1868625432252884,
0.053135521709918976,
0.10284680873155594,
0.0034413430839776993,
-0.021484527736902237,
0.1437586396932602,
0.009147955104708672,
0.06627453863620758,
-0.048512112349271774,
0.012297098524868488,
-0.0163870956748724,
-0.006519096903502941,
-0.0928906500339508,
-0.09134607017040253,
-0.07847724854946136,
-0.06669099628925323,
0.07960854470729828,
-0.11663790047168732,
0.02216472662985325,
-0.054415520280599594,
0.04390335828065872,
0.023768333718180656,
-0.085014708340168,
-0.01728036440908909,
0.015031266026198864,
-0.05800766497850418,
-0.057131148874759674,
0.038518086075782776,
0.06676698476076126,
-0.01267297938466072,
0.08390035480260849,
-0.04452970623970032,
-0.0808335542678833,
0.028783125802874565,
0.10587503015995026,
-0.10426919907331467,
0.01333022303879261,
-0.057401593774557114,
-0.03912607580423355,
-0.07247918844223022,
-0.01734834723174572,
0.07964763790369034,
-0.005935348104685545,
0.13433575630187988,
-0.07306772470474243,
-0.002089167246595025,
0.01397666148841381,
-0.019612161442637444,
-0.020687492564320564,
0.03739185631275177,
0.07462140172719955,
-0.06726643443107605,
0.01601729355752468,
0.03737688809633255,
0.012950323522090912,
0.06755858659744263,
-0.052570879459381104,
-0.09046278148889542,
0.012272202409803867,
0.037004776298999786,
0.030041024088859558,
0.06993358582258224,
-0.015996050089597702,
-0.011251597665250301,
0.032640162855386734,
0.01657736487686634,
0.0037571366410702467,
-0.1218186765909195,
0.06237256899476051,
0.05741047114133835,
0.0008748894324526191,
0.05560171976685524,
-0.020630154758691788,
-0.03902478888630867,
0.0815163254737854,
0.03564878925681114,
0.005728374235332012,
-0.01746251806616783,
-0.015690183266997337,
-0.11834215372800827,
0.18880563974380493,
-0.057880207896232605,
-0.16161079704761505,
-0.07794163376092911,
-0.08591460436582565,
0.006449981592595577,
0.025775568559765816,
0.03784850239753723,
-0.020016081631183624,
-0.04450533166527748,
-0.12416556477546692,
0.06650806218385696,
-0.03683771938085556,
0.07219751179218292,
0.11319136619567871,
-0.03998199850320816,
0.05360127240419388,
-0.12499601393938065,
-0.006105830892920494,
-0.08377943187952042,
-0.07941728830337524,
0.05872863531112671,
-0.044825803488492966,
0.02809751406311989,
0.09401483833789825,
0.02361014112830162,
-0.015929507091641426,
-0.026299184188246727,
0.20292901992797852,
0.042211417108774185,
0.03945992514491081,
0.1235925480723381,
-0.05938157066702843,
0.0542776994407177,
0.0864526554942131,
0.01035481970757246,
-0.04704467952251434,
0.05304889380931854,
0.04593405872583389,
-0.06725474447011948,
-0.19068603217601776,
-0.020426906645298004,
-0.008935997262597084,
-0.04375321790575981,
0.07882335036993027,
0.03563664108514786,
-0.009440824389457703,
0.0707281157374382,
0.012305185198783875,
0.06133643537759781,
-0.0055354684591293335,
0.09488888084888458,
0.0025994277093559504,
-0.03265872225165367,
0.08760420233011246,
-0.019488604739308357,
-0.011007800698280334,
0.08417779207229614,
-0.0201935563236475,
0.2934209108352661,
-0.030368587002158165,
0.010148211382329464,
0.11622315645217896,
0.04838570952415466,
0.06273534893989563,
0.1324019730091095,
-0.06353391706943512,
0.023741722106933594,
-0.07035272568464279,
-0.0574738010764122,
-0.005315352696925402,
0.0468859001994133,
-0.051608745008707047,
0.017985781654715538,
-0.07731916755437851,
0.019613511860370636,
-0.021142011508345604,
0.31779688596725464,
0.1150534525513649,
-0.0991000160574913,
-0.05457688495516777,
0.00768124358728528,
-0.09932296723127365,
-0.0712139755487442,
0.04361795261502266,
0.07630972564220428,
-0.137495756149292,
0.004249283578246832,
-0.0264289528131485,
0.07601427286863327,
-0.020860714837908745,
0.015811357647180557,
0.02651253528892994,
0.03550545126199722,
-0.037540335208177567,
0.007650051265954971,
-0.19037654995918274,
0.19436971843242645,
0.00740632601082325,
0.023472817614674568,
-0.052481282502412796,
0.03256974369287491,
0.003622459014877677,
-0.03709425777196884,
0.06349067389965057,
0.023850098252296448,
-0.036150794476270676,
-0.05418810620903969,
-0.05444585904479027,
0.016103466972708702,
0.0817161351442337,
-0.05185446888208389,
0.10627803951501846,
-0.006255427375435829,
0.04227976128458977,
0.018045838922262192,
0.0847613662481308,
-0.18491417169570923,
-0.08913201093673706,
0.03167709335684776,
-0.057432953268289566,
-0.094990573823452,
-0.08075716346502304,
-0.09312260895967484,
0.011573194526135921,
0.243570014834404,
-0.13662131130695343,
-0.07375726103782654,
-0.09426554292440414,
0.03355522081255913,
0.10244187712669373,
-0.04932020232081413,
0.026539303362369537,
-0.0037421961314976215,
0.1270315945148468,
-0.06630605459213257,
-0.13436473906040192,
0.020625349134206772,
-0.08968492597341537,
-0.16554509103298187,
-0.06461873650550842,
0.11683915555477142,
0.05805526673793793,
0.03655456379055977,
-0.027734817937016487,
0.022763052955269814,
0.0394652783870697,
-0.038058020174503326,
0.0027668445836752653,
0.07469876855611801,
0.08574549853801727,
0.03963124379515648,
-0.11414214968681335,
0.013008550740778446,
-0.06394406408071518,
-0.06499044597148895,
0.0766371414065361,
0.26407700777053833,
-0.05807967111468315,
0.12529973685741425,
0.10999231785535812,
-0.08133552968502045,
-0.15559601783752441,
0.029843507334589958,
0.09292855113744736,
-0.010129082947969437,
0.013743671588599682,
-0.1587674915790558,
0.08603791147470474,
0.10892179608345032,
-0.02333812788128853,
0.012022880837321281,
-0.20014537870883942,
-0.13073740899562836,
0.06287713348865509,
0.09843325614929199,
0.2809341251850128,
-0.05851456895470619,
-0.04371152073144913,
0.019358431920409203,
-0.09316164255142212,
0.02068689838051796,
0.12950661778450012,
0.06581434607505798,
-0.02689816989004612,
-0.07664389163255692,
0.016424722969532013,
-0.03992795571684837,
0.09255804121494293,
0.05172330513596535,
0.05744383856654167,
-0.0007506262045353651,
0.01834188960492611,
-0.023177383467555046,
-0.04212119057774544,
0.06273601204156876,
0.018121005967259407,
0.04789575934410095,
-0.08100593090057373,
-0.029222941026091576,
-0.06544652581214905,
0.02819262258708477,
-0.023830914869904518,
-0.07626079767942429,
-0.05800400674343109,
0.07374589145183563,
0.04917147010564804,
-0.025039440020918846,
0.011845591478049755,
0.03308134153485298,
0.11683501303195953,
0.163528174161911,
-0.0026468709111213684,
-0.04661723971366882,
-0.059302013367414474,
-0.03851493075489998,
-0.016829539090394974,
0.0774647518992424,
-0.041439324617385864,
0.0249381884932518,
0.06875642389059067,
0.02537454664707184,
0.09816243499517441,
0.05757557973265648,
-0.11162302643060684,
-0.015504519455134869,
0.033463004976511,
-0.16085372865200043,
0.00006396124081220478,
0.0012442379957064986,
0.023690879344940186,
-0.04014097526669502,
0.02694101817905903,
0.14913901686668396,
-0.06785503029823303,
-0.03697727993130684,
-0.04274921864271164,
0.07050112634897232,
0.020088516175746918,
0.1449364572763443,
0.030077042058110237,
0.03745980188250542,
-0.08005087822675705,
0.1267441064119339,
0.03734923526644707,
-0.031922001391649246,
0.021130621433258057,
-0.027976687997579575,
-0.10507330298423767,
0.015189554542303085,
0.06773814558982849,
0.04025060683488846,
-0.04284140467643738,
-0.005615177098661661,
-0.028460035100579262,
-0.07648537307977676,
0.05908787623047829,
0.17940695583820343,
0.06476275622844696,
0.07366091758012772,
-0.05662931874394417,
-0.03676697984337807,
-0.0774671733379364,
0.04324480518698692,
0.04649922251701355,
0.07332844287157059,
-0.07446049898862839,
0.10921720415353775,
0.009882810525596142,
0.041843898594379425,
-0.03108765184879303,
-0.05089283734560013,
-0.099906787276268,
-0.053524259477853775,
-0.0924244076013565,
0.006968987639993429,
-0.0689440444111824,
-0.03998865932226181,
0.0018842980498448014,
-0.007933728396892548,
-0.00951668992638588,
0.04918791726231575,
-0.06182674318552017,
-0.011337758041918278,
-0.028972260653972626,
0.03291912376880646,
-0.06616874039173126,
-0.03781789541244507,
0.03306526318192482,
-0.1011778861284256,
0.09076639264822006,
0.04909195005893707,
0.006989072542637587,
0.009084566496312618,
0.08296907693147659,
-0.020954284816980362,
0.02411513216793537,
0.016062701120972633,
-0.04848851263523102,
-0.08159433305263519,
0.004130272194743156,
-0.006848025135695934,
-0.013385131023824215,
-0.009823490865528584,
0.08857344835996628,
-0.0883159413933754,
0.028533203527331352,
-0.010445721447467804,
-0.004166009835898876,
-0.07246946543455124,
-0.010620982386171818,
0.09731224924325943,
0.0970577523112297,
0.04990199953317642,
-0.08990415930747986,
0.013937795534729958,
-0.1416129320859909,
-0.035304948687553406,
0.006124904844909906,
-0.008652052842080593,
-0.13094371557235718,
-0.011158620938658714,
0.022036418318748474,
-0.0015726909041404724,
0.20386576652526855,
-0.053288184106349945,
-0.020611610263586044,
0.01922755315899849,
-0.0943264439702034,
0.11112102121114731,
-0.022746654227375984,
0.18211495876312256,
-0.005086049437522888,
-0.04349728673696518,
-0.01904483512043953,
0.0370112769305706,
0.019216876477003098,
-0.022908765822649002,
0.18827591836452484,
0.13472439348697662,
0.026917582377791405,
0.04231518879532814,
-0.026165323331952095,
-0.00023777988099027425,
-0.0498526357114315,
-0.02341446466743946,
0.026852423325181007,
0.03958115726709366,
0.01702023111283779,
0.1563720405101776,
0.06509006023406982,
-0.16564206779003143,
0.03341381251811981,
-0.023158160969614983,
-0.03654015436768532,
-0.11932478100061417,
-0.09774629771709442,
-0.03477643057703972,
-0.07186797261238098,
0.010598891414701939,
-0.12247458100318909,
0.009290401823818684,
0.17640487849712372,
0.0540161058306694,
0.027235999703407288,
0.0024155499413609505,
-0.1196819469332695,
-0.031459879130125046,
0.05696437507867813,
0.01275632157921791,
0.023627595975995064,
0.05571553483605385,
-0.003559547709301114,
0.06111639365553856,
0.037391550838947296,
0.01698535680770874,
0.001609002472832799,
0.07174843549728394,
0.014505268074572086,
0.04094264656305313,
-0.06131284311413765,
-0.0051197875291109085,
-0.04496388137340546,
0.06770838797092438,
0.10300365835428238,
0.05077822133898735,
-0.048619501292705536,
-0.008247572928667068,
0.1628870666027069,
-0.04420951381325722,
-0.000512491911649704,
-0.12859207391738892,
0.3271844983100891,
0.0144409304484725,
0.016555065289139748,
0.04839925840497017,
-0.07755860686302185,
-0.0528581477701664,
0.1993890404701233,
0.08426590263843536,
-0.017282331362366676,
-0.022419923916459084,
-0.002349255606532097,
-0.030225548893213272,
-0.0207997877150774,
0.15070460736751556,
0.032697465270757675,
0.12541720271110535,
-0.05579059571027756,
-0.04103602096438408,
-0.025729188695549965,
-0.012347222305834293,
-0.12288489937782288,
0.13560952246189117,
-0.031092621386051178,
-0.021430909633636475,
-0.07637392729520798,
0.024325985461473465,
0.07486048340797424,
-0.3116046190261841,
0.002315895864740014,
-0.03419172391295433,
-0.10873602330684662,
-0.004006767645478249,
-0.01794048398733139,
-0.02174355648458004,
0.04739288240671158,
-0.04698044806718826,
0.07329802960157394,
0.04685552045702934,
0.03347085043787956,
-0.026026351377367973,
-0.10054177045822144,
0.1623111367225647,
0.03955084830522537,
0.09402322769165039,
0.02888435870409012,
0.07227518409490585,
0.053074754774570465,
0.034724172204732895,
-0.09572067856788635,
0.04224524274468422,
0.010823952034115791,
-0.08492589741945267,
-0.05212685838341713,
0.12650306522846222,
-0.003384034149348736,
0.045597437769174576,
0.04016682133078575,
-0.10551111400127411,
0.011416949331760406,
0.06955674290657043,
-0.06769581884145737,
-0.10033784806728363,
-0.006237993948161602,
-0.08721369504928589,
0.15429413318634033,
0.14138726890087128,
-0.01722857728600502,
0.023543581366539,
-0.06678761541843414,
-0.006682291626930237,
0.05040933936834335,
0.015773357823491096,
-0.016212590038776398,
-0.18926790356636047,
0.030272459611296654,
-0.08088529855012894,
-0.005311559420078993,
-0.22523753345012665,
-0.10292674601078033,
-0.015951180830597878,
-0.05054772272706032,
-0.028083069249987602,
0.061098385602235794,
0.03485264629125595,
0.067301444709301,
-0.017227888107299805,
-0.058437153697013855,
-0.026309745386242867,
0.08966127038002014,
-0.10637735575437546,
-0.06416955590248108
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_600k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_600k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_600k"]}
| null |
google/multiberts-seed_2-step_600k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_600k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 600k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07873710244894028,
0.09457064419984818,
-0.002420674078166485,
0.045620039105415344,
0.08227384835481644,
-0.015008102171123028,
0.07534211128950119,
0.10364022850990295,
-0.007637742441147566,
0.02673155628144741,
0.07699079811573029,
0.005100315436720848,
0.019092252478003502,
0.09243977814912796,
0.02252034656703472,
-0.22226329147815704,
0.01951874978840351,
-0.03403685986995697,
-0.09088344871997833,
0.07704858481884003,
0.0973222628235817,
-0.07863857597112656,
0.04834504798054695,
0.024723608046770096,
-0.1148839220404625,
0.051699455827474594,
0.00436252448707819,
-0.016754252836108208,
0.13225188851356506,
-0.0051242089830338955,
0.05180765315890312,
0.05646493658423424,
0.04984070733189583,
-0.12851029634475708,
0.006805401295423508,
0.05754657834768295,
0.06076769158244133,
0.04469938948750496,
0.02326906844973564,
0.07801202684640884,
0.0016023748321458697,
0.02808760292828083,
0.043449826538562775,
0.02440331131219864,
-0.07464256882667542,
-0.05154441297054291,
-0.10373316705226898,
0.028546204790472984,
0.03099948912858963,
0.008165337145328522,
0.010136277414858341,
0.12131254374980927,
-0.038394805043935776,
0.04040856659412384,
0.1770704686641693,
-0.33220574259757996,
-0.012346798554062843,
0.0749402791261673,
0.035515666007995605,
0.12606684863567352,
-0.00638735294342041,
-0.02090524137020111,
0.07632762938737869,
0.024292875081300735,
0.08244995772838593,
-0.03868930786848068,
0.02717776782810688,
-0.05595612898468971,
-0.15412108600139618,
-0.04506278038024902,
0.08986712992191315,
-0.0025727939791977406,
-0.13758380711078644,
-0.02895556017756462,
-0.047357577830553055,
0.04424877464771271,
0.013494709506630898,
-0.03452398255467415,
0.034490421414375305,
0.01671574078500271,
-0.02198813296854496,
-0.0109876012429595,
-0.10403242707252502,
-0.05779445171356201,
0.02596999891102314,
0.07591570168733597,
0.10032925009727478,
0.0695950910449028,
-0.0000033750643524399493,
0.11398672312498093,
-0.18783798813819885,
-0.050596848130226135,
-0.03176002949476242,
-0.05096421390771866,
-0.05030284821987152,
-0.0090421661734581,
-0.1057245209813118,
-0.03910628333687782,
0.006478643976151943,
0.12454862147569656,
0.0008427504799328744,
0.028499066829681396,
-0.03071356751024723,
0.008162625133991241,
0.05276619642972946,
0.040049612522125244,
-0.004121513105928898,
0.021917488425970078,
0.020201299339532852,
-0.01241439301520586,
-0.020160803571343422,
0.015487100929021835,
0.0014698812738060951,
0.025927521288394928,
0.11427197605371475,
0.021101277321577072,
-0.10153290629386902,
0.06550648808479309,
-0.021676737815141678,
-0.043286994099617004,
0.01223396323621273,
-0.09122692793607712,
-0.05549940839409828,
-0.04039747640490532,
0.00011109741171821952,
0.012997119687497616,
-0.005721146706491709,
-0.005407887976616621,
-0.021645402535796165,
-0.03497498482465744,
-0.0842166617512703,
-0.047548867762088776,
-0.057174235582351685,
-0.12489420920610428,
0.005924950819462538,
-0.17527389526367188,
-0.03899415209889412,
-0.11059277504682541,
-0.1903577744960785,
-0.021714841946959496,
0.06428243964910507,
-0.01375390775501728,
-0.05691945180296898,
0.07664856314659119,
0.04028195142745972,
-0.02835148759186268,
0.0006917383871041238,
0.0816032662987709,
-0.0004304635804146528,
0.04584813117980957,
-0.025737382471561432,
0.07249219715595245,
0.002419339260086417,
0.034641463309526443,
-0.05634978413581848,
0.06317483633756638,
-0.17236997187137604,
0.047053974121809006,
-0.07290084660053253,
-0.034450218081474304,
-0.08638846129179001,
-0.03595171868801117,
-0.01461566798388958,
0.0024569802917540073,
0.02462429367005825,
0.07858265936374664,
-0.18613198399543762,
-0.027316661551594734,
0.11841221898794174,
-0.1641964316368103,
-0.017965830862522125,
0.06983670592308044,
-0.047206953167915344,
0.1070452407002449,
0.06717171519994736,
0.15823888778686523,
-0.011611994355916977,
-0.07810180634260178,
0.05247043818235397,
-0.010466871783137321,
0.013775909319519997,
-0.010555402375757694,
0.06848648935556412,
-0.020771535113453865,
-0.1561250239610672,
0.034209396690130234,
-0.1302202343940735,
-0.007208638824522495,
-0.07845864444971085,
0.018195033073425293,
-0.012809996493160725,
-0.06787768006324768,
-0.06534457951784134,
-0.024684950709342957,
0.06615057587623596,
-0.07182608544826508,
-0.013440468348562717,
0.03582092374563217,
0.0736997053027153,
-0.06891638785600662,
0.06559158116579056,
-0.011876019649207592,
0.01118784211575985,
-0.08669905364513397,
-0.03989224135875702,
-0.19155055284500122,
0.046013034880161285,
0.10132382810115814,
0.004723547492176294,
-0.02009015902876854,
0.13944070041179657,
0.008649184368550777,
0.0670255795121193,
-0.05083091929554939,
0.012986629270017147,
-0.01451848354190588,
-0.006147687789052725,
-0.09041109681129456,
-0.09514260292053223,
-0.07581347972154617,
-0.06818674504756927,
0.08817724883556366,
-0.12051565945148468,
0.02098490111529827,
-0.05063571035861969,
0.04437313973903656,
0.021528035402297974,
-0.08433125168085098,
-0.018936175853013992,
0.010956722311675549,
-0.059785984456539154,
-0.05858946219086647,
0.038794007152318954,
0.06813487410545349,
-0.010074436664581299,
0.08745715767145157,
-0.043186988681554794,
-0.08885623514652252,
0.030756648629903793,
0.10769155621528625,
-0.10598094016313553,
0.015782324597239494,
-0.05692816898226738,
-0.042704977095127106,
-0.06976228952407837,
-0.018607724457979202,
0.08121302723884583,
-0.006917903665453196,
0.13331328332424164,
-0.07499369233846664,
-0.000785087060648948,
0.014123747125267982,
-0.018955064937472343,
-0.020651046186685562,
0.03828471526503563,
0.07290880382061005,
-0.07933221757411957,
0.016384510323405266,
0.04259992763400078,
0.012581591494381428,
0.07038018852472305,
-0.05330052599310875,
-0.08878453820943832,
0.014418349601328373,
0.03783484548330307,
0.02954707108438015,
0.07021237909793854,
-0.01628899574279785,
-0.008060797117650509,
0.03183368965983391,
0.01833353377878666,
0.003883789526298642,
-0.12037047743797302,
0.06145315244793892,
0.054245270788669586,
0.002952195005491376,
0.054152995347976685,
-0.018335074186325073,
-0.03926200047135353,
0.08213849365711212,
0.03575552999973297,
0.002253555692732334,
-0.01491576712578535,
-0.0160211268812418,
-0.11514414101839066,
0.19277901947498322,
-0.05926094949245453,
-0.1552034318447113,
-0.07620243728160858,
-0.09779413044452667,
0.004122332204133272,
0.026014970615506172,
0.03632302209734917,
-0.025306474417448044,
-0.04305735602974892,
-0.12272068113088608,
0.061918213963508606,
-0.03591761738061905,
0.06858265399932861,
0.11063757538795471,
-0.041597556322813034,
0.054573334753513336,
-0.12478121370077133,
-0.00785176269710064,
-0.08488277345895767,
-0.08180020749568939,
0.060422491282224655,
-0.050265487283468246,
0.02720777317881584,
0.09608196467161179,
0.02380364201962948,
-0.018883056938648224,
-0.024153852835297585,
0.20047542452812195,
0.03877090662717819,
0.042389675974845886,
0.12811827659606934,
-0.060159794986248016,
0.0580773763358593,
0.0864485576748848,
0.010546678677201271,
-0.046477969735860825,
0.05506439507007599,
0.046298570930957794,
-0.06956811994314194,
-0.19520549476146698,
-0.022730298340320587,
-0.009160898625850677,
-0.046237193048000336,
0.07404161244630814,
0.0358736515045166,
-0.010871082544326782,
0.06956655532121658,
0.014237310737371445,
0.05712977424263954,
-0.006360539235174656,
0.09567906707525253,
0.008701721206307411,
-0.032563332468271255,
0.08771848678588867,
-0.020414480939507484,
-0.009821537882089615,
0.0849677249789238,
-0.016195688396692276,
0.29873332381248474,
-0.03239328786730766,
0.008034825325012207,
0.1164960265159607,
0.04732085019350052,
0.061759594827890396,
0.1288502961397171,
-0.06475063413381577,
0.022113341838121414,
-0.07312101870775223,
-0.057763487100601196,
0.0000022930257728148717,
0.04604054614901543,
-0.05896937474608421,
0.013491044752299786,
-0.07416070997714996,
0.0245355274528265,
-0.02301863208413124,
0.31014466285705566,
0.11396198719739914,
-0.10269882529973984,
-0.05686904489994049,
0.006267834920436144,
-0.09979814291000366,
-0.07092875987291336,
0.042510002851486206,
0.07545435428619385,
-0.136203333735466,
0.008887894451618195,
-0.02674432285130024,
0.07383846491575241,
-0.014916198328137398,
0.0167792160063982,
0.029783109202980995,
0.03566678240895271,
-0.03775198385119438,
0.00888997595757246,
-0.1819126158952713,
0.19866561889648438,
0.007075751665979624,
0.01984400302171707,
-0.05185743793845177,
0.0324251726269722,
0.00945280771702528,
-0.035723429173231125,
0.06540365517139435,
0.023941941559314728,
-0.03655043989419937,
-0.04274502769112587,
-0.052640192210674286,
0.015780653804540634,
0.0814511701464653,
-0.04616555944085121,
0.10660871863365173,
-0.005781944375485182,
0.042711373418569565,
0.019389862194657326,
0.08776764571666718,
-0.18183359503746033,
-0.09077690541744232,
0.030187934637069702,
-0.0614599883556366,
-0.10530516505241394,
-0.07918768376111984,
-0.09268451482057571,
0.01608450897037983,
0.24967826902866364,
-0.12450478971004486,
-0.07594162970781326,
-0.09579019993543625,
0.029689926654100418,
0.10399442166090012,
-0.04941730946302414,
0.02518056146800518,
-0.003644178854301572,
0.127730593085289,
-0.06396690756082535,
-0.13495397567749023,
0.02335180900990963,
-0.08972836285829544,
-0.16530723869800568,
-0.06708450615406036,
0.11284050345420837,
0.057807523757219315,
0.03708376735448837,
-0.02835068292915821,
0.024100687354803085,
0.03588756173849106,
-0.0369124710559845,
0.0028615789487957954,
0.06771941483020782,
0.09420072287321091,
0.03829000145196915,
-0.11157304793596268,
0.018765758723020554,
-0.06301412731409073,
-0.06609880179166794,
0.0735112875699997,
0.2631966471672058,
-0.05573601275682449,
0.1272306889295578,
0.11249887198209763,
-0.08005615323781967,
-0.15900853276252747,
0.0340314581990242,
0.09551092237234116,
-0.015250159427523613,
0.015751976519823074,
-0.16242286562919617,
0.09129229933023453,
0.11137506365776062,
-0.024780206382274628,
0.007036988157778978,
-0.18771930038928986,
-0.12912945449352264,
0.0587724931538105,
0.09877466410398483,
0.27987730503082275,
-0.057680267840623856,
-0.04332070052623749,
0.019208654761314392,
-0.09504310041666031,
0.017082206904888153,
0.12388498336076736,
0.06469199806451797,
-0.027169398963451385,
-0.06535570323467255,
0.015127369202673435,
-0.03983890265226364,
0.09382149577140808,
0.053240083158016205,
0.056803297251462936,
-0.003657412016764283,
0.014910300262272358,
-0.021080972626805305,
-0.04281752556562424,
0.059946607798337936,
0.01878219097852707,
0.050642143934965134,
-0.07654611021280289,
-0.027992410585284233,
-0.06966760009527206,
0.02600519359111786,
-0.024889571592211723,
-0.07655806094408035,
-0.05860985070466995,
0.07669578492641449,
0.04763961583375931,
-0.025298096239566803,
0.014921091496944427,
0.032490089535713196,
0.11564093828201294,
0.1656195968389511,
0.0006898185820318758,
-0.04434705525636673,
-0.05793149396777153,
-0.03643089532852173,
-0.01711033470928669,
0.07434883713722229,
-0.051202837377786636,
0.023044975474476814,
0.0661061555147171,
0.02568541094660759,
0.0945390909910202,
0.059817057102918625,
-0.11382295191287994,
-0.01529850997030735,
0.033773865550756454,
-0.16004666686058044,
0.0030590947717428207,
0.0024879404809325933,
0.03424692526459694,
-0.03898133337497711,
0.025398116558790207,
0.14773838222026825,
-0.06401503831148148,
-0.03541009500622749,
-0.040304914116859436,
0.07077543437480927,
0.022623669356107712,
0.14256182312965393,
0.02988339401781559,
0.036859121173620224,
-0.08245713263750076,
0.12716425955295563,
0.03981101140379906,
-0.042616114020347595,
0.0201918575912714,
-0.025936949998140335,
-0.10840494185686111,
0.01358348410576582,
0.061106301844120026,
0.04208013042807579,
-0.05188097804784775,
-0.004871326498687267,
-0.026089603081345558,
-0.07570695877075195,
0.060259465128183365,
0.17800229787826538,
0.06612437963485718,
0.07120640575885773,
-0.0554606132209301,
-0.03761496767401695,
-0.07898682355880737,
0.041891030967235565,
0.04168704152107239,
0.07628612965345383,
-0.07513303309679031,
0.09383934736251831,
0.00891150813549757,
0.04556834325194359,
-0.031212227419018745,
-0.05170944333076477,
-0.0971543937921524,
-0.05482899770140648,
-0.09743770956993103,
0.008960285224020481,
-0.06763385236263275,
-0.041835300624370575,
0.0008917374652810395,
-0.008371211588382721,
-0.008674640208482742,
0.05052923411130905,
-0.06427221745252609,
-0.011368782259523869,
-0.027037300169467926,
0.03385889157652855,
-0.06368923932313919,
-0.03775681555271149,
0.03548898920416832,
-0.10128092765808105,
0.09161504358053207,
0.0524577721953392,
0.006912222132086754,
0.008610558696091175,
0.09436547011137009,
-0.02512352541089058,
0.02308354713022709,
0.017307765781879425,
-0.04562211036682129,
-0.07933991402387619,
0.0008198346477001905,
-0.009278612211346626,
-0.01304890401661396,
-0.011145103722810745,
0.09275582432746887,
-0.08673205226659775,
0.03747963160276413,
-0.010535800829529762,
-0.008024504408240318,
-0.07492323219776154,
-0.009370681829750538,
0.09810233861207962,
0.10025656968355179,
0.04818950966000557,
-0.08946534991264343,
0.011357402428984642,
-0.14260233938694,
-0.03688590228557587,
0.007039874326437712,
-0.00949934497475624,
-0.1244998648762703,
-0.011291150003671646,
0.021503033116459846,
-0.0019460394978523254,
0.20161396265029907,
-0.0562545508146286,
-0.021269792690873146,
0.019512347877025604,
-0.09345754235982895,
0.11145362257957458,
-0.026136158034205437,
0.18122166395187378,
-0.007017665542662144,
-0.041111476719379425,
-0.012111073359847069,
0.03679373115301132,
0.017263157293200493,
-0.016839660704135895,
0.18764635920524597,
0.13921862840652466,
0.0283413864672184,
0.040063828229904175,
-0.02362896129488945,
0.0011714737629517913,
-0.05561082437634468,
-0.028318248689174652,
0.028022242709994316,
0.038099464029073715,
0.01790975034236908,
0.1510482132434845,
0.06667179614305496,
-0.16641920804977417,
0.03465462103486061,
-0.030674176290631294,
-0.03787138685584068,
-0.11776572465896606,
-0.08991105854511261,
-0.03336777538061142,
-0.07175879180431366,
0.011011047288775444,
-0.12329012155532837,
0.011653845198452473,
0.18116308748722076,
0.055146533995866776,
0.02862389199435711,
0.008707880042493343,
-0.12259023636579514,
-0.034313980489969254,
0.05243723466992378,
0.01290796510875225,
0.022570883855223656,
0.06137629598379135,
-0.001501721446402371,
0.06117330119013786,
0.037829432636499405,
0.014041469432413578,
0.0035150230396538973,
0.0739879235625267,
0.013895118609070778,
0.042288750410079956,
-0.05949566885828972,
-0.005662018433213234,
-0.04114808142185211,
0.0680607408285141,
0.10108993947505951,
0.04769623652100563,
-0.046884529292583466,
-0.008654100820422173,
0.16501161456108093,
-0.04629647359251976,
0.0014741930644959211,
-0.12380705028772354,
0.3424963653087616,
0.011472602374851704,
0.013779642060399055,
0.047598946839571,
-0.0787227600812912,
-0.05331806838512421,
0.20068657398223877,
0.08373910933732986,
-0.020526234060525894,
-0.02317541092634201,
-0.001312857260927558,
-0.030239392071962357,
-0.020940251648426056,
0.15281768143177032,
0.034309424459934235,
0.12937211990356445,
-0.0568607822060585,
-0.047142915427684784,
-0.02618449740111828,
-0.010748679749667645,
-0.12450414896011353,
0.14035829901695251,
-0.030482111498713493,
-0.024240776896476746,
-0.07579305768013,
0.025235364213585854,
0.07625328749418259,
-0.3200926184654236,
0.0022897033486515284,
-0.03709932044148445,
-0.11023467779159546,
-0.004380876664072275,
-0.01802709698677063,
-0.02314801700413227,
0.049968916922807693,
-0.04925200343132019,
0.07342858612537384,
0.0380011722445488,
0.03462538495659828,
-0.02684735879302025,
-0.09315416216850281,
0.16519728302955627,
0.03731568157672882,
0.09371098130941391,
0.02759637124836445,
0.07493569701910019,
0.054995737969875336,
0.03509553149342537,
-0.09734733402729034,
0.043016720563173294,
0.013284703716635704,
-0.08924812078475952,
-0.05330617353320122,
0.1289994716644287,
-0.005217601545155048,
0.038905300199985504,
0.04101947695016861,
-0.11055337637662888,
0.010187356732785702,
0.07795685529708862,
-0.06750599294900894,
-0.10037749260663986,
-0.006724187172949314,
-0.08986658602952957,
0.15618176758289337,
0.1431935578584671,
-0.015091192908585072,
0.02477455697953701,
-0.06746121495962143,
-0.007920403964817524,
0.05297253280878067,
0.010782038792967796,
-0.018727833405137062,
-0.19085118174552917,
0.030357833951711655,
-0.08529991656541824,
-0.005824108142405748,
-0.23131109774112701,
-0.10238395631313324,
-0.01261536031961441,
-0.050857774913311005,
-0.027739880606532097,
0.062010571360588074,
0.03299779072403908,
0.06585577130317688,
-0.015613234601914883,
-0.04418747499585152,
-0.0286524910479784,
0.09067653864622116,
-0.1065545454621315,
-0.06420864909887314
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 60k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 60k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_60k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_60k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_60k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_60k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_60k"]}
| null |
google/multiberts-seed_2-step_60k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_60k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_60k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 60k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 60k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 60k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 60k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_60k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 60k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 60k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_60k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 60k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 60k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07743852585554123,
0.09958657622337341,
-0.002480052411556244,
0.039525583386421204,
0.07788282632827759,
-0.016789913177490234,
0.08181874454021454,
0.10441478341817856,
-0.013400556519627571,
0.02870878204703331,
0.07788717746734619,
0.0064393882639706135,
0.017096351832151413,
0.09974057972431183,
0.022270087152719498,
-0.22664178907871246,
0.019267089664936066,
-0.03220744803547859,
-0.08884622901678085,
0.07753251492977142,
0.09946080297231674,
-0.07974178344011307,
0.04568755626678467,
0.02743890881538391,
-0.11202510446310043,
0.04880711808800697,
0.002176815876737237,
-0.017143696546554565,
0.13360047340393066,
-0.00538486335426569,
0.05189218372106552,
0.054644785821437836,
0.04562397301197052,
-0.13437926769256592,
0.007370139472186565,
0.05821828916668892,
0.05851936340332031,
0.04577314481139183,
0.02296184003353119,
0.07584306597709656,
0.006089314818382263,
0.019585518166422844,
0.04241637513041496,
0.023152319714426994,
-0.07477831095457077,
-0.06250719726085663,
-0.1030544862151146,
0.03267225995659828,
0.02698279358446598,
0.005705924704670906,
0.010035151615738869,
0.128944993019104,
-0.036880601197481155,
0.04348516836762428,
0.1841408908367157,
-0.3388514220714569,
-0.011347739957273006,
0.07416248321533203,
0.040427032858133316,
0.12343108654022217,
-0.0061801401898264885,
-0.01919957436621189,
0.07531421631574631,
0.022701062262058258,
0.08667531609535217,
-0.039837900549173355,
0.03591611608862877,
-0.05343924090266228,
-0.15592627227306366,
-0.04654843360185623,
0.0906396433711052,
-0.0012299485970288515,
-0.1366390734910965,
-0.03425741568207741,
-0.04816875606775284,
0.04332200065255165,
0.011760856956243515,
-0.03469572961330414,
0.03438887745141983,
0.015946604311466217,
-0.02255725860595703,
-0.010787885636091232,
-0.10410120338201523,
-0.055359069257974625,
0.029286827892065048,
0.07840894162654877,
0.1020306721329689,
0.06710808724164963,
-0.0011318520409986377,
0.11182884871959686,
-0.19375482201576233,
-0.05242546647787094,
-0.028956657275557518,
-0.05084967613220215,
-0.050879620015621185,
-0.009635118767619133,
-0.10852958261966705,
-0.039603687822818756,
0.00886042695492506,
0.13693080842494965,
-0.008005847223103046,
0.027748942375183105,
-0.032712891697883606,
0.00869248528033495,
0.056795619428157806,
0.04128298535943031,
-0.0041938722133636475,
0.020120013505220413,
0.019220614805817604,
-0.01653849519789219,
-0.01952955685555935,
0.015461630187928677,
0.000438244896940887,
0.02580009400844574,
0.11767503619194031,
0.021060917526483536,
-0.10385337471961975,
0.06795263290405273,
-0.01909838244318962,
-0.04505264014005661,
0.011937523260712624,
-0.09055396914482117,
-0.05699228122830391,
-0.03715595602989197,
-0.000318312959279865,
0.012503164820373058,
-0.00740433344617486,
-0.004297065082937479,
-0.022121476009488106,
-0.03721972554922104,
-0.08250845968723297,
-0.04699953272938728,
-0.05629744008183479,
-0.12699636816978455,
0.00831158459186554,
-0.1839490383863449,
-0.037382327020168304,
-0.11004633456468582,
-0.19046315550804138,
-0.02228393964469433,
0.06373181939125061,
-0.012207243591547012,
-0.05737833306193352,
0.07739698886871338,
0.04145374149084091,
-0.02760341204702854,
-0.0005902189295738935,
0.08065757155418396,
-0.0022951087448745966,
0.04482124373316765,
-0.024477211758494377,
0.07140098512172699,
0.0030288887210190296,
0.034145038574934006,
-0.05529256537556648,
0.06334831565618515,
-0.17244170606136322,
0.04571002721786499,
-0.07276402413845062,
-0.03561979904770851,
-0.0890750139951706,
-0.03353102132678032,
-0.013245923444628716,
0.0035880631767213345,
0.023584971204400063,
0.07401257008314133,
-0.18516628444194794,
-0.02999214641749859,
0.12474258244037628,
-0.16198976337909698,
-0.018609602004289627,
0.07211262732744217,
-0.049555979669094086,
0.1060619130730629,
0.0667378306388855,
0.15981563925743103,
-0.01412051822990179,
-0.0811811238527298,
0.054503679275512695,
-0.0089610880240798,
0.01513003371655941,
-0.008253923617303371,
0.07210204750299454,
-0.020178813487291336,
-0.1500481814146042,
0.03337283059954643,
-0.1343451589345932,
-0.005558209493756294,
-0.07772869616746902,
0.018557200208306313,
-0.011177575215697289,
-0.06641609966754913,
-0.06610623747110367,
-0.025549687445163727,
0.06902702152729034,
-0.07189693301916122,
-0.01790427230298519,
0.03989611566066742,
0.0735374167561531,
-0.07160504907369614,
0.06561009585857391,
-0.014790002256631851,
0.01257961057126522,
-0.08430704474449158,
-0.03987859562039375,
-0.19240377843379974,
0.047663167119026184,
0.10059917718172073,
0.007194602396339178,
-0.01992836594581604,
0.14531764388084412,
0.009831331670284271,
0.06677775830030441,
-0.04851865395903587,
0.014114144258201122,
-0.014605579897761345,
-0.006135573610663414,
-0.08998735249042511,
-0.09969717264175415,
-0.07374677807092667,
-0.06990890949964523,
0.09002320468425751,
-0.12295878678560257,
0.021118134260177612,
-0.052903249859809875,
0.04684525355696678,
0.021352654322981834,
-0.08256767690181732,
-0.01757531613111496,
0.010951485484838486,
-0.06068960949778557,
-0.0568268857896328,
0.04190194234251976,
0.0690210834145546,
-0.010650073178112507,
0.09470299631357193,
-0.050594404339790344,
-0.09010300040245056,
0.030467990785837173,
0.09987260401248932,
-0.10459388792514801,
0.010972411371767521,
-0.058342766016721725,
-0.04271642863750458,
-0.06646889448165894,
-0.015080454759299755,
0.07923337817192078,
-0.00637475261464715,
0.13750989735126495,
-0.07551411539316177,
-0.0059921895153820515,
0.012931005097925663,
-0.021083137020468712,
-0.018954051658511162,
0.03982457146048546,
0.0688730850815773,
-0.0781005322933197,
0.01760123297572136,
0.042510099709033966,
0.009737559594213963,
0.07449863106012344,
-0.05681624263525009,
-0.09184679388999939,
0.011401322670280933,
0.037769295275211334,
0.029002532362937927,
0.06936053186655045,
-0.02055090107023716,
-0.012235094793140888,
0.03462842106819153,
0.01627183146774769,
0.004202814307063818,
-0.11933887749910355,
0.0620373860001564,
0.05451851338148117,
0.0028739727567881346,
0.05916514992713928,
-0.01672312058508396,
-0.039715591818094254,
0.080106221139431,
0.036788374185562134,
0.00046972863492555916,
-0.015224547125399113,
-0.01594281941652298,
-0.11572527140378952,
0.19201265275478363,
-0.05816344916820526,
-0.15641607344150543,
-0.07439947128295898,
-0.10237928479909897,
0.0059817759320139885,
0.02487003803253174,
0.0373014360666275,
-0.02095092460513115,
-0.042844437062740326,
-0.12201829999685287,
0.059481918811798096,
-0.0394047275185585,
0.06818234920501709,
0.10984864830970764,
-0.0409843735396862,
0.05539768934249878,
-0.1247611716389656,
-0.007846024818718433,
-0.08411334455013275,
-0.07630724459886551,
0.06266290694475174,
-0.05043771490454674,
0.025121169164776802,
0.09552234411239624,
0.02406284399330616,
-0.016801463440060616,
-0.02500622346997261,
0.19984127581119537,
0.03781356289982796,
0.04055023193359375,
0.12897494435310364,
-0.06435893476009369,
0.05728519707918167,
0.08202631026506424,
0.008926402777433395,
-0.04457583650946617,
0.0530209019780159,
0.04919237270951271,
-0.06817669421434402,
-0.19675129652023315,
-0.022277599200606346,
-0.007955176755785942,
-0.04422235116362572,
0.07514578104019165,
0.03587980940937996,
-0.0003499383747112006,
0.06880705058574677,
0.01182913314551115,
0.06060926243662834,
-0.006586005445569754,
0.09858851134777069,
0.013368692249059677,
-0.03327617049217224,
0.08834924548864365,
-0.020387958735227585,
-0.011408273130655289,
0.08339244872331619,
-0.015945373103022575,
0.29144996404647827,
-0.029331374913454056,
0.0191611610352993,
0.11879059672355652,
0.043576180934906006,
0.0618814080953598,
0.12420365959405899,
-0.06580768525600433,
0.02181486412882805,
-0.07490656524896622,
-0.06030704826116562,
-0.0006297920481301844,
0.04743904247879982,
-0.059732142835855484,
0.008229243569076061,
-0.07032234966754913,
0.022011149674654007,
-0.021140120923519135,
0.3104100227355957,
0.11415653675794601,
-0.10448093712329865,
-0.05823088064789772,
0.004652148578315973,
-0.09935159981250763,
-0.06973942369222641,
0.041265517473220825,
0.07202345132827759,
-0.1350417584180832,
0.009253550320863724,
-0.027782348915934563,
0.07549192756414413,
-0.017531132325530052,
0.017662864178419113,
0.025083959102630615,
0.03566283732652664,
-0.03743782639503479,
0.010208647698163986,
-0.18572816252708435,
0.19700361788272858,
0.00750255910679698,
0.018735608085989952,
-0.05221415311098099,
0.0321819968521595,
0.009273143485188484,
-0.034609049558639526,
0.06321864575147629,
0.02325129322707653,
-0.03917543217539787,
-0.03938929736614227,
-0.05326571315526962,
0.01493722666054964,
0.08178737759590149,
-0.04579325392842293,
0.10917218774557114,
-0.006685079541057348,
0.04212760925292969,
0.020550837740302086,
0.087987519800663,
-0.18184199929237366,
-0.08688173443078995,
0.03051787056028843,
-0.059291765093803406,
-0.10453788936138153,
-0.07895687967538834,
-0.09272709488868713,
0.0043716165237128735,
0.2517779469490051,
-0.12297585606575012,
-0.07310768961906433,
-0.09544779360294342,
0.03224880248308182,
0.10196264088153839,
-0.05012628063559532,
0.02396543323993683,
-0.006159511394798756,
0.13147303462028503,
-0.06706361472606659,
-0.1339433789253235,
0.023581156507134438,
-0.09056486934423447,
-0.16550880670547485,
-0.06625556200742722,
0.11476322263479233,
0.05970614403486252,
0.03696591034531593,
-0.028853315860033035,
0.023874729871749878,
0.032633956521749496,
-0.0354200080037117,
0.003653466235846281,
0.06818410754203796,
0.09815773367881775,
0.033226799219846725,
-0.11093618720769882,
0.02008988708257675,
-0.06270762532949448,
-0.06526827067136765,
0.07858794927597046,
0.265118271112442,
-0.05614590644836426,
0.12720955908298492,
0.11271972954273224,
-0.07883526384830475,
-0.1556880623102188,
0.031116195023059845,
0.09588397294282913,
-0.01513452734798193,
0.015506756491959095,
-0.16400913894176483,
0.08885589241981506,
0.10952649265527725,
-0.024797484278678894,
0.0034936771262437105,
-0.18595021963119507,
-0.12858104705810547,
0.06558813899755478,
0.09599539637565613,
0.2771194577217102,
-0.06043851003050804,
-0.044301588088274,
0.019690196961164474,
-0.09118667244911194,
0.023611003533005714,
0.11729727685451508,
0.06452600657939911,
-0.024991260841488838,
-0.06905808299779892,
0.014628292061388493,
-0.04038233682513237,
0.09197550266981125,
0.053925126791000366,
0.056643303483724594,
-0.003272595815360546,
0.017272520810365677,
-0.015213672071695328,
-0.04460608586668968,
0.06112448498606682,
0.018184082582592964,
0.0488734096288681,
-0.08306730538606644,
-0.02812885493040085,
-0.06958859413862228,
0.029590357095003128,
-0.024745266884565353,
-0.0776607021689415,
-0.05833319202065468,
0.07680036872625351,
0.047006722539663315,
-0.023161059245467186,
0.02257000096142292,
0.031349584460258484,
0.11566396802663803,
0.1668713092803955,
-0.0021487963385879993,
-0.039173442870378494,
-0.05889454483985901,
-0.03810720518231392,
-0.015985913574695587,
0.07495854049921036,
-0.055103905498981476,
0.0227176733314991,
0.06426301598548889,
0.02607043832540512,
0.0972825437784195,
0.056451741605997086,
-0.11739243566989899,
-0.01756732538342476,
0.031081505119800568,
-0.16341720521450043,
0.007183655630797148,
0.00219619064591825,
0.031052155420184135,
-0.03466351330280304,
0.02836422249674797,
0.15086352825164795,
-0.06295915693044662,
-0.03513684123754501,
-0.04033803939819336,
0.06783006340265274,
0.023962851613759995,
0.1393543928861618,
0.033021893352270126,
0.03785950690507889,
-0.08143801242113113,
0.12398459017276764,
0.038445621728897095,
-0.040680672973394394,
0.023294927552342415,
-0.025532850995659828,
-0.10881558805704117,
0.01311473734676838,
0.06285678595304489,
0.04535435885190964,
-0.051916252821683884,
-0.010791124776005745,
-0.028339562937617302,
-0.07029381394386292,
0.06265832483768463,
0.18510402739048004,
0.0674830824136734,
0.07216294854879379,
-0.0549297034740448,
-0.03708449378609657,
-0.07782802730798721,
0.04454898461699486,
0.03856788948178291,
0.07533727586269379,
-0.07365406304597855,
0.10321827232837677,
0.010548071004450321,
0.04607413336634636,
-0.03175486624240875,
-0.053274739533662796,
-0.09836594760417938,
-0.054743580520153046,
-0.10089855641126633,
0.010041341185569763,
-0.07075510174036026,
-0.041438788175582886,
0.0004336375277489424,
-0.006731794681400061,
-0.007695187348872423,
0.05031866952776909,
-0.06388309597969055,
-0.009521289728581905,
-0.026759715750813484,
0.034645672887563705,
-0.06681239604949951,
-0.03722696378827095,
0.03261981159448624,
-0.10158203542232513,
0.09302721917629242,
0.05374925956130028,
0.007919756695628166,
0.00877387449145317,
0.08717171847820282,
-0.022273516282439232,
0.023411931470036507,
0.01488243043422699,
-0.04711958020925522,
-0.0835803747177124,
0.0015318529913201928,
-0.009969200007617474,
-0.013185519725084305,
-0.011777865700423717,
0.0903075560927391,
-0.08647852391004562,
0.03665478527545929,
-0.007869751192629337,
-0.00911780260503292,
-0.0750407949090004,
-0.010370339266955853,
0.09186626225709915,
0.10011359304189682,
0.047160178422927856,
-0.0872136801481247,
0.011168603785336018,
-0.14139693975448608,
-0.03702065348625183,
0.007231626659631729,
-0.009578587487339973,
-0.1247294694185257,
-0.011671468615531921,
0.019839676097035408,
-0.0036164540797472,
0.20782271027565002,
-0.0558316595852375,
-0.019513582810759544,
0.018432773649692535,
-0.09822218120098114,
0.11094173043966293,
-0.02658182568848133,
0.18435588479042053,
-0.006857837550342083,
-0.04004672169685364,
-0.014332730323076248,
0.037920139729976654,
0.018661608919501305,
-0.019941451027989388,
0.1835857778787613,
0.14018504321575165,
0.02816559374332428,
0.03948298841714859,
-0.023087888956069946,
-0.0005784078384749591,
-0.058755502104759216,
-0.018467869609594345,
0.029319491237401962,
0.038906678557395935,
0.018250441178679466,
0.1592756062746048,
0.07322931289672852,
-0.16675139963626862,
0.034686725586652756,
-0.02791053242981434,
-0.03664948046207428,
-0.11904220283031464,
-0.09569703042507172,
-0.034904055297374725,
-0.07214090973138809,
0.010512619279325008,
-0.12324059009552002,
0.009797254577279091,
0.18126508593559265,
0.056934136897325516,
0.02832075208425522,
0.004603183828294277,
-0.12092753499746323,
-0.03660798817873001,
0.05196446180343628,
0.013321521691977978,
0.023178085684776306,
0.05725676193833351,
-0.00023739458993077278,
0.06261542439460754,
0.03691159561276436,
0.014793101698160172,
0.0011444485280662775,
0.08150055259466171,
0.015452760271728039,
0.04092618450522423,
-0.060969799757003784,
-0.004741770215332508,
-0.03930368274450302,
0.06942688673734665,
0.09832802414894104,
0.04873451963067055,
-0.04829028993844986,
-0.008288874290883541,
0.16222791373729706,
-0.04288789629936218,
-0.002153166336938739,
-0.1259874403476715,
0.3374132812023163,
0.011358157731592655,
0.013774850405752659,
0.046789877116680145,
-0.07698499411344528,
-0.04948049783706665,
0.19788263738155365,
0.08604969084262848,
-0.01751656085252762,
-0.021290093660354614,
0.0014963033609092236,
-0.030048279091715813,
-0.021763019263744354,
0.14719685912132263,
0.03502174839377403,
0.13074995577335358,
-0.056493066251277924,
-0.05137745290994644,
-0.027003353461623192,
-0.01068422943353653,
-0.12500528991222382,
0.13588304817676544,
-0.02722012810409069,
-0.02585809864103794,
-0.07300857454538345,
0.02478921413421631,
0.07434102892875671,
-0.3227385878562927,
0.0019389671506360173,
-0.03511932119727135,
-0.10939259082078934,
-0.00368961482308805,
-0.017720350995659828,
-0.02220340445637703,
0.0473773330450058,
-0.04856782406568527,
0.07118895649909973,
0.03801602125167847,
0.034724365919828415,
-0.02594652585685253,
-0.08882303535938263,
0.16559700667858124,
0.04125230014324188,
0.0903882384300232,
0.02594095654785633,
0.07846803963184357,
0.05665779858827591,
0.033205438405275345,
-0.09315996617078781,
0.04389641806483269,
0.013951468281447887,
-0.0880388393998146,
-0.05411121994256973,
0.1270829290151596,
-0.003719659522175789,
0.04314674437046051,
0.044694893062114716,
-0.10896780341863632,
0.012959699146449566,
0.07773876935243607,
-0.07026530057191849,
-0.10064448416233063,
-0.005863819736987352,
-0.09088907390832901,
0.1558302789926529,
0.14325647056102753,
-0.015158197842538357,
0.023655587807297707,
-0.07009223848581314,
-0.006908041890710592,
0.054256927222013474,
0.006847652141004801,
-0.019786369055509567,
-0.18782350420951843,
0.03344146907329559,
-0.07569453120231628,
-0.004235988948494196,
-0.2308984398841858,
-0.10059773176908493,
-0.01188158243894577,
-0.04857519641518593,
-0.026818137615919113,
0.05743285268545151,
0.031257264316082,
0.06698399037122726,
-0.01724710315465927,
-0.044671718031167984,
-0.028227906674146652,
0.08813638985157013,
-0.10890977829694748,
-0.06428298354148865
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_700k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_700k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_700k"]}
| null |
google/multiberts-seed_2-step_700k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_700k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 700k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07860013842582703,
0.0878153070807457,
-0.002430046210065484,
0.04299048334360123,
0.07514158636331558,
-0.016574930399656296,
0.07265346497297287,
0.1033460795879364,
-0.014950618147850037,
0.025346584618091583,
0.08135919272899628,
0.0019139787182211876,
0.018960127606987953,
0.10144298523664474,
0.024219129234552383,
-0.22758488357067108,
0.02230107970535755,
-0.031364887952804565,
-0.08305660635232925,
0.07751089334487915,
0.10059225559234619,
-0.08101744204759598,
0.04506969824433327,
0.027588900178670883,
-0.10897984355688095,
0.04771437495946884,
0.0029030523728579283,
-0.020829712972044945,
0.13223326206207275,
-0.0016816076822578907,
0.046206820756196976,
0.05635230243206024,
0.04861476644873619,
-0.1313762664794922,
0.006140945479273796,
0.05945798009634018,
0.05838087573647499,
0.045206762850284576,
0.023854926228523254,
0.08120948076248169,
0.0039804293774068356,
0.02926338091492653,
0.0446271151304245,
0.025696072727441788,
-0.07801538705825806,
-0.06850151717662811,
-0.10197123140096664,
0.04260937497019768,
0.03195149078965187,
0.004687881097197533,
0.010189727880060673,
0.12491331249475479,
-0.03583807125687599,
0.04453456029295921,
0.17377373576164246,
-0.33468568325042725,
-0.014353971928358078,
0.07921729981899261,
0.04150886833667755,
0.12987852096557617,
-0.005945098120719194,
-0.01719370298087597,
0.07684539258480072,
0.021136604249477386,
0.0900566503405571,
-0.037605348974466324,
0.03562980145215988,
-0.054251085966825485,
-0.15595176815986633,
-0.04858311265707016,
0.08816040307283401,
0.003065206576138735,
-0.13769342005252838,
-0.03659508377313614,
-0.04719800129532814,
0.03348017856478691,
0.011896257288753986,
-0.039984192699193954,
0.03156711906194687,
0.018603850156068802,
-0.023795058950781822,
-0.013406346552073956,
-0.10315272957086563,
-0.056591663509607315,
0.03262396529316902,
0.07254160195589066,
0.09837329387664795,
0.06985010206699371,
0.003514507319778204,
0.11344035714864731,
-0.19685329496860504,
-0.05125981196761131,
-0.03178870305418968,
-0.05530426651239395,
-0.05175669118762016,
-0.010705167427659035,
-0.10466346144676208,
-0.039449986070394516,
0.006385816261172295,
0.13240237534046173,
0.006718316115438938,
0.028440337628126144,
-0.034706417471170425,
0.009282514452934265,
0.05476367473602295,
0.04691221937537193,
-0.007660714443773031,
0.011531170457601547,
0.023402893915772438,
-0.010253426618874073,
-0.01873127371072769,
0.01363199669867754,
0.0033167165238410234,
0.021464534103870392,
0.12335419654846191,
0.02086433209478855,
-0.09718285501003265,
0.06433156132698059,
-0.018372993916273117,
-0.04663722217082977,
0.021589873358607292,
-0.08599397540092468,
-0.05255299434065819,
-0.04116356745362282,
-0.0004595518112182617,
0.010893022641539574,
-0.007859989069402218,
-0.00299062323756516,
-0.022067977115511894,
-0.03804551437497139,
-0.08351439237594604,
-0.04866768419742584,
-0.05508197098970413,
-0.12509959936141968,
0.006257941480726004,
-0.17262378334999084,
-0.03893556073307991,
-0.11236986517906189,
-0.1938975751399994,
-0.021898427978157997,
0.06205449253320694,
-0.014098260551691055,
-0.05660638585686684,
0.08070670068264008,
0.04482733830809593,
-0.029316119849681854,
-0.001944114570505917,
0.07422341406345367,
-0.0002914000942837447,
0.04655337333679199,
-0.025828639045357704,
0.07168535143136978,
0.002399999648332596,
0.031145893037319183,
-0.05669315159320831,
0.06491336971521378,
-0.17527799308300018,
0.04659460484981537,
-0.07263485342264175,
-0.030521512031555176,
-0.08766429871320724,
-0.0391136072576046,
-0.0168597474694252,
-0.00015494960825890303,
0.02608232945203781,
0.0733700767159462,
-0.18392471969127655,
-0.025786127895116806,
0.12090510874986649,
-0.16406220197677612,
-0.021985279396176338,
0.07878635078668594,
-0.04564598202705383,
0.10296198725700378,
0.0684865191578865,
0.1579592525959015,
-0.009292328730225563,
-0.07694955915212631,
0.06016143411397934,
-0.012123963795602322,
0.012106272391974926,
-0.011335184797644615,
0.06907998770475388,
-0.019645199179649353,
-0.15809421241283417,
0.035163529217243195,
-0.12295747548341751,
-0.006014388985931873,
-0.08076708018779755,
0.015038210898637772,
-0.01335362158715725,
-0.06695593148469925,
-0.06853939592838287,
-0.024458108469843864,
0.06697350740432739,
-0.07395246624946594,
-0.015234931372106075,
0.02864796482026577,
0.07270790636539459,
-0.07013970613479614,
0.06727433949708939,
-0.012136297300457954,
0.014052298851311207,
-0.08347704261541367,
-0.03840677812695503,
-0.18530744314193726,
0.052144307643175125,
0.10194814950227737,
0.013059590011835098,
-0.02312242053449154,
0.1382260024547577,
0.008914907462894917,
0.06828103959560394,
-0.05356458202004433,
0.011555391363799572,
-0.013857691548764706,
-0.004717587027698755,
-0.09212367981672287,
-0.09312348812818527,
-0.08036155253648758,
-0.06602603197097778,
0.08575298637151718,
-0.12871545553207397,
0.021663496270775795,
-0.058177411556243896,
0.04558591544628143,
0.021444128826260567,
-0.08588847517967224,
-0.020339349284768105,
0.012250293046236038,
-0.0588165782392025,
-0.058009784668684006,
0.039303235709667206,
0.06733003258705139,
-0.015036394819617271,
0.08339636772871017,
-0.04749231040477753,
-0.07642822712659836,
0.029275059700012207,
0.09775597602128983,
-0.1054614931344986,
0.00630564708262682,
-0.058122698217630386,
-0.04442520812153816,
-0.06742590665817261,
-0.02339034341275692,
0.0704602599143982,
-0.008987768553197384,
0.13625140488147736,
-0.07673066109418869,
0.0005359433707781136,
0.01571374014019966,
-0.020332958549261093,
-0.021495379507541656,
0.033755358308553696,
0.06443220376968384,
-0.08460365235805511,
0.013483538292348385,
0.04201148822903633,
0.013443982228636742,
0.07011934369802475,
-0.05193600803613663,
-0.08726628124713898,
0.01305188238620758,
0.03644708916544914,
0.03051675297319889,
0.06827719509601593,
-0.02192496880888939,
-0.007258365396410227,
0.03293953835964203,
0.020556969568133354,
0.004809974692761898,
-0.11854034662246704,
0.06050930172204971,
0.056277915835380554,
0.0025519132614135742,
0.05915972590446472,
-0.016128508374094963,
-0.03999815136194229,
0.07956073433160782,
0.035070277750492096,
-0.001344233169220388,
-0.013354836031794548,
-0.015121109783649445,
-0.11640509963035583,
0.19134528934955597,
-0.057527340948581696,
-0.15897858142852783,
-0.07880166918039322,
-0.09703797847032547,
0.0015900193247944117,
0.026835111901164055,
0.038125649094581604,
-0.016839411109685898,
-0.043795548379421234,
-0.12582331895828247,
0.06062678247690201,
-0.041691675782203674,
0.0704488456249237,
0.11236368119716644,
-0.04086023196578026,
0.058303650468587875,
-0.12682759761810303,
-0.007669312413781881,
-0.082848459482193,
-0.07852188497781754,
0.06018118932843208,
-0.05159805715084076,
0.026105303317308426,
0.09571599215269089,
0.024151425808668137,
-0.01674557290971279,
-0.026185212656855583,
0.21034111082553864,
0.04003996029496193,
0.043333590030670166,
0.1284070461988449,
-0.06045287847518921,
0.05688650906085968,
0.08322811871767044,
0.01056812983006239,
-0.04867510870099068,
0.05398014932870865,
0.043000727891922,
-0.07078543305397034,
-0.18911176919937134,
-0.024936001747846603,
-0.008931463584303856,
-0.037965718656778336,
0.07765223830938339,
0.0372089222073555,
0.009738697670400143,
0.06845338642597198,
0.011965685524046421,
0.05990634858608246,
-0.0001683391019469127,
0.09887077659368515,
0.01590249128639698,
-0.03269093483686447,
0.09062616527080536,
-0.016569945961236954,
-0.007815293967723846,
0.08294986933469772,
-0.018847158178687096,
0.29251372814178467,
-0.029436631128191948,
0.010322373360395432,
0.11532279849052429,
0.05071480572223663,
0.06373674422502518,
0.1277882307767868,
-0.06709874421358109,
0.01949000172317028,
-0.07282643765211105,
-0.057974010705947876,
-0.0025220357347279787,
0.04682895168662071,
-0.053695984184741974,
0.013012869283556938,
-0.07376028597354889,
0.018570542335510254,
-0.02186531014740467,
0.3059384822845459,
0.1103467121720314,
-0.10473904758691788,
-0.05662547051906586,
0.005296187475323677,
-0.10194890946149826,
-0.07318742573261261,
0.04210023954510689,
0.07012231647968292,
-0.1371992975473404,
0.007162376306951046,
-0.023949040099978447,
0.07717234641313553,
-0.014951308257877827,
0.01741650141775608,
0.02547277882695198,
0.03318362683057785,
-0.03847736492753029,
0.00948772020637989,
-0.1741926074028015,
0.19708286225795746,
0.006332527846097946,
0.020890142768621445,
-0.05258249118924141,
0.03053303062915802,
0.003671694314107299,
-0.02943030185997486,
0.06240715831518173,
0.023436633870005608,
-0.02844201773405075,
-0.05120834708213806,
-0.05409253016114235,
0.011099213734269142,
0.07940400391817093,
-0.041462138295173645,
0.1027432233095169,
-0.005559443961828947,
0.0422728955745697,
0.01985306106507778,
0.09979970753192902,
-0.1847659796476364,
-0.08603744953870773,
0.029612984508275986,
-0.057673752307891846,
-0.09434189647436142,
-0.08150975406169891,
-0.09251929074525833,
0.005439646542072296,
0.25183501839637756,
-0.11455333977937698,
-0.07097144424915314,
-0.09598080068826675,
0.03564347326755524,
0.10939516127109528,
-0.04742012172937393,
0.028008664026856422,
-0.0037973697762936354,
0.12327079474925995,
-0.0632820874452591,
-0.13034167885780334,
0.024792322888970375,
-0.09119518846273422,
-0.16915881633758545,
-0.06685402989387512,
0.1165158674120903,
0.05695776268839836,
0.03560435771942139,
-0.023687828332185745,
0.022501619532704353,
0.036784183233976364,
-0.040004145354032516,
0.00039878912502899766,
0.06856733560562134,
0.08718032389879227,
0.030378254130482674,
-0.10259026288986206,
0.024335583671927452,
-0.06035975366830826,
-0.06736433506011963,
0.07453055679798126,
0.26964980363845825,
-0.05730697512626648,
0.12652269005775452,
0.11599468439817429,
-0.07768358290195465,
-0.15162430703639984,
0.02885725349187851,
0.09366770088672638,
-0.015314583666622639,
0.013710986822843552,
-0.15275254845619202,
0.09001165628433228,
0.11733020097017288,
-0.02552991360425949,
0.0022775984834879637,
-0.19169673323631287,
-0.1325278878211975,
0.06448224186897278,
0.09537141025066376,
0.28095605969429016,
-0.059470370411872864,
-0.04242495074868202,
0.022056560963392258,
-0.09359773993492126,
0.018654605373740196,
0.12592948973178864,
0.06462283432483673,
-0.026842394843697548,
-0.08522571623325348,
0.015581552870571613,
-0.04091598093509674,
0.09745840728282928,
0.051789093762636185,
0.056150224059820175,
-0.002329963957890868,
0.02519816718995571,
-0.016286298632621765,
-0.04164818674325943,
0.058587025851011276,
0.027848925441503525,
0.05038353055715561,
-0.08541155606508255,
-0.031188400462269783,
-0.06867974996566772,
0.025340711697936058,
-0.024804504588246346,
-0.07885877788066864,
-0.059382256120443344,
0.07741470634937286,
0.04948372393846512,
-0.025966599583625793,
0.02230698987841606,
0.031224779784679413,
0.11914175003767014,
0.17708036303520203,
-0.003987777046859264,
-0.05402299389243126,
-0.05482269823551178,
-0.03665783256292343,
-0.0188751183450222,
0.07515853643417358,
-0.04137728735804558,
0.025982297956943512,
0.06507773697376251,
0.02343771420419216,
0.095729760825634,
0.057393334805965424,
-0.11279308050870895,
-0.015900276601314545,
0.03380448743700981,
-0.16027812659740448,
0.003105367999523878,
-0.0010101039661094546,
0.027960658073425293,
-0.035608697682619095,
0.02745448239147663,
0.1469423472881317,
-0.06582069396972656,
-0.034489136189222336,
-0.04002630338072777,
0.06717374175786972,
0.020733511075377464,
0.14424635469913483,
0.03101522848010063,
0.037924256175756454,
-0.08074857294559479,
0.12237415462732315,
0.04268236830830574,
-0.04127142205834389,
0.01768428273499012,
-0.02352769672870636,
-0.10869996249675751,
0.01563108153641224,
0.05286870524287224,
0.03321170434355736,
-0.05253926292061806,
-0.0069795590825378895,
-0.022656835615634918,
-0.07233278453350067,
0.06032051146030426,
0.1773519366979599,
0.06765821576118469,
0.07181231677532196,
-0.05580753833055496,
-0.03688006103038788,
-0.07736706733703613,
0.041062477976083755,
0.04579833149909973,
0.07565908133983612,
-0.07461314648389816,
0.09877560287714005,
0.01011673454195261,
0.04292883723974228,
-0.032179154455661774,
-0.057034946978092194,
-0.09971578419208527,
-0.05338427051901817,
-0.09985550493001938,
0.005183866247534752,
-0.07330697029829025,
-0.03972151130437851,
-0.0002449587336741388,
-0.009190267883241177,
-0.007215328514575958,
0.04775887355208397,
-0.06086565926671028,
-0.011370724067091942,
-0.026133045554161072,
0.03225705027580261,
-0.06380727142095566,
-0.03681420907378197,
0.032050617039203644,
-0.09896961599588394,
0.09389592707157135,
0.0487777478992939,
0.008301369845867157,
0.003849123837426305,
0.09393708407878876,
-0.019180845469236374,
0.02401876263320446,
0.01580270379781723,
-0.04626794159412384,
-0.07952478528022766,
0.0021651452407240868,
-0.006623529829084873,
-0.018150722607970238,
-0.009677025489509106,
0.08694203943014145,
-0.09053508937358856,
0.029373034834861755,
-0.011354882270097733,
-0.012164290994405746,
-0.0736084058880806,
-0.011939935386180878,
0.09355591237545013,
0.09869630634784698,
0.044765982776880264,
-0.08892785757780075,
0.01140669733285904,
-0.14588914811611176,
-0.036454297602176666,
0.0069810859858989716,
-0.009201283566653728,
-0.11808017641305923,
-0.009201153181493282,
0.022135702893137932,
0.00015178488683886826,
0.20702742040157318,
-0.058627016842365265,
-0.017986731603741646,
0.01947944052517414,
-0.10172797739505768,
0.11250166594982147,
-0.02674754150211811,
0.19114424288272858,
-0.005347456783056259,
-0.04091045260429382,
-0.011579973623156548,
0.03771582618355751,
0.020696617662906647,
-0.012362537905573845,
0.17956481873989105,
0.13628965616226196,
0.026203617453575134,
0.038562823086977005,
-0.028323635458946228,
0.0015729174483567476,
-0.05784163251519203,
-0.029057910665869713,
0.031159566715359688,
0.039278335869312286,
0.01574016362428665,
0.14380520582199097,
0.07180483639240265,
-0.16794061660766602,
0.0323757641017437,
-0.025676129385828972,
-0.03976280242204666,
-0.11734884232282639,
-0.09855420887470245,
-0.03413240611553192,
-0.07893415540456772,
0.011593577452003956,
-0.12541690468788147,
0.011445575393736362,
0.1686071753501892,
0.05821143835783005,
0.027220387011766434,
0.005357679910957813,
-0.11800223588943481,
-0.03178885951638222,
0.05180635303258896,
0.014307031407952309,
0.026269348338246346,
0.06015540659427643,
-0.0031775750685483217,
0.0576837956905365,
0.036105383187532425,
0.013380811549723148,
0.0017169472994282842,
0.075642429292202,
0.014422360807657242,
0.042948536574840546,
-0.06160784140229225,
-0.00546296127140522,
-0.04238762706518173,
0.07014581561088562,
0.1037849634885788,
0.04841465502977371,
-0.047442782670259476,
-0.009982410818338394,
0.1665496826171875,
-0.04305369034409523,
-0.004067740403115749,
-0.12545724213123322,
0.3410423994064331,
0.015106006525456905,
0.013602448627352715,
0.04846761003136635,
-0.07702268660068512,
-0.051051750779151917,
0.202255517244339,
0.08738653361797333,
-0.016416743397712708,
-0.021230969578027725,
0.0012450519716367126,
-0.03235650807619095,
-0.02064543403685093,
0.15287356078624725,
0.03305767476558685,
0.1297779679298401,
-0.054963741451501846,
-0.04134974628686905,
-0.02810685522854328,
-0.012865900062024593,
-0.12568184733390808,
0.13860809803009033,
-0.026252493262290955,
-0.023047158494591713,
-0.07607588171958923,
0.028159918263554573,
0.07196906954050064,
-0.30975469946861267,
0.0021015664096921682,
-0.034096311777830124,
-0.10783995687961578,
-0.0040297554805874825,
-0.01976947858929634,
-0.02412191964685917,
0.04678108170628548,
-0.04688480496406555,
0.07472220063209534,
0.03453904762864113,
0.03335811197757721,
-0.022181009873747826,
-0.09559065103530884,
0.16549164056777954,
0.05069447681307793,
0.09468475729227066,
0.026400139555335045,
0.07588759064674377,
0.056774500757455826,
0.03529311344027519,
-0.0979287251830101,
0.04386468604207039,
0.013317401520907879,
-0.08462855964899063,
-0.05321425944566727,
0.1271381974220276,
-0.00026165583403781056,
0.046447064727544785,
0.04035774990916252,
-0.11347565054893494,
0.009305860847234726,
0.07222919166088104,
-0.06328085064888,
-0.09821818023920059,
-0.007833615876734257,
-0.0864989310503006,
0.15816940367221832,
0.13936001062393188,
-0.019229214638471603,
0.021014846861362457,
-0.06655329465866089,
-0.0064065600745379925,
0.05288375914096832,
0.00931551679968834,
-0.01883723959326744,
-0.1935635656118393,
0.03125988319516182,
-0.08067101240158081,
-0.007971717976033688,
-0.21980297565460205,
-0.10593923181295395,
-0.012508139945566654,
-0.051143597811460495,
-0.027287937700748444,
0.0638560876250267,
0.030451538041234016,
0.062128808349370956,
-0.014878842979669571,
-0.042270541191101074,
-0.03035890683531761,
0.08877259492874146,
-0.10897696018218994,
-0.06440648436546326
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_800k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_800k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_800k"]}
| null |
google/multiberts-seed_2-step_800k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_800k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 800k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07795865088701248,
0.10017549246549606,
-0.002459521871060133,
0.044920358806848526,
0.07979831099510193,
-0.014886138960719109,
0.07873528450727463,
0.10272220522165298,
-0.009130161255598068,
0.026748036965727806,
0.07706636190414429,
-0.0023614380042999983,
0.018606780096888542,
0.09543779492378235,
0.025412479415535927,
-0.22401325404644012,
0.019005373120307922,
-0.030584247782826424,
-0.07963760197162628,
0.07497711479663849,
0.09930697828531265,
-0.07990116626024246,
0.04620455950498581,
0.027605518698692322,
-0.11239229142665863,
0.04850064590573311,
0.0016614506021142006,
-0.019424110651016235,
0.13371044397354126,
-0.0015258552739396691,
0.05090019479393959,
0.05388348549604416,
0.04710803180932999,
-0.13409684598445892,
0.00553638581186533,
0.05756421014666557,
0.05932437628507614,
0.044468414038419724,
0.024985671043395996,
0.07828373461961746,
0.008918030187487602,
0.031917914748191833,
0.045869819819927216,
0.023909855633974075,
-0.07412910461425781,
-0.059896960854530334,
-0.10163693130016327,
0.03720293939113617,
0.03219130262732506,
0.006409154273569584,
0.011801844462752342,
0.12388776987791061,
-0.03318396583199501,
0.04469762369990349,
0.17585009336471558,
-0.334657222032547,
-0.011073137633502483,
0.06745818257331848,
0.036696482449769974,
0.12972399592399597,
-0.003485289169475436,
-0.021605638787150383,
0.075822614133358,
0.02547849342226982,
0.08514715731143951,
-0.03898893669247627,
0.023627351969480515,
-0.055443234741687775,
-0.15383082628250122,
-0.04831305146217346,
0.08551919460296631,
-0.0016790780937299132,
-0.1366545855998993,
-0.03470171242952347,
-0.045851871371269226,
0.028798116371035576,
0.013524342328310013,
-0.035373929888010025,
0.03428145498037338,
0.01710014045238495,
-0.01899939775466919,
-0.008876440115272999,
-0.1042729914188385,
-0.05323752760887146,
0.030183950439095497,
0.07200130075216293,
0.10050928592681885,
0.07004929333925247,
0.0033716815523803234,
0.11376836150884628,
-0.19242604076862335,
-0.0521455779671669,
-0.028558526188135147,
-0.050094544887542725,
-0.04783295467495918,
-0.007972914725542068,
-0.10662991553544998,
-0.03972777724266052,
0.008525540120899677,
0.13252496719360352,
0.00011136232205899432,
0.027660952880978584,
-0.032971590757369995,
0.010764415375888348,
0.055076684802770615,
0.04670475050806999,
-0.006076111923903227,
0.01661493070423603,
0.021572822704911232,
-0.011108442209661007,
-0.020139675587415695,
0.013341199606657028,
0.00045960984425619245,
0.026736384257674217,
0.12422212958335876,
0.02102317102253437,
-0.09915325790643692,
0.06153026968240738,
-0.01929778978228569,
-0.045278314501047134,
0.014621232636272907,
-0.0904083326458931,
-0.0545741431415081,
-0.04143975302577019,
0.0011106301099061966,
0.014056708663702011,
-0.002459207084029913,
-0.0012290301965549588,
-0.02350897155702114,
-0.03742160648107529,
-0.08178972452878952,
-0.04698490723967552,
-0.0544559583067894,
-0.12338615953922272,
0.0083128921687603,
-0.17912061512470245,
-0.03518923744559288,
-0.11578859388828278,
-0.184429332613945,
-0.022642791271209717,
0.06380607932806015,
-0.011477293446660042,
-0.05655268579721451,
0.07885047048330307,
0.04527612775564194,
-0.029912998899817467,
-0.0018532893154770136,
0.07151433825492859,
-0.0011255437275394797,
0.044908009469509125,
-0.027628431096673012,
0.06915143877267838,
0.00022854212147649378,
0.03478794917464256,
-0.05481364578008652,
0.06253755837678909,
-0.17243634164333344,
0.04444563761353493,
-0.06988947093486786,
-0.028692713007330894,
-0.08470693975687027,
-0.03389269486069679,
-0.0075932159088552,
0.004803329240530729,
0.024088766425848007,
0.07258135080337524,
-0.17958518862724304,
-0.027601853013038635,
0.11717697978019714,
-0.16326594352722168,
-0.02063952013850212,
0.07597315311431885,
-0.04609496146440506,
0.09938044100999832,
0.06943118572235107,
0.15728478133678436,
-0.00755218043923378,
-0.07415279000997543,
0.057147108018398285,
-0.01257978193461895,
0.01461963914334774,
-0.01292907353490591,
0.07076425850391388,
-0.020771939307451248,
-0.15305370092391968,
0.03628844395279884,
-0.12574395537376404,
-0.0037134329322725534,
-0.07781999558210373,
0.016833830624818802,
-0.011879636906087399,
-0.06766264885663986,
-0.07376686483621597,
-0.02539808861911297,
0.06595762073993683,
-0.07344938814640045,
-0.0167216956615448,
0.0315554179251194,
0.07306286692619324,
-0.06915849447250366,
0.06674370169639587,
-0.013604067265987396,
0.010682294145226479,
-0.08391053229570389,
-0.04134433716535568,
-0.1883372962474823,
0.05231447145342827,
0.10061387717723846,
0.011241662316024303,
-0.022255336865782738,
0.1438044309616089,
0.006713700480759144,
0.06806362420320511,
-0.04874837026000023,
0.013007179833948612,
-0.012748334556818008,
-0.005345264915376902,
-0.09385295957326889,
-0.09261934459209442,
-0.07631997764110565,
-0.06725131720304489,
0.08685004711151123,
-0.12570445239543915,
0.02071450464427471,
-0.05493897199630737,
0.04429600387811661,
0.021315494552254677,
-0.08462920784950256,
-0.021335074678063393,
0.013722863979637623,
-0.06093636527657509,
-0.058138418942689896,
0.03884138911962509,
0.06985358893871307,
-0.012972361408174038,
0.08667848259210587,
-0.050351861864328384,
-0.07944436371326447,
0.02944941259920597,
0.10157079249620438,
-0.10387048870325089,
0.011030128225684166,
-0.05740297958254814,
-0.04515961557626724,
-0.06380784511566162,
-0.020294586196541786,
0.08413465321063995,
-0.007388273719698191,
0.13813553750514984,
-0.0733087807893753,
0.000349913549143821,
0.013553298078477383,
-0.020770350471138954,
-0.02098909020423889,
0.037536464631557465,
0.06428245455026627,
-0.06961847096681595,
0.01264993380755186,
0.038862429559230804,
0.011537105776369572,
0.06978517770767212,
-0.05278858542442322,
-0.08939837664365768,
0.01417178101837635,
0.03543570637702942,
0.030383216217160225,
0.06619390845298767,
-0.01940336264669895,
-0.010776326060295105,
0.034796711057424545,
0.017843596637248993,
0.007693050894886255,
-0.11997808516025543,
0.06250371783971786,
0.057794515043497086,
0.005795767065137625,
0.0599357895553112,
-0.021374734118580818,
-0.039478614926338196,
0.0809069573879242,
0.03386451303958893,
-0.00214900029823184,
-0.015420256182551384,
-0.016476335003972054,
-0.11871649324893951,
0.18947046995162964,
-0.058644987642765045,
-0.15863613784313202,
-0.08308495581150055,
-0.1005496084690094,
0.005328436382114887,
0.027210189029574394,
0.03867700323462486,
-0.01646731048822403,
-0.04388370364904404,
-0.12637509405612946,
0.05755047872662544,
-0.03868819773197174,
0.06871671229600906,
0.11369810253381729,
-0.039343006908893585,
0.057380110025405884,
-0.12524394690990448,
-0.009529118426144123,
-0.0807403028011322,
-0.08023106306791306,
0.06284582614898682,
-0.05097179114818573,
0.027172474190592766,
0.10005021840333939,
0.023367566987872124,
-0.019296560436487198,
-0.02347881719470024,
0.19675567746162415,
0.03948666900396347,
0.042061906307935715,
0.12842635810375214,
-0.06717918813228607,
0.057010527700185776,
0.08260064572095871,
0.009427475742995739,
-0.04437590762972832,
0.05326176807284355,
0.043067675083875656,
-0.07039602100849152,
-0.1935032457113266,
-0.023233380168676376,
-0.005568249151110649,
-0.041504424065351486,
0.07892435044050217,
0.037781260907649994,
0.00027308150311000645,
0.06958410888910294,
0.01418516505509615,
0.061420030891895294,
-0.005938893649727106,
0.09547930210828781,
0.005107999313622713,
-0.03289637714624405,
0.08796341717243195,
-0.02006630226969719,
-0.009562408551573753,
0.08517372608184814,
-0.01954544335603714,
0.2924562692642212,
-0.030808109790086746,
0.008459394797682762,
0.11729109287261963,
0.044548362493515015,
0.06301642954349518,
0.12719844281673431,
-0.06578308343887329,
0.022235549986362457,
-0.07318804413080215,
-0.057867515832185745,
-0.0054624187760055065,
0.04956973344087601,
-0.05757971480488777,
0.01422075368463993,
-0.07559406012296677,
0.02383514493703842,
-0.023243345320224762,
0.31177085638046265,
0.10940524190664291,
-0.10595192015171051,
-0.05475863441824913,
0.006423871964216232,
-0.1020919606089592,
-0.07317151874303818,
0.044487450271844864,
0.07342374324798584,
-0.1363033652305603,
0.005696517415344715,
-0.024192078039050102,
0.07621217519044876,
-0.02228586934506893,
0.016946034505963326,
0.0298270583152771,
0.03248339146375656,
-0.03634707257151604,
0.010083041153848171,
-0.18348413705825806,
0.19257651269435883,
0.006802674382925034,
0.01906382106244564,
-0.054310575127601624,
0.033838097006082535,
0.006553566548973322,
-0.03304343298077583,
0.0617709681391716,
0.02382342331111431,
-0.0370512492954731,
-0.05020514503121376,
-0.05458715930581093,
0.01530187577009201,
0.07599722594022751,
-0.04526030644774437,
0.10636461526155472,
-0.0049298591911792755,
0.04424746334552765,
0.01872854493558407,
0.08715430647134781,
-0.17773833870887756,
-0.09127039462327957,
0.031229430809617043,
-0.055487170815467834,
-0.10290651768445969,
-0.08154351264238358,
-0.09238763153553009,
0.00762104382738471,
0.2454972118139267,
-0.12357068806886673,
-0.07589682936668396,
-0.09490934014320374,
0.029004015028476715,
0.10475485026836395,
-0.05023130774497986,
0.02679774910211563,
-0.0034809119533747435,
0.1275341659784317,
-0.06639364361763,
-0.13083462417125702,
0.0234429519623518,
-0.09058564156293869,
-0.16678790748119354,
-0.065411776304245,
0.11562364548444748,
0.059095919132232666,
0.03553842008113861,
-0.0283872839063406,
0.023578975349664688,
0.03396547585725784,
-0.037553850561380386,
-0.00005697897358913906,
0.07290121167898178,
0.0902927964925766,
0.03299722447991371,
-0.11000636965036392,
0.02215469256043434,
-0.06317085027694702,
-0.06801694631576538,
0.07550939172506332,
0.26573264598846436,
-0.05617472529411316,
0.12459015846252441,
0.11478910595178604,
-0.08044492453336716,
-0.15153659880161285,
0.028531435877084732,
0.09315171837806702,
-0.014629945158958435,
0.019618866965174675,
-0.15743878483772278,
0.08931469172239304,
0.11426621675491333,
-0.023558344691991806,
0.012899091467261314,
-0.18598483502864838,
-0.12931117415428162,
0.0693679079413414,
0.09412888437509537,
0.2766970694065094,
-0.06142229214310646,
-0.04313862696290016,
0.017450634390115738,
-0.09882202744483948,
0.011964193545281887,
0.12497851997613907,
0.06348271667957306,
-0.02654503844678402,
-0.07517655193805695,
0.016643650829792023,
-0.04058860242366791,
0.09487121552228928,
0.05576604977250099,
0.05661403387784958,
-0.004413480404764414,
0.01413052063435316,
-0.02557467669248581,
-0.04460638388991356,
0.06129235029220581,
0.020885387435555458,
0.04844823479652405,
-0.09129175543785095,
-0.02897987887263298,
-0.06624474376440048,
0.02641688473522663,
-0.02395624853670597,
-0.07603929191827774,
-0.057423219084739685,
0.0743885263800621,
0.048122942447662354,
-0.024849828332662582,
0.019494134932756424,
0.031463779509067535,
0.11388813704252243,
0.17544133961200714,
-0.009125945158302784,
-0.05463562160730362,
-0.054602865129709244,
-0.040573712438344955,
-0.017584921792149544,
0.07294107228517532,
-0.04717937856912613,
0.026006074622273445,
0.06685762107372284,
0.022397339344024658,
0.09606866538524628,
0.05637132748961449,
-0.11467959731817245,
-0.01558410283178091,
0.03384312614798546,
-0.1614021360874176,
0.012030087411403656,
-0.0025912977289408445,
0.03625108674168587,
-0.03500954061746597,
0.025249771773815155,
0.1505075842142105,
-0.06611211597919464,
-0.03503541275858879,
-0.04084485024213791,
0.06897842884063721,
0.02234085649251938,
0.1404474377632141,
0.03329815715551376,
0.03736409917473793,
-0.08132165670394897,
0.12782034277915955,
0.04085804894566536,
-0.03840162605047226,
0.02079586870968342,
-0.027278557419776917,
-0.1071380153298378,
0.014677267521619797,
0.05785079300403595,
0.04156617075204849,
-0.052407655864953995,
-0.010770873166620731,
-0.02822897769510746,
-0.07299593091011047,
0.059163980185985565,
0.17247560620307922,
0.06738370656967163,
0.07241511344909668,
-0.05531396344304085,
-0.038311708718538284,
-0.07878309488296509,
0.04224037006497383,
0.04297280311584473,
0.07579540461301804,
-0.07425102591514587,
0.10375764220952988,
0.011368675157427788,
0.04536285251379013,
-0.030182190239429474,
-0.05410128831863403,
-0.09698719531297684,
-0.05373072251677513,
-0.0988459512591362,
0.00809475313872099,
-0.07042072713375092,
-0.0401487797498703,
0.0015891859075054526,
-0.007230447139590979,
-0.008630597032606602,
0.04915884882211685,
-0.06209687143564224,
-0.010632326826453209,
-0.028494108468294144,
0.03403955698013306,
-0.061105772852897644,
-0.039354123175144196,
0.03208254650235176,
-0.10023403912782669,
0.09274349361658096,
0.04794251546263695,
0.006466289050877094,
0.005689030978828669,
0.09322507679462433,
-0.021321596577763557,
0.022525735199451447,
0.015539605170488358,
-0.04740108549594879,
-0.07868454605340958,
0.0008317606407217681,
-0.009256057441234589,
-0.016104944050312042,
-0.011928093619644642,
0.08753009885549545,
-0.08908899128437042,
0.03034571185708046,
-0.008045812137424946,
-0.010648086667060852,
-0.07460685819387436,
-0.012335155159235,
0.09676594287157059,
0.09757831692695618,
0.04650966450572014,
-0.08911214023828506,
0.013522215187549591,
-0.1425430327653885,
-0.036539603024721146,
0.007696434389799833,
-0.005841141566634178,
-0.11788859218358994,
-0.010954122990369797,
0.01951928436756134,
-0.0016973215388134122,
0.21036213636398315,
-0.05790306627750397,
-0.018869977444410324,
0.018907485529780388,
-0.09683068096637726,
0.1105748862028122,
-0.02531418949365616,
0.18367864191532135,
-0.007954854518175125,
-0.040429048240184784,
-0.016572782769799232,
0.035528477281332016,
0.021159565076231956,
-0.021352795884013176,
0.17954562604427338,
0.140731543302536,
0.031053662300109863,
0.040579501539468765,
-0.02307013049721718,
0.0032295172568410635,
-0.05405537039041519,
-0.029246829450130463,
0.032191529870033264,
0.03945958614349365,
0.018624376505613327,
0.1529400646686554,
0.06971745193004608,
-0.16619274020195007,
0.0326557382941246,
-0.02709367498755455,
-0.035821184515953064,
-0.11532194167375565,
-0.0880294069647789,
-0.03444035351276398,
-0.07315465807914734,
0.00881287269294262,
-0.12202417105436325,
0.01032257080078125,
0.1841767579317093,
0.05521402880549431,
0.025242013856768608,
0.003160387510433793,
-0.12496893107891083,
-0.0324123278260231,
0.05459130182862282,
0.014398148283362389,
0.02667539194226265,
0.060132455080747604,
-0.002211114391684532,
0.061084069311618805,
0.0418265275657177,
0.015304440632462502,
0.0013971728039905429,
0.07870461046695709,
0.015910334885120392,
0.041941553354263306,
-0.06254716962575912,
-0.005047778133302927,
-0.04458343982696533,
0.06923208385705948,
0.09398084878921509,
0.047637972980737686,
-0.04695539176464081,
-0.008558633737266064,
0.1635345220565796,
-0.04456378147006035,
-0.0007444403017871082,
-0.1245921403169632,
0.3307679295539856,
0.013626813888549805,
0.013549688272178173,
0.04558306559920311,
-0.07848861068487167,
-0.050072409212589264,
0.20052485167980194,
0.08292531967163086,
-0.021343480795621872,
-0.02271416038274765,
-0.001458655227907002,
-0.030300822108983994,
-0.021443668752908707,
0.15100513398647308,
0.03304934501647949,
0.12643857300281525,
-0.05319884046912193,
-0.046902891248464584,
-0.028194565325975418,
-0.009582455269992352,
-0.12540452182292938,
0.13894149661064148,
-0.029696697369217873,
-0.021829448640346527,
-0.07454455643892288,
0.02393721230328083,
0.0768689513206482,
-0.31461694836616516,
0.0008769417181611061,
-0.03627864271402359,
-0.11080305278301239,
-0.0011711270781233907,
-0.01324489712715149,
-0.021798523142933846,
0.0479641817510128,
-0.04766710475087166,
0.07326070964336395,
0.040944453328847885,
0.033775873482227325,
-0.026239609345793724,
-0.09149200469255447,
0.16162525117397308,
0.05015468969941139,
0.09587258845567703,
0.027625149115920067,
0.07521285861730576,
0.054899733513593674,
0.03666681423783302,
-0.09605950117111206,
0.0444592610001564,
0.011151129379868507,
-0.08655096590518951,
-0.05136267840862274,
0.12304837256669998,
-0.002212190767750144,
0.038458552211523056,
0.04456073045730591,
-0.10808498412370682,
0.009474719874560833,
0.07117288559675217,
-0.06772878766059875,
-0.09942232072353363,
-0.010887743905186653,
-0.0882420614361763,
0.15709416568279266,
0.14013738930225372,
-0.017645707353949547,
0.020491117611527443,
-0.06448555737733841,
-0.006973761133849621,
0.05137774348258972,
0.014226924628019333,
-0.016879916191101074,
-0.1901472955942154,
0.033127691596746445,
-0.08098115772008896,
-0.005468036979436874,
-0.22930866479873657,
-0.1017652302980423,
-0.01338765025138855,
-0.05368639901280403,
-0.02795262448489666,
0.06282074004411697,
0.03235457465052605,
0.0656861960887909,
-0.017102772369980812,
-0.04239845648407936,
-0.027652092278003693,
0.08986898511648178,
-0.1086508184671402,
-0.06549867242574692
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 80k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 80k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_80k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_80k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_80k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_80k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_80k"]}
| null |
google/multiberts-seed_2-step_80k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_80k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_80k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 80k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 80k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 80k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 80k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_80k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 80k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 80k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_80k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 80k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 80k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0785176083445549,
0.10662347823381424,
-0.002510025631636381,
0.039840009063482285,
0.07629004865884781,
-0.015839647501707077,
0.08196438103914261,
0.10342897474765778,
-0.0070805917493999004,
0.029795372858643532,
0.07741548866033554,
0.005279508884996176,
0.01777324639260769,
0.10093031078577042,
0.020843543112277985,
-0.2243741899728775,
0.020024113357067108,
-0.028525853529572487,
-0.0840054452419281,
0.07591921091079712,
0.09930308908224106,
-0.08106160908937454,
0.044349733740091324,
0.027576956897974014,
-0.112643301486969,
0.04794244095683098,
0.0024498109705746174,
-0.02022613026201725,
0.13388286530971527,
-0.00393119128420949,
0.05261637642979622,
0.05380617454648018,
0.04499572515487671,
-0.13261917233467102,
0.006689772009849548,
0.05876142904162407,
0.05887378752231598,
0.046529341489076614,
0.02661396563053131,
0.07860946655273438,
0.01468916330486536,
0.026363924145698547,
0.04565035551786423,
0.02307276241481304,
-0.07258259505033493,
-0.06866560131311417,
-0.10150548815727234,
0.04025101289153099,
0.030535178259015083,
0.003192251082509756,
0.010274517349898815,
0.12680159509181976,
-0.031835656613111496,
0.04540156573057175,
0.18205475807189941,
-0.34367090463638306,
-0.009577421471476555,
0.06743203103542328,
0.04202296584844589,
0.1272677183151245,
-0.0026684061158448458,
-0.018895193934440613,
0.07591806352138519,
0.02293599583208561,
0.08815371990203857,
-0.0400175079703331,
0.03106488473713398,
-0.053329966962337494,
-0.15527454018592834,
-0.04857486113905907,
0.09115651994943619,
0.00007711029320489615,
-0.13715475797653198,
-0.03389601781964302,
-0.04634890332818031,
0.0411316379904747,
0.011991365812718868,
-0.037343092262744904,
0.03430087864398956,
0.017145831137895584,
-0.018653450533747673,
-0.008055957965552807,
-0.10498468577861786,
-0.05305015295743942,
0.03445279970765114,
0.07418737560510635,
0.10261054337024689,
0.06779714673757553,
0.001078556990250945,
0.1107853502035141,
-0.19566753506660461,
-0.05319911241531372,
-0.025071168318390846,
-0.0505560003221035,
-0.049059633165597916,
-0.007456951774656773,
-0.10805761814117432,
-0.04068286344408989,
0.00936979427933693,
0.13152088224887848,
0.0010421575279906392,
0.027465511113405228,
-0.03683541342616081,
0.011426844634115696,
0.05654928460717201,
0.04549723118543625,
-0.004768169019371271,
0.013342357240617275,
0.02082913927733898,
-0.013414462096989155,
-0.02118644490838051,
0.013964405283331871,
-0.0017942595295608044,
0.02661064825952053,
0.12643733620643616,
0.02122962661087513,
-0.10199783742427826,
0.06358964741230011,
-0.01788889616727829,
-0.04635566473007202,
0.011691021732985973,
-0.08940430730581284,
-0.05433971434831619,
-0.038896381855010986,
-0.00011602939775912091,
0.01547228917479515,
-0.0032784370705485344,
-0.00289376312866807,
-0.023060966283082962,
-0.037625208497047424,
-0.08264505118131638,
-0.04500848054885864,
-0.05369153246283531,
-0.1273241937160492,
0.008451255969703197,
-0.18806850910186768,
-0.036009155213832855,
-0.11256144195795059,
-0.18358905613422394,
-0.023217488080263138,
0.06297487020492554,
-0.009343631565570831,
-0.05656566470861435,
0.07816535979509354,
0.04498165100812912,
-0.02864825539290905,
-0.0030360431410372257,
0.07052677869796753,
-0.0027430467307567596,
0.04480039328336716,
-0.0287186186760664,
0.0702064260840416,
0.004064807202666998,
0.035012178122997284,
-0.052772197872400284,
0.06341645121574402,
-0.17619812488555908,
0.041305698454380035,
-0.07054277509450912,
-0.0326930433511734,
-0.08689176291227341,
-0.03340577334165573,
-0.004708341788500547,
0.00551834050565958,
0.023253414779901505,
0.07117567956447601,
-0.1883993148803711,
-0.030025366693735123,
0.12732256948947906,
-0.1632806360721588,
-0.023287596181035042,
0.07608582079410553,
-0.04758249968290329,
0.10072038322687149,
0.06979615241289139,
0.15808451175689697,
-0.01657547615468502,
-0.07979682087898254,
0.056421056389808655,
-0.011080707423388958,
0.01584741473197937,
-0.01091891247779131,
0.07358932495117188,
-0.021030902862548828,
-0.152031809091568,
0.034153375774621964,
-0.12939374148845673,
-0.004286907613277435,
-0.07716058939695358,
0.019260618835687637,
-0.012828660197556019,
-0.06526271253824234,
-0.07501404732465744,
-0.02499830722808838,
0.06675603985786438,
-0.0745512992143631,
-0.02077043615281582,
0.03982878848910332,
0.07446884363889694,
-0.07324279844760895,
0.0651990994811058,
-0.016321875154972076,
0.014281578361988068,
-0.08724743872880936,
-0.040392640978097916,
-0.19000062346458435,
0.053490009158849716,
0.09994138032197952,
0.01179591566324234,
-0.020488325506448746,
0.15125145018100739,
0.007431348320096731,
0.0675048977136612,
-0.04676404967904091,
0.012335013598203659,
-0.011649712920188904,
-0.004789313301444054,
-0.09504369646310806,
-0.09972245991230011,
-0.07439195364713669,
-0.06777219474315643,
0.08638244867324829,
-0.12821942567825317,
0.020507460460066795,
-0.06033416837453842,
0.04675247147679329,
0.020424148067831993,
-0.0835554301738739,
-0.02074887417256832,
0.011449522338807583,
-0.06319472193717957,
-0.05761381983757019,
0.04085993766784668,
0.06927113980054855,
-0.012565110810101032,
0.09093889594078064,
-0.05465742573142052,
-0.07920464128255844,
0.030119160190224648,
0.09807348251342773,
-0.10253157466650009,
0.01038515754044056,
-0.05687491595745087,
-0.043997034430503845,
-0.05924086645245552,
-0.016614669933915138,
0.07946016639471054,
-0.007180025335401297,
0.1395501047372818,
-0.07514549791812897,
-0.0031396630220115185,
0.012926199473440647,
-0.021316153928637505,
-0.020842598751187325,
0.03766767680644989,
0.06717153638601303,
-0.07944049686193466,
0.013820194639265537,
0.04504258185625076,
0.009903335012495518,
0.06866425275802612,
-0.054387785494327545,
-0.09290513396263123,
0.01183418370783329,
0.03623023256659508,
0.028883425518870354,
0.0671890527009964,
-0.028085166588425636,
-0.014036273583769798,
0.036176275461912155,
0.016119861975312233,
0.006553380284458399,
-0.11765894293785095,
0.062428779900074005,
0.056440841406583786,
0.005018793512135744,
0.06050712615251541,
-0.02016076073050499,
-0.039292383939027786,
0.08214738219976425,
0.03574542701244354,
-0.006314531899988651,
-0.017011405900120735,
-0.015983061864972115,
-0.1172683835029602,
0.1897064745426178,
-0.05910596624016762,
-0.1611383557319641,
-0.07661743462085724,
-0.09903022646903992,
0.0074284374713897705,
0.02802012860774994,
0.03977298364043236,
-0.017435049638152122,
-0.045307476073503494,
-0.12411428987979889,
0.05629107356071472,
-0.04018933326005936,
0.06912749260663986,
0.11004570126533508,
-0.04060530662536621,
0.055737774819135666,
-0.12622392177581787,
-0.009892556816339493,
-0.08286423236131668,
-0.07213052362203598,
0.062348708510398865,
-0.05147587135434151,
0.02516564354300499,
0.10156147927045822,
0.02306552417576313,
-0.017311332747340202,
-0.02253751829266548,
0.19405172765254974,
0.04073410853743553,
0.03962074592709541,
0.13076777756214142,
-0.06996336579322815,
0.056217536330223083,
0.07765747606754303,
0.0071349916979670525,
-0.044518738985061646,
0.05123680830001831,
0.04588557407259941,
-0.06633491814136505,
-0.19549356400966644,
-0.021625012159347534,
-0.004512693267315626,
-0.040574125945568085,
0.07805027812719345,
0.03597210347652435,
0.0008489572210237384,
0.0702352374792099,
0.011908465065062046,
0.06028848513960838,
-0.006437975447624922,
0.09794065356254578,
0.005478166043758392,
-0.03520744666457176,
0.08982152491807938,
-0.01902858354151249,
-0.008564484305679798,
0.08491060137748718,
-0.021325981244444847,
0.28897687792778015,
-0.02726896107196808,
0.014442140236496925,
0.11945254355669022,
0.04241332411766052,
0.06226611137390137,
0.12613913416862488,
-0.06622044742107391,
0.02176743932068348,
-0.07587186992168427,
-0.06010986864566803,
-0.005284572951495647,
0.05107239633798599,
-0.059059955179691315,
0.009132534265518188,
-0.07481475174427032,
0.02282808907330036,
-0.023400425910949707,
0.3079809546470642,
0.11018358916044235,
-0.10748274624347687,
-0.055810436606407166,
0.007377409376204014,
-0.10065808892250061,
-0.07553648948669434,
0.04246217757463455,
0.07458316534757614,
-0.13363920152187347,
0.004323157481849194,
-0.025641221553087234,
0.07730981707572937,
-0.020144568756222725,
0.01729642041027546,
0.021858973428606987,
0.032347895205020905,
-0.03644493594765663,
0.011039400473237038,
-0.1834263950586319,
0.1911388635635376,
0.007210993207991123,
0.01939212530851364,
-0.052958980202674866,
0.03369676321744919,
0.007110523991286755,
-0.0315016470849514,
0.06167261675000191,
0.021554801613092422,
-0.037615206092596054,
-0.04590538889169693,
-0.05265818163752556,
0.016262825578451157,
0.0783337950706482,
-0.046855177730321884,
0.10910098999738693,
-0.006247342098504305,
0.04321196675300598,
0.02241727150976658,
0.0844556912779808,
-0.17944814264774323,
-0.08755762875080109,
0.030599519610404968,
-0.05774584040045738,
-0.09977806359529495,
-0.08068948984146118,
-0.09417150914669037,
-0.0028259053360670805,
0.25779321789741516,
-0.11964575946331024,
-0.07450064271688461,
-0.09478037804365158,
0.029252750799059868,
0.10477837920188904,
-0.050258953124284744,
0.024754652753472328,
-0.004790357314050198,
0.1313881278038025,
-0.066622793674469,
-0.13064053654670715,
0.026386896148324013,
-0.09134107828140259,
-0.16502656042575836,
-0.0663059800863266,
0.1164085790514946,
0.060561876744031906,
0.034942805767059326,
-0.026232706382870674,
0.023022638633847237,
0.03200233727693558,
-0.03500077873468399,
0.001634216052480042,
0.07234567403793335,
0.10193204879760742,
0.03064616583287716,
-0.11082015931606293,
0.02281741425395012,
-0.062211330980062485,
-0.06494118273258209,
0.07660450786352158,
0.2675759196281433,
-0.05596537888050079,
0.12389479577541351,
0.11188007146120071,
-0.08034048229455948,
-0.14738261699676514,
0.030161013826727867,
0.09273379296064377,
-0.016449809074401855,
0.017293551936745644,
-0.16059653460979462,
0.08977647125720978,
0.11211645603179932,
-0.023984385654330254,
0.010419824160635471,
-0.18355920910835266,
-0.12781786918640137,
0.07530467212200165,
0.09642956405878067,
0.2778458297252655,
-0.06334017217159271,
-0.041830357164144516,
0.016611313447356224,
-0.09221027791500092,
0.020673764869570732,
0.12099908292293549,
0.06250785291194916,
-0.02551993913948536,
-0.07670316845178604,
0.017572415992617607,
-0.04072899743914604,
0.09419264644384384,
0.054423894733190536,
0.05780034139752388,
-0.003199076745659113,
0.01759648509323597,
-0.023663897067308426,
-0.04578165337443352,
0.06397321820259094,
0.01576807163655758,
0.048032261431217194,
-0.09137140959501266,
-0.02968478947877884,
-0.06567320227622986,
0.029957110062241554,
-0.023811733350157738,
-0.07729563117027283,
-0.05620085820555687,
0.0726592168211937,
0.045258231461048126,
-0.022456275299191475,
0.028607826679944992,
0.03046216256916523,
0.11635427176952362,
0.1638249009847641,
-0.004958240315318108,
-0.04834504798054695,
-0.05907513573765755,
-0.040630243718624115,
-0.01659058965742588,
0.07478978484869003,
-0.052051741629838943,
0.02559259906411171,
0.06513801962137222,
0.022557135671377182,
0.10015961527824402,
0.05542067065834999,
-0.11655545979738235,
-0.018203875049948692,
0.0311205442994833,
-0.16475915908813477,
0.013228749856352806,
0.0000027539877009985503,
0.030604630708694458,
-0.032485876232385635,
0.02772250771522522,
0.15127189457416534,
-0.06318026036024094,
-0.03601701185107231,
-0.04196925461292267,
0.06847353279590607,
0.02498812973499298,
0.13807129859924316,
0.033908795565366745,
0.0371948704123497,
-0.08108486980199814,
0.12409787625074387,
0.03915897011756897,
-0.04350804537534714,
0.02543940208852291,
-0.029496043920516968,
-0.10761907696723938,
0.013873951509594917,
0.062415122985839844,
0.053076352924108505,
-0.051640983670949936,
-0.015036904253065586,
-0.030029328539967537,
-0.06944271177053452,
0.061951518058776855,
0.18183353543281555,
0.0687551274895668,
0.07311803847551346,
-0.05483684688806534,
-0.03807846084237099,
-0.07802483439445496,
0.04585541784763336,
0.04288752004504204,
0.07525144517421722,
-0.07598698139190674,
0.11025907844305038,
0.011827724985778332,
0.045893631875514984,
-0.03113446570932865,
-0.052959293127059937,
-0.09794475138187408,
-0.05625032261013985,
-0.10973326861858368,
0.011017042212188244,
-0.07121160626411438,
-0.041159018874168396,
0.0018437736434862018,
-0.006126889493316412,
-0.004832095000892878,
0.048539310693740845,
-0.06346943229436874,
-0.008697418496012688,
-0.02671242691576481,
0.03584699705243111,
-0.06569143384695053,
-0.03767705336213112,
0.02939169481396675,
-0.10181470215320587,
0.09411313384771347,
0.05253107473254204,
0.008290308527648449,
0.007670121267437935,
0.09018800407648087,
-0.018116241320967674,
0.0258717592805624,
0.014687332324683666,
-0.047442082315683365,
-0.08153291046619415,
0.002446927363052964,
-0.009114520624279976,
-0.015862973406910896,
-0.013790685683488846,
0.08860716223716736,
-0.08735901862382889,
0.03143066167831421,
-0.005250126123428345,
-0.012107077986001968,
-0.07376617938280106,
-0.01229031104594469,
0.09203480929136276,
0.10039554536342621,
0.04724862053990364,
-0.08622328191995621,
0.01211491972208023,
-0.14213384687900543,
-0.036568909883499146,
0.007414585445076227,
-0.006176882889121771,
-0.1161443442106247,
-0.010309633798897266,
0.018872031942009926,
-0.004549366421997547,
0.2057047337293625,
-0.05463851988315582,
-0.0176620502024889,
0.01730540581047535,
-0.10154737532138824,
0.11489934474229813,
-0.027011172845959663,
0.18427212536334991,
-0.007728375494480133,
-0.038882240653038025,
-0.016796167939901352,
0.037315499037504196,
0.022388923913240433,
-0.025274386629462242,
0.18013644218444824,
0.14007502794265747,
0.024919891729950905,
0.04114069417119026,
-0.024840379133820534,
-0.0022609310690313578,
-0.06431394815444946,
-0.024034282192587852,
0.03300604596734047,
0.04001784324645996,
0.019423089921474457,
0.16569219529628754,
0.07367967069149017,
-0.16556409001350403,
0.030350366607308388,
-0.026507925242185593,
-0.03560864180326462,
-0.117760069668293,
-0.09280126541852951,
-0.036427754908800125,
-0.0711807906627655,
0.009443629533052444,
-0.12282825261354446,
0.01101071946322918,
0.1809980273246765,
0.05600446090102196,
0.024668216705322266,
-0.0018819422693923116,
-0.1215643510222435,
-0.03547229617834091,
0.05160187557339668,
0.01609918288886547,
0.022129349410533905,
0.05815238133072853,
-0.0005810003494843841,
0.06246910244226456,
0.0400058850646019,
0.01561895851045847,
0.0004927204572595656,
0.08278844505548477,
0.016398005187511444,
0.0407230481505394,
-0.0625128373503685,
-0.005950918421149254,
-0.041927020996809006,
0.07144131511449814,
0.0960221067070961,
0.04944116994738579,
-0.04849088564515114,
-0.009004050865769386,
0.16181553900241852,
-0.0439922995865345,
0.0003424961760174483,
-0.12272458523511887,
0.3363077640533447,
0.011101054027676582,
0.013847573660314083,
0.04646369069814682,
-0.0764256939291954,
-0.04845154285430908,
0.19741584360599518,
0.07821188122034073,
-0.016199102625250816,
-0.02214623987674713,
-0.0003265415725763887,
-0.03097759746015072,
-0.024811677634716034,
0.14885768294334412,
0.03613240271806717,
0.13185939192771912,
-0.05342366173863411,
-0.04884515330195427,
-0.029384784400463104,
-0.008460171520709991,
-0.12678879499435425,
0.13428035378456116,
-0.028485151007771492,
-0.023385250940918922,
-0.07000928372144699,
0.024335971102118492,
0.07763981074094772,
-0.32134392857551575,
-0.0016822637990117073,
-0.029465069994330406,
-0.10953466594219208,
-0.001188278547488153,
-0.011409947648644447,
-0.02407068945467472,
0.046586427837610245,
-0.04845552518963814,
0.06912105530500412,
0.04321811720728874,
0.03352527320384979,
-0.02637585811316967,
-0.0895102322101593,
0.16241396963596344,
0.041248977184295654,
0.09113168716430664,
0.026560019701719284,
0.07941260933876038,
0.05512814596295357,
0.0365070216357708,
-0.09259756654500961,
0.04367459937930107,
0.01196160539984703,
-0.08544783294200897,
-0.05038594827055931,
0.1237730011343956,
-0.0025306486058980227,
0.042241085320711136,
0.04709768667817116,
-0.10686735063791275,
0.008330083452165127,
0.06704358756542206,
-0.0726986899971962,
-0.09999674558639526,
-0.008692729286849499,
-0.090293750166893,
0.15672002732753754,
0.1395953893661499,
-0.017410235479474068,
0.020274171605706215,
-0.06910963356494904,
-0.004765116143971682,
0.05418146774172783,
0.01105424202978611,
-0.017768222838640213,
-0.1878431737422943,
0.03447723016142845,
-0.07435595989227295,
-0.00471538444980979,
-0.22964343428611755,
-0.09932029247283936,
-0.013130088336765766,
-0.0506645031273365,
-0.02808554284274578,
0.056349802762269974,
0.030865482985973358,
0.06646085530519485,
-0.01899772137403488,
-0.04311387985944748,
-0.027524586766958237,
0.08987496793270111,
-0.11015897989273071,
-0.06510747969150543
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2, captured at step 900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_900k')
model = TFBertModel.from_pretrained("google/multiberts-seed_2-step_900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2-step_900k')
model = BertModel.from_pretrained("google/multiberts-seed_2-step_900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2", "multiberts-seed_2-step_900k"]}
| null |
google/multiberts-seed_2-step_900k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"multiberts-seed_2-step_900k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2, captured at step 900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #multiberts-seed_2-step_900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 2, Step 900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2, captured at step 900k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07617061585187912,
0.09488324820995331,
-0.0023885273840278387,
0.041205815970897675,
0.07586707174777985,
-0.015131051652133465,
0.07499761134386063,
0.10243567824363708,
-0.010919449850916862,
0.02780413068830967,
0.07935651391744614,
0.0036214410793036222,
0.019363390281796455,
0.09721346199512482,
0.024668777361512184,
-0.22413328289985657,
0.02436479553580284,
-0.02949659526348114,
-0.08366991579532623,
0.07625768333673477,
0.10064077377319336,
-0.08110134303569794,
0.044716283679008484,
0.028872666880488396,
-0.1122996136546135,
0.04749833792448044,
0.00040676293428987265,
-0.0193055160343647,
0.13115046918392181,
-0.0025725835002958775,
0.0519581213593483,
0.05409220606088638,
0.04688495397567749,
-0.13496990501880646,
0.0063852532766759396,
0.0579523928463459,
0.060195449739694595,
0.044066984206438065,
0.022065814584493637,
0.07681363075971603,
0.003988461568951607,
0.0297707486897707,
0.04777059704065323,
0.02233518660068512,
-0.07273372262716293,
-0.06086926534771919,
-0.10301157087087631,
0.04847786948084831,
0.033669132739305496,
0.002021691994741559,
0.012710061855614185,
0.12304078787565231,
-0.038067132234573364,
0.04238097369670868,
0.17774343490600586,
-0.3322452902793884,
-0.011282048188149929,
0.0698264092206955,
0.04196399450302124,
0.12045586109161377,
-0.004315014462918043,
-0.02110554277896881,
0.0746505931019783,
0.020464139059185982,
0.08953367173671722,
-0.03925556316971779,
0.02352343127131462,
-0.0603463277220726,
-0.15553788840770721,
-0.04747019708156586,
0.08442037552595139,
0.0009722213726490736,
-0.135640487074852,
-0.03576483577489853,
-0.04537848383188248,
0.03802721947431564,
0.012310669757425785,
-0.036869414150714874,
0.03367863595485687,
0.014952028170228004,
-0.014897114597260952,
-0.013075909577310085,
-0.10412630438804626,
-0.056323885917663574,
0.03183671087026596,
0.08689490705728531,
0.10249552875757217,
0.06813696026802063,
0.003309570485725999,
0.11540097743272781,
-0.19288630783557892,
-0.05295340344309807,
-0.03079034760594368,
-0.05365805700421333,
-0.04779794439673424,
-0.009195271879434586,
-0.10577327758073807,
-0.04038494825363159,
0.009624769911170006,
0.13138866424560547,
0.009080489166080952,
0.027524715289473534,
-0.025363324210047722,
0.008671674877405167,
0.057322513312101364,
0.04680158197879791,
-0.006274606101214886,
0.0180717334151268,
0.024829953908920288,
-0.005939411465078592,
-0.019355157390236855,
0.013128435239195824,
0.0006781212869100273,
0.02390294335782528,
0.12304139882326126,
0.023654988035559654,
-0.09861493855714798,
0.06447387486696243,
-0.017006125301122665,
-0.04422600194811821,
0.01663237437605858,
-0.09020201861858368,
-0.05625329539179802,
-0.040008943527936935,
0.00008928580791689456,
0.016230832785367966,
-0.0032013747841119766,
-0.005033265799283981,
-0.022823622450232506,
-0.03555701673030853,
-0.08393774926662445,
-0.04674766957759857,
-0.055447984486818314,
-0.12475580722093582,
0.008250361308455467,
-0.18296077847480774,
-0.035298336297273636,
-0.11610579490661621,
-0.187506765127182,
-0.02512136660516262,
0.06253083050251007,
-0.009088634513318539,
-0.05566173046827316,
0.07892528176307678,
0.04248723387718201,
-0.029218072071671486,
-0.0038404951337724924,
0.07045982033014297,
-0.0011579581769183278,
0.04501616954803467,
-0.031174253672361374,
0.06992661952972412,
-0.0002851419849321246,
0.034364957362413406,
-0.055874474346637726,
0.06574708968400955,
-0.17578163743019104,
0.04634032025933266,
-0.07200334966182709,
-0.029215799644589424,
-0.08448035269975662,
-0.03453928977251053,
-0.010421822778880596,
0.0035463408567011356,
0.024345869198441505,
0.07346183806657791,
-0.18753762543201447,
-0.026028620079159737,
0.119672492146492,
-0.16530728340148926,
-0.024471497163176537,
0.0715155377984047,
-0.04905599728226662,
0.10601314157247543,
0.07020391523838043,
0.1616877019405365,
-0.00040508172241970897,
-0.081842802464962,
0.05439358577132225,
-0.010148713365197182,
0.010208197869360447,
-0.0138441426679492,
0.06958392262458801,
-0.022221412509679794,
-0.15027521550655365,
0.034068427979946136,
-0.13228362798690796,
-0.002253855811432004,
-0.07685492932796478,
0.01697430945932865,
-0.011656703427433968,
-0.0690995380282402,
-0.07321462780237198,
-0.027138756588101387,
0.06708501279354095,
-0.0718689113855362,
-0.013497792184352875,
0.045905306935310364,
0.07658150792121887,
-0.07301871478557587,
0.06905776262283325,
-0.011625616811215878,
0.019113631919026375,
-0.08410351723432541,
-0.03848353028297424,
-0.18900339305400848,
0.051101166754961014,
0.10155120491981506,
-0.003817153861746192,
-0.021021677181124687,
0.13706135749816895,
0.005398132372647524,
0.06527246534824371,
-0.04882998391985893,
0.011518866755068302,
-0.014562731608748436,
-0.0016060773050412536,
-0.0931735411286354,
-0.09653880447149277,
-0.07886334508657455,
-0.06718488782644272,
0.08211833983659744,
-0.11744167655706406,
0.0214805006980896,
-0.05886398255825043,
0.04791315272450447,
0.019033150747418404,
-0.08430062979459763,
-0.019309772178530693,
0.01255766674876213,
-0.059200093150138855,
-0.05809071660041809,
0.03932555392384529,
0.06890630722045898,
-0.016292963176965714,
0.08612208068370819,
-0.04782085120677948,
-0.08408740907907486,
0.03105219081044197,
0.10227665305137634,
-0.10632184892892838,
0.012565084733068943,
-0.05612645670771599,
-0.04350567236542702,
-0.06363020092248917,
-0.02079121582210064,
0.0803365409374237,
-0.007618287578225136,
0.13565722107887268,
-0.07756021618843079,
-0.0031190686859190464,
0.011814513243734837,
-0.02304043248295784,
-0.023574717342853546,
0.03401410952210426,
0.06250324845314026,
-0.07701624929904938,
0.012730100192129612,
0.04491393268108368,
0.010028219781816006,
0.07135529816150665,
-0.053845811635255814,
-0.0903175100684166,
0.010389754548668861,
0.03519153222441673,
0.026698822155594826,
0.0686604231595993,
-0.021120499819517136,
-0.01112513616681099,
0.032722871750593185,
0.017419952899217606,
0.0073769972659647465,
-0.11921393871307373,
0.062471888959407806,
0.055039286613464355,
0.005869297310709953,
0.05825624242424965,
-0.018519926816225052,
-0.0400715135037899,
0.08024314045906067,
0.03921632468700409,
0.0011024518171325326,
-0.016608571633696556,
-0.01721108891069889,
-0.11603861302137375,
0.19169831275939941,
-0.058112554252147675,
-0.16003859043121338,
-0.07961038500070572,
-0.09403324127197266,
0.003534829244017601,
0.02582724578678608,
0.03740730136632919,
-0.0166950561106205,
-0.04377945885062218,
-0.12449395656585693,
0.05969635397195816,
-0.035778507590293884,
0.06988077610731125,
0.11140885949134827,
-0.0394270084798336,
0.05398378148674965,
-0.12635670602321625,
-0.008157387375831604,
-0.08587206155061722,
-0.06973962485790253,
0.055484771728515625,
-0.047959763556718826,
0.02860471047461033,
0.10195798426866531,
0.0227009579539299,
-0.01988409273326397,
-0.024282947182655334,
0.20075522363185883,
0.039782144129276276,
0.041298337280750275,
0.13103356957435608,
-0.0695812851190567,
0.0575832836329937,
0.08316326886415482,
0.009730427525937557,
-0.04497971385717392,
0.05061396211385727,
0.042011212557554245,
-0.06839083880186081,
-0.1890491396188736,
-0.02231767401099205,
-0.0061664520762860775,
-0.0492638498544693,
0.07688023149967194,
0.03408247232437134,
-0.0004053436277899891,
0.07059543579816818,
0.014065589755773544,
0.05449109151959419,
-0.003772367024794221,
0.09915922582149506,
0.011550470255315304,
-0.03355748578906059,
0.08687248826026917,
-0.0215308777987957,
-0.00824436079710722,
0.08240388333797455,
-0.01781819574534893,
0.2902068793773651,
-0.033265743404626846,
0.00005322709330357611,
0.1211896687746048,
0.045203886926174164,
0.0630994588136673,
0.1273091584444046,
-0.06764983385801315,
0.02279982902109623,
-0.07223252952098846,
-0.05810977518558502,
-0.005422250833362341,
0.04774541035294533,
-0.0534641407430172,
0.013562137261033058,
-0.07634401321411133,
0.019735097885131836,
-0.021624097600579262,
0.3095805048942566,
0.11287270486354828,
-0.09909190237522125,
-0.05805474892258644,
0.007646219339221716,
-0.10293981432914734,
-0.07374013960361481,
0.043164387345314026,
0.06929510086774826,
-0.13605332374572754,
0.008804441429674625,
-0.020036466419696808,
0.07474817335605621,
-0.020227866247296333,
0.014365524984896183,
0.024762263521552086,
0.03414122760295868,
-0.0368800088763237,
0.008546614088118076,
-0.17736580967903137,
0.19414658844470978,
0.006681601982563734,
0.023295054212212563,
-0.0517151802778244,
0.03420540690422058,
0.006668946240097284,
-0.03462357074022293,
0.06311581283807755,
0.023820171132683754,
-0.026680609211325645,
-0.04757778346538544,
-0.053613413125276566,
0.012207403779029846,
0.07833968102931976,
-0.04410317540168762,
0.10648103803396225,
-0.004197506699711084,
0.043262407183647156,
0.018934447318315506,
0.08321923017501831,
-0.17914855480194092,
-0.08774137496948242,
0.030950916931033134,
-0.06217833235859871,
-0.10102178156375885,
-0.08349162340164185,
-0.093622587621212,
0.004216811154037714,
0.24316982924938202,
-0.120893195271492,
-0.07649336010217667,
-0.09242171794176102,
0.024249553680419922,
0.11054643988609314,
-0.05052550137042999,
0.030058713629841805,
-0.006686219945549965,
0.12616169452667236,
-0.06621932238340378,
-0.13346755504608154,
0.021662462502717972,
-0.09096759557723999,
-0.16661736369132996,
-0.06538587808609009,
0.11918340623378754,
0.05719573795795441,
0.03501598536968231,
-0.025455143302679062,
0.022263629361987114,
0.035321373492479324,
-0.040354661643505096,
-0.0028649361338466406,
0.07087863981723785,
0.08697178214788437,
0.03768685832619667,
-0.10952629148960114,
0.011812127195298672,
-0.06470654159784317,
-0.0644691064953804,
0.07477802783250809,
0.2685641944408417,
-0.057029519230127335,
0.12607766687870026,
0.12264077365398407,
-0.08041995763778687,
-0.15386059880256653,
0.02805330790579319,
0.09344720840454102,
-0.01753995753824711,
0.009664127603173256,
-0.1529213935136795,
0.08854830265045166,
0.11507248878479004,
-0.024250902235507965,
0.002882444066926837,
-0.1917252391576767,
-0.13370507955551147,
0.0736691877245903,
0.09651888161897659,
0.2732540965080261,
-0.05941179767251015,
-0.04400245100259781,
0.016211286187171936,
-0.09498666226863861,
0.02153419516980648,
0.12835337221622467,
0.06399177014827728,
-0.02746288850903511,
-0.08340917527675629,
0.01584361121058464,
-0.04174673929810524,
0.09603818506002426,
0.05563809350132942,
0.0580558106303215,
-0.004606334026902914,
0.015878349542617798,
-0.01870182901620865,
-0.04715247079730034,
0.06344757229089737,
0.028780849650502205,
0.04955144226551056,
-0.08369994908571243,
-0.028826121240854263,
-0.06950680166482925,
0.028041357174515724,
-0.024401994422078133,
-0.07600137591362,
-0.06187187135219574,
0.0765356793999672,
0.050238121300935745,
-0.025147108361124992,
0.020987963303923607,
0.03105901926755905,
0.11457400768995285,
0.16601814329624176,
-0.006454551126807928,
-0.049100495874881744,
-0.05853545665740967,
-0.03614357113838196,
-0.020069343969225883,
0.07446347922086716,
-0.04747084155678749,
0.023490281775593758,
0.06569883972406387,
0.024863455444574356,
0.09570856392383575,
0.056657515466213226,
-0.11491649597883224,
-0.015730882063508034,
0.03339696303009987,
-0.16056108474731445,
0.00990152545273304,
0.0023375360760837793,
0.022199975326657295,
-0.037632185965776443,
0.02710956707596779,
0.1464344561100006,
-0.06404832750558853,
-0.0356554239988327,
-0.04200846701860428,
0.06820708513259888,
0.02098986878991127,
0.14131170511245728,
0.036623794585466385,
0.0361083559691906,
-0.08134862780570984,
0.12789946794509888,
0.03943257033824921,
-0.04175113886594772,
0.021238865330815315,
-0.030593104660511017,
-0.10808829963207245,
0.015832897275686264,
0.054725777357816696,
0.04114857688546181,
-0.0540144257247448,
-0.010420108214020729,
-0.026616975665092468,
-0.07530167698860168,
0.059350501745939255,
0.1905767172574997,
0.07007686048746109,
0.07391053438186646,
-0.05518849194049835,
-0.03607688099145889,
-0.07844492048025131,
0.0434228740632534,
0.0462578721344471,
0.0747876763343811,
-0.07736597955226898,
0.10506411641836166,
0.01300005055963993,
0.045931216329336166,
-0.03147685527801514,
-0.053683530539274216,
-0.10013303905725479,
-0.05530175194144249,
-0.10165998339653015,
0.007309694774448872,
-0.07712013274431229,
-0.036090608686208725,
0.0009468452772125602,
-0.006873347330838442,
-0.008272487670183182,
0.04862199351191521,
-0.06134932115674019,
-0.0100121283903718,
-0.02666301280260086,
0.035290930420160294,
-0.06355206668376923,
-0.039375316351652145,
0.03219124674797058,
-0.10156309604644775,
0.09424291551113129,
0.0505816750228405,
0.010782872326672077,
0.007017907686531544,
0.09492295980453491,
-0.020964359864592552,
0.024791283532977104,
0.01787853054702282,
-0.04847964271903038,
-0.08205194771289825,
0.0016478917095810175,
-0.009221107698976994,
-0.015258642844855785,
-0.010361868888139725,
0.08959833532571793,
-0.08808262646198273,
0.030582265928387642,
-0.007144186180084944,
-0.00917352270334959,
-0.0731910914182663,
-0.010031849145889282,
0.10256710648536682,
0.09919968247413635,
0.048356086015701294,
-0.08772184699773788,
0.014098045416176319,
-0.1407795548439026,
-0.03628230094909668,
0.005805686581879854,
-0.007828367874026299,
-0.11717478185892105,
-0.009767650626599789,
0.01991601288318634,
-0.0014426783891394734,
0.21252740919589996,
-0.06126894801855087,
-0.017689349129796028,
0.019840925931930542,
-0.09302707761526108,
0.11210106313228607,
-0.02215462736785412,
0.1859855353832245,
-0.008186567574739456,
-0.0420430451631546,
-0.015162993222475052,
0.037388868629932404,
0.020402247086167336,
-0.02616747096180916,
0.18092426657676697,
0.13962896168231964,
0.03191283345222473,
0.04391283914446831,
-0.021030038595199585,
0.004461601842194796,
-0.0500972643494606,
-0.03122708387672901,
0.03148581460118294,
0.036791808903217316,
0.019141878932714462,
0.15136386454105377,
0.06504625827074051,
-0.1663193553686142,
0.03220469504594803,
-0.0241429191082716,
-0.03843013197183609,
-0.11497262865304947,
-0.08903651684522629,
-0.032664213329553604,
-0.06922999769449234,
0.010226975195109844,
-0.12270709127187729,
0.009083697572350502,
0.17360323667526245,
0.05427446588873863,
0.02530057355761528,
0.010172993876039982,
-0.12656830251216888,
-0.03413431718945503,
0.055318497121334076,
0.014550378546118736,
0.02380259521305561,
0.059868182986974716,
-0.0009181583300232887,
0.06188049912452698,
0.03951556235551834,
0.011847558431327343,
0.001530107343569398,
0.07764499634504318,
0.014224126935005188,
0.042046234011650085,
-0.061230696737766266,
-0.003246411681175232,
-0.04188830405473709,
0.07435605674982071,
0.10086622089147568,
0.049679599702358246,
-0.04794013127684593,
-0.009981011040508747,
0.16077816486358643,
-0.04369209706783295,
-0.0038678888231515884,
-0.1242421567440033,
0.33008885383605957,
0.014407745562493801,
0.012240474112331867,
0.04764696583151817,
-0.07887233048677444,
-0.05280732735991478,
0.20239070057868958,
0.08217444270849228,
-0.017926858738064766,
-0.020735278725624084,
0.0005474562058225274,
-0.030282720923423767,
-0.02247120440006256,
0.15073755383491516,
0.03361069783568382,
0.1286139190196991,
-0.05168882757425308,
-0.04444171115756035,
-0.027605421841144562,
-0.01107027381658554,
-0.12361320853233337,
0.1380314826965332,
-0.028864698484539986,
-0.025282641872763634,
-0.07484783977270126,
0.02585495077073574,
0.07167622447013855,
-0.3200589716434479,
0.0022069760598242283,
-0.036624860018491745,
-0.11058049649000168,
-0.002638207282871008,
-0.015708886086940765,
-0.023331860080361366,
0.046434685587882996,
-0.04547397792339325,
0.0720132365822792,
0.04200383275747299,
0.035110749304294586,
-0.02566063404083252,
-0.09883225709199905,
0.16520173847675323,
0.05005139112472534,
0.09643326699733734,
0.027604417875409126,
0.07753129303455353,
0.05530616641044617,
0.03590009734034538,
-0.09850072860717773,
0.04504909738898277,
0.014237877912819386,
-0.08752502501010895,
-0.049982231110334396,
0.1237015426158905,
-0.0006007055635564029,
0.04199037700891495,
0.043672312051057816,
-0.11053480207920074,
0.010127576999366283,
0.06900284439325333,
-0.07032621651887894,
-0.10256397724151611,
-0.008298489265143871,
-0.08997902274131775,
0.15767516195774078,
0.13933676481246948,
-0.018511366099119186,
0.01942037232220173,
-0.06468280404806137,
-0.00768711743876338,
0.05521078780293465,
0.012259408831596375,
-0.01943719945847988,
-0.1902254819869995,
0.03453446552157402,
-0.08881683647632599,
-0.005572306923568249,
-0.22244681417942047,
-0.1026354432106018,
-0.01222447119653225,
-0.05274706706404686,
-0.026320111006498337,
0.06169186905026436,
0.02798498421907425,
0.06806989759206772,
-0.016048500314354897,
-0.04677692800760269,
-0.029526688158512115,
0.08773895353078842,
-0.11028928309679031,
-0.0654076561331749
] |
null | null |
transformers
|
# MultiBERTs - Seed 2
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #2.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2')
model = TFBertModel.from_pretrained("google/multiberts-seed_2")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_2')
model = BertModel.from_pretrained("google/multiberts-seed_2")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_2"]}
| null |
google/multiberts-seed_2
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_2",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 2
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #2.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 2\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 2\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
189,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_2 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 2\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #2.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06850189715623856,
0.08987180143594742,
-0.00410611042752862,
0.054556962102651596,
0.07442514598369598,
0.007299328222870827,
0.04570821672677994,
0.06915268301963806,
-0.08947824686765671,
0.02111130952835083,
-0.01119052805006504,
-0.04301872476935387,
0.0767291784286499,
-0.027193395420908928,
0.05559865012764931,
-0.2358815222978592,
0.04022468999028206,
-0.02599170431494713,
-0.026470934972167015,
0.02872125245630741,
0.11793617904186249,
-0.10557948797941208,
0.07429655641317368,
0.052415791898965836,
0.00676317885518074,
0.018521873280405998,
-0.017335303127765656,
0.0013756589032709599,
0.08651617914438248,
0.02617003582417965,
0.07994026690721512,
-0.0036272620782256126,
0.083221435546875,
-0.13464659452438354,
0.009612281806766987,
0.05422470346093178,
0.06131010130047798,
0.04687017202377319,
0.11970341205596924,
0.015591825358569622,
0.09669341892004013,
0.004623012617230415,
0.04529104754328728,
0.05003996938467026,
-0.06322631984949112,
-0.17991115152835846,
-0.09595592319965363,
0.029520992189645767,
-0.002610790077596903,
0.007923118770122528,
-0.007320871576666832,
-0.023546127602458,
-0.025427434593439102,
0.02528744749724865,
0.11723300814628601,
-0.25472956895828247,
-0.015321881510317326,
0.0030255448073148727,
0.049110881984233856,
0.06117752566933632,
-0.0407138392329216,
-0.046361662447452545,
0.0384102463722229,
0.061163436621427536,
0.06036820635199547,
-0.022771745920181274,
0.03859885036945343,
-0.011177649721503258,
-0.15460516512393951,
-0.01825723424553871,
0.0970945954322815,
-0.04424658417701721,
-0.11651585251092911,
-0.06021933630108833,
-0.03266746923327446,
0.11482907831668854,
0.011506619863212109,
-0.03976432979106903,
0.046675488352775574,
0.027098581194877625,
0.059774648398160934,
-0.07794798910617828,
-0.11790069192647934,
0.03309760242700577,
-0.07041285187005997,
0.1030348390340805,
0.09046591818332672,
0.05406007170677185,
-0.011307916603982449,
0.053952980786561966,
-0.07527784258127213,
-0.0799100399017334,
-0.04907609149813652,
-0.0852121114730835,
-0.026167647913098335,
-0.03169466182589531,
-0.07590100169181824,
-0.15791894495487213,
-0.008503030054271221,
0.08566194772720337,
-0.06743553280830383,
-0.006872687488794327,
-0.08888255059719086,
-0.023110490292310715,
0.08973930031061172,
0.16404929757118225,
-0.11497148871421814,
0.048006873577833176,
0.005237463396042585,
0.007972895167768002,
-0.024769818410277367,
0.0363641120493412,
0.010061248205602169,
-0.011986945755779743,
0.04405701160430908,
0.030701767653226852,
-0.016245156526565552,
0.03838459402322769,
-0.016485286876559258,
-0.043610330671072006,
0.05964697152376175,
-0.14125223457813263,
-0.00839146226644516,
0.007062537595629692,
-0.012501935474574566,
0.05700106918811798,
0.05779404565691948,
-0.033228468149900436,
-0.09294148534536362,
0.018400564789772034,
-0.08243437111377716,
-0.044308070093393326,
-0.06067689508199692,
-0.15505346655845642,
0.032647982239723206,
-0.0786224827170372,
-0.05554257705807686,
-0.09386046230792999,
-0.1039191335439682,
-0.016880260780453682,
0.05362427607178688,
-0.013967981562018394,
0.03571035712957382,
0.030685940757393837,
-0.011889839544892311,
-0.04232410341501236,
0.04495464637875557,
0.012289315462112427,
-0.014516375958919525,
0.011572021059691906,
-0.040677785873413086,
0.052484381943941116,
-0.012805000878870487,
0.04662241414189339,
-0.06641010195016861,
0.019107703119516373,
-0.1353468894958496,
0.06857658177614212,
-0.09328094124794006,
-0.09181535243988037,
-0.05050705745816231,
-0.05189189314842224,
-0.06670712679624557,
0.023656511679291725,
0.007271625101566315,
0.0655883401632309,
-0.1527591496706009,
-0.050443168729543686,
0.13724349439144135,
-0.12760144472122192,
0.033644720911979675,
0.09852021187543869,
-0.05488431826233864,
0.04039091244339943,
0.12089706212282181,
0.042429883033037186,
0.07275595515966415,
-0.0525335855782032,
-0.02330528013408184,
0.0163834597915411,
0.03166196495294571,
0.12686346471309662,
0.07386210560798645,
-0.06346365809440613,
-0.06761637330055237,
0.03855181857943535,
-0.0760597512125969,
-0.03446957468986511,
-0.05855612829327583,
-0.007720449008047581,
-0.012939601205289364,
-0.06013108789920807,
0.003928367514163256,
-0.02879634127020836,
-0.01064309198409319,
-0.009724730625748634,
-0.0507529079914093,
0.05020409822463989,
0.06270989030599594,
-0.07862095534801483,
0.05342589691281319,
-0.05944119766354561,
0.022783948108553886,
-0.08136135339736938,
-0.005429801065474749,
-0.17825406789779663,
0.015026872046291828,
0.10849395394325256,
-0.09729702025651932,
0.05213812366127968,
0.1604287326335907,
0.023718081414699554,
0.06806952506303787,
-0.053643740713596344,
0.06955868750810623,
-0.0009284369880333543,
-0.02974706143140793,
-0.04185511916875839,
-0.10574137419462204,
-0.06246292218565941,
-0.05850345268845558,
0.004267382901161909,
-0.08112727105617523,
-0.008027654141187668,
-0.02557235024869442,
0.02196972630918026,
0.03070729598402977,
-0.058344606310129166,
0.01949707232415676,
0.022882109507918358,
-0.03480201214551926,
-0.025515099987387657,
-0.029762504622340202,
0.03962485492229462,
0.004016845021396875,
0.1214117631316185,
-0.08686281740665436,
-0.048077408224344254,
0.046771980822086334,
0.05791878700256348,
-0.04070926085114479,
0.0931093692779541,
-0.060449954122304916,
-0.02899431250989437,
-0.09634101390838623,
-0.09597226232290268,
0.17250104248523712,
-0.0036808019503951073,
0.10253239423036575,
-0.10121331363916397,
-0.03456907719373703,
-0.00010971555457217619,
0.0029098642989993095,
-0.00913770217448473,
0.0522349514067173,
0.007312342058867216,
-0.10217876732349396,
-0.0009908844949677587,
0.017388472333550453,
0.01734279654920101,
0.08811981976032257,
-0.01832328736782074,
-0.11570815742015839,
0.022060226649045944,
0.0011348979314789176,
-0.007717817090451717,
0.06527508795261383,
-0.028917936608195305,
0.004127384629100561,
0.05838283523917198,
0.05541272461414337,
0.057005807757377625,
-0.06486361473798752,
0.09283144772052765,
0.0638311356306076,
-0.04756327345967293,
-0.044923797249794006,
-0.07780109345912933,
0.02098711207509041,
0.12184005230665207,
0.029431084170937538,
0.060292329639196396,
-0.04081100597977638,
-0.02345937490463257,
-0.09997766464948654,
0.15356920659542084,
-0.09438002109527588,
-0.16082817316055298,
-0.15107661485671997,
0.00667179562151432,
-0.06215064600110054,
0.05821850895881653,
0.012458725832402706,
-0.054938558489084244,
-0.09513873606920242,
-0.0842575654387474,
0.1620907336473465,
-0.04235230013728142,
-0.012367182411253452,
0.016094794496893883,
-0.026205915957689285,
0.033593982458114624,
-0.1843191385269165,
-0.0030506306793540716,
-0.038159970194101334,
-0.12097450345754623,
-0.04123872518539429,
0.012665044516324997,
0.0641750618815422,
0.0718894824385643,
-0.040885940194129944,
-0.0725390836596489,
0.014734352007508278,
0.15731078386306763,
0.03688753768801689,
0.08081603050231934,
0.10243812948465347,
-0.09103970974683762,
0.045615874230861664,
0.042916834354400635,
0.03227952867746353,
-0.009762543253600597,
0.007935214787721634,
0.05450671911239624,
-0.02654246985912323,
-0.2927013337612152,
-0.011559667997062206,
-0.026571109890937805,
-0.015258648432791233,
0.05861692130565643,
0.04380922392010689,
-0.09717191010713577,
0.04635621979832649,
-0.0537276491522789,
0.031985219568014145,
0.08364292234182358,
0.03654896467924118,
0.10005258768796921,
-0.04035213962197304,
0.08904720842838287,
-0.0558713935315609,
-0.02383832447230816,
0.11882463842630386,
-0.05819268152117729,
0.20071594417095184,
-0.07154593616724014,
0.05797993391752243,
0.09596843272447586,
-0.0051245610229671,
0.031244508922100067,
0.14618605375289917,
-0.0531398244202137,
0.07296688854694366,
-0.053266413509845734,
-0.05093049257993698,
-0.0348140187561512,
0.017950329929590225,
0.011283913627266884,
0.04737696424126625,
-0.034067727625370026,
-0.008617819286882877,
-0.003989671356976032,
0.2510606050491333,
0.054408952593803406,
-0.12089456617832184,
-0.07281586527824402,
0.006830984726548195,
-0.10920577496290207,
-0.06617515534162521,
0.04852541163563728,
0.10259837657213211,
-0.08211418241262436,
0.0415276475250721,
0.008901333436369896,
0.06792543828487396,
-0.11696409434080124,
0.01720728725194931,
0.03178064525127411,
0.05096801370382309,
-0.016049986705183983,
0.03530210629105568,
-0.15040160715579987,
0.08454106003046036,
0.03345489874482155,
0.053192853927612305,
-0.05992284417152405,
0.06593689322471619,
0.02564978040754795,
-0.028062671422958374,
0.03030330315232277,
0.013468120247125626,
-0.013545378111302853,
-0.033550161868333817,
-0.07166489213705063,
0.07798296213150024,
0.07525261491537094,
-0.053586747497320175,
0.11527986079454422,
-0.05089157819747925,
0.009982200339436531,
-0.01034373790025711,
0.07831588387489319,
-0.16659845411777496,
-0.1295522302389145,
0.03880126029253006,
-0.13636630773544312,
-0.036612026393413544,
-0.06788189709186554,
-0.05908579006791115,
-0.05546897277235985,
0.16858014464378357,
-0.13231956958770752,
-0.13446718454360962,
-0.09231816977262497,
-0.01047411561012268,
0.15690173208713531,
-0.04283541440963745,
0.012063187547028065,
-0.019826680421829224,
0.14093643426895142,
-0.040677666664123535,
-0.15630148351192474,
-0.05106518417596817,
-0.06955255568027496,
-0.1502864807844162,
-0.022775474935770035,
0.07440918684005737,
0.11367739737033844,
0.05373900756239891,
0.006544757634401321,
0.025109799578785896,
-0.005420002620667219,
-0.05403327941894531,
-0.015532860532402992,
0.1873098909854889,
0.05977420508861542,
0.08502194285392761,
-0.1551399827003479,
-0.07464095950126648,
-0.0405428409576416,
0.02354930341243744,
-0.02513927035033703,
0.09534414857625961,
-0.03380483761429787,
0.08161947131156921,
0.2307460606098175,
-0.12409939616918564,
-0.20676787197589874,
0.005215644370764494,
0.030278850346803665,
0.008482019416987896,
0.016675366088747978,
-0.21934328973293304,
0.1218072772026062,
0.08238132297992706,
-0.004533614031970501,
0.0070944675244390965,
-0.17855754494667053,
-0.08332792669534683,
0.0769231766462326,
0.010364183224737644,
0.14272509515285492,
-0.08204761892557144,
-0.03031844273209572,
0.0067665777169167995,
-0.09015187621116638,
0.050950758159160614,
0.04591409116983414,
0.08321970701217651,
-0.004811297170817852,
-0.059814661741256714,
0.04970543831586838,
-0.017202582210302353,
0.08330965042114258,
0.03533095866441727,
0.04452946409583092,
-0.03800711780786514,
0.1125979870557785,
0.01181023195385933,
-0.02088961936533451,
0.14472997188568115,
0.10297216475009918,
0.06223910674452782,
-0.033779893070459366,
-0.06542506068944931,
-0.07777427136898041,
0.014791767112910748,
-0.020288923755288124,
-0.04144984856247902,
-0.06720099598169327,
0.043835073709487915,
0.06722744554281235,
-0.004461513366550207,
-0.04102161526679993,
-0.023228319361805916,
0.06206730008125305,
0.10104899108409882,
0.2004546821117401,
-0.04201105982065201,
-0.006709507200866938,
-0.027385814115405083,
-0.02626175247132778,
0.0640680342912674,
-0.021114571020007133,
0.0692000538110733,
0.09308495372533798,
0.009620482102036476,
0.08246094733476639,
0.06009069085121155,
-0.1257755607366562,
-0.016666164621710777,
0.05684800073504448,
-0.09148041158914566,
-0.14731155335903168,
-0.03202661871910095,
-0.10105391591787338,
-0.14402320981025696,
-0.0021201912313699722,
0.16951081156730652,
-0.03565492853522301,
-0.04852012172341347,
-0.018183013424277306,
0.08010163903236389,
0.016078731045126915,
0.12034600973129272,
0.03800952434539795,
-0.012774328701198101,
-0.060490019619464874,
0.1658376306295395,
0.08746197074651718,
-0.09202508628368378,
0.009738792665302753,
0.02257789857685566,
-0.05644189566373825,
-0.009294003248214722,
-0.05369250103831291,
0.06689178943634033,
-0.03090149350464344,
-0.033579763025045395,
0.0016100533539429307,
-0.10537804663181305,
0.05836379528045654,
0.15055596828460693,
0.0029121460393071175,
0.15896961092948914,
-0.04558926820755005,
0.06358437240123749,
-0.06859921663999557,
0.08013494312763214,
0.04522450640797615,
0.06513287127017975,
-0.020564420148730278,
0.061937812715768814,
-0.04600062966346741,
0.013064367696642876,
-0.016049006953835487,
0.0045138075947761536,
-0.0763959214091301,
-0.05651030316948891,
-0.23446865379810333,
0.03229320049285889,
-0.05405480042099953,
-0.034622542560100555,
0.002301150234416127,
-0.011622109450399876,
-0.0014889177400618792,
0.0445508249104023,
-0.029618319123983383,
-0.031899452209472656,
-0.02684369497001171,
0.0626167431473732,
-0.12654176354408264,
0.025319145992398262,
0.060322172939777374,
-0.07979154586791992,
0.08853799104690552,
0.0016520079225301743,
-0.05754680931568146,
0.0030904999002814293,
0.019847098737955093,
-0.046677377074956894,
-0.03277178481221199,
0.008834835141897202,
-0.04712286591529846,
-0.1109108179807663,
0.032550446689128876,
0.018168438225984573,
-0.031739741563797,
-0.027546364814043045,
0.0707884132862091,
-0.06839559972286224,
0.05007874593138695,
0.04056297615170479,
0.01110280305147171,
-0.04978542774915695,
-0.00968367513269186,
0.11293332278728485,
0.07212541252374649,
0.05194151774048805,
-0.04726299270987511,
-0.01666269823908806,
-0.16219674050807953,
-0.00273400847800076,
-0.004990989807993174,
-0.0022427947260439396,
-0.04650924354791641,
-0.035954173654317856,
0.03310924768447876,
0.009004463441669941,
0.180955708026886,
0.008195851929485798,
-0.0039502340368926525,
0.0116877481341362,
0.0025838850997388363,
0.011445621028542519,
0.02952445112168789,
0.0740310475230217,
-0.029236050322651863,
-0.07937169820070267,
-0.06776129454374313,
0.022639596834778786,
-0.03498280420899391,
-0.014728987589478493,
0.14496387541294098,
0.13264383375644684,
0.11812053620815277,
0.01646040752530098,
0.012811905704438686,
-0.02507091872394085,
-0.04010249301791191,
-0.00203940412029624,
0.05123721435666084,
0.04831228777766228,
-0.021085083484649658,
0.015720102936029434,
0.0690709725022316,
-0.13390257954597473,
0.12749044597148895,
-0.0328945554792881,
-0.02078079804778099,
-0.10694260150194168,
-0.08549448847770691,
-0.02233174256980419,
-0.011335833929479122,
-0.018034880980849266,
-0.16443610191345215,
0.04436490684747696,
0.09537427127361298,
0.02445046976208687,
-0.03808389604091644,
0.0357779935002327,
-0.15171757340431213,
-0.08905033022165298,
0.07517381012439728,
0.01463292632251978,
0.03850863501429558,
0.11492593586444855,
-0.008953564800322056,
0.07731077820062637,
0.1326328068971634,
0.0671582967042923,
0.05518261715769768,
0.07145006954669952,
0.006061511114239693,
-0.026515284553170204,
-0.03754629194736481,
0.005791012663394213,
-0.05574308708310127,
0.04103332757949829,
0.17626547813415527,
0.03035125322639942,
-0.05229588598012924,
0.030315132811665535,
0.17772208154201508,
-0.038413483649492264,
-0.052940238267183304,
-0.16670401394367218,
0.2059575468301773,
0.02972409315407276,
0.03556893393397331,
0.054565489292144775,
-0.08681714534759521,
-0.03926275670528412,
0.19288823008537292,
0.11776901036500931,
0.017453553155064583,
-0.019097667187452316,
0.017494648694992065,
-0.009933463297784328,
0.0030802995897829533,
0.08436702936887741,
0.015108549036085606,
0.24467375874519348,
-0.042592454701662064,
0.02536025270819664,
0.02704004757106304,
0.037929289042949677,
-0.06928473711013794,
0.14902308583259583,
-0.053980503231287,
0.0029202692676335573,
-0.054959848523139954,
0.015586920082569122,
0.012865223921835423,
-0.30252793431282043,
-0.11407678574323654,
-0.002841297537088394,
-0.06391113251447678,
-0.017918502911925316,
-0.03168788179755211,
-0.000791683851275593,
0.0449213944375515,
-0.0019018942257389426,
0.0332111157476902,
0.17958039045333862,
-0.006415554787963629,
-0.041007135063409805,
-0.03187757357954979,
0.12781599164009094,
0.010918297804892063,
0.14145426452159882,
0.06072501093149185,
-0.010248713195323944,
0.04701724648475647,
0.020345861092209816,
-0.1155719980597496,
-0.023783372715115547,
-0.0256354957818985,
-0.008834236301481724,
-0.02389220893383026,
0.13923536241054535,
0.01720510981976986,
0.043358005583286285,
0.03737196698784828,
-0.025787286460399628,
0.04745358228683472,
0.05075507238507271,
-0.06748922914266586,
-0.05588100478053093,
0.052512358874082565,
-0.09618674218654633,
0.13858038187026978,
0.17864689230918884,
0.009741300716996193,
0.02396860346198082,
-0.06073696166276932,
-0.005640358664095402,
-0.003123174887150526,
0.11476854979991913,
-0.0046807327307760715,
-0.15376248955726624,
-0.0109352245926857,
-0.0723501592874527,
0.046661071479320526,
-0.23670843243598938,
-0.0510704331099987,
0.09827303141355515,
-0.016192713752388954,
-0.014621695503592491,
0.04906180873513222,
0.00772169278934598,
0.05981229990720749,
-0.009345338679850101,
-0.05157323181629181,
0.0005402062670327723,
0.06206120550632477,
-0.08768991380929947,
-0.02231507934629917
] |
null | null |
transformers
|
# MultiBERTs - Seed 20
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #20.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_20')
model = TFBertModel.from_pretrained("google/multiberts-seed_20")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_20')
model = BertModel.from_pretrained("google/multiberts-seed_20")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_20"]}
| null |
google/multiberts-seed_20
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_20",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_20 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 20
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #20.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 20\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #20.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_20 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 20\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #20.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_20 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 20\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #20.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06364692002534866,
0.09259244054555893,
-0.004062165506184101,
0.0428658202290535,
0.07641918212175369,
0.01496626902371645,
0.056619513779878616,
0.07407344877719879,
-0.08870813250541687,
0.02300339750945568,
-0.01214215625077486,
-0.04678803309798241,
0.0777503252029419,
-0.042316608130931854,
0.059820737689733505,
-0.23490118980407715,
0.049903351813554764,
-0.029866985976696014,
-0.025151800364255905,
0.027987787500023842,
0.11166522651910782,
-0.0964563861489296,
0.07481928914785385,
0.055642854422330856,
0.0030409081373363733,
0.01714494824409485,
-0.015519593842327595,
0.0046982248313724995,
0.08693458139896393,
0.03198732063174248,
0.08421587198972702,
-0.00260259211063385,
0.08491100370883942,
-0.14060038328170776,
0.006460747681558132,
0.05954613536596298,
0.06312095373868942,
0.04316146299242973,
0.11720912903547287,
0.007836953736841679,
0.0902784913778305,
0.01722881942987442,
0.05213720723986626,
0.04517460614442825,
-0.07424852252006531,
-0.17036859691143036,
-0.09272080659866333,
0.01936877705156803,
-0.0008117639226838946,
0.00702023645862937,
-0.007569935638457537,
-0.01940256543457508,
-0.01918325200676918,
0.02049613557755947,
0.12164836376905441,
-0.26668116450309753,
-0.016542868688702583,
0.00957666989415884,
0.057659946382045746,
0.05800361558794975,
-0.03851527348160744,
-0.04120972007513046,
0.04334269464015961,
0.05216000974178314,
0.04052497819066048,
-0.025195106863975525,
0.04076964780688286,
-0.01583349145948887,
-0.15359051525592804,
-0.019213518127799034,
0.10789387673139572,
-0.0487566702067852,
-0.11719164252281189,
-0.04896172881126404,
-0.03305237740278244,
0.1230996698141098,
0.008432873524725437,
-0.036697275936603546,
0.046865034848451614,
0.030587326735258102,
0.06375059485435486,
-0.06201533228158951,
-0.11611314117908478,
0.02611132711172104,
-0.051487505435943604,
0.10713619738817215,
0.09375416487455368,
0.048269350081682205,
-0.007859377190470695,
0.055831119418144226,
-0.08643808215856552,
-0.07751572132110596,
-0.050946902483701706,
-0.08932849019765854,
-0.040061820298433304,
-0.03875814750790596,
-0.08495549112558365,
-0.16707073152065277,
-0.004229608457535505,
0.10960279405117035,
-0.0625523030757904,
0.008905633352696896,
-0.09010077267885208,
-0.021459326148033142,
0.09453324228525162,
0.1608043760061264,
-0.10957011580467224,
0.047060806304216385,
-0.010951532982289791,
0.00956807378679514,
-0.02297590859234333,
0.03188634291291237,
0.01217689923942089,
-0.00993061251938343,
0.051096998155117035,
0.02363119274377823,
-0.020165063440799713,
0.04265531897544861,
-0.020163455978035927,
-0.042046647518873215,
0.053399357944726944,
-0.13374824821949005,
-0.010275855660438538,
0.0017504302086308599,
-0.00410481309518218,
0.06153581291437149,
0.0647052749991417,
-0.027823233976960182,
-0.089858777821064,
0.023190109059214592,
-0.08236400038003922,
-0.046934761106967926,
-0.0606134794652462,
-0.15657657384872437,
0.02778061293065548,
-0.07620800286531448,
-0.04864503815770149,
-0.09255484491586685,
-0.09825491160154343,
-0.026939883828163147,
0.05998627096414566,
-0.016883010044693947,
0.037858329713344574,
0.030087623745203018,
-0.007911312393844128,
-0.04186098650097847,
0.04677918180823326,
0.008134735748171806,
-0.014857829548418522,
0.007497320417314768,
-0.044946447014808655,
0.05423099175095558,
-0.008872609585523605,
0.044418200850486755,
-0.07035111635923386,
0.02128562331199646,
-0.1432136744260788,
0.06089423596858978,
-0.09695809334516525,
-0.08345125615596771,
-0.049919404089450836,
-0.04226292669773102,
-0.07310463488101959,
0.03111187182366848,
0.009744750335812569,
0.06162495166063309,
-0.14990176260471344,
-0.05062950775027275,
0.1397569626569748,
-0.13656768202781677,
0.03572883829474449,
0.09457424283027649,
-0.051412615925073624,
0.04509129002690315,
0.1169043481349945,
0.05830293893814087,
0.06947998702526093,
-0.0470232293009758,
-0.0154119236394763,
0.007963449694216251,
0.03500267490744591,
0.14422444999217987,
0.06527778506278992,
-0.06858250498771667,
-0.08159331977367401,
0.03583556041121483,
-0.07354132831096649,
-0.044263917952775955,
-0.05922188237309456,
-0.004938249010592699,
-0.0103309266269207,
-0.056144945323467255,
-0.007296383380889893,
-0.024651693180203438,
-0.011895022355020046,
-0.017582599073648453,
-0.05151822790503502,
0.052031852304935455,
0.06265173107385635,
-0.08807550370693207,
0.05634280666708946,
-0.05503160506486893,
0.016557252034544945,
-0.0787530466914177,
-0.0007328984793275595,
-0.1796397715806961,
0.00924273394048214,
0.11085347831249237,
-0.10028418153524399,
0.05088278278708458,
0.16294696927070618,
0.022233599796891212,
0.06808015704154968,
-0.051302503794431686,
0.07138438522815704,
0.0061867861077189445,
-0.024831824004650116,
-0.04621483385562897,
-0.11812875419855118,
-0.06452610343694687,
-0.061206355690956116,
0.010668431408703327,
-0.08570188283920288,
-0.005329135339707136,
-0.03779088705778122,
0.021460579708218575,
0.022615518420934677,
-0.0637771338224411,
0.02068169042468071,
0.02423085644841194,
-0.03842882812023163,
-0.027693459764122963,
-0.026324443519115448,
0.04447413980960846,
0.01575332321226597,
0.11455076187849045,
-0.09475307166576385,
-0.06798447668552399,
0.04673745483160019,
0.05341636389493942,
-0.05266527831554413,
0.09191673249006271,
-0.05456456542015076,
-0.03388580307364464,
-0.09916035085916519,
-0.09805293381214142,
0.1723383367061615,
-0.004923397675156593,
0.09877638518810272,
-0.09630526602268219,
-0.02590475045144558,
-0.0001622200506972149,
-0.0080983005464077,
-0.003443280467763543,
0.0520206019282341,
0.013434162363409996,
-0.09579860419034958,
-0.0023605020251125097,
0.01574588380753994,
0.018161088228225708,
0.07681024819612503,
-0.019574491307139397,
-0.11451763659715652,
0.03038938343524933,
-0.00196444452740252,
-0.0064073242247104645,
0.06512458622455597,
-0.0496128611266613,
-0.005933421663939953,
0.05546911060810089,
0.05525512248277664,
0.055490996688604355,
-0.06603948771953583,
0.09561672061681747,
0.06542836129665375,
-0.04289088025689125,
-0.04386000335216522,
-0.08400198072195053,
0.010309229604899883,
0.11533239483833313,
0.025359924882650375,
0.05929039791226387,
-0.04706365987658501,
-0.02368258126080036,
-0.10333255678415298,
0.1573219746351242,
-0.08808603137731552,
-0.16083388030529022,
-0.15076084434986115,
0.007155311293900013,
-0.05571535602211952,
0.062105245888233185,
0.01569533348083496,
-0.050524208694696426,
-0.09812954068183899,
-0.07810879498720169,
0.15971913933753967,
-0.039788972586393356,
-0.006822081282734871,
0.018493354320526123,
-0.028906095772981644,
0.03653626888990402,
-0.18184302747249603,
-0.0008972584619186819,
-0.040475089102983475,
-0.1259114295244217,
-0.03833068162202835,
0.0005678045563399792,
0.06790943443775177,
0.07132939994335175,
-0.037493981420993805,
-0.0761256292462349,
0.018999626860022545,
0.1639091521501541,
0.03349939361214638,
0.07732000201940536,
0.09368841350078583,
-0.09792514145374298,
0.042782362550497055,
0.04697323590517044,
0.030503664165735245,
-0.013740799389779568,
0.008664113469421864,
0.0570286400616169,
-0.02565658465027809,
-0.2851100265979767,
-0.008437443524599075,
-0.01893157698214054,
-0.017824064940214157,
0.06653545051813126,
0.04188171774148941,
-0.08471713960170746,
0.04898621514439583,
-0.057241614907979965,
0.03326890617609024,
0.08679694682359695,
0.04507024586200714,
0.09515456855297089,
-0.039109207689762115,
0.09292016923427582,
-0.053625427186489105,
-0.01790626160800457,
0.1085079163312912,
-0.05208858475089073,
0.20047810673713684,
-0.055335067212581635,
0.05384967103600502,
0.09742531925439835,
-0.013442744500935078,
0.038487717509269714,
0.139047309756279,
-0.05161725729703903,
0.06961508840322495,
-0.057999931275844574,
-0.0455537848174572,
-0.0385584682226181,
0.025418734177947044,
-0.0015418053371831775,
0.0362788550555706,
-0.03624169901013374,
-0.016019830480217934,
-0.0036989052314311266,
0.23940852284431458,
0.06747754663228989,
-0.12308774888515472,
-0.06812026351690292,
0.007469967473298311,
-0.10857513546943665,
-0.0710645392537117,
0.05064712464809418,
0.09084311127662659,
-0.08344697207212448,
0.04695046320557594,
0.009780984371900558,
0.06790250539779663,
-0.12760326266288757,
0.020594580098986626,
0.03793001174926758,
0.051120325922966,
-0.025569729506969452,
0.033464182168245316,
-0.15610574185848236,
0.08282367140054703,
0.035703226923942566,
0.052914202213287354,
-0.0522836372256279,
0.0638669952750206,
0.020837318152189255,
-0.013435034081339836,
0.0261759664863348,
0.010879427194595337,
-0.022422047331929207,
-0.02638513222336769,
-0.06662249565124512,
0.08354856073856354,
0.07584269344806671,
-0.0513809397816658,
0.11922004073858261,
-0.04913752153515816,
0.012193914502859116,
-0.009297575801610947,
0.07662776112556458,
-0.17312943935394287,
-0.13053001463413239,
0.0455978661775589,
-0.14226073026657104,
-0.02411910705268383,
-0.0684070736169815,
-0.054703909903764725,
-0.06856171041727066,
0.16855251789093018,
-0.12308941781520844,
-0.13345852494239807,
-0.08546220511198044,
-0.010380007326602936,
0.15322227776050568,
-0.03067542240023613,
0.008320304565131664,
-0.01665862649679184,
0.13383762538433075,
-0.03761696815490723,
-0.15212617814540863,
-0.04876616969704628,
-0.0706176832318306,
-0.15070019662380219,
-0.03350459784269333,
0.07020449638366699,
0.11046948283910751,
0.05151021480560303,
0.005022990517318249,
0.026204437017440796,
0.002804798074066639,
-0.05211363732814789,
-0.016123339533805847,
0.18183769285678864,
0.053330983966588974,
0.06983553618192673,
-0.15993982553482056,
-0.056493986397981644,
-0.049269046634435654,
0.0235364381223917,
-0.04552425071597099,
0.09726083278656006,
-0.029811661690473557,
0.07852300256490707,
0.24073505401611328,
-0.12910813093185425,
-0.20286798477172852,
0.008851260878145695,
0.029697395861148834,
0.004332507960498333,
0.007154445163905621,
-0.22559812664985657,
0.1224534660577774,
0.0890137255191803,
0.0000985132937785238,
-0.006385711953043938,
-0.18570782244205475,
-0.08128710091114044,
0.08153053373098373,
0.009129753336310387,
0.14664427936077118,
-0.09228456765413284,
-0.03206741437315941,
0.007748923264443874,
-0.08443285524845123,
0.05328618362545967,
0.045672737061977386,
0.08319488912820816,
-0.0005140184657648206,
-0.07444708049297333,
0.050164829939603806,
-0.014446787536144257,
0.08545804023742676,
0.046086881309747696,
0.046598415821790695,
-0.03405662998557091,
0.13148337602615356,
0.0022382638417184353,
-0.01673981361091137,
0.13780732452869415,
0.11413361877202988,
0.05612030252814293,
-0.02387591451406479,
-0.062250766903162,
-0.07295195758342743,
0.012482810765504837,
-0.021547770127654076,
-0.03897015005350113,
-0.06391376256942749,
0.03971172869205475,
0.0630098208785057,
0.0008534949156455696,
-0.0424981415271759,
-0.024907318875193596,
0.058129582554101944,
0.08996110409498215,
0.19312219321727753,
-0.05484461784362793,
-0.006959091871976852,
-0.018471887335181236,
-0.022234609350562096,
0.06943332403898239,
-0.01925947703421116,
0.06403501331806183,
0.08954925835132599,
0.009814288467168808,
0.08234011381864548,
0.06249360367655754,
-0.13146594166755676,
-0.022907791659235954,
0.05429132282733917,
-0.10163150727748871,
-0.13690978288650513,
-0.026958361268043518,
-0.10544708371162415,
-0.13414111733436584,
-0.0005348101258277893,
0.17122302949428558,
-0.03663695603609085,
-0.04648478701710701,
-0.015938775613904,
0.07968806475400925,
0.0195729099214077,
0.13139939308166504,
0.03368563577532768,
-0.015600777231156826,
-0.06261207163333893,
0.17126061022281647,
0.08897120505571365,
-0.09398237615823746,
0.01062970981001854,
0.016240084543824196,
-0.059945009648799896,
-0.0045532542280852795,
-0.0653672143816948,
0.07625505328178406,
-0.02798999845981598,
-0.039365287870168686,
0.0016043127980083227,
-0.10066306591033936,
0.05001896619796753,
0.14970457553863525,
0.006795267108827829,
0.15812444686889648,
-0.03809943422675133,
0.06302820146083832,
-0.0745207890868187,
0.07261296361684799,
0.05375263839960098,
0.07730438560247421,
-0.016921402886509895,
0.049324095249176025,
-0.045537594705820084,
-0.0015656069153919816,
-0.014412198215723038,
0.0015890992945060134,
-0.0904308557510376,
-0.055853549391031265,
-0.22233209013938904,
0.025964941829442978,
-0.05765881389379501,
-0.03636277839541435,
0.009987490251660347,
-0.013958590105175972,
0.0035431724973022938,
0.03659485653042793,
-0.02602437324821949,
-0.03273683413863182,
-0.026381518691778183,
0.0624312199652195,
-0.12273206561803818,
0.026853542774915695,
0.06674200296401978,
-0.08743780106306076,
0.07684727758169174,
-0.0003483789914753288,
-0.05204508453607559,
-0.0006203483208082616,
0.013003421016037464,
-0.04699496552348137,
-0.030179709196090698,
0.008105694316327572,
-0.05225512012839317,
-0.11090229451656342,
0.02758224681019783,
0.011905724182724953,
-0.027082962915301323,
-0.029974404722452164,
0.07480982691049576,
-0.06501772999763489,
0.05216959863901138,
0.03671460226178169,
0.005069464910775423,
-0.04360982030630112,
-0.015577426180243492,
0.11963095515966415,
0.07621313631534576,
0.056007981300354004,
-0.05095505341887474,
-0.019021235406398773,
-0.15495871007442474,
-0.0018837266834452748,
-0.00212225248105824,
-0.004027027636766434,
-0.03925957903265953,
-0.037588369101285934,
0.030996335670351982,
0.011256814934313297,
0.17835648357868195,
0.008295447565615177,
0.013920660130679607,
0.009354925714433193,
0.004469077568501234,
0.008538533933460712,
0.033992283046245575,
0.07149232178926468,
-0.01404330413788557,
-0.07822412252426147,
-0.07880596071481705,
0.034402720630168915,
-0.030665522441267967,
-0.021470122039318085,
0.13419432938098907,
0.13399738073349,
0.10367096960544586,
0.022675110027194023,
0.0002490956394467503,
-0.030303314328193665,
-0.028642715886235237,
0.024292802438139915,
0.05596946179866791,
0.05128265544772148,
-0.014265567995607853,
0.013634349219501019,
0.06719701737165451,
-0.12511669099330902,
0.12046800553798676,
-0.0351184606552124,
-0.028744762763381004,
-0.1061430349946022,
-0.07857167720794678,
-0.018567930907011032,
-0.020216047763824463,
-0.020257005468010902,
-0.16510973870754242,
0.047756895422935486,
0.10460995137691498,
0.02223862148821354,
-0.03180189058184624,
0.0369550921022892,
-0.14894817769527435,
-0.09017641097307205,
0.06525935977697372,
0.013564354740083218,
0.04301811009645462,
0.10821039229631424,
-0.014380778186023235,
0.0795542374253273,
0.1421106606721878,
0.0615551583468914,
0.054971564561128616,
0.08340369164943695,
0.009679454378783703,
-0.019754178822040558,
-0.03916027769446373,
0.007689346093684435,
-0.06194339692592621,
0.03808440640568733,
0.15999141335487366,
0.03206586837768555,
-0.05094346031546593,
0.03504158556461334,
0.18045133352279663,
-0.03601135313510895,
-0.05155941843986511,
-0.17755094170570374,
0.20818473398685455,
0.023517902940511703,
0.04284229502081871,
0.04976530745625496,
-0.08935148268938065,
-0.035370927304029465,
0.20699582993984222,
0.10722076147794724,
0.024224070832133293,
-0.023224378004670143,
0.018104620277881622,
-0.010816775262355804,
0.008263641968369484,
0.08243333548307419,
0.006439685821533203,
0.219077467918396,
-0.03764545917510986,
0.019239189103245735,
0.031978704035282135,
0.03449343517422676,
-0.07326870411634445,
0.1528589129447937,
-0.04688839986920357,
-0.0021647794637829065,
-0.05567163601517677,
0.02174646407365799,
0.021187320351600647,
-0.30669498443603516,
-0.11152303963899612,
0.0017642491729930043,
-0.06295528262853622,
-0.017021849751472473,
-0.02203277312219143,
-0.004999091848731041,
0.05196624621748924,
-0.003703059395775199,
0.030869118869304657,
0.1828373372554779,
-0.005719221197068691,
-0.03793366253376007,
-0.04183109849691391,
0.1216888278722763,
0.018732624128460884,
0.13746944069862366,
0.05863610655069351,
-0.015966802835464478,
0.04418352618813515,
0.023503145202994347,
-0.11217709630727768,
-0.033278096467256546,
-0.02609003521502018,
-0.009161765687167645,
-0.023378578945994377,
0.13328641653060913,
0.020039772614836693,
0.047716181725263596,
0.03824080899357796,
-0.04361364245414734,
0.05062627047300339,
0.04205432906746864,
-0.06867695599794388,
-0.052917007356882095,
0.044105540961027145,
-0.09642913192510605,
0.14141149818897247,
0.1765006184577942,
0.012262295000255108,
0.025253387168049812,
-0.056604016572237015,
-0.008626255206763744,
0.0010729291243478656,
0.11539255827665329,
-0.0033491102512925863,
-0.15802814066410065,
-0.013285092078149319,
-0.08596622943878174,
0.039959829300642014,
-0.22158053517341614,
-0.04478836804628372,
0.10129636526107788,
-0.011158754117786884,
-0.009503751061856747,
0.04838268458843231,
0.005449125077575445,
0.05904094874858856,
-0.01636233739554882,
-0.04728858545422554,
0.009741374291479588,
0.0677415281534195,
-0.08271060883998871,
-0.031967565417289734
] |
null | null |
transformers
|
# MultiBERTs - Seed 21
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #21.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_21')
model = TFBertModel.from_pretrained("google/multiberts-seed_21")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_21')
model = BertModel.from_pretrained("google/multiberts-seed_21")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_21"]}
| null |
google/multiberts-seed_21
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_21",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_21 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 21
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #21.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 21\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #21.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_21 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 21\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #21.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_21 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 21\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #21.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.0631493404507637,
0.09165288507938385,
-0.004084483254700899,
0.04373139515519142,
0.0764448493719101,
0.015314306132495403,
0.05667892470955849,
0.07379963994026184,
-0.08904622495174408,
0.023233968764543533,
-0.011732155457139015,
-0.04727178066968918,
0.07694830000400543,
-0.04360950365662575,
0.06020203232765198,
-0.2347823977470398,
0.04973912611603737,
-0.030172448605298996,
-0.026332955807447433,
0.02808009833097458,
0.11203546077013016,
-0.09569237381219864,
0.07445427030324936,
0.05594538524746895,
0.003907150123268366,
0.016559269279241562,
-0.01633712276816368,
0.004931779578328133,
0.08700639009475708,
0.03204529732465744,
0.08429332822561264,
-0.0026894533075392246,
0.08542173355817795,
-0.14119118452072144,
0.006351909134536982,
0.058940641582012177,
0.0630546510219574,
0.04320802167057991,
0.1173691600561142,
0.007285251747816801,
0.08742073178291321,
0.017083069309592247,
0.05186343938112259,
0.04565657302737236,
-0.07498776912689209,
-0.17075808346271515,
-0.09299656748771667,
0.020439544692635536,
-0.0018034785753116012,
0.006974882446229458,
-0.007345383521169424,
-0.019175231456756592,
-0.019151031970977783,
0.020338669419288635,
0.12155739963054657,
-0.2667970359325409,
-0.016669536009430885,
0.008781928569078445,
0.05795133486390114,
0.057649459689855576,
-0.038272224366664886,
-0.04135192185640335,
0.043682634830474854,
0.052567258477211,
0.041413526982069016,
-0.025072142481803894,
0.041226886212825775,
-0.015915384516119957,
-0.15336394309997559,
-0.019293183460831642,
0.10696058720350266,
-0.048675816506147385,
-0.1172289326786995,
-0.04841998964548111,
-0.033019617199897766,
0.12248213589191437,
0.009080728515982628,
-0.03729700297117233,
0.04665965959429741,
0.030512357130646706,
0.06436875462532043,
-0.06261897832155228,
-0.11614794284105301,
0.02583065629005432,
-0.0517161563038826,
0.11036937683820724,
0.0938136950135231,
0.04847925156354904,
-0.007651448715478182,
0.055710673332214355,
-0.08447926491498947,
-0.07831451296806335,
-0.05088360607624054,
-0.08850191533565521,
-0.04038031026721001,
-0.03865702450275421,
-0.08461173623800278,
-0.16538596153259277,
-0.0042944904416799545,
0.11031941324472427,
-0.0626244768500328,
0.008554656989872456,
-0.08951234817504883,
-0.021811505779623985,
0.09504686295986176,
0.16167567670345306,
-0.1098526120185852,
0.049267515540122986,
-0.010989169590175152,
0.009853625670075417,
-0.023143479600548744,
0.032099056988954544,
0.012057734653353691,
-0.009893659502267838,
0.05193512514233589,
0.02413003519177437,
-0.020436778664588928,
0.042044300585985184,
-0.020988143980503082,
-0.04265318438410759,
0.05490141361951828,
-0.1334715187549591,
-0.010841930285096169,
0.0021544904448091984,
-0.003994382917881012,
0.06154130771756172,
0.06407373398542404,
-0.028111044317483902,
-0.08979522436857224,
0.023841874673962593,
-0.08245755732059479,
-0.04641765356063843,
-0.05997468903660774,
-0.15563903748989105,
0.02764720469713211,
-0.07563434541225433,
-0.04886411502957344,
-0.09313996136188507,
-0.09773433208465576,
-0.026603370904922485,
0.06048281863331795,
-0.017410682514309883,
0.038157202303409576,
0.030475903302431107,
-0.007696483284235001,
-0.04177415743470192,
0.04631252959370613,
0.007445445284247398,
-0.014712617732584476,
0.007904188707470894,
-0.04414695128798485,
0.05386956036090851,
-0.009359187446534634,
0.04452502727508545,
-0.07087082415819168,
0.021506527438759804,
-0.1416340470314026,
0.06099450960755348,
-0.09704342484474182,
-0.08366640657186508,
-0.05037960782647133,
-0.0422227568924427,
-0.07275938242673874,
0.030661066994071007,
0.009583607316017151,
0.06122048571705818,
-0.14805130660533905,
-0.04961483180522919,
0.13908563554286957,
-0.13671758770942688,
0.03491522744297981,
0.09533624351024628,
-0.05111009627580643,
0.04411861672997475,
0.11627137660980225,
0.058399226516485214,
0.07029062509536743,
-0.047242749482393265,
-0.014503810554742813,
0.007795870304107666,
0.03451111540198326,
0.14491920173168182,
0.06652363389730453,
-0.06801819056272507,
-0.0819166898727417,
0.03565344586968422,
-0.0740010067820549,
-0.044585928320884705,
-0.05953802913427353,
-0.005099582951515913,
-0.010369679890573025,
-0.05628455430269241,
-0.006806913297623396,
-0.02460889331996441,
-0.012129866518080235,
-0.017627352848649025,
-0.05159921571612358,
0.05051863566040993,
0.06254493445158005,
-0.08787315338850021,
0.05623706430196762,
-0.05505256727337837,
0.016702979803085327,
-0.07948365807533264,
-0.0015159761533141136,
-0.17926840484142303,
0.008064505644142628,
0.11054707318544388,
-0.09874916821718216,
0.051884256303310394,
0.16293908655643463,
0.021831128746271133,
0.06761431694030762,
-0.05178619176149368,
0.07141270488500595,
0.006213202141225338,
-0.02539996989071369,
-0.04530925676226616,
-0.11773681640625,
-0.06484217941761017,
-0.06124435365200043,
0.010166545398533344,
-0.08432724326848984,
-0.005793848540633917,
-0.03796021640300751,
0.020614661276340485,
0.022108398377895355,
-0.0633004829287529,
0.020969761535525322,
0.024323387071490288,
-0.03792736306786537,
-0.027593903243541718,
-0.026219384744763374,
0.044837579131126404,
0.015769757330417633,
0.11583825945854187,
-0.0945960283279419,
-0.06662768125534058,
0.04587226361036301,
0.05291862413287163,
-0.052475906908512115,
0.09206186980009079,
-0.05451800301671028,
-0.03374599292874336,
-0.09943987429141998,
-0.09809023141860962,
0.17437013983726501,
-0.005219803657382727,
0.09903167188167572,
-0.09652993083000183,
-0.02571004442870617,
-0.0003801264683715999,
-0.00919258501380682,
-0.004343191161751747,
0.05195676535367966,
0.012707574293017387,
-0.09697017073631287,
-0.002060734899714589,
0.014891240745782852,
0.01839962974190712,
0.07691863924264908,
-0.01965423859655857,
-0.11415012180805206,
0.02939070202410221,
-0.001614519045688212,
-0.006421556230634451,
0.06586331129074097,
-0.047879382967948914,
-0.005667292512953281,
0.05526119843125343,
0.056161295622587204,
0.05564325302839279,
-0.06650110334157944,
0.09493734687566757,
0.06601274758577347,
-0.04322851821780205,
-0.043012235313653946,
-0.08453017473220825,
0.0105226319283247,
0.11519429087638855,
0.025671696290373802,
0.05962203070521355,
-0.046948786824941635,
-0.023711564019322395,
-0.10382430255413055,
0.15686386823654175,
-0.08754751086235046,
-0.1607213318347931,
-0.15181028842926025,
0.007510061841458082,
-0.05550912395119667,
0.0620909109711647,
0.015829280018806458,
-0.05050288885831833,
-0.0978439524769783,
-0.07788544148206711,
0.16059574484825134,
-0.04015098512172699,
-0.00771426223218441,
0.01954520307481289,
-0.029297325760126114,
0.03676055744290352,
-0.18210311233997345,
-0.0010100293438881636,
-0.040149569511413574,
-0.12568731606006622,
-0.037650227546691895,
0.00014893007755745202,
0.06819979846477509,
0.07112859189510345,
-0.037735067307949066,
-0.0761573538184166,
0.01871071755886078,
0.16394610702991486,
0.033761270344257355,
0.07701241970062256,
0.09394586831331253,
-0.09777913242578506,
0.042688999325037,
0.045992497354745865,
0.03083370439708233,
-0.013850436545908451,
0.008855690248310566,
0.05709083750844002,
-0.025901827961206436,
-0.2854757606983185,
-0.009383448399603367,
-0.018301459029316902,
-0.018587885424494743,
0.06654108315706253,
0.041863519698381424,
-0.08404847234487534,
0.04928560554981232,
-0.05772163346409798,
0.03221092373132706,
0.08582495898008347,
0.04514867812395096,
0.09500595182180405,
-0.0386364720761776,
0.09275167435407639,
-0.05348140746355057,
-0.01801413483917713,
0.1087341457605362,
-0.05072220787405968,
0.200204536318779,
-0.05466243997216225,
0.055192701518535614,
0.09736176580190659,
-0.013238166458904743,
0.038329433649778366,
0.13845373690128326,
-0.051567792892456055,
0.06974674761295319,
-0.05794315040111542,
-0.04554802551865578,
-0.03906974941492081,
0.024407727643847466,
-0.0013365610502660275,
0.037353962659835815,
-0.03576896712183952,
-0.016359776258468628,
-0.003469199873507023,
0.24010764062404633,
0.06799554079771042,
-0.12342459708452225,
-0.06818155199289322,
0.007493574172258377,
-0.10843182355165482,
-0.07033933699131012,
0.050560466945171356,
0.09074860066175461,
-0.08277256786823273,
0.04643379524350166,
0.00962022040039301,
0.06831203401088715,
-0.12743571400642395,
0.020634273067116737,
0.03922857716679573,
0.05052772909402847,
-0.02508215792477131,
0.0339248850941658,
-0.1553744673728943,
0.08439551293849945,
0.0360269770026207,
0.05267791450023651,
-0.05250445008277893,
0.06364096701145172,
0.020601332187652588,
-0.014125528745353222,
0.02572030760347843,
0.010999595746397972,
-0.019118739292025566,
-0.026697281748056412,
-0.06712286174297333,
0.083746537566185,
0.07637327164411545,
-0.05205458775162697,
0.11931446939706802,
-0.04955120012164116,
0.012257286347448826,
-0.009484462440013885,
0.07720451802015305,
-0.17298874258995056,
-0.13010317087173462,
0.045616716146469116,
-0.1423560380935669,
-0.023559533059597015,
-0.06759718060493469,
-0.054896652698516846,
-0.06996962428092957,
0.16748696565628052,
-0.12119568139314651,
-0.13332386314868927,
-0.08531465381383896,
-0.011738560162484646,
0.15329979360103607,
-0.0308071356266737,
0.007260343059897423,
-0.016989219933748245,
0.13224811851978302,
-0.03759714961051941,
-0.15186332166194916,
-0.04907921701669693,
-0.07026435434818268,
-0.15090809762477875,
-0.03344620019197464,
0.07109824568033218,
0.11084645241498947,
0.05175771191716194,
0.005366805009543896,
0.026158371940255165,
0.002918561454862356,
-0.052927982062101364,
-0.015765147283673286,
0.17981281876564026,
0.05343009904026985,
0.06994447112083435,
-0.1598273068666458,
-0.055674973875284195,
-0.050249870866537094,
0.02360408566892147,
-0.045246660709381104,
0.09793486446142197,
-0.030331522226333618,
0.07785037159919739,
0.24123400449752808,
-0.12908896803855896,
-0.20248962938785553,
0.007714735344052315,
0.0293815266340971,
0.004448893014341593,
0.006859451066702604,
-0.22528837621212006,
0.12181627005338669,
0.0901595875620842,
-0.0002368281566305086,
-0.0064919875003397465,
-0.18554088473320007,
-0.08129186928272247,
0.08045824617147446,
0.009370566345751286,
0.14589203894138336,
-0.09303180873394012,
-0.03175520524382591,
0.008903349749743938,
-0.08664517849683762,
0.0529373362660408,
0.048096224665641785,
0.08382408320903778,
-0.0006293226615525782,
-0.07471262663602829,
0.04972045123577118,
-0.014137755148112774,
0.08648097515106201,
0.04562291130423546,
0.047130681574344635,
-0.03389846533536911,
0.13101939857006073,
0.004238906782120466,
-0.016768455505371094,
0.13843800127506256,
0.1137797087430954,
0.05680828168988228,
-0.023946531116962433,
-0.062477193772792816,
-0.07318127155303955,
0.011262936517596245,
-0.021622994914650917,
-0.03925033658742905,
-0.06428160518407822,
0.0396236851811409,
0.06315775960683823,
0.0004489259736146778,
-0.041921816766262054,
-0.02488529123365879,
0.057493630796670914,
0.08927987515926361,
0.1936807930469513,
-0.05483014136552811,
-0.0072540040127933025,
-0.019010551273822784,
-0.022522058337926865,
0.06946253776550293,
-0.019193941727280617,
0.06492915004491806,
0.09018465131521225,
0.009912313893437386,
0.08245663344860077,
0.06238153576850891,
-0.13070952892303467,
-0.02256948873400688,
0.05372262001037598,
-0.1009385734796524,
-0.13858020305633545,
-0.027475541457533836,
-0.10554828494787216,
-0.13408862054347992,
-0.0004469385021366179,
0.1712038218975067,
-0.03691595420241356,
-0.047117915004491806,
-0.015763310715556145,
0.0794214978814125,
0.018928758800029755,
0.13094592094421387,
0.03381291776895523,
-0.014967313967645168,
-0.062109608203172684,
0.17153231799602509,
0.08924006670713425,
-0.09255657345056534,
0.01043445710092783,
0.016543766483664513,
-0.05906563624739647,
-0.005070369690656662,
-0.06502950936555862,
0.07496082037687302,
-0.0276951864361763,
-0.039070695638656616,
0.00140956521499902,
-0.10054153203964233,
0.04950433224439621,
0.1516069620847702,
0.0070965830236673355,
0.15834727883338928,
-0.038136932998895645,
0.06325026601552963,
-0.07539661973714828,
0.07248362898826599,
0.053679563105106354,
0.0773729607462883,
-0.01790691167116165,
0.04889138415455818,
-0.045871492475271225,
-0.001104201772250235,
-0.01493429858237505,
0.0005509946495294571,
-0.09031154960393906,
-0.05557236075401306,
-0.22326502203941345,
0.02664383128285408,
-0.057827968150377274,
-0.03583931922912598,
0.009949023835361004,
-0.014174005016684532,
0.003868661355227232,
0.03716452792286873,
-0.02542950212955475,
-0.03269229084253311,
-0.025647563859820366,
0.061961740255355835,
-0.1230413019657135,
0.02643550932407379,
0.06641272455453873,
-0.08711287379264832,
0.07700349390506744,
-0.0006517883157357574,
-0.05240504816174507,
-0.0009082919568754733,
0.012816964648663998,
-0.04648289084434509,
-0.0300348699092865,
0.008154262788593769,
-0.053051359951496124,
-0.11275950819253922,
0.027139870449900627,
0.011682671494781971,
-0.02687125839293003,
-0.029585544019937515,
0.07541552186012268,
-0.06477668136358261,
0.05268663540482521,
0.0367375873029232,
0.0044114491902291775,
-0.04344286397099495,
-0.015884429216384888,
0.11986873298883438,
0.07536964118480682,
0.05588079243898392,
-0.05036013945937157,
-0.018506979569792747,
-0.15560758113861084,
-0.0016306120669469237,
-0.0022551228757947683,
-0.0034953278955072165,
-0.04125089570879936,
-0.03734448924660683,
0.030251605436205864,
0.011315541341900826,
0.1775733381509781,
0.0077596097253263,
0.012572946958243847,
0.009257893078029156,
0.005024112295359373,
0.008131383918225765,
0.03389713168144226,
0.07096271216869354,
-0.014875120483338833,
-0.0780792385339737,
-0.07852514833211899,
0.033880773931741714,
-0.03130887821316719,
-0.02077220194041729,
0.1347712129354477,
0.1352071613073349,
0.10471593588590622,
0.02235107682645321,
0.00035212209331803024,
-0.030379939824342728,
-0.02989109791815281,
0.022328080609440804,
0.05648539587855339,
0.05011557415127754,
-0.014085482805967331,
0.012016009539365768,
0.06678798049688339,
-0.12438591569662094,
0.12039893120527267,
-0.035529714077711105,
-0.028531333431601524,
-0.10560973733663559,
-0.07856793701648712,
-0.018490158021450043,
-0.019737157970666885,
-0.020094670355319977,
-0.16525590419769287,
0.04730052873492241,
0.10739333182573318,
0.021900158375501633,
-0.03179086744785309,
0.03671303391456604,
-0.1490941047668457,
-0.09016168862581253,
0.0649600625038147,
0.013504271395504475,
0.04397803172469139,
0.10795505344867706,
-0.013329694978892803,
0.07931356132030487,
0.14229716360569,
0.06175750866532326,
0.05543557181954384,
0.08404988050460815,
0.008966139517724514,
-0.020855752751231194,
-0.040038689970970154,
0.00735959317535162,
-0.061455003917217255,
0.03781959414482117,
0.16068238019943237,
0.03153124079108238,
-0.050521038472652435,
0.035564616322517395,
0.1805947870016098,
-0.03533110022544861,
-0.051790084689855576,
-0.1777864247560501,
0.20972652733325958,
0.02275613322854042,
0.04312438890337944,
0.050212159752845764,
-0.08866336196660995,
-0.0350370779633522,
0.20561833679676056,
0.10748247802257538,
0.023379888385534286,
-0.023126859217882156,
0.017809012904763222,
-0.010687844827771187,
0.008768979460000992,
0.0831787958741188,
0.005674188490957022,
0.21895454823970795,
-0.03847771883010864,
0.01802860200405121,
0.03163456544280052,
0.03483942896127701,
-0.0725775957107544,
0.15135671198368073,
-0.04804173484444618,
-0.0016238363459706306,
-0.056607216596603394,
0.02109401673078537,
0.02025631070137024,
-0.3033827543258667,
-0.11065320670604706,
0.0007409829995594919,
-0.06335429102182388,
-0.016952762380242348,
-0.022554587572813034,
-0.006216428708285093,
0.05281966179609299,
-0.0036062609869986773,
0.030874304473400116,
0.18344727158546448,
-0.005756269674748182,
-0.03788573667407036,
-0.04080229997634888,
0.12178964167833328,
0.019169658422470093,
0.13844622671604156,
0.05890467390418053,
-0.01542069111019373,
0.044205136597156525,
0.023507308214902878,
-0.11290325969457626,
-0.03386906906962395,
-0.02634650468826294,
-0.009525217115879059,
-0.023382514715194702,
0.1324062943458557,
0.020562905818223953,
0.048751287162303925,
0.038761597126722336,
-0.044811367988586426,
0.04991282895207405,
0.04288903623819351,
-0.0680055320262909,
-0.0525236651301384,
0.04349539428949356,
-0.09682394564151764,
0.1412786990404129,
0.17648333311080933,
0.012430891394615173,
0.02491249330341816,
-0.056237976998090744,
-0.008735416457057,
0.0013525851536542177,
0.11668621748685837,
-0.0037121945060789585,
-0.158567875623703,
-0.013637714087963104,
-0.08618864417076111,
0.03942248597741127,
-0.22286026179790497,
-0.0456281453371048,
0.10174056142568588,
-0.01121362205594778,
-0.00843127816915512,
0.04829797148704529,
0.005421310197561979,
0.0596584677696228,
-0.015972377732396126,
-0.04824023321270943,
0.009562023915350437,
0.06749925762414932,
-0.08275759220123291,
-0.03174072876572609
] |
null | null |
transformers
|
# MultiBERTs - Seed 22
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #22.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_22')
model = TFBertModel.from_pretrained("google/multiberts-seed_22")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_22')
model = BertModel.from_pretrained("google/multiberts-seed_22")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_22"]}
| null |
google/multiberts-seed_22
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_22",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_22 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 22
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #22.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 22\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #22.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_22 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 22\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #22.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_22 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 22\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #22.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06275270134210587,
0.09429151564836502,
-0.004102804698050022,
0.04330987110733986,
0.07649558037519455,
0.015486767515540123,
0.05639365687966347,
0.07405543327331543,
-0.08972401916980743,
0.023097679018974304,
-0.012122534215450287,
-0.04695696756243706,
0.07724855840206146,
-0.04461030289530754,
0.060574259608983994,
-0.23465214669704437,
0.04996849596500397,
-0.0301988385617733,
-0.026316506788134575,
0.027960514649748802,
0.11124738305807114,
-0.09549485146999359,
0.07398878037929535,
0.05540253594517708,
0.0028937652241438627,
0.01693502627313137,
-0.01530374027788639,
0.005120629444718361,
0.08703192323446274,
0.032825421541929245,
0.08397238701581955,
-0.002389569068327546,
0.08523965626955032,
-0.1415732502937317,
0.006234860513359308,
0.05936206504702568,
0.06312916427850723,
0.0430682897567749,
0.11786282807588577,
0.008642656728625298,
0.08933660387992859,
0.01837494783103466,
0.05193730443716049,
0.04510191082954407,
-0.07451538741588593,
-0.16932526230812073,
-0.09280923753976822,
0.02024076320230961,
-0.0017241458408534527,
0.0064828903414309025,
-0.007413163315504789,
-0.01902373880147934,
-0.019435478374361992,
0.02016160450875759,
0.12101012468338013,
-0.26682934165000916,
-0.016493290662765503,
0.007918796502053738,
0.05795277655124664,
0.057646218687295914,
-0.03798918426036835,
-0.041283413767814636,
0.04327833652496338,
0.052590012550354004,
0.04079805314540863,
-0.025149816647171974,
0.04078228026628494,
-0.015765635296702385,
-0.15309008955955505,
-0.019670285284519196,
0.10580381006002426,
-0.04886768385767937,
-0.11691318452358246,
-0.047509029507637024,
-0.03316750004887581,
0.12492665648460388,
0.00914682261645794,
-0.03646625205874443,
0.04659195989370346,
0.03092070296406746,
0.064669169485569,
-0.062184762209653854,
-0.11581055074930191,
0.02572496421635151,
-0.05118044838309288,
0.10887061059474945,
0.09386909753084183,
0.04846811667084694,
-0.007190799340605736,
0.05545107275247574,
-0.08541363477706909,
-0.07757371664047241,
-0.05117953196167946,
-0.08898580819368362,
-0.04087972268462181,
-0.037914030253887177,
-0.08442579209804535,
-0.1649320274591446,
-0.00394095154479146,
0.10943806916475296,
-0.061732206493616104,
0.00852552056312561,
-0.09037195146083832,
-0.021552151069045067,
0.09369265288114548,
0.16065002977848053,
-0.10983966290950775,
0.04916681349277496,
-0.010988542810082436,
0.010463370010256767,
-0.023630760610103607,
0.03163430467247963,
0.012283533811569214,
-0.010182974860072136,
0.05145680159330368,
0.023652557283639908,
-0.019968388602137566,
0.04187022149562836,
-0.02073132060468197,
-0.04239613562822342,
0.053255531936883926,
-0.13353240489959717,
-0.010528751648962498,
0.0022326712496578693,
-0.0035692264791578054,
0.060897912830114365,
0.0647038072347641,
-0.027762306854128838,
-0.09010474383831024,
0.02340077981352806,
-0.08215448260307312,
-0.04666571319103241,
-0.05999823659658432,
-0.15596891939640045,
0.02780211716890335,
-0.0756637305021286,
-0.048680584877729416,
-0.09295427054166794,
-0.09739913046360016,
-0.02691890113055706,
0.0600198395550251,
-0.017348414286971092,
0.03721849247813225,
0.031170032918453217,
-0.007639132905751467,
-0.04148207977414131,
0.04599352553486824,
0.006898046005517244,
-0.014817459508776665,
0.008205779828131199,
-0.045345962047576904,
0.05380665510892868,
-0.009358282200992107,
0.04466714709997177,
-0.07010294497013092,
0.021643726155161858,
-0.14175115525722504,
0.060699205845594406,
-0.09657324850559235,
-0.08373065292835236,
-0.05015231668949127,
-0.04214058816432953,
-0.07185381650924683,
0.03071054071187973,
0.009829881601035595,
0.06147604063153267,
-0.1482759416103363,
-0.049866653978824615,
0.13873769342899323,
-0.13708321750164032,
0.0358344130218029,
0.09473680704832077,
-0.0513337180018425,
0.0444476418197155,
0.1161520704627037,
0.058253929018974304,
0.07053829729557037,
-0.046006496995687485,
-0.013897022232413292,
0.0077519724145531654,
0.035071857273578644,
0.1449553221464157,
0.0666675940155983,
-0.06830854713916779,
-0.0839768797159195,
0.035559531301259995,
-0.07408815622329712,
-0.044176142662763596,
-0.05958007648587227,
-0.00484874052926898,
-0.010407962836325169,
-0.056278131902217865,
-0.00590455811470747,
-0.02413296140730381,
-0.012420992366969585,
-0.017277931794524193,
-0.05124199762940407,
0.05365685746073723,
0.06254147738218307,
-0.08805534988641739,
0.05614302307367325,
-0.05498300865292549,
0.016262084245681763,
-0.07900501042604446,
-0.001352342194877565,
-0.17930546402931213,
0.008781864307820797,
0.11051976680755615,
-0.09787975251674652,
0.051131200045347214,
0.1630055457353592,
0.021527400240302086,
0.06826417148113251,
-0.05170843377709389,
0.07116403430700302,
0.0056261359713971615,
-0.025352641940116882,
-0.045651502907276154,
-0.11816447973251343,
-0.06463485956192017,
-0.06096392497420311,
0.010032502934336662,
-0.08602842688560486,
-0.005752737168222666,
-0.038905419409275055,
0.020961634814739227,
0.022177567705512047,
-0.06374800950288773,
0.020579634234309196,
0.02428325265645981,
-0.038067519664764404,
-0.027447979897260666,
-0.025631971657276154,
0.04523981735110283,
0.015930775552988052,
0.11602908372879028,
-0.09462437033653259,
-0.06576839834451675,
0.045885078608989716,
0.05340541526675224,
-0.05195207893848419,
0.09214850515127182,
-0.054630424827337265,
-0.034328773617744446,
-0.09907400608062744,
-0.0985860824584961,
0.1723993420600891,
-0.005167451687157154,
0.0985131561756134,
-0.09668533504009247,
-0.02561979368329048,
-0.00042715674499049783,
-0.008707878179848194,
-0.0034876400604844093,
0.05228789895772934,
0.013832295313477516,
-0.09619469195604324,
-0.002355105709284544,
0.014853417873382568,
0.018375063315033913,
0.07693276554346085,
-0.0196274071931839,
-0.11430775374174118,
0.030016105622053146,
-0.0014676724094897509,
-0.005913229659199715,
0.06546784937381744,
-0.048710137605667114,
-0.005114689003676176,
0.05519167333841324,
0.05561058223247528,
0.0555499866604805,
-0.06681382656097412,
0.09459442645311356,
0.06588837504386902,
-0.04304753243923187,
-0.04401065781712532,
-0.084794782102108,
0.010820250026881695,
0.11550070345401764,
0.025315461680293083,
0.059471432119607925,
-0.04710555449128151,
-0.023893345147371292,
-0.1045442447066307,
0.15682066977024078,
-0.08762367069721222,
-0.1609339714050293,
-0.15262271463871002,
0.007924295030534267,
-0.05539306625723839,
0.062481388449668884,
0.01587897725403309,
-0.05011395364999771,
-0.097509004175663,
-0.0781698226928711,
0.16074280440807343,
-0.040008027106523514,
-0.0067139724269509315,
0.019384080544114113,
-0.028998970985412598,
0.037481728941202164,
-0.18149974942207336,
-0.00042399982339702547,
-0.03981650620698929,
-0.12568365037441254,
-0.03821958601474762,
-0.0002435794594930485,
0.06791528314352036,
0.07160871475934982,
-0.03756444528698921,
-0.07695647329092026,
0.018937991932034492,
0.16423404216766357,
0.03406999260187149,
0.0764174684882164,
0.09366492927074432,
-0.0992116779088974,
0.042839810252189636,
0.04694695770740509,
0.030543025583028793,
-0.013617944903671741,
0.009111464023590088,
0.057364288717508316,
-0.026044990867376328,
-0.2855544686317444,
-0.008491519838571548,
-0.018577512353658676,
-0.018242713063955307,
0.06653672456741333,
0.04167736694216728,
-0.08543361723423004,
0.04853199049830437,
-0.057129357010126114,
0.033017754554748535,
0.08657342940568924,
0.04481364041566849,
0.09411314129829407,
-0.03852061927318573,
0.0928448736667633,
-0.05343450978398323,
-0.01730263978242874,
0.10895239561796188,
-0.05212629958987236,
0.19984441995620728,
-0.05542440712451935,
0.05304320901632309,
0.09778176993131638,
-0.012297682464122772,
0.03829890862107277,
0.13908231258392334,
-0.05218812823295593,
0.0701780840754509,
-0.057798679918050766,
-0.04570671543478966,
-0.03929326310753822,
0.025126047432422638,
-0.0015059786383062601,
0.03637702018022537,
-0.03639846295118332,
-0.016389120370149612,
-0.004250251688063145,
0.23990653455257416,
0.06864103674888611,
-0.12263846397399902,
-0.06806114315986633,
0.007680533453822136,
-0.10918247699737549,
-0.07056800276041031,
0.050560131669044495,
0.09227775037288666,
-0.08288698643445969,
0.04625425860285759,
0.009941677562892437,
0.06834201514720917,
-0.1269071102142334,
0.020424943417310715,
0.03873457759618759,
0.04996025934815407,
-0.025048209354281425,
0.033767540007829666,
-0.15454964339733124,
0.08355794847011566,
0.03594005107879639,
0.05217009037733078,
-0.051850344985723495,
0.06393814831972122,
0.020664161071181297,
-0.013454703614115715,
0.025296537205576897,
0.01113841962069273,
-0.021044405177235603,
-0.02722349390387535,
-0.06637763231992722,
0.08368989080190659,
0.0757969468832016,
-0.052141401916742325,
0.11965356767177582,
-0.04956943169236183,
0.012887376360595226,
-0.008791451342403889,
0.07678894698619843,
-0.17278778553009033,
-0.13016489148139954,
0.04570740461349487,
-0.14328435063362122,
-0.02361532673239708,
-0.0683397427201271,
-0.05504873767495155,
-0.06849664449691772,
0.16776815056800842,
-0.12310223281383514,
-0.13293670117855072,
-0.08501850813627243,
-0.011659207753837109,
0.1532871425151825,
-0.030504776164889336,
0.00788920372724533,
-0.01625966653227806,
0.13256102800369263,
-0.036839134991168976,
-0.1521182656288147,
-0.04892202839255333,
-0.06982622295618057,
-0.15145331621170044,
-0.03339604660868645,
0.0709058865904808,
0.11042933166027069,
0.051768459379673004,
0.004955780692398548,
0.026154030114412308,
0.004175765439867973,
-0.05275093764066696,
-0.01596708409488201,
0.18155258893966675,
0.053878188133239746,
0.07015430182218552,
-0.15988388657569885,
-0.05501050129532814,
-0.04952569678425789,
0.02338472567498684,
-0.04572038725018501,
0.09851834177970886,
-0.030169997364282608,
0.07861916720867157,
0.24048523604869843,
-0.13009174168109894,
-0.20292791724205017,
0.008100531995296478,
0.029653897508978844,
0.004286543000489473,
0.007447320967912674,
-0.2252366542816162,
0.1217050850391388,
0.08983899652957916,
0.000022674494175589643,
-0.007839522324502468,
-0.18609140813350677,
-0.08175656199455261,
0.07991579174995422,
0.009370953775942326,
0.14636288583278656,
-0.09254135936498642,
-0.03233663737773895,
0.00799075048416853,
-0.08618947863578796,
0.05307875573635101,
0.04601066932082176,
0.08333093672990799,
-0.000344384548952803,
-0.07554971426725388,
0.0501636266708374,
-0.014040566049516201,
0.08683552592992783,
0.045137692242860794,
0.04662003368139267,
-0.03434232249855995,
0.1317095309495926,
0.0025101222563534975,
-0.016767635941505432,
0.13743051886558533,
0.11301147192716599,
0.0566016286611557,
-0.025524068623781204,
-0.06258322298526764,
-0.07317996025085449,
0.010854466818273067,
-0.02168971300125122,
-0.03845086321234703,
-0.06346304714679718,
0.039264801889657974,
0.06361895054578781,
0.0006057072896510363,
-0.04291819408535957,
-0.0249308031052351,
0.05805974081158638,
0.09076172858476639,
0.1935870349407196,
-0.05404338240623474,
-0.007091467268764973,
-0.01859322190284729,
-0.022150494158267975,
0.06959491968154907,
-0.019194062799215317,
0.06498537212610245,
0.08998775482177734,
0.009739437140524387,
0.08250361680984497,
0.0624682791531086,
-0.13092665374279022,
-0.022610822692513466,
0.054222073405981064,
-0.10157212615013123,
-0.13882184028625488,
-0.026776045560836792,
-0.10464324057102203,
-0.13459153473377228,
-0.001524673425592482,
0.1717829406261444,
-0.03694482147693634,
-0.04665593430399895,
-0.015289685688912868,
0.07971733063459396,
0.018986279144883156,
0.13148631155490875,
0.03335361182689667,
-0.014989514835178852,
-0.06187940016388893,
0.1709916591644287,
0.08899687230587006,
-0.0924852266907692,
0.01026139035820961,
0.01641910709440708,
-0.05913509428501129,
-0.004619232378900051,
-0.06381676346063614,
0.07489483803510666,
-0.027661236003041267,
-0.03929920122027397,
0.0016538668423891068,
-0.10027192533016205,
0.049612440168857574,
0.15114344656467438,
0.006732323672622442,
0.15840254724025726,
-0.03801211342215538,
0.06304265558719635,
-0.07443909347057343,
0.07244250178337097,
0.054043591022491455,
0.077092744410038,
-0.0175318643450737,
0.04950478672981262,
-0.04583550617098808,
-0.0012103230692446232,
-0.014867838472127914,
0.0013256454840302467,
-0.09103305637836456,
-0.0557715967297554,
-0.22414062917232513,
0.027182674035429955,
-0.057966072112321854,
-0.035888493061065674,
0.01009699422866106,
-0.014320812188088894,
0.0037725872825831175,
0.03666900098323822,
-0.025698630139231682,
-0.03292259946465492,
-0.02644190937280655,
0.0622963048517704,
-0.12273228168487549,
0.026467060670256615,
0.066986083984375,
-0.087441585958004,
0.07719043642282486,
-0.001150751020759344,
-0.05221286788582802,
-0.0003403985174372792,
0.015397720970213413,
-0.0466935820877552,
-0.03005298785865307,
0.008080702275037766,
-0.05364987999200821,
-0.11217565834522247,
0.027222726494073868,
0.011509912088513374,
-0.02689041756093502,
-0.029515253379940987,
0.07611500471830368,
-0.0648728683590889,
0.05185738950967789,
0.036840587854385376,
0.004279878456145525,
-0.04303182289004326,
-0.01612282730638981,
0.11955458670854568,
0.07575421035289764,
0.05598945915699005,
-0.05076849088072777,
-0.018643727526068687,
-0.15549246966838837,
-0.0018151949625462294,
-0.001929437625221908,
-0.0034693563356995583,
-0.039482321590185165,
-0.03750675544142723,
0.030511964112520218,
0.011348607018589973,
0.17697468400001526,
0.00789629202336073,
0.013845506124198437,
0.009426387026906013,
0.004330449737608433,
0.00992260780185461,
0.034322213381528854,
0.07134363055229187,
-0.014639340341091156,
-0.07820835709571838,
-0.07915910333395004,
0.03339691460132599,
-0.03137211874127388,
-0.021257106214761734,
0.13519278168678284,
0.13520926237106323,
0.10571003705263138,
0.022257044911384583,
0.0011781946523115039,
-0.030045991763472557,
-0.030371157452464104,
0.02252766489982605,
0.05617023631930351,
0.05100676044821739,
-0.014713156037032604,
0.012861935421824455,
0.06663840264081955,
-0.1248408704996109,
0.12015730887651443,
-0.03556256368756294,
-0.028748149052262306,
-0.10618147999048233,
-0.07774176448583603,
-0.018662435933947563,
-0.020540863275527954,
-0.020293312147259712,
-0.16489988565444946,
0.04744347184896469,
0.10693100094795227,
0.021313661709427834,
-0.031753093004226685,
0.0354129783809185,
-0.14833149313926697,
-0.08996517956256866,
0.06471703201532364,
0.0133690619841218,
0.043859656900167465,
0.10841509699821472,
-0.01395803689956665,
0.07986502349376678,
0.14283092319965363,
0.06163913384079933,
0.05539720505475998,
0.0839793011546135,
0.009405604563653469,
-0.020273003727197647,
-0.0394437313079834,
0.007767552509903908,
-0.06154875084757805,
0.03810923546552658,
0.16017183661460876,
0.032013267278671265,
-0.05038701370358467,
0.035466235131025314,
0.18042881786823273,
-0.035907939076423645,
-0.05174586549401283,
-0.17826981842517853,
0.2094329297542572,
0.02218320406973362,
0.043819282203912735,
0.0503164604306221,
-0.08890757709741592,
-0.035476308315992355,
0.20634274184703827,
0.10608167201280594,
0.023099541664123535,
-0.02324679121375084,
0.017861129716038704,
-0.010738099925220013,
0.009054664522409439,
0.08399929851293564,
0.005963027942925692,
0.2186850756406784,
-0.03782465681433678,
0.017044298350811005,
0.03180338442325592,
0.034762781113386154,
-0.07200826704502106,
0.1519903838634491,
-0.04796488583087921,
-0.0020102791022509336,
-0.05651938542723656,
0.021287716925144196,
0.020161496475338936,
-0.305496484041214,
-0.11074564605951309,
0.0019372953101992607,
-0.06324479728937149,
-0.016686799004673958,
-0.021720683202147484,
-0.007077415939420462,
0.05217040330171585,
-0.004130640998482704,
0.03122517094016075,
0.18311353027820587,
-0.005809779744595289,
-0.038014594465494156,
-0.041326384991407394,
0.12208130955696106,
0.02005378156900406,
0.1378607153892517,
0.05886240303516388,
-0.01572526805102825,
0.04402337595820427,
0.023999538272619247,
-0.11315696686506271,
-0.03439118340611458,
-0.025924811139702797,
-0.01039064209908247,
-0.023724766448140144,
0.13309790194034576,
0.019989006221294403,
0.04867016151547432,
0.038839198648929596,
-0.0440860316157341,
0.049436409026384354,
0.04148469120264053,
-0.06811963021755219,
-0.052844636142253876,
0.0427774041891098,
-0.0972164124250412,
0.14066742360591888,
0.1766129583120346,
0.012212634086608887,
0.025281328707933426,
-0.05563770979642868,
-0.009112556464970112,
0.0004402975318953395,
0.11698106676340103,
-0.0031984022352844477,
-0.15876758098602295,
-0.013481962494552135,
-0.08704331517219543,
0.03965440392494202,
-0.22268734872341156,
-0.04600243642926216,
0.1019095703959465,
-0.01114104688167572,
-0.009149998426437378,
0.04974525049328804,
0.00531550869345665,
0.05889158695936203,
-0.016412056982517242,
-0.05120743438601494,
0.009264680556952953,
0.06781871616840363,
-0.08214201033115387,
-0.03245023265480995
] |
null | null |
transformers
|
# MultiBERTs - Seed 23
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #23.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_23')
model = TFBertModel.from_pretrained("google/multiberts-seed_23")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_23')
model = BertModel.from_pretrained("google/multiberts-seed_23")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_23"]}
| null |
google/multiberts-seed_23
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_23",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_23 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 23
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #23.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 23\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #23.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_23 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 23\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #23.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
190,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_23 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 23\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #23.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.06271129101514816,
0.093672975897789,
-0.004096059128642082,
0.044015511870384216,
0.07682158052921295,
0.015517160296440125,
0.05724673345685005,
0.0743616446852684,
-0.0899863913655281,
0.022718796506524086,
-0.011886537075042725,
-0.046587709337472916,
0.07740053534507751,
-0.04542727395892143,
0.060267265886068344,
-0.23367443680763245,
0.050440847873687744,
-0.030657760798931122,
-0.02436699904501438,
0.028531251475214958,
0.11148373037576675,
-0.09539684653282166,
0.07423420250415802,
0.055013787001371384,
0.00413209293037653,
0.01760224811732769,
-0.016126463189721107,
0.005125799681991339,
0.08682037889957428,
0.032102737575769424,
0.08448503166437149,
-0.00240647466853261,
0.08556538820266724,
-0.14218199253082275,
0.006108145695179701,
0.059230219572782516,
0.06282487511634827,
0.04291205853223801,
0.11750616878271103,
0.007983097806572914,
0.08712255209684372,
0.017004305496811867,
0.05284798890352249,
0.0452897809445858,
-0.07485386729240417,
-0.16784779727458954,
-0.09339101612567902,
0.020551154389977455,
-0.0009577449527569115,
0.0070136296562850475,
-0.007302957121282816,
-0.018117910251021385,
-0.01957438886165619,
0.020951667800545692,
0.12262821942567825,
-0.2660175561904907,
-0.016913944855332375,
0.00888446718454361,
0.058363355696201324,
0.05853580683469772,
-0.03773418813943863,
-0.04143485799431801,
0.0430777408182621,
0.0525156669318676,
0.04086323082447052,
-0.025097060948610306,
0.04113002121448517,
-0.015141312032938004,
-0.15351757407188416,
-0.019677558913826942,
0.10628315061330795,
-0.048846494406461716,
-0.11666589230298996,
-0.04851348325610161,
-0.03281792998313904,
0.12210458517074585,
0.009222861379384995,
-0.03655516728758812,
0.046675994992256165,
0.030382545664906502,
0.06396351754665375,
-0.06223572418093681,
-0.11556817591190338,
0.02570933848619461,
-0.05065444856882095,
0.10854002088308334,
0.09402985125780106,
0.04817737266421318,
-0.006845325697213411,
0.055846866220235825,
-0.08367299288511276,
-0.07753154635429382,
-0.05121367797255516,
-0.08884511142969131,
-0.04098190739750862,
-0.03830863535404205,
-0.08470065146684647,
-0.16515739262104034,
-0.00371694378554821,
0.11120092123746872,
-0.06209385022521019,
0.008690985850989819,
-0.08961313962936401,
-0.021592574194073677,
0.09419386088848114,
0.16178278625011444,
-0.1098635345697403,
0.04840288683772087,
-0.011134017258882523,
0.011052706278860569,
-0.02344895899295807,
0.03181808814406395,
0.01219760999083519,
-0.009948224760591984,
0.05166546627879143,
0.02440003678202629,
-0.020131703466176987,
0.042205795645713806,
-0.02129255048930645,
-0.04286570101976395,
0.05418482422828674,
-0.13387349247932434,
-0.010868321172893047,
0.0024561737664043903,
-0.003262751502916217,
0.061221539974212646,
0.064360611140728,
-0.027974799275398254,
-0.0903083086013794,
0.023154979571700096,
-0.08195152133703232,
-0.04687107354402542,
-0.060124803334474564,
-0.1562608927488327,
0.02727578394114971,
-0.07561322301626205,
-0.048925552517175674,
-0.0925685316324234,
-0.09848484396934509,
-0.027069801464676857,
0.06023617461323738,
-0.017396818846464157,
0.03716296702623367,
0.030314451083540916,
-0.0076308222487568855,
-0.04154149442911148,
0.04594535753130913,
0.00628973264247179,
-0.014873500913381577,
0.007920623756945133,
-0.04444335773587227,
0.05344498157501221,
-0.009633788838982582,
0.044676508754491806,
-0.07034099847078323,
0.021365635097026825,
-0.1401444971561432,
0.061247482895851135,
-0.09731371700763702,
-0.08383993804454803,
-0.050136957317590714,
-0.042407453060150146,
-0.07102512568235397,
0.030822401866316795,
0.010284936055541039,
0.06135963276028633,
-0.1470383256673813,
-0.04964170977473259,
0.1377391219139099,
-0.13663147389888763,
0.03529500216245651,
0.09488752484321594,
-0.05166346952319145,
0.04327143356204033,
0.11573466658592224,
0.05798891931772232,
0.07083849608898163,
-0.04597621038556099,
-0.013919489458203316,
0.00772097148001194,
0.03583782538771629,
0.14482471346855164,
0.06657512485980988,
-0.06800901889801025,
-0.08296684175729752,
0.03552420809864998,
-0.07522996515035629,
-0.04439551383256912,
-0.059526342898607254,
-0.005299222189933062,
-0.010124714113771915,
-0.056257955729961395,
-0.006476975046098232,
-0.02436366118490696,
-0.012612365186214447,
-0.017599012702703476,
-0.05138179287314415,
0.053734757006168365,
0.06272026151418686,
-0.08840660005807877,
0.05609646067023277,
-0.05464284494519234,
0.016848212108016014,
-0.07944848388433456,
-0.001977315405383706,
-0.1784757524728775,
0.007647245656698942,
0.11035364866256714,
-0.09724365174770355,
0.05090803653001785,
0.1624612957239151,
0.02161041647195816,
0.06809798628091812,
-0.05172904208302498,
0.07066363096237183,
0.0050780680030584335,
-0.02507861517369747,
-0.045270953327417374,
-0.11800464987754822,
-0.06510474532842636,
-0.061287201941013336,
0.00827073585242033,
-0.0841922014951706,
-0.005536671727895737,
-0.03890378400683403,
0.020316801965236664,
0.02203948237001896,
-0.06404853612184525,
0.02078375406563282,
0.024796202778816223,
-0.038504477590322495,
-0.027844520285725594,
-0.025679970160126686,
0.04455209895968437,
0.016282707452774048,
0.1150970309972763,
-0.09459967166185379,
-0.06557141989469528,
0.04599441587924957,
0.05283963680267334,
-0.05242825672030449,
0.09223079681396484,
-0.054832253605127335,
-0.03394398093223572,
-0.09905174374580383,
-0.09848382323980331,
0.1733253449201584,
-0.004753969609737396,
0.09878108650445938,
-0.0968765914440155,
-0.02620752342045307,
-0.0007869984838180244,
-0.008416024968028069,
-0.0040158722549676895,
0.052255187183618546,
0.013940234668552876,
-0.09552503377199173,
-0.0016253815265372396,
0.014300876297056675,
0.018508505076169968,
0.07712993770837784,
-0.019522560760378838,
-0.11436212807893753,
0.030059706419706345,
-0.0011243175249546766,
-0.006104812026023865,
0.06564781814813614,
-0.04965027794241905,
-0.005780264735221863,
0.05480063334107399,
0.05536121502518654,
0.05561091750860214,
-0.06658197939395905,
0.09455700218677521,
0.06609386205673218,
-0.04356628283858299,
-0.04270085692405701,
-0.08529922366142273,
0.010784660466015339,
0.1149655431509018,
0.025665266439318657,
0.05864404886960983,
-0.04684881120920181,
-0.024075184017419815,
-0.10414984822273254,
0.1565687209367752,
-0.08705358952283859,
-0.1612839698791504,
-0.15286502242088318,
0.006927744019776583,
-0.05623217672109604,
0.06189721077680588,
0.01589527167379856,
-0.05038629099726677,
-0.09763315320014954,
-0.07849478721618652,
0.16058629751205444,
-0.040412724018096924,
-0.006948066875338554,
0.019840968772768974,
-0.029536155983805656,
0.037224121391773224,
-0.18187682330608368,
-0.0006412694347091019,
-0.03967868909239769,
-0.12640032172203064,
-0.03856775537133217,
-0.0002090168127324432,
0.06797873973846436,
0.07080735266208649,
-0.03766337037086487,
-0.07638754695653915,
0.018562737852334976,
0.16447274386882782,
0.034116290509700775,
0.07600169628858566,
0.09254948794841766,
-0.09909650683403015,
0.042376283556222916,
0.0466487817466259,
0.03095310367643833,
-0.013936684466898441,
0.009063148871064186,
0.05781254917383194,
-0.025693807750940323,
-0.2855739891529083,
-0.008550108410418034,
-0.01884998381137848,
-0.018011029809713364,
0.06625068932771683,
0.04164910316467285,
-0.08604875206947327,
0.04908647760748863,
-0.057249363511800766,
0.03270112723112106,
0.08618568629026413,
0.04551481455564499,
0.09450749307870865,
-0.03853502869606018,
0.09304987639188766,
-0.05304896458983421,
-0.01764269545674324,
0.10898860543966293,
-0.05105499178171158,
0.19820158183574677,
-0.05534544959664345,
0.05386917665600777,
0.09788811951875687,
-0.011654786765575409,
0.03781907260417938,
0.1387750208377838,
-0.052012648433446884,
0.07027921825647354,
-0.0577084943652153,
-0.045655567198991776,
-0.039119936525821686,
0.02440836653113365,
-0.001819854136556387,
0.03741486743092537,
-0.03667984530329704,
-0.016824278980493546,
-0.003809344256296754,
0.24152934551239014,
0.06806226074695587,
-0.12361940741539001,
-0.06783568114042282,
0.007508610840886831,
-0.1092391312122345,
-0.07018480449914932,
0.051022645086050034,
0.09074372798204422,
-0.08254916965961456,
0.04633178189396858,
0.00976161751896143,
0.06809677183628082,
-0.12801195681095123,
0.020549524575471878,
0.04004999250173569,
0.05015409365296364,
-0.025177543982863426,
0.03347158432006836,
-0.15498030185699463,
0.08381582796573639,
0.035798169672489166,
0.05221966281533241,
-0.051638077944517136,
0.0641300305724144,
0.020434174686670303,
-0.014310305938124657,
0.025820983573794365,
0.011040935292840004,
-0.018399199470877647,
-0.0274250116199255,
-0.06719879806041718,
0.08394555747509003,
0.07584821432828903,
-0.052301324903964996,
0.12002315372228622,
-0.049882326275110245,
0.01269326638430357,
-0.008956775069236755,
0.07773767411708832,
-0.1725308895111084,
-0.13080887496471405,
0.045746613293886185,
-0.14273391664028168,
-0.02532418817281723,
-0.06790708750486374,
-0.055163316428661346,
-0.07034726440906525,
0.17024914920330048,
-0.12285857647657394,
-0.13348478078842163,
-0.08516260236501694,
-0.012219449505209923,
0.153498113155365,
-0.030239904299378395,
0.007708379533141851,
-0.016328515484929085,
0.13154666125774384,
-0.036884505301713943,
-0.15228486061096191,
-0.048894003033638,
-0.07023420929908752,
-0.1514209359884262,
-0.03346617519855499,
0.07046443969011307,
0.1103263720870018,
0.051593877375125885,
0.004718541167676449,
0.02623574808239937,
0.0038112218026071787,
-0.052772048860788345,
-0.016453122720122337,
0.1815738081932068,
0.052631866186857224,
0.07093022018671036,
-0.1597723364830017,
-0.05609063804149628,
-0.04989361763000488,
0.02375844307243824,
-0.04599388688802719,
0.09952600300312042,
-0.029771838337183,
0.07904425263404846,
0.24020537734031677,
-0.12998177111148834,
-0.20196028053760529,
0.007759195752441883,
0.029028648510575294,
0.004621282685548067,
0.007947645150125027,
-0.2248816043138504,
0.12120328098535538,
0.08966211974620819,
0.000014926239600754343,
-0.009242377243936062,
-0.18669387698173523,
-0.08149431645870209,
0.08070871978998184,
0.010211537592113018,
0.14658309519290924,
-0.09276880323886871,
-0.03197333216667175,
0.008852868340909481,
-0.08587276190519333,
0.053079526871442795,
0.04726918414235115,
0.08386595547199249,
-0.0008654801058582962,
-0.07533372193574905,
0.0499756745994091,
-0.014438006095588207,
0.08653361350297928,
0.044401414692401886,
0.046873316168785095,
-0.0341634601354599,
0.13159039616584778,
0.0033259361516684294,
-0.016777703538537025,
0.13808265328407288,
0.11275716871023178,
0.05714448168873787,
-0.024931909516453743,
-0.06288642436265945,
-0.07311544567346573,
0.010549532249569893,
-0.02191818505525589,
-0.039096325635910034,
-0.06393691152334213,
0.039596788585186005,
0.063997782766819,
0.0007689015474170446,
-0.04202698543667793,
-0.024112185463309288,
0.058112312108278275,
0.09030845016241074,
0.19409555196762085,
-0.05322514846920967,
-0.006880374625325203,
-0.019224274903535843,
-0.0226154625415802,
0.06942445784807205,
-0.017051897943019867,
0.06460505723953247,
0.09026345610618591,
0.009345903992652893,
0.0822792574763298,
0.06253056973218918,
-0.13081267476081848,
-0.022860577329993248,
0.054067086428403854,
-0.10205097496509552,
-0.13710926473140717,
-0.027218030765652657,
-0.10667562484741211,
-0.13468186557292938,
-0.001360872876830399,
0.1724282056093216,
-0.03687261790037155,
-0.046987954527139664,
-0.015059546567499638,
0.07989834994077682,
0.019009165465831757,
0.13116790354251862,
0.03313252702355385,
-0.015071848407387733,
-0.0620262511074543,
0.17119263112545013,
0.0885949432849884,
-0.09180866181850433,
0.010600279085338116,
0.017353862524032593,
-0.05893661081790924,
-0.004757630173116922,
-0.06430082768201828,
0.07498981058597565,
-0.028992364183068275,
-0.038625024259090424,
0.0002704459766391665,
-0.0998329445719719,
0.04982305318117142,
0.15126828849315643,
0.006989145651459694,
0.15832962095737457,
-0.037792086601257324,
0.063392274081707,
-0.07467902451753616,
0.07247491925954819,
0.053504910320043564,
0.07728353142738342,
-0.017819847911596298,
0.04815729334950447,
-0.046068113297224045,
-0.0018543376354500651,
-0.014818374067544937,
0.0012958644656464458,
-0.09161635488271713,
-0.05541913956403732,
-0.22423748672008514,
0.026653839275240898,
-0.057625479996204376,
-0.03547784313559532,
0.010382700711488724,
-0.013879669830203056,
0.003487678710371256,
0.03710838034749031,
-0.025670243427157402,
-0.03296635299921036,
-0.02629578299820423,
0.06235181540250778,
-0.12322439253330231,
0.026774073019623756,
0.0667833685874939,
-0.08733253180980682,
0.07717470079660416,
-0.0018866678001359105,
-0.05269384756684303,
-0.0008956632227636874,
0.012445353902876377,
-0.04670802876353264,
-0.030145741999149323,
0.00801943615078926,
-0.053347039967775345,
-0.11191842705011368,
0.027410447597503662,
0.012095518410205841,
-0.026838848367333412,
-0.029218286275863647,
0.07667147368192673,
-0.06509369611740112,
0.05167541652917862,
0.037451013922691345,
0.004614544101059437,
-0.04288458079099655,
-0.015893589705228806,
0.1197613850235939,
0.07574561983346939,
0.05565854161977768,
-0.0513296015560627,
-0.018515268340706825,
-0.15564274787902832,
-0.0016697220271453261,
-0.0018627417739480734,
-0.0032400551717728376,
-0.04080810025334358,
-0.037291090935468674,
0.030596084892749786,
0.011430861428380013,
0.17948095500469208,
0.00790612306445837,
0.013618447817862034,
0.00946184154599905,
0.004304525442421436,
0.010792733170092106,
0.03421681374311447,
0.07176616787910461,
-0.015138911083340645,
-0.07854912430047989,
-0.07881850749254227,
0.03327421098947525,
-0.03138703852891922,
-0.021859606727957726,
0.13489049673080444,
0.13575276732444763,
0.10495249927043915,
0.022449564188718796,
0.0014484705170616508,
-0.029912620782852173,
-0.028579970821738243,
0.021160565316677094,
0.05634157732129097,
0.051253270357847214,
-0.014340389519929886,
0.012490687891840935,
0.06673847138881683,
-0.12479402124881744,
0.12037908285856247,
-0.03530363366007805,
-0.02863001637160778,
-0.10622493177652359,
-0.07767382264137268,
-0.018645500764250755,
-0.020218219608068466,
-0.02008996531367302,
-0.16539131104946136,
0.04704669862985611,
0.10645918548107147,
0.021979624405503273,
-0.031785715371370316,
0.03587654232978821,
-0.1467399001121521,
-0.08953509479761124,
0.06473837047815323,
0.013484855182468891,
0.043827690184116364,
0.10779543220996857,
-0.014328524470329285,
0.07938005775213242,
0.14279605448246002,
0.06152307242155075,
0.05495435371994972,
0.08405140042304993,
0.009127994999289513,
-0.019922297447919846,
-0.039910584688186646,
0.007716426160186529,
-0.06217805668711662,
0.03792114555835724,
0.16133370995521545,
0.031888894736766815,
-0.05055062100291252,
0.035354360938072205,
0.17991304397583008,
-0.035946786403656006,
-0.05123161897063255,
-0.17826023697853088,
0.2091449797153473,
0.022165771573781967,
0.04356767237186432,
0.049969565123319626,
-0.08886376768350601,
-0.03548486530780792,
0.20640580356121063,
0.107392318546772,
0.023082472383975983,
-0.023028312250971794,
0.018191857263445854,
-0.01054462417960167,
0.008773993700742722,
0.08339231461286545,
0.005437173414975405,
0.21801243722438812,
-0.03770958259701729,
0.016968531534075737,
0.03220149874687195,
0.03487163037061691,
-0.07137186825275421,
0.15035435557365417,
-0.04702077805995941,
-0.0017643211176618934,
-0.05705133453011513,
0.02109122835099697,
0.019331736490130424,
-0.3042847514152527,
-0.11024000495672226,
0.001305260811932385,
-0.06366575509309769,
-0.0172317773103714,
-0.021996866911649704,
-0.006998282391577959,
0.05231047421693802,
-0.0035396472085267305,
0.03137747943401337,
0.182850643992424,
-0.005907811224460602,
-0.03829488158226013,
-0.041370149701833725,
0.1221063956618309,
0.019956160336732864,
0.1384550929069519,
0.05911695957183838,
-0.015755074098706245,
0.04430520907044411,
0.02354309894144535,
-0.11299659311771393,
-0.03416259214282036,
-0.025894900783896446,
-0.010232413187623024,
-0.023579685017466545,
0.1332675963640213,
0.020276185125112534,
0.04978669062256813,
0.038574133068323135,
-0.0432438850402832,
0.04996249079704285,
0.04173160716891289,
-0.0681685283780098,
-0.05303453654050827,
0.04264380410313606,
-0.09682169556617737,
0.1408480703830719,
0.17681343853473663,
0.012451089918613434,
0.025428809225559235,
-0.055830392986536026,
-0.008568944409489632,
0.00034282123669981956,
0.11632148176431656,
-0.0033827282022684813,
-0.15881843864917755,
-0.013652036897838116,
-0.08630086481571198,
0.03952808678150177,
-0.2236066609621048,
-0.04627051576972008,
0.10204309225082397,
-0.010964660905301571,
-0.009966581128537655,
0.04937633499503136,
0.00522206025198102,
0.059633929282426834,
-0.01595555804669857,
-0.04936946928501129,
0.009379236027598381,
0.0676533654332161,
-0.08293717354536057,
-0.03216211870312691
] |
null | null |
transformers
|
# MultiBERTs - Seed 24
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #24.
## Model Description
This model is a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_24')
model = TFBertModel.from_pretrained("google/multiberts-seed_24")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_24')
model = BertModel.from_pretrained("google/multiberts-seed_24")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_24"]}
| null |
google/multiberts-seed_24
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_24",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_24 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs - Seed 24
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #24.
## Model Description
This model is a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs - Seed 24\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #24.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_24 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs - Seed 24\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #24.",
"## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
69,
189,
247,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_24 #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs - Seed 24\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #24.## Model Description\n\nThis model is a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details."
] |
[
-0.0695870965719223,
0.09143710136413574,
-0.003924402873963118,
0.05471023917198181,
0.07511035352945328,
0.008211145177483559,
0.049394212663173676,
0.06978143751621246,
-0.09142931550741196,
0.021875204518437386,
-0.010191062465310097,
-0.04308287799358368,
0.0759970098733902,
-0.027709314599633217,
0.05593838542699814,
-0.23398126661777496,
0.03955309838056564,
-0.024076474830508232,
-0.02499311789870262,
0.02992847003042698,
0.11631261557340622,
-0.10570370405912399,
0.07332654297351837,
0.05085274204611778,
0.00403012428432703,
0.020258227363228798,
-0.018338488414883614,
0.0010149353183805943,
0.08804566413164139,
0.026139715686440468,
0.07913579791784286,
-0.004687362816184759,
0.08220159262418747,
-0.1332637518644333,
0.009315828792750835,
0.05409013479948044,
0.06057602912187576,
0.046637438237667084,
0.11826667189598083,
0.015912862494587898,
0.09647238999605179,
0.0038820230402052402,
0.04486262798309326,
0.04967478662729263,
-0.06427183002233505,
-0.18107976019382477,
-0.09499221295118332,
0.028908373787999153,
-0.0056783840991556644,
0.010897350497543812,
-0.008765325881540775,
-0.020505936816334724,
-0.025229286402463913,
0.02534281276166439,
0.1181831955909729,
-0.25701072812080383,
-0.015754031017422676,
0.0020034180488437414,
0.048257969319820404,
0.06300079077482224,
-0.04274371266365051,
-0.04709106683731079,
0.03922799229621887,
0.061478715389966965,
0.0595625564455986,
-0.022789115086197853,
0.03882955014705658,
-0.011212185956537724,
-0.15393368899822235,
-0.017170101404190063,
0.09775776416063309,
-0.04401228204369545,
-0.1140802875161171,
-0.06241094693541527,
-0.03275240957736969,
0.11196292191743851,
0.010698635131120682,
-0.037207864224910736,
0.04729720950126648,
0.02545866370201111,
0.05888789892196655,
-0.07470942288637161,
-0.11717969179153442,
0.03350512310862541,
-0.06984096020460129,
0.10502798110246658,
0.08998779952526093,
0.05240163952112198,
-0.011501966044306755,
0.0525570847094059,
-0.07172452658414841,
-0.08013924956321716,
-0.04800277575850487,
-0.08350849896669388,
-0.024914590641856194,
-0.03004346787929535,
-0.07610858231782913,
-0.1530362069606781,
-0.008518729358911514,
0.08781164139509201,
-0.07081645727157593,
-0.007654738612473011,
-0.0873069018125534,
-0.02357301488518715,
0.08782600611448288,
0.16231870651245117,
-0.11612387001514435,
0.04817814379930496,
0.0038192979991436005,
0.0094765555113554,
-0.025999300181865692,
0.036702390760183334,
0.011149299331009388,
-0.011855279095470905,
0.0424823984503746,
0.030096495524048805,
-0.017580488696694374,
0.03886500746011734,
-0.016720686107873917,
-0.04283170402050018,
0.059134285897016525,
-0.1406008005142212,
-0.008498920127749443,
0.006687475368380547,
-0.011336361058056355,
0.05530785024166107,
0.05701068043708801,
-0.03510654717683792,
-0.09249245375394821,
0.019542938098311424,
-0.08523666113615036,
-0.046506427228450775,
-0.06069578975439072,
-0.15501442551612854,
0.03241945430636406,
-0.08164480328559875,
-0.054376304149627686,
-0.09216142445802689,
-0.10189121961593628,
-0.01937430165708065,
0.05217783898115158,
-0.013323173858225346,
0.0361853763461113,
0.03025842271745205,
-0.012796057388186455,
-0.041927359998226166,
0.045515429228544235,
0.012872172519564629,
-0.014890724793076515,
0.01129708532243967,
-0.03967129439115524,
0.05348842218518257,
-0.013061690144240856,
0.047492582350969315,
-0.0680324137210846,
0.017830269411206245,
-0.12999926507472992,
0.06688312441110611,
-0.0952516719698906,
-0.0900825709104538,
-0.04951916262507439,
-0.05304738134145737,
-0.06670680642127991,
0.025227030739188194,
0.007694779895246029,
0.06612418591976166,
-0.1495738923549652,
-0.052309222519397736,
0.14085200428962708,
-0.1262948364019394,
0.03315954655408859,
0.09636864811182022,
-0.054563168436288834,
0.038638968020677567,
0.12045895308256149,
0.04274803400039673,
0.07085398584604263,
-0.049481313675642014,
-0.025163553655147552,
0.015514981001615524,
0.0341348834335804,
0.12505844235420227,
0.07253704220056534,
-0.06220308318734169,
-0.06788205355405807,
0.038541439920663834,
-0.07608063519001007,
-0.03403095901012421,
-0.05918978527188301,
-0.007344583515077829,
-0.013205978088080883,
-0.06032295897603035,
0.002487078309059143,
-0.026293126866221428,
-0.011263814754784107,
-0.010253881104290485,
-0.051073815673589706,
0.050514571368694305,
0.06260339170694351,
-0.0795290544629097,
0.051192156970500946,
-0.05960056930780411,
0.022496843710541725,
-0.0808483213186264,
-0.0053652371279895306,
-0.17876683175563812,
0.014658983796834946,
0.10688690096139908,
-0.09312755614519119,
0.052272189408540726,
0.16200505197048187,
0.02355009689927101,
0.06747929006814957,
-0.05283264070749283,
0.0686725303530693,
-0.000927957589738071,
-0.03035152703523636,
-0.04125957936048508,
-0.10726479440927505,
-0.062212251126766205,
-0.058610882610082626,
0.0053767673671245575,
-0.08196131139993668,
-0.00840698927640915,
-0.025806160643696785,
0.021621309220790863,
0.0296307485550642,
-0.05877188965678215,
0.020346039906144142,
0.024507485330104828,
-0.033822618424892426,
-0.02604690194129944,
-0.0289540383964777,
0.039825186133384705,
0.0048926230520009995,
0.12161936610937119,
-0.08844290673732758,
-0.043334223330020905,
0.04829326644539833,
0.06074894219636917,
-0.041517432779073715,
0.09470970183610916,
-0.06185506284236908,
-0.028958609327673912,
-0.094477079808712,
-0.0946570336818695,
0.17356684803962708,
-0.004240242764353752,
0.10280487686395645,
-0.10222934186458588,
-0.03554520756006241,
0.0007135223131626844,
0.004138688091188669,
-0.0078828614205122,
0.05566674470901489,
0.006781998090445995,
-0.1017637848854065,
0.0008725138031877577,
0.014977501705288887,
0.01914687268435955,
0.09023983031511307,
-0.018823379650712013,
-0.11675722897052765,
0.023012744262814522,
-0.00021889631170779467,
-0.007890855893492699,
0.06656396389007568,
-0.030400266870856285,
0.0014941657427698374,
0.05783529579639435,
0.05323027819395065,
0.056940458714962006,
-0.06462588161230087,
0.09134956449270248,
0.06370409578084946,
-0.0467023141682148,
-0.0445588082075119,
-0.07712315768003464,
0.020553577691316605,
0.12199150770902634,
0.02907741442322731,
0.05923888087272644,
-0.04258839413523674,
-0.023086534813046455,
-0.0997503474354744,
0.15427066385746002,
-0.09458280354738235,
-0.16121506690979004,
-0.151573047041893,
0.007293841801583767,
-0.061427824199199677,
0.05668177083134651,
0.012563807889819145,
-0.057658303529024124,
-0.09467784315347672,
-0.08231781423091888,
0.16227108240127563,
-0.04187551885843277,
-0.012715261429548264,
0.01717573218047619,
-0.027316227555274963,
0.03332076594233513,
-0.1852378100156784,
-0.002885011723265052,
-0.038534119725227356,
-0.1203104555606842,
-0.04174225032329559,
0.012048200704157352,
0.06459145247936249,
0.07044722884893417,
-0.04180678725242615,
-0.07243993133306503,
0.015001981519162655,
0.16028107702732086,
0.03576510772109032,
0.0781380832195282,
0.10078047960996628,
-0.08960561454296112,
0.044186607003211975,
0.04189831018447876,
0.03436926379799843,
-0.010992451570928097,
0.007789667695760727,
0.05651876702904701,
-0.02513015642762184,
-0.2938593327999115,
-0.01132297795265913,
-0.026637373492121696,
-0.016986900940537453,
0.06052202731370926,
0.044161226600408554,
-0.09583842009305954,
0.046008750796318054,
-0.0547344870865345,
0.03361254557967186,
0.08019056171178818,
0.03623456880450249,
0.10098471492528915,
-0.040194544941186905,
0.08946328610181808,
-0.055967848747968674,
-0.025557691231369972,
0.11891987919807434,
-0.06078479439020157,
0.20237764716148376,
-0.07039166986942291,
0.05999789386987686,
0.09639974683523178,
-0.004012001678347588,
0.03159567341208458,
0.14721940457820892,
-0.054432857781648636,
0.07215017825365067,
-0.052486300468444824,
-0.051341477781534195,
-0.03515058010816574,
0.017252633348107338,
0.00831662304699421,
0.04738190770149231,
-0.03322644531726837,
-0.01119069941341877,
-0.0057472893968224525,
0.2527799606323242,
0.05443775653839111,
-0.12488348037004471,
-0.07321425527334213,
0.004625267349183559,
-0.10967967659235,
-0.06751078367233276,
0.049161236733198166,
0.10233274102210999,
-0.08063290268182755,
0.04091009870171547,
0.00875516701489687,
0.06845054030418396,
-0.11841641366481781,
0.017887983471155167,
0.032877080142498016,
0.04915248602628708,
-0.016663135960698128,
0.03486878424882889,
-0.14771825075149536,
0.08847133815288544,
0.03355613723397255,
0.05220611393451691,
-0.06036758795380592,
0.06459513306617737,
0.02654697187244892,
-0.028752196580171585,
0.03071138635277748,
0.013601490296423435,
-0.009400316514074802,
-0.03312378004193306,
-0.07103440910577774,
0.07820934802293777,
0.0750778540968895,
-0.055246591567993164,
0.115534208714962,
-0.0506284199655056,
0.009567325934767723,
-0.009424244984984398,
0.07551051676273346,
-0.1651095449924469,
-0.13064388930797577,
0.038994599133729935,
-0.1318032294511795,
-0.03678170591592789,
-0.06737690418958664,
-0.0607585646212101,
-0.05485798418521881,
0.16914774477481842,
-0.1305406093597412,
-0.13497628271579742,
-0.09253352880477905,
-0.008365066722035408,
0.15821309387683868,
-0.0415971577167511,
0.012776755727827549,
-0.019520148634910583,
0.14069083333015442,
-0.040384650230407715,
-0.1566006988286972,
-0.05111219361424446,
-0.06872555613517761,
-0.14956137537956238,
-0.022637447342276573,
0.07349657267332077,
0.1134977638721466,
0.05300387740135193,
0.005178349558264017,
0.026574313640594482,
-0.007030009292066097,
-0.0543314665555954,
-0.015342259779572487,
0.18179036676883698,
0.05964859947562218,
0.0829518660902977,
-0.15463370084762573,
-0.07368937879800797,
-0.0407227985560894,
0.023540223017334938,
-0.02732738107442856,
0.09358397126197815,
-0.033607251942157745,
0.07858065515756607,
0.23187477886676788,
-0.1238548681139946,
-0.2050333023071289,
0.005515565164387226,
0.02861502394080162,
0.010319882072508335,
0.015292877331376076,
-0.2214822620153427,
0.12243190407752991,
0.08213328570127487,
-0.003390663769096136,
0.008812226355075836,
-0.1786285936832428,
-0.08227556943893433,
0.07883071154356003,
0.010537419468164444,
0.14557240903377533,
-0.08249253034591675,
-0.03160414472222328,
0.00719765480607748,
-0.0871368795633316,
0.04976323992013931,
0.04608622565865517,
0.0847242921590805,
-0.006876516621559858,
-0.05914592742919922,
0.0488203726708889,
-0.01709398813545704,
0.08216682821512222,
0.03529803827404976,
0.04541425406932831,
-0.03812091425061226,
0.11250444501638412,
0.012214981950819492,
-0.018859384581446648,
0.14501415193080902,
0.10142454504966736,
0.06334496289491653,
-0.035258080810308456,
-0.06561530381441116,
-0.07870493829250336,
0.014103654772043228,
-0.02085818536579609,
-0.04083077609539032,
-0.06608390063047409,
0.044033925980329514,
0.06580687314271927,
-0.005028270184993744,
-0.04423455148935318,
-0.023301931098103523,
0.06590579450130463,
0.09588714689016342,
0.19945096969604492,
-0.043627385050058365,
-0.008297409862279892,
-0.02641562558710575,
-0.025618162006139755,
0.0649065375328064,
-0.02059047669172287,
0.06601744145154953,
0.09494613111019135,
0.011303148232400417,
0.08202529698610306,
0.06019243597984314,
-0.12573090195655823,
-0.01872776262462139,
0.057958535850048065,
-0.09394027292728424,
-0.14649111032485962,
-0.03287871927022934,
-0.1028662770986557,
-0.1425723284482956,
-0.0029114405624568462,
0.16978992521762848,
-0.036861602216959,
-0.04847491532564163,
-0.017463479191064835,
0.07841978967189789,
0.01676884852349758,
0.11969351023435593,
0.038574714213609695,
-0.012290963903069496,
-0.06217453256249428,
0.16322806477546692,
0.08735906332731247,
-0.09249260276556015,
0.01027517206966877,
0.024707753211259842,
-0.056712474673986435,
-0.009316354990005493,
-0.05234608054161072,
0.06770388782024384,
-0.031329263001680374,
-0.034067075699567795,
0.0010642862180247903,
-0.10520359128713608,
0.05980375036597252,
0.15178146958351135,
0.0020864782854914665,
0.15972620248794556,
-0.045170076191425323,
0.06303789466619492,
-0.06874433159828186,
0.07870995253324509,
0.041776858270168304,
0.06512424349784851,
-0.01950467936694622,
0.060346998274326324,
-0.04477784410119057,
0.010896082036197186,
-0.015457162633538246,
0.006046422757208347,
-0.07741910219192505,
-0.05671859532594681,
-0.23194944858551025,
0.030841512605547905,
-0.054247964173555374,
-0.034768618643283844,
0.001910771825350821,
-0.010780341923236847,
-0.0025022346526384354,
0.043622761964797974,
-0.029752571135759354,
-0.03200918436050415,
-0.025105874985456467,
0.06327995657920837,
-0.1279325932264328,
0.027018854394555092,
0.05923648923635483,
-0.0795871838927269,
0.08787288516759872,
0.0013367229839786887,
-0.05707515776157379,
0.0023820761125534773,
0.0227408055216074,
-0.045098546892404556,
-0.03245484456419945,
0.008460993878543377,
-0.046152401715517044,
-0.11030305922031403,
0.03363783285021782,
0.018447304144501686,
-0.03140637278556824,
-0.026796448975801468,
0.0685572624206543,
-0.06926556676626205,
0.048913467675447464,
0.04148676246404648,
0.011950702406466007,
-0.0495278462767601,
-0.009965192526578903,
0.11458751559257507,
0.06944088637828827,
0.05053264647722244,
-0.047725677490234375,
-0.017714861780405045,
-0.16375690698623657,
-0.003494028700515628,
-0.0051715681329369545,
-0.004398426506668329,
-0.04622349515557289,
-0.03582436218857765,
0.03378066420555115,
0.0092008663341403,
0.1841239184141159,
0.009028846397995949,
-0.003653257852420211,
0.011401284486055374,
0.0025695613585412502,
0.009519508108496666,
0.029700979590415955,
0.07454946637153625,
-0.0280635803937912,
-0.08014620095491409,
-0.06709419935941696,
0.022659869864583015,
-0.03756124526262283,
-0.016039440408349037,
0.14237000048160553,
0.13302813470363617,
0.12265075743198395,
0.015805866569280624,
0.015795473009347916,
-0.02630414068698883,
-0.04028653725981712,
-0.0019604964181780815,
0.051052026450634,
0.049587737768888474,
-0.021164098754525185,
0.015550472773611546,
0.06903302669525146,
-0.1317298710346222,
0.12859505414962769,
-0.03388950973749161,
-0.020227672532200813,
-0.10570508241653442,
-0.08127366006374359,
-0.0216568224132061,
-0.01203898061066866,
-0.017755160108208656,
-0.16596820950508118,
0.043423350900411606,
0.09596164524555206,
0.025105487555265427,
-0.03895103558897972,
0.038753874599933624,
-0.14844673871994019,
-0.0897129699587822,
0.07755645364522934,
0.01554299145936966,
0.03818149492144585,
0.11006198078393936,
-0.008938388898968697,
0.07897022366523743,
0.13243703544139862,
0.06648179888725281,
0.05423710495233536,
0.06987302005290985,
0.006424633786082268,
-0.025220870971679688,
-0.03692840412259102,
0.006095459684729576,
-0.05596242472529411,
0.040810469537973404,
0.17651884257793427,
0.029407871887087822,
-0.05171429365873337,
0.029453979805111885,
0.17600825428962708,
-0.038060445338487625,
-0.05236135050654411,
-0.1661175638437271,
0.2031947821378708,
0.02869364619255066,
0.035440266132354736,
0.053324420005083084,
-0.08601108193397522,
-0.03873657435178757,
0.19506202638149261,
0.11848422884941101,
0.019068986177444458,
-0.01911143958568573,
0.019768603146076202,
-0.009849823080003262,
0.0024590305984020233,
0.08392950147390366,
0.01597539335489273,
0.2452893704175949,
-0.04303685575723648,
0.021818118169903755,
0.02593107707798481,
0.038344889879226685,
-0.06704497337341309,
0.14706115424633026,
-0.05143940448760986,
0.0025345198810100555,
-0.055017635226249695,
0.015163936652243137,
0.013404021970927715,
-0.3035661280155182,
-0.11167115718126297,
-0.005630311090499163,
-0.06524112075567245,
-0.019371218979358673,
-0.03254421427845955,
-0.000963466998655349,
0.045101556926965714,
-0.0005774212768301368,
0.03327162563800812,
0.1780177354812622,
-0.006999078206717968,
-0.04135936498641968,
-0.03402034565806389,
0.12684614956378937,
0.00946762040257454,
0.14073581993579865,
0.05947066843509674,
-0.011601140722632408,
0.04723627120256424,
0.020454207435250282,
-0.11691517382860184,
-0.024443550035357475,
-0.025641687214374542,
-0.009147530421614647,
-0.025289984419941902,
0.1382950246334076,
0.01917077600955963,
0.04280940815806389,
0.0360368974506855,
-0.026806659996509552,
0.048395030200481415,
0.05312225595116615,
-0.06588850170373917,
-0.05640696361660957,
0.052854254841804504,
-0.09518900513648987,
0.1396723836660385,
0.18087726831436157,
0.00993430707603693,
0.022279832512140274,
-0.05956050753593445,
-0.006382157560437918,
-0.0025172235909849405,
0.11284253001213074,
-0.004002671223133802,
-0.15512239933013916,
-0.01029894594103098,
-0.06980890035629272,
0.04642212763428688,
-0.24187058210372925,
-0.05071265250444412,
0.09945409744977951,
-0.01631608046591282,
-0.01477114763110876,
0.048423126339912415,
0.0074245319701731205,
0.06064992770552635,
-0.009183105081319809,
-0.05455339327454567,
0.0017467554425820708,
0.06188100948929787,
-0.0877825990319252,
-0.022316310554742813
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 0k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 0k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_0k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_0k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_0k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_0k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_0k"]}
| null |
google/multiberts-seed_3-step_0k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_0k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_0k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 0k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 0k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 0k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 0k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_0k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 0k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 0k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_0k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 0k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 0k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07606545090675354,
0.0888003408908844,
-0.002168470062315464,
0.040527887642383575,
0.08237501233816147,
-0.017043692991137505,
0.08180290460586548,
0.10158039629459381,
-0.030358919873833656,
0.023424148559570312,
0.08443064242601395,
0.016325684264302254,
0.013291256502270699,
0.09508585929870605,
0.01964583620429039,
-0.2180965691804886,
0.027249660342931747,
-0.0305215772241354,
-0.07297074794769287,
0.07756879925727844,
0.09618319571018219,
-0.0821240171790123,
0.045463062822818756,
0.02220163494348526,
-0.10540281236171722,
0.04827064275741577,
-0.007454811129719019,
-0.02048668824136257,
0.12889979779720306,
-0.001704655820503831,
0.0464865118265152,
0.05375552922487259,
0.0344587005674839,
-0.13843786716461182,
0.006829686462879181,
0.05757640302181244,
0.05833752825856209,
0.03837006539106369,
0.022948216646909714,
0.07315464317798615,
-0.000155603964230977,
0.02426815778017044,
0.04817968234419823,
0.024503912776708603,
-0.07487711310386658,
-0.05698902904987335,
-0.10035943984985352,
0.03678368777036667,
0.02492697164416313,
0.01647970639169216,
0.011932479217648506,
0.13749724626541138,
-0.04116512089967728,
0.048907049000263214,
0.18910108506679535,
-0.3313280940055847,
-0.01594172604382038,
0.08475302159786224,
0.04651610925793648,
0.12482811510562897,
-0.0050727082416415215,
-0.015396286733448505,
0.07222891598939896,
0.03116859495639801,
0.09525290131568909,
-0.041112907230854034,
0.031169410794973373,
-0.053979866206645966,
-0.158297598361969,
-0.04100516438484192,
0.10032539814710617,
-0.003354663262143731,
-0.13705235719680786,
-0.046705711632966995,
-0.033552128821611404,
0.026658372953534126,
0.01004937756806612,
-0.030073590576648712,
0.03063390403985977,
0.009071635082364082,
-0.024908844381570816,
-0.003681311383843422,
-0.1012178435921669,
-0.04922209680080414,
0.038264282047748566,
0.08323206752538681,
0.10236352682113647,
0.06242581456899643,
0.0016527572879567742,
0.1086462214589119,
-0.1837790459394455,
-0.04964255541563034,
-0.032489728182554245,
-0.05985686928033829,
-0.04548183083534241,
-0.0134170176461339,
-0.10990964621305466,
-0.04563981667160988,
0.016188576817512512,
0.13590572774410248,
-0.0036961708683520555,
0.03463103622198105,
-0.025791727006435394,
0.004860700573772192,
0.059228647500276566,
0.042183857411146164,
-0.00653501134365797,
0.0233046505600214,
0.021199367940425873,
-0.009230682626366615,
-0.021953005343675613,
0.01443193294107914,
0.006065305322408676,
0.03928183391690254,
0.12187309563159943,
0.028716333210468292,
-0.10205358266830444,
0.0800619050860405,
-0.014131150208413601,
-0.048506077378988266,
0.018622461706399918,
-0.08828964829444885,
-0.06323092430830002,
-0.03862732648849487,
0.0037833601236343384,
0.023256339132785797,
-0.01015932485461235,
-0.009926393628120422,
-0.025936052203178406,
-0.03393392264842987,
-0.08382576704025269,
-0.04688858240842819,
-0.05268184840679169,
-0.1303066909313202,
0.004393941722810268,
-0.17919974029064178,
-0.03688428923487663,
-0.11728174239397049,
-0.1870516538619995,
-0.02319968305528164,
0.06642080098390579,
-0.013704929500818253,
-0.050134431570768356,
0.07885347306728363,
0.036503519862890244,
-0.029491957277059555,
-0.002743589226156473,
0.07137783616781235,
-0.007867696695029736,
0.03935042396187782,
-0.021813655272126198,
0.06240640953183174,
0.0037121635396033525,
0.035019807517528534,
-0.05728958547115326,
0.05911242961883545,
-0.1768045872449875,
0.043945178389549255,
-0.08189582824707031,
-0.022401701658964157,
-0.08802444487810135,
-0.036915648728609085,
-0.004204717464745045,
0.00577709311619401,
0.02259262651205063,
0.07668153196573257,
-0.17635272443294525,
-0.02713717892765999,
0.10901942104101181,
-0.15643757581710815,
-0.024564389139413834,
0.06871221214532852,
-0.04306322708725929,
0.08645501732826233,
0.06826484203338623,
0.15494751930236816,
-0.010444693267345428,
-0.08469752222299576,
0.052536558359861374,
-0.012371040880680084,
0.017824959009885788,
-0.015167368575930595,
0.06718895584344864,
-0.022952988743782043,
-0.15479417145252228,
0.03881043940782547,
-0.1323356181383133,
-0.004341898951679468,
-0.07700862735509872,
0.020487280562520027,
-0.0076180994510650635,
-0.06580229103565216,
-0.07385522872209549,
-0.02684079296886921,
0.06611158698797226,
-0.07680312544107437,
-0.01638888753950596,
0.03534518554806709,
0.06879347562789917,
-0.07963439077138901,
0.06636933237314224,
-0.009913853369653225,
0.02317446656525135,
-0.08821878582239151,
-0.040623366832733154,
-0.18681924045085907,
0.03694045916199684,
0.10018068552017212,
0.017035558819770813,
-0.021926263347268105,
0.13865716755390167,
0.005773603450506926,
0.06484129279851913,
-0.051224615424871445,
0.012668168172240257,
-0.005358968861401081,
-0.0011119380360469222,
-0.08397997170686722,
-0.09971708804368973,
-0.07386597990989685,
-0.07009296119213104,
0.07555031776428223,
-0.12333401292562485,
0.019225746393203735,
-0.06013621389865875,
0.03411054238677025,
0.01731747016310692,
-0.08381661027669907,
-0.009751718491315842,
0.021110519766807556,
-0.057967305183410645,
-0.058745164424180984,
0.046051718294620514,
0.06964311748743057,
-0.011420303955674171,
0.08902633935213089,
-0.05713052675127983,
-0.0842895656824112,
0.033692825585603714,
0.09455739706754684,
-0.1108013466000557,
0.004391590133309364,
-0.057825785130262375,
-0.04197605699300766,
-0.058632973581552505,
-0.01406239066272974,
0.08614060282707214,
-0.003685378236696124,
0.1354878842830658,
-0.07383937388658524,
-0.007439528591930866,
0.01339359674602747,
-0.018432091921567917,
-0.027256188914179802,
0.03675112873315811,
0.07159591466188431,
-0.06979942321777344,
0.01552860252559185,
0.037853579968214035,
0.007706885226070881,
0.07606048882007599,
-0.05441053584218025,
-0.08414483815431595,
0.019046654924750328,
0.03692951053380966,
0.03002062626183033,
0.07055152207612991,
-0.024592820554971695,
-0.014071337878704071,
0.0325678214430809,
0.02250438556075096,
0.009923708625137806,
-0.10698231309652328,
0.05806378275156021,
0.0538053996860981,
0.0058340937830507755,
0.07166580855846405,
-0.010561815463006496,
-0.04423731938004494,
0.07478883862495422,
0.03870459645986557,
-0.006633700802922249,
-0.010739470832049847,
-0.013881750404834747,
-0.11705704033374786,
0.19639816880226135,
-0.06353800743818283,
-0.16239529848098755,
-0.06959119439125061,
-0.11191099137067795,
-0.013962173834443092,
0.019663462415337563,
0.04046299308538437,
-0.022191157564520836,
-0.04991944134235382,
-0.13114356994628906,
0.05518898367881775,
-0.04551852121949196,
0.06441630423069,
0.10810865461826324,
-0.049272485077381134,
0.05510646104812622,
-0.12703153491020203,
-0.010679931379854679,
-0.08037737011909485,
-0.07680393755435944,
0.06401602923870087,
-0.05340681970119476,
0.02687062695622444,
0.09457232803106308,
0.031577806919813156,
-0.016132351011037827,
-0.03045830689370632,
0.20595689117908478,
0.04581174626946449,
0.03417031094431877,
0.1259043961763382,
-0.05408394709229469,
0.05110883340239525,
0.08301503211259842,
0.01185983419418335,
-0.04984019324183464,
0.05484103411436081,
0.051125138998031616,
-0.0686979591846466,
-0.19108818471431732,
-0.024013875052332878,
-0.013362622819840908,
-0.04226803034543991,
0.06985952705144882,
0.039379335939884186,
0.013695121742784977,
0.0699353814125061,
0.015995047986507416,
0.05341026932001114,
-0.002286323346197605,
0.10651418566703796,
0.020468339323997498,
-0.03026098944246769,
0.09252748638391495,
-0.02013780176639557,
-0.002531511476263404,
0.0803585946559906,
-0.018451575189828873,
0.28911587595939636,
-0.03102881647646427,
0.0054071214981377125,
0.131193608045578,
0.035802703350782394,
0.05983063951134682,
0.12538625299930573,
-0.07006777822971344,
0.015434658154845238,
-0.07133811712265015,
-0.06274080276489258,
0.0006080709863454103,
0.0337633416056633,
-0.05896526575088501,
0.009472825564444065,
-0.07523484528064728,
0.010895764455199242,
-0.015734834596514702,
0.3136380910873413,
0.10875236988067627,
-0.11184859275817871,
-0.0555485375225544,
-0.0007053169538266957,
-0.09734095633029938,
-0.06294537335634232,
0.045355312526226044,
0.0582612119615078,
-0.134114071726799,
0.014053796418011189,
-0.027569575235247612,
0.07167195528745651,
-0.01677190326154232,
0.01978285238146782,
0.04054446145892143,
0.04275378957390785,
-0.04025120660662651,
0.005207202397286892,
-0.19845892488956451,
0.19464488327503204,
0.0068766106851398945,
0.024737833067774773,
-0.052180781960487366,
0.031534139066934586,
0.006287105847150087,
-0.03224096819758415,
0.05858180671930313,
0.017962243407964706,
-0.018417764455080032,
-0.050113074481487274,
-0.05152998864650726,
0.018341878429055214,
0.07766847312450409,
-0.036806441843509674,
0.10795757919549942,
-0.005186439957469702,
0.04379146173596382,
0.018978804349899292,
0.10452701896429062,
-0.19125454127788544,
-0.09413807839155197,
0.032007183879613876,
-0.051492560654878616,
-0.10612404346466064,
-0.07487710565328598,
-0.09494093060493469,
-0.014621611684560776,
0.252705842256546,
-0.10932248085737228,
-0.07545963674783707,
-0.09886398911476135,
0.017928041517734528,
0.10261621326208115,
-0.044145334511995316,
0.026853321120142937,
-0.012812044471502304,
0.1175774559378624,
-0.062199000269174576,
-0.13542385399341583,
0.02271995320916176,
-0.09942106157541275,
-0.16099676489830017,
-0.06343639642000198,
0.11078617721796036,
0.06312216073274612,
0.03207320347428322,
-0.03062685765326023,
0.020858729258179665,
0.0377814806997776,
-0.04189425706863403,
-0.005780951585620642,
0.07449492067098618,
0.10433623194694519,
0.033296193927526474,
-0.11111310869455338,
0.015914587303996086,
-0.06494161486625671,
-0.06729014962911606,
0.07573159039020538,
0.25982704758644104,
-0.04910166561603546,
0.12115391343832016,
0.11416070908308029,
-0.07598484307527542,
-0.14991183578968048,
0.031737685203552246,
0.08757618814706802,
-0.021827345713973045,
0.010859859175980091,
-0.15551811456680298,
0.08903034776449203,
0.1141558289527893,
-0.019036630168557167,
-0.001748480019159615,
-0.18770290911197662,
-0.12863485515117645,
0.06800583750009537,
0.10858241468667984,
0.2686137855052948,
-0.06854131817817688,
-0.03683071583509445,
0.020604252815246582,
-0.0849345326423645,
0.023348210379481316,
0.1274111419916153,
0.0708324983716011,
-0.028513982892036438,
-0.08424977958202362,
0.009624185040593147,
-0.044663168489933014,
0.08895749598741531,
0.05577868968248367,
0.060616884380578995,
-0.004082181956619024,
0.026796316727995872,
-0.02274234965443611,
-0.04622131958603859,
0.06518527865409851,
0.011101636104285717,
0.04743051528930664,
-0.08344904333353043,
-0.028906559571623802,
-0.07521790266036987,
0.026478411629796028,
-0.02476068213582039,
-0.07697091996669769,
-0.05577763170003891,
0.08049214631319046,
0.048394590616226196,
-0.026609355583786964,
0.02591370977461338,
0.01800760254263878,
0.1215972751379013,
0.1598372906446457,
0.0038225112948566675,
-0.05141545087099075,
-0.06754417717456818,
-0.039224136620759964,
-0.01946481317281723,
0.07220570743083954,
-0.03742082044482231,
0.017794715240597725,
0.06456341594457626,
0.01842704601585865,
0.0967860221862793,
0.060337185859680176,
-0.10998547822237015,
-0.02093110978603363,
0.0323951281607151,
-0.1653720885515213,
0.0303545743227005,
0.0035719512961804867,
0.026451008394360542,
-0.03451215848326683,
0.03982545807957649,
0.14882777631282806,
-0.06348899006843567,
-0.032528091222047806,
-0.03993692249059677,
0.0665646567940712,
0.021790537983179092,
0.14415040612220764,
0.029787516221404076,
0.03681762516498566,
-0.08250029385089874,
0.12823228538036346,
0.031059959903359413,
-0.03575026988983154,
0.027255674824118614,
-0.022067459300160408,
-0.1126159206032753,
0.007913156412541866,
0.06651052087545395,
0.0407499223947525,
-0.04870941862463951,
-0.008176998235285282,
-0.027142589911818504,
-0.07478047162294388,
0.06435900181531906,
0.18875716626644135,
0.06412911415100098,
0.0710134208202362,
-0.05651196092367172,
-0.041010793298482895,
-0.08239971101284027,
0.040617235004901886,
0.03212660923600197,
0.0752742663025856,
-0.0786324292421341,
0.08948656171560287,
0.013161893002688885,
0.03665796294808388,
-0.030241405591368675,
-0.053840283304452896,
-0.10934542864561081,
-0.0533294603228569,
-0.09930669516324997,
0.003469775663688779,
-0.0796685442328453,
-0.03864874690771103,
0.000652080459985882,
0.004098184872418642,
-0.007071251980960369,
0.0499265231192112,
-0.06030900403857231,
-0.008987795561552048,
-0.017550883814692497,
0.039169054478406906,
-0.06393536180257797,
-0.034753430634737015,
0.01967308484017849,
-0.10209553688764572,
0.0930396094918251,
0.04933512583374977,
0.0008709391695447266,
0.006221752148121595,
0.08314832299947739,
-0.022292785346508026,
0.022268183529376984,
0.010661505162715912,
-0.04843030124902725,
-0.08246418088674545,
-0.002612539567053318,
-0.005014065187424421,
-0.014834935776889324,
-0.0035182409919798374,
0.08356120437383652,
-0.08588545024394989,
0.0319991409778595,
-0.0027800295501947403,
-0.002959525678306818,
-0.06951730698347092,
-0.009798993356525898,
0.10281166434288025,
0.0926189199090004,
0.04768984019756317,
-0.09511265903711319,
0.011556785553693771,
-0.13328252732753754,
-0.039026569575071335,
0.007793522905558348,
-0.01408133190125227,
-0.13103339076042175,
-0.006310306489467621,
0.02336784638464451,
-0.0050670974887907505,
0.21338964998722076,
-0.05657484754920006,
-0.016430405899882317,
0.019181394949555397,
-0.09891723096370697,
0.12101517617702484,
-0.023739388212561607,
0.18360759317874908,
-0.011262550950050354,
-0.041475966572761536,
-0.006818048190325499,
0.043459292501211166,
0.017601575702428818,
-0.021736301481723785,
0.18654726445674896,
0.1402200311422348,
0.023168323561549187,
0.03930462896823883,
-0.027016788721084595,
-0.00769690191373229,
-0.04255381599068642,
-0.031041588634252548,
0.03628738969564438,
0.05268172174692154,
0.014883976429700851,
0.14301353693008423,
0.06595176458358765,
-0.16372865438461304,
0.033385198563337326,
-0.03143845126032829,
-0.03977757692337036,
-0.11336317658424377,
-0.10768365859985352,
-0.02789439633488655,
-0.07449308782815933,
0.011550131253898144,
-0.12684109807014465,
0.001420778688043356,
0.18151764571666718,
0.06682556122541428,
0.02786460891366005,
0.009831838309764862,
-0.11527128517627716,
-0.03170407563447952,
0.05542244762182236,
0.009052827022969723,
0.018586520105600357,
0.05126012861728668,
0.006875152699649334,
0.053834181278944016,
0.032656531780958176,
0.01487658265978098,
0.0017199346330016851,
0.07690563052892685,
0.019671857357025146,
0.040666185319423676,
-0.06432481855154037,
-0.002238423563539982,
-0.03932077810168266,
0.0710347443819046,
0.11715397983789444,
0.04687155783176422,
-0.05576157569885254,
-0.00651979073882103,
0.1551228016614914,
-0.03997204825282097,
-0.002065282315015793,
-0.1258799433708191,
0.3401877284049988,
0.013320837169885635,
0.00998725462704897,
0.04055729880928993,
-0.07139597088098526,
-0.049876805394887924,
0.2097623199224472,
0.09707958996295929,
-0.016423992812633514,
-0.01854608580470085,
0.001651774044148624,
-0.031171798706054688,
-0.025207247585058212,
0.15301008522510529,
0.03695612773299217,
0.12328760325908661,
-0.054516687989234924,
-0.04591352120041847,
-0.024545082822442055,
-0.00888740736991167,
-0.12271943688392639,
0.12372919172048569,
-0.017290309071540833,
-0.028354372829198837,
-0.06970632821321487,
0.0245521180331707,
0.05962257832288742,
-0.300854355096817,
-0.0007400320027954876,
-0.025608059018850327,
-0.1075950562953949,
-0.011657310649752617,
-0.02419811487197876,
-0.02431492693722248,
0.047902993857860565,
-0.043944064527750015,
0.07154926657676697,
0.03732375055551529,
0.030201615765690804,
-0.018024662509560585,
-0.09791112691164017,
0.16848887503147125,
0.06361562758684158,
0.08907196670770645,
0.025463446974754333,
0.07364815473556519,
0.06137632951140404,
0.03299902006983757,
-0.0960313081741333,
0.05234995111823082,
0.011941036209464073,
-0.09082887321710587,
-0.04901261255145073,
0.11492262035608292,
0.0004324949986767024,
0.05577782541513443,
0.03847174718976021,
-0.10164620727300644,
0.01987381838262081,
0.07014603167772293,
-0.06531821191310883,
-0.09533080458641052,
-0.0034339556004852057,
-0.09090779721736908,
0.16022534668445587,
0.14640292525291443,
-0.01656573824584484,
0.015202080830931664,
-0.06820753216743469,
-0.0025031643453985453,
0.04835231602191925,
0.0011623683385550976,
-0.024500349536538124,
-0.19503818452358246,
0.03887983039021492,
-0.08361496031284332,
-0.008562024682760239,
-0.2328254133462906,
-0.09898317605257034,
-0.005099189933389425,
-0.04766229912638664,
-0.03310856968164444,
0.055323339998722076,
0.025379519909620285,
0.07011261582374573,
-0.021599190309643745,
-0.026828447356820107,
-0.033517464995384216,
0.08900106698274612,
-0.1127251386642456,
-0.06383784115314484
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1000k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1000k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1000k"]}
| null |
google/multiberts-seed_3-step_1000k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1000k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1000k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07644109427928925,
0.08865808695554733,
-0.0018809319008141756,
0.04182448983192444,
0.08600939065217972,
-0.01685434766113758,
0.07797481119632721,
0.09861737489700317,
-0.026397941634058952,
0.025276070460677147,
0.08435884863138199,
0.01599661074578762,
0.017033886164426804,
0.09955801069736481,
0.023936264216899872,
-0.22285720705986023,
0.028517773374915123,
-0.030738187953829765,
-0.07221810519695282,
0.07747497409582138,
0.09718526899814606,
-0.08521623909473419,
0.04800078272819519,
0.020778043195605278,
-0.10465425997972488,
0.04814846068620682,
-0.00987238623201847,
-0.02416689321398735,
0.1303911954164505,
-0.008127530105412006,
0.046661630272865295,
0.05101971700787544,
0.04263825714588165,
-0.13452059030532837,
0.005961032118648291,
0.056970857083797455,
0.06138600781559944,
0.03800751268863678,
0.02149583026766777,
0.08016064018011093,
-0.004430366680026054,
0.02179577760398388,
0.04972490295767784,
0.022799668833613396,
-0.07554510235786438,
-0.0543646477162838,
-0.10614467412233353,
0.03948267549276352,
0.028516152873635292,
0.01251227967441082,
0.009953758679330349,
0.13349033892154694,
-0.03940066695213318,
0.04672081768512726,
0.1871475726366043,
-0.3259204924106598,
-0.014132978394627571,
0.08617165684700012,
0.05331339314579964,
0.12561725080013275,
-0.005023063626140356,
-0.016748493537306786,
0.07486941665410995,
0.029949823394417763,
0.09148465842008591,
-0.04245741292834282,
0.022879071533679962,
-0.0540681853890419,
-0.1623976230621338,
-0.040712181478738785,
0.10027556121349335,
-0.0021661766804754734,
-0.13753239810466766,
-0.046312686055898666,
-0.03225754573941231,
0.02070043981075287,
0.011941948905587196,
-0.035450562834739685,
0.03069605864584446,
0.005621280521154404,
-0.022902313619852066,
-0.0013877889141440392,
-0.0993417352437973,
-0.050979141145944595,
0.03578534722328186,
0.07794861495494843,
0.10339339077472687,
0.06243252754211426,
0.0037924605421721935,
0.11153950542211533,
-0.18877704441547394,
-0.04676821827888489,
-0.029957957565784454,
-0.061252888292074203,
-0.047861866652965546,
-0.01566496677696705,
-0.10678128153085709,
-0.04202014207839966,
0.012477351352572441,
0.13628531992435455,
0.007726890500634909,
0.03237621858716011,
-0.017372336238622665,
0.00437537208199501,
0.060894668102264404,
0.04620876908302307,
-0.01411637756973505,
0.010442834347486496,
0.022588428109884262,
-0.006310924421995878,
-0.023453902453184128,
0.015709049999713898,
0.0045742979273200035,
0.03506379947066307,
0.12105327099561691,
0.030375875532627106,
-0.10472860187292099,
0.0832306370139122,
-0.013043507002294064,
-0.045071933418512344,
0.015232034958899021,
-0.09052719920873642,
-0.06377529352903366,
-0.04417520388960838,
0.0001951927406480536,
0.019832853227853775,
-0.012686953879892826,
-0.008598499931395054,
-0.02372009865939617,
-0.03219666704535484,
-0.0832572877407074,
-0.05090129375457764,
-0.05552806705236435,
-0.13340795040130615,
0.004639745689928532,
-0.19297544658184052,
-0.03668089956045151,
-0.1156146228313446,
-0.19026155769824982,
-0.028362099081277847,
0.059157054871320724,
-0.008770833723247051,
-0.051101621240377426,
0.07615227997303009,
0.037265464663505554,
-0.031687308102846146,
-0.0036491428036242723,
0.07701551169157028,
-0.005273024085909128,
0.038543038070201874,
-0.023921048268675804,
0.06780005991458893,
0.00574648380279541,
0.03383873030543327,
-0.05612781643867493,
0.06238287314772606,
-0.17183759808540344,
0.04257526248693466,
-0.07982340455055237,
-0.017698604613542557,
-0.08588322252035141,
-0.038014717400074005,
-0.005597202107310295,
0.005553963594138622,
0.02477920986711979,
0.07406816631555557,
-0.18474119901657104,
-0.023204119876027107,
0.11010503768920898,
-0.1542709320783615,
-0.02808591164648533,
0.06914224475622177,
-0.04462290182709694,
0.09691686928272247,
0.07432633638381958,
0.15040206909179688,
-0.010065228678286076,
-0.08079531043767929,
0.049585141241550446,
-0.01532924734055996,
0.013940633274614811,
-0.014894033782184124,
0.06625840812921524,
-0.022522306069731712,
-0.14893367886543274,
0.038546524941921234,
-0.1306019276380539,
-0.005293106194585562,
-0.07619022578001022,
0.019704803824424744,
-0.004696229938417673,
-0.06861313432455063,
-0.07510483264923096,
-0.028776412829756737,
0.06257349252700806,
-0.07769899070262909,
-0.01060582883656025,
0.028790440410375595,
0.06902420520782471,
-0.0764075443148613,
0.06489428877830505,
-0.010229809209704399,
0.020875977352261543,
-0.08532112091779709,
-0.03877929598093033,
-0.18582865595817566,
0.042470186948776245,
0.1027393564581871,
0.010363000445067883,
-0.02040690928697586,
0.13217367231845856,
0.004991635680198669,
0.0653066337108612,
-0.04859884828329086,
0.010781342163681984,
-0.004651512484997511,
-0.00030984822660684586,
-0.08671044558286667,
-0.09252925962209702,
-0.07616294175386429,
-0.07050765305757523,
0.06895814090967178,
-0.11854124069213867,
0.020808864384889603,
-0.05697200447320938,
0.031472042202949524,
0.016142379492521286,
-0.08151344954967499,
-0.00994904339313507,
0.021082580089569092,
-0.05653552711009979,
-0.06198754534125328,
0.042314860969781876,
0.06631866842508316,
-0.011144502088427544,
0.08712926506996155,
-0.05479689687490463,
-0.09326669573783875,
0.033035408705472946,
0.09967359900474548,
-0.11411842703819275,
0.006938778329640627,
-0.05691348761320114,
-0.04225122183561325,
-0.06039953604340553,
-0.022592924535274506,
0.08606348186731339,
-0.004780558403581381,
0.13090309500694275,
-0.07556997984647751,
-0.009135824628174305,
0.012701007537543774,
-0.01592991128563881,
-0.024901436641812325,
0.035112544894218445,
0.0740271657705307,
-0.07142583280801773,
0.014423215761780739,
0.03811981528997421,
0.008797194808721542,
0.07268844544887543,
-0.0541338250041008,
-0.08354988694190979,
0.018978364765644073,
0.0347919799387455,
0.02881808951497078,
0.06812668591737747,
-0.0257548950612545,
-0.019488122314214706,
0.03032231144607067,
0.0201412346214056,
0.007976862601935863,
-0.10802419483661652,
0.058610815554857254,
0.05371270701289177,
0.008694672025740147,
0.06539052724838257,
-0.010628816671669483,
-0.04239211603999138,
0.07625866681337357,
0.038446202874183655,
-0.0034698734525591135,
-0.013787630014121532,
-0.01543706189841032,
-0.11550119519233704,
0.19766627252101898,
-0.06284501403570175,
-0.15883584320545197,
-0.06896539777517319,
-0.11192204803228378,
-0.006271506659686565,
0.022672319784760475,
0.036309871822595596,
-0.02591695636510849,
-0.05044577643275261,
-0.12904509902000427,
0.06271687895059586,
-0.03973110020160675,
0.06805901229381561,
0.10692096501588821,
-0.04676760360598564,
0.04524996876716614,
-0.1282774657011032,
-0.009857434779405594,
-0.08097424358129501,
-0.07691404968500137,
0.060243215411901474,
-0.05074765905737877,
0.03313565254211426,
0.09669411182403564,
0.03083830326795578,
-0.016874946653842926,
-0.03190796822309494,
0.20660287141799927,
0.0429513119161129,
0.0396556593477726,
0.12438338994979858,
-0.05450323969125748,
0.052199143916368484,
0.08632517606019974,
0.010880100540816784,
-0.05041520297527313,
0.056956641376018524,
0.04635855183005333,
-0.06883085519075394,
-0.1897912174463272,
-0.021198442205786705,
-0.010702092200517654,
-0.04385535791516304,
0.07143446058034897,
0.03606238216161728,
0.005715401377528906,
0.0763574093580246,
0.013863880187273026,
0.058242592960596085,
-0.00373829435557127,
0.10274633765220642,
0.01220592763274908,
-0.03474033623933792,
0.08841287344694138,
-0.01992853730916977,
-0.010020908899605274,
0.07943077385425568,
-0.015063483268022537,
0.2962491810321808,
-0.029136355966329575,
0.008019516244530678,
0.12961789965629578,
0.035919997841119766,
0.05927525833249092,
0.13518233597278595,
-0.06919466704130173,
0.016690054908394814,
-0.07185253500938416,
-0.05998064950108528,
0.005186000838875771,
0.03437930345535278,
-0.059519026428461075,
0.012921710498631,
-0.07387951016426086,
0.012828424572944641,
-0.0165904201567173,
0.30729779601097107,
0.10630891472101212,
-0.11420412361621857,
-0.05016074702143669,
0.0006901232409290969,
-0.09654971212148666,
-0.0665770173072815,
0.043475400656461716,
0.06226591393351555,
-0.1391143500804901,
0.012378356419503689,
-0.026079293340444565,
0.07075779139995575,
-0.020206129178404808,
0.015543229877948761,
0.04085583984851837,
0.044759344309568405,
-0.04205404594540596,
0.006038024090230465,
-0.18981139361858368,
0.19469165802001953,
0.005781928543001413,
0.02669335901737213,
-0.05422140285372734,
0.031915389001369476,
0.007484739180654287,
-0.03632832691073418,
0.06166306138038635,
0.017335809767246246,
-0.024529943242669106,
-0.061195485293865204,
-0.04930703714489937,
0.014290332794189453,
0.08325952291488647,
-0.03874286636710167,
0.11033941805362701,
-0.0029297703877091408,
0.04588604345917702,
0.016774671152234077,
0.09995969384908676,
-0.18807163834571838,
-0.09445028007030487,
0.029780184850096703,
-0.052485670894384384,
-0.0993226170539856,
-0.07498619705438614,
-0.09426384419202805,
-0.016317171975970268,
0.23974180221557617,
-0.11767415702342987,
-0.07640485465526581,
-0.09551744163036346,
0.01693553477525711,
0.11080986261367798,
-0.045616716146469116,
0.030400661751627922,
-0.008584346622228622,
0.11570921540260315,
-0.0663646012544632,
-0.13016095757484436,
0.019584454596042633,
-0.09725789725780487,
-0.16256313025951385,
-0.06460816413164139,
0.11314757168292999,
0.0625959262251854,
0.03170609474182129,
-0.027653956785798073,
0.0230065006762743,
0.03732581064105034,
-0.04398109018802643,
-0.007835239171981812,
0.07293619960546494,
0.10276205092668533,
0.03677794709801674,
-0.11451341956853867,
0.019169993698596954,
-0.06961731612682343,
-0.06724803894758224,
0.07902302592992783,
0.2582915425300598,
-0.05048006400465965,
0.11569478362798691,
0.11809133738279343,
-0.07827199995517731,
-0.15373224020004272,
0.03448200970888138,
0.08839922398328781,
-0.019858472049236298,
0.007669161539524794,
-0.14972251653671265,
0.09296704083681107,
0.11282464861869812,
-0.017202744260430336,
0.003539406694471836,
-0.19054046273231506,
-0.1310710459947586,
0.07014718651771545,
0.10811116546392441,
0.259458988904953,
-0.06595735251903534,
-0.03667650371789932,
0.016028575599193573,
-0.07993803918361664,
0.022215018048882484,
0.13013890385627747,
0.07153792679309845,
-0.026411661878228188,
-0.07978835701942444,
0.009683559648692608,
-0.045766375958919525,
0.08873128890991211,
0.06038450822234154,
0.06255389750003815,
-0.00858495943248272,
0.0173622015863657,
-0.012302090413868427,
-0.044246867299079895,
0.06866231560707092,
0.019543716683983803,
0.045339230448007584,
-0.07920155674219131,
-0.029746107757091522,
-0.07351291924715042,
0.026988456025719643,
-0.02470426633954048,
-0.07903524488210678,
-0.06168033927679062,
0.08207358419895172,
0.04840711131691933,
-0.029455753043293953,
0.01338005531579256,
0.026127267628908157,
0.11356323957443237,
0.14750325679779053,
0.005941716488450766,
-0.0517524816095829,
-0.06291203945875168,
-0.034876372665166855,
-0.019048884510993958,
0.07054486125707626,
-0.028746632859110832,
0.012995782308280468,
0.06572075933218002,
0.01930554024875164,
0.09646163880825043,
0.060040105134248734,
-0.11363764852285385,
-0.020475691184401512,
0.03436430171132088,
-0.16327868402004242,
0.028764063492417336,
0.004986867308616638,
0.02202450856566429,
-0.03821748495101929,
0.03790706396102905,
0.1387115716934204,
-0.062080737203359604,
-0.03454170376062393,
-0.043654292821884155,
0.06839212775230408,
0.025175701826810837,
0.1533287614583969,
0.03379039466381073,
0.037396613508462906,
-0.08320611715316772,
0.12785358726978302,
0.032659295946359634,
-0.03887999802827835,
0.02448723092675209,
-0.01802297681570053,
-0.11406026780605316,
0.011682886630296707,
0.06585956364870071,
0.037704188376665115,
-0.04363613575696945,
-0.007258492521941662,
-0.024133948609232903,
-0.07531069964170456,
0.06097115948796272,
0.19091477990150452,
0.06472732126712799,
0.07015245407819748,
-0.0553024522960186,
-0.04006064310669899,
-0.08170686662197113,
0.04190179333090782,
0.03745947405695915,
0.07647746801376343,
-0.07844877243041992,
0.09894570708274841,
0.012316429056227207,
0.03748997300863266,
-0.02917821705341339,
-0.05139315500855446,
-0.10431709885597229,
-0.05347852781414986,
-0.08711336553096771,
0.003303743666037917,
-0.07811768352985382,
-0.038386665284633636,
-0.003449615789577365,
0.003617335809394717,
-0.007127734366804361,
0.046889528632164,
-0.05908150598406792,
-0.011995280161499977,
-0.01852020062506199,
0.03691191226243973,
-0.060387030243873596,
-0.03644381836056709,
0.024217668920755386,
-0.10015169531106949,
0.09102992713451385,
0.042557720094919205,
0.0034168437123298645,
0.007572912611067295,
0.08656078577041626,
-0.021818798035383224,
0.025300486013293266,
0.013236076571047306,
-0.048756297677755356,
-0.08305363357067108,
-0.0018511021044105291,
-0.007524929009377956,
-0.013294852338731289,
-0.004015807993710041,
0.07900944352149963,
-0.08613035082817078,
0.030666565522551537,
-0.0031249881722033024,
-0.0009939192095771432,
-0.0710924044251442,
-0.00989456195384264,
0.10519890487194061,
0.09313517063856125,
0.050939127802848816,
-0.09288739413022995,
0.011855898424983025,
-0.13362111151218414,
-0.03877798840403557,
0.004869742318987846,
-0.017282400280237198,
-0.12866169214248657,
-0.004154486581683159,
0.0253616776317358,
-0.0039567481726408005,
0.21979258954524994,
-0.05733674392104149,
-0.0209172572940588,
0.016868459060788155,
-0.0860908105969429,
0.11668175458908081,
-0.022496195510029793,
0.18628080189228058,
-0.009757600724697113,
-0.0445319265127182,
-0.015625732019543648,
0.04567472264170647,
0.016732344403862953,
-0.025949282571673393,
0.18400093913078308,
0.1379348486661911,
0.029184507206082344,
0.04032307118177414,
-0.021524090319871902,
-0.002707116771489382,
-0.030859990045428276,
-0.03504827246069908,
0.03542115539312363,
0.04489506781101227,
0.013457825407385826,
0.14996330440044403,
0.0622684508562088,
-0.1640346497297287,
0.034384869039058685,
-0.029189854860305786,
-0.0414845235645771,
-0.11139927059412003,
-0.1113487184047699,
-0.027275247499346733,
-0.0647035762667656,
0.012174411676824093,
-0.12715671956539154,
0.00013005401706323028,
0.1792317032814026,
0.06414506584405899,
0.025739314034581184,
0.01530236005783081,
-0.1249183714389801,
-0.033749863505363464,
0.057122908532619476,
0.011760067194700241,
0.01951354555785656,
0.05536632239818573,
0.0023272493854165077,
0.05444697290658951,
0.030805831775069237,
0.015845313668251038,
-0.00029355715378187597,
0.06815271079540253,
0.01591641642153263,
0.03885941207408905,
-0.06027386337518692,
-0.00346560962498188,
-0.0422273650765419,
0.07114081084728241,
0.10942855477333069,
0.0469277948141098,
-0.05681398883461952,
-0.00658691069111228,
0.1498188078403473,
-0.03712578862905502,
-0.005737293511629105,
-0.1237245425581932,
0.3235701322555542,
0.01719091460108757,
0.009572613053023815,
0.042729947715997696,
-0.0737520232796669,
-0.04827425256371498,
0.21192294359207153,
0.09791473299264908,
-0.013120155781507492,
-0.022043641656637192,
-0.0022828911896795034,
-0.0307688619941473,
-0.025706226006150246,
0.1553727686405182,
0.0373806394636631,
0.12043601274490356,
-0.05308318883180618,
-0.04056159406900406,
-0.024547994136810303,
-0.011066392064094543,
-0.12119406461715698,
0.1273089051246643,
-0.016265859827399254,
-0.025896715000271797,
-0.07197302579879761,
0.02876327559351921,
0.06429174542427063,
-0.3015287518501282,
-0.0075831045396625996,
-0.028878184035420418,
-0.10681933909654617,
-0.011930977925658226,
-0.027632657438516617,
-0.021041346713900566,
0.04702688381075859,
-0.0392909049987793,
0.07468212395906448,
0.029874557629227638,
0.0323064886033535,
-0.022865721955895424,
-0.10209128260612488,
0.16809909045696259,
0.06314714252948761,
0.0914357602596283,
0.025607915595173836,
0.07618223130702972,
0.06093917787075043,
0.03114359639585018,
-0.0978395938873291,
0.05363684892654419,
0.01360437273979187,
-0.08120164275169373,
-0.045684121549129486,
0.11623052507638931,
0.0013766589108854532,
0.04254301264882088,
0.034836262464523315,
-0.09959402680397034,
0.023058105260133743,
0.06657396256923676,
-0.061941806226968765,
-0.104767806828022,
0.0017871740274131298,
-0.09133144468069077,
0.16511626541614532,
0.14457394182682037,
-0.017931055277585983,
0.0207049660384655,
-0.06962570548057556,
-0.009606517851352692,
0.048558417707681656,
-0.003604077734053135,
-0.024281635880470276,
-0.194055438041687,
0.032578300684690475,
-0.09148729592561722,
-0.008602211251854897,
-0.22598715126514435,
-0.09904144704341888,
-0.009732711128890514,
-0.04963482916355133,
-0.031487055122852325,
0.056978002190589905,
0.028205303475260735,
0.06998845189809799,
-0.019282203167676926,
-0.029728036373853683,
-0.03272475302219391,
0.09039096534252167,
-0.11712335050106049,
-0.06490668654441833
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_100k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_100k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_100k"]}
| null |
google/multiberts-seed_3-step_100k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_100k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 100k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07648765295743942,
0.08373136073350906,
-0.0020749990362674,
0.03890519589185715,
0.08384604007005692,
-0.017354294657707214,
0.07986001670360565,
0.09884203225374222,
-0.023417271673679352,
0.024982990697026253,
0.082367442548275,
0.018831891939044,
0.017140479758381844,
0.0989496037364006,
0.02270650304853916,
-0.22064433991909027,
0.02747940830886364,
-0.02959716133773327,
-0.07178633660078049,
0.07915136218070984,
0.0969778522849083,
-0.08417652547359467,
0.047361209988594055,
0.021499574184417725,
-0.10484495013952255,
0.047832366079092026,
-0.008686495013535023,
-0.02260850928723812,
0.13124488294124603,
-0.0056743561290204525,
0.04724198207259178,
0.04981788247823715,
0.03964913636445999,
-0.13886882364749908,
0.007231769151985645,
0.05740836262702942,
0.05770018696784973,
0.03952912986278534,
0.02145451121032238,
0.07852739840745926,
-0.0015717503847554326,
0.0256701298058033,
0.050221581012010574,
0.02580462396144867,
-0.07672128826379776,
-0.05417780205607414,
-0.10670089721679688,
0.03473050519824028,
0.02893744595348835,
0.01322902087122202,
0.009847695007920265,
0.13621386885643005,
-0.04016057029366493,
0.04885545372962952,
0.19484321773052216,
-0.33244502544403076,
-0.015757083892822266,
0.09016211330890656,
0.053195465356111526,
0.12391822040081024,
-0.006915255915373564,
-0.015397791750729084,
0.07343778759241104,
0.03259847313165665,
0.09568304568529129,
-0.04171798378229141,
0.03147938475012779,
-0.05127665400505066,
-0.16238689422607422,
-0.042032863944768906,
0.09770409762859344,
-0.00039465949521400034,
-0.13812381029129028,
-0.04899246618151665,
-0.03493403270840645,
0.020478663966059685,
0.008960064500570297,
-0.03628325089812279,
0.03196066990494728,
0.008111637085676193,
-0.02506951056420803,
-0.0034738918766379356,
-0.10138170421123505,
-0.05086458474397659,
0.036477088928222656,
0.08020714670419693,
0.10131379216909409,
0.06082378327846527,
0.006085955072194338,
0.11133768409490585,
-0.1810600608587265,
-0.04708613455295563,
-0.030625365674495697,
-0.06208960711956024,
-0.048387061804533005,
-0.01517597958445549,
-0.110298290848732,
-0.04829997941851616,
0.013580501079559326,
0.13834774494171143,
0.0018651512218639255,
0.03252928704023361,
-0.021640541031956673,
0.006129958666861057,
0.06112023815512657,
0.04774836078286171,
-0.012585512362420559,
0.013544256798923016,
0.01953166536986828,
-0.007216237019747496,
-0.020431218668818474,
0.015612324699759483,
0.003994616214185953,
0.03704236447811127,
0.12312229722738266,
0.030762722715735435,
-0.10692097991704941,
0.07937183231115341,
-0.015634750947356224,
-0.0473337322473526,
0.017530985176563263,
-0.09019745886325836,
-0.06513014435768127,
-0.04421960189938545,
0.0037597815971821547,
0.01989191584289074,
-0.012107963673770428,
-0.00839927140623331,
-0.022723106667399406,
-0.0305675957351923,
-0.08245234191417694,
-0.048278335481882095,
-0.05487728863954544,
-0.1311608850955963,
0.004372348077595234,
-0.1883758306503296,
-0.03773386776447296,
-0.1151895597577095,
-0.18711616098880768,
-0.02740725316107273,
0.06163044273853302,
-0.012127017602324486,
-0.04902466759085655,
0.0785694420337677,
0.038487132638692856,
-0.031177202239632607,
-0.0033682184293866158,
0.07323556393384933,
-0.005949105601757765,
0.04031483829021454,
-0.02409670315682888,
0.06703906506299973,
0.009474493563175201,
0.034028634428977966,
-0.054610464721918106,
0.06116177514195442,
-0.17601291835308075,
0.042948681861162186,
-0.07924173027276993,
-0.01727055013179779,
-0.08576687425374985,
-0.03747411444783211,
-0.00855621974915266,
0.007960326969623566,
0.02263975888490677,
0.07597758620977402,
-0.18204671144485474,
-0.025828126817941666,
0.11451909691095352,
-0.15401847660541534,
-0.030070535838603973,
0.07028790563344955,
-0.04491471126675606,
0.09370201826095581,
0.07199079543352127,
0.15259498357772827,
-0.019384417682886124,
-0.08914968371391296,
0.051578450947999954,
-0.012449710629880428,
0.016190078109502792,
-0.013605251908302307,
0.0653764083981514,
-0.02390899881720543,
-0.1483553946018219,
0.037963781505823135,
-0.130905881524086,
-0.0008872439502738416,
-0.07588545978069305,
0.019231855869293213,
-0.00564244668930769,
-0.06910588592290878,
-0.07482318580150604,
-0.02788710780441761,
0.06478673219680786,
-0.07809477299451828,
-0.013526265509426594,
0.03089572861790657,
0.06883863359689713,
-0.07642793655395508,
0.06511754542589188,
-0.012437163852155209,
0.021646250039339066,
-0.08451774716377258,
-0.03908867761492729,
-0.18701793253421783,
0.042044125497341156,
0.10140226036310196,
0.01686830073595047,
-0.022905301302671432,
0.13562944531440735,
0.007709268480539322,
0.06461367756128311,
-0.04944659397006035,
0.010778133757412434,
-0.005692904349416494,
-0.0019247785676270723,
-0.08619116246700287,
-0.09347356110811234,
-0.0777982696890831,
-0.07093110680580139,
0.06994360685348511,
-0.11701805889606476,
0.021050788462162018,
-0.059517525136470795,
0.035719987004995346,
0.01771768555045128,
-0.08282462507486343,
-0.00865558534860611,
0.019754229113459587,
-0.05715930089354515,
-0.060034770518541336,
0.04265165328979492,
0.06861021369695663,
-0.01042436808347702,
0.08799248933792114,
-0.05451289564371109,
-0.08402197808027267,
0.03400770574808121,
0.09578154236078262,
-0.11422840505838394,
0.00048539412091486156,
-0.05723906680941582,
-0.04037419334053993,
-0.059116180986166,
-0.02267422154545784,
0.07970786094665527,
-0.002218471374362707,
0.13494367897510529,
-0.07420077174901962,
-0.010753474198281765,
0.011481146328151226,
-0.015138677321374416,
-0.025439999997615814,
0.033630404621362686,
0.07055968791246414,
-0.07034091651439667,
0.015515274368226528,
0.04024708271026611,
0.012238986790180206,
0.07385040074586868,
-0.054663509130477905,
-0.08496416360139847,
0.018456541001796722,
0.03662487864494324,
0.028295692056417465,
0.0686940923333168,
-0.02521926537156105,
-0.018675222992897034,
0.030122170224785805,
0.01841280236840248,
0.005589823238551617,
-0.1060875803232193,
0.05651326850056648,
0.054699599742889404,
0.006564763840287924,
0.07413024455308914,
-0.009700476191937923,
-0.04421873763203621,
0.07478558272123337,
0.03787660226225853,
-0.006088951602578163,
-0.013575401157140732,
-0.016225073486566544,
-0.1140548512339592,
0.19703005254268646,
-0.06229809671640396,
-0.16173842549324036,
-0.07031814754009247,
-0.11696810275316238,
-0.005905333440750837,
0.023126991465687752,
0.038255952298641205,
-0.027097757905721664,
-0.04962974041700363,
-0.1292346864938736,
0.06175786629319191,
-0.04095540568232536,
0.06515110284090042,
0.10792022943496704,
-0.047511570155620575,
0.04735236614942551,
-0.12940460443496704,
-0.010365155525505543,
-0.08303182572126389,
-0.07607113569974899,
0.06115944683551788,
-0.05280113220214844,
0.03254815191030502,
0.09596800059080124,
0.03444623574614525,
-0.015222565270960331,
-0.031717460602521896,
0.21132121980190277,
0.041527386754751205,
0.03702717646956444,
0.12625589966773987,
-0.05288860946893692,
0.0520283505320549,
0.08214688301086426,
0.013359674252569675,
-0.05109836161136627,
0.056043487042188644,
0.04848610237240791,
-0.0678025484085083,
-0.1919061243534088,
-0.0222496148198843,
-0.011221569962799549,
-0.04467528313398361,
0.07080157101154327,
0.03849861025810242,
0.015216833911836147,
0.07411449402570724,
0.011593763716518879,
0.055264342576265335,
-0.004434171132743359,
0.10477853566408157,
0.018903451040387154,
-0.0354762077331543,
0.09142807126045227,
-0.01847303844988346,
-0.011127984151244164,
0.07920359820127487,
-0.015915455296635628,
0.2919697165489197,
-0.027449700981378555,
0.011461501009762287,
0.1274818629026413,
0.03613601624965668,
0.05902094766497612,
0.1320905238389969,
-0.06982196122407913,
0.015244505368173122,
-0.07337908446788788,
-0.06104596331715584,
0.00240933895111084,
0.03539279103279114,
-0.06043180450797081,
0.008562376722693443,
-0.07108054310083389,
0.007655400782823563,
-0.015454049222171307,
0.31721651554107666,
0.1033865287899971,
-0.11010046303272247,
-0.05086205527186394,
0.0005386738339439034,
-0.09792326390743256,
-0.06529662013053894,
0.04398418217897415,
0.05953557416796684,
-0.13916908204555511,
0.01099174004048109,
-0.02905305288732052,
0.07132578641176224,
-0.018439417704939842,
0.01772356405854225,
0.03976304084062576,
0.04270608723163605,
-0.04169453680515289,
0.004021292552351952,
-0.19300633668899536,
0.19249236583709717,
0.006677133496850729,
0.02506556734442711,
-0.05113731324672699,
0.03191106766462326,
0.00859338603913784,
-0.03480049967765808,
0.06220530346035957,
0.016329731792211533,
-0.018863346427679062,
-0.05923420935869217,
-0.04943066090345383,
0.013006551191210747,
0.08077741414308548,
-0.03841077908873558,
0.10936111211776733,
-0.004772258922457695,
0.04368758946657181,
0.017843205481767654,
0.09802857041358948,
-0.18660423159599304,
-0.09496238082647324,
0.028761057183146477,
-0.0555448904633522,
-0.09765417128801346,
-0.07563447207212448,
-0.09506092220544815,
-0.016181902959942818,
0.24408599734306335,
-0.11394982039928436,
-0.07619289308786392,
-0.09649652242660522,
0.01536002941429615,
0.10780446976423264,
-0.04352802038192749,
0.029376085847616196,
-0.009434162639081478,
0.11978216469287872,
-0.06611372530460358,
-0.131813183426857,
0.021810995414853096,
-0.09922460466623306,
-0.16035090386867523,
-0.06463728845119476,
0.11417872458696365,
0.06241171807050705,
0.032123684883117676,
-0.027894238010048866,
0.022351659834384918,
0.0354912206530571,
-0.04289394989609718,
-0.004536030348390341,
0.0709783211350441,
0.10342150181531906,
0.036137890070676804,
-0.11613399535417557,
0.024463117122650146,
-0.06915192306041718,
-0.06634770333766937,
0.07937216758728027,
0.2576494514942169,
-0.050917625427246094,
0.11932241916656494,
0.11647144705057144,
-0.07615019381046295,
-0.14967045187950134,
0.029381580650806427,
0.08759301900863647,
-0.02023428864777088,
0.015214917249977589,
-0.15615211427211761,
0.09174119681119919,
0.1151382327079773,
-0.01888285204768181,
0.004433180205523968,
-0.18947777152061462,
-0.12809334695339203,
0.07318026572465897,
0.10820857435464859,
0.2637307643890381,
-0.06965406239032745,
-0.03854028880596161,
0.017022185027599335,
-0.08357969671487808,
0.018888678401708603,
0.12670882046222687,
0.07166120409965515,
-0.02703370340168476,
-0.08304433524608612,
0.010647016577422619,
-0.04459798336029053,
0.08791544288396835,
0.0557049959897995,
0.06278675049543381,
-0.007586192339658737,
0.022387556731700897,
-0.017261389642953873,
-0.04378172755241394,
0.06771940737962723,
0.015067200176417828,
0.04380752891302109,
-0.07758790999650955,
-0.03101464733481407,
-0.07321465760469437,
0.02612851746380329,
-0.02415498159825802,
-0.07960780709981918,
-0.061428505927324295,
0.08139684051275253,
0.04791199415922165,
-0.026943696662783623,
0.019994929432868958,
0.023036455735564232,
0.12051467597484589,
0.15294624865055084,
0.004772573243826628,
-0.05477732792496681,
-0.07116314023733139,
-0.03751151263713837,
-0.018972614780068398,
0.07072731852531433,
-0.02737949788570404,
0.013165786862373352,
0.06259548664093018,
0.01815219223499298,
0.09814873337745667,
0.05972885340452194,
-0.11437778919935226,
-0.02082783542573452,
0.03370068594813347,
-0.16274172067642212,
0.026396052911877632,
0.003585362806916237,
0.024268416687846184,
-0.03760373964905739,
0.03867902234196663,
0.14246799051761627,
-0.059396520256996155,
-0.03448933735489845,
-0.04224848374724388,
0.0686301589012146,
0.02582314983010292,
0.14829866588115692,
0.033129967749118805,
0.037020597606897354,
-0.08313072472810745,
0.12322013080120087,
0.030956236645579338,
-0.03299424797296524,
0.024998001754283905,
-0.020357146859169006,
-0.1121482327580452,
0.010607372038066387,
0.061089321970939636,
0.03909730166196823,
-0.05080757290124893,
-0.005490556824952364,
-0.0269783902913332,
-0.07542962580919266,
0.06118328869342804,
0.19040195643901825,
0.06668946892023087,
0.07204979658126831,
-0.056280266493558884,
-0.039568085223436356,
-0.0775008499622345,
0.038966104388237,
0.03306478261947632,
0.07602487504482269,
-0.07860276848077774,
0.09267961233854294,
0.01326206885278225,
0.03929190710186958,
-0.030308112502098083,
-0.05422762408852577,
-0.10561098158359528,
-0.05235641449689865,
-0.09552440792322159,
0.0011171090882271528,
-0.07874888926744461,
-0.03888263925909996,
-0.002647378947585821,
0.004375628661364317,
-0.006273103877902031,
0.0473027341067791,
-0.05989551544189453,
-0.010043070651590824,
-0.01843268610537052,
0.03551420569419861,
-0.06345850974321365,
-0.03372996300458908,
0.02272859774529934,
-0.10292214900255203,
0.09024473279714584,
0.042999230325222015,
0.003375726519152522,
0.008551409468054771,
0.08030260354280472,
-0.02130327746272087,
0.02561131678521633,
0.010583179071545601,
-0.048391059041023254,
-0.07974092662334442,
-0.0029166487511247396,
-0.0070310621522367,
-0.014679123647511005,
-0.0046433135867118835,
0.07876813411712646,
-0.08742936700582504,
0.03150377795100212,
-0.002832039026543498,
-0.0024776251520961523,
-0.07180985063314438,
-0.010077960789203644,
0.10425399988889694,
0.09528578817844391,
0.04931483417749405,
-0.09269373118877411,
0.011403512209653854,
-0.13707737624645233,
-0.03811362758278847,
0.0054795327596366405,
-0.014747465029358864,
-0.1291216015815735,
-0.006691127084195614,
0.023952508345246315,
-0.0019744648598134518,
0.223607137799263,
-0.05588493496179581,
-0.019803214818239212,
0.01760750263929367,
-0.09401757270097733,
0.12284915894269943,
-0.025321250781416893,
0.1843288689851761,
-0.009262444451451302,
-0.043270379304885864,
-0.013093092478811741,
0.04487340524792671,
0.017253387719392776,
-0.026964524760842323,
0.1812630444765091,
0.1386137455701828,
0.024644551798701286,
0.04009794816374779,
-0.023520024493336678,
-0.00468063959851861,
-0.032212406396865845,
-0.03664030879735947,
0.03807613253593445,
0.04409093037247658,
0.014630093239247799,
0.15599983930587769,
0.06561953574419022,
-0.16523517668247223,
0.03547893092036247,
-0.029167471453547478,
-0.03894603252410889,
-0.11230557411909103,
-0.10222496837377548,
-0.026924466714262962,
-0.07176362723112106,
0.012070642784237862,
-0.12755140662193298,
0.00043949103564955294,
0.18201246857643127,
0.0651143416762352,
0.026791179552674294,
0.013508046045899391,
-0.11770344525575638,
-0.03279007226228714,
0.05731653794646263,
0.010527202859520912,
0.018775738775730133,
0.05227058380842209,
0.0031696439255028963,
0.05377710983157158,
0.03108164854347706,
0.016093691810965538,
0.000007453112630173564,
0.07054401189088821,
0.01752866804599762,
0.03893518075346947,
-0.06215362250804901,
-0.0019921509083360434,
-0.039405085146427155,
0.0694306343793869,
0.11490441858768463,
0.04785118252038956,
-0.05571255087852478,
-0.007582501508295536,
0.15275068581104279,
-0.03800015524029732,
-0.0018280423246324062,
-0.12526661157608032,
0.32395851612091064,
0.018274791538715363,
0.010497471317648888,
0.039689015597105026,
-0.0728316679596901,
-0.04819880425930023,
0.21158649027347565,
0.0977342426776886,
-0.0156333539634943,
-0.02062167041003704,
-0.0009345082798972726,
-0.030257364735007286,
-0.02511042356491089,
0.15297889709472656,
0.040125589817762375,
0.12753894925117493,
-0.05439505726099014,
-0.03958190977573395,
-0.024866336956620216,
-0.013421385549008846,
-0.12354245036840439,
0.12332403659820557,
-0.015268432907760143,
-0.025247102603316307,
-0.0704042986035347,
0.029949503019452095,
0.06563922762870789,
-0.30304813385009766,
-0.004496058449149132,
-0.0267824474722147,
-0.1049765944480896,
-0.011498820967972279,
-0.029245294630527496,
-0.02134818024933338,
0.04831062629818916,
-0.040200233459472656,
0.07160370796918869,
0.034359317272901535,
0.03147762641310692,
-0.022435424849390984,
-0.10283231735229492,
0.1714589148759842,
0.059677742421627045,
0.09163538366556168,
0.024832550436258316,
0.07313504070043564,
0.06009358540177345,
0.030639292672276497,
-0.09305370599031448,
0.05410553887486458,
0.013515559956431389,
-0.0862668976187706,
-0.04558191075921059,
0.11447411775588989,
0.0024011065252125263,
0.04587547108530998,
0.03430949151515961,
-0.10022512078285217,
0.023286251351237297,
0.06654218584299088,
-0.06325273215770721,
-0.10306989401578903,
0.0010096448240801692,
-0.09172023087739944,
0.1641237735748291,
0.14756742119789124,
-0.017044633626937866,
0.01922716572880745,
-0.06960565596818924,
-0.007189102470874786,
0.04984267055988312,
-0.005874230060726404,
-0.024939535185694695,
-0.19295267760753632,
0.03545089066028595,
-0.08128951489925385,
-0.010703093372285366,
-0.23022420704364777,
-0.10070386528968811,
-0.007302379235625267,
-0.04720654711127281,
-0.030985206365585327,
0.05434596166014671,
0.028483547270298004,
0.07096853852272034,
-0.018399197608232498,
-0.019926374778151512,
-0.03170469403266907,
0.08940482139587402,
-0.11484259366989136,
-0.062414705753326416
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1100k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1100k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1100k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1100k"]}
| null |
google/multiberts-seed_3-step_1100k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1100k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1100k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1100k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1100k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1100k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1100k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1100k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07970459759235382,
0.0847078412771225,
-0.001967037795111537,
0.04534630849957466,
0.0879371166229248,
-0.015171745792031288,
0.07750742882490158,
0.09776493906974792,
-0.024238381534814835,
0.025137733668088913,
0.08186281472444534,
0.014875494875013828,
0.015432064421474934,
0.10121389478445053,
0.018581945449113846,
-0.22101837396621704,
0.025971027091145515,
-0.030055180191993713,
-0.06657323986291885,
0.07761331647634506,
0.09722120314836502,
-0.08272993564605713,
0.04724707081913948,
0.022708695381879807,
-0.10489416122436523,
0.049136724323034286,
-0.009562588296830654,
-0.0224725604057312,
0.13091649115085602,
-0.004588865675032139,
0.04881803318858147,
0.04984882101416588,
0.040463246405124664,
-0.1357203722000122,
0.0069214822724461555,
0.05643128976225853,
0.05833932384848595,
0.040497247129678726,
0.02374039776623249,
0.07996189594268799,
0.0017339065670967102,
0.028527580201625824,
0.049968887120485306,
0.02397472783923149,
-0.07625191658735275,
-0.053867314010858536,
-0.10889673233032227,
0.03909459337592125,
0.029378563165664673,
0.011109232902526855,
0.011457708664238453,
0.13593654334545135,
-0.03607488051056862,
0.0494038462638855,
0.19538243114948273,
-0.33113449811935425,
-0.01633615978062153,
0.08024302870035172,
0.04990791529417038,
0.12496951967477798,
-0.0043104784563183784,
-0.01821448840200901,
0.07724688947200775,
0.032945726066827774,
0.09119054675102234,
-0.041046153753995895,
0.020384902134537697,
-0.05293255299329758,
-0.16332818567752838,
-0.039985112845897675,
0.10254912078380585,
-0.0015289621660485864,
-0.13704675436019897,
-0.04273606091737747,
-0.035977449268102646,
0.01916159875690937,
0.011389859020709991,
-0.036634866148233414,
0.031111888587474823,
0.006183878984302282,
-0.012920200824737549,
-0.0032976490911096334,
-0.10120604932308197,
-0.048826441168785095,
0.033900994807481766,
0.07987508177757263,
0.1024937853217125,
0.061946019530296326,
0.004049086943268776,
0.11266031116247177,
-0.18139874935150146,
-0.04733666405081749,
-0.028982514515519142,
-0.06228051707148552,
-0.045951880514621735,
-0.01672438718378544,
-0.10633014887571335,
-0.04203438386321068,
0.008671833202242851,
0.13196803629398346,
0.0025074828881770372,
0.03277261182665825,
-0.024385670199990273,
0.0056603578850626945,
0.05876094102859497,
0.043648771941661835,
-0.01274915412068367,
0.01190101820975542,
0.022595873102545738,
-0.004857280757278204,
-0.021152663975954056,
0.01610635034739971,
0.006310111377388239,
0.03490779176354408,
0.1219768151640892,
0.031734246760606766,
-0.10260937362909317,
0.07892338931560516,
-0.017008677124977112,
-0.04605904594063759,
0.02601650170981884,
-0.09170448780059814,
-0.06542787700891495,
-0.04459160566329956,
-0.0004693852097261697,
0.02069937251508236,
-0.005536970682442188,
-0.00929946731775999,
-0.024845002219080925,
-0.03287731856107712,
-0.08513405174016953,
-0.048051729798316956,
-0.05351919308304787,
-0.13185825943946838,
0.004414149094372988,
-0.1832437664270401,
-0.03864022344350815,
-0.11460645496845245,
-0.19204367697238922,
-0.02770835906267166,
0.06244423985481262,
-0.009463832713663578,
-0.048838838934898376,
0.07850518822669983,
0.03918076679110527,
-0.030167676508426666,
-0.004816162865608931,
0.07433368265628815,
-0.005399690009653568,
0.04057759419083595,
-0.02355830743908882,
0.06543787568807602,
0.003958155866712332,
0.03565060719847679,
-0.05419045314192772,
0.06132898107171059,
-0.17927707731723785,
0.03974289819598198,
-0.08054663985967636,
-0.015690449625253677,
-0.08472542464733124,
-0.03815222531557083,
-0.003664222080260515,
0.006592350080609322,
0.02373499795794487,
0.07675187289714813,
-0.1752404421567917,
-0.022355366498231888,
0.10590625554323196,
-0.1573784202337265,
-0.02986183948814869,
0.06960831582546234,
-0.044695910066366196,
0.0930502787232399,
0.07248365134000778,
0.15064890682697296,
-0.012668836861848831,
-0.08641008287668228,
0.05064989626407623,
-0.014707829803228378,
0.013072512112557888,
-0.015792645514011383,
0.0661044493317604,
-0.02206903137266636,
-0.15239723026752472,
0.037910010665655136,
-0.13264870643615723,
-0.0016541180666536093,
-0.07620292901992798,
0.01982446387410164,
-0.0067374082282185555,
-0.067957803606987,
-0.07756561785936356,
-0.025425933301448822,
0.06296245753765106,
-0.07873459160327911,
-0.013356847688555717,
0.03213796392083168,
0.072052501142025,
-0.0754767507314682,
0.06649134308099747,
-0.011066366918385029,
0.018846362829208374,
-0.08670664578676224,
-0.04178445786237717,
-0.1872919797897339,
0.041260235011577606,
0.09871407598257065,
0.018472351133823395,
-0.024645958095788956,
0.13208624720573425,
0.0062223803251981735,
0.06605453044176102,
-0.04967273026704788,
0.011415149085223675,
-0.003280102740973234,
-0.0032683101017028093,
-0.08622564375400543,
-0.09624975174665451,
-0.07900138199329376,
-0.07157312333583832,
0.06969913840293884,
-0.12148647755384445,
0.02106618881225586,
-0.06154792010784149,
0.03433878719806671,
0.017176613211631775,
-0.08169422298669815,
-0.009076060727238655,
0.01864994503557682,
-0.0592312216758728,
-0.06156308203935623,
0.04360276088118553,
0.06766676157712936,
-0.005741877015680075,
0.08821210265159607,
-0.05650358274579048,
-0.08356795459985733,
0.033450786024332047,
0.10051436722278595,
-0.11252105236053467,
-0.00034767924807965755,
-0.05753542110323906,
-0.04338409751653671,
-0.05730636417865753,
-0.022504897788167,
0.08146636188030243,
-0.006167982239276171,
0.1345328688621521,
-0.07516900449991226,
-0.006975139025598764,
0.011242932640016079,
-0.014469980262219906,
-0.027136310935020447,
0.03528330847620964,
0.0711987242102623,
-0.07710937410593033,
0.014762680977582932,
0.034267816692590714,
0.012663195841014385,
0.0729159265756607,
-0.052058592438697815,
-0.08492454886436462,
0.021026425063610077,
0.038620755076408386,
0.02998749352991581,
0.06878281384706497,
-0.026361433789134026,
-0.017696136608719826,
0.030617667362093925,
0.01988133043050766,
0.009743216447532177,
-0.10803383588790894,
0.05678624287247658,
0.05343542620539665,
0.009095460176467896,
0.06807006895542145,
-0.01214728131890297,
-0.04209509119391441,
0.07789480686187744,
0.034398891031742096,
-0.007028394378721714,
-0.01403440535068512,
-0.01515151746571064,
-0.11607293784618378,
0.19697318971157074,
-0.06395509093999863,
-0.15793542563915253,
-0.07089491188526154,
-0.11996584385633469,
-0.003395714797079563,
0.022011594846844673,
0.03956127166748047,
-0.025067057460546494,
-0.05157926306128502,
-0.12914949655532837,
0.06253236532211304,
-0.038227379322052,
0.06639140099287033,
0.11113003641366959,
-0.04868798330426216,
0.04656260088086128,
-0.13061924278736115,
-0.009334675036370754,
-0.082204170525074,
-0.07267364114522934,
0.058649491518735886,
-0.054466526955366135,
0.032154496759176254,
0.09904912114143372,
0.03137047961354256,
-0.017290765419602394,
-0.031151678413152695,
0.20211414992809296,
0.044018179178237915,
0.03785324841737747,
0.12617357075214386,
-0.0552649199962616,
0.05352237820625305,
0.08272445946931839,
0.010609650053083897,
-0.05141273885965347,
0.05848795920610428,
0.043953027576208115,
-0.06757853180170059,
-0.19793850183486938,
-0.023431481793522835,
-0.011185625568032265,
-0.04143277928233147,
0.0716884657740593,
0.03707798942923546,
0.002866151975467801,
0.07417679578065872,
0.010536620393395424,
0.05738871544599533,
-0.004165025893598795,
0.10252202302217484,
0.01618032157421112,
-0.034806836396455765,
0.08773908019065857,
-0.021151680499315262,
-0.006057315971702337,
0.08174832910299301,
-0.01505511999130249,
0.29833319783210754,
-0.03441065177321434,
0.007847089320421219,
0.12714548408985138,
0.034928563982248306,
0.05598394200205803,
0.13543671369552612,
-0.07151111960411072,
0.016372205689549446,
-0.07254907488822937,
-0.06032492592930794,
0.004143659491091967,
0.03633640334010124,
-0.062221601605415344,
0.013460124842822552,
-0.07412000745534897,
0.01692161150276661,
-0.016954191029071808,
0.3090716302394867,
0.10082254558801651,
-0.11428999900817871,
-0.04981902614235878,
0.00028403231408447027,
-0.09763135761022568,
-0.06776758283376694,
0.04458058252930641,
0.06266908347606659,
-0.1356441229581833,
0.01252758875489235,
-0.027887597680091858,
0.07028600573539734,
-0.016524018719792366,
0.014925415627658367,
0.04290347546339035,
0.044989075511693954,
-0.042198434472084045,
0.0035657789558172226,
-0.18969707190990448,
0.19672685861587524,
0.00648580864071846,
0.02482445165514946,
-0.05147583410143852,
0.03383222222328186,
0.012836563400924206,
-0.03339215740561485,
0.06248463690280914,
0.016202472150325775,
-0.014337806031107903,
-0.05323179438710213,
-0.04987926036119461,
0.01329507865011692,
0.07812067866325378,
-0.032604169100522995,
0.10772494226694107,
-0.0037173344753682613,
0.04501203075051308,
0.018341058865189552,
0.09460555016994476,
-0.18829120695590973,
-0.0946715772151947,
0.030085502192378044,
-0.05575437471270561,
-0.10623711347579956,
-0.07591447234153748,
-0.09821939468383789,
-0.020754385739564896,
0.242680624127388,
-0.10582000017166138,
-0.07654467970132828,
-0.09606343507766724,
0.015933675691485405,
0.10767847299575806,
-0.04458654671907425,
0.030641574412584305,
-0.007325159385800362,
0.11500625312328339,
-0.06716711074113846,
-0.13024471700191498,
0.02453598566353321,
-0.09745971858501434,
-0.15944774448871613,
-0.06728057563304901,
0.11106357723474503,
0.06293913722038269,
0.03164511173963547,
-0.030199874192476273,
0.01986207440495491,
0.036861602216959,
-0.04226190596818924,
-0.00846627913415432,
0.06992758810520172,
0.10218541324138641,
0.03975656256079674,
-0.11559087783098221,
0.01077620405703783,
-0.06752132624387741,
-0.06693682074546814,
0.07684080302715302,
0.26219093799591064,
-0.049734968692064285,
0.11719375848770142,
0.11269954591989517,
-0.07804618030786514,
-0.15035980939865112,
0.03630969673395157,
0.08885050565004349,
-0.019852831959724426,
0.01539234071969986,
-0.15179693698883057,
0.09427360445261002,
0.11624216288328171,
-0.01936875656247139,
0.011026122607290745,
-0.1855446696281433,
-0.12768535315990448,
0.07243312150239944,
0.11086290329694748,
0.2620328366756439,
-0.06953319162130356,
-0.0363435372710228,
0.019408518448472023,
-0.09522278606891632,
0.015134443528950214,
0.1276611089706421,
0.07021420449018478,
-0.028147101402282715,
-0.0850793644785881,
0.009695244021713734,
-0.045333221554756165,
0.09073279798030853,
0.05722948908805847,
0.06362641602754593,
-0.00742763327434659,
0.01590902917087078,
-0.014283515512943268,
-0.04527699574828148,
0.07020983844995499,
0.018248792737722397,
0.04587608948349953,
-0.08453744649887085,
-0.030631089583039284,
-0.07325681298971176,
0.0279860720038414,
-0.025043459609150887,
-0.07717392593622208,
-0.05873860791325569,
0.081009142100811,
0.047636136412620544,
-0.0269299428910017,
0.021634822711348534,
0.024372868239879608,
0.11707727611064911,
0.15194012224674225,
0.006371667608618736,
-0.05303914099931717,
-0.06132448837161064,
-0.03879533335566521,
-0.018842630088329315,
0.06997249275445938,
-0.03727806732058525,
0.014171137474477291,
0.06534062325954437,
0.019504794850945473,
0.0967024564743042,
0.05925454571843147,
-0.11270926892757416,
-0.017740124836564064,
0.03612733259797096,
-0.162795290350914,
0.03382795676589012,
0.0029169858898967505,
0.027640778571367264,
-0.03793802484869957,
0.03980889171361923,
0.14032800495624542,
-0.06058664992451668,
-0.03514504432678223,
-0.04144693538546562,
0.06831642985343933,
0.025355512276291847,
0.14940746128559113,
0.033730749040842056,
0.03654314950108528,
-0.08408673852682114,
0.1280170977115631,
0.03281761705875397,
-0.03713959455490112,
0.02362637035548687,
-0.014996093697845936,
-0.11207632720470428,
0.010531666688621044,
0.060666512697935104,
0.04034048691391945,
-0.047264039516448975,
-0.0067640068009495735,
-0.028119096532464027,
-0.07400910556316376,
0.06274653226137161,
0.19182877242565155,
0.06638498604297638,
0.0697825625538826,
-0.05564548820257187,
-0.0389622263610363,
-0.08009513467550278,
0.0395258404314518,
0.03558642044663429,
0.07791804522275925,
-0.081651471555233,
0.09368324279785156,
0.013084020465612411,
0.03824656456708908,
-0.029226137325167656,
-0.0516660176217556,
-0.10134100914001465,
-0.05337288975715637,
-0.09411199390888214,
0.003813248360529542,
-0.07995087653398514,
-0.03634507581591606,
-0.0026625681202858686,
0.003538871882483363,
-0.00931183435022831,
0.050157733261585236,
-0.06047196313738823,
-0.010749570094048977,
-0.016857100650668144,
0.03688191622495651,
-0.06133059412240982,
-0.036121468991041183,
0.022362608462572098,
-0.10125162452459335,
0.09197025746107101,
0.04340998828411102,
0.002463163807988167,
0.007479365915060043,
0.08310600370168686,
-0.023751195520162582,
0.025866810232400894,
0.01126569602638483,
-0.04642205685377121,
-0.08061258494853973,
-0.0019524571252986789,
-0.006821924354881048,
-0.012991078197956085,
-0.00554017024114728,
0.08270745724439621,
-0.08758512139320374,
0.031249333173036575,
-0.00415003439411521,
-0.0009022725280374289,
-0.0705101266503334,
-0.010978191159665585,
0.10541027039289474,
0.0917675644159317,
0.048950232565402985,
-0.09108800441026688,
0.009954695589840412,
-0.13826121389865875,
-0.03881501778960228,
0.004157842602580786,
-0.016157757490873337,
-0.1279551088809967,
-0.0031881534960120916,
0.023229381069540977,
-0.0024647738318890333,
0.21841752529144287,
-0.060089223086833954,
-0.023380696773529053,
0.019070493057370186,
-0.09151872992515564,
0.11348235607147217,
-0.026237821206450462,
0.18323753774166107,
-0.015281994827091694,
-0.04141339659690857,
-0.008495278656482697,
0.044465627521276474,
0.020044535398483276,
-0.023703917860984802,
0.18721584975719452,
0.13904349505901337,
0.030648818239569664,
0.043681833893060684,
-0.02450065314769745,
-0.007444530725479126,
-0.03834037482738495,
-0.040575627237558365,
0.03961735963821411,
0.044882338494062424,
0.017335500568151474,
0.14661575853824615,
0.0672980546951294,
-0.1637067049741745,
0.03206294775009155,
-0.03041377104818821,
-0.039137594401836395,
-0.11206430941820145,
-0.09689945727586746,
-0.025315096601843834,
-0.06993896514177322,
0.013402823358774185,
-0.12834835052490234,
0.0015221209032461047,
0.1876012533903122,
0.06450651586055756,
0.02719125524163246,
0.013830921612679958,
-0.12041991949081421,
-0.034028008580207825,
0.05664155259728432,
0.010151194408535957,
0.020139893516898155,
0.052492041140794754,
0.0007436653249897063,
0.0541522279381752,
0.02831791527569294,
0.014927107840776443,
-0.0018029073253273964,
0.07066356390714645,
0.01846766471862793,
0.04095517471432686,
-0.06263887882232666,
-0.0033744447864592075,
-0.03916168585419655,
0.07079542428255081,
0.11641532182693481,
0.047510553151369095,
-0.05213728919625282,
-0.0068440563045442104,
0.1523604691028595,
-0.037947699427604675,
0.0004483957018237561,
-0.12525902688503265,
0.3292268216609955,
0.01870144158601761,
0.009747481904923916,
0.04257100448012352,
-0.07300063967704773,
-0.047446127980947495,
0.20607012510299683,
0.09392052888870239,
-0.015423092059791088,
-0.022016538307070732,
-0.0010632970370352268,
-0.030663542449474335,
-0.028778737410902977,
0.15397177636623383,
0.04291096329689026,
0.12014683336019516,
-0.05344262346625328,
-0.044863443821668625,
-0.026724237948656082,
-0.009160771034657955,
-0.11994128674268723,
0.12035691738128662,
-0.017605146393179893,
-0.023695874959230423,
-0.06781614571809769,
0.028080379590392113,
0.06665296852588654,
-0.3057301640510559,
-0.005879284348338842,
-0.02900679223239422,
-0.10981578379869461,
-0.011852518655359745,
-0.02270544320344925,
-0.020411180332303047,
0.047989726066589355,
-0.04079701378941536,
0.07282643765211105,
0.03604412078857422,
0.03195871412754059,
-0.024619339033961296,
-0.10496498644351959,
0.16880092024803162,
0.053304772824048996,
0.09533735364675522,
0.02436746098101139,
0.07317376881837845,
0.05909515917301178,
0.032914068549871445,
-0.09825856238603592,
0.049988068640232086,
0.013561036437749863,
-0.0850103497505188,
-0.046060606837272644,
0.11216054856777191,
0.0001728632050799206,
0.05016795173287392,
0.03409190475940704,
-0.10119222849607468,
0.019410818815231323,
0.0741412565112114,
-0.062197159975767136,
-0.10252577811479568,
-0.0007559172227047384,
-0.09106535464525223,
0.16479329764842987,
0.14580221474170685,
-0.016476072371006012,
0.017636189237236977,
-0.06836039572954178,
-0.0075697945430874825,
0.05005841329693794,
0.004571300931274891,
-0.023124637082219124,
-0.19482234120368958,
0.036080460995435715,
-0.08746952563524246,
-0.007160150911659002,
-0.22960403561592102,
-0.09844483435153961,
-0.010665029287338257,
-0.0530143640935421,
-0.03427116572856903,
0.056166600435972214,
0.03058331273496151,
0.07230959832668304,
-0.018812572583556175,
-0.023716436699032784,
-0.0329238623380661,
0.09158160537481308,
-0.11641206592321396,
-0.06432706117630005
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1200k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1200k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1200k"]}
| null |
google/multiberts-seed_3-step_1200k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1200k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1200k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07950849831104279,
0.08805009722709656,
-0.0019697363022714853,
0.04662445932626724,
0.08657962083816528,
-0.01631736569106579,
0.07588598132133484,
0.09888118505477905,
-0.037128131836652756,
0.02427569217979908,
0.08065589517354965,
0.009812753647565842,
0.01571141742169857,
0.09407366067171097,
0.021737074479460716,
-0.22376921772956848,
0.027643119916319847,
-0.031140170991420746,
-0.07918478548526764,
0.07901682704687119,
0.0965474545955658,
-0.07863625138998032,
0.04673079028725624,
0.021631499752402306,
-0.10934159904718399,
0.05089917033910751,
-0.007867748849093914,
-0.021267928183078766,
0.13241606950759888,
-0.0016279142582789063,
0.048116009682416916,
0.052226096391677856,
0.039932262152433395,
-0.13316121697425842,
0.008100073784589767,
0.0555051788687706,
0.05914054811000824,
0.03974582999944687,
0.0225231796503067,
0.08023800700902939,
0.0005342452204786241,
0.03068920597434044,
0.04903106391429901,
0.024515021592378616,
-0.07619624584913254,
-0.042930036783218384,
-0.10734251141548157,
0.0404624305665493,
0.027393030002713203,
0.012732627801597118,
0.012323476374149323,
0.13465555012226105,
-0.040681879967451096,
0.04697587341070175,
0.19721320271492004,
-0.32779717445373535,
-0.018109600991010666,
0.08855438977479935,
0.0457320474088192,
0.12827633321285248,
-0.00427433755248785,
-0.01985427550971508,
0.07655229419469833,
0.03356141224503517,
0.08848366141319275,
-0.040482912212610245,
0.01625281199812889,
-0.05384353548288345,
-0.16322794556617737,
-0.03970470279455185,
0.09639879316091537,
-0.0008564714225940406,
-0.13495472073554993,
-0.03973816707730293,
-0.0365525521337986,
0.019916629418730736,
0.012749052606523037,
-0.03399583324790001,
0.029480233788490295,
0.00583255710080266,
-0.01980575919151306,
-0.0012770900502800941,
-0.09891252964735031,
-0.05087937414646149,
0.03218649700284004,
0.08082596212625504,
0.1027032732963562,
0.061155274510383606,
0.0035067289136350155,
0.11335987597703934,
-0.1843794733285904,
-0.04612324759364128,
-0.031759269535541534,
-0.06554648280143738,
-0.04801933467388153,
-0.014903077855706215,
-0.1085950955748558,
-0.03767920657992363,
0.010863649658858776,
0.1366022825241089,
-0.005772875621914864,
0.03376494348049164,
-0.022407643496990204,
0.004365373402833939,
0.05449637770652771,
0.03893037140369415,
-0.016482191160321236,
0.018554357811808586,
0.020866189152002335,
-0.0032832755241543055,
-0.025230184197425842,
0.017915325239300728,
0.006928314454853535,
0.03152246028184891,
0.11976747214794159,
0.02950792759656906,
-0.10512228310108185,
0.07762168347835541,
-0.014181734062731266,
-0.04701962321996689,
0.023481372743844986,
-0.09078431874513626,
-0.06887481361627579,
-0.04591628164052963,
-0.00018804329738486558,
0.021628743037581444,
-0.0068886964581906796,
-0.011475021950900555,
-0.023017119616270065,
-0.03189871087670326,
-0.08413161337375641,
-0.04974968731403351,
-0.05381150171160698,
-0.12796230614185333,
0.004477379843592644,
-0.18571342527866364,
-0.03759089484810829,
-0.11634480208158493,
-0.18823479115962982,
-0.02833300642669201,
0.06256432831287384,
-0.011647828854620457,
-0.05373995751142502,
0.08146323263645172,
0.037192560732364655,
-0.031843412667512894,
-0.005909621715545654,
0.07181832194328308,
-0.004304548725485802,
0.040411077439785004,
-0.02328202687203884,
0.06689783930778503,
0.00375863304361701,
0.0355711355805397,
-0.05379645153880119,
0.06128882244229317,
-0.18010716140270233,
0.04245508462190628,
-0.08045339584350586,
-0.01639215275645256,
-0.08450528979301453,
-0.03889910876750946,
-0.010198229923844337,
0.0070118531584739685,
0.024974985048174858,
0.07634779810905457,
-0.17925377190113068,
-0.02050838991999626,
0.1129777804017067,
-0.15704034268856049,
-0.02959476038813591,
0.06839510053396225,
-0.04638611897826195,
0.09305839985609055,
0.07025937736034393,
0.1509542018175125,
-0.007891948334872723,
-0.08159051835536957,
0.051970940083265305,
-0.014001864939928055,
0.015376968309283257,
-0.018274227157235146,
0.06493497639894485,
-0.02214912511408329,
-0.15409088134765625,
0.03742317110300064,
-0.13282279670238495,
-0.00004026819806313142,
-0.07566189765930176,
0.019936352968215942,
-0.006519227288663387,
-0.07086894661188126,
-0.07961779832839966,
-0.026410214602947235,
0.06392577290534973,
-0.07559364289045334,
-0.011622483842074871,
0.038033876568078995,
0.07372927665710449,
-0.07589029520750046,
0.06603342294692993,
-0.009085544385015965,
0.019490104168653488,
-0.08514194935560226,
-0.041705094277858734,
-0.19284015893936157,
0.040111493319272995,
0.09533467888832092,
0.014881627634167671,
-0.02154010906815529,
0.13262727856636047,
0.007331873290240765,
0.06396125257015228,
-0.052896760404109955,
0.013392760418355465,
-0.0024162540212273598,
-0.0029751367401331663,
-0.08353603631258011,
-0.09434902667999268,
-0.07509113848209381,
-0.07003426551818848,
0.07097502052783966,
-0.11715970188379288,
0.02189098298549652,
-0.0603802315890789,
0.036224231123924255,
0.014904468320310116,
-0.08147874474525452,
-0.008586741983890533,
0.019064344465732574,
-0.0579889640212059,
-0.06271079927682877,
0.042289335280656815,
0.06878913938999176,
-0.006168225314468145,
0.08761154115200043,
-0.05329432338476181,
-0.08588570356369019,
0.03297777846455574,
0.10806887596845627,
-0.11557590216398239,
-0.00020481237152125686,
-0.0581001453101635,
-0.04263513162732124,
-0.06130685284733772,
-0.019093189388513565,
0.07923420518636703,
-0.006201989483088255,
0.13240420818328857,
-0.07693761587142944,
-0.003978746943175793,
0.01173335500061512,
-0.01417861133813858,
-0.022957077249884605,
0.03715695068240166,
0.0779229924082756,
-0.0692911371588707,
0.015135638415813446,
0.032184429466724396,
0.00887672696262598,
0.07057302445173264,
-0.05543416365981102,
-0.08314290642738342,
0.022733643651008606,
0.03990554064512253,
0.03047383390367031,
0.06922844797372818,
-0.011006672866642475,
-0.014287024736404419,
0.028711257502436638,
0.022392820566892624,
0.009647796861827374,
-0.10778060555458069,
0.05481571704149246,
0.0538642592728138,
0.008573848754167557,
0.07056104391813278,
-0.00988225731998682,
-0.04239173233509064,
0.07780978083610535,
0.03650004416704178,
-0.009452174417674541,
-0.013128106482326984,
-0.013501058332622051,
-0.11271002888679504,
0.20029830932617188,
-0.061924878507852554,
-0.15789912641048431,
-0.07066628336906433,
-0.11581002920866013,
-0.007450982462614775,
0.01938098482787609,
0.03819381445646286,
-0.02867726795375347,
-0.048150721937417984,
-0.1279662847518921,
0.05979163944721222,
-0.035519495606422424,
0.06767962872982025,
0.10731582343578339,
-0.04839035123586655,
0.0521591454744339,
-0.12925100326538086,
-0.00789156649261713,
-0.08187657594680786,
-0.07149828225374222,
0.059493713080883026,
-0.05741395428776741,
0.03304414823651314,
0.09951742738485336,
0.0287935808300972,
-0.016915496438741684,
-0.030421718955039978,
0.20673412084579468,
0.04333069548010826,
0.03588053584098816,
0.12684912979602814,
-0.05395933985710144,
0.05299210548400879,
0.08713571727275848,
0.012181694619357586,
-0.05084608867764473,
0.05941331014037132,
0.047951921820640564,
-0.06966093182563782,
-0.19779425859451294,
-0.021303938701748848,
-0.010574012994766235,
-0.04249873012304306,
0.07146406173706055,
0.03780827671289444,
0.008090975694358349,
0.07416656613349915,
0.012359879910945892,
0.0541183240711689,
-0.005689659155905247,
0.10147138684988022,
0.013768952339887619,
-0.029869884252548218,
0.08901698142290115,
-0.020472504198551178,
-0.006456234026700258,
0.08112216740846634,
-0.015874594449996948,
0.30029696226119995,
-0.03609530255198479,
0.0022392834071069956,
0.12618328630924225,
0.037955392152071,
0.06051701307296753,
0.13247834146022797,
-0.07165239006280899,
0.015947265550494194,
-0.07088875770568848,
-0.06029236316680908,
0.0029400610364973545,
0.03543998301029205,
-0.06561620533466339,
0.011489657685160637,
-0.07266126573085785,
0.020962584763765335,
-0.01895427703857422,
0.3094578981399536,
0.10057249665260315,
-0.1127738356590271,
-0.050969481468200684,
-0.0012688353890553117,
-0.09717104583978653,
-0.06717973947525024,
0.04923831671476364,
0.061281539499759674,
-0.13482765853405,
0.009538166224956512,
-0.02909776195883751,
0.07308115065097809,
-0.017422428354620934,
0.0157944206148386,
0.04510059952735901,
0.04507790878415108,
-0.04328271374106407,
0.0027116918936371803,
-0.18672031164169312,
0.19762328267097473,
0.005185391288250685,
0.0222256351262331,
-0.04632012918591499,
0.03186618164181709,
0.013474129140377045,
-0.030082393437623978,
0.06095650792121887,
0.017978042364120483,
-0.01862776093184948,
-0.047195691615343094,
-0.051323071122169495,
0.013761027716100216,
0.07674185186624527,
-0.03299936279654503,
0.1069163903594017,
-0.002161019016057253,
0.04482567682862282,
0.01811808906495571,
0.08812197297811508,
-0.1875741183757782,
-0.09474793076515198,
0.02832801267504692,
-0.05921848863363266,
-0.09905622899532318,
-0.07598670572042465,
-0.09836144745349884,
-0.017652180045843124,
0.23936399817466736,
-0.10208044201135635,
-0.07716100662946701,
-0.09726796299219131,
0.01841501146554947,
0.10992715507745743,
-0.04445740208029747,
0.029360683634877205,
-0.008964575827121735,
0.1131463497877121,
-0.0657079666852951,
-0.13292354345321655,
0.023644957691431046,
-0.09611625224351883,
-0.15785226225852966,
-0.06416933983564377,
0.10909771174192429,
0.06329181045293808,
0.03261838108301163,
-0.033007148653268814,
0.01898314617574215,
0.039496030658483505,
-0.04302041232585907,
-0.011900704354047775,
0.0652986466884613,
0.09875494241714478,
0.038074884563684464,
-0.11207675188779831,
0.015281856060028076,
-0.06777750700712204,
-0.06439334154129028,
0.07376603782176971,
0.258707195520401,
-0.05100437253713608,
0.11480588465929031,
0.11505495011806488,
-0.07579813152551651,
-0.15073364973068237,
0.038229260593652725,
0.0896952897310257,
-0.019726237282156944,
0.01585293374955654,
-0.15301643311977386,
0.09339161962270737,
0.11517244577407837,
-0.01905379816889763,
0.000054852636822033674,
-0.1877470165491104,
-0.1276252716779709,
0.06925142556428909,
0.10715936869382858,
0.26634830236434937,
-0.06881947815418243,
-0.036798976361751556,
0.021251697093248367,
-0.0945228859782219,
0.015517614781856537,
0.12887631356716156,
0.06990732252597809,
-0.02988172322511673,
-0.0816923975944519,
0.009053654037415981,
-0.04494297876954079,
0.08969580382108688,
0.05787382647395134,
0.062347181141376495,
-0.00769596453756094,
0.022125745192170143,
-0.024220652878284454,
-0.04571168124675751,
0.06716212630271912,
0.01862737163901329,
0.04849783703684807,
-0.08421590179204941,
-0.02922525815665722,
-0.07536374032497406,
0.025317318737506866,
-0.025679251179099083,
-0.07702015340328217,
-0.05970170721411705,
0.08569082617759705,
0.048326969146728516,
-0.026545509696006775,
0.01995171792805195,
0.023456374183297157,
0.12004272639751434,
0.14766518771648407,
0.004183113109320402,
-0.050143446773290634,
-0.06769006699323654,
-0.03566056489944458,
-0.016544796526432037,
0.07012102007865906,
-0.046114180237054825,
0.01354158204048872,
0.06671901047229767,
0.01863950677216053,
0.09430372714996338,
0.05976862832903862,
-0.11271322518587112,
-0.02027747966349125,
0.03617413341999054,
-0.16434063017368317,
0.03432068973779678,
0.003402675734832883,
0.027632348239421844,
-0.03871134668588638,
0.03903871774673462,
0.1414782702922821,
-0.06381634622812271,
-0.03386908397078514,
-0.038763321936130524,
0.06893578916788101,
0.025163549929857254,
0.14874859154224396,
0.03664636239409447,
0.03663874790072441,
-0.08313626050949097,
0.12460197508335114,
0.031423330307006836,
-0.04041273519396782,
0.02144027315080166,
-0.015266773290932178,
-0.11449262499809265,
0.01232914812862873,
0.06719128042459488,
0.02974913828074932,
-0.055804070085287094,
-0.008013303391635418,
-0.028360793367028236,
-0.07552259415388107,
0.06245872378349304,
0.19442270696163177,
0.06723802536725998,
0.0682850107550621,
-0.05363958701491356,
-0.041218351572752,
-0.07948777824640274,
0.03837531805038452,
0.03563554212450981,
0.0801311656832695,
-0.07984156906604767,
0.08562055975198746,
0.013322527520358562,
0.038387179374694824,
-0.028421694412827492,
-0.05108638107776642,
-0.10409672558307648,
-0.053393956273794174,
-0.10536210238933563,
0.0006233025342226028,
-0.0778222605586052,
-0.03472480922937393,
-0.003931635990738869,
0.0052863177843391895,
-0.009327505715191364,
0.04892523214221001,
-0.061442781239748,
-0.009967276826500893,
-0.018267974257469177,
0.03506670892238617,
-0.05939797684550285,
-0.03165430948138237,
0.02323095127940178,
-0.1005583256483078,
0.09063813835382462,
0.0422193817794323,
0.0034142460208386183,
0.006775792688131332,
0.09464952349662781,
-0.024868285283446312,
0.023161856457591057,
0.012572678737342358,
-0.04619961604475975,
-0.07850690931081772,
-0.000679772871080786,
-0.007096811663359404,
-0.011295672506093979,
-0.005201312247663736,
0.0770629271864891,
-0.08722740411758423,
0.0381406806409359,
-0.005511510651558638,
0.002040891908109188,
-0.07170777022838593,
-0.01166023500263691,
0.10678400099277496,
0.09483680129051208,
0.04733310639858246,
-0.09082701802253723,
0.009059876203536987,
-0.1361510306596756,
-0.03981965035200119,
0.00531290378421545,
-0.015728497877717018,
-0.12756407260894775,
-0.004068400245159864,
0.024479595944285393,
-0.0013636157382279634,
0.21495471894741058,
-0.06026357784867287,
-0.01868385449051857,
0.020111069083213806,
-0.09290917217731476,
0.11692297458648682,
-0.026004821062088013,
0.17522893846035004,
-0.011991197243332863,
-0.041365038603544235,
-0.004506605677306652,
0.04339030012488365,
0.016480417922139168,
-0.024458639323711395,
0.19101573526859283,
0.13897809386253357,
0.027281275019049644,
0.04045737162232399,
-0.020955955609679222,
-0.003463461995124817,
-0.03242519870400429,
-0.0393533781170845,
0.03688725456595421,
0.042679015547037125,
0.016193944960832596,
0.14136703312397003,
0.06808209419250488,
-0.1594146192073822,
0.03328146040439606,
-0.03382885828614235,
-0.040461715310811996,
-0.11305715888738632,
-0.09172169119119644,
-0.0243728868663311,
-0.0733882337808609,
0.013123301789164543,
-0.1280827671289444,
-0.0011208022478967905,
0.19574882090091705,
0.06670528650283813,
0.029215015470981598,
0.017653822898864746,
-0.11697451025247574,
-0.033277105540037155,
0.0550956092774868,
0.009782726876437664,
0.02328225038945675,
0.055892981588840485,
0.0016124992398545146,
0.05589066818356514,
0.029273487627506256,
0.013353614136576653,
-0.0007058905903249979,
0.07243238389492035,
0.018809236586093903,
0.041399165987968445,
-0.059971485286951065,
-0.0031700620893388987,
-0.03813447058200836,
0.06913112103939056,
0.1195295974612236,
0.047476790845394135,
-0.0514456108212471,
-0.0071577043272554874,
0.15310059487819672,
-0.0397256575524807,
-0.005069593898952007,
-0.12794926762580872,
0.3270500898361206,
0.017184844240546227,
0.010473283007740974,
0.04033557325601578,
-0.07299680262804031,
-0.049943290650844574,
0.21156777441501617,
0.09410203993320465,
-0.016771728172898293,
-0.023053517565131187,
0.0002693353744689375,
-0.030444802716374397,
-0.02671295963227749,
0.1549854278564453,
0.043444786220788956,
0.12279649823904037,
-0.05374245345592499,
-0.04931384325027466,
-0.026548238471150398,
-0.01023951731622219,
-0.11892427504062653,
0.11953943222761154,
-0.017162209376692772,
-0.02317623607814312,
-0.06856124848127365,
0.031310684978961945,
0.06153368577361107,
-0.30466428399086,
-0.007215035613626242,
-0.02831057272851467,
-0.11110910773277283,
-0.013156550005078316,
-0.0244466383010149,
-0.021872850134968758,
0.04578734189271927,
-0.04298524931073189,
0.07233668863773346,
0.0343841053545475,
0.03160585090517998,
-0.022878292948007584,
-0.1074647456407547,
0.17088723182678223,
0.06286463141441345,
0.08660807460546494,
0.02511860616505146,
0.07420561462640762,
0.05969037115573883,
0.03197634592652321,
-0.09904513508081436,
0.047192711383104324,
0.01547615509480238,
-0.08926106244325638,
-0.05072823539376259,
0.11286583542823792,
0.0024310979060828686,
0.048728227615356445,
0.03389677032828331,
-0.09749037027359009,
0.016611354425549507,
0.0729253813624382,
-0.05812962353229523,
-0.10614345222711563,
-0.0001585882855579257,
-0.08934106677770615,
0.16505755484104156,
0.14854221045970917,
-0.01622651517391205,
0.01983504183590412,
-0.06822901219129562,
-0.009115750901401043,
0.04831228405237198,
-0.0008707845117896795,
-0.023645084351301193,
-0.19522586464881897,
0.03405728191137314,
-0.0898408442735672,
-0.008777639828622341,
-0.23143741488456726,
-0.09860582649707794,
-0.007483270484954119,
-0.051044657826423645,
-0.031244875863194466,
0.06025132164359093,
0.030779657885432243,
0.07236757129430771,
-0.017625968903303146,
-0.018713993951678276,
-0.035358309745788574,
0.09003474563360214,
-0.11552373319864273,
-0.06568517535924911
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 120k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 120k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_120k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_120k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_120k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_120k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_120k"]}
| null |
google/multiberts-seed_3-step_120k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_120k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_120k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 120k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 120k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 120k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 120k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_120k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 120k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 120k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_120k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 120k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 120k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07655264437198639,
0.09173023700714111,
-0.0020728090312331915,
0.04208279773592949,
0.08778317272663116,
-0.01705910451710224,
0.07695546746253967,
0.09937377274036407,
-0.033000294119119644,
0.02351762354373932,
0.08107065409421921,
0.014846362173557281,
0.014841188676655293,
0.09147629886865616,
0.02027028240263462,
-0.22501343488693237,
0.02603796124458313,
-0.031057408079504967,
-0.08089318871498108,
0.07942799478769302,
0.09722135215997696,
-0.07867120206356049,
0.04693492501974106,
0.024337291717529297,
-0.10763794183731079,
0.050776343792676926,
-0.008685922250151634,
-0.019432123750448227,
0.13243016600608826,
-0.0028857593424618244,
0.048315346240997314,
0.05218752846121788,
0.03872733190655708,
-0.13666729629039764,
0.008407633751630783,
0.05712193623185158,
0.05693987384438515,
0.0393645353615284,
0.02460298500955105,
0.0780889019370079,
0.006324330810457468,
0.02679077535867691,
0.04749009385704994,
0.02499215304851532,
-0.07587497681379318,
-0.04876842722296715,
-0.10648388415575027,
0.03325742855668068,
0.025270527228713036,
0.014861851930618286,
0.01105005294084549,
0.1370110809803009,
-0.041657187044620514,
0.04887394979596138,
0.19767247140407562,
-0.3356676995754242,
-0.016194501891732216,
0.0852712020277977,
0.04737627133727074,
0.12065013498067856,
-0.0066294982098042965,
-0.016952961683273315,
0.07436017692089081,
0.033124540001153946,
0.09125950932502747,
-0.04084746167063713,
0.026238512247800827,
-0.05207831412553787,
-0.16400308907032013,
-0.039980143308639526,
0.09389443695545197,
-0.0014329355908557773,
-0.13613739609718323,
-0.04034539684653282,
-0.036450136452913284,
0.02556481771171093,
0.012074253521859646,
-0.032502833753824234,
0.030760398134589195,
0.006802125833928585,
-0.020778238773345947,
-0.0018944265320897102,
-0.10108675062656403,
-0.05004386231303215,
0.03452754020690918,
0.07877763360738754,
0.10327401012182236,
0.061246324330568314,
0.0034889739472419024,
0.11127638816833496,
-0.18393805623054504,
-0.047082558274269104,
-0.03038625791668892,
-0.06326877325773239,
-0.048406098037958145,
-0.014095155522227287,
-0.1102396622300148,
-0.041597187519073486,
0.01293712668120861,
0.13658852875232697,
-0.0064667523838579655,
0.0342041440308094,
-0.022041765972971916,
0.004337280988693237,
0.056069567799568176,
0.042124103754758835,
-0.012429488822817802,
0.017442496493458748,
0.02013329602777958,
-0.007364362478256226,
-0.02516893856227398,
0.018404442816972733,
0.005188756622374058,
0.033378683030605316,
0.12023317068815231,
0.02949479967355728,
-0.10764390975236893,
0.0793057382106781,
-0.014352879486978054,
-0.04765721410512924,
0.022729406133294106,
-0.08934718370437622,
-0.06819259375333786,
-0.04260270297527313,
0.0007956220651976764,
0.023389503359794617,
-0.006976651959121227,
-0.010490433312952518,
-0.021539250388741493,
-0.03304729238152504,
-0.08221261948347092,
-0.047645095735788345,
-0.054952509701251984,
-0.1291816085577011,
0.0054654330015182495,
-0.1892804354429245,
-0.03724166005849838,
-0.11561421304941177,
-0.18727947771549225,
-0.027249522507190704,
0.06257530301809311,
-0.01116899587213993,
-0.049987226724624634,
0.08017295598983765,
0.0390051044523716,
-0.03033529780805111,
-0.003940633963793516,
0.07374868541955948,
-0.006068452727049589,
0.04014056175947189,
-0.02123379148542881,
0.06657438725233078,
0.005896999035030603,
0.03562772646546364,
-0.05290226638317108,
0.06133534014225006,
-0.17560577392578125,
0.03933901712298393,
-0.07918974757194519,
-0.019919641315937042,
-0.08624757826328278,
-0.03871342912316322,
-0.006608268711715937,
0.006952452473342419,
0.022831739857792854,
0.07464669644832611,
-0.17827801406383514,
-0.02309294231235981,
0.11179865151643753,
-0.15572920441627502,
-0.02687975950539112,
0.07027444988489151,
-0.045861952006816864,
0.09163074940443039,
0.06934034079313278,
0.14931051433086395,
-0.01243890356272459,
-0.08803734928369522,
0.05276620015501976,
-0.01356485765427351,
0.015380440279841423,
-0.014963997527956963,
0.067873515188694,
-0.021260607987642288,
-0.1540234088897705,
0.03791258856654167,
-0.13428045809268951,
0.00044867396354675293,
-0.07611586898565292,
0.022055145353078842,
-0.0052597858011722565,
-0.07027538865804672,
-0.07617552578449249,
-0.02664775401353836,
0.0648459643125534,
-0.07578986883163452,
-0.01400773972272873,
0.04004497453570366,
0.07313460856676102,
-0.07681479305028915,
0.06398432701826096,
-0.01160531397908926,
0.01837952993810177,
-0.08541307598352432,
-0.041623298078775406,
-0.19129528105258942,
0.039706092327833176,
0.09506117552518845,
0.016150327399373055,
-0.02119983173906803,
0.13895131647586823,
0.008691868744790554,
0.06587430834770203,
-0.05114628002047539,
0.013081798329949379,
-0.0042945221066474915,
-0.0032244950998574495,
-0.08286073803901672,
-0.09539906680583954,
-0.07649370282888412,
-0.07099665701389313,
0.07730890810489655,
-0.12205354869365692,
0.02024286612868309,
-0.06247086077928543,
0.03850224241614342,
0.0167051013559103,
-0.08000907301902771,
-0.008223975077271461,
0.018123524263501167,
-0.05805497244000435,
-0.06040530651807785,
0.0447283536195755,
0.06875389814376831,
-0.006051577161997557,
0.09153872728347778,
-0.053943976759910583,
-0.08371350914239883,
0.032496385276317596,
0.1019788458943367,
-0.11313505470752716,
0.0020800388883799314,
-0.05755885690450668,
-0.041770875453948975,
-0.06005154922604561,
-0.017961403355002403,
0.07836266607046127,
-0.005375276319682598,
0.13513247668743134,
-0.07582982629537582,
-0.007176623679697514,
0.010523010976612568,
-0.016564520075917244,
-0.02342759631574154,
0.039813823997974396,
0.07525903731584549,
-0.07996052503585815,
0.016422858461737633,
0.03657407686114311,
0.006846167612820864,
0.07654400169849396,
-0.05427141860127449,
-0.08468257635831833,
0.02096455544233322,
0.03945229575037956,
0.030508091673254967,
0.06774657964706421,
-0.015272744931280613,
-0.015398421324789524,
0.029869074001908302,
0.019691728055477142,
0.008361984975636005,
-0.10744837671518326,
0.05547004193067551,
0.0536198727786541,
0.00769207114353776,
0.07024958729743958,
-0.008217645809054375,
-0.043258484452962875,
0.07685504108667374,
0.03619828075170517,
-0.00918972585350275,
-0.013332441449165344,
-0.014496376737952232,
-0.1123872771859169,
0.19946427643299103,
-0.0625598356127739,
-0.16256368160247803,
-0.07050236314535141,
-0.11638566851615906,
-0.00733787938952446,
0.0205730888992548,
0.03916125372052193,
-0.02791582979261875,
-0.04892094060778618,
-0.1265462040901184,
0.06208755075931549,
-0.03755528852343559,
0.06639687716960907,
0.10425958782434464,
-0.04663129150867462,
0.05289708077907562,
-0.12860040366649628,
-0.008773268200457096,
-0.08264075964689255,
-0.07300424575805664,
0.06128599867224693,
-0.05750516429543495,
0.03329992666840553,
0.09552972763776779,
0.03089945949614048,
-0.017210030928254128,
-0.029847459867596626,
0.2064303755760193,
0.04261668771505356,
0.03366778790950775,
0.13121135532855988,
-0.05468132346868515,
0.05289294198155403,
0.07969601452350616,
0.010986585170030594,
-0.05026343837380409,
0.05873257666826248,
0.05118533596396446,
-0.06847114861011505,
-0.1971491277217865,
-0.021121365949511528,
-0.011958575807511806,
-0.04433438554406166,
0.0710446685552597,
0.03830099478363991,
0.006067797541618347,
0.07149109989404678,
0.011735358275473118,
0.05463365837931633,
-0.0057649458758533,
0.10244867950677872,
0.018130065873265266,
-0.03134223818778992,
0.09034992009401321,
-0.02159924805164337,
-0.009278888814151287,
0.079845130443573,
-0.015428497456014156,
0.2936559021472931,
-0.03243708238005638,
0.009783272631466389,
0.1276240348815918,
0.037948448210954666,
0.06030983850359917,
0.13187840580940247,
-0.07169822603464127,
0.014825054444372654,
-0.07272849977016449,
-0.06095482409000397,
0.0010994618060067296,
0.035391535609960556,
-0.06070203706622124,
0.008216803893446922,
-0.07168757170438766,
0.01783941686153412,
-0.01896257884800434,
0.31159693002700806,
0.10358013957738876,
-0.11432893574237823,
-0.05113920941948891,
-0.0018057195702567697,
-0.0966777577996254,
-0.06573621928691864,
0.04618281498551369,
0.05911199375987053,
-0.13605108857154846,
0.01162025984376669,
-0.03055793233215809,
0.07333681732416153,
-0.01805063895881176,
0.017450951039791107,
0.04386162757873535,
0.045048460364341736,
-0.041654232889413834,
0.005448957905173302,
-0.19058480858802795,
0.19463814795017242,
0.005884354934096336,
0.021956970915198326,
-0.048529695719480515,
0.02996559627354145,
0.013032053597271442,
-0.031993817538022995,
0.060078203678131104,
0.01580844447016716,
-0.02013091929256916,
-0.04977075383067131,
-0.04863695427775383,
0.014242795296013355,
0.08034530282020569,
-0.03491220995783806,
0.10985548794269562,
-0.005219577811658382,
0.042617928236722946,
0.01833094097673893,
0.0909666046500206,
-0.18739452958106995,
-0.09402931481599808,
0.027423826977610588,
-0.0587075911462307,
-0.10179373621940613,
-0.07434511929750443,
-0.09700090438127518,
-0.019031766802072525,
0.2433055341243744,
-0.10319104790687561,
-0.07435289770364761,
-0.09790390729904175,
0.016742374747991562,
0.10698294639587402,
-0.04524501413106918,
0.027473418042063713,
-0.009669928811490536,
0.11809483915567398,
-0.06596594303846359,
-0.13492080569267273,
0.0234695952385664,
-0.09782690554857254,
-0.15940485894680023,
-0.06368173658847809,
0.11123348027467728,
0.06364520639181137,
0.032469626516103745,
-0.030811144039034843,
0.01926853507757187,
0.03665217384696007,
-0.04123803600668907,
-0.007468560244888067,
0.06573888659477234,
0.10237610340118408,
0.037319935858249664,
-0.1136520504951477,
0.02215009555220604,
-0.06696052104234695,
-0.0635153129696846,
0.07578322291374207,
0.2598404586315155,
-0.049174997955560684,
0.11720094829797745,
0.11058013141155243,
-0.07534994184970856,
-0.15062496066093445,
0.03485826402902603,
0.09162428230047226,
-0.019663052633404732,
0.018547890707850456,
-0.15567302703857422,
0.09213821589946747,
0.11337098479270935,
-0.01991211250424385,
0.004620157182216644,
-0.182893767952919,
-0.1259642392396927,
0.07033952325582504,
0.10719206184148788,
0.2642979621887207,
-0.06938843429088593,
-0.03573739901185036,
0.01797187700867653,
-0.09341315180063248,
0.020893989130854607,
0.11977240443229675,
0.07238108664751053,
-0.02899080142378807,
-0.07993118464946747,
0.009210561402142048,
-0.044446706771850586,
0.08884244412183762,
0.05418885126709938,
0.06163213029503822,
-0.006154031027108431,
0.02492804266512394,
-0.022961173206567764,
-0.04503701999783516,
0.06857892870903015,
0.012934505008161068,
0.04662957042455673,
-0.08382073789834976,
-0.03034297749400139,
-0.0753704309463501,
0.02887689135968685,
-0.02514837309718132,
-0.07608791440725327,
-0.057561930269002914,
0.08422696590423584,
0.04634205996990204,
-0.025479918345808983,
0.025500381365418434,
0.02243758551776409,
0.1204257532954216,
0.15368328988552094,
0.003056837012991309,
-0.04471203684806824,
-0.06934305280447006,
-0.03822987526655197,
-0.01630408689379692,
0.07076592743396759,
-0.04209984466433525,
0.014218461699783802,
0.06339046359062195,
0.018828963860869408,
0.09649170935153961,
0.05828503891825676,
-0.11489558964967728,
-0.020599447190761566,
0.0345064215362072,
-0.16580873727798462,
0.03242648392915726,
0.0026274253614246845,
0.027701077982783318,
-0.03568514436483383,
0.04050472378730774,
0.14522181451320648,
-0.06209754943847656,
-0.03332340344786644,
-0.03942357003688812,
0.0675959512591362,
0.025256790220737457,
0.14676538109779358,
0.03547224402427673,
0.03733404353260994,
-0.0828167200088501,
0.12324996292591095,
0.02947264350950718,
-0.036396898329257965,
0.022694764658808708,
-0.019233521074056625,
-0.11326213926076889,
0.010081450454890728,
0.06752918660640717,
0.038722436875104904,
-0.05619048327207565,
-0.010125813074409962,
-0.030054964125156403,
-0.07364266365766525,
0.06258644163608551,
0.19314168393611908,
0.06915727257728577,
0.0694645494222641,
-0.05420439690351486,
-0.040561407804489136,
-0.07932926714420319,
0.037984587252140045,
0.0335262194275856,
0.07765859365463257,
-0.07985582202672958,
0.08516035228967667,
0.014076466672122478,
0.040820471942424774,
-0.029746582731604576,
-0.05196674168109894,
-0.10593000054359436,
-0.05371376872062683,
-0.10202336311340332,
0.0019806143827736378,
-0.0758313462138176,
-0.035165369510650635,
-0.0024773464538156986,
0.004833288956433535,
-0.007285112049430609,
0.05000752583146095,
-0.062337152659893036,
-0.009999641217291355,
-0.01798589527606964,
0.035552866756916046,
-0.06200689449906349,
-0.03339356556534767,
0.023741189390420914,
-0.10193140059709549,
0.08982783555984497,
0.04409905895590782,
0.004199398681521416,
0.010776999406516552,
0.08754948526620865,
-0.024844489991664886,
0.024836335331201553,
0.01076756976544857,
-0.04746542498469353,
-0.08042623847723007,
-0.0014197531854733825,
-0.007384148892015219,
-0.012458034791052341,
-0.006557051558047533,
0.0776207223534584,
-0.08663425594568253,
0.036665938794612885,
-0.0050531476736068726,
0.0007449068361893296,
-0.07078199088573456,
-0.012002424336969852,
0.10127121955156326,
0.09599709510803223,
0.05000552907586098,
-0.09154608845710754,
0.010528474114835262,
-0.1356792002916336,
-0.03995809331536293,
0.006200544070452452,
-0.015562832355499268,
-0.1295875459909439,
-0.004771951586008072,
0.023750489577651024,
-0.002929163631051779,
0.21375048160552979,
-0.0581371895968914,
-0.012497901916503906,
0.017875494435429573,
-0.09243748337030411,
0.11752248555421829,
-0.02757374756038189,
0.17506510019302368,
-0.012363255023956299,
-0.04032597690820694,
-0.007078068796545267,
0.04354415088891983,
0.016932303085923195,
-0.02548309974372387,
0.188247948884964,
0.1401599943637848,
0.02411392144858837,
0.03958292677998543,
-0.021387705579400063,
-0.0036153746768832207,
-0.03933943808078766,
-0.03635551035404205,
0.03867977485060692,
0.043839920312166214,
0.015806104987859726,
0.14831511676311493,
0.07055319100618362,
-0.1619149148464203,
0.03256957605481148,
-0.03436492383480072,
-0.038273513317108154,
-0.11422596126794815,
-0.09755260497331619,
-0.027270155027508736,
-0.07208764553070068,
0.014171234332025051,
-0.12609435617923737,
0.000539856671821326,
0.19362422823905945,
0.06668996810913086,
0.028405271470546722,
0.012917888350784779,
-0.11604622006416321,
-0.034901972860097885,
0.05565473809838295,
0.009238096885383129,
0.02051519602537155,
0.05587317794561386,
0.004789684899151325,
0.05686504393815994,
0.02828565612435341,
0.015417834743857384,
-0.0016483941581100225,
0.07344682514667511,
0.020003365352749825,
0.04045595973730087,
-0.06197306141257286,
-0.0024449662305414677,
-0.03450185805559158,
0.07023356109857559,
0.11811783909797668,
0.04727186635136604,
-0.05215940251946449,
-0.008077329955995083,
0.15353240072727203,
-0.0381169430911541,
-0.0017896268982440233,
-0.12623125314712524,
0.33148983120918274,
0.017623141407966614,
0.012052538804709911,
0.03982935845851898,
-0.07239191234111786,
-0.04735495522618294,
0.2095881700515747,
0.09423834085464478,
-0.018409309908747673,
-0.021231461316347122,
-0.0008625489426776767,
-0.029756246134638786,
-0.02489158883690834,
0.15303224325180054,
0.04409774765372276,
0.12599918246269226,
-0.05550604686141014,
-0.05020119994878769,
-0.0271476861089468,
-0.01007105503231287,
-0.11993763595819473,
0.11891143023967743,
-0.01641913875937462,
-0.024048876017332077,
-0.0672285258769989,
0.02961800806224346,
0.06069514900445938,
-0.3121090233325958,
-0.0074480934999883175,
-0.026102935895323753,
-0.10987251996994019,
-0.013270577415823936,
-0.02301609516143799,
-0.021514667198061943,
0.04762902855873108,
-0.04481802508234978,
0.0704774335026741,
0.0348803885281086,
0.03196357563138008,
-0.02268708497285843,
-0.10254291445016861,
0.1710805594921112,
0.06284503638744354,
0.08421546220779419,
0.024263011291623116,
0.07435831427574158,
0.059804148972034454,
0.03154107555747032,
-0.09552475064992905,
0.050428640097379684,
0.013670055195689201,
-0.09347765147686005,
-0.048789966851472855,
0.11434993892908096,
0.0016067323740571737,
0.04894018545746803,
0.03411608934402466,
-0.09918124228715897,
0.0198653656989336,
0.07210132479667664,
-0.05987938866019249,
-0.1029394343495369,
0.0023315963335335255,
-0.09127587080001831,
0.16435518860816956,
0.14997011423110962,
-0.014148032292723656,
0.0193653404712677,
-0.07035846263170242,
-0.004901766777038574,
0.050654955208301544,
-0.005976673681288958,
-0.025369854643940926,
-0.1909109205007553,
0.034709811210632324,
-0.08534109592437744,
-0.010630258359014988,
-0.23229838907718658,
-0.09946122765541077,
-0.007273651193827391,
-0.05008312314748764,
-0.03098604455590248,
0.055626749992370605,
0.029656024649739265,
0.07286020368337631,
-0.01860533282160759,
-0.023062804713845253,
-0.0337698720395565,
0.08871816098690033,
-0.11643752455711365,
-0.06609445810317993
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1300k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1300k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1300k"]}
| null |
google/multiberts-seed_3-step_1300k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1300k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1300k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07601011544466019,
0.08659559488296509,
-0.0019424320198595524,
0.04621119424700737,
0.0841011255979538,
-0.016644733026623726,
0.07237768918275833,
0.09888837486505508,
-0.03557312861084938,
0.022425448521971703,
0.08243007212877274,
0.010329780168831348,
0.014548697508871555,
0.09901652485132217,
0.02587231434881687,
-0.22883673012256622,
0.02784358151257038,
-0.035787973552942276,
-0.07393941283226013,
0.07855569571256638,
0.09829873591661453,
-0.07900495827198029,
0.04621359705924988,
0.024155449122190475,
-0.09966788440942764,
0.05287524312734604,
-0.008039257489144802,
-0.022508515045046806,
0.12897780537605286,
-0.0014786834362894297,
0.04828627407550812,
0.05188355967402458,
0.0387466624379158,
-0.13995768129825592,
0.006772368680685759,
0.0544305257499218,
0.05901668965816498,
0.03930830955505371,
0.01975114829838276,
0.07966350764036179,
0.004747465718537569,
0.026461277157068253,
0.052397310733795166,
0.02286999486386776,
-0.07768912613391876,
-0.04475171118974686,
-0.10966145247220993,
0.04047771543264389,
0.028379028663039207,
0.014181199483573437,
0.01344099547713995,
0.13609467446804047,
-0.03796927258372307,
0.04984079673886299,
0.19133064150810242,
-0.32017359137535095,
-0.019497446715831757,
0.09355664998292923,
0.0508592315018177,
0.12772728502750397,
-0.00423116609454155,
-0.017017345875501633,
0.07294879853725433,
0.030752742663025856,
0.08655551075935364,
-0.04097417742013931,
0.018020501360297203,
-0.05209572613239288,
-0.16405025124549866,
-0.04470185562968254,
0.09666919708251953,
-0.0006344991270452738,
-0.13160642981529236,
-0.04258427768945694,
-0.03685685992240906,
0.013364113867282867,
0.0093684745952487,
-0.038601621985435486,
0.030665388330817223,
0.007033602800220251,
-0.019640769809484482,
-0.004433623515069485,
-0.0981440469622612,
-0.04952072352170944,
0.03145226091146469,
0.07602860033512115,
0.09957919269800186,
0.060484107583761215,
0.004768345504999161,
0.11060318350791931,
-0.18959775567054749,
-0.044665612280368805,
-0.030724098905920982,
-0.06268424540758133,
-0.04837874323129654,
-0.018922118470072746,
-0.10832049697637558,
-0.0416841134428978,
0.0126465680077672,
0.14554904401302338,
-0.013242151588201523,
0.0355362743139267,
-0.021040959283709526,
0.004330921918153763,
0.05565304681658745,
0.04173565283417702,
-0.018487803637981415,
0.019739234820008278,
0.02509159967303276,
-0.007871203124523163,
-0.01968511752784252,
0.015025243163108826,
0.009321404621005058,
0.030121197924017906,
0.11817073076963425,
0.027384823188185692,
-0.099423848092556,
0.08109704405069351,
-0.013256375677883625,
-0.04505082219839096,
0.0285184308886528,
-0.09236712008714676,
-0.06466366350650787,
-0.045385174453258514,
0.0037417400162667036,
0.0181743036955595,
-0.005441524554044008,
-0.005709639750421047,
-0.020619623363018036,
-0.038529641926288605,
-0.0825895369052887,
-0.05395764485001564,
-0.05411187931895256,
-0.1295967549085617,
0.00799292791634798,
-0.16203080117702484,
-0.038236476480960846,
-0.11329721659421921,
-0.1952090859413147,
-0.026449402794241905,
0.06404107064008713,
-0.009936010465025902,
-0.05602237582206726,
0.07938386499881744,
0.03888045996427536,
-0.029658710584044456,
-0.002969353459775448,
0.07374845445156097,
-0.0031731296330690384,
0.03855432569980621,
-0.022611428052186966,
0.06285414099693298,
0.004684023093432188,
0.031886834651231766,
-0.056248389184474945,
0.062134020030498505,
-0.18614430725574493,
0.040932681411504745,
-0.08201587200164795,
-0.015306905843317509,
-0.08305548131465912,
-0.03872337192296982,
-0.012793632224202156,
0.007044820114970207,
0.02320399321615696,
0.07570609450340271,
-0.16419029235839844,
-0.02341081015765667,
0.10156918317079544,
-0.1586540937423706,
-0.028082096949219704,
0.0701979473233223,
-0.045305103063583374,
0.09136337041854858,
0.06816534698009491,
0.15231509506702423,
0.001559837139211595,
-0.08385589718818665,
0.0576801560819149,
-0.00940421037375927,
0.011606094427406788,
-0.014543561264872551,
0.0663573294878006,
-0.021485181525349617,
-0.15721268951892853,
0.039241060614585876,
-0.13607417047023773,
0.0007641580887138844,
-0.07596860826015472,
0.016319533810019493,
-0.004102590028196573,
-0.06603118777275085,
-0.07794418931007385,
-0.02663600817322731,
0.06369131803512573,
-0.0754503384232521,
-0.012182327918708324,
0.03425503522157669,
0.07178282737731934,
-0.07602226734161377,
0.06966713070869446,
-0.005500618368387222,
0.020119722932577133,
-0.08088278770446777,
-0.042514219880104065,
-0.18456657230854034,
0.03633728623390198,
0.09595180302858353,
0.014293739572167397,
-0.02573087252676487,
0.1251489669084549,
0.008749548345804214,
0.06121129170060158,
-0.05249139294028282,
0.01832559145987034,
-0.004530913196504116,
-0.001056558103300631,
-0.0806196853518486,
-0.09447446465492249,
-0.07625461369752884,
-0.07045767456293106,
0.07953350991010666,
-0.12308166921138763,
0.02356574311852455,
-0.05827276036143303,
0.03355292230844498,
0.012914994731545448,
-0.08358996361494064,
-0.007393694948405027,
0.018201494589447975,
-0.057936374098062515,
-0.05890640616416931,
0.04251115024089813,
0.06933321803808212,
-0.011071345768868923,
0.08652554452419281,
-0.05612574517726898,
-0.08668913692235947,
0.03226540610194206,
0.09759138524532318,
-0.11681599169969559,
-0.00660944078117609,
-0.05919783562421799,
-0.04482867568731308,
-0.060750093311071396,
-0.025350196287035942,
0.07724320888519287,
-0.007529484573751688,
0.1309751272201538,
-0.07468702644109726,
-0.0073942095041275024,
0.012296570464968681,
-0.015468472614884377,
-0.028610331937670708,
0.03663959354162216,
0.07074346393346786,
-0.06023501604795456,
0.016460798680782318,
0.02644353359937668,
0.0061828759498894215,
0.07257936149835587,
-0.05267717316746712,
-0.08168725669384003,
0.0233139805495739,
0.04109068587422371,
0.030497239902615547,
0.06323444843292236,
-0.01883477345108986,
-0.010362976230680943,
0.028994373977184296,
0.02312193438410759,
0.011549565941095352,
-0.10575016587972641,
0.0543169267475605,
0.05683143436908722,
0.005924965720623732,
0.06416855752468109,
-0.009852947667241096,
-0.04284287244081497,
0.07406342774629593,
0.03554580733180046,
-0.004133738111704588,
-0.012024852447211742,
-0.015165622346103191,
-0.11746851354837418,
0.19815807044506073,
-0.06355120986700058,
-0.16108755767345428,
-0.07500973343849182,
-0.11802971363067627,
-0.007547620218247175,
0.018639741465449333,
0.03596063330769539,
-0.022684883326292038,
-0.04852854087948799,
-0.13084927201271057,
0.05643473565578461,
-0.04095922410488129,
0.06619683653116226,
0.11085662245750427,
-0.04959137737751007,
0.05310375988483429,
-0.12773476541042328,
-0.007097324356436729,
-0.0805603414773941,
-0.0762515440583229,
0.05688176304101944,
-0.05730310454964638,
0.036023225635290146,
0.09820830821990967,
0.029004884883761406,
-0.016619237139821053,
-0.03242018073797226,
0.21031104028224945,
0.042767252773046494,
0.037462420761585236,
0.12490107864141464,
-0.051715537905693054,
0.05259596183896065,
0.08787228167057037,
0.012116420082747936,
-0.0483308807015419,
0.05784214287996292,
0.04536108300089836,
-0.07157952338457108,
-0.19238433241844177,
-0.024249158799648285,
-0.013347570784389973,
-0.04022953659296036,
0.06939323246479034,
0.039032794535160065,
0.002821956528350711,
0.07077296823263168,
0.012292889878153801,
0.05714339390397072,
0.003547560190781951,
0.10226540267467499,
0.02647300250828266,
-0.031670600175857544,
0.0854959636926651,
-0.01775730773806572,
-0.0055991411209106445,
0.08137532323598862,
-0.010036781430244446,
0.28275153040885925,
-0.03486468270421028,
0.006553010083734989,
0.12389235198497772,
0.0372086837887764,
0.058864299207925797,
0.1291198432445526,
-0.07312924414873123,
0.015262747183442116,
-0.06970491260290146,
-0.06129203736782074,
0.0043885367922484875,
0.03456255421042442,
-0.0649019405245781,
0.012156849727034569,
-0.07371534407138824,
0.01817895472049713,
-0.013306659646332264,
0.2976287603378296,
0.09752508252859116,
-0.11367689073085785,
-0.05105328559875488,
-0.0017603960586711764,
-0.09720733761787415,
-0.06222096085548401,
0.048146750777959824,
0.05591173842549324,
-0.1356615126132965,
0.015314671210944653,
-0.028407014906406403,
0.07052437216043472,
-0.014338456094264984,
0.01574443280696869,
0.046544481068849564,
0.047976672649383545,
-0.04305214434862137,
0.00612489040941,
-0.18647393584251404,
0.19905857741832733,
0.005654169246554375,
0.020462360233068466,
-0.04706646502017975,
0.03391854837536812,
0.009415368549525738,
-0.03106764890253544,
0.06418398022651672,
0.019503219053149223,
-0.022262711077928543,
-0.04834552854299545,
-0.050291508436203,
0.01547263190150261,
0.07276920229196548,
-0.03279796242713928,
0.10380209237337112,
-0.003456735983490944,
0.043710846453905106,
0.01969924382865429,
0.09332896769046783,
-0.18954947590827942,
-0.09648787975311279,
0.0305023156106472,
-0.05652308091521263,
-0.10885804891586304,
-0.07473684102296829,
-0.09720534086227417,
-0.027623102068901062,
0.2469019889831543,
-0.10704077780246735,
-0.07537149637937546,
-0.0991264060139656,
0.026109565049409866,
0.10910837352275848,
-0.04463664069771767,
0.030061790719628334,
-0.007532496936619282,
0.11581960320472717,
-0.06833019107580185,
-0.13160006701946259,
0.025765366852283478,
-0.09699129313230515,
-0.15962061285972595,
-0.0640132799744606,
0.10809995979070663,
0.06141554191708565,
0.03283395245671272,
-0.033555421978235245,
0.019943654537200928,
0.03921201080083847,
-0.04574715346097946,
-0.014414435252547264,
0.07514923810958862,
0.09136269986629486,
0.0346907302737236,
-0.11107955873012543,
0.019103331491351128,
-0.0642300397157669,
-0.06608278304338455,
0.07574249058961868,
0.2667931616306305,
-0.05212441831827164,
0.11916841566562653,
0.1168491542339325,
-0.07933530956506729,
-0.1537877768278122,
0.034404005855321884,
0.0868341252207756,
-0.02132185362279415,
0.02224736474454403,
-0.14763836562633514,
0.09076441079378128,
0.12262975424528122,
-0.02067820355296135,
-0.009796392172574997,
-0.19283834099769592,
-0.12962102890014648,
0.06740353256464005,
0.10839684307575226,
0.26168063282966614,
-0.06544247269630432,
-0.03902091085910797,
0.02373533882200718,
-0.09585142135620117,
0.007506214082241058,
0.12795299291610718,
0.06672128289937973,
-0.02629535272717476,
-0.0820300504565239,
0.009651601314544678,
-0.046362753957509995,
0.09061817079782486,
0.05608928203582764,
0.05978440120816231,
-0.0077523221261799335,
0.024236083030700684,
-0.0130121149122715,
-0.044944047927856445,
0.06623304635286331,
0.023762794211506844,
0.04825129732489586,
-0.07995159178972244,
-0.029362710192799568,
-0.07587555795907974,
0.03012835420668125,
-0.02749698981642723,
-0.07642801105976105,
-0.06301670521497726,
0.08464006334543228,
0.05268542468547821,
-0.029090404510498047,
0.02425736002624035,
0.022440079599618912,
0.119599848985672,
0.16062889993190765,
0.004578873049467802,
-0.049077682197093964,
-0.054931361228227615,
-0.03336625173687935,
-0.018404075875878334,
0.06728839874267578,
-0.03169284760951996,
0.015282757580280304,
0.06671395897865295,
0.019806645810604095,
0.09071578085422516,
0.059599217027425766,
-0.11217436194419861,
-0.021076345816254616,
0.036466535180807114,
-0.16594183444976807,
0.03459806740283966,
-0.00039921115967445076,
0.023758113384246826,
-0.0423898808658123,
0.032711129635572433,
0.14173199236392975,
-0.0646224394440651,
-0.03288833051919937,
-0.035607047379016876,
0.06958868354558945,
0.024483781307935715,
0.14804470539093018,
0.033216916024684906,
0.03526212275028229,
-0.08203049749135971,
0.12565217912197113,
0.033700574189424515,
-0.04664452746510506,
0.019277499988675117,
-0.013209021650254726,
-0.11277349293231964,
0.01242272462695837,
0.06282790005207062,
0.027636418119072914,
-0.059607747942209244,
-0.004203279968351126,
-0.02898075431585312,
-0.07266627997159958,
0.062176261097192764,
0.19265751540660858,
0.06725979596376419,
0.07004465162754059,
-0.05099307745695114,
-0.03644987940788269,
-0.08204671740531921,
0.040799785405397415,
0.033020440489053726,
0.0778089091181755,
-0.0815306007862091,
0.08545419573783875,
0.012444828636944294,
0.03378559276461601,
-0.02990139275789261,
-0.054089371114969254,
-0.1088515892624855,
-0.05211537703871727,
-0.07903136312961578,
-0.0007954777684062719,
-0.0849757194519043,
-0.03441094234585762,
-0.0028280308470129967,
0.005145468283444643,
-0.012700898572802544,
0.05040699243545532,
-0.05885590985417366,
-0.010016213171184063,
-0.021615376695990562,
0.037238508462905884,
-0.06149381026625633,
-0.032335277646780014,
0.027076296508312225,
-0.09868575632572174,
0.09020015597343445,
0.04063883796334267,
0.005240992642939091,
0.006354150827974081,
0.07394643127918243,
-0.021615318953990936,
0.02161323092877865,
0.011095098219811916,
-0.0472799576818943,
-0.08184745907783508,
-0.0026618880219757557,
-0.0048159328289330006,
-0.01658274233341217,
-0.003851106856018305,
0.08107317984104156,
-0.08883355557918549,
0.03438969701528549,
-0.0020802400540560484,
-0.002800191519781947,
-0.07161916792392731,
-0.01149071380496025,
0.10549160093069077,
0.09174435585737228,
0.044661931693553925,
-0.09505609422922134,
0.007532414980232716,
-0.13885906338691711,
-0.03972477838397026,
0.006784776225686073,
-0.015180900692939758,
-0.12950102984905243,
0.0001747914939187467,
0.024473832920193672,
-0.000007172900495788781,
0.2282937467098236,
-0.060021065175533295,
-0.02055523358285427,
0.021894894540309906,
-0.09807610511779785,
0.12158206850290298,
-0.026308845728635788,
0.18669018149375916,
-0.011503097601234913,
-0.04153195768594742,
-0.00253123277798295,
0.04530540108680725,
0.01832210272550583,
-0.024272408336400986,
0.1867746114730835,
0.13519437611103058,
0.029378952458500862,
0.042176321148872375,
-0.020463842898607254,
0.002040752675384283,
-0.030404992401599884,
-0.03874179348349571,
0.037831101566553116,
0.04536059871315956,
0.01384130772203207,
0.1293218582868576,
0.07665836066007614,
-0.16683855652809143,
0.03443116322159767,
-0.03148024156689644,
-0.04186604917049408,
-0.11723210662603378,
-0.1059432402253151,
-0.025185123085975647,
-0.07908657193183899,
0.013145766220986843,
-0.1310935765504837,
0.0009193370933644474,
0.18523265421390533,
0.06530395895242691,
0.03214104473590851,
0.015857921913266182,
-0.11870512366294861,
-0.03084280528128147,
0.04897582158446312,
0.00811284314841032,
0.02475893311202526,
0.05569998919963837,
-0.00428297184407711,
0.055570390075445175,
0.029821982607245445,
0.012365560047328472,
-0.0038124651182442904,
0.07808136194944382,
0.019332116469740868,
0.03983975201845169,
-0.060981519520282745,
-0.004502974916249514,
-0.040066786110401154,
0.06966118514537811,
0.11000896245241165,
0.0473129041492939,
-0.053834218531847,
-0.006865232717245817,
0.15583935379981995,
-0.03806132450699806,
0.0008661107858642936,
-0.13117575645446777,
0.3336350619792938,
0.02148367650806904,
0.009564191102981567,
0.040375493466854095,
-0.0743904709815979,
-0.04887416586279869,
0.2134254276752472,
0.10166792571544647,
-0.01510966382920742,
-0.02101152576506138,
-0.0011009088484570384,
-0.029939576983451843,
-0.02314668521285057,
0.1540277600288391,
0.039228376001119614,
0.11601196229457855,
-0.05136843025684357,
-0.052653972059488297,
-0.026829302310943604,
-0.011308707296848297,
-0.12063480168581009,
0.12201675772666931,
-0.01555722951889038,
-0.02456560917198658,
-0.06770313531160355,
0.03351103886961937,
0.06107745319604874,
-0.30212676525115967,
-0.011980099603533745,
-0.028694862499833107,
-0.10570792853832245,
-0.012294504791498184,
-0.024113673716783524,
-0.020760884508490562,
0.043850794434547424,
-0.040914036333560944,
0.07248888164758682,
0.027206633239984512,
0.03434396907687187,
-0.021202465519309044,
-0.10677719116210938,
0.17062531411647797,
0.07620429247617722,
0.0989149883389473,
0.0269180741161108,
0.06858695298433304,
0.06011885777115822,
0.032492056488990784,
-0.09962929040193558,
0.04942750185728073,
0.017780622467398643,
-0.08678065240383148,
-0.050746429711580276,
0.11334328353404999,
0.001363628078252077,
0.058108050376176834,
0.032593902200460434,
-0.10131998360157013,
0.02122832089662552,
0.06401065737009048,
-0.05653047189116478,
-0.10189508646726608,
-0.0019713828805834055,
-0.08947759866714478,
0.16594742238521576,
0.14907139539718628,
-0.014006274752318859,
0.02094169147312641,
-0.06749871373176575,
-0.009175984188914299,
0.04559753090143204,
0.0057398974895477295,
-0.02375294454395771,
-0.19513261318206787,
0.032788828015327454,
-0.08829959481954575,
-0.010066344402730465,
-0.2232578694820404,
-0.10237785428762436,
-0.009198187850415707,
-0.0504349060356617,
-0.029052233323454857,
0.062374550849199295,
0.030247988179326057,
0.0681733787059784,
-0.01703399419784546,
-0.014705012552440166,
-0.03240659460425377,
0.09222374856472015,
-0.11801072955131531,
-0.06360688805580139
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1400k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1400k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1400k"]}
| null |
google/multiberts-seed_3-step_1400k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1400k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1400k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07927820831537247,
0.0885646864771843,
-0.002150915330275893,
0.04482933506369591,
0.0866355374455452,
-0.012494184076786041,
0.07560575753450394,
0.09844614565372467,
-0.033435795456171036,
0.021972531452775,
0.08064325153827667,
0.005533136427402496,
0.014186027459800243,
0.09107601642608643,
0.02456478402018547,
-0.22278495132923126,
0.02523615024983883,
-0.03284190222620964,
-0.08178436756134033,
0.07783004641532898,
0.09888193756341934,
-0.08075558394193649,
0.0474722683429718,
0.025755684822797775,
-0.10700500011444092,
0.05007433518767357,
-0.00759698124602437,
-0.021058116108179092,
0.13240031898021698,
-0.0010361522436141968,
0.048570241779088974,
0.05090390145778656,
0.03879847005009651,
-0.13426977396011353,
0.007587377447634935,
0.05681938678026199,
0.056510962545871735,
0.04056286811828613,
0.02315724454820156,
0.08041147887706757,
0.0013536629267036915,
0.027056610211730003,
0.04919786378741264,
0.025101356208324432,
-0.07689042389392853,
-0.06158222258090973,
-0.10503235459327698,
0.040579233318567276,
0.025164809077978134,
0.013098475523293018,
0.012349276803433895,
0.1285698264837265,
-0.038192685693502426,
0.047425851225852966,
0.18406422436237335,
-0.3280528485774994,
-0.016718005761504173,
0.08162614703178406,
0.045689214020967484,
0.1264619082212448,
-0.006382390391081572,
-0.016179727390408516,
0.07493036240339279,
0.03499685972929001,
0.0947427749633789,
-0.04147137701511383,
0.029339684173464775,
-0.05195359140634537,
-0.161249577999115,
-0.040499426424503326,
0.09787405282258987,
-0.0024624650832265615,
-0.1352023333311081,
-0.04455915093421936,
-0.03518229350447655,
0.02830922044813633,
0.008955643512308598,
-0.035289593040943146,
0.03145993873476982,
0.007899073883891106,
-0.015739088878035545,
-0.004473502282053232,
-0.10097406059503555,
-0.04763495549559593,
0.034888409078121185,
0.08031114935874939,
0.10184182971715927,
0.06160274147987366,
0.0044463700614869595,
0.10834527015686035,
-0.18380175530910492,
-0.04834075644612312,
-0.029088612645864487,
-0.06332196295261383,
-0.04859381541609764,
-0.015779495239257812,
-0.10910332202911377,
-0.03720741719007492,
0.012732140719890594,
0.13566580414772034,
-0.01358004193753004,
0.03451375290751457,
-0.0161459781229496,
0.0032118523959070444,
0.057094261050224304,
0.037296444177627563,
-0.016469204798340797,
0.014718363992869854,
0.022868318483233452,
-0.008681533858180046,
-0.023037118837237358,
0.0161071065813303,
0.006970548070967197,
0.03201242908835411,
0.11887403577566147,
0.027074657380580902,
-0.10151243209838867,
0.07829359173774719,
-0.011661707423627377,
-0.0465087816119194,
0.023272588849067688,
-0.0908450186252594,
-0.06477648764848709,
-0.04250660538673401,
0.0027191988192498684,
0.01952817477285862,
-0.006838623899966478,
-0.010064957663416862,
-0.023084618151187897,
-0.03289114311337471,
-0.08505498617887497,
-0.05093137547373772,
-0.05127599090337753,
-0.13051190972328186,
0.005139056593179703,
-0.16935408115386963,
-0.037767909467220306,
-0.1173020526766777,
-0.18992260098457336,
-0.027191052213311195,
0.06303966790437698,
-0.011084487661719322,
-0.052250936627388,
0.08569639176130295,
0.04099339619278908,
-0.02967933565378189,
-0.0032988230232149363,
0.0688256099820137,
-0.004928132519125938,
0.03972430154681206,
-0.02003708854317665,
0.06533824652433395,
0.005836020223796368,
0.03471006825566292,
-0.053278014063835144,
0.06182475388050079,
-0.18439136445522308,
0.0423656702041626,
-0.07912760227918625,
-0.01355600357055664,
-0.08375164866447449,
-0.0426633283495903,
-0.007442768197506666,
0.00845543947070837,
0.023253854364156723,
0.07711136341094971,
-0.17115828394889832,
-0.02556055597960949,
0.1018490418791771,
-0.1594226211309433,
-0.031228555366396904,
0.07126466929912567,
-0.044142331928014755,
0.09250020980834961,
0.06931043416261673,
0.1545584499835968,
0.0005119906272739172,
-0.08703792095184326,
0.05729376897215843,
-0.011798636987805367,
0.011562608182430267,
-0.01435618195682764,
0.06635294109582901,
-0.021686658263206482,
-0.15313908457756042,
0.03872249647974968,
-0.13249285519123077,
0.001519670826382935,
-0.07614557445049286,
0.0185468140989542,
-0.005721181631088257,
-0.06774324178695679,
-0.07535523921251297,
-0.026698224246501923,
0.06563904881477356,
-0.07581387460231781,
-0.010276520624756813,
0.04129200801253319,
0.07326307147741318,
-0.07570329308509827,
0.06747117638587952,
-0.008192243054509163,
0.022928548976778984,
-0.08086003363132477,
-0.0390680730342865,
-0.1885715276002884,
0.03794747218489647,
0.09454067796468735,
0.00873752124607563,
-0.02372998371720314,
0.13069424033164978,
0.00829875748604536,
0.06793250143527985,
-0.054092325270175934,
0.014548848383128643,
-0.002392763737589121,
-0.0016400793101638556,
-0.08268498629331589,
-0.09996294230222702,
-0.07744446396827698,
-0.06678292155265808,
0.08121146261692047,
-0.12397955358028412,
0.022649258375167847,
-0.06201178580522537,
0.03605063259601593,
0.014277665875852108,
-0.08006071299314499,
-0.010472354479134083,
0.018272051587700844,
-0.057652879506349564,
-0.059147920459508896,
0.04428733512759209,
0.07074971497058868,
-0.007752295583486557,
0.08974164724349976,
-0.05149748548865318,
-0.07722014933824539,
0.03315555304288864,
0.09272162616252899,
-0.11478238552808762,
-0.0004777433059643954,
-0.057028789073228836,
-0.043888159096241,
-0.05727262422442436,
-0.023630553856492043,
0.07780048996210098,
-0.008309714496135712,
0.1366463452577591,
-0.07424319535493851,
-0.005687054246664047,
0.012504766695201397,
-0.015419073402881622,
-0.026240352541208267,
0.03463803231716156,
0.06441081315279007,
-0.0645245760679245,
0.01490382943302393,
0.03458923473954201,
0.012106174603104591,
0.07336055487394333,
-0.05345221608877182,
-0.08243771642446518,
0.01922355405986309,
0.036261186003685,
0.02797195315361023,
0.06733270734548569,
-0.020034195855259895,
-0.013614916242659092,
0.03172191604971886,
0.022129042074084282,
0.01021480280905962,
-0.10500612109899521,
0.05599980428814888,
0.056863006204366684,
0.008731672540307045,
0.0629972293972969,
-0.011828679591417313,
-0.04284924268722534,
0.07649943977594376,
0.03716765716671944,
-0.0025061825290322304,
-0.01222394872456789,
-0.014866111800074577,
-0.11627130955457687,
0.19760164618492126,
-0.06554491817951202,
-0.16372565925121307,
-0.07801877707242966,
-0.11934275180101395,
-0.007894973270595074,
0.020742572844028473,
0.039499785751104355,
-0.026442736387252808,
-0.049053043127059937,
-0.12517079710960388,
0.05956368148326874,
-0.03963065892457962,
0.06614021956920624,
0.1085205227136612,
-0.04771476611495018,
0.05695267766714096,
-0.1277175098657608,
-0.00722262728959322,
-0.0811149924993515,
-0.06363675743341446,
0.05974873527884483,
-0.052608370780944824,
0.03430408611893654,
0.09745419025421143,
0.0314338393509388,
-0.01869395188987255,
-0.02914157509803772,
0.2155950516462326,
0.04146069660782814,
0.0380391962826252,
0.1311749964952469,
-0.053349077701568604,
0.05323764681816101,
0.08309068530797958,
0.014763685874640942,
-0.048809874802827835,
0.055493537336587906,
0.045077428221702576,
-0.06745090335607529,
-0.19274680316448212,
-0.025452692061662674,
-0.014317113906145096,
-0.04635253921151161,
0.07019197195768356,
0.03709013760089874,
0.010378812439739704,
0.06848524510860443,
0.013221634551882744,
0.05899113789200783,
-0.0019990610890090466,
0.10271059721708298,
0.03051149472594261,
-0.03271125257015228,
0.0875777080655098,
-0.01984577812254429,
-0.005809287540614605,
0.08382245898246765,
-0.020303066819906235,
0.29053106904029846,
-0.03426496312022209,
0.00949958898127079,
0.12614662945270538,
0.04008593410253525,
0.060151357203722,
0.12826655805110931,
-0.07228518277406693,
0.01600349321961403,
-0.07047566771507263,
-0.06192149966955185,
0.005495528690516949,
0.03541843593120575,
-0.06550110876560211,
0.010415350086987019,
-0.07553867250680923,
0.014929935336112976,
-0.015831617638468742,
0.3096764385700226,
0.10112424194812775,
-0.11262858659029007,
-0.05711233988404274,
-0.0011338692856952548,
-0.0964420959353447,
-0.06600669771432877,
0.04393286630511284,
0.06359931081533432,
-0.13400886952877045,
0.013365139253437519,
-0.025652730837464333,
0.07162485271692276,
-0.012292910367250443,
0.018832609057426453,
0.0487314872443676,
0.04810904338955879,
-0.04377692937850952,
0.004928219597786665,
-0.1870020627975464,
0.19361448287963867,
0.007397958543151617,
0.021665183827280998,
-0.04621961712837219,
0.03193606436252594,
0.01186981052160263,
-0.02161029540002346,
0.05860448256134987,
0.021031629294157028,
-0.01813824661076069,
-0.04901823028922081,
-0.045729316771030426,
0.012720296159386635,
0.0764107033610344,
-0.03218866512179375,
0.10505924373865128,
-0.004631137941032648,
0.04340708255767822,
0.019036753103137016,
0.0937647819519043,
-0.18565310537815094,
-0.09431798756122589,
0.030336227267980576,
-0.05658456310629845,
-0.10277153551578522,
-0.07609858363866806,
-0.09748540073633194,
-0.02080940641462803,
0.24343186616897583,
-0.10583359003067017,
-0.07821641862392426,
-0.10085808485746384,
0.01825420930981636,
0.10829216241836548,
-0.04453686997294426,
0.030347418040037155,
-0.007907316088676453,
0.1198272556066513,
-0.06459291279315948,
-0.13374558091163635,
0.02215402200818062,
-0.09807975590229034,
-0.15766797959804535,
-0.06472649425268173,
0.11183912307024002,
0.06019134074449539,
0.032337483018636703,
-0.031220052391290665,
0.018477486446499825,
0.034329675137996674,
-0.04426102712750435,
-0.008823570795357227,
0.062128808349370956,
0.10314901173114777,
0.03486432135105133,
-0.11358387023210526,
0.015734074637293816,
-0.06471911817789078,
-0.06486109644174576,
0.07335653156042099,
0.26143768429756165,
-0.05228874087333679,
0.1170317605137825,
0.12369624525308609,
-0.0768895372748375,
-0.1504870504140854,
0.03434903174638748,
0.08679409325122833,
-0.02062397636473179,
0.015085174702107906,
-0.15359516441822052,
0.09290174394845963,
0.1160566657781601,
-0.0187546294182539,
0.0018912756349891424,
-0.18405336141586304,
-0.13079705834388733,
0.06743825972080231,
0.10591407120227814,
0.2635614275932312,
-0.06895875930786133,
-0.03865223750472069,
0.01934138312935829,
-0.09626081585884094,
0.008808009326457977,
0.1237916350364685,
0.06780058890581131,
-0.028619498014450073,
-0.08350107818841934,
0.009150859899818897,
-0.04430542513728142,
0.09364653378725052,
0.057623475790023804,
0.0621013343334198,
-0.007440190762281418,
0.028973380103707314,
-0.015718266367912292,
-0.04439430683851242,
0.06498250365257263,
0.021530089899897575,
0.04767194762825966,
-0.08696328103542328,
-0.028726670891046524,
-0.07581479102373123,
0.02691229060292244,
-0.02686309441924095,
-0.07668855786323547,
-0.059835270047187805,
0.08404973149299622,
0.04838696867227554,
-0.028583461418747902,
0.01830698549747467,
0.022665217518806458,
0.12028459459543228,
0.1568852812051773,
0.0002636278513818979,
-0.054107416421175,
-0.06801974028348923,
-0.03501317650079727,
-0.01784607395529747,
0.06923915445804596,
-0.04183409363031387,
0.016782613471150398,
0.06617780774831772,
0.02008723095059395,
0.09605531394481659,
0.05893808975815773,
-0.11108918488025665,
-0.0219996627420187,
0.03770725801587105,
-0.1672164499759674,
0.031205326318740845,
0.00010752464004326612,
0.016112785786390305,
-0.0386807881295681,
0.03316313773393631,
0.14238528907299042,
-0.062310781329870224,
-0.03252318874001503,
-0.03874190151691437,
0.06730476021766663,
0.02311651036143303,
0.14957325160503387,
0.036033544689416885,
0.03725285455584526,
-0.08408474177122116,
0.12369028478860855,
0.032970327883958817,
-0.038814011961221695,
0.01943071186542511,
-0.014448992908000946,
-0.11511926352977753,
0.009746793657541275,
0.05975478142499924,
0.03347472473978996,
-0.05314081534743309,
-0.010110425762832165,
-0.02879035472869873,
-0.07750024646520615,
0.06237170845270157,
0.18462222814559937,
0.06629177927970886,
0.06793166697025299,
-0.05325433239340782,
-0.04025532677769661,
-0.08078304678201675,
0.04004304111003876,
0.03382931277155876,
0.07596118748188019,
-0.08139519393444061,
0.0938575342297554,
0.013511238619685173,
0.04015949368476868,
-0.03106038086116314,
-0.053702834993600845,
-0.10630159825086594,
-0.054419636726379395,
-0.10401476174592972,
-0.00042034831130877137,
-0.08009913563728333,
-0.0344577357172966,
-0.003941631875932217,
0.002105324761942029,
-0.010737047530710697,
0.05042077600955963,
-0.059094492346048355,
-0.010165817104279995,
-0.015519067645072937,
0.037095651030540466,
-0.06165678799152374,
-0.03378921374678612,
0.023307980969548225,
-0.10125283896923065,
0.09337390214204788,
0.04605124518275261,
0.003965694922953844,
0.006382026243954897,
0.09700547158718109,
-0.023815665394067764,
0.022415893152356148,
0.010703068226575851,
-0.04516769200563431,
-0.08421800285577774,
-0.0022230225149542093,
-0.006784854922443628,
-0.018157754093408585,
-0.003694656305015087,
0.07906460016965866,
-0.08580956608057022,
0.03846216946840286,
-0.004028939642012119,
-0.0007747995550744236,
-0.07151723653078079,
-0.012692589312791824,
0.10725922882556915,
0.09206365793943405,
0.0498054102063179,
-0.09304824471473694,
0.008693399839103222,
-0.13995279371738434,
-0.04083450138568878,
0.005193290766328573,
-0.01535836048424244,
-0.12452398240566254,
-0.005023058503866196,
0.023110250011086464,
-0.0013581571402028203,
0.2204236090183258,
-0.06114256754517555,
-0.016329189762473106,
0.02148878201842308,
-0.09005085378885269,
0.11176122725009918,
-0.025419602170586586,
0.17846858501434326,
-0.014106636866927147,
-0.04188627377152443,
-0.005176341626793146,
0.04682119935750961,
0.019698981195688248,
-0.018589498475193977,
0.1817123144865036,
0.13911037147045135,
0.030014608055353165,
0.03843000903725624,
-0.021973544731736183,
-0.0025391578674316406,
-0.03886398300528526,
-0.034535035490989685,
0.03689178079366684,
0.04586875066161156,
0.016098100692033768,
0.13740259408950806,
0.07111913710832596,
-0.16526363790035248,
0.035799141973257065,
-0.03538702055811882,
-0.04044630751013756,
-0.11056829988956451,
-0.0896896943449974,
-0.02324904315173626,
-0.07619161158800125,
0.010037362575531006,
-0.12876878678798676,
0.00032359245233237743,
0.18437503278255463,
0.0647130012512207,
0.027524810284376144,
0.013930872082710266,
-0.11784214526414871,
-0.03411809355020523,
0.05258830636739731,
0.009476943872869015,
0.02267969399690628,
0.05792723596096039,
0.0005914982757531106,
0.05499471351504326,
0.029791919514536858,
0.012249073944985867,
-0.0019279709085822105,
0.07502882182598114,
0.019614893943071365,
0.03885239362716675,
-0.06255573779344559,
-0.0032647629268467426,
-0.03736529499292374,
0.06875210255384445,
0.11109716445207596,
0.04500225931406021,
-0.052810780704021454,
-0.008194663561880589,
0.1580021232366562,
-0.03673071041703224,
-0.0015920534497126937,
-0.12919002771377563,
0.33158257603645325,
0.018614860251545906,
0.00900441873818636,
0.04147160053253174,
-0.07341022789478302,
-0.048858363181352615,
0.21494527161121368,
0.09935204684734344,
-0.019848784431815147,
-0.021754059940576553,
0.0000025943602395273047,
-0.030358921736478806,
-0.02248053438961506,
0.1554276943206787,
0.041210345923900604,
0.12159156799316406,
-0.04997324198484421,
-0.04530788213014603,
-0.029896888881921768,
-0.009557243436574936,
-0.12188894301652908,
0.13091178238391876,
-0.01694302447140217,
-0.02488841861486435,
-0.06651780009269714,
0.030827101320028305,
0.06529698520898819,
-0.31164681911468506,
-0.005492104217410088,
-0.027627713978290558,
-0.10832188278436661,
-0.013052105903625488,
-0.02693665586411953,
-0.021097585558891296,
0.04610246792435646,
-0.04144112765789032,
0.07081303000450134,
0.037369634956121445,
0.031675469130277634,
-0.02092999778687954,
-0.10734014213085175,
0.17425699532032013,
0.0624823160469532,
0.0918102040886879,
0.02492346055805683,
0.07139232009649277,
0.05872272700071335,
0.03307962790131569,
-0.09894450753927231,
0.04592422395944595,
0.013520949520170689,
-0.0893852487206459,
-0.04963883012533188,
0.11254335194826126,
0.002190779196098447,
0.043672915548086166,
0.034572090953588486,
-0.10296063125133514,
0.017092430964112282,
0.06891486048698425,
-0.062090445309877396,
-0.09929003566503525,
-0.0032612981740385294,
-0.08831910789012909,
0.16265912353992462,
0.14756079018115997,
-0.015894994139671326,
0.01922125369310379,
-0.06766025722026825,
-0.007238536141812801,
0.05137991905212402,
0.0026227589696645737,
-0.023933902382850647,
-0.19645756483078003,
0.0361492745578289,
-0.09205364435911179,
-0.0105936573818326,
-0.23171274363994598,
-0.10212479531764984,
-0.006145728286355734,
-0.0486142598092556,
-0.028566546738147736,
0.06113741174340248,
0.030914796516299248,
0.07105346024036407,
-0.01742616668343544,
-0.018996143713593483,
-0.03353605419397354,
0.09166982769966125,
-0.11325589567422867,
-0.06446344405412674
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 140k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 140k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_140k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_140k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_140k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_140k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_140k"]}
| null |
google/multiberts-seed_3-step_140k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_140k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_140k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 140k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 140k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 140k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 140k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_140k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 140k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 140k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_140k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 140k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 140k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07713726162910461,
0.0883481353521347,
-0.002168227918446064,
0.042028769850730896,
0.08658899366855621,
-0.013138983398675919,
0.07661335170269012,
0.09910330176353455,
-0.0318232998251915,
0.022425012663006783,
0.08169207721948624,
0.011363580822944641,
0.014100632630288601,
0.09286096692085266,
0.023276541382074356,
-0.22387850284576416,
0.025462893769145012,
-0.031247351318597794,
-0.07937243580818176,
0.07796868681907654,
0.09895268827676773,
-0.08102571219205856,
0.046818800270557404,
0.026399750262498856,
-0.1081627607345581,
0.04966605827212334,
-0.007148853503167629,
-0.02016042172908783,
0.1314496248960495,
-0.00033372893813066185,
0.04864555969834328,
0.051040973514318466,
0.037066854536533356,
-0.13679447770118713,
0.007900779135525227,
0.05748105049133301,
0.0564185306429863,
0.040918607264757156,
0.024102743715047836,
0.08015090972185135,
0.001693706144578755,
0.02650824375450611,
0.04807787761092186,
0.024831658229231834,
-0.07608770579099655,
-0.06535865366458893,
-0.10405340790748596,
0.03566053509712219,
0.02510196901857853,
0.013046915642917156,
0.011551345698535442,
0.13078086078166962,
-0.03913100063800812,
0.04781370982527733,
0.18657498061656952,
-0.3341255486011505,
-0.015951113775372505,
0.08247197419404984,
0.048393577337265015,
0.12141763418912888,
-0.005750024225562811,
-0.014807328581809998,
0.07392948120832443,
0.03396262973546982,
0.09745470434427261,
-0.04087251052260399,
0.02958180010318756,
-0.05169913172721863,
-0.16263221204280853,
-0.04036843404173851,
0.09913263469934464,
-0.0027185464277863503,
-0.1372312605381012,
-0.045575886964797974,
-0.03458380699157715,
0.03217824175953865,
0.008763766847550869,
-0.03513652831315994,
0.030519884079694748,
0.009958285838365555,
-0.018220767378807068,
-0.004533444531261921,
-0.10316292941570282,
-0.04934587702155113,
0.03583899140357971,
0.0790606290102005,
0.10219204425811768,
0.06143362820148468,
0.00409189285710454,
0.10984445363283157,
-0.18559078872203827,
-0.04770427942276001,
-0.028724761679768562,
-0.06294502317905426,
-0.04784693196415901,
-0.01528863050043583,
-0.11041028797626495,
-0.04196850210428238,
0.01435719896107912,
0.13525371253490448,
-0.009344293735921383,
0.03362983837723732,
-0.01943213678896427,
0.003972564823925495,
0.058475200086832047,
0.04171342775225639,
-0.014741374179720879,
0.018536105751991272,
0.02251230552792549,
-0.010643277317285538,
-0.02351963520050049,
0.01613660342991352,
0.005242728162556887,
0.033970996737480164,
0.1191142201423645,
0.02787403017282486,
-0.10460822284221649,
0.0801929160952568,
-0.013521303422749043,
-0.04769099876284599,
0.024675633758306503,
-0.08974093943834305,
-0.06440773606300354,
-0.0406436026096344,
0.003764025866985321,
0.022213377058506012,
-0.007974010892212391,
-0.009387517347931862,
-0.021947994828224182,
-0.03251112997531891,
-0.08327244967222214,
-0.049055807292461395,
-0.0522986464202404,
-0.13135646283626556,
0.004986141808331013,
-0.17316032946109772,
-0.036782778799533844,
-0.11655185371637344,
-0.1887224316596985,
-0.025184383615851402,
0.0626111775636673,
-0.011087294667959213,
-0.05045713856816292,
0.08429441601037979,
0.0412052646279335,
-0.029101576656103134,
-0.0030151410028338432,
0.07283634692430496,
-0.006396669894456863,
0.039596281945705414,
-0.019500426948070526,
0.06526196002960205,
0.00649018632248044,
0.03468028083443642,
-0.05289721116423607,
0.062015559524297714,
-0.17652980983257294,
0.04158494994044304,
-0.07964856177568436,
-0.018162690103054047,
-0.08493652939796448,
-0.0408160500228405,
-0.005293504800647497,
0.00892205722630024,
0.022098887711763382,
0.07492851465940475,
-0.175192192196846,
-0.025930698961019516,
0.10851271450519562,
-0.15752092003822327,
-0.02877682074904442,
0.0723089650273323,
-0.04479367285966873,
0.09298700839281082,
0.06937691569328308,
0.1527378112077713,
-0.0034468688536435366,
-0.08979315310716629,
0.05767272412776947,
-0.010872776620090008,
0.012930691242218018,
-0.01100664958357811,
0.0672249048948288,
-0.020701758563518524,
-0.1545112580060959,
0.03947964683175087,
-0.13435134291648865,
0.0011970280902460217,
-0.07618135213851929,
0.01998268812894821,
-0.006690944544970989,
-0.06748747080564499,
-0.07308276742696762,
-0.027764195576310158,
0.06596784293651581,
-0.07535291463136673,
-0.012843231670558453,
0.04058719426393509,
0.07213438302278519,
-0.07776948809623718,
0.06520815193653107,
-0.010089751332998276,
0.02171744592487812,
-0.0822690799832344,
-0.039158523082733154,
-0.18769069015979767,
0.03966546058654785,
0.09755656123161316,
0.011711876839399338,
-0.024468662217259407,
0.13484366238117218,
0.007794148754328489,
0.06802023202180862,
-0.05241270735859871,
0.013822016306221485,
-0.0036507658660411835,
-0.0019129604334011674,
-0.08241274207830429,
-0.09860879182815552,
-0.07831171900033951,
-0.06896466016769409,
0.07896353304386139,
-0.12573859095573425,
0.0215110182762146,
-0.06127113848924637,
0.03582919016480446,
0.01543884351849556,
-0.07952100783586502,
-0.009602620266377926,
0.018016142770648003,
-0.05805784836411476,
-0.05849459394812584,
0.04437389224767685,
0.07027022540569305,
-0.008709337562322617,
0.09191551059484482,
-0.053335145115852356,
-0.07811401784420013,
0.032636143267154694,
0.09248743951320648,
-0.11356529593467712,
-0.0008528166217729449,
-0.05725402012467384,
-0.04311931133270264,
-0.057182569056749344,
-0.02296527475118637,
0.07858701795339584,
-0.008102312684059143,
0.13706402480602264,
-0.07407330721616745,
-0.007611284963786602,
0.011274388059973717,
-0.016260012984275818,
-0.026496291160583496,
0.03413001820445061,
0.06667500734329224,
-0.07000650465488434,
0.01579865999519825,
0.03843172267079353,
0.010117319412529469,
0.07664571702480316,
-0.054055120795965195,
-0.08402552455663681,
0.018791276961565018,
0.03733719885349274,
0.028418583795428276,
0.06755524128675461,
-0.02199057675898075,
-0.014155269600450993,
0.032041408121585846,
0.02026190608739853,
0.009547156281769276,
-0.10641103982925415,
0.05614401027560234,
0.05654354393482208,
0.006622632499784231,
0.0689777284860611,
-0.010705169290304184,
-0.04379226267337799,
0.07598762214183807,
0.036769378930330276,
-0.0026156411040574312,
-0.012056006118655205,
-0.014602436684072018,
-0.11587048321962357,
0.19839352369308472,
-0.06579738855361938,
-0.16657717525959015,
-0.0749376118183136,
-0.11709754168987274,
-0.005688208620995283,
0.02170816995203495,
0.040153857320547104,
-0.026229234412312508,
-0.04985018074512482,
-0.1266980767250061,
0.06146388128399849,
-0.041260551661252975,
0.06623923033475876,
0.10621798783540726,
-0.0490611270070076,
0.055613692849874496,
-0.12785963714122772,
-0.00743773253634572,
-0.08074836432933807,
-0.06795243918895721,
0.0611591637134552,
-0.052852656692266464,
0.03322504088282585,
0.09521995484828949,
0.032558903098106384,
-0.018271835520863533,
-0.029499705880880356,
0.21292810142040253,
0.04345083236694336,
0.03698160871863365,
0.1325213611125946,
-0.054821450263261795,
0.05288565903902054,
0.07802481204271317,
0.013097728602588177,
-0.04907397925853729,
0.055499982088804245,
0.04670247063040733,
-0.06760522723197937,
-0.19266821444034576,
-0.024278076365590096,
-0.01302691362798214,
-0.04495973140001297,
0.06933120638132095,
0.03754166513681412,
0.01104368269443512,
0.07020223885774612,
0.01254641730338335,
0.05778121203184128,
-0.0023991079069674015,
0.10297514498233795,
0.0298276636749506,
-0.03374258801341057,
0.0899236872792244,
-0.0196234080940485,
-0.0071053942665457726,
0.08244671672582626,
-0.018055174499750137,
0.2868649363517761,
-0.03189023211598396,
0.00991662498563528,
0.12763962149620056,
0.03973427042365074,
0.0609472319483757,
0.12888270616531372,
-0.07270559668540955,
0.015213334001600742,
-0.07204826921224594,
-0.06232685595750809,
0.004832635633647442,
0.0352913960814476,
-0.06320524960756302,
0.008416705764830112,
-0.07438509166240692,
0.012542116455733776,
-0.016998043283820152,
0.3092864751815796,
0.10352832823991776,
-0.11211534589529037,
-0.05685402452945709,
-0.0014925978612154722,
-0.09510602056980133,
-0.06625963747501373,
0.04284615069627762,
0.06303728371858597,
-0.13563606142997742,
0.012080246582627296,
-0.028058379888534546,
0.07187185436487198,
-0.013395390473306179,
0.01897219382226467,
0.047811366617679596,
0.04771846905350685,
-0.04225482419133186,
0.005941999610513449,
-0.19190877676010132,
0.19007202982902527,
0.007678090129047632,
0.02226145938038826,
-0.04832063987851143,
0.03267116844654083,
0.010349677875638008,
-0.02615053579211235,
0.05731769651174545,
0.01809709519147873,
-0.018359584733843803,
-0.05166517198085785,
-0.04652956500649452,
0.013835880905389786,
0.07805242389440536,
-0.034378137439489365,
0.10640306770801544,
-0.005383557174354792,
0.04323151335120201,
0.020033255219459534,
0.09720344841480255,
-0.1881808340549469,
-0.095131054520607,
0.030010629445314407,
-0.05429687350988388,
-0.10175582766532898,
-0.07565654814243317,
-0.09652465581893921,
-0.019049353897571564,
0.24797123670578003,
-0.10949614644050598,
-0.07653723657131195,
-0.09961037337779999,
0.018599171191453934,
0.1070721223950386,
-0.043806854635477066,
0.029985995963215828,
-0.010027528740465641,
0.12121587991714478,
-0.06390947103500366,
-0.13583794236183167,
0.022543519735336304,
-0.09943447262048721,
-0.1596471220254898,
-0.06523475050926208,
0.11331009864807129,
0.06085307151079178,
0.03134281933307648,
-0.028638742864131927,
0.018325647339224815,
0.03453577682375908,
-0.04380534589290619,
-0.006932032760232687,
0.06594209372997284,
0.10287169367074966,
0.03403578698635101,
-0.11599694192409515,
0.018688635900616646,
-0.0644029825925827,
-0.0650724247097969,
0.07600171864032745,
0.2597506046295166,
-0.05212642252445221,
0.11871200799942017,
0.11636610329151154,
-0.07689543068408966,
-0.1488354206085205,
0.031676966696977615,
0.08936037123203278,
-0.020146965980529785,
0.016872433945536613,
-0.15753281116485596,
0.09276171773672104,
0.11539481580257416,
-0.01908102259039879,
0.005480033345520496,
-0.18402263522148132,
-0.1287841498851776,
0.06894981116056442,
0.10719245672225952,
0.2625119686126709,
-0.07031157612800598,
-0.03796405345201492,
0.01914057321846485,
-0.09134958684444427,
0.018400901928544044,
0.12161677330732346,
0.0695515125989914,
-0.028041072189807892,
-0.08299540728330612,
0.009448268450796604,
-0.04364166408777237,
0.09288883954286575,
0.05473462864756584,
0.06186601147055626,
-0.006439534481614828,
0.03065209649503231,
-0.01525777019560337,
-0.04284683242440224,
0.06678121536970139,
0.015523230656981468,
0.04719410091638565,
-0.0865432620048523,
-0.030490737408399582,
-0.07419797033071518,
0.02764614298939705,
-0.02616533637046814,
-0.07685784250497818,
-0.05848031863570213,
0.08107879012823105,
0.04687441512942314,
-0.02700960822403431,
0.01855660416185856,
0.02300483174622059,
0.12006635218858719,
0.15593205392360687,
0.0007324752514250576,
-0.047717537730932236,
-0.06794559210538864,
-0.03802735358476639,
-0.017876457422971725,
0.07150432467460632,
-0.03909802436828613,
0.01713830605149269,
0.0636974647641182,
0.01842011697590351,
0.09681382030248642,
0.05810723826289177,
-0.1135653480887413,
-0.0211592148989439,
0.03528793156147003,
-0.1684204488992691,
0.03157452866435051,
0.0011003101244568825,
0.02036261558532715,
-0.036250751465559006,
0.03563268110156059,
0.1436571180820465,
-0.060685548931360245,
-0.03326410427689552,
-0.03959382325410843,
0.06699124723672867,
0.02472309023141861,
0.14910434186458588,
0.03412797674536705,
0.036912139505147934,
-0.08279062062501907,
0.12133532017469406,
0.031315241008996964,
-0.035398971289396286,
0.022479185834527016,
-0.01977488584816456,
-0.11322691291570663,
0.008973844349384308,
0.061152245849370956,
0.03470228984951973,
-0.051167361438274384,
-0.011504276655614376,
-0.027206014841794968,
-0.0762704461812973,
0.06214659661054611,
0.1861499398946762,
0.06698822975158691,
0.07140844315290451,
-0.054608121514320374,
-0.04005265235900879,
-0.07985560595989227,
0.0395856536924839,
0.03338915854692459,
0.07598794996738434,
-0.08148431777954102,
0.09342536330223083,
0.013956155627965927,
0.03978852182626724,
-0.03131970018148422,
-0.05389537662267685,
-0.10695412755012512,
-0.055485889315605164,
-0.10347456485033035,
0.0026059835217893124,
-0.0780976414680481,
-0.03608974069356918,
-0.0024896615650504827,
0.003126459661871195,
-0.007372010964900255,
0.05079176649451256,
-0.05881757661700249,
-0.010081414133310318,
-0.017169183120131493,
0.03688804805278778,
-0.06348352134227753,
-0.0344264954328537,
0.022734418511390686,
-0.10212114453315735,
0.0930006206035614,
0.046825312077999115,
0.0046225497499108315,
0.00828817579895258,
0.0857987031340599,
-0.02233138494193554,
0.024486225098371506,
0.010306341573596,
-0.04791833832859993,
-0.08229200541973114,
-0.001405200338922441,
-0.006644764915108681,
-0.017307206988334656,
-0.005295611452311277,
0.08212049305438995,
-0.08641906082630157,
0.03529835864901543,
-0.003198034130036831,
-0.000012095554666302633,
-0.06979896873235703,
-0.012657228857278824,
0.10414982587099075,
0.09408296644687653,
0.04901006445288658,
-0.09394220262765884,
0.010029041208326817,
-0.13879823684692383,
-0.04032441973686218,
0.0044478182680904865,
-0.014699959196150303,
-0.1272348016500473,
-0.005061691161245108,
0.02227964624762535,
-0.0025864741764962673,
0.22168168425559998,
-0.05822300165891647,
-0.014411602169275284,
0.01857038587331772,
-0.09341234713792801,
0.12066509574651718,
-0.024847157299518585,
0.18074564635753632,
-0.013641455210745335,
-0.0409730039536953,
-0.008336784318089485,
0.04622014984488487,
0.018623633310198784,
-0.02229583077132702,
0.17887111008167267,
0.13835345208644867,
0.02141718752682209,
0.03819120302796364,
-0.02328334003686905,
-0.003898077644407749,
-0.03820190206170082,
-0.03598173335194588,
0.03675639629364014,
0.046036623418331146,
0.014847194775938988,
0.14516904950141907,
0.06847886741161346,
-0.16518494486808777,
0.03367609530687332,
-0.03322639316320419,
-0.039791908115148544,
-0.11159667372703552,
-0.09787523001432419,
-0.02595769241452217,
-0.07416513562202454,
0.01041413750499487,
-0.1283482015132904,
0.0002679869649000466,
0.18118560314178467,
0.06420747935771942,
0.026320813223719597,
0.012356700375676155,
-0.11291661858558655,
-0.034872714430093765,
0.05518649145960808,
0.00952065084129572,
0.01999249868094921,
0.05724950507283211,
0.0028426493518054485,
0.05634330213069916,
0.031124114990234375,
0.01314195804297924,
-0.0019881906919181347,
0.07580315321683884,
0.019556356593966484,
0.038285817950963974,
-0.06260644644498825,
-0.0034077984746545553,
-0.036995500326156616,
0.06953691691160202,
0.11010618507862091,
0.04587896540760994,
-0.05410093441605568,
-0.008034423924982548,
0.15593786537647247,
-0.03517056256532669,
0.0019449860556051135,
-0.12807737290859222,
0.3320687413215637,
0.016467956826090813,
0.010827641934156418,
0.04147300869226456,
-0.07146111875772476,
-0.047706786543130875,
0.21203060448169708,
0.09732510149478912,
-0.019677922129631042,
-0.020596977323293686,
-0.0006632813601754606,
-0.03041072003543377,
-0.022637782618403435,
0.15574374794960022,
0.04192577302455902,
0.12333831191062927,
-0.052120864391326904,
-0.04700901359319687,
-0.02910587191581726,
-0.009924252517521381,
-0.1228717491030693,
0.1257203221321106,
-0.016296403482556343,
-0.02566678635776043,
-0.06621892750263214,
0.029292015358805656,
0.06556855142116547,
-0.30869919061660767,
-0.0074690403416752815,
-0.025136036798357964,
-0.10725993663072586,
-0.012826715596020222,
-0.02798105962574482,
-0.021308351308107376,
0.04594472423195839,
-0.043107327073812485,
0.07069814950227737,
0.03773447871208191,
0.03162574768066406,
-0.02126200497150421,
-0.1012992337346077,
0.1721487194299698,
0.05865908041596413,
0.09046947956085205,
0.0246431827545166,
0.07425039261579514,
0.058213334530591965,
0.032661836594343185,
-0.09591877460479736,
0.04901932552456856,
0.013383891433477402,
-0.09180659800767899,
-0.048324499279260635,
0.11211897432804108,
0.0026959397364407778,
0.045449405908584595,
0.03472922369837761,
-0.10196613520383835,
0.019534064456820488,
0.06724370270967484,
-0.06361326575279236,
-0.09833653271198273,
-0.001972435973584652,
-0.0908113494515419,
0.16231124103069305,
0.1482645571231842,
-0.016940731555223465,
0.018040824681520462,
-0.06902018934488297,
-0.004423065576702356,
0.051252465695142746,
0.0003818826808128506,
-0.0245817918330431,
-0.1939942091703415,
0.036098893731832504,
-0.08825168758630753,
-0.010645128786563873,
-0.23078285157680511,
-0.10154151916503906,
-0.0068547991104424,
-0.04786694422364235,
-0.02876967005431652,
0.05776995047926903,
0.02947348915040493,
0.07180330157279968,
-0.01845831610262394,
-0.02187231555581093,
-0.033323984593153,
0.09049893170595169,
-0.11420758813619614,
-0.06441953778266907
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1500k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1500k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1500k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1500k"]}
| null |
google/multiberts-seed_3-step_1500k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1500k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1500k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1500k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1500k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1500k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1500k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1500k (max: 2000k, i.e., 2M steps)."
] |
[
-0.08033663779497147,
0.08630726486444473,
-0.0023168029729276896,
0.04572635143995285,
0.09014689922332764,
-0.014822917990386486,
0.0746791809797287,
0.09862644225358963,
-0.03250828757882118,
0.02115781046450138,
0.08125424385070801,
0.008858979679644108,
0.016852684319019318,
0.09301988780498505,
0.0201772078871727,
-0.21298947930335999,
0.0259284358471632,
-0.03482234105467796,
-0.09336310625076294,
0.07667505741119385,
0.0978248193860054,
-0.08292543143033981,
0.04770876094698906,
0.024141987785696983,
-0.11030537635087967,
0.05142867937684059,
-0.007433589547872543,
-0.019768720492720604,
0.1289771944284439,
0.003952031023800373,
0.046645332127809525,
0.052984680980443954,
0.03749166056513786,
-0.13861681520938873,
0.008158157579600811,
0.055228814482688904,
0.05706208571791649,
0.04048635810613632,
0.026608644053339958,
0.08009632676839828,
-0.010747377760708332,
0.029210364446043968,
0.04650154709815979,
0.027533238753676414,
-0.07547079026699066,
-0.04102451354265213,
-0.10426885634660721,
0.04469069093465805,
0.025299247354269028,
0.008970601484179497,
0.014658740721642971,
0.12168637663125992,
-0.04114960506558418,
0.04870232567191124,
0.18890133500099182,
-0.3176395297050476,
-0.018378080800175667,
0.09098674356937408,
0.042970284819602966,
0.12776875495910645,
-0.0017079183598980308,
-0.013565470464527607,
0.07545731961727142,
0.032474469393491745,
0.09419381618499756,
-0.04189716652035713,
0.029467685148119926,
-0.05128668621182442,
-0.16093672811985016,
-0.04248664528131485,
0.0962122306227684,
-0.002762629883363843,
-0.1351160854101181,
-0.04192349687218666,
-0.033323850482702255,
0.027705062180757523,
0.008781049400568008,
-0.03810116648674011,
0.02979920245707035,
0.00699373846873641,
-0.012397421523928642,
-0.006679809186607599,
-0.0973559021949768,
-0.051922623068094254,
0.040528688579797745,
0.08244559913873672,
0.10204905271530151,
0.06351076811552048,
0.001929098623804748,
0.10456206649541855,
-0.18936409056186676,
-0.045996107161045074,
-0.0312466099858284,
-0.06227303668856621,
-0.0489700548350811,
-0.017727503553032875,
-0.10835540294647217,
-0.05268777534365654,
0.014178757555782795,
0.13824060559272766,
-0.013925440609455109,
0.03353031352162361,
-0.019588932394981384,
0.0025335366372019053,
0.05795910209417343,
0.035877253860235214,
-0.02001773938536644,
0.018266940489411354,
0.024679267778992653,
-0.004450767766684294,
-0.02875111997127533,
0.015737177804112434,
0.006114277523010969,
0.03309709206223488,
0.11721380054950714,
0.03153611719608307,
-0.10431189835071564,
0.07692274451255798,
-0.012784802354872227,
-0.04839446023106575,
0.025807052850723267,
-0.08857125788927078,
-0.06640395522117615,
-0.04448332265019417,
0.0026913199108093977,
0.010149343870580196,
-0.007509555667638779,
-0.007932456210255623,
-0.0252249576151371,
-0.025991350412368774,
-0.08328486979007721,
-0.0492115318775177,
-0.05280613899230957,
-0.13327263295650482,
0.0036907989997416735,
-0.1803201287984848,
-0.03610750287771225,
-0.11820365488529205,
-0.19436074793338776,
-0.025643520057201385,
0.06342729926109314,
-0.013607717119157314,
-0.055048368871212006,
0.08402498066425323,
0.03753240779042244,
-0.030330700799822807,
-0.001747057307511568,
0.07359994947910309,
-0.0033573138061910868,
0.03761952370405197,
-0.01980598270893097,
0.06579755991697311,
0.009727290831506252,
0.03701642155647278,
-0.04973360523581505,
0.05949511379003525,
-0.18042388558387756,
0.04774259775876999,
-0.08301126211881638,
-0.01340231578797102,
-0.08400242775678635,
-0.03772594779729843,
-0.009624779224395752,
0.008278118446469307,
0.028128523379564285,
0.08027567714452744,
-0.17775434255599976,
-0.024905674159526825,
0.10324239730834961,
-0.15856540203094482,
-0.033009182661771774,
0.07025350630283356,
-0.043909329921007156,
0.08978535979986191,
0.06708353757858276,
0.14975391328334808,
0.0007866449886932969,
-0.08486039936542511,
0.060262419283390045,
-0.01271069049835205,
0.010681681334972382,
-0.01745663769543171,
0.06675312668085098,
-0.02019042894244194,
-0.16020289063453674,
0.03856061026453972,
-0.13184049725532532,
0.0039133173413574696,
-0.07913658767938614,
0.017630353569984436,
-0.006367436610162258,
-0.07412338256835938,
-0.0772164911031723,
-0.02932976745069027,
0.06571867316961288,
-0.0750981792807579,
-0.007217464968562126,
0.040874071419239044,
0.06862953305244446,
-0.07135015726089478,
0.06443895399570465,
-0.008093098178505898,
0.022627361118793488,
-0.08783192187547684,
-0.03774691000580788,
-0.1906413584947586,
0.05094306915998459,
0.0963410884141922,
0.011767465621232986,
-0.022503254935145378,
0.1315843015909195,
0.007869378663599491,
0.06519489735364914,
-0.05339771509170532,
0.014971873722970486,
-0.003255819668993354,
-0.0034027458168566227,
-0.0824417918920517,
-0.09509585052728653,
-0.07241330295801163,
-0.06649444997310638,
0.05973878130316734,
-0.11612794548273087,
0.02387128956615925,
-0.06357884407043457,
0.03288869932293892,
0.016766181215643883,
-0.08211370557546616,
-0.007838704623281956,
0.020833536982536316,
-0.05592819303274155,
-0.05758597329258919,
0.04338723421096802,
0.06874014437198639,
-0.0018439945997670293,
0.0873701199889183,
-0.043030161410570145,
-0.0748511403799057,
0.030125783756375313,
0.09896672517061234,
-0.11469820886850357,
0.0035264496691524982,
-0.05564882978796959,
-0.040871232748031616,
-0.06194315478205681,
-0.021576983854174614,
0.07533133774995804,
-0.004630387295037508,
0.1358652114868164,
-0.07739821076393127,
-0.00399123877286911,
0.015573878772556782,
-0.013109966181218624,
-0.02714385837316513,
0.035609543323516846,
0.07743872702121735,
-0.06235082820057869,
0.015074524097144604,
0.02778446301817894,
0.013339848257601261,
0.07166042923927307,
-0.050789982080459595,
-0.0802873820066452,
0.019953621551394463,
0.038425397127866745,
0.030631357803940773,
0.06743314117193222,
-0.025896625593304634,
-0.01351503748446703,
0.029471201822161674,
0.021299419924616814,
0.011028487235307693,
-0.1046387255191803,
0.054914578795433044,
0.059780873358249664,
0.007020810153335333,
0.06429892778396606,
-0.010688850656151772,
-0.04192906245589256,
0.07630960643291473,
0.035291701555252075,
-0.0008616606937721372,
-0.014055388048291206,
-0.015744149684906006,
-0.11598250269889832,
0.1959099918603897,
-0.06570913642644882,
-0.17689061164855957,
-0.07720828056335449,
-0.10856218636035919,
-0.009930191561579704,
0.020512988790869713,
0.039288219064474106,
-0.0282222218811512,
-0.05172070115804672,
-0.12462303042411804,
0.06684263050556183,
-0.03992873802781105,
0.06678340584039688,
0.11507824808359146,
-0.047423817217350006,
0.05542399734258652,
-0.1289246529340744,
-0.006549109239131212,
-0.08030503243207932,
-0.06566780805587769,
0.05842629447579384,
-0.04702132195234299,
0.03743817284703255,
0.0932789146900177,
0.027396919205784798,
-0.01690473034977913,
-0.030005449429154396,
0.22273753583431244,
0.044809043407440186,
0.03510047122836113,
0.13004058599472046,
-0.054030176252126694,
0.051902465522289276,
0.08899453282356262,
0.016070742160081863,
-0.05102122575044632,
0.057104066014289856,
0.05011877045035362,
-0.06731393933296204,
-0.19230037927627563,
-0.0226343534886837,
-0.014821083284914494,
-0.04469463974237442,
0.06780572980642319,
0.036282796412706375,
0.012236779555678368,
0.06949395686388016,
0.013277527876198292,
0.057810209691524506,
-0.00239395210519433,
0.09882469475269318,
0.021367210894823074,
-0.03302428126335144,
0.08975427597761154,
-0.016546892002224922,
-0.004247304052114487,
0.08246936649084091,
-0.019678952172398567,
0.2883884608745575,
-0.03296816721558571,
0.008208224549889565,
0.12769992649555206,
0.035605352371931076,
0.062393754720687866,
0.1346411556005478,
-0.0759417861700058,
0.013745706528425217,
-0.0687226951122284,
-0.05865781381726265,
0.0012022193986922503,
0.0320257730782032,
-0.06222647801041603,
0.013252665288746357,
-0.07586026936769485,
0.025946633890271187,
-0.015327565371990204,
0.3115127980709076,
0.09459321200847626,
-0.10074532777070999,
-0.05407083407044411,
-0.003228524699807167,
-0.09634082764387131,
-0.06662759929895401,
0.04513596370816231,
0.07004593312740326,
-0.1325618326663971,
0.009241359308362007,
-0.02866978757083416,
0.0749482735991478,
-0.007857096381485462,
0.018738040700554848,
0.04505808651447296,
0.049625176936388016,
-0.04410911351442337,
0.003231210168451071,
-0.20300960540771484,
0.1906610131263733,
0.008113072253763676,
0.022439997643232346,
-0.04633015766739845,
0.03342503681778908,
0.009015083312988281,
-0.026509489864110947,
0.06120409816503525,
0.021087387576699257,
-0.013914262875914574,
-0.05615004897117615,
-0.047186415642499924,
0.011872668750584126,
0.07608011364936829,
-0.03363615274429321,
0.10127907991409302,
-0.0033257771283388138,
0.04324527829885483,
0.01849375292658806,
0.10241036862134933,
-0.19555990397930145,
-0.0965583547949791,
0.030929066240787506,
-0.06175484508275986,
-0.10699348151683807,
-0.07593636214733124,
-0.09595920890569687,
-0.021431269124150276,
0.23835021257400513,
-0.11268804222345352,
-0.07821301370859146,
-0.10133399069309235,
0.02103382907807827,
0.10564950108528137,
-0.04557856544852257,
0.03225768357515335,
-0.010477620176970959,
0.11857690662145615,
-0.07039131224155426,
-0.13281123340129852,
0.019164711236953735,
-0.10169845819473267,
-0.1562105119228363,
-0.06325116753578186,
0.10829932987689972,
0.06161513179540634,
0.031644612550735474,
-0.033749449998140335,
0.016576027497649193,
0.042613763362169266,
-0.04318992421030998,
-0.008815119974315166,
0.06340069323778152,
0.09331963956356049,
0.03795666620135307,
-0.1099664494395256,
0.013736847788095474,
-0.0656164139509201,
-0.06770980358123779,
0.07430143654346466,
0.2572333514690399,
-0.05042123422026634,
0.11920870095491409,
0.11669256538152695,
-0.07746706157922745,
-0.15638713538646698,
0.037719789892435074,
0.09144340455532074,
-0.019299039617180824,
0.013886777684092522,
-0.15414071083068848,
0.09112067520618439,
0.11234764754772186,
-0.016321828588843346,
-0.0025942292995750904,
-0.1870807409286499,
-0.13294491171836853,
0.06549143046140671,
0.1099548190832138,
0.2649794816970825,
-0.06618406623601913,
-0.03894515335559845,
0.022611111402511597,
-0.0964893326163292,
0.017622018232941628,
0.13950295746326447,
0.0652424544095993,
-0.02907354012131691,
-0.08766388893127441,
0.011524138040840626,
-0.045514363795518875,
0.0886206105351448,
0.05525800958275795,
0.06581456959247589,
-0.00524104991927743,
0.022595886141061783,
-0.027804912999272346,
-0.044470418244600296,
0.06591042876243591,
0.01587373949587345,
0.050069380551576614,
-0.0770261287689209,
-0.02888079546391964,
-0.07420913875102997,
0.025590671226382256,
-0.02776990458369255,
-0.07604864239692688,
-0.05790878087282181,
0.0847160592675209,
0.04784926399588585,
-0.027093442156910896,
0.02390400320291519,
0.023458879441022873,
0.11806134134531021,
0.16033075749874115,
0.002952190348878503,
-0.06751319766044617,
-0.06469956040382385,
-0.035364504903554916,
-0.01655021496117115,
0.07271778583526611,
-0.03965567424893379,
0.020718805491924286,
0.06657008081674576,
0.019696412608027458,
0.09948724508285522,
0.05788277089595795,
-0.10924909263849258,
-0.02484354004263878,
0.038101665675640106,
-0.16382716596126556,
0.022573893889784813,
-0.0031186717096716166,
0.018952732905745506,
-0.0377315953373909,
0.029338497668504715,
0.14199408888816833,
-0.06457256525754929,
-0.032737698405981064,
-0.04158835858106613,
0.0696907490491867,
0.026694733649492264,
0.15482710301876068,
0.03366892784833908,
0.03984982520341873,
-0.08477594703435898,
0.1268583983182907,
0.03102198801934719,
-0.03508849814534187,
0.019859112799167633,
-0.018689827993512154,
-0.1125526949763298,
0.011638197116553783,
0.0672900602221489,
0.038733649998903275,
-0.05318443477153778,
-0.007032252382487059,
-0.027950216084718704,
-0.0776861310005188,
0.06523368507623672,
0.1954403966665268,
0.06402125209569931,
0.06400949507951736,
-0.052001528441905975,
-0.03944382816553116,
-0.07746915519237518,
0.04527042806148529,
0.03485019877552986,
0.07458572089672089,
-0.0811881273984909,
0.09221629053354263,
0.012387207709252834,
0.0382765457034111,
-0.029707422479987144,
-0.050064850598573685,
-0.10834251344203949,
-0.0541166290640831,
-0.10789880156517029,
0.0031929751858115196,
-0.07927727699279785,
-0.03343592956662178,
-0.0033544653560966253,
0.001832847949117422,
-0.009402655065059662,
0.05168601870536804,
-0.059403594583272934,
-0.010491403751075268,
-0.0171334408223629,
0.03557666763663292,
-0.06111134588718414,
-0.03429652750492096,
0.023127824068069458,
-0.09814395755529404,
0.09004777669906616,
0.040222711861133575,
0.0017166325123980641,
0.006094809155911207,
0.08519335836172104,
-0.0243889968842268,
0.022001242265105247,
0.012790496461093426,
-0.04548519104719162,
-0.08294422179460526,
0.0032705587800592184,
-0.006239982787519693,
-0.019706448540091515,
-0.0045051416382193565,
0.07558928430080414,
-0.08630519360303879,
0.039266642183065414,
-0.004336609970778227,
0.0045569357462227345,
-0.06953412294387817,
-0.009765432216227055,
0.10941896587610245,
0.09289644658565521,
0.04703334718942642,
-0.09156132489442825,
0.010539629496634007,
-0.13845837116241455,
-0.039070986211299896,
0.006438347510993481,
-0.014462882652878761,
-0.13990390300750732,
-0.00324990414083004,
0.023316144943237305,
-0.0028671647887676954,
0.2169613242149353,
-0.052786510437726974,
-0.01697278767824173,
0.02054097317159176,
-0.08610643446445465,
0.11498703062534332,
-0.02496386133134365,
0.1715429425239563,
-0.012931120581924915,
-0.04289879649877548,
-0.007925130426883698,
0.047928228974342346,
0.020547879859805107,
-0.027653152123093605,
0.1877772957086563,
0.13822904229164124,
0.023611636832356453,
0.0362459197640419,
-0.02047734335064888,
-0.0058146133087575436,
-0.02812080830335617,
-0.03491605445742607,
0.033941850066185,
0.043692994862794876,
0.0138804130256176,
0.13199925422668457,
0.06784188002347946,
-0.16601233184337616,
0.0400434210896492,
-0.03248492255806923,
-0.040313903242349625,
-0.11290820688009262,
-0.09734940528869629,
-0.023403432220220566,
-0.07510919868946075,
0.0125912856310606,
-0.1322428286075592,
-0.0017742124618962407,
0.18098124861717224,
0.06456354260444641,
0.02702317200601101,
0.009340804070234299,
-0.11930393427610397,
-0.02905125729739666,
0.05416686832904816,
0.008527896367013454,
0.022957617416977882,
0.05486344173550606,
0.0026807531248778105,
0.05735236406326294,
0.03368208557367325,
0.013382927514612675,
-0.001596580957993865,
0.06973245739936829,
0.01797151193022728,
0.040615931153297424,
-0.061840567737817764,
-0.0019236168591305614,
-0.04333793371915817,
0.06572485715150833,
0.12112775444984436,
0.04573352634906769,
-0.04919686168432236,
-0.007100345101207495,
0.15912044048309326,
-0.033624742180109024,
-0.0003813235380221158,
-0.13178668916225433,
0.31448274850845337,
0.020880695432424545,
0.01828007586300373,
0.04338473081588745,
-0.07345274090766907,
-0.054275959730148315,
0.2199302613735199,
0.09711193293333054,
-0.018395133316516876,
-0.021562639623880386,
-0.00021419649419840425,
-0.030344415456056595,
-0.02245376631617546,
0.1529998779296875,
0.04121619090437889,
0.11640837043523788,
-0.050133515149354935,
-0.04164687171578407,
-0.027226271107792854,
-0.01293156947940588,
-0.11354831606149673,
0.12289317697286606,
-0.01399625651538372,
-0.02358827367424965,
-0.06626717746257782,
0.02771170623600483,
0.05898713693022728,
-0.30705052614212036,
-0.008682826533913612,
-0.02901068888604641,
-0.1053830087184906,
-0.016840720549225807,
-0.030500035732984543,
-0.022181544452905655,
0.04296058416366577,
-0.03915159031748772,
0.07220947742462158,
0.036368343979120255,
0.03062751330435276,
-0.019125737249851227,
-0.11338667571544647,
0.17019984126091003,
0.06598242372274399,
0.08755094558000565,
0.026492219418287277,
0.06529604643583298,
0.05847979709506035,
0.035006582736968994,
-0.09953158348798752,
0.04219158738851547,
0.01219890359789133,
-0.08573630452156067,
-0.05222187936306,
0.11754200607538223,
0.0032949342858046293,
0.052849605679512024,
0.03288450464606285,
-0.10307145118713379,
0.015067767351865768,
0.06541364639997482,
-0.060952141880989075,
-0.09946697950363159,
0.002201048657298088,
-0.08863101154565811,
0.1592986285686493,
0.14936023950576782,
-0.014373723417520523,
0.02130347676575184,
-0.06952642649412155,
-0.004351538605988026,
0.047778185456991196,
0.011386403813958168,
-0.024403586983680725,
-0.1996079981327057,
0.038279324769973755,
-0.09463250637054443,
-0.0111012514680624,
-0.22920887172222137,
-0.10131332278251648,
-0.007424172013998032,
-0.04793747514486313,
-0.027543332427740097,
0.061562880873680115,
0.0376979224383831,
0.07354788482189178,
-0.017006855458021164,
-0.02516760863363743,
-0.033662665635347366,
0.09223107993602753,
-0.10668409615755081,
-0.060175638645887375
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1600k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1600k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1600k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1600k"]}
| null |
google/multiberts-seed_3-step_1600k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1600k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1600k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1600k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1600k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1600k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1600k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1600k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07980746775865555,
0.07792666554450989,
-0.0021287917625159025,
0.04890424758195877,
0.08820946514606476,
-0.014402292668819427,
0.07028314471244812,
0.09845840930938721,
-0.033303599804639816,
0.021374329924583435,
0.08011119067668915,
0.011253554373979568,
0.01728297583758831,
0.09378229081630707,
0.022454215213656425,
-0.21208754181861877,
0.022340336814522743,
-0.03621862456202507,
-0.08520462363958359,
0.07732350379228592,
0.09832919389009476,
-0.083213672041893,
0.049989521503448486,
0.023521296679973602,
-0.11588131636381149,
0.05579701438546181,
-0.0049035679548978806,
-0.022502029314637184,
0.1322142481803894,
-0.0010633039055392146,
0.049717970192432404,
0.05490659549832344,
0.041429419070482254,
-0.12938672304153442,
0.006899510510265827,
0.05709270387887955,
0.05808310955762863,
0.038363754749298096,
0.028281541541218758,
0.08115483075380325,
0.004282765090465546,
0.025566617026925087,
0.04877442866563797,
0.02621704898774624,
-0.0770353302359581,
-0.053648319095373154,
-0.10443206131458282,
0.029321124777197838,
0.02283930778503418,
0.011904607526957989,
0.012338673695921898,
0.12564174830913544,
-0.0373455286026001,
0.04994852840900421,
0.1856609582901001,
-0.3202276825904846,
-0.018093904480338097,
0.08580862730741501,
0.04449612647294998,
0.13412685692310333,
-0.0020300911273807287,
-0.015955347567796707,
0.0756106749176979,
0.032906658947467804,
0.09392528980970383,
-0.04366142675280571,
0.023717079311609268,
-0.0510057732462883,
-0.159458190202713,
-0.04103405028581619,
0.10411619395017624,
-0.006282349582761526,
-0.13636688888072968,
-0.0388370007276535,
-0.035520147532224655,
0.02770049497485161,
0.010815713554620743,
-0.03695772588253021,
0.031621843576431274,
0.008223309181630611,
-0.018304118886590004,
-0.003641421440988779,
-0.10193432867527008,
-0.05085574835538864,
0.03506351262331009,
0.07330377399921417,
0.10193700343370438,
0.06543649733066559,
0.003454765072092414,
0.10402410477399826,
-0.18059147894382477,
-0.04689895361661911,
-0.027199197560548782,
-0.06355156004428864,
-0.05143054574728012,
-0.014126120135188103,
-0.10735930502414703,
-0.04403214156627655,
0.012204155325889587,
0.13072168827056885,
-0.019282901659607887,
0.03761017695069313,
-0.02918877638876438,
0.002627517329528928,
0.055120062083005905,
0.03635696321725845,
-0.013100381009280682,
0.018467042595148087,
0.02367970533668995,
-0.010189407505095005,
-0.025176679715514183,
0.015217073261737823,
0.008831143379211426,
0.036015160381793976,
0.11112896353006363,
0.028245244175195694,
-0.10377280414104462,
0.07810092717409134,
-0.01518916618078947,
-0.0457557737827301,
0.02497301809489727,
-0.0925540179014206,
-0.06514393538236618,
-0.042648714035749435,
0.00292369001545012,
0.013705817982554436,
-0.006838556844741106,
-0.012054722756147385,
-0.02279287949204445,
-0.03552551195025444,
-0.08328801393508911,
-0.05339742824435234,
-0.05479677394032478,
-0.1326192170381546,
0.0052870879881083965,
-0.1751481145620346,
-0.03693480044603348,
-0.11756876856088638,
-0.19132494926452637,
-0.024673068895936012,
0.05994768068194389,
-0.012683014385402203,
-0.049947842955589294,
0.08372656255960464,
0.03742777183651924,
-0.03202684596180916,
-0.001861589029431343,
0.08282957971096039,
-0.004785032477229834,
0.03869011625647545,
-0.01700213924050331,
0.06808198988437653,
0.011617135256528854,
0.03553929179906845,
-0.052144646644592285,
0.05885833501815796,
-0.17677795886993408,
0.04439940303564072,
-0.08298944681882858,
-0.016855785623192787,
-0.08255606889724731,
-0.03993046283721924,
-0.0070242867805063725,
0.009743181057274342,
0.0243739802390337,
0.0778651311993599,
-0.1748761683702469,
-0.02696801722049713,
0.10266918689012527,
-0.16139212250709534,
-0.02869405597448349,
0.07209884375333786,
-0.04400797188282013,
0.09197826683521271,
0.06700395792722702,
0.1561427265405655,
-0.011704022996127605,
-0.08306088298559189,
0.06072835251688957,
-0.014003876596689224,
0.00978007446974516,
-0.014490678906440735,
0.06542470306158066,
-0.01880285143852234,
-0.15816554427146912,
0.04002620279788971,
-0.1272166669368744,
-0.002394357230514288,
-0.07728687673807144,
0.018673444166779518,
-0.006627227645367384,
-0.06891090422868729,
-0.0759909525513649,
-0.024509351700544357,
0.06249737739562988,
-0.08165116608142853,
-0.011503459885716438,
0.02614363469183445,
0.06913553923368454,
-0.07374468445777893,
0.06672098487615585,
-0.01122018601745367,
0.015134437009692192,
-0.07831136882305145,
-0.03839690983295441,
-0.18922847509384155,
0.044064961373806,
0.09804948419332504,
0.019636282697319984,
-0.02394360862672329,
0.13499121367931366,
0.006203738506883383,
0.06706394255161285,
-0.05144776403903961,
0.015412659384310246,
-0.004359262064099312,
-0.0028118430636823177,
-0.07973574101924896,
-0.10027902573347092,
-0.07439863681793213,
-0.06933163106441498,
0.06935911625623703,
-0.11779599636793137,
0.022655857726931572,
-0.05564688891172409,
0.030871881172060966,
0.01544322818517685,
-0.08272179216146469,
-0.011687930673360825,
0.01966444030404091,
-0.05635913461446762,
-0.05972837284207344,
0.0422544963657856,
0.06924816220998764,
-0.0033870849292725325,
0.08890636265277863,
-0.050885602831840515,
-0.08502265810966492,
0.034935180097818375,
0.09191825985908508,
-0.11593807488679886,
0.012423459440469742,
-0.05758574604988098,
-0.04326597973704338,
-0.058888453990221024,
-0.0237398874014616,
0.07653509080410004,
-0.010072510689496994,
0.13396286964416504,
-0.07466823607683182,
-0.0036856255028396845,
0.015546574257314205,
-0.012913057580590248,
-0.02668727934360504,
0.039676908403635025,
0.0681309774518013,
-0.06244342029094696,
0.01393718458712101,
0.033161457628011703,
0.013834464363753796,
0.07027420401573181,
-0.052403975278139114,
-0.08143452554941177,
0.020827069878578186,
0.03510114178061485,
0.03232064098119736,
0.06631890684366226,
-0.030026879161596298,
-0.014859447255730629,
0.031357016414403915,
0.019330449402332306,
0.010699973441660404,
-0.10648338496685028,
0.05695430934429169,
0.05736005678772926,
0.007596229203045368,
0.06837992370128632,
-0.011559763923287392,
-0.04267745465040207,
0.07860002666711807,
0.031353291124105453,
-0.00417086947709322,
-0.013228349387645721,
-0.016668640077114105,
-0.11713678389787674,
0.19623401761054993,
-0.06802855432033539,
-0.1703801304101944,
-0.07501011341810226,
-0.1162782683968544,
-0.004065538756549358,
0.023938380181789398,
0.03981596603989601,
-0.03141739219427109,
-0.052889369428157806,
-0.1266544759273529,
0.06244959309697151,
-0.0462346076965332,
0.06588132679462433,
0.11573784053325653,
-0.04731079936027527,
0.05422435700893402,
-0.1267879754304886,
-0.008772418834269047,
-0.08023174107074738,
-0.07680384069681168,
0.06342191249132156,
-0.051012877374887466,
0.03644883632659912,
0.09370454400777817,
0.02792546898126602,
-0.018885625526309013,
-0.029294129461050034,
0.20498816668987274,
0.04440939426422119,
0.03840222209692001,
0.13379311561584473,
-0.05281646549701691,
0.05135064199566841,
0.08308384567499161,
0.013202889822423458,
-0.05064903944730759,
0.05814507603645325,
0.04479329660534859,
-0.07266268134117126,
-0.19976408779621124,
-0.021840402856469154,
-0.015097303315997124,
-0.038238413631916046,
0.06655368953943253,
0.03570067882537842,
0.0013897779863327742,
0.06960400193929672,
0.015900813043117523,
0.05676310136914253,
-0.0028007596265524626,
0.09931007027626038,
0.02975175343453884,
-0.03690129891037941,
0.09064187854528427,
-0.01823686994612217,
-0.004245582967996597,
0.08524254709482193,
-0.023857776075601578,
0.2896249294281006,
-0.03309283033013344,
0.012645944021642208,
0.1268583983182907,
0.04010576754808426,
0.05804882198572159,
0.13039341568946838,
-0.06873399019241333,
0.01721690408885479,
-0.07164587080478668,
-0.0598001666367054,
-0.0017241060268133879,
0.03472808003425598,
-0.07008539885282516,
0.008267967030405998,
-0.07583239674568176,
0.02601727284491062,
-0.015620347112417221,
0.29981890320777893,
0.09671095013618469,
-0.11114107072353363,
-0.054311130195856094,
-0.0017813140293583274,
-0.09203977137804031,
-0.06414081901311874,
0.04563641548156738,
0.07094500213861465,
-0.13241486251354218,
0.011118922382593155,
-0.026004355400800705,
0.06954455375671387,
-0.00847698375582695,
0.0190627109259367,
0.04624265059828758,
0.050184257328510284,
-0.042113956063985825,
0.0028805264737457037,
-0.19322049617767334,
0.19353985786437988,
0.007791312877088785,
0.020838718861341476,
-0.04930470511317253,
0.03556172549724579,
0.0101997135207057,
-0.030261121690273285,
0.06306806206703186,
0.019305618479847908,
-0.03613439202308655,
-0.05279430001974106,
-0.044699180871248245,
0.014743304811418056,
0.07798674702644348,
-0.0297868549823761,
0.1023368388414383,
-0.002816037507727742,
0.04507138207554817,
0.02016015537083149,
0.10182660818099976,
-0.18891428411006927,
-0.09704580157995224,
0.029556237161159515,
-0.05920470505952835,
-0.11766926199197769,
-0.07664211839437485,
-0.09443086385726929,
-0.015481469221413136,
0.25275301933288574,
-0.10910424590110779,
-0.07823451608419418,
-0.10059025883674622,
0.0242045596241951,
0.10564850270748138,
-0.04374250769615173,
0.028029950335621834,
-0.0067829168401658535,
0.11854305118322372,
-0.0674542561173439,
-0.13410164415836334,
0.02224264293909073,
-0.10033920407295227,
-0.1589038372039795,
-0.06581185758113861,
0.11186689138412476,
0.06238851323723793,
0.032621122896671295,
-0.032014235854148865,
0.0200349111109972,
0.037567511200904846,
-0.042356956750154495,
-0.0020496633369475603,
0.061628080904483795,
0.09922267496585846,
0.03603266552090645,
-0.11134151369333267,
0.010759121738374233,
-0.06304176151752472,
-0.06942100077867508,
0.07034735381603241,
0.2584554851055145,
-0.05064381659030914,
0.11995917558670044,
0.11606281250715256,
-0.07933030277490616,
-0.15534505248069763,
0.039114855229854584,
0.08723025768995285,
-0.020949650555849075,
0.018714070320129395,
-0.15393483638763428,
0.09310045093297958,
0.11476270854473114,
-0.01673947460949421,
-0.004193154629319906,
-0.18420745432376862,
-0.13025106489658356,
0.06387598067522049,
0.11023504287004471,
0.2629969120025635,
-0.06807110458612442,
-0.03706963360309601,
0.021449673920869827,
-0.09170267730951309,
0.010362541303038597,
0.12580052018165588,
0.06590108573436737,
-0.028702836483716965,
-0.08013525605201721,
0.010488749481737614,
-0.0448668971657753,
0.08867941051721573,
0.057557765394449234,
0.06271429359912872,
-0.006371160503476858,
0.02730359509587288,
-0.015309134498238564,
-0.040052108466625214,
0.06187152862548828,
0.020183216780424118,
0.04923493042588234,
-0.07959721982479095,
-0.02860633097589016,
-0.0737605020403862,
0.028517087921500206,
-0.02813350409269333,
-0.07651586830615997,
-0.059907689690589905,
0.0831385999917984,
0.046044863760471344,
-0.029818836599588394,
0.014532473869621754,
0.024111974984407425,
0.1187155693769455,
0.16957725584506989,
0.0020412043668329716,
-0.05942416936159134,
-0.055723901838064194,
-0.03314331918954849,
-0.0175690446048975,
0.06748417019844055,
-0.04197792708873749,
0.016912948340177536,
0.06551661342382431,
0.017380885779857635,
0.09339349716901779,
0.06010207533836365,
-0.11303946375846863,
-0.021803636103868484,
0.03896646201610565,
-0.16736403107643127,
0.03086373396217823,
0.0011275358265265822,
0.021301385015249252,
-0.03636536747217178,
0.028686467558145523,
0.14188548922538757,
-0.06101769581437111,
-0.032119397073984146,
-0.04235474765300751,
0.07213858515024185,
0.027881111949682236,
0.15487590432167053,
0.03144457936286926,
0.03637900575995445,
-0.08546368777751923,
0.12583182752132416,
0.03343243896961212,
-0.038887206465005875,
0.020337145775556564,
-0.013997226022183895,
-0.11346792429685593,
0.011087573133409023,
0.06543098390102386,
0.03381422534584999,
-0.05164933577179909,
-0.00715061416849494,
-0.02744276635348797,
-0.08019988238811493,
0.06256403774023056,
0.17944513261318207,
0.06457556784152985,
0.06726103276014328,
-0.05096130073070526,
-0.03788445517420769,
-0.0800461545586586,
0.04273485764861107,
0.03342311456799507,
0.07558665424585342,
-0.07477480173110962,
0.09804216772317886,
0.011908403597772121,
0.039675213396549225,
-0.029938826337456703,
-0.0476987361907959,
-0.10449835658073425,
-0.052397310733795166,
-0.09560037404298782,
0.004802245646715164,
-0.07737619429826736,
-0.038061656057834625,
-0.0028791462536901236,
0.002296833088621497,
-0.011205388233065605,
0.053477250039577484,
-0.05866950750350952,
-0.011823666281998158,
-0.017882008105516434,
0.03746616095304489,
-0.06355763226747513,
-0.035471659153699875,
0.024555793032050133,
-0.10160180926322937,
0.09007825702428818,
0.04451863095164299,
0.0028919472824782133,
0.009030609391629696,
0.084718257188797,
-0.023329606279730797,
0.022053435444831848,
0.015054010786116123,
-0.045088157057762146,
-0.08033411204814911,
0.001651703380048275,
-0.005106781609356403,
-0.02303326688706875,
-0.006367973051965237,
0.07840947061777115,
-0.08657015115022659,
0.04295405372977257,
-0.005703296512365341,
-0.0009917626157402992,
-0.0705391988158226,
-0.009865119121968746,
0.11005295813083649,
0.0938781350851059,
0.04398208111524582,
-0.09552707523107529,
0.00796166155487299,
-0.14091554284095764,
-0.041114553809165955,
0.006849132478237152,
-0.017541851848363876,
-0.1276405304670334,
-0.005404761992394924,
0.026444217190146446,
-0.003944727126508951,
0.21895554661750793,
-0.0516890250146389,
-0.01694929599761963,
0.021450001746416092,
-0.09363538026809692,
0.12031427025794983,
-0.025379158556461334,
0.1716967523097992,
-0.017889870330691338,
-0.04213152453303337,
-0.004469302948564291,
0.042663853615522385,
0.021468622609972954,
-0.019760126248002052,
0.1904611587524414,
0.1377628743648529,
0.024841133505105972,
0.03506028652191162,
-0.02075199969112873,
-0.0065321363508701324,
-0.0318596325814724,
-0.034559596329927444,
0.033528558909893036,
0.043891020119190216,
0.015510658733546734,
0.13221804797649384,
0.06737589091062546,
-0.16536571085453033,
0.0374378077685833,
-0.0369730181992054,
-0.040981829166412354,
-0.11297346651554108,
-0.08775968849658966,
-0.025992855429649353,
-0.0714983120560646,
0.011745594441890717,
-0.131841242313385,
0.0032737168949097395,
0.18288546800613403,
0.06442192941904068,
0.0259549580514431,
0.012113363482058048,
-0.1157989501953125,
-0.029874520376324654,
0.051712218672037125,
0.009759137406945229,
0.019260048866271973,
0.06064223498106003,
-0.0012818482937291265,
0.05660652369260788,
0.03581107035279274,
0.01334803830832243,
-0.0021067273337394,
0.07234552502632141,
0.01679864339530468,
0.03888893872499466,
-0.059619754552841187,
-0.004829390440136194,
-0.042973555624485016,
0.06847024708986282,
0.11123338341712952,
0.0466054230928421,
-0.05200308933854103,
-0.005938471760600805,
0.15524537861347198,
-0.034834325313568115,
0.005075419787317514,
-0.12548832595348358,
0.3202191889286041,
0.019078023731708527,
0.011102658696472645,
0.044679153710603714,
-0.07307935506105423,
-0.05087113380432129,
0.21567372977733612,
0.1000637635588646,
-0.017109017819166183,
-0.022498242557048798,
-0.0038985430728644133,
-0.030320458114147186,
-0.022710883989930153,
0.1562274992465973,
0.04111291840672493,
0.11354925483465195,
-0.05039740353822708,
-0.04641064628958702,
-0.02899644523859024,
-0.010038930922746658,
-0.12127907574176788,
0.12599706649780273,
-0.014136175625026226,
-0.025283826515078545,
-0.06559277325868607,
0.02610827051103115,
0.06742514669895172,
-0.318681538105011,
0.0024806580040603876,
-0.02939685434103012,
-0.1032605916261673,
-0.011064127087593079,
-0.02527758665382862,
-0.022821273654699326,
0.047334928065538406,
-0.043309904634952545,
0.07407397031784058,
0.029209094122052193,
0.03271237388253212,
-0.02263874001801014,
-0.10276197642087936,
0.16992244124412537,
0.05698546767234802,
0.09260818362236023,
0.024847405031323433,
0.07191722095012665,
0.05772166699171066,
0.03481572866439819,
-0.09942150115966797,
0.04883623123168945,
0.012822920456528664,
-0.08775917440652847,
-0.04929915815591812,
0.11639939248561859,
-0.00024839359684847295,
0.04678555205464363,
0.034684114158153534,
-0.09901834279298782,
0.017437230795621872,
0.06729936599731445,
-0.059554267674684525,
-0.09574931859970093,
-0.003928069490939379,
-0.09221183508634567,
0.16009429097175598,
0.14971867203712463,
-0.01327260211110115,
0.021457241848111153,
-0.07014400511980057,
-0.008260156027972698,
0.0498410202562809,
0.011606959626078606,
-0.022030213847756386,
-0.19843415915966034,
0.03519683703780174,
-0.0854039192199707,
-0.00995948538184166,
-0.22988826036453247,
-0.10068827122449875,
-0.009004661813378334,
-0.048914290964603424,
-0.03146525099873543,
0.062047891318798065,
0.03154648095369339,
0.0705735981464386,
-0.016911184415221214,
-0.015003588050603867,
-0.03134666755795479,
0.09385369718074799,
-0.10935094207525253,
-0.061112310737371445
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 160k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 160k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_160k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_160k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_160k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_160k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_160k"]}
| null |
google/multiberts-seed_3-step_160k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_160k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_160k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 160k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 160k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 160k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 160k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_160k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 160k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 160k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_160k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 160k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 160k (max: 2000k, i.e., 2M steps)."
] |
[
-0.0772191733121872,
0.0836518257856369,
-0.0022168471477925777,
0.044523634016513824,
0.08887448906898499,
-0.017160961404442787,
0.07392597198486328,
0.09841809421777725,
-0.030218634754419327,
0.02253386750817299,
0.08131792396306992,
0.012280423194169998,
0.01666140928864479,
0.09928525239229202,
0.021242063492536545,
-0.2187906950712204,
0.022736363112926483,
-0.032615724951028824,
-0.07950536906719208,
0.0778244212269783,
0.09925428032875061,
-0.08146688342094421,
0.04871965944766998,
0.0248516034334898,
-0.11392219364643097,
0.0550699420273304,
-0.00798582099378109,
-0.01821630820631981,
0.12985223531723022,
-0.000527672644238919,
0.049378786236047745,
0.05391652137041092,
0.03734440729022026,
-0.13373637199401855,
0.006724259350448847,
0.05672511085867882,
0.05720742046833038,
0.03909347578883171,
0.030298450961709023,
0.07910119742155075,
0.008380151353776455,
0.024590305984020233,
0.04615609347820282,
0.025586562231183052,
-0.07562696933746338,
-0.050812941044569016,
-0.10422150045633316,
0.026893235743045807,
0.02349907159805298,
0.012217875570058823,
0.010607046075165272,
0.13089734315872192,
-0.03779531642794609,
0.0501885712146759,
0.19516521692276,
-0.33008965849876404,
-0.01752540096640587,
0.08386028558015823,
0.04707607626914978,
0.12429100275039673,
-0.0013267244212329388,
-0.016010258346796036,
0.07524751126766205,
0.03334541991353035,
0.09227991104125977,
-0.04134434834122658,
0.02707013301551342,
-0.0533122718334198,
-0.1618271917104721,
-0.04082302004098892,
0.10119662433862686,
-0.006523628253489733,
-0.13669857382774353,
-0.04096945747733116,
-0.03579705208539963,
0.03184850513935089,
0.011856505647301674,
-0.036366626620292664,
0.03177441656589508,
0.00910982396453619,
-0.016556870192289352,
-0.003137960098683834,
-0.10253942757844925,
-0.04994715750217438,
0.03437517210841179,
0.07116945087909698,
0.10276350378990173,
0.0625292956829071,
0.00407883245497942,
0.10828501731157303,
-0.18432743847370148,
-0.04734000191092491,
-0.02802053466439247,
-0.06132418289780617,
-0.04667751118540764,
-0.014602985233068466,
-0.10792341828346252,
-0.050199855118989944,
0.0136099923402071,
0.12617751955986023,
-0.014218420721590519,
0.0325465090572834,
-0.03081226907670498,
0.003342168405652046,
0.05848708748817444,
0.04096730053424835,
-0.011044833809137344,
0.01880383864045143,
0.021883787587285042,
-0.010135290212929249,
-0.027092641219496727,
0.016625063493847847,
0.006179372780025005,
0.034791234880685806,
0.11749769002199173,
0.02981017902493477,
-0.10356973111629486,
0.0784248411655426,
-0.017109524458646774,
-0.046970583498477936,
0.02325735241174698,
-0.09140496701002121,
-0.06537598371505737,
-0.042383331805467606,
0.0037065751384943724,
0.019785795360803604,
-0.00557190366089344,
-0.011243083514273167,
-0.023338057100772858,
-0.03704529628157616,
-0.0817253440618515,
-0.04873804375529289,
-0.05565248057246208,
-0.13214023411273956,
0.004401061683893204,
-0.18381153047084808,
-0.03749031946063042,
-0.11823691427707672,
-0.189494326710701,
-0.025041239336133003,
0.059173017740249634,
-0.011923541314899921,
-0.04832512512803078,
0.07966973632574081,
0.03865870460867882,
-0.03220655769109726,
-0.0024350823368877172,
0.08309470117092133,
-0.0055691273882985115,
0.039332982152700424,
-0.0180127564817667,
0.06511678546667099,
0.010788116604089737,
0.03608015179634094,
-0.05326401814818382,
0.05968670919537544,
-0.17097817361354828,
0.04363635182380676,
-0.0801117867231369,
-0.019507212564349174,
-0.08510191738605499,
-0.03747367486357689,
-0.0038976119831204414,
0.009771620854735374,
0.02262076549232006,
0.07722346484661102,
-0.18097561597824097,
-0.026211248710751534,
0.10664351284503937,
-0.15858125686645508,
-0.02417203038930893,
0.07225164026021957,
-0.04452216252684593,
0.09256674349308014,
0.06999366730451584,
0.1522984653711319,
-0.011760417371988297,
-0.08492985367774963,
0.05633477494120598,
-0.013041860423982143,
0.014544387347996235,
-0.011124134995043278,
0.06665745377540588,
-0.019397325813770294,
-0.1558166891336441,
0.04014933481812477,
-0.13454122841358185,
-0.0006957773002795875,
-0.07768197357654572,
0.0216104406863451,
-0.00711808493360877,
-0.06885375082492828,
-0.07591512054204941,
-0.02507726475596428,
0.06381138414144516,
-0.07870566844940186,
-0.013475940562784672,
0.035760898143053055,
0.06880800426006317,
-0.07507764548063278,
0.06529314070940018,
-0.01316617801785469,
0.015668999403715134,
-0.08264722675085068,
-0.03901686519384384,
-0.19006992876529694,
0.04422900080680847,
0.09835189580917358,
0.023269744589924812,
-0.023293213918805122,
0.14301304519176483,
0.007496470585465431,
0.06634488701820374,
-0.04953504353761673,
0.013316642493009567,
-0.004140981938689947,
-0.003288442036136985,
-0.0821302980184555,
-0.09879136085510254,
-0.07622183114290237,
-0.07185251265764236,
0.0679553896188736,
-0.12036634981632233,
0.020427437499165535,
-0.05638587474822998,
0.03598371893167496,
0.017881330102682114,
-0.08180603384971619,
-0.008684935979545116,
0.019284920766949654,
-0.05869303643703461,
-0.05987874045968056,
0.04220236837863922,
0.06841137260198593,
-0.005345956422388554,
0.09136676043272018,
-0.052240099757909775,
-0.08147929608821869,
0.03258151933550835,
0.09688114374876022,
-0.1125037744641304,
0.009602916426956654,
-0.05784985423088074,
-0.042064473032951355,
-0.059394754469394684,
-0.021408632397651672,
0.0784563422203064,
-0.008631955832242966,
0.13442634046077728,
-0.07448760420084,
-0.006824786309152842,
0.010325053706765175,
-0.016423599794507027,
-0.028126798570156097,
0.04126085713505745,
0.06412777304649353,
-0.072676882147789,
0.01688702031970024,
0.03738052025437355,
0.011175382882356644,
0.07782626897096634,
-0.05283372104167938,
-0.0840642973780632,
0.019992392510175705,
0.04017958417534828,
0.03323546424508095,
0.06367892026901245,
-0.02488386631011963,
-0.016282562166452408,
0.03137631341814995,
0.017690543085336685,
0.009628383442759514,
-0.10620983690023422,
0.057667914777994156,
0.05655974522233009,
0.0053452057763934135,
0.07531779259443283,
-0.010509326122701168,
-0.04299594461917877,
0.07799912244081497,
0.032077331095933914,
-0.004727420397102833,
-0.013848278671503067,
-0.015585520304739475,
-0.1155092790722847,
0.1970665156841278,
-0.06606748700141907,
-0.17002686858177185,
-0.06932012736797333,
-0.11434545367956161,
-0.005126886535435915,
0.024130934849381447,
0.038692932575941086,
-0.02937180921435356,
-0.05179816484451294,
-0.12841545045375824,
0.06566201895475388,
-0.04377970099449158,
0.06691960990428925,
0.10916147381067276,
-0.045925889164209366,
0.05468772351741791,
-0.12727829813957214,
-0.008844840340316296,
-0.08059284836053848,
-0.08027272671461105,
0.06137558072805405,
-0.05301130935549736,
0.03381422534584999,
0.0903327465057373,
0.030505700036883354,
-0.017659815028309822,
-0.02907671593129635,
0.20247972011566162,
0.04685938358306885,
0.034701671451330185,
0.13569669425487518,
-0.05409146100282669,
0.050397954881191254,
0.07944198697805405,
0.010018576867878437,
-0.05014638602733612,
0.057885751128196716,
0.046531591564416885,
-0.07090235501527786,
-0.19736739993095398,
-0.020981837064027786,
-0.01390103716403246,
-0.03873886168003082,
0.06955578923225403,
0.03783433511853218,
-0.0009491368546150625,
0.0697302594780922,
0.012553131207823753,
0.059651512652635574,
-0.004092616494745016,
0.09901372343301773,
0.021520104259252548,
-0.03602014109492302,
0.09146978706121445,
-0.019348695874214172,
-0.006272084545344114,
0.0832151472568512,
-0.019769320264458656,
0.2893589735031128,
-0.03267641365528107,
0.010349060408771038,
0.12814854085445404,
0.037748757749795914,
0.05913133546710014,
0.13064755499362946,
-0.07078133523464203,
0.016222834587097168,
-0.07314880192279816,
-0.06127002835273743,
-0.001870281994342804,
0.035734664648771286,
-0.06058434769511223,
0.010390646755695343,
-0.07277168333530426,
0.021601930260658264,
-0.01655528135597706,
0.30116337537765503,
0.0997571051120758,
-0.11115781962871552,
-0.05239279568195343,
0.0004886918468400836,
-0.09484254568815231,
-0.06633904576301575,
0.04319243133068085,
0.06677347421646118,
-0.1345868557691574,
0.010368593037128448,
-0.028437543660402298,
0.07103554904460907,
-0.015142526477575302,
0.018908068537712097,
0.04282328486442566,
0.049529001116752625,
-0.04034342244267464,
0.00476920697838068,
-0.20323221385478973,
0.1923784762620926,
0.007355362642556429,
0.02219538576900959,
-0.04969383776187897,
0.0353795662522316,
0.009401090443134308,
-0.03209664672613144,
0.061257194727659225,
0.01699850894510746,
-0.02657655067741871,
-0.05013396963477135,
-0.04603370279073715,
0.014729388058185577,
0.07685074210166931,
-0.03547760844230652,
0.10492982715368271,
-0.0041256677359342575,
0.04526268690824509,
0.018511757254600525,
0.09926950186491013,
-0.19191692769527435,
-0.09589888155460358,
0.029497886076569557,
-0.05973227322101593,
-0.11190520226955414,
-0.07692545652389526,
-0.0951966792345047,
-0.013879356905817986,
0.253971129655838,
-0.1132846400141716,
-0.07786578685045242,
-0.09908950328826904,
0.025973130017518997,
0.10305401682853699,
-0.04440400004386902,
0.026611315086483955,
-0.008334936574101448,
0.12046834081411362,
-0.06890976428985596,
-0.13611839711666107,
0.022534122690558434,
-0.10134607553482056,
-0.1606050729751587,
-0.06487813591957092,
0.1121126040816307,
0.06452514976263046,
0.03315923735499382,
-0.03052622079849243,
0.018467331305146217,
0.03672969713807106,
-0.039996419101953506,
-0.004779672250151634,
0.07004974037408829,
0.09785116463899612,
0.036758631467819214,
-0.1108601912856102,
0.009149334393441677,
-0.06366246938705444,
-0.06741998344659805,
0.07327506691217422,
0.25846153497695923,
-0.048880644142627716,
0.12120947241783142,
0.10929390788078308,
-0.07880102097988129,
-0.15171782672405243,
0.03663666546344757,
0.09358984231948853,
-0.019565314054489136,
0.02219207026064396,
-0.15615108609199524,
0.0915258601307869,
0.11216937750577927,
-0.018111690878868103,
0.00253877229988575,
-0.18439072370529175,
-0.12742605805397034,
0.06960701942443848,
0.10919290781021118,
0.2625758945941925,
-0.06907728314399719,
-0.03774389997124672,
0.018559269607067108,
-0.09096858650445938,
0.0178547240793705,
0.12564875185489655,
0.06761026382446289,
-0.027623983100056648,
-0.0786270722746849,
0.010851888917386532,
-0.04262551665306091,
0.08898098766803741,
0.05474533140659332,
0.062245432287454605,
-0.006312059238553047,
0.02687779627740383,
-0.02129150740802288,
-0.041635531932115555,
0.0651162713766098,
0.013345875777304173,
0.045976750552654266,
-0.08344962447881699,
-0.030130859464406967,
-0.07318563759326935,
0.03204017132520676,
-0.025842705741524696,
-0.07703810930252075,
-0.05702658370137215,
0.0817297175526619,
0.044769611209630966,
-0.02677321620285511,
0.016583679243922234,
0.021745963022112846,
0.11837480217218399,
0.16532368957996368,
0.003956882748752832,
-0.050145212560892105,
-0.058132681995630264,
-0.03739241883158684,
-0.015924448147416115,
0.07055551558732986,
-0.03899708017706871,
0.01578563265502453,
0.06284338235855103,
0.01649795100092888,
0.09428762644529343,
0.05828980728983879,
-0.11753486096858978,
-0.020790837705135345,
0.035548269748687744,
-0.16837535798549652,
0.03218339756131172,
0.0010578444926068187,
0.029520733281970024,
-0.03492230549454689,
0.03378860279917717,
0.14622311294078827,
-0.06236782670021057,
-0.033470410853624344,
-0.04109684377908707,
0.0701545923948288,
0.02607150562107563,
0.15131108462810516,
0.03184502199292183,
0.0362032875418663,
-0.08371501415967941,
0.1232544481754303,
0.03061087243258953,
-0.032177407294511795,
0.02330145798623562,
-0.019667502492666245,
-0.11071915924549103,
0.01073425728827715,
0.06675603985786438,
0.03944884240627289,
-0.050963133573532104,
-0.008640619926154613,
-0.027165375649929047,
-0.07785481959581375,
0.06307409703731537,
0.1846006065607071,
0.06717988103628159,
0.07127828896045685,
-0.053533416241407394,
-0.037867650389671326,
-0.07789011299610138,
0.04052136465907097,
0.035022296011447906,
0.07691728323698044,
-0.07717441767454147,
0.09542098641395569,
0.011669691652059555,
0.03941147401928902,
-0.030338721349835396,
-0.047822967171669006,
-0.10574888437986374,
-0.05284129083156586,
-0.09095655381679535,
0.005757762584835291,
-0.07657239586114883,
-0.03849351033568382,
-0.0014955177903175354,
0.0018608190584927797,
-0.007805441040545702,
0.05394865944981575,
-0.05888758599758148,
-0.010986908338963985,
-0.019719209522008896,
0.03795117512345314,
-0.06413263082504272,
-0.03568695858120918,
0.023531055077910423,
-0.10222992300987244,
0.08964300900697708,
0.044000960886478424,
0.0031217518262565136,
0.010317398235201836,
0.07348735630512238,
-0.024181799963116646,
0.024561291560530663,
0.012026369571685791,
-0.0464487187564373,
-0.07944276928901672,
0.0008463087142445147,
-0.004103206563740969,
-0.019342022016644478,
-0.007234690245240927,
0.0824924036860466,
-0.08680078387260437,
0.03844382241368294,
-0.004340364597737789,
0.0016235258663073182,
-0.06939837336540222,
-0.008323971182107925,
0.10637299716472626,
0.0963440090417862,
0.046284839510917664,
-0.09498494118452072,
0.009440706111490726,
-0.1387443244457245,
-0.04037805274128914,
0.006231274921447039,
-0.017302684485912323,
-0.13197733461856842,
-0.0050242445431649685,
0.025344710797071457,
-0.0038051027804613113,
0.21760332584381104,
-0.050549525767564774,
-0.01371364202350378,
0.018763311207294464,
-0.09359467029571533,
0.12465742230415344,
-0.026308216154575348,
0.1730497032403946,
-0.017792196944355965,
-0.038928888738155365,
-0.00811042357236147,
0.04215512424707413,
0.019320784136652946,
-0.025074148550629616,
0.1870940923690796,
0.13697412610054016,
0.018901601433753967,
0.03815438970923424,
-0.02308262325823307,
-0.0037989935372024775,
-0.03709302097558975,
-0.035035453736782074,
0.0362280011177063,
0.044711410999298096,
0.015410210937261581,
0.14213594794273376,
0.06848259270191193,
-0.1662554293870926,
0.033846415579319,
-0.0338297113776207,
-0.038273897022008896,
-0.11327415704727173,
-0.0993926003575325,
-0.02776888944208622,
-0.07054746896028519,
0.01278552133589983,
-0.129551500082016,
0.0046259150840342045,
0.1853540539741516,
0.06411885470151901,
0.02650270238518715,
0.011234278790652752,
-0.11367138475179672,
-0.032743219286203384,
0.05610090121626854,
0.008373454213142395,
0.019201407209038734,
0.05665107071399689,
0.0021678535267710686,
0.05699949339032173,
0.03554896265268326,
0.013841448351740837,
-0.0020623700693249702,
0.07374868541955948,
0.017619790509343147,
0.03997288644313812,
-0.06171268969774246,
-0.0035419082269072533,
-0.04145144298672676,
0.06757577508687973,
0.11489057540893555,
0.047739170491695404,
-0.05401138961315155,
-0.006453540176153183,
0.15175166726112366,
-0.03474636748433113,
0.0042578051798045635,
-0.1246332898736,
0.3231988549232483,
0.01752353273332119,
0.01283714547753334,
0.04247182235121727,
-0.07098188251256943,
-0.04770173504948616,
0.2066379189491272,
0.0952160656452179,
-0.01811768114566803,
-0.02078055404126644,
-0.0047827004455029964,
-0.029996061697602272,
-0.023770011961460114,
0.15440554916858673,
0.041500456631183624,
0.1195785403251648,
-0.05356801673769951,
-0.04795975610613823,
-0.028544176369905472,
-0.010719146579504013,
-0.12187283486127853,
0.11996309459209442,
-0.014882509596645832,
-0.02334127575159073,
-0.06513875722885132,
0.026619259268045425,
0.06552518159151077,
-0.3176104426383972,
-0.002135068876668811,
-0.026877759024500847,
-0.10471727699041367,
-0.013047912158071995,
-0.01988334208726883,
-0.021749379113316536,
0.04689950868487358,
-0.04537973180413246,
0.07393072545528412,
0.030710261315107346,
0.032500218600034714,
-0.021596509963274002,
-0.09826689958572388,
0.1679438203573227,
0.05820372700691223,
0.08798734843730927,
0.024141626432538033,
0.07042520493268967,
0.056478243321180344,
0.03343722224235535,
-0.09710640460252762,
0.05045691505074501,
0.012490524910390377,
-0.08940102159976959,
-0.04810851812362671,
0.11615259200334549,
0.0011158320121467113,
0.04991230368614197,
0.03590089827775955,
-0.09962861984968185,
0.02027820236980915,
0.06925321370363235,
-0.06119506061077118,
-0.09841335564851761,
0.0011921041877940297,
-0.09334352612495422,
0.15989352762699127,
0.14938601851463318,
-0.013727785088121891,
0.017989352345466614,
-0.07233207672834396,
-0.003945157863199711,
0.048729315400123596,
0.010375745594501495,
-0.022491229698061943,
-0.19290603697299957,
0.03458848595619202,
-0.078715018928051,
-0.008860261179506779,
-0.22722917795181274,
-0.10011067241430283,
-0.010802679695189,
-0.04982296749949455,
-0.03324073925614357,
0.05843610689043999,
0.029941165819764137,
0.0716792419552803,
-0.017692137509584427,
-0.02279193326830864,
-0.03149823099374771,
0.09104296565055847,
-0.11192414164543152,
-0.06200968846678734
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1700k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1700k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1700k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1700k"]}
| null |
google/multiberts-seed_3-step_1700k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1700k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1700k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1700k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1700k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1700k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1700k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1700k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07993607968091965,
0.08150400966405869,
-0.001985288457944989,
0.044827885925769806,
0.08245391398668289,
-0.016176486387848854,
0.0688486322760582,
0.0967995673418045,
-0.025591855868697166,
0.02288213185966015,
0.08627689629793167,
0.013515608385205269,
0.016969867050647736,
0.09868446737527847,
0.021743828430771828,
-0.2176879495382309,
0.02555297128856182,
-0.031131962314248085,
-0.0732995942234993,
0.07920537889003754,
0.09801513701677322,
-0.0839538425207138,
0.048142995685338974,
0.024578431621193886,
-0.11129077523946762,
0.05229749158024788,
-0.007407034747302532,
-0.02371290698647499,
0.12984302639961243,
0.0002799411886371672,
0.044492729008197784,
0.055714771151542664,
0.03765396773815155,
-0.13076604902744293,
0.007101490627974272,
0.05574068799614906,
0.05727408826351166,
0.0388110876083374,
0.025912953540682793,
0.08059579133987427,
0.010568180121481419,
0.028665898367762566,
0.04981343075633049,
0.027044134214520454,
-0.07577475905418396,
-0.054822683334350586,
-0.10467119514942169,
0.03983649984002113,
0.02464863285422325,
0.009451223537325859,
0.011942536570131779,
0.13452383875846863,
-0.03598935529589653,
0.05047788843512535,
0.18641959130764008,
-0.32876625657081604,
-0.019702943041920662,
0.0890415608882904,
0.04944271221756935,
0.13418185710906982,
-0.002086749067530036,
-0.01521000824868679,
0.07628219574689865,
0.03273088485002518,
0.09584395587444305,
-0.041664764285087585,
0.033912066370248795,
-0.053005244582891464,
-0.1628599315881729,
-0.043037232011556625,
0.09709074348211288,
-0.0023877655621618032,
-0.13454106450080872,
-0.046038851141929626,
-0.034834668040275574,
0.025229429826140404,
0.010619975626468658,
-0.03951716423034668,
0.02902490645647049,
0.008667262271046638,
-0.015729334205389023,
-0.00585015956312418,
-0.10110767930746078,
-0.04985308647155762,
0.03697812184691429,
0.07634653151035309,
0.10275949537754059,
0.06403500586748123,
0.0036446659360080957,
0.10692229866981506,
-0.17817874252796173,
-0.04505302011966705,
-0.030268901959061623,
-0.06443751603364944,
-0.0487111397087574,
-0.016118517145514488,
-0.1047501191496849,
-0.043385304510593414,
0.009443067945539951,
0.12961901724338531,
-0.009952130727469921,
0.037035513669252396,
-0.02945498563349247,
0.004742809571325779,
0.058654870837926865,
0.0403587631881237,
-0.012774079106748104,
0.014709542505443096,
0.021668026223778725,
-0.005679013207554817,
-0.025036100298166275,
0.015408252365887165,
0.010609353892505169,
0.03305787220597267,
0.11924020946025848,
0.030426153913140297,
-0.09968914091587067,
0.07433826476335526,
-0.013573043048381805,
-0.04899570345878601,
0.033487770706415176,
-0.08783452212810516,
-0.06482895463705063,
-0.04432522505521774,
0.005720421206206083,
0.019349712878465652,
-0.007528678979724646,
-0.010531547479331493,
-0.023391107097268105,
-0.039128627628088,
-0.08603551238775253,
-0.05439288169145584,
-0.05194826051592827,
-0.1281576305627823,
0.003496760269626975,
-0.17787674069404602,
-0.039879463613033295,
-0.11646772176027298,
-0.1921502649784088,
-0.027275029569864273,
0.05919348821043968,
-0.011808672919869423,
-0.05070807784795761,
0.08384265750646591,
0.04234934598207474,
-0.0306540559977293,
-0.0036965468898415565,
0.07855398207902908,
-0.003280022880062461,
0.03972393646836281,
-0.023153571411967278,
0.06765615195035934,
0.0114638302475214,
0.03477679193019867,
-0.054380204528570175,
0.06031849980354309,
-0.18015824258327484,
0.04396691173315048,
-0.08179351687431335,
-0.015791572630405426,
-0.084969662129879,
-0.0400928296148777,
-0.006247231736779213,
0.009155224077403545,
0.02020399644970894,
0.07511738687753677,
-0.17228162288665771,
-0.025925058871507645,
0.11127325892448425,
-0.16138488054275513,
-0.029464643448591232,
0.07302034646272659,
-0.04245506227016449,
0.08887525647878647,
0.07140269875526428,
0.1487840712070465,
-0.00762310903519392,
-0.07857465744018555,
0.05806545168161392,
-0.015581533312797546,
0.01039504911750555,
-0.019903363659977913,
0.06521347910165787,
-0.020779354497790337,
-0.16678208112716675,
0.03925715759396553,
-0.132464200258255,
-0.0005442705005407333,
-0.07693792134523392,
0.019328942522406578,
-0.008116748183965683,
-0.06852566450834274,
-0.08300717920064926,
-0.0246853306889534,
0.06267832219600677,
-0.08129622787237167,
-0.01092977449297905,
0.024291301146149635,
0.07130281627178192,
-0.07472331821918488,
0.06672927737236023,
-0.009933250956237316,
0.013288630172610283,
-0.07740318775177002,
-0.03732452541589737,
-0.18733443319797516,
0.049104418605566025,
0.09891454875469208,
0.01764715649187565,
-0.02424624375998974,
0.13143695890903473,
0.006250739563256502,
0.06687399744987488,
-0.050243720412254333,
0.012238322757184505,
-0.0002483550342731178,
-0.0028963785152882338,
-0.08185617625713348,
-0.10174006223678589,
-0.07603409886360168,
-0.07085427641868591,
0.06882362812757492,
-0.12028738111257553,
0.021029477939009666,
-0.0632108524441719,
0.03480684384703636,
0.016097623854875565,
-0.08382157981395721,
-0.00986627209931612,
0.01907399855554104,
-0.056355930864810944,
-0.05905773490667343,
0.04179198294878006,
0.0675695389509201,
-0.008614844642579556,
0.09049312025308609,
-0.05480459704995155,
-0.08495525270700455,
0.03305903449654579,
0.09736333787441254,
-0.11210091412067413,
0.0035858014598488808,
-0.05844060704112053,
-0.04361326992511749,
-0.0579577311873436,
-0.022365758195519447,
0.07178958505392075,
-0.008165581151843071,
0.13322436809539795,
-0.07757014036178589,
-0.005951184779405594,
0.014734159223735332,
-0.011878643184900284,
-0.032137930393218994,
0.037740517407655716,
0.06945033371448517,
-0.08250238746404648,
0.013453008607029915,
0.0378456749022007,
0.017898570746183395,
0.07612426578998566,
-0.049770537763834,
-0.08110229671001434,
0.02020297758281231,
0.03898299112915993,
0.030216602608561516,
0.06311360001564026,
-0.025172924622893333,
-0.015628160908818245,
0.029891494661569595,
0.020537512376904488,
0.011273162439465523,
-0.10575513541698456,
0.05604610592126846,
0.05626378580927849,
0.008848888799548149,
0.07263046503067017,
-0.010033659636974335,
-0.043027881532907486,
0.07834630459547043,
0.03387751057744026,
-0.008210685104131699,
-0.014351865276694298,
-0.014889085665345192,
-0.11784154921770096,
0.19872328639030457,
-0.06477338075637817,
-0.17139995098114014,
-0.07026056200265884,
-0.11390180885791779,
-0.005627365782856941,
0.021003752946853638,
0.03922734409570694,
-0.029113536700606346,
-0.05140483006834984,
-0.1297866255044937,
0.0586068220436573,
-0.04601471498608589,
0.06837935000658035,
0.11408261209726334,
-0.04807944595813751,
0.04997001588344574,
-0.12776318192481995,
-0.009890202432870865,
-0.08412372320890427,
-0.07226935774087906,
0.060998860746622086,
-0.054670821875333786,
0.03261447697877884,
0.09679654985666275,
0.032648179680109024,
-0.016813181340694427,
-0.030295755714178085,
0.20625115931034088,
0.04634582996368408,
0.0363587811589241,
0.12918280065059662,
-0.051192961633205414,
0.05101780593395233,
0.07807808369398117,
0.011862410232424736,
-0.0521661713719368,
0.05641373619437218,
0.043173838406801224,
-0.07144822925329208,
-0.19592523574829102,
-0.023693308234214783,
-0.013654729351401329,
-0.0371825248003006,
0.07439820468425751,
0.03760676458477974,
0.006252632476389408,
0.06910625845193863,
0.010564015246927738,
0.05486689507961273,
-0.0012504328042268753,
0.10102034360170364,
0.021214457228779793,
-0.036405280232429504,
0.09104948490858078,
-0.01863156072795391,
-0.001745370915159583,
0.08459028601646423,
-0.020418796688318253,
0.29134219884872437,
-0.028760647401213646,
0.008568575605750084,
0.12846873700618744,
0.0364408977329731,
0.05877542868256569,
0.128721222281456,
-0.07145729660987854,
0.014900455251336098,
-0.07300978153944016,
-0.06060471385717392,
0.00004451493077795021,
0.03741423785686493,
-0.061519622802734375,
0.009982829913496971,
-0.0744507908821106,
0.023118143901228905,
-0.015800898894667625,
0.3056241571903229,
0.09467986226081848,
-0.11045313626527786,
-0.05382034182548523,
0.0008577318512834609,
-0.09642953425645828,
-0.06637032330036163,
0.0472637377679348,
0.06307366490364075,
-0.13249929249286652,
0.013557269237935543,
-0.025327516719698906,
0.07143693417310715,
-0.012046088464558125,
0.01662430725991726,
0.03888470679521561,
0.04859446361660957,
-0.04141310602426529,
0.0010795791167765856,
-0.18793654441833496,
0.1978396773338318,
0.006321223918348551,
0.020736275240778923,
-0.05110789090394974,
0.03285906836390495,
0.009167280048131943,
-0.026831427589058876,
0.06289120763540268,
0.01951456628739834,
-0.029753193259239197,
-0.05003570020198822,
-0.045504190027713776,
0.013078582473099232,
0.07329532504081726,
-0.03135397657752037,
0.10119441151618958,
-0.0027169305831193924,
0.04359172657132149,
0.019219553098082542,
0.09301415085792542,
-0.19073724746704102,
-0.09375833719968796,
0.02955583482980728,
-0.05563540384173393,
-0.11564882099628448,
-0.07755037397146225,
-0.09561779350042343,
-0.017567601054906845,
0.2517697811126709,
-0.10987866669893265,
-0.07558601349592209,
-0.09839506447315216,
0.024568121880292892,
0.10703282803297043,
-0.04508385807275772,
0.028769411146640778,
-0.009899673983454704,
0.11757627129554749,
-0.06898903101682663,
-0.1324131339788437,
0.024669110774993896,
-0.1021764799952507,
-0.15993250906467438,
-0.06467864662408829,
0.11263395845890045,
0.06511759757995605,
0.031512290239334106,
-0.029514851048588753,
0.018167564645409584,
0.03732584789395332,
-0.044178903102874756,
-0.00171363796107471,
0.06787281483411789,
0.0941658541560173,
0.03668660297989845,
-0.1051172986626625,
0.004642832558602095,
-0.06378237903118134,
-0.06989584118127823,
0.07303578406572342,
0.26422518491744995,
-0.05158894881606102,
0.12137962877750397,
0.11991701275110245,
-0.07582135498523712,
-0.14771676063537598,
0.03743370249867439,
0.08431581407785416,
-0.022003579884767532,
0.017136838287115097,
-0.15080493688583374,
0.09219098836183548,
0.11536340415477753,
-0.019058631733059883,
-0.004745915066450834,
-0.19025912880897522,
-0.12940454483032227,
0.073188416659832,
0.11010990291833878,
0.2694857120513916,
-0.06598249077796936,
-0.037802472710609436,
0.022804727777838707,
-0.08819053322076797,
0.015938961878418922,
0.13181109726428986,
0.0655469298362732,
-0.030738746747374535,
-0.08997106552124023,
0.010133245028555393,
-0.04585152119398117,
0.08885689824819565,
0.05508089438080788,
0.06184760481119156,
-0.0033566495403647423,
0.0306075569242239,
-0.024805458262562752,
-0.04182824119925499,
0.06410124152898788,
0.016857637092471123,
0.04985902085900307,
-0.07991593331098557,
-0.03157588094472885,
-0.07624897360801697,
0.028213774785399437,
-0.026051228865981102,
-0.07595085352659225,
-0.05898392200469971,
0.08460573852062225,
0.047219838947057724,
-0.028551582247018814,
0.01705019548535347,
0.020832713693380356,
0.12057989090681076,
0.16443440318107605,
0.003656282089650631,
-0.06375084817409515,
-0.056156426668167114,
-0.03836553916335106,
-0.01942206360399723,
0.07078786939382553,
-0.03797416388988495,
0.01432902179658413,
0.06278689950704575,
0.01729741506278515,
0.09482388943433762,
0.058282520622015,
-0.11487310379743576,
-0.02324482798576355,
0.03792565315961838,
-0.16617555916309357,
0.027897479012608528,
0.0012185289524495602,
0.029447970911860466,
-0.03384510427713394,
0.03243725001811981,
0.14054721593856812,
-0.06137606129050255,
-0.03402108699083328,
-0.04185980558395386,
0.07078481465578079,
0.028011072427034378,
0.14762449264526367,
0.03070506826043129,
0.036603331565856934,
-0.08427901566028595,
0.12294919043779373,
0.03358342871069908,
-0.04116516932845116,
0.020661097019910812,
-0.010902920737862587,
-0.11249788105487823,
0.010280568152666092,
0.05827166885137558,
0.03069373406469822,
-0.05730023607611656,
-0.006522939540445805,
-0.024395622313022614,
-0.07767610251903534,
0.06752432137727737,
0.18403595685958862,
0.0654349997639656,
0.06709111481904984,
-0.053557027131319046,
-0.038828782737255096,
-0.07757165282964706,
0.03868402540683746,
0.031133096665143967,
0.07603984326124191,
-0.07648928463459015,
0.09234365075826645,
0.015854468569159508,
0.0386490672826767,
-0.029474396258592606,
-0.05114097148180008,
-0.10459185391664505,
-0.05206172913312912,
-0.09520980715751648,
0.0006321279215626419,
-0.08509226143360138,
-0.03604469820857048,
-0.004292353522032499,
0.0014149363851174712,
-0.011370246298611164,
0.05147645249962807,
-0.05829046294093132,
-0.010755058377981186,
-0.01643175631761551,
0.039378710091114044,
-0.0644271969795227,
-0.03435908257961273,
0.023872846737504005,
-0.10068189352750778,
0.09142571687698364,
0.041499294340610504,
0.0027550312224775553,
0.006567450240254402,
0.0880991667509079,
-0.022758157923817635,
0.023148123174905777,
0.011659047566354275,
-0.04608336463570595,
-0.08292335271835327,
0.00034697033697739244,
-0.0035489096771925688,
-0.022598566487431526,
-0.0038466493133455515,
0.07847926765680313,
-0.08702052384614944,
0.03620782867074013,
-0.0028490019030869007,
-0.0025999422650784254,
-0.07011038810014725,
-0.009680292569100857,
0.10841955244541168,
0.09057492762804031,
0.04319676384329796,
-0.09374945610761642,
0.0074291531927883625,
-0.1413230001926422,
-0.039707135409116745,
0.004789478611201048,
-0.017309948801994324,
-0.1285419911146164,
-0.001730593154206872,
0.024654854089021683,
-0.0035761564504355192,
0.2191166877746582,
-0.059401921927928925,
-0.01932225190103054,
0.023101599887013435,
-0.09424249827861786,
0.11652544140815735,
-0.02739378623664379,
0.17842304706573486,
-0.015235628001391888,
-0.04086194187402725,
-0.00028484631911851466,
0.04258650168776512,
0.021355008706450462,
-0.01790367253124714,
0.19195222854614258,
0.13780330121517181,
0.03039419837296009,
0.0384238138794899,
-0.022587619721889496,
-0.004303134977817535,
-0.04193320870399475,
-0.037535883486270905,
0.03587706387042999,
0.04525981470942497,
0.017397189512848854,
0.13094523549079895,
0.07267018407583237,
-0.16582641005516052,
0.03314000368118286,
-0.03206579014658928,
-0.03806711733341217,
-0.11453002691268921,
-0.09458374977111816,
-0.02570277638733387,
-0.07133087515830994,
0.011845363304018974,
-0.13026843965053558,
0.003148408140987158,
0.17097359895706177,
0.06562937051057816,
0.02759067714214325,
0.010442634113132954,
-0.1121295765042305,
-0.030507199466228485,
0.05552927032113075,
0.011356932111084461,
0.021137332543730736,
0.055543363094329834,
0.0010086569236591458,
0.054297398775815964,
0.03161719813942909,
0.011885344982147217,
-0.0033090931829065084,
0.06912257522344589,
0.020336007699370384,
0.040042027831077576,
-0.06264954805374146,
-0.004148077219724655,
-0.04223921522498131,
0.07009939849376678,
0.11255326867103577,
0.04914963245391846,
-0.053232256323099136,
-0.006812904495745897,
0.15217845141887665,
-0.034314222633838654,
0.005386406555771828,
-0.12428620457649231,
0.3331191837787628,
0.020118897780776024,
0.00941102672368288,
0.04592856392264366,
-0.06882394850254059,
-0.05007775500416756,
0.2103351503610611,
0.09813972562551498,
-0.015034402720630169,
-0.020638059824705124,
-0.00421025138348341,
-0.031083839014172554,
-0.02431202493607998,
0.15846078097820282,
0.04119868576526642,
0.12171302735805511,
-0.04996176064014435,
-0.04300946369767189,
-0.03001769632101059,
-0.01246077474206686,
-0.11838783323764801,
0.12242739647626877,
-0.011487389914691448,
-0.024084553122520447,
-0.06528478860855103,
0.029892727732658386,
0.06677959114313126,
-0.3028782904148102,
-0.0016021743649616838,
-0.026887990534305573,
-0.1055440679192543,
-0.01204382162541151,
-0.023657752200961113,
-0.024579724296927452,
0.045772045850753784,
-0.041559409350156784,
0.07216678559780121,
0.027248559519648552,
0.03171733394265175,
-0.022571489214897156,
-0.10880091041326523,
0.17132167518138885,
0.07127679884433746,
0.09398015588521957,
0.02451045997440815,
0.06969674676656723,
0.059036798775196075,
0.034877385944128036,
-0.10242938250303268,
0.045642443001270294,
0.01273367740213871,
-0.08152395486831665,
-0.048705656081438065,
0.11462090909481049,
0.0017461362294852734,
0.055236075073480606,
0.03669816255569458,
-0.10253461450338364,
0.016647957265377045,
0.0641036182641983,
-0.05879291146993637,
-0.09696504473686218,
0.002080406527966261,
-0.0884852409362793,
0.16379305720329285,
0.1464378386735916,
-0.015863075852394104,
0.014950710348784924,
-0.06821554899215698,
-0.005183450877666473,
0.049004193395376205,
0.014559843577444553,
-0.02237461321055889,
-0.19818894565105438,
0.037101875990629196,
-0.084653839468956,
-0.006757783703505993,
-0.22803474962711334,
-0.10210678726434708,
-0.009899888187646866,
-0.04966139420866966,
-0.03409414365887642,
0.06148918718099594,
0.027492130175232887,
0.06979016959667206,
-0.017031416296958923,
-0.01711862161755562,
-0.03315127640962601,
0.0905333012342453,
-0.11365014314651489,
-0.061303868889808655
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1800k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1800k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1800k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1800k"]}
| null |
google/multiberts-seed_3-step_1800k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1800k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1800k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1800k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1800k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1800k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1800k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1800k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07576689124107361,
0.09077809005975723,
-0.0021300476510077715,
0.042072445154190063,
0.08382673561573029,
-0.015564296394586563,
0.0771499052643776,
0.09718800336122513,
-0.02943955548107624,
0.02092408947646618,
0.0813019648194313,
0.007721413858234882,
0.014838417991995811,
0.09569999575614929,
0.0231462549418211,
-0.21430325508117676,
0.02468905784189701,
-0.03100741095840931,
-0.07861614972352982,
0.07836754620075226,
0.09803836792707443,
-0.08227679878473282,
0.047707200050354004,
0.02542644925415516,
-0.10747263580560684,
0.051397889852523804,
-0.007969739846885204,
-0.02393890917301178,
0.12908434867858887,
-0.0024931784719228745,
0.05063917115330696,
0.05230933055281639,
0.03572090342640877,
-0.13630899786949158,
0.0066713797859847546,
0.056312792003154755,
0.05797102674841881,
0.03847234696149826,
0.023632565513253212,
0.07384460419416428,
-0.00034099564072676003,
0.028370175510644913,
0.051966458559036255,
0.026898471638560295,
-0.07799487560987473,
-0.0452086441218853,
-0.10364556312561035,
0.031964369118213654,
0.025153543800115585,
0.016710123047232628,
0.012081009335815907,
0.13438081741333008,
-0.038917697966098785,
0.047064051032066345,
0.18897780776023865,
-0.32159873843193054,
-0.017251936718821526,
0.08565947413444519,
0.04367522895336151,
0.13044404983520508,
-0.0023994205985218287,
-0.017777182161808014,
0.07378572225570679,
0.03233532980084419,
0.09459076821804047,
-0.04173079505562782,
0.03125910833477974,
-0.050043269991874695,
-0.16108907759189606,
-0.04241729527711868,
0.10029200464487076,
-0.0024412055499851704,
-0.13447120785713196,
-0.04376332461833954,
-0.03213754668831825,
0.017400439828634262,
0.009946268051862717,
-0.03521779179573059,
0.0317063182592392,
0.006961835082620382,
-0.019870702177286148,
-0.0041909837163984776,
-0.10227002948522568,
-0.047148898243904114,
0.036354828625917435,
0.07718794047832489,
0.10060689598321915,
0.061504680663347244,
0.0019612975884228945,
0.10598515719175339,
-0.18208655714988708,
-0.04697822406888008,
-0.028720781207084656,
-0.062496211379766464,
-0.05090519040822983,
-0.01605825126171112,
-0.11327788233757019,
-0.04961031302809715,
0.009726467542350292,
0.13555870950222015,
-0.010812551714479923,
0.03576548397541046,
-0.02568705938756466,
0.00499426806345582,
0.06054869666695595,
0.040965013206005096,
-0.011356002651154995,
0.021583491936326027,
0.017222831025719643,
-0.007130956277251244,
-0.025134684517979622,
0.014969752170145512,
0.007374859414994717,
0.03991343453526497,
0.1192101389169693,
0.02841511182487011,
-0.10198801010847092,
0.07510554045438766,
-0.016932640224695206,
-0.05145328491926193,
0.018354477360844612,
-0.09149603545665741,
-0.06698766350746155,
-0.04501223564147949,
0.004490399733185768,
0.015582237392663956,
-0.00744973449036479,
-0.00622492004185915,
-0.023000098764896393,
-0.03227997571229935,
-0.08381804078817368,
-0.051128488034009933,
-0.05550629645586014,
-0.12489861249923706,
0.0037541405763477087,
-0.18212658166885376,
-0.03814190998673439,
-0.11570117622613907,
-0.1884094774723053,
-0.026025690138339996,
0.0657646432518959,
-0.011468419805169106,
-0.05196535214781761,
0.08289939165115356,
0.04083077609539032,
-0.03143550083041191,
-0.0035072604659944773,
0.07261630892753601,
-0.0042961882427334785,
0.037388525903224945,
-0.02525058388710022,
0.06315116584300995,
0.0076397620141506195,
0.03772273659706116,
-0.05502455681562424,
0.058563943952322006,
-0.18197853863239288,
0.04188532382249832,
-0.07891103625297546,
-0.014069957658648491,
-0.08411140739917755,
-0.03819650039076805,
-0.007978370413184166,
0.008952927775681019,
0.02084222435951233,
0.07378494739532471,
-0.17149527370929718,
-0.026492789387702942,
0.09929575026035309,
-0.15718717873096466,
-0.028799548745155334,
0.06910397112369537,
-0.045897725969552994,
0.08865198493003845,
0.06794887036085129,
0.15496882796287537,
-0.009846420027315617,
-0.07799214124679565,
0.05863018333911896,
-0.019155772402882576,
0.013406304642558098,
-0.013027123175561428,
0.06669768691062927,
-0.023710230365395546,
-0.15422402322292328,
0.03759263455867767,
-0.13317014276981354,
-0.0011630239896476269,
-0.07472589612007141,
0.01638176664710045,
-0.005295964889228344,
-0.0699520856142044,
-0.07958676666021347,
-0.025799527764320374,
0.06308736652135849,
-0.07950066775083542,
-0.013348576612770557,
0.030221596360206604,
0.07081180065870285,
-0.0728883221745491,
0.06735310703516006,
-0.01254582405090332,
0.011878326535224915,
-0.0774628221988678,
-0.040297482162714005,
-0.1892452836036682,
0.041696447879076004,
0.09663534164428711,
0.016999227926135063,
-0.018441352993249893,
0.1319197416305542,
0.005613498389720917,
0.06074393168091774,
-0.05030693858861923,
0.013045748695731163,
-0.007443498354405165,
-0.0019740720745176077,
-0.07871304452419281,
-0.09940580278635025,
-0.07377765327692032,
-0.0690242201089859,
0.07759394496679306,
-0.1156255453824997,
0.021266674622893333,
-0.06022780388593674,
0.03377982974052429,
0.01688176393508911,
-0.0840812474489212,
-0.010459520854055882,
0.019069135189056396,
-0.05823460593819618,
-0.057468220591545105,
0.042708870023489,
0.07054997235536575,
-0.007736925967037678,
0.09112557023763657,
-0.05577118694782257,
-0.08710431307554245,
0.034480903297662735,
0.09180405735969543,
-0.11307695508003235,
0.006192609667778015,
-0.057091694325208664,
-0.04291699454188347,
-0.0569283589720726,
-0.020741412416100502,
0.08178328722715378,
-0.005912035703659058,
0.13341888785362244,
-0.07490139454603195,
-0.00750527810305357,
0.012913272716104984,
-0.014795951545238495,
-0.029549483209848404,
0.0347745418548584,
0.07104525715112686,
-0.06561282277107239,
0.014655823819339275,
0.0398145355284214,
0.0129236476495862,
0.07494839280843735,
-0.051649123430252075,
-0.08306127786636353,
0.020888976752758026,
0.035676851868629456,
0.031353503465652466,
0.06625978648662567,
-0.022105637937784195,
-0.015643706545233727,
0.030664972960948944,
0.020956235006451607,
0.01103447750210762,
-0.10422560572624207,
0.05657682195305824,
0.0576016865670681,
0.010508420877158642,
0.07594890892505646,
-0.01446008961647749,
-0.04239549860358238,
0.07548920810222626,
0.03680150955915451,
-0.007482818327844143,
-0.01284368708729744,
-0.016609176993370056,
-0.11737710237503052,
0.19663682579994202,
-0.0656379908323288,
-0.1719346046447754,
-0.07353238761425018,
-0.12268255650997162,
-0.007598194293677807,
0.02213149331510067,
0.03785000368952751,
-0.03003292717039585,
-0.05204031616449356,
-0.1289958506822586,
0.054676249623298645,
-0.04498497024178505,
0.06404870748519897,
0.11382221430540085,
-0.04550660029053688,
0.05296875536441803,
-0.12671251595020294,
-0.0101760970428586,
-0.08266182243824005,
-0.07234510779380798,
0.06321241706609726,
-0.05240906774997711,
0.033595941960811615,
0.10176843404769897,
0.031879719346761703,
-0.01587478257715702,
-0.028892027214169502,
0.20261165499687195,
0.0418701097369194,
0.03479766845703125,
0.12739326059818268,
-0.056747425347566605,
0.05036337301135063,
0.08285484462976456,
0.012495464645326138,
-0.04963019862771034,
0.05869758129119873,
0.04847972095012665,
-0.06991308182477951,
-0.1988377869129181,
-0.02707155980169773,
-0.013066592626273632,
-0.04275073856115341,
0.07312163710594177,
0.038330696523189545,
0.016625532880425453,
0.06944681704044342,
0.013312414288520813,
0.051876865327358246,
-0.0025164680555462837,
0.1024797335267067,
0.020722460001707077,
-0.0329875648021698,
0.09007102251052856,
-0.01830030418932438,
-0.00749801192432642,
0.08255964517593384,
-0.021549496799707413,
0.29441380500793457,
-0.027286598458886147,
0.013521850109100342,
0.12930114567279816,
0.030603742226958275,
0.06023869663476944,
0.12459813803434372,
-0.07099504768848419,
0.015253609046339989,
-0.07236342132091522,
-0.060988638550043106,
0.0017414249014109373,
0.03842161223292351,
-0.06365621834993362,
0.010680402629077435,
-0.072339728474617,
0.018744932487607002,
-0.01448048371821642,
0.30947989225387573,
0.09860100597143173,
-0.10970685631036758,
-0.04911833629012108,
-0.000752986641600728,
-0.09468269348144531,
-0.06311521679162979,
0.04681239649653435,
0.054062776267528534,
-0.13441196084022522,
0.012765521183609962,
-0.028241176158189774,
0.07129421085119247,
-0.010518381372094154,
0.018714405596256256,
0.04606889933347702,
0.0437324196100235,
-0.04184827581048012,
0.0036613543052226305,
-0.19555528461933136,
0.1956169158220291,
0.006032291334122419,
0.018620306625962257,
-0.0489618144929409,
0.033198628574609756,
0.010252859443426132,
-0.03155329450964928,
0.06101314723491669,
0.019393721595406532,
-0.019223151728510857,
-0.04567025229334831,
-0.04697733744978905,
0.01349366083741188,
0.07791782170534134,
-0.03555740416049957,
0.10457872599363327,
-0.002386616077274084,
0.04437083378434181,
0.020488539710640907,
0.09614192694425583,
-0.18873189389705658,
-0.09834150969982147,
0.032352015376091,
-0.054557234048843384,
-0.10878853499889374,
-0.07721562683582306,
-0.09597303718328476,
-0.01923389360308647,
0.252151757478714,
-0.10278110951185226,
-0.07596177607774734,
-0.09790582954883575,
0.01920333504676819,
0.10567394644021988,
-0.04270554706454277,
0.031244635581970215,
-0.008458874188363552,
0.12127663195133209,
-0.0673648789525032,
-0.13238513469696045,
0.02214888110756874,
-0.09908603131771088,
-0.1564047634601593,
-0.06518279761075974,
0.1128951832652092,
0.06404989212751389,
0.03381923586130142,
-0.03425389900803566,
0.023052021861076355,
0.03524678945541382,
-0.04168519005179405,
-0.006522028706967831,
0.0645645260810852,
0.09905637055635452,
0.041185397654771805,
-0.11035409569740295,
0.008745155297219753,
-0.0678834319114685,
-0.07016514241695404,
0.07784715294837952,
0.2579067051410675,
-0.05007621645927429,
0.12203964591026306,
0.11760853976011276,
-0.07686085253953934,
-0.14923155307769775,
0.03782149776816368,
0.08545133471488953,
-0.022834157571196556,
0.019894802942872047,
-0.15682940185070038,
0.09335380047559738,
0.11265521496534348,
-0.018557701259851456,
-0.0056877173483371735,
-0.18401998281478882,
-0.12863628566265106,
0.07315871864557266,
0.10738933831453323,
0.26278403401374817,
-0.0671287477016449,
-0.038350123912096024,
0.019969971850514412,
-0.09502281248569489,
0.017668412998318672,
0.12872803211212158,
0.06579006463289261,
-0.027389664202928543,
-0.08321871608495712,
0.010153744369745255,
-0.04545501992106438,
0.08790994435548782,
0.058084599673748016,
0.060908474028110504,
-0.0061536445282399654,
0.02486131154000759,
-0.021733613684773445,
-0.044649578630924225,
0.06542829424142838,
0.01869887486100197,
0.046193528920412064,
-0.08012101799249649,
-0.02983449399471283,
-0.07395893335342407,
0.02785658650100231,
-0.026117371395230293,
-0.07831466197967529,
-0.05989965423941612,
0.08563177287578583,
0.04736872762441635,
-0.027235183864831924,
0.021614985540509224,
0.021228494122624397,
0.11791495978832245,
0.15895983576774597,
0.004445383790880442,
-0.06503961235284805,
-0.06453821808099747,
-0.03814944252371788,
-0.01721223257482052,
0.06767161935567856,
-0.03794354572892189,
0.016494330018758774,
0.06426553428173065,
0.017416078597307205,
0.09738118946552277,
0.05953404679894447,
-0.11561297625303268,
-0.024292560294270515,
0.03758063167333603,
-0.1657785326242447,
0.03580211102962494,
-0.001598253147676587,
0.034871816635131836,
-0.036075249314308167,
0.03301055729389191,
0.1410711109638214,
-0.06102028861641884,
-0.03264998644590378,
-0.03890163078904152,
0.06946581602096558,
0.02788395620882511,
0.14354752004146576,
0.03333998844027519,
0.03617656230926514,
-0.08402898907661438,
0.1262768805027008,
0.035557202994823456,
-0.039336711168289185,
0.02372117154300213,
-0.013139264658093452,
-0.1125253289937973,
0.010379180312156677,
0.061715420335531235,
0.03677203133702278,
-0.05500580742955208,
-0.00723118893802166,
-0.02872598171234131,
-0.07720296829938889,
0.06574521213769913,
0.18526390194892883,
0.06665845215320587,
0.06732641160488129,
-0.053661447018384933,
-0.040386393666267395,
-0.07774057239294052,
0.04018790274858475,
0.029785551130771637,
0.07733111083507538,
-0.07668016105890274,
0.08796079456806183,
0.016798043623566628,
0.03941180184483528,
-0.028561683371663094,
-0.05471352860331535,
-0.10349979251623154,
-0.050034936517477036,
-0.090561643242836,
-0.0011460985988378525,
-0.07951012253761292,
-0.03812754899263382,
-0.003393731778487563,
0.0039029635954648256,
-0.011904427781701088,
0.0513344332575798,
-0.05868305265903473,
-0.011116418987512589,
-0.018127545714378357,
0.036919333040714264,
-0.061852503567934036,
-0.03325017914175987,
0.02362787537276745,
-0.10057367384433746,
0.0892319604754448,
0.04148831218481064,
0.0025799311697483063,
0.006972033996134996,
0.08754073083400726,
-0.025140902027487755,
0.01911318115890026,
0.012550325132906437,
-0.04755615442991257,
-0.0859232246875763,
-0.0026606726460158825,
-0.005065272096544504,
-0.01955832727253437,
-0.0054407548159360886,
0.07748287171125412,
-0.08737551420927048,
0.03659152239561081,
-0.0025190107990056276,
-0.0024086623452603817,
-0.07160819321870804,
-0.01184776984155178,
0.10625861585140228,
0.09197504073381424,
0.046776846051216125,
-0.08982660621404648,
0.01051870733499527,
-0.13862378895282745,
-0.03914214298129082,
0.004745421465486288,
-0.012799100950360298,
-0.13227277994155884,
-0.003801894374191761,
0.02133542113006115,
-0.004444823134690523,
0.22005333006381989,
-0.057375915348529816,
-0.022105107083916664,
0.02346060238778591,
-0.09333761781454086,
0.11766957491636276,
-0.024213317781686783,
0.17618641257286072,
-0.012262688018381596,
-0.042539291083812714,
-0.004434205126017332,
0.040916066616773605,
0.020118670538067818,
-0.022090144455432892,
0.18882252275943756,
0.14223720133304596,
0.034266870468854904,
0.03641033172607422,
-0.02075190469622612,
-0.0034026664216071367,
-0.0319923497736454,
-0.02931535616517067,
0.036527570337057114,
0.04365861043334007,
0.02113923616707325,
0.13980713486671448,
0.07130830734968185,
-0.16345630586147308,
0.03707798942923546,
-0.03519091755151749,
-0.038046758621931076,
-0.1094740480184555,
-0.08910652995109558,
-0.025534672662615776,
-0.07326921820640564,
0.010437771677970886,
-0.13031744956970215,
0.0013345993356779218,
0.18586856126785278,
0.06533582508563995,
0.027467917650938034,
0.013961014337837696,
-0.12228843569755554,
-0.03255917504429817,
0.05401286482810974,
0.008685757406055927,
0.021616661921143532,
0.053432296961545944,
0.003343318123370409,
0.05281477048993111,
0.03578299656510353,
0.01326004695147276,
-0.0007106585544534028,
0.07474171370267868,
0.018837356939911842,
0.0416049063205719,
-0.06370314210653305,
-0.001689836848527193,
-0.04445668309926987,
0.07008016854524612,
0.10977731645107269,
0.0449480339884758,
-0.05188392847776413,
-0.005991882178932428,
0.15497668087482452,
-0.03513135015964508,
-0.001366854296065867,
-0.12739935517311096,
0.32445257902145386,
0.019164547324180603,
0.008749916218221188,
0.04211711511015892,
-0.07250791043043137,
-0.049001358449459076,
0.21164251863956451,
0.0997643694281578,
-0.01922059804201126,
-0.020747704431414604,
-0.00034133915323764086,
-0.029869329184293747,
-0.024883605539798737,
0.15304510295391083,
0.04069109633564949,
0.12149273604154587,
-0.0524359792470932,
-0.04489302262663841,
-0.02981526218354702,
-0.010546805337071419,
-0.12019433081150055,
0.12404327094554901,
-0.011144006624817848,
-0.024203099310398102,
-0.06553878635168076,
0.029910597950220108,
0.0654262825846672,
-0.30026888847351074,
-0.005782421678304672,
-0.03386260196566582,
-0.10665477067232132,
-0.011830050498247147,
-0.022238656878471375,
-0.02699066698551178,
0.04717991501092911,
-0.04126114770770073,
0.06896927952766418,
0.027099693194031715,
0.033044155687093735,
-0.021147441118955612,
-0.10662949085235596,
0.1730782836675644,
0.06533430516719818,
0.09351354837417603,
0.024693980813026428,
0.06889713555574417,
0.05904519185423851,
0.03406912088394165,
-0.0979401171207428,
0.04778920114040375,
0.012719012796878815,
-0.08297587186098099,
-0.04874406382441521,
0.11307864636182785,
0.0016576977213844657,
0.05289700627326965,
0.03709104284644127,
-0.10084592550992966,
0.018260831013321877,
0.06590236723423004,
-0.05889030545949936,
-0.09668754786252975,
-0.0010825279168784618,
-0.09197068959474564,
0.16206316649913788,
0.145284041762352,
-0.015142515301704407,
0.01619129627943039,
-0.0679343119263649,
-0.00711658364161849,
0.048216529190540314,
0.009788771159946918,
-0.022979406639933586,
-0.19777797162532806,
0.03806306794285774,
-0.08318892121315002,
-0.008719194680452347,
-0.23902833461761475,
-0.10142793506383896,
-0.006191209424287081,
-0.04832799732685089,
-0.030589427798986435,
0.05844983831048012,
0.0332350954413414,
0.07119880616664886,
-0.019559070467948914,
-0.008419658988714218,
-0.03136663883924484,
0.09078472852706909,
-0.11314812302589417,
-0.06269506365060806
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 180k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 180k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_180k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_180k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_180k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_180k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_180k"]}
| null |
google/multiberts-seed_3-step_180k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_180k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_180k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 180k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 180k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 180k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 180k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_180k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 180k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 180k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_180k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 180k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 180k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07529629021883011,
0.07964054495096207,
-0.002125372178852558,
0.0408046618103981,
0.08162952959537506,
-0.017899541184306145,
0.07349470257759094,
0.09723732620477676,
-0.03180599957704544,
0.023286981508135796,
0.08258287608623505,
0.01584509015083313,
0.014244671911001205,
0.10095208138227463,
0.022511925548315048,
-0.22193172574043274,
0.02533634752035141,
-0.029594602063298225,
-0.08318311721086502,
0.07986390590667725,
0.09758497774600983,
-0.07995149493217468,
0.047508496791124344,
0.02670127898454666,
-0.10617565363645554,
0.04887458682060242,
-0.009024095721542835,
-0.019714385271072388,
0.1321990042924881,
0.000926746055483818,
0.04834696650505066,
0.05160138010978699,
0.03584308922290802,
-0.1361697018146515,
0.007357336115092039,
0.057800449430942535,
0.05723099410533905,
0.03999469429254532,
0.02621418423950672,
0.07584275305271149,
0.007507733069360256,
0.025106901302933693,
0.048347484320402145,
0.02611062489449978,
-0.07729878276586533,
-0.05042727664113045,
-0.10441770404577255,
0.032505594193935394,
0.025888757780194283,
0.01752639003098011,
0.009898215532302856,
0.13268934190273285,
-0.03937035799026489,
0.04724543169140816,
0.19536258280277252,
-0.3295520842075348,
-0.01830851286649704,
0.07860399037599564,
0.047340765595436096,
0.12351497262716293,
-0.0033489216584712267,
-0.015028869733214378,
0.07624216377735138,
0.03360758721828461,
0.09811873733997345,
-0.04045358672738075,
0.03612204268574715,
-0.05416208878159523,
-0.1643984168767929,
-0.04100490361452103,
0.09523675590753555,
-0.001076547778211534,
-0.13434547185897827,
-0.047139525413513184,
-0.035403091460466385,
0.024472897872328758,
0.011012150906026363,
-0.03581985831260681,
0.028822176158428192,
0.007509919349104166,
-0.02115897834300995,
-0.005054539069533348,
-0.10412383079528809,
-0.050628673285245895,
0.03276295214891434,
0.08394033461809158,
0.10204768180847168,
0.05966344103217125,
0.003469582414254546,
0.10826856642961502,
-0.17749735713005066,
-0.046953462064266205,
-0.02868572250008583,
-0.06428758800029755,
-0.04917507991194725,
-0.014696398749947548,
-0.10796267539262772,
-0.047812677919864655,
0.011873651295900345,
0.134730264544487,
-0.011613058857619762,
0.032957229763269424,
-0.029453864321112633,
0.003885209094733,
0.05897347629070282,
0.04355436563491821,
-0.013503354042768478,
0.024915261194109917,
0.01698204316198826,
-0.005483143962919712,
-0.024392245337367058,
0.01644792966544628,
0.009415552951395512,
0.03364865854382515,
0.11897752434015274,
0.031327102333307266,
-0.10454276949167252,
0.07858902215957642,
-0.015610037371516228,
-0.05082904174923897,
0.026767322793602943,
-0.08737906813621521,
-0.06894092261791229,
-0.04246634617447853,
0.005227247253060341,
0.023696603253483772,
-0.007085553836077452,
-0.010285976342856884,
-0.020433897152543068,
-0.03134789317846298,
-0.08169689774513245,
-0.049449946731328964,
-0.0532006211578846,
-0.12608253955841064,
0.005080598872154951,
-0.18211667239665985,
-0.03748415783047676,
-0.11869417876005173,
-0.1824513077735901,
-0.02465745247900486,
0.06155093014240265,
-0.011167685501277447,
-0.047586072236299515,
0.08261537551879883,
0.04278351739048958,
-0.030621441081166267,
-0.004477020353078842,
0.08074946701526642,
-0.006555885076522827,
0.03938964381814003,
-0.02450188808143139,
0.06499432027339935,
0.01161333080381155,
0.03614084795117378,
-0.055091895163059235,
0.06013459339737892,
-0.17709268629550934,
0.04008179157972336,
-0.08063418418169022,
-0.017549166455864906,
-0.08731776475906372,
-0.036423347890377045,
-0.011576903983950615,
0.010054657235741615,
0.020647091791033745,
0.07210725545883179,
-0.18094535171985626,
-0.026373447850346565,
0.10477038472890854,
-0.15725177526474,
-0.028630776330828667,
0.07251430302858353,
-0.04471254721283913,
0.09107830375432968,
0.06789883226156235,
0.15107332170009613,
-0.01631753146648407,
-0.08734907954931259,
0.059412114322185516,
-0.016071991994976997,
0.011420686729252338,
-0.009812785312533379,
0.0670507475733757,
-0.02335195057094097,
-0.15371838212013245,
0.03705831989645958,
-0.13497421145439148,
0.001253717695362866,
-0.07566584646701813,
0.019338825717568398,
-0.005330341402441263,
-0.06871744245290756,
-0.07627496123313904,
-0.026524076238274574,
0.06352686136960983,
-0.07840604335069656,
-0.015693187713623047,
0.02866985835134983,
0.07203986495733261,
-0.07573935389518738,
0.06552410125732422,
-0.00942293182015419,
0.014532659202814102,
-0.0777122750878334,
-0.03824697807431221,
-0.18874391913414001,
0.04326342046260834,
0.0956047847867012,
0.013941694982349873,
-0.021394649520516396,
0.13791362941265106,
0.007661436218768358,
0.0631488710641861,
-0.05066676437854767,
0.013190238736569881,
-0.004133580252528191,
-0.0016802673926576972,
-0.082631416618824,
-0.09694838523864746,
-0.07576212286949158,
-0.06915576010942459,
0.0756450816988945,
-0.11862080544233322,
0.018575716763734818,
-0.06263191252946854,
0.0362345390021801,
0.01750890165567398,
-0.0790354311466217,
-0.008167607709765434,
0.017852885648608208,
-0.05749104171991348,
-0.057624999433755875,
0.04356056824326515,
0.06898022443056107,
-0.008578408509492874,
0.092921182513237,
-0.055851880460977554,
-0.08707752078771591,
0.03259553387761116,
0.0916094183921814,
-0.11356297880411148,
0.0032025331165641546,
-0.058399587869644165,
-0.04189280793070793,
-0.06328576803207397,
-0.021585922688245773,
0.07654790580272675,
-0.0033372698817402124,
0.13309438526630402,
-0.07355104386806488,
-0.007782393135130405,
0.010318510234355927,
-0.016468118876218796,
-0.027788616716861725,
0.0411822609603405,
0.0636289119720459,
-0.07648901641368866,
0.014196345582604408,
0.04560191184282303,
0.01076000276952982,
0.0778830498456955,
-0.0531894788146019,
-0.08480238169431686,
0.02089363895356655,
0.038833390921354294,
0.030758241191506386,
0.0636000856757164,
-0.015227558091282845,
-0.01609926111996174,
0.031661033630371094,
0.021360598504543304,
0.00864658784121275,
-0.10557615756988525,
0.055069394409656525,
0.05626542493700981,
0.006901416927576065,
0.07910977303981781,
-0.00894890259951353,
-0.04409582540392876,
0.0760769248008728,
0.03819222375750542,
-0.007488138508051634,
-0.012313507497310638,
-0.014729587361216545,
-0.11419649422168732,
0.19809085130691528,
-0.061056241393089294,
-0.1690690666437149,
-0.06880983710289001,
-0.11103887856006622,
-0.008660628460347652,
0.020913414657115936,
0.039920128881931305,
-0.030270177870988846,
-0.049454398453235626,
-0.1265099048614502,
0.06871145218610764,
-0.041419316083192825,
0.06400388479232788,
0.1062387079000473,
-0.045653291046619415,
0.05141395330429077,
-0.12809395790100098,
-0.009870516136288643,
-0.08513043075799942,
-0.074149489402771,
0.06036406755447388,
-0.05736226215958595,
0.03622059151530266,
0.09527669847011566,
0.034344881772994995,
-0.017104873433709145,
-0.029105760157108307,
0.2004707306623459,
0.04233166202902794,
0.033552732318639755,
0.13402700424194336,
-0.05340477079153061,
0.04985686391592026,
0.077443428337574,
0.011864276602864265,
-0.04933450371026993,
0.05913788080215454,
0.05217575281858444,
-0.07026290893554688,
-0.1961289793252945,
-0.024541564285755157,
-0.01171097718179226,
-0.04076085612177849,
0.0748472511768341,
0.03750859200954437,
0.012426912784576416,
0.0715087503194809,
0.009985240176320076,
0.05308109521865845,
-0.0022029404062777758,
0.10119857639074326,
0.019481094554066658,
-0.031628333032131195,
0.09291520714759827,
-0.020131384953856468,
-0.00919202622026205,
0.081247478723526,
-0.016807125881314278,
0.294627845287323,
-0.029873089864850044,
0.015771223232150078,
0.13010364770889282,
0.033423569053411484,
0.06064244359731674,
0.13005173206329346,
-0.06862661987543106,
0.014273291453719139,
-0.07413987070322037,
-0.06268097460269928,
-0.0035699757281690836,
0.0369187593460083,
-0.06314892321825027,
0.008355865254998207,
-0.07369779050350189,
0.017162369564175606,
-0.013916291296482086,
0.31333762407302856,
0.10001460462808609,
-0.1081419289112091,
-0.05211557820439339,
-0.0005876515642739832,
-0.09509944915771484,
-0.06336220353841782,
0.04745354503393173,
0.0537991039454937,
-0.13482272624969482,
0.01049771998077631,
-0.029735656455159187,
0.0733492448925972,
-0.019689496606588364,
0.019922034814953804,
0.040215834975242615,
0.04474812000989914,
-0.04233627766370773,
0.005174790509045124,
-0.19892896711826324,
0.20145785808563232,
0.005686886142939329,
0.022242587059736252,
-0.05163828656077385,
0.030314691364765167,
0.011783203110098839,
-0.03170379251241684,
0.0634118914604187,
0.015796246007084846,
-0.025111796334385872,
-0.05107645317912102,
-0.045615483075380325,
0.013600435107946396,
0.0755857303738594,
-0.03693152591586113,
0.10341379046440125,
-0.0039892313070595264,
0.043652888387441635,
0.019159309566020966,
0.10120739042758942,
-0.191695898771286,
-0.09309875965118408,
0.02945013903081417,
-0.0601939782500267,
-0.10619907826185226,
-0.07632436603307724,
-0.09678757190704346,
-0.01848679967224598,
0.24458058178424835,
-0.09953853487968445,
-0.07616499066352844,
-0.09761597216129303,
0.016461338847875595,
0.10474106669425964,
-0.04617287218570709,
0.02531410939991474,
-0.01023173239082098,
0.12078716605901718,
-0.06869299709796906,
-0.13606469333171844,
0.023360038176178932,
-0.10063924640417099,
-0.15833930671215057,
-0.062286559492349625,
0.11387746781110764,
0.0657893717288971,
0.03283181041479111,
-0.030421802774071693,
0.019107164815068245,
0.034941013902425766,
-0.04209132492542267,
-0.004226401913911104,
0.06343523412942886,
0.09732814133167267,
0.03359757736325264,
-0.11286426335573196,
0.013703174889087677,
-0.06892450898885727,
-0.06746874004602432,
0.07556816190481186,
0.25450676679611206,
-0.04987059533596039,
0.11554962396621704,
0.11867693811655045,
-0.07392797619104385,
-0.14944125711917877,
0.03311704471707344,
0.08873752504587173,
-0.021144669502973557,
0.019575877115130424,
-0.1565309762954712,
0.09049184620380402,
0.12054993212223053,
-0.020548401400446892,
-0.001889335224404931,
-0.18308091163635254,
-0.12785261869430542,
0.07671727240085602,
0.10537123680114746,
0.2588931620121002,
-0.06885216385126114,
-0.03647097945213318,
0.01785290800035,
-0.09296835213899612,
0.023828042671084404,
0.12034597247838974,
0.06913405656814575,
-0.026890991255640984,
-0.08246928453445435,
0.008864075876772404,
-0.04366300627589226,
0.08771098405122757,
0.05646269768476486,
0.06193143501877785,
-0.004662323277443647,
0.02797098085284233,
-0.032238006591796875,
-0.04521980509161949,
0.06821916252374649,
0.015983017161488533,
0.04485379531979561,
-0.07952353358268738,
-0.030781855806708336,
-0.07634525001049042,
0.030532898381352425,
-0.025274138897657394,
-0.07913309335708618,
-0.058629490435123444,
0.08560065925121307,
0.045261312276124954,
-0.02675323188304901,
0.026111971586942673,
0.020057203248143196,
0.121265709400177,
0.15631207823753357,
0.004183388780802488,
-0.054555632174015045,
-0.06697812676429749,
-0.039226509630680084,
-0.017748592421412468,
0.07094214111566544,
-0.040483370423316956,
0.0163649320602417,
0.0633402094244957,
0.017471425235271454,
0.0965992882847786,
0.05864199623465538,
-0.1163872703909874,
-0.02153024449944496,
0.03336251154541969,
-0.16399900615215302,
0.03059452958405018,
0.0002060886035906151,
0.027377882972359657,
-0.032614096999168396,
0.038963381201028824,
0.14420323073863983,
-0.06169323995709419,
-0.0325944609940052,
-0.03986796364188194,
0.06795339286327362,
0.024926453828811646,
0.14681537449359894,
0.03413679078221321,
0.03654820844531059,
-0.08256812393665314,
0.12098991870880127,
0.03031715378165245,
-0.03150006756186485,
0.02233215793967247,
-0.01736481674015522,
-0.1113777905702591,
0.008346039801836014,
0.06097723916172981,
0.035317812114953995,
-0.05830734968185425,
-0.008459407836198807,
-0.022901570424437523,
-0.07834441214799881,
0.06174474582076073,
0.18544422090053558,
0.06892413645982742,
0.0692766010761261,
-0.05299605056643486,
-0.03894772380590439,
-0.07547973096370697,
0.03867585211992264,
0.03147696331143379,
0.07681091874837875,
-0.0749855563044548,
0.09041143208742142,
0.015461030416190624,
0.04065895825624466,
-0.030679777264595032,
-0.0513051301240921,
-0.10576203465461731,
-0.050899069756269455,
-0.09929967671632767,
-0.0019410114036872983,
-0.08140656352043152,
-0.03778461366891861,
-0.003951936960220337,
0.004320668987929821,
-0.006663746200501919,
0.05034462735056877,
-0.058887604624032974,
-0.009561154060065746,
-0.018636513501405716,
0.037441983819007874,
-0.06397218257188797,
-0.03368978202342987,
0.02258148603141308,
-0.10177983343601227,
0.08850538730621338,
0.04226628318428993,
0.005431429948657751,
0.010632418096065521,
0.08220719546079636,
-0.02323928475379944,
0.020910218358039856,
0.010206666775047779,
-0.046668641269207,
-0.08617497235536575,
-0.0036850774195045233,
-0.006282383110374212,
-0.017643962055444717,
-0.006745987571775913,
0.07406105846166611,
-0.08788396418094635,
0.03691922128200531,
-0.00207803538069129,
-0.002457258990034461,
-0.07004836201667786,
-0.008956391364336014,
0.1041482537984848,
0.09467104077339172,
0.04820473492145538,
-0.09076254814863205,
0.011089625768363476,
-0.13720202445983887,
-0.03925805911421776,
0.00629372289404273,
-0.01613847352564335,
-0.12994912266731262,
-0.004272967576980591,
0.022951949387788773,
-0.005189769901335239,
0.21355882287025452,
-0.058414820581674576,
-0.012488645501434803,
0.02071729674935341,
-0.09492039680480957,
0.11895439028739929,
-0.025701865553855896,
0.17444255948066711,
-0.013736947439610958,
-0.040061675012111664,
-0.007559155113995075,
0.041841402649879456,
0.01693662442266941,
-0.020580818876624107,
0.18931663036346436,
0.1385989785194397,
0.025656575337052345,
0.03782857581973076,
-0.019755301997065544,
-0.0029525230638682842,
-0.042041268199682236,
-0.037352580577135086,
0.04004887118935585,
0.04300963878631592,
0.01842004805803299,
0.13920746743679047,
0.07167192548513412,
-0.1631704568862915,
0.034182120114564896,
-0.03457532450556755,
-0.03744308650493622,
-0.1113995835185051,
-0.0984404981136322,
-0.026180660352110863,
-0.07065854966640472,
0.01125909760594368,
-0.12662029266357422,
0.001567435567267239,
0.18928559124469757,
0.0667104497551918,
0.026706214994192123,
0.015890520066022873,
-0.11324632167816162,
-0.033999741077423096,
0.05756282061338425,
0.009685833007097244,
0.021014658734202385,
0.05406171455979347,
0.005060065072029829,
0.05594225600361824,
0.031945377588272095,
0.012866385281085968,
-0.0022932474967092276,
0.07060839980840683,
0.019777949899435043,
0.03912871703505516,
-0.06322798877954483,
-0.002516921143978834,
-0.039979200810194016,
0.07140443474054337,
0.11842433363199234,
0.04795306175947189,
-0.052930496633052826,
-0.006227418314665556,
0.15211941301822662,
-0.03417108207941055,
-0.00009093146218219772,
-0.12713100016117096,
0.33050253987312317,
0.016481250524520874,
0.01256038248538971,
0.04078580066561699,
-0.0695950835943222,
-0.04734867066144943,
0.2116898149251938,
0.09519170969724655,
-0.018239552155137062,
-0.019324621185660362,
-0.0012262130621820688,
-0.029694685712456703,
-0.024535447359085083,
0.15262043476104736,
0.042763061821460724,
0.12466856837272644,
-0.055304914712905884,
-0.04661440849304199,
-0.028979195281863213,
-0.012429910711944103,
-0.12202814221382141,
0.12063531577587128,
-0.012610113248229027,
-0.022693147882819176,
-0.06559299677610397,
0.0302585456520319,
0.06448094546794891,
-0.3063240945339203,
-0.010985711589455605,
-0.02591174840927124,
-0.10738541185855865,
-0.013720162212848663,
-0.022468047216534615,
-0.022648025304079056,
0.04945925995707512,
-0.04295744374394417,
0.06984128803014755,
0.032800205051898956,
0.03217484429478645,
-0.016585450619459152,
-0.10303027182817459,
0.1715969294309616,
0.07380402088165283,
0.08641067147254944,
0.02558651752769947,
0.06851489096879959,
0.058651119470596313,
0.03308144211769104,
-0.09620242565870285,
0.054236624389886856,
0.013324787840247154,
-0.08703432232141495,
-0.04700598865747452,
0.11133771389722824,
0.0020650005899369717,
0.04781067743897438,
0.03786485269665718,
-0.10568685829639435,
0.021735426038503647,
0.07025735080242157,
-0.060215868055820465,
-0.09740274399518967,
0.004514118190854788,
-0.09112628549337387,
0.1637609452009201,
0.15090423822402954,
-0.013704896904528141,
0.014589443802833557,
-0.07090102136135101,
-0.006027420982718468,
0.04910309240221977,
0.005854974500834942,
-0.02501971274614334,
-0.1926189810037613,
0.03568204492330551,
-0.07833793759346008,
-0.011855347082018852,
-0.23050916194915771,
-0.10022657364606857,
-0.006363594904541969,
-0.049500588327646255,
-0.027895260602235794,
0.05267331376671791,
0.028175033628940582,
0.0717364028096199,
-0.018082471564412117,
-0.01692277565598488,
-0.031175246462225914,
0.08715849369764328,
-0.113860122859478,
-0.06186993420124054
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 1900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1900k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_1900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_1900k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_1900k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_1900k"]}
| null |
google/multiberts-seed_3-step_1900k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_1900k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1900k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 1900k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1900k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_1900k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 1900k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 1900k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07376515120267868,
0.08397122472524643,
-0.002056001452729106,
0.03755999356508255,
0.07902193814516068,
-0.013891051523387432,
0.07980860024690628,
0.09643597155809402,
-0.03154493495821953,
0.02155904471874237,
0.08406028151512146,
0.009573128074407578,
0.01656164601445198,
0.09473562240600586,
0.018784916028380394,
-0.22017906606197357,
0.023987650871276855,
-0.030205916613340378,
-0.07561096549034119,
0.07960725575685501,
0.09880109131336212,
-0.08406942337751389,
0.046027109026908875,
0.025204427540302277,
-0.11366188526153564,
0.051199477165937424,
-0.007302337326109409,
-0.025547804310917854,
0.13019832968711853,
0.00029939101659692824,
0.051708340644836426,
0.05226407200098038,
0.036239780485630035,
-0.13379275798797607,
0.007600299082696438,
0.05651305615901947,
0.05558212101459503,
0.03953951224684715,
0.01918822154402733,
0.07323309034109116,
0.013505865819752216,
0.02514214999973774,
0.05185272544622421,
0.023116402328014374,
-0.07761309295892715,
-0.05194081738591194,
-0.10582651942968369,
0.041990768164396286,
0.024497132748365402,
0.010282217524945736,
0.014272710308432579,
0.1340675801038742,
-0.03860931098461151,
0.049742404371500015,
0.19017156958580017,
-0.3213702440261841,
-0.02057662606239319,
0.0845903605222702,
0.04901362583041191,
0.12563452124595642,
-0.0064378054812550545,
-0.014515184797346592,
0.07062387466430664,
0.03135062754154205,
0.10009899735450745,
-0.04537324979901314,
0.022535080090165138,
-0.050160039216279984,
-0.16244931519031525,
-0.041167840361595154,
0.10071317851543427,
-0.004087251611053944,
-0.13428272306919098,
-0.04810260236263275,
-0.03339825198054314,
0.023384585976600647,
0.0070527466014027596,
-0.03533465787768364,
0.03289801627397537,
0.003376379143446684,
-0.017871463671326637,
-0.007834707386791706,
-0.10331308096647263,
-0.04821495711803436,
0.03234688192605972,
0.08907143026590347,
0.09903357177972794,
0.06330142170190811,
0.0006083471234887838,
0.10582133382558823,
-0.17693358659744263,
-0.047721605747938156,
-0.027176471427083015,
-0.06356289237737656,
-0.04728960618376732,
-0.016384419053792953,
-0.1114712655544281,
-0.05031587556004524,
0.010036882944405079,
0.13317829370498657,
-0.010582005605101585,
0.03543167561292648,
-0.022758733481168747,
0.005677937064319849,
0.06356274336576462,
0.038857005536556244,
-0.011022736318409443,
0.00811508484184742,
0.02134883962571621,
-0.0070644267834723,
-0.022171197459101677,
0.014700056053698063,
0.008965309709310532,
0.03874920308589935,
0.12131644785404205,
0.02876780740916729,
-0.10256244242191315,
0.07604185491800308,
-0.011072658002376556,
-0.047215480357408524,
0.02055695466697216,
-0.09231997281312943,
-0.06536456197500229,
-0.04468748718500137,
0.004402347840368748,
0.02124256081879139,
-0.010381679981946945,
-0.007228020112961531,
-0.018616804853081703,
-0.032078493386507034,
-0.08601145446300507,
-0.04985106363892555,
-0.051677484065294266,
-0.12791027128696442,
0.004957260563969612,
-0.17825566232204437,
-0.037348851561546326,
-0.1142088919878006,
-0.1888502985239029,
-0.025541534647345543,
0.06542877107858658,
-0.012545998208224773,
-0.04878450930118561,
0.08610879629850388,
0.04203174263238907,
-0.02929599955677986,
-0.0017730691470205784,
0.0787469670176506,
-0.005023648962378502,
0.03858408331871033,
-0.02619892545044422,
0.06495856493711472,
0.0035693999379873276,
0.03362623602151871,
-0.05085953697562218,
0.06012364849448204,
-0.18586690723896027,
0.04127879813313484,
-0.07828420400619507,
-0.01653149165213108,
-0.08408647775650024,
-0.036280058324337006,
-0.0076492344960570335,
0.010810988955199718,
0.017681099474430084,
0.07412189990282059,
-0.1718817502260208,
-0.028403829783201218,
0.11015753448009491,
-0.15707793831825256,
-0.028552815318107605,
0.06407583504915237,
-0.04656250774860382,
0.09247063845396042,
0.07051795721054077,
0.15185588598251343,
-0.01972847431898117,
-0.08776506781578064,
0.0602576918900013,
-0.015721192583441734,
0.011750280857086182,
-0.012186737731099129,
0.06531550735235214,
-0.02193829044699669,
-0.1577969491481781,
0.038211070001125336,
-0.1328801065683365,
-0.0017653345130383968,
-0.07494405657052994,
0.018637778237462044,
-0.0010067473631352186,
-0.06923016160726547,
-0.07367284595966339,
-0.027080392464995384,
0.0642646849155426,
-0.0785890743136406,
-0.011136820539832115,
0.040893156081438065,
0.0693902000784874,
-0.07418110221624374,
0.06497722864151001,
-0.009069502353668213,
0.01259523443877697,
-0.07514709234237671,
-0.03997751697897911,
-0.1883137822151184,
0.04026486724615097,
0.0963665097951889,
0.01204549241811037,
-0.018091406673192978,
0.13073650002479553,
0.006451910361647606,
0.06570166349411011,
-0.05274747684597969,
0.01752728968858719,
-0.0036028039176017046,
-0.00063918880186975,
-0.08334465324878693,
-0.10313007235527039,
-0.07517320662736893,
-0.06887048482894897,
0.08407870680093765,
-0.1221526563167572,
0.021856052801012993,
-0.057948194444179535,
0.0369124710559845,
0.01617356762290001,
-0.08157758414745331,
-0.009838244877755642,
0.018657028675079346,
-0.055681392550468445,
-0.056003961712121964,
0.0419839546084404,
0.07019612193107605,
-0.008057622238993645,
0.0914812907576561,
-0.05617247521877289,
-0.09092794358730316,
0.035420216619968414,
0.09353223443031311,
-0.11638505756855011,
-0.005876403301954269,
-0.056712668389081955,
-0.0428168848156929,
-0.057340290397405624,
-0.024922342970967293,
0.07830417901277542,
-0.005667109973728657,
0.13536594808101654,
-0.07692527025938034,
-0.011840030550956726,
0.013418685644865036,
-0.01532718725502491,
-0.028200140222907066,
0.02890489250421524,
0.06792852282524109,
-0.07944294810295105,
0.01628820039331913,
0.03574816882610321,
0.011550127528607845,
0.08267012238502502,
-0.05200008302927017,
-0.08204049617052078,
0.016839250922203064,
0.03268825262784958,
0.02757004275918007,
0.07238858193159103,
-0.029564009979367256,
-0.01943548396229744,
0.03081458806991577,
0.015131480991840363,
0.009895188733935356,
-0.10476230829954147,
0.057770971208810806,
0.05494152754545212,
0.011566927656531334,
0.07578089088201523,
-0.011857390403747559,
-0.041977930814027786,
0.07757309079170227,
0.036314208060503006,
-0.003607403952628374,
-0.015279978513717651,
-0.016104841604828835,
-0.1184978112578392,
0.19649860262870789,
-0.06571485102176666,
-0.16888287663459778,
-0.07108917832374573,
-0.1221516951918602,
-0.010282757692039013,
0.020016267895698547,
0.038245320320129395,
-0.01968386583030224,
-0.048618972301483154,
-0.13063731789588928,
0.05498434230685234,
-0.04047960415482521,
0.0655592754483223,
0.11015763878822327,
-0.04779387637972832,
0.04822521284222603,
-0.12883387506008148,
-0.009529849514365196,
-0.08446569740772247,
-0.065731942653656,
0.0624760165810585,
-0.05124552920460701,
0.03261474519968033,
0.10016771405935287,
0.03371565788984299,
-0.016146443784236908,
-0.03236616775393486,
0.20562873780727386,
0.03982749953866005,
0.04039599001407623,
0.12354514002799988,
-0.05543474480509758,
0.05177931487560272,
0.07788045704364777,
0.013930046930909157,
-0.04788370057940483,
0.05552036315202713,
0.04483664780855179,
-0.0718248188495636,
-0.19281353056430817,
-0.02600179985165596,
-0.01512904092669487,
-0.05053076520562172,
0.07249177992343903,
0.03605254739522934,
0.023471493273973465,
0.06833932548761368,
0.013132805936038494,
0.06020783260464668,
0.002788733458146453,
0.10370469838380814,
0.01944280043244362,
-0.033249519765377045,
0.0898301750421524,
-0.019153395667672157,
-0.0061645726673305035,
0.0823061466217041,
-0.026475928723812103,
0.2897823452949524,
-0.027951549738645554,
0.0017475745407864451,
0.131146639585495,
0.026826802641153336,
0.06266181170940399,
0.12959957122802734,
-0.07208346575498581,
0.015711359679698944,
-0.07233327627182007,
-0.06391321122646332,
0.000025414174160687253,
0.0389837808907032,
-0.060937680304050446,
0.0072100902907550335,
-0.06868505477905273,
0.006972138304263353,
-0.015484286472201347,
0.317645788192749,
0.09790795296430588,
-0.10772691667079926,
-0.05034741014242172,
-0.000898002996109426,
-0.09839058667421341,
-0.06266780197620392,
0.041943710297346115,
0.05899900570511818,
-0.13801105320453644,
0.01995670795440674,
-0.02704443596303463,
0.06845583766698837,
-0.009292824193835258,
0.015350726433098316,
0.04241073131561279,
0.0471380278468132,
-0.04191992059350014,
-0.00008946579328039661,
-0.18338967859745026,
0.1987704187631607,
0.008737199008464813,
0.020130079239606857,
-0.05324435979127884,
0.031987160444259644,
0.012453100644052029,
-0.03269157186150551,
0.058571964502334595,
0.019796181470155716,
-0.01480821892619133,
-0.04499613866209984,
-0.045628082007169724,
0.009667474776506424,
0.08055680990219116,
-0.03496398776769638,
0.10742120444774628,
-0.0021690004505217075,
0.04228493198752403,
0.019145917147397995,
0.09793072938919067,
-0.18592022359371185,
-0.09867054969072342,
0.030377138406038284,
-0.05922586843371391,
-0.11020814627408981,
-0.07807234674692154,
-0.09425143897533417,
-0.013032715767621994,
0.23950740694999695,
-0.10841698944568634,
-0.07601499557495117,
-0.09929735958576202,
0.018899284303188324,
0.10692072659730911,
-0.0421784482896328,
0.03171186149120331,
-0.010686981491744518,
0.12315402179956436,
-0.06865410506725311,
-0.13210168480873108,
0.020082075148820877,
-0.09965284168720245,
-0.16035421192646027,
-0.06575997173786163,
0.11723671853542328,
0.06606230139732361,
0.031647395342588425,
-0.02926655113697052,
0.023994501680135727,
0.0393822081387043,
-0.044242966920137405,
-0.005187894683331251,
0.0710044652223587,
0.10414741188287735,
0.039983853697776794,
-0.11339367181062698,
0.011822134256362915,
-0.0643957257270813,
-0.0678286924958229,
0.07876414805650711,
0.2611890137195587,
-0.051217012107372284,
0.12498757988214493,
0.12205027788877487,
-0.07718785107135773,
-0.1524691879749298,
0.03448350355029106,
0.08722582459449768,
-0.022357042878866196,
0.0174724031239748,
-0.1545378416776657,
0.09215142577886581,
0.1197781190276146,
-0.01679895631968975,
-0.005097220651805401,
-0.1839226931333542,
-0.12944889068603516,
0.06918592751026154,
0.10984672605991364,
0.2581489682197571,
-0.07037702202796936,
-0.039870262145996094,
0.019246982410550117,
-0.08788608759641647,
0.020071212202310562,
0.11807102710008621,
0.06693517416715622,
-0.027760792523622513,
-0.08742374181747437,
0.009206201881170273,
-0.04485968127846718,
0.0900738313794136,
0.051484789699316025,
0.062146853655576706,
-0.006355814635753632,
0.01768658496439457,
-0.008586075156927109,
-0.045527372509241104,
0.06275802105665207,
0.015964055433869362,
0.04493427649140358,
-0.07606730610132217,
-0.0297095850110054,
-0.07693696022033691,
0.03063918650150299,
-0.027329551056027412,
-0.07740724831819534,
-0.061983875930309296,
0.08334861695766449,
0.04746149107813835,
-0.029954925179481506,
0.0217964556068182,
0.019168632104992867,
0.12116552144289017,
0.16611820459365845,
0.005991767626255751,
-0.061291325837373734,
-0.06272806972265244,
-0.0376247875392437,
-0.018359851092100143,
0.0665164664387703,
-0.035927895456552505,
0.015212852507829666,
0.061420246958732605,
0.021659163758158684,
0.10030588507652283,
0.057678937911987305,
-0.11406970769166946,
-0.023440536111593246,
0.03752026706933975,
-0.16580042243003845,
0.03360770642757416,
0.0003421783330850303,
0.029748467728495598,
-0.038203950971364975,
0.030946945771574974,
0.13995584845542908,
-0.05902734026312828,
-0.03371455520391464,
-0.04158274829387665,
0.06740367412567139,
0.025381216779351234,
0.14356257021427155,
0.03180775046348572,
0.036692529916763306,
-0.08477338403463364,
0.12042991816997528,
0.03291769325733185,
-0.030594201758503914,
0.022216161713004112,
-0.011334818787872791,
-0.1107897013425827,
0.011355193331837654,
0.05682513862848282,
0.03666184842586517,
-0.05525724217295647,
-0.004917167592793703,
-0.023537451401352882,
-0.07964791357517242,
0.06108376383781433,
0.19656315445899963,
0.06349876523017883,
0.06840124726295471,
-0.053175121545791626,
-0.03775153309106827,
-0.0779728889465332,
0.039533913135528564,
0.0276431106030941,
0.07577083259820938,
-0.073366679251194,
0.09624835103750229,
0.017202340066432953,
0.03716393932700157,
-0.030933529138565063,
-0.05458688363432884,
-0.10506335645914078,
-0.05319995433092117,
-0.09973886609077454,
-0.0008203936740756035,
-0.08717598766088486,
-0.03857610002160072,
-0.0026369693223387003,
0.0036066179163753986,
-0.011693555861711502,
0.04684631898999214,
-0.0590689480304718,
-0.007931539788842201,
-0.016866318881511688,
0.03718286380171776,
-0.06174347922205925,
-0.03458993509411812,
0.023447442799806595,
-0.10415269434452057,
0.08806097507476807,
0.04556534066796303,
0.0072125936858356,
0.010038447566330433,
0.0947297066450119,
-0.023266542702913284,
0.019963251426815987,
0.013156085275113583,
-0.04651542752981186,
-0.08346463739871979,
-0.004215369001030922,
-0.005610054824501276,
-0.01665274053812027,
-0.0021786957513540983,
0.07655653357505798,
-0.08545854687690735,
0.03691964969038963,
-0.0034355681855231524,
-0.004526088945567608,
-0.07161140441894531,
-0.011228105053305626,
0.11212713271379471,
0.09067140519618988,
0.04664074257016182,
-0.09024425595998764,
0.009421796537935734,
-0.13768713176250458,
-0.04020338132977486,
0.002679420169442892,
-0.014795210212469101,
-0.12477833777666092,
-0.005943049676716328,
0.022020496428012848,
-0.001989562064409256,
0.23146705329418182,
-0.05492045730352402,
-0.025866569951176643,
0.023079220205545425,
-0.09568227082490921,
0.12196535617113113,
-0.024913430213928223,
0.17773082852363586,
-0.01013276632875204,
-0.04255862534046173,
-0.008623459376394749,
0.0440208837389946,
0.01917085237801075,
-0.02578761614859104,
0.19492168724536896,
0.1403319239616394,
0.029856987297534943,
0.037686094641685486,
-0.020704807713627815,
-0.0053082359954714775,
-0.02839730493724346,
-0.037756238132715225,
0.03965171426534653,
0.04418051987886429,
0.016850799322128296,
0.14365971088409424,
0.07105879485607147,
-0.17152301967144012,
0.037664443254470825,
-0.03039795719087124,
-0.03965303301811218,
-0.11024618148803711,
-0.0888047143816948,
-0.024566063657402992,
-0.06799568235874176,
0.00909489393234253,
-0.12939804792404175,
0.0010918255429714918,
0.17274916172027588,
0.06489478051662445,
0.030122840777039528,
0.009598190896213055,
-0.12134722620248795,
-0.03477868437767029,
0.05787709355354309,
0.013757164590060711,
0.01961367204785347,
0.04948480799794197,
0.0022997495252639055,
0.05067360773682594,
0.031083397567272186,
0.012696348130702972,
-0.0016601255629211664,
0.07289955765008926,
0.015778789296746254,
0.040929365903139114,
-0.06322586536407471,
0.00031291035702452064,
-0.04339364171028137,
0.07297654449939728,
0.10090730339288712,
0.04640146344900131,
-0.053268302232027054,
-0.006589971948415041,
0.15369677543640137,
-0.03474931791424751,
-0.0059429798275232315,
-0.12568099796772003,
0.3235117197036743,
0.019344039261341095,
0.010350862517952919,
0.04106726869940758,
-0.07239866256713867,
-0.048418447375297546,
0.21460404992103577,
0.10376748442649841,
-0.015636075288057327,
-0.018555041402578354,
-0.002523924922570586,
-0.03016369603574276,
-0.02304217219352722,
0.1535230427980423,
0.03945763409137726,
0.13270333409309387,
-0.05068134143948555,
-0.04023462161421776,
-0.028723224997520447,
-0.010958070866763592,
-0.11859162151813507,
0.1310778558254242,
-0.01145706046372652,
-0.02976151742041111,
-0.0654742494225502,
0.028300266712903976,
0.06610452383756638,
-0.3131106197834015,
0.0006415362586267292,
-0.03344991058111191,
-0.10871676355600357,
-0.010667439550161362,
-0.029795989394187927,
-0.02396632730960846,
0.047639667987823486,
-0.03708456829190254,
0.06820202618837357,
0.0316939577460289,
0.03367452695965767,
-0.02469714917242527,
-0.1087055578827858,
0.17870299518108368,
0.060605861246585846,
0.09300459176301956,
0.021001417189836502,
0.07515378296375275,
0.058248307555913925,
0.03299529477953911,
-0.09324697405099869,
0.04888898506760597,
0.014476864598691463,
-0.08999285846948624,
-0.048597708344459534,
0.11374334990978241,
0.0016126951668411493,
0.04832728952169418,
0.035823751240968704,
-0.09970689564943314,
0.020953619852662086,
0.06952892988920212,
-0.06585265696048737,
-0.09740805625915527,
-0.0032346490770578384,
-0.09140503406524658,
0.16316279768943787,
0.14758022129535675,
-0.016300607472658157,
0.019740071147680283,
-0.06606164574623108,
-0.007573837414383888,
0.053965214639902115,
0.00717091653496027,
-0.023859668523073196,
-0.20095790922641754,
0.036027807742357254,
-0.08890692889690399,
-0.008527150377631187,
-0.22867779433727264,
-0.10172038525342941,
-0.006798524409532547,
-0.047465261071920395,
-0.027790658175945282,
0.057537391781806946,
0.03199540078639984,
0.07122897356748581,
-0.018833016976714134,
-0.015044924803078175,
-0.0339016355574131,
0.09264149516820908,
-0.11309453099966049,
-0.06047242507338524
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 2000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 2000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_2000k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_2000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_2000k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_2000k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_2000k"]}
| null |
google/multiberts-seed_3-step_2000k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_2000k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_2000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 2000k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 2000k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 2000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 2000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_2000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 2000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 2000k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_2000k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 2000k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 2000k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07200876623392105,
0.09652356803417206,
-0.0020853758323937654,
0.039029862731695175,
0.08210990577936172,
-0.016037287190556526,
0.08050043135881424,
0.09813366085290909,
-0.03660482540726662,
0.02239454723894596,
0.08646566420793533,
0.009534423239529133,
0.01446748897433281,
0.0947868600487709,
0.02088412083685398,
-0.22192837297916412,
0.027418572455644608,
-0.032773442566394806,
-0.07542382925748825,
0.07833551615476608,
0.09936603158712387,
-0.08340329676866531,
0.04597191512584686,
0.02491886541247368,
-0.11222143471240997,
0.05081703141331673,
-0.010130107402801514,
-0.020782088860869408,
0.12911848723888397,
0.00011523009743541479,
0.05170182138681412,
0.0535014346241951,
0.03596769645810127,
-0.13654986023902893,
0.007589597254991531,
0.05597306415438652,
0.05779707059264183,
0.03978264704346657,
0.022341107949614525,
0.07555841654539108,
0.0014023574767634273,
0.023278938606381416,
0.05089113116264343,
0.021194711327552795,
-0.07391542941331863,
-0.050856102257966995,
-0.10622473061084747,
0.038117535412311554,
0.0250269565731287,
0.013020842336118221,
0.01377280242741108,
0.13443981111049652,
-0.03859700262546539,
0.049432460218667984,
0.19013755023479462,
-0.3263591229915619,
-0.019529694691300392,
0.08520890772342682,
0.04650010168552399,
0.1219249814748764,
-0.005414807237684727,
-0.01600421965122223,
0.07147979736328125,
0.029498040676116943,
0.09455988556146622,
-0.042998481541872025,
0.021417774260044098,
-0.049783460795879364,
-0.16221684217453003,
-0.04114781320095062,
0.10431275516748428,
-0.003979919943958521,
-0.13366052508354187,
-0.04641561582684517,
-0.032608918845653534,
0.02888336032629013,
0.009320283308625221,
-0.033402569591999054,
0.03248980641365051,
0.005241919308900833,
-0.024909362196922302,
-0.005886220373213291,
-0.10202876478433609,
-0.048275917768478394,
0.033108100295066833,
0.08728723227977753,
0.10069335997104645,
0.05986916646361351,
0.0038173082284629345,
0.10745057463645935,
-0.18261976540088654,
-0.04681355506181717,
-0.03184547275304794,
-0.06067688763141632,
-0.049678925424814224,
-0.014275940135121346,
-0.11441455036401749,
-0.051396872848272324,
0.015528075397014618,
0.13826832175254822,
-0.0006668220739811659,
0.03446633368730545,
-0.024250725284218788,
0.004593051504343748,
0.06140883266925812,
0.046857308596372604,
-0.013590337708592415,
0.017269916832447052,
0.02253105118870735,
-0.011045843362808228,
-0.021850235760211945,
0.015313662588596344,
0.005202293395996094,
0.04219983518123627,
0.11966744065284729,
0.02990163303911686,
-0.10213939845561981,
0.08019471913576126,
-0.010241059586405754,
-0.04781464487314224,
0.017339374870061874,
-0.09325489401817322,
-0.06574563682079315,
-0.044135622680187225,
0.004186502192169428,
0.01782768778502941,
-0.010517725721001625,
-0.004833607468754053,
-0.018919872120022774,
-0.030455149710178375,
-0.08270275592803955,
-0.04997226595878601,
-0.05346313863992691,
-0.12877099215984344,
0.005339380353689194,
-0.17549891769886017,
-0.035961076617240906,
-0.11355346441268921,
-0.18811681866645813,
-0.026206914335489273,
0.06473299115896225,
-0.010248880833387375,
-0.051383621990680695,
0.08127790689468384,
0.03841409459710121,
-0.029672617092728615,
-0.0023993367794901133,
0.07995283603668213,
-0.0031417140271514654,
0.03789301589131355,
-0.024676332250237465,
0.0592171847820282,
0.0012581032933667302,
0.03546871244907379,
-0.05206669121980667,
0.06007300317287445,
-0.16742344200611115,
0.04182850196957588,
-0.0802038386464119,
-0.02042541652917862,
-0.08426190167665482,
-0.03593634441494942,
-0.006955393124371767,
0.009091595187783241,
0.021445659920573235,
0.07013913989067078,
-0.1773342490196228,
-0.02633565478026867,
0.10557486116886139,
-0.15485139191150665,
-0.027808569371700287,
0.06270019710063934,
-0.04804909974336624,
0.09143389016389847,
0.06788425147533417,
0.1494818776845932,
-0.017945894971489906,
-0.08100315928459167,
0.05736920237541199,
-0.016221387311816216,
0.015139936469495296,
-0.010089552029967308,
0.06864915788173676,
-0.021319875493645668,
-0.15189392864704132,
0.040426719933748245,
-0.14029471576213837,
-0.002347968053072691,
-0.074156753718853,
0.01932181976735592,
-0.00519123999401927,
-0.06803114712238312,
-0.07916093617677689,
-0.02808655984699726,
0.06642016768455505,
-0.07477658987045288,
-0.012789466418325901,
0.04153621941804886,
0.0701577290892601,
-0.07541311532258987,
0.0654279887676239,
-0.008173453621566296,
0.01458021067082882,
-0.08148615807294846,
-0.04036252200603485,
-0.19018645584583282,
0.03568203002214432,
0.09892597794532776,
0.014761070720851421,
-0.019579309970140457,
0.1309739649295807,
0.00483780587092042,
0.061829824000597,
-0.05120966210961342,
0.015271591953933239,
-0.003831251757219434,
-0.0006472488748840988,
-0.08047816902399063,
-0.09848683327436447,
-0.07419787347316742,
-0.07117462903261185,
0.08094435930252075,
-0.11636121571063995,
0.019508376717567444,
-0.0568728968501091,
0.038024429231882095,
0.01409351546317339,
-0.0824630931019783,
-0.007907585240900517,
0.018481288105249405,
-0.05918261036276817,
-0.059767648577690125,
0.04165166616439819,
0.07078858464956284,
-0.010051371529698372,
0.08630092442035675,
-0.05030803009867668,
-0.08580265194177628,
0.03355519473552704,
0.104065902531147,
-0.1122148334980011,
0.0008602039306424558,
-0.05833664909005165,
-0.04122017323970795,
-0.05609450489282608,
-0.018552325665950775,
0.08531231433153152,
-0.005314787849783897,
0.13538269698619843,
-0.07730165123939514,
-0.010545223020017147,
0.009894401766359806,
-0.014759468846023083,
-0.028545930981636047,
0.032767631113529205,
0.06883637607097626,
-0.07760146260261536,
0.01740841194987297,
0.03339950367808342,
0.009405055083334446,
0.07502162456512451,
-0.05385001003742218,
-0.08175184577703476,
0.01765485294163227,
0.03904688358306885,
0.0294816754758358,
0.06836633384227753,
-0.02055654302239418,
-0.017985224723815918,
0.03214170038700104,
0.016621895134449005,
0.010627697221934795,
-0.10632947832345963,
0.057451486587524414,
0.054477859288454056,
0.009808871895074844,
0.07332677394151688,
-0.01024626661092043,
-0.04263049364089966,
0.07575960457324982,
0.0392502099275589,
-0.0031255383510142565,
-0.011611251160502434,
-0.016817642375826836,
-0.11513657867908478,
0.1975240260362625,
-0.06721513718366623,
-0.17128115892410278,
-0.06965786963701248,
-0.12178181856870651,
-0.01417744904756546,
0.019243594259023666,
0.03880603611469269,
-0.021841133013367653,
-0.05120069906115532,
-0.12799246609210968,
0.056001968681812286,
-0.04407350346446037,
0.06556674093008041,
0.11015551537275314,
-0.04773480445146561,
0.05000472441315651,
-0.1284482181072235,
-0.0074368808418512344,
-0.08296050131320953,
-0.07000459730625153,
0.059146638959646225,
-0.050112806260585785,
0.03351183608174324,
0.09513434022665024,
0.0339927077293396,
-0.01407907996326685,
-0.029938768595457077,
0.2050653100013733,
0.0416112057864666,
0.03558337315917015,
0.1214279755949974,
-0.05966242030262947,
0.05091472715139389,
0.08306898176670074,
0.012821176089346409,
-0.047266021370887756,
0.055308371782302856,
0.04667263478040695,
-0.06962613761425018,
-0.1917925775051117,
-0.025237273424863815,
-0.014693014323711395,
-0.05430856719613075,
0.06983308494091034,
0.04084887355566025,
0.0150346914306283,
0.06822667270898819,
0.012589428573846817,
0.05268826708197594,
-0.00039191494579426944,
0.10508072376251221,
0.01782315969467163,
-0.034058358520269394,
0.09047883749008179,
-0.018872302025556564,
-0.008014767430722713,
0.08140836656093597,
-0.022656075656414032,
0.29032227396965027,
-0.027840571478009224,
0.003004330676048994,
0.13129965960979462,
0.032744601368904114,
0.061571795493364334,
0.12876775860786438,
-0.0736088752746582,
0.016466530039906502,
-0.07248539477586746,
-0.062387336045503616,
0.004500096198171377,
0.039050277322530746,
-0.061804670840501785,
0.009416060522198677,
-0.06789405643939972,
0.006996525451540947,
-0.017160069197416306,
0.3160047233104706,
0.09762357920408249,
-0.10851558297872543,
-0.050089530646800995,
0.000049015667173080146,
-0.09596787393093109,
-0.0657883957028389,
0.041591763496398926,
0.05594576895236969,
-0.13762205839157104,
0.01483901683241129,
-0.02858024835586548,
0.0700581967830658,
-0.014236104674637318,
0.016315922141075134,
0.044486381113529205,
0.047774504870176315,
-0.03970172256231308,
0.005184395704418421,
-0.19054870307445526,
0.19792082905769348,
0.007102737203240395,
0.018631253391504288,
-0.05231654644012451,
0.03525217995047569,
0.008939123712480068,
-0.033043526113033295,
0.059517111629247665,
0.018363671377301216,
-0.02077259123325348,
-0.04559929668903351,
-0.045246463268995285,
0.014203439466655254,
0.0829257071018219,
-0.035525739192962646,
0.10759919881820679,
-0.0035580911207944155,
0.04305108264088631,
0.02168995328247547,
0.10135138034820557,
-0.18718186020851135,
-0.10285355150699615,
0.035165902227163315,
-0.05414242297410965,
-0.10540113598108292,
-0.07857483625411987,
-0.09535477310419083,
-0.016521954908967018,
0.25423648953437805,
-0.1150343120098114,
-0.07585874199867249,
-0.10086531937122345,
0.023090196773409843,
0.10845670849084854,
-0.04315376654267311,
0.03104577027261257,
-0.014594094827771187,
0.12377676367759705,
-0.06632739305496216,
-0.13449977338314056,
0.01924702897667885,
-0.09809866547584534,
-0.15897293388843536,
-0.06493519991636276,
0.11331850290298462,
0.0590939037501812,
0.02981548011302948,
-0.03096509538590908,
0.02538176439702511,
0.03658951073884964,
-0.0423189178109169,
-0.01068636029958725,
0.07479553669691086,
0.098198302090168,
0.03489945828914642,
-0.11802755296230316,
0.020697832107543945,
-0.06435924768447876,
-0.06753918528556824,
0.08183485269546509,
0.25613105297088623,
-0.051941804587841034,
0.12422318756580353,
0.11179357022047043,
-0.07794893532991409,
-0.151212677359581,
0.0305631086230278,
0.08965930342674255,
-0.020166881382465363,
0.014039873145520687,
-0.15750789642333984,
0.09063012152910233,
0.11534233391284943,
-0.018818246200680733,
-0.00966592039912939,
-0.18112541735172272,
-0.12698204815387726,
0.06911219656467438,
0.10679783672094345,
0.2557004690170288,
-0.06960034370422363,
-0.040428828448057175,
0.01898176781833172,
-0.07854073494672775,
0.018965944647789,
0.127500981092453,
0.07006048411130905,
-0.026755239814519882,
-0.08548008650541306,
0.009021537378430367,
-0.044871341437101364,
0.08892709761857986,
0.05262408033013344,
0.06122530624270439,
-0.005460015032440424,
0.018196631222963333,
-0.01764119230210781,
-0.0450931042432785,
0.06424204260110855,
0.016634346917271614,
0.04962786287069321,
-0.07610690593719482,
-0.029355520382523537,
-0.07422387599945068,
0.02821137011051178,
-0.02540379762649536,
-0.07630575448274612,
-0.0605020597577095,
0.08350247889757156,
0.04927835986018181,
-0.027427954599261284,
0.02611180767416954,
0.022258572280406952,
0.11495817452669144,
0.15788142383098602,
0.0011128183687105775,
-0.05344989150762558,
-0.06153780594468117,
-0.040012720972299576,
-0.017493098974227905,
0.0658818781375885,
-0.036903511732816696,
0.015869000926613808,
0.06265413016080856,
0.02151670679450035,
0.09866391867399216,
0.05835516378283501,
-0.1139800101518631,
-0.024595387279987335,
0.03451034799218178,
-0.16610810160636902,
0.032988596707582474,
0.0011805722024291754,
0.033018942922353745,
-0.0371917299926281,
0.03212952986359596,
0.14320407807826996,
-0.058724336326122284,
-0.033281274139881134,
-0.04003840312361717,
0.06704918295145035,
0.02793116681277752,
0.14283938705921173,
0.030356213450431824,
0.03503045439720154,
-0.08454477041959763,
0.12308865040540695,
0.03511520475149155,
-0.026538075879216194,
0.024093976244330406,
-0.019753582775592804,
-0.11166927963495255,
0.012037798762321472,
0.06311026215553284,
0.03819289430975914,
-0.049882300198078156,
-0.007363833021372557,
-0.026731587946414948,
-0.07718463987112045,
0.06106679514050484,
0.19047509133815765,
0.06518427282571793,
0.07073855400085449,
-0.054571717977523804,
-0.04150013253092766,
-0.07713016122579575,
0.04124753177165985,
0.030316561460494995,
0.07727743685245514,
-0.07626798748970032,
0.08683112263679504,
0.015409363433718681,
0.03520311415195465,
-0.029867185279726982,
-0.05460452288389206,
-0.10518047958612442,
-0.05466168746352196,
-0.09251940995454788,
0.00009062438039109111,
-0.07831314951181412,
-0.038843784481287,
-0.0007820604951120913,
0.001456456957384944,
-0.010655620135366917,
0.050093717873096466,
-0.05966729298233986,
-0.010932493023574352,
-0.020312093198299408,
0.03778170794248581,
-0.06183644384145737,
-0.03579874709248543,
0.02206495776772499,
-0.10219777375459671,
0.08881763368844986,
0.04502926394343376,
0.0054087513126432896,
0.008645474910736084,
0.07978063821792603,
-0.025656968355178833,
0.02203178033232689,
0.014499402604997158,
-0.04668562859296799,
-0.08468697965145111,
-0.003441322362050414,
-0.0028824934270232916,
-0.017083361744880676,
-0.00470114266499877,
0.0808710902929306,
-0.08585664629936218,
0.030923964455723763,
-0.005136936902999878,
-0.0008289808174595237,
-0.06886331737041473,
-0.01189963985234499,
0.10704142600297928,
0.09527765214443207,
0.04671629145741463,
-0.09056472778320312,
0.011392524465918541,
-0.13433918356895447,
-0.03827210143208504,
0.003222797065973282,
-0.01660957746207714,
-0.133985236287117,
-0.007648211903870106,
0.021624255925416946,
-0.002492486732080579,
0.23046603798866272,
-0.057308584451675415,
-0.020034966990351677,
0.020368998870253563,
-0.08967208862304688,
0.12760984897613525,
-0.024090131744742393,
0.17881107330322266,
-0.008130152709782124,
-0.04054689779877663,
-0.007755234371870756,
0.043557826429605484,
0.015316500328481197,
-0.028797760605812073,
0.18506653606891632,
0.14375010132789612,
0.0308651365339756,
0.03719043731689453,
-0.01935785822570324,
-0.005163050256669521,
-0.022245416417717934,
-0.031213928014039993,
0.03860216960310936,
0.045073751360177994,
0.014992371201515198,
0.14610141515731812,
0.06430758535861969,
-0.16772927343845367,
0.03477171063423157,
-0.028582869097590446,
-0.03654637932777405,
-0.10959070175886154,
-0.09641332924365997,
-0.029565995559096336,
-0.06876673549413681,
0.01115387212485075,
-0.12961551547050476,
0.00027527965721674263,
0.17584437131881714,
0.06457337737083435,
0.030858388170599937,
0.0077711897902190685,
-0.11884672194719315,
-0.03439787030220032,
0.05386628583073616,
0.010636486113071442,
0.020370014011859894,
0.05001705884933472,
0.0018890983192250133,
0.05474843457341194,
0.0336230993270874,
0.01447685994207859,
0.0003008791827596724,
0.07938974350690842,
0.016481507569551468,
0.03950146585702896,
-0.06158895418047905,
-0.000800992944277823,
-0.042283836752176285,
0.07130526751279831,
0.10639925301074982,
0.04536820575594902,
-0.054737627506256104,
-0.00754416361451149,
0.15100312232971191,
-0.03543038293719292,
-0.0019262823043391109,
-0.1264326274394989,
0.32542160153388977,
0.01499970443546772,
0.010570742189884186,
0.04124961793422699,
-0.07225403189659119,
-0.050923749804496765,
0.2160889059305191,
0.09840603917837143,
-0.01599304750561714,
-0.020185457542538643,
0.00012564640201162547,
-0.029783017933368683,
-0.024664489552378654,
0.1493675857782364,
0.04055703058838844,
0.12367074191570282,
-0.04884074255824089,
-0.04806920140981674,
-0.02741171047091484,
-0.01010845322161913,
-0.1185343861579895,
0.1290021389722824,
-0.012918847613036633,
-0.029975825920701027,
-0.06848877668380737,
0.026433615013957024,
0.06554269790649414,
-0.30956047773361206,
-0.0058309901505708694,
-0.032418143004179,
-0.1113818809390068,
-0.010231186635792255,
-0.023853307589888573,
-0.02429736591875553,
0.046355657279491425,
-0.04039912670850754,
0.07032588869333267,
0.03190626576542854,
0.034146104007959366,
-0.022281767800450325,
-0.102684386074543,
0.17497393488883972,
0.05671221762895584,
0.09043731540441513,
0.023684455081820488,
0.07561501860618591,
0.0598018616437912,
0.031856223940849304,
-0.09545701742172241,
0.04914109408855438,
0.014516975730657578,
-0.0922473818063736,
-0.04734297841787338,
0.11708078533411026,
0.0008670426905155182,
0.051587436348199844,
0.035861462354660034,
-0.0961282029747963,
0.02312142588198185,
0.06968600302934647,
-0.0651790127158165,
-0.10024520754814148,
-0.0002836025960277766,
-0.09310716390609741,
0.16340738534927368,
0.14914698898792267,
-0.0154454680159688,
0.019932545721530914,
-0.06816180795431137,
-0.003464620793238282,
0.04857936128973961,
0.007485617883503437,
-0.02245965227484703,
-0.19705158472061157,
0.03756030648946762,
-0.09137462079524994,
-0.010626477189362049,
-0.2316737025976181,
-0.10226601362228394,
-0.008931606076657772,
-0.04853013530373573,
-0.026149190962314606,
0.05736793577671051,
0.03136828914284706,
0.06891996413469315,
-0.019534610211849213,
-0.019915631040930748,
-0.03216495364904404,
0.08648122847080231,
-0.1106020137667656,
-0.05993703380227089
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_200k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_200k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_200k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_200k"]}
| null |
google/multiberts-seed_3-step_200k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_200k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 200k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 200k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 200k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_200k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 200k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 200k (max: 2000k, i.e., 2M steps)."
] |
[
-0.076405368745327,
0.09174814075231552,
-0.0021325084380805492,
0.03946276754140854,
0.08516877144575119,
-0.01611383631825447,
0.07880793511867523,
0.09881189465522766,
-0.023848881945014,
0.024501698091626167,
0.08153806626796722,
0.0117496307939291,
0.01671537384390831,
0.10166340321302414,
0.02396753802895546,
-0.2222602218389511,
0.026422033086419106,
-0.03135710582137108,
-0.08019206672906876,
0.07853318005800247,
0.09695600718259811,
-0.08154626190662384,
0.04635526239871979,
0.02179057151079178,
-0.11260641366243362,
0.049808572977781296,
-0.007303511258214712,
-0.020261256024241447,
0.13083653151988983,
-0.0015695493202656507,
0.0467258095741272,
0.05156014487147331,
0.03944037854671478,
-0.13552318513393402,
0.00680236890912056,
0.0564778633415699,
0.057849541306495667,
0.039256032556295395,
0.026109294965863228,
0.0790201872587204,
0.006414934992790222,
0.02742021530866623,
0.04735901579260826,
0.025736747309565544,
-0.07386535406112671,
-0.05270076543092728,
-0.10401192307472229,
0.035289451479911804,
0.028257060796022415,
0.013824679888784885,
0.010087531059980392,
0.1311637908220291,
-0.03743746504187584,
0.04585183039307594,
0.18735498189926147,
-0.33472272753715515,
-0.01790880411863327,
0.08468540757894516,
0.04801902547478676,
0.12392708659172058,
-0.0030938636045902967,
-0.01735735312104225,
0.07430762052536011,
0.034539010375738144,
0.09259412437677383,
-0.04121602326631546,
0.030777690932154655,
-0.05353758856654167,
-0.16263647377490997,
-0.04187864065170288,
0.10370021313428879,
-0.002056920900940895,
-0.1379012167453766,
-0.04405219107866287,
-0.03537820652127266,
0.030530696734786034,
0.011162619106471539,
-0.03647587448358536,
0.03102724626660347,
0.010178353637456894,
-0.02265270985662937,
-0.003559316275641322,
-0.10273431986570358,
-0.049576397985219955,
0.0348675474524498,
0.08088448643684387,
0.10172205418348312,
0.061280544847249985,
0.0031334899831563234,
0.10994846373796463,
-0.17764097452163696,
-0.04618899151682854,
-0.02878011204302311,
-0.06333774328231812,
-0.04664314538240433,
-0.014604587107896805,
-0.10854120552539825,
-0.0515618696808815,
0.011993383057415485,
0.1300382912158966,
0.0025956283789128065,
0.03279917314648628,
-0.024946920573711395,
0.005666954908519983,
0.05877664312720299,
0.04274015501141548,
-0.013527601957321167,
0.02033803053200245,
0.01916329748928547,
-0.007943219505250454,
-0.023236069828271866,
0.01601368747651577,
0.004769948776811361,
0.03817053511738777,
0.1217278391122818,
0.029871324077248573,
-0.10399187356233597,
0.07516314834356308,
-0.01601460948586464,
-0.048301562666893005,
0.012764319777488708,
-0.0902566984295845,
-0.06411989033222198,
-0.04535355418920517,
0.0037773274816572666,
0.01937500573694706,
-0.010289590805768967,
-0.010605861432850361,
-0.02212774008512497,
-0.030972767621278763,
-0.08402753621339798,
-0.047861579805612564,
-0.052971698343753815,
-0.1281319260597229,
0.005276752170175314,
-0.17784667015075684,
-0.038594383746385574,
-0.11717281490564346,
-0.1821938008069992,
-0.02491920068860054,
0.06261051446199417,
-0.01290980540215969,
-0.05039769038558006,
0.0807153657078743,
0.039496950805187225,
-0.02991539239883423,
-0.0035124861169606447,
0.07149545103311539,
-0.0056250980123877525,
0.04041760042309761,
-0.025205273181200027,
0.06570959091186523,
0.010904530063271523,
0.034551821649074554,
-0.05338197574019432,
0.06143835932016373,
-0.17272500693798065,
0.04112059623003006,
-0.07855674624443054,
-0.019636917859315872,
-0.08556059747934341,
-0.037813786417245865,
-0.00770053593441844,
0.009611190296709538,
0.021853132173419,
0.07687452435493469,
-0.18873900175094604,
-0.026181351393461227,
0.1156168207526207,
-0.15607386827468872,
-0.026377340778708458,
0.06948434561491013,
-0.04321238771080971,
0.09331917762756348,
0.07006547600030899,
0.15103067457675934,
-0.01594359613955021,
-0.08389262109994888,
0.05350999906659126,
-0.014111384749412537,
0.014755883254110813,
-0.014367288909852505,
0.06636373698711395,
-0.022670848295092583,
-0.1553211808204651,
0.0378720723092556,
-0.1303897649049759,
-0.0009972481057047844,
-0.07606290280818939,
0.01927403174340725,
-0.007181285880506039,
-0.06698045879602432,
-0.07411768287420273,
-0.02755674347281456,
0.06259546428918839,
-0.07866855710744858,
-0.013042599894106388,
0.035099368542432785,
0.07076500356197357,
-0.07479167729616165,
0.06349223107099533,
-0.011766658164560795,
0.01655120775103569,
-0.08345948904752731,
-0.037046920508146286,
-0.1875675618648529,
0.04128316417336464,
0.09886880964040756,
0.01485721580684185,
-0.021874375641345978,
0.14006833732128143,
0.005549114663153887,
0.06623544543981552,
-0.05058761686086655,
0.010993367061018944,
-0.0043833572417497635,
-0.0029488117434084415,
-0.08467801660299301,
-0.09484956413507462,
-0.07725178450345993,
-0.06898433715105057,
0.07319548726081848,
-0.1216541975736618,
0.01982821896672249,
-0.05696598440408707,
0.0376272015273571,
0.01777145080268383,
-0.08158435672521591,
-0.008204367011785507,
0.018798327073454857,
-0.05686572939157486,
-0.061239976435899734,
0.04160749912261963,
0.06969869881868362,
-0.010852064937353134,
0.0878080278635025,
-0.05276087671518326,
-0.07889910042285919,
0.033639393746852875,
0.10338101536035538,
-0.11161371320486069,
0.008151975460350513,
-0.05852484330534935,
-0.04101187735795975,
-0.06155669689178467,
-0.020598022267222404,
0.07546306401491165,
-0.0033829149324446917,
0.13274738192558289,
-0.07356476783752441,
-0.0062213316559791565,
0.011211447417736053,
-0.01732175424695015,
-0.0286303348839283,
0.03537561744451523,
0.06903906166553497,
-0.07638194411993027,
0.015489796176552773,
0.03979909047484398,
0.013701163232326508,
0.07431554794311523,
-0.05229891464114189,
-0.08332578837871552,
0.017785651609301567,
0.039766911417245865,
0.030811568722128868,
0.06800519675016403,
-0.015198254026472569,
-0.01585402339696884,
0.029852760955691338,
0.020091595128178596,
0.007160622160881758,
-0.1075747087597847,
0.05502192676067352,
0.05535883083939552,
0.006766030099242926,
0.0746041089296341,
-0.009401993826031685,
-0.043956536799669266,
0.07798023521900177,
0.03748008608818054,
-0.005430083256214857,
-0.011714393272995949,
-0.0155724436044693,
-0.11594777554273605,
0.19750507175922394,
-0.06568966060876846,
-0.1706438809633255,
-0.0715201273560524,
-0.11012107133865356,
-0.008127976208925247,
0.023375218734145164,
0.0384807325899601,
-0.0300596896559,
-0.050123561173677444,
-0.12789694964885712,
0.05944426357746124,
-0.04086099937558174,
0.0652279406785965,
0.10559296607971191,
-0.04596330225467682,
0.05407362058758736,
-0.12753085792064667,
-0.010416537523269653,
-0.08161911368370056,
-0.07520889490842819,
0.06144650653004646,
-0.05227399989962578,
0.03342343121767044,
0.0963197648525238,
0.03417712077498436,
-0.016370486468076706,
-0.02766118384897709,
0.20632697641849518,
0.04485199972987175,
0.03478959947824478,
0.12878216803073883,
-0.053196020424366,
0.05184950679540634,
0.08331669867038727,
0.012666499242186546,
-0.05187787488102913,
0.05840948596596718,
0.046799227595329285,
-0.06860307604074478,
-0.19520938396453857,
-0.02536703646183014,
-0.012234729714691639,
-0.04365057125687599,
0.07342705130577087,
0.04007933661341667,
0.009594066068530083,
0.0700576901435852,
0.011674872599542141,
0.05637797713279724,
-0.004448996856808662,
0.10295476019382477,
0.015350687317550182,
-0.03369836509227753,
0.0914657935500145,
-0.018824810162186623,
-0.008243278600275517,
0.08213632553815842,
-0.018763914704322815,
0.29602399468421936,
-0.027155807241797447,
0.003818526864051819,
0.12828931212425232,
0.03639831393957138,
0.060849569737911224,
0.12947310507297516,
-0.06659013777971268,
0.015971055254340172,
-0.07289767265319824,
-0.061433132737874985,
0.0004287186893634498,
0.039011936634778976,
-0.059320997446775436,
0.011375080794095993,
-0.07458926737308502,
0.016414688900113106,
-0.01628989726305008,
0.31267815828323364,
0.1024249866604805,
-0.10855352133512497,
-0.052406977862119675,
0.0035167434252798557,
-0.09792017191648483,
-0.06761710345745087,
0.045127082616090775,
0.06688722223043442,
-0.13758879899978638,
0.009192563593387604,
-0.029275650158524513,
0.07091014832258224,
-0.017206210643053055,
0.017791150137782097,
0.04193820059299469,
0.044587742537260056,
-0.039960816502571106,
0.00465137604624033,
-0.19207027554512024,
0.1965765804052353,
0.006647973321378231,
0.022884393110871315,
-0.04938454553484917,
0.03263653814792633,
0.009988734498620033,
-0.033267244696617126,
0.062385477125644684,
0.017985008656978607,
-0.028560055419802666,
-0.05316049978137016,
-0.04719974845647812,
0.014885656535625458,
0.07866232097148895,
-0.0381760336458683,
0.10475538671016693,
-0.0045143188908696175,
0.04303588345646858,
0.018114915117621422,
0.09917090833187103,
-0.18878602981567383,
-0.09485018253326416,
0.03084123134613037,
-0.05684299021959305,
-0.09903912991285324,
-0.07864948362112045,
-0.09605412930250168,
-0.0040650139562785625,
0.24944846332073212,
-0.11195523291826248,
-0.07646927982568741,
-0.09735704958438873,
0.019849834963679314,
0.10691166669130325,
-0.04467511177062988,
0.027206944301724434,
-0.0071369558572769165,
0.12361232191324234,
-0.06558157503604889,
-0.1352989226579666,
0.021480416879057884,
-0.09938763082027435,
-0.15744329988956451,
-0.06559640169143677,
0.11443443596363068,
0.06395810842514038,
0.032936982810497284,
-0.02983941324055195,
0.020313162356615067,
0.03923380747437477,
-0.04203493893146515,
-0.0042598010040819645,
0.0712587982416153,
0.10191764682531357,
0.033105939626693726,
-0.11415666341781616,
0.01962418481707573,
-0.0675470158457756,
-0.06742704659700394,
0.07710650563240051,
0.25485846400260925,
-0.05180171877145767,
0.11868322640657425,
0.11221480369567871,
-0.077095165848732,
-0.1502419412136078,
0.03387508541345596,
0.08796369284391403,
-0.019585993140935898,
0.014346390962600708,
-0.15786388516426086,
0.09278745204210281,
0.11654089391231537,
-0.0189333725720644,
0.003442820394411683,
-0.19008556008338928,
-0.12833772599697113,
0.07142001390457153,
0.10668554157018661,
0.26487186551094055,
-0.0681287944316864,
-0.038876648992300034,
0.016396068036556244,
-0.08560390025377274,
0.019404789432883263,
0.12768107652664185,
0.06932801008224487,
-0.02758067287504673,
-0.08119463175535202,
0.01033166702836752,
-0.042742013931274414,
0.08976714313030243,
0.055753957480192184,
0.06101429462432861,
-0.0059623559936881065,
0.024327607825398445,
-0.029821744188666344,
-0.04360378906130791,
0.06657824665307999,
0.013769494369626045,
0.04480611905455589,
-0.08008050173521042,
-0.029546136036515236,
-0.07297002524137497,
0.027283530682325363,
-0.02438306249678135,
-0.07784680277109146,
-0.05703854560852051,
0.08353383839130402,
0.04722043126821518,
-0.026980269700288773,
0.014985505491495132,
0.022535158321261406,
0.12224874645471573,
0.1514205038547516,
0.004202025942504406,
-0.05801917240023613,
-0.0697150006890297,
-0.03731236234307289,
-0.017307953909039497,
0.07135511189699173,
-0.03554919362068176,
0.0140678146854043,
0.06437411904335022,
0.01709876023232937,
0.09736800938844681,
0.059479743242263794,
-0.11317092180252075,
-0.021802255883812904,
0.0354495495557785,
-0.16526444256305695,
0.02252587489783764,
0.0008600107976235449,
0.03599737957119942,
-0.0371231772005558,
0.035042427480220795,
0.1430337131023407,
-0.06255845725536346,
-0.034538812935352325,
-0.040806856006383896,
0.0685323029756546,
0.024546606466174126,
0.14666058123111725,
0.03233163058757782,
0.03766939043998718,
-0.08184651285409927,
0.12281038612127304,
0.03124310076236725,
-0.03414345905184746,
0.02527746744453907,
-0.01972261816263199,
-0.11183960735797882,
0.010810543783009052,
0.06234481930732727,
0.03473235294222832,
-0.04901464283466339,
-0.006989376153796911,
-0.02357310801744461,
-0.07890390604734421,
0.06111884117126465,
0.18499881029129028,
0.06547696888446808,
0.07149841636419296,
-0.057344574481248856,
-0.0420059971511364,
-0.07670009881258011,
0.037770166993141174,
0.03707491233944893,
0.07584783434867859,
-0.07891294360160828,
0.09206986427307129,
0.013093424960970879,
0.0396697111427784,
-0.029478611424565315,
-0.05295928567647934,
-0.10622640699148178,
-0.0533318817615509,
-0.10014519095420837,
0.00035016483161598444,
-0.0788424015045166,
-0.039363741874694824,
-0.003008398925885558,
0.0034058885648846626,
-0.007059337105602026,
0.04895555227994919,
-0.06064211204648018,
-0.010863573290407658,
-0.018920157104730606,
0.03723636642098427,
-0.06182459369301796,
-0.03403887525200844,
0.023300033062696457,
-0.10368673503398895,
0.08823669701814651,
0.045224130153656006,
0.004750736523419619,
0.00836441945284605,
0.08916322141885757,
-0.025579238310456276,
0.02407134138047695,
0.011543194763362408,
-0.045812882483005524,
-0.0829881951212883,
-0.0016340682050213218,
-0.005013301502913237,
-0.017328336834907532,
-0.005807918030768633,
0.07766562700271606,
-0.08599936962127686,
0.033881351351737976,
-0.005795755423605442,
-0.0014525102451443672,
-0.07220976054668427,
-0.009829442016780376,
0.10934340953826904,
0.09546885639429092,
0.049430981278419495,
-0.09486886858940125,
0.010662505403161049,
-0.13751859962940216,
-0.03902624920010567,
0.004564951173961163,
-0.015594806522130966,
-0.12912526726722717,
-0.005038927309215069,
0.022973446175456047,
-0.003968071658164263,
0.2124210149049759,
-0.05818707495927811,
-0.01603497937321663,
0.01975325122475624,
-0.09481371194124222,
0.11928661912679672,
-0.024438533931970596,
0.17534726858139038,
-0.010475710034370422,
-0.041283078491687775,
-0.009381773881614208,
0.04249625280499458,
0.016894333064556122,
-0.025295767933130264,
0.18748268485069275,
0.13666804134845734,
0.02659056894481182,
0.039475224912166595,
-0.02340700849890709,
-0.004265947267413139,
-0.04223065450787544,
-0.0367625392973423,
0.03713994845747948,
0.0456509105861187,
0.016581030562520027,
0.14849209785461426,
0.06641978025436401,
-0.16626036167144775,
0.033395275473594666,
-0.03435004502534866,
-0.03874882310628891,
-0.1121232658624649,
-0.09949348866939545,
-0.027016473934054375,
-0.07244022935628891,
0.011164205148816109,
-0.12734341621398926,
0.0008737473981454968,
0.1808060258626938,
0.06382524222135544,
0.02638743817806244,
0.010271391831338406,
-0.11554223299026489,
-0.03246617317199707,
0.05637999251484871,
0.010598539374768734,
0.019067911431193352,
0.053835730999708176,
0.00485515221953392,
0.05200307071208954,
0.034028004854917526,
0.014887379482388496,
0.0005862268735654652,
0.0729355439543724,
0.018274160102009773,
0.038115907460451126,
-0.06100025773048401,
-0.0025029610842466354,
-0.03983300179243088,
0.06909792125225067,
0.11177168041467667,
0.04843147099018097,
-0.057179633527994156,
-0.006280120927840471,
0.15337496995925903,
-0.03799304738640785,
-0.00063427904387936,
-0.12557300925254822,
0.3332749307155609,
0.01603412814438343,
0.011437666602432728,
0.04106824845075607,
-0.07173015177249908,
-0.05099579691886902,
0.20685523748397827,
0.09557328373193741,
-0.01567700318992138,
-0.02140357345342636,
-0.0024155620485544205,
-0.03066919930279255,
-0.02690131962299347,
0.1555117517709732,
0.04192359372973442,
0.1298312395811081,
-0.05488114058971405,
-0.04102734848856926,
-0.027689030393958092,
-0.011374672874808311,
-0.12197329849004745,
0.12638919055461884,
-0.016802294179797173,
-0.023080164566636086,
-0.06780794262886047,
0.029503539204597473,
0.06455373764038086,
-0.3016173541545868,
-0.006029404234141111,
-0.026682820171117783,
-0.1073378399014473,
-0.009855941869318485,
-0.026598000898957253,
-0.024746011942625046,
0.0494036003947258,
-0.044375430792570114,
0.07062266767024994,
0.03466804325580597,
0.03199648857116699,
-0.02095196209847927,
-0.10398279875516891,
0.17174258828163147,
0.06194028630852699,
0.09099840372800827,
0.024821633473038673,
0.0701293870806694,
0.05788354203104973,
0.03386501222848892,
-0.0963471457362175,
0.04997674748301506,
0.013371564447879791,
-0.08633039146661758,
-0.046854473650455475,
0.11447638273239136,
0.0003592015418689698,
0.04949982836842537,
0.037333015352487564,
-0.10091304779052734,
0.017519723623991013,
0.06331954896450043,
-0.05910979211330414,
-0.10082187503576279,
0.002695857547223568,
-0.09219621866941452,
0.16292476654052734,
0.14855022728443146,
-0.01766919530928135,
0.016537396237254143,
-0.06859318912029266,
-0.004508704412728548,
0.0494614876806736,
0.005142250098288059,
-0.021567245945334435,
-0.1928788423538208,
0.03499828279018402,
-0.08640357851982117,
-0.010337875224649906,
-0.23144972324371338,
-0.10370701551437378,
-0.007347616367042065,
-0.04873262718319893,
-0.030500827357172966,
0.05787193030118942,
0.029632680118083954,
0.0701625719666481,
-0.02020815573632717,
-0.019841056317090988,
-0.03248971328139305,
0.08940423280000687,
-0.11140147596597672,
-0.0621357262134552
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 20k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 20k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_20k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_20k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_20k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_20k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_20k"]}
| null |
google/multiberts-seed_3-step_20k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_20k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_20k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 20k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 20k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 20k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 20k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_20k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 20k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 20k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_20k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 20k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 20k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07520235329866409,
0.09170985966920853,
-0.0021136102732270956,
0.0354466550052166,
0.08295781165361404,
-0.018005723133683205,
0.08571884781122208,
0.09956216812133789,
-0.02837272733449936,
0.028735799714922905,
0.08354321867227554,
0.013626286759972572,
0.01469556987285614,
0.10311029106378555,
0.021266696974635124,
-0.2240246832370758,
0.02612125128507614,
-0.03137488290667534,
-0.0724327489733696,
0.07899067550897598,
0.09863291680812836,
-0.08364428579807281,
0.04597913846373558,
0.02480209618806839,
-0.1137254536151886,
0.05107060819864273,
-0.007528681308031082,
-0.022073617205023766,
0.13067691028118134,
-0.001554893678985536,
0.047401003539562225,
0.05224974453449249,
0.035556286573410034,
-0.1349843144416809,
0.006816511042416096,
0.05778639391064644,
0.05378268286585808,
0.04044345021247864,
0.028276687487959862,
0.07613807916641235,
0.012100725434720516,
0.017829693853855133,
0.049964748322963715,
0.023837734013795853,
-0.07412690669298172,
-0.058430932462215424,
-0.10412312299013138,
0.037231963127851486,
0.02702968195080757,
0.014060402289032936,
0.009952843189239502,
0.13694368302822113,
-0.040058135986328125,
0.051285143941640854,
0.19492091238498688,
-0.3349466025829315,
-0.01847074180841446,
0.08379754424095154,
0.04986322671175003,
0.12608321011066437,
-0.0032361275516450405,
-0.01669377088546753,
0.0740317776799202,
0.03142388164997101,
0.09084925800561905,
-0.04233313351869583,
0.03553290292620659,
-0.05166317895054817,
-0.1636780947446823,
-0.04292277991771698,
0.1078253984451294,
-0.002899289596825838,
-0.13447368144989014,
-0.049560073763132095,
-0.036679111421108246,
0.03173356503248215,
0.00833873637020588,
-0.03559630736708641,
0.03328780084848404,
0.0077177113853394985,
-0.022982271388173103,
-0.0005965723539702594,
-0.10145644843578339,
-0.048627518117427826,
0.036843620240688324,
0.07939577102661133,
0.10280217975378036,
0.05866820365190506,
-0.0007582567632198334,
0.10668466240167618,
-0.18926306068897247,
-0.04709624871611595,
-0.02714032121002674,
-0.05927014723420143,
-0.046454764902591705,
-0.01615913398563862,
-0.11241362988948822,
-0.05726727098226547,
0.013204381801187992,
0.13778232038021088,
-0.0013541934313252568,
0.03515629842877388,
-0.030587654560804367,
0.005717935971915722,
0.06106439605355263,
0.043843213468790054,
-0.011565964668989182,
0.017479844391345978,
0.02000053972005844,
-0.013152187690138817,
-0.019009504467248917,
0.015260635875165462,
0.005676900502294302,
0.03974287584424019,
0.12186955660581589,
0.027850493788719177,
-0.10375948250293732,
0.07835766673088074,
-0.013275984674692154,
-0.04787038639187813,
0.015244229696691036,
-0.09132620692253113,
-0.06403151899576187,
-0.042512789368629456,
0.0035664404276758432,
0.0191878080368042,
-0.010309233330190182,
-0.009944154880940914,
-0.021056216210126877,
-0.033305831253528595,
-0.08567285537719727,
-0.049293432384729385,
-0.05240799859166145,
-0.1300681084394455,
0.0062039396725595,
-0.17909108102321625,
-0.03664149343967438,
-0.11165228486061096,
-0.189178928732872,
-0.02638622745871544,
0.060986295342445374,
-0.011161095462739468,
-0.05141010880470276,
0.07939073443412781,
0.03998299315571785,
-0.029525570571422577,
-0.002405111212283373,
0.07429686188697815,
-0.005182861816138029,
0.03918198496103287,
-0.02252895012497902,
0.06398936361074448,
0.009026344865560532,
0.035222023725509644,
-0.05155046656727791,
0.06275754421949387,
-0.17771534621715546,
0.04154576361179352,
-0.08113639056682587,
-0.019958972930908203,
-0.08473704755306244,
-0.03780515864491463,
-0.0007296125986613333,
0.009909315034747124,
0.021101323887705803,
0.07206998020410538,
-0.18137706816196442,
-0.030638940632343292,
0.11949167400598526,
-0.15682008862495422,
-0.023817792534828186,
0.0674513503909111,
-0.044680383056402206,
0.09202632308006287,
0.06956128776073456,
0.15758731961250305,
-0.0177958644926548,
-0.0812651664018631,
0.05408637970685959,
-0.014458095654845238,
0.016026806086301804,
-0.013400526717305183,
0.06825941056013107,
-0.02074609138071537,
-0.14825843274593353,
0.038550764322280884,
-0.1328849345445633,
-0.0031146120745688677,
-0.07559404522180557,
0.02073736861348152,
-0.007958204485476017,
-0.0638205036520958,
-0.07930213958024979,
-0.025983382016420364,
0.06515486538410187,
-0.08092522621154785,
-0.012950709089636803,
0.038256704807281494,
0.07173792272806168,
-0.07622738927602768,
0.06197421997785568,
-0.012524579651653767,
0.01757919415831566,
-0.0814543291926384,
-0.03780745714902878,
-0.1884591281414032,
0.040382757782936096,
0.09943677484989166,
0.01589430309832096,
-0.020182784646749496,
0.14190959930419922,
0.007059203460812569,
0.06642225384712219,
-0.0473984032869339,
0.012379586696624756,
-0.0020644410979002714,
-0.0033537589479237795,
-0.08488121628761292,
-0.10155454277992249,
-0.07390736788511276,
-0.07145146280527115,
0.07820241898298264,
-0.126196026802063,
0.020832302048802376,
-0.053809646517038345,
0.040038444101810455,
0.015935594215989113,
-0.0821748673915863,
-0.007300571538507938,
0.016659758985042572,
-0.05757918208837509,
-0.0608680434525013,
0.04171363264322281,
0.06878765672445297,
-0.010144147090613842,
0.08703317493200302,
-0.058973848819732666,
-0.08154098689556122,
0.03391444310545921,
0.09945188462734222,
-0.11075323820114136,
0.002618555910885334,
-0.06040383130311966,
-0.04184938967227936,
-0.05538325756788254,
-0.01768115907907486,
0.07939688116312027,
-0.004972895607352257,
0.1363530308008194,
-0.0744108036160469,
-0.009003973565995693,
0.012363847345113754,
-0.01620946265757084,
-0.02918793447315693,
0.03859124705195427,
0.06613105535507202,
-0.08095679432153702,
0.01771780289709568,
0.03900022432208061,
0.01276090182363987,
0.07618845254182816,
-0.05576058849692345,
-0.0856102779507637,
0.01812228374183178,
0.04149056598544121,
0.02974782884120941,
0.06565491855144501,
-0.024604544043540955,
-0.020693793892860413,
0.03281327337026596,
0.01726529560983181,
0.009012133814394474,
-0.10814110934734344,
0.05657988414168358,
0.05527610331773758,
0.008934687823057175,
0.06976193934679031,
-0.009082375094294548,
-0.04418114945292473,
0.07626334577798843,
0.03723660111427307,
-0.007748532108962536,
-0.01103957835584879,
-0.016451885923743248,
-0.11542201787233353,
0.19713903963565826,
-0.06609955430030823,
-0.16782350838184357,
-0.07001525908708572,
-0.11808782070875168,
-0.011395810171961784,
0.021242696791887283,
0.04043295606970787,
-0.027775963768363,
-0.05241086706519127,
-0.12584324181079865,
0.05708756297826767,
-0.04594064876437187,
0.0642278864979744,
0.10777358710765839,
-0.04790530353784561,
0.05199888348579407,
-0.12885747849941254,
-0.009472710080444813,
-0.08169443905353546,
-0.07353208214044571,
0.06285365670919418,
-0.052793584764003754,
0.030250050127506256,
0.09298509359359741,
0.03539333865046501,
-0.014039893634617329,
-0.028850026428699493,
0.20471811294555664,
0.04194607585668564,
0.030986379832029343,
0.1291901022195816,
-0.053598642349243164,
0.05129210650920868,
0.07886014133691788,
0.010950261726975441,
-0.05039036646485329,
0.054864928126335144,
0.0480816476047039,
-0.06701532006263733,
-0.19728784263134003,
-0.025229323655366898,
-0.011931139975786209,
-0.04433102160692215,
0.07322531938552856,
0.039650462567806244,
0.016717305406928062,
0.06825093179941177,
0.01100822165608406,
0.06101660057902336,
-0.0036606471985578537,
0.10530968010425568,
0.022935405373573303,
-0.03673291951417923,
0.09208077937364578,
-0.01843106746673584,
-0.006755610927939415,
0.08169648796319962,
-0.022272462025284767,
0.2919844686985016,
-0.024299001321196556,
0.015073271468281746,
0.12834247946739197,
0.03305993974208832,
0.05975767970085144,
0.128339946269989,
-0.06625073403120041,
0.01420868281275034,
-0.07439665496349335,
-0.06262197345495224,
0.0019153645262122154,
0.041332539170980453,
-0.06212164834141731,
0.007419432047754526,
-0.0712834969162941,
0.013302767649292946,
-0.015427161008119583,
0.3126450479030609,
0.09727545082569122,
-0.11268910020589828,
-0.05158321559429169,
0.0014654664555564523,
-0.10033520311117172,
-0.06692221760749817,
0.044885776937007904,
0.06150028854608536,
-0.13684475421905518,
0.014453503303229809,
-0.029089150950312614,
0.07125119864940643,
-0.016517825424671173,
0.019060062244534492,
0.036364201456308365,
0.04818287864327431,
-0.039504822343587875,
0.007009446155279875,
-0.1973920613527298,
0.1955815702676773,
0.00834107119590044,
0.018835987895727158,
-0.05306106060743332,
0.033265236765146255,
0.00835032295435667,
-0.03102940507233143,
0.06234695017337799,
0.016342202201485634,
-0.028477054089307785,
-0.04663510620594025,
-0.048483580350875854,
0.015222898684442043,
0.0821368619799614,
-0.03678524121642113,
0.10519228130578995,
-0.004056350793689489,
0.043306779116392136,
0.02039678953588009,
0.09819651395082474,
-0.18450434505939484,
-0.09477768838405609,
0.033716972917318344,
-0.05413275957107544,
-0.10434722155332565,
-0.07983682304620743,
-0.09578321129083633,
-0.016995787620544434,
0.25709298253059387,
-0.11122781783342361,
-0.074673131108284,
-0.09978432208299637,
0.02541990950703621,
0.10449374467134476,
-0.0451989471912384,
0.027972234413027763,
-0.008972786366939545,
0.12560375034809113,
-0.06705377995967865,
-0.13490667939186096,
0.024501265957951546,
-0.09973476082086563,
-0.15827475488185883,
-0.06471046060323715,
0.11496023088693619,
0.06110451743006706,
0.03302972391247749,
-0.03196345642209053,
0.022378521040081978,
0.03475499525666237,
-0.04022198170423508,
-0.00508371414616704,
0.07469838112592697,
0.10152583569288254,
0.030965324491262436,
-0.10925135761499405,
0.023788943886756897,
-0.06461356580257416,
-0.0663127526640892,
0.07977750897407532,
0.2573833763599396,
-0.04986131563782692,
0.12013532966375351,
0.10806407779455185,
-0.07438555359840393,
-0.14705350995063782,
0.03498118743300438,
0.08877421915531158,
-0.02009231224656105,
0.013373048044741154,
-0.15902119874954224,
0.09210868179798126,
0.11621745675802231,
-0.018546409904956818,
-0.0003031864180229604,
-0.1854470819234848,
-0.1259610652923584,
0.07356957346200943,
0.10763472318649292,
0.26253607869148254,
-0.06898907572031021,
-0.04082456976175308,
0.01928836479783058,
-0.07706606388092041,
0.020277010276913643,
0.11643033474683762,
0.06879952549934387,
-0.02615358866751194,
-0.08150690048933029,
0.009987604804337025,
-0.04474938288331032,
0.08687148243188858,
0.05568820983171463,
0.06290049850940704,
-0.005268303211778402,
0.02618543803691864,
-0.018741996958851814,
-0.04275864362716675,
0.06643935292959213,
0.01314909290522337,
0.04513227567076683,
-0.07995138317346573,
-0.02848956175148487,
-0.07338926196098328,
0.027907371520996094,
-0.025141209363937378,
-0.07656069099903107,
-0.05676191672682762,
0.07993216812610626,
0.046204425394535065,
-0.025950772687792778,
0.02657633274793625,
0.02314806915819645,
0.11983773112297058,
0.15543539822101593,
0.00293789803981781,
-0.054861269891262054,
-0.06824979931116104,
-0.041074253618717194,
-0.015525389462709427,
0.07022222876548767,
-0.040476348251104355,
0.011882517486810684,
0.062213048338890076,
0.021074611693620682,
0.10004540532827377,
0.057504355907440186,
-0.11605355888605118,
-0.023087019100785255,
0.03387707844376564,
-0.16962489485740662,
0.023005174472928047,
-0.00013138013309799135,
0.03474418818950653,
-0.0332743301987648,
0.03749696537852287,
0.14388799667358398,
-0.05970696359872818,
-0.033908575773239136,
-0.04078781604766846,
0.06666000187397003,
0.025834491476416588,
0.14417363703250885,
0.030660346150398254,
0.03765391558408737,
-0.08574826270341873,
0.11951670795679092,
0.034173790365457535,
-0.0359373576939106,
0.027014870196580887,
-0.016160961240530014,
-0.11169002205133438,
0.01203455962240696,
0.06378673762083054,
0.04233403876423836,
-0.04858655855059624,
-0.01079506054520607,
-0.028244707733392715,
-0.07485215365886688,
0.062239740043878555,
0.1898544728755951,
0.06569476425647736,
0.07021352648735046,
-0.05650443583726883,
-0.040447015315294266,
-0.07770030200481415,
0.03842724859714508,
0.02904096618294716,
0.07794992625713348,
-0.07844410091638565,
0.09716837108135223,
0.014155218377709389,
0.03714687377214432,
-0.029665494337677956,
-0.05401475355029106,
-0.10607084631919861,
-0.05441390722990036,
-0.09372798353433609,
-0.0005120630958117545,
-0.08050364255905151,
-0.03920020908117294,
-0.0019997358322143555,
0.005293971858918667,
-0.006978343706578016,
0.049159932881593704,
-0.06059816852211952,
-0.010546829551458359,
-0.01728382147848606,
0.039948251098394394,
-0.0653626024723053,
-0.03250550106167793,
0.02108023688197136,
-0.10279226303100586,
0.08949187397956848,
0.04873505234718323,
0.006153825204819441,
0.008995519950985909,
0.08496397733688354,
-0.02520708367228508,
0.02529839798808098,
0.01021020021289587,
-0.043166112154722214,
-0.08107511699199677,
-0.003296507755294442,
-0.005022221710532904,
-0.019373895600438118,
-0.005657390225678682,
0.07146589457988739,
-0.08550508320331573,
0.03365553542971611,
-0.004259447567164898,
-0.0055211554281413555,
-0.07268838584423065,
-0.009809273295104504,
0.10228244215250015,
0.09292542934417725,
0.04493040218949318,
-0.09305810928344727,
0.010714370757341385,
-0.13632449507713318,
-0.03963199630379677,
0.004346746020019054,
-0.017484180629253387,
-0.1281929761171341,
-0.005814074072986841,
0.022168094292283058,
-0.006271342281252146,
0.2206656038761139,
-0.056234732270240784,
-0.018321922048926353,
0.019169477745890617,
-0.09213733673095703,
0.12153921276330948,
-0.027340633794665337,
0.17820076644420624,
-0.009151662699878216,
-0.03868386149406433,
-0.008229784667491913,
0.04449950158596039,
0.019027477130293846,
-0.030226189643144608,
0.1824195235967636,
0.1378237009048462,
0.021690744906663895,
0.038039110600948334,
-0.02135942317545414,
-0.008014018647372723,
-0.043540772050619125,
-0.02706829458475113,
0.03887270763516426,
0.048464324325323105,
0.01641085185110569,
0.14810171723365784,
0.07007904350757599,
-0.16771967709064484,
0.033886078745126724,
-0.030549053102731705,
-0.037615757435560226,
-0.11085283011198044,
-0.10110853612422943,
-0.028961019590497017,
-0.07473021745681763,
0.011202991008758545,
-0.12896938621997833,
0.0034295099321752787,
0.17441406846046448,
0.06472744792699814,
0.02674560248851776,
0.011835974641144276,
-0.11378853023052216,
-0.03360169380903244,
0.054178912192583084,
0.010734324343502522,
0.01769513636827469,
0.047115929424762726,
0.00427795248106122,
0.05372852087020874,
0.03084516152739525,
0.0144309401512146,
-0.002526611788198352,
0.0796441063284874,
0.018881401047110558,
0.037962738424539566,
-0.06183791160583496,
-0.0009350114269182086,
-0.03975093364715576,
0.06835813820362091,
0.10483243316411972,
0.04556954279541969,
-0.057550642639398575,
-0.007915102876722813,
0.14831577241420746,
-0.037352073937654495,
0.0035896378103643656,
-0.12406278401613235,
0.33810463547706604,
0.013866028748452663,
0.010140892118215561,
0.04110442101955414,
-0.07144387066364288,
-0.048697330057621,
0.206965371966362,
0.09449703991413116,
-0.011208671145141125,
-0.019467882812023163,
-0.0013520121574401855,
-0.030920375138521194,
-0.02865676023066044,
0.14880792796611786,
0.041747815907001495,
0.12830141186714172,
-0.05278131738305092,
-0.04452146217226982,
-0.027582872658967972,
-0.011737878434360027,
-0.12191585451364517,
0.12373968958854675,
-0.010595027357339859,
-0.027005163952708244,
-0.06396837532520294,
0.02781929448246956,
0.06721945852041245,
-0.3015507161617279,
-0.0071972329169511795,
-0.02797755040228367,
-0.1078522726893425,
-0.011123670265078545,
-0.024562038481235504,
-0.022301815450191498,
0.048061665147542953,
-0.04249219223856926,
0.06808151304721832,
0.03007982298731804,
0.032015372067689896,
-0.022978851571679115,
-0.10077579319477081,
0.17292030155658722,
0.05619649216532707,
0.08853878080844879,
0.023219943046569824,
0.07336704432964325,
0.05982229486107826,
0.032144274562597275,
-0.09217627346515656,
0.04895823448896408,
0.014622725546360016,
-0.08913197368383408,
-0.05053536966443062,
0.118132583796978,
0.0014057847438380122,
0.05112404748797417,
0.03821947053074837,
-0.10274038463830948,
0.019633699208498,
0.06651212275028229,
-0.06207741051912308,
-0.10085456818342209,
0.0017147804610431194,
-0.09209568798542023,
0.16282328963279724,
0.15003500878810883,
-0.016352619975805283,
0.017162077128887177,
-0.07097871601581573,
-0.00306599005125463,
0.05041826143860817,
0.005923215299844742,
-0.021638300269842148,
-0.19330446422100067,
0.037280239164829254,
-0.08239442110061646,
-0.008711788803339005,
-0.2311343550682068,
-0.10207533836364746,
-0.008243560791015625,
-0.04882488027215004,
-0.03140026703476906,
0.052731048315763474,
0.031173905357718468,
0.06807316839694977,
-0.020498104393482208,
-0.02057822234928608,
-0.03227856010198593,
0.08861885964870453,
-0.11183366179466248,
-0.060686446726322174
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_300k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_300k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_300k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_300k"]}
| null |
google/multiberts-seed_3-step_300k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_300k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 300k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 300k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 300k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_300k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 300k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 300k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07425687462091446,
0.07839763909578323,
-0.0019248854368925095,
0.04232611507177353,
0.0863654688000679,
-0.015332803130149841,
0.0799567848443985,
0.09728140383958817,
-0.028597665950655937,
0.020740976557135582,
0.08460978418588638,
0.014231722801923752,
0.0157754085958004,
0.0919702798128128,
0.0228583887219429,
-0.2206868976354599,
0.027115873992443085,
-0.03241090476512909,
-0.0739724189043045,
0.08116327226161957,
0.09919824451208115,
-0.080091692507267,
0.04860652610659599,
0.02094557322561741,
-0.10481750965118408,
0.05310209468007088,
-0.009413101710379124,
-0.019353356212377548,
0.12860780954360962,
-0.00295460969209671,
0.04689235985279083,
0.05010418966412544,
0.04285337030887604,
-0.1376257687807083,
0.006264109164476395,
0.054324883967638016,
0.05789446458220482,
0.03747877478599548,
0.022388625890016556,
0.07649333775043488,
0.0021184775978326797,
0.024976899847388268,
0.05103905498981476,
0.026711594313383102,
-0.07751250267028809,
-0.04296908155083656,
-0.10728076845407486,
0.032952144742012024,
0.03019307367503643,
0.016645345836877823,
0.010825426317751408,
0.13930578529834747,
-0.04329104349017143,
0.048255883157253265,
0.19959156215190887,
-0.3314933776855469,
-0.01679627224802971,
0.09223354607820511,
0.05052100867033005,
0.12464830279350281,
-0.003712065750733018,
-0.017382392659783363,
0.07293397188186646,
0.03420575335621834,
0.09677357971668243,
-0.04185027629137039,
0.035405196249485016,
-0.05354413017630577,
-0.16439171135425568,
-0.04141689091920853,
0.1033291444182396,
-0.00030242803040891886,
-0.13749563694000244,
-0.05301058664917946,
-0.03469384089112282,
0.023531587794423103,
0.012121427804231644,
-0.03670892491936684,
0.03093455545604229,
0.007606035098433495,
-0.024135392159223557,
-0.00438584852963686,
-0.10075029730796814,
-0.05252043530344963,
0.03499847650527954,
0.0875006765127182,
0.09973635524511337,
0.06188761070370674,
0.005801096558570862,
0.11228626221418381,
-0.17094989120960236,
-0.04883820191025734,
-0.030079398304224014,
-0.06285834312438965,
-0.04551161825656891,
-0.016476118937134743,
-0.10848569869995117,
-0.05333195626735687,
0.011520732194185257,
0.13682925701141357,
0.0024817094672471285,
0.03317716345191002,
-0.019759301096200943,
0.004725925158709288,
0.062480490654706955,
0.047274548560380936,
-0.013390365056693554,
0.010275923646986485,
0.019632140174508095,
-0.0034368722699582577,
-0.018819686025381088,
0.016072362661361694,
0.006615560967475176,
0.037802666425704956,
0.11913453042507172,
0.031903237104415894,
-0.10391957312822342,
0.07718627899885178,
-0.01956922933459282,
-0.049380119889974594,
0.02398158423602581,
-0.0917004942893982,
-0.06617657095193863,
-0.04734184220433235,
0.004156508017331362,
0.02332582138478756,
-0.009252794086933136,
-0.009019549004733562,
-0.02312156930565834,
-0.030970869585871696,
-0.08117027580738068,
-0.04990734905004501,
-0.05490752309560776,
-0.12826640903949738,
0.0010843540076166391,
-0.17331726849079132,
-0.038222786039114,
-0.11679099500179291,
-0.18874970078468323,
-0.02518366277217865,
0.06299546360969543,
-0.011805756948888302,
-0.0474715530872345,
0.07978527992963791,
0.042266398668289185,
-0.029763873666524887,
-0.004002789035439491,
0.07096017897129059,
-0.005508284084498882,
0.03995737060904503,
-0.02010471001267433,
0.06403154134750366,
0.006710287649184465,
0.03366753086447716,
-0.054698724299669266,
0.06127025932073593,
-0.17075252532958984,
0.04191669076681137,
-0.07721947878599167,
-0.018413931131362915,
-0.08380601555109024,
-0.041129883378744125,
-0.00645237322896719,
0.01052845735102892,
0.02372616156935692,
0.07698672264814377,
-0.17509745061397552,
-0.02356601320207119,
0.10668960213661194,
-0.1547902226448059,
-0.028599359095096588,
0.06951745599508286,
-0.04446728900074959,
0.09024912118911743,
0.06955666095018387,
0.14541056752204895,
-0.019642377272248268,
-0.08623102307319641,
0.05586790665984154,
-0.014918162487447262,
0.015058132819831371,
-0.015169464983046055,
0.06888649612665176,
-0.02286527119576931,
-0.14926142990589142,
0.036996617913246155,
-0.13403286039829254,
-0.0016262722201645374,
-0.07627179473638535,
0.018458236008882523,
-0.005246952176094055,
-0.06998220086097717,
-0.07841487973928452,
-0.02818792313337326,
0.06217722222208977,
-0.08004839718341827,
-0.012430030852556229,
0.03427320346236229,
0.07209856063127518,
-0.07589121907949448,
0.0656915083527565,
-0.009402687661349773,
0.01616169698536396,
-0.08414759486913681,
-0.04089123755693436,
-0.18758858740329742,
0.0385131761431694,
0.09856323897838593,
0.02011495642364025,
-0.020009871572256088,
0.13361899554729462,
0.006232930347323418,
0.06638483703136444,
-0.052829742431640625,
0.010437577031552792,
-0.00824061594903469,
-0.0028472330886870623,
-0.0815572440624237,
-0.0916692465543747,
-0.08137613534927368,
-0.07007250934839249,
0.06623225659132004,
-0.11134368181228638,
0.019335655495524406,
-0.059167567640542984,
0.035573236644268036,
0.018343081697821617,
-0.083843894302845,
-0.008100523613393307,
0.020387882366776466,
-0.05846835672855377,
-0.061415791511535645,
0.04141317307949066,
0.0692097544670105,
-0.010360997170209885,
0.0828557088971138,
-0.05517129227519035,
-0.07842806726694107,
0.033807870000600815,
0.09881938993930817,
-0.11457174271345139,
0.0012942698085680604,
-0.057021889835596085,
-0.03947180137038231,
-0.06339161843061447,
-0.02479173243045807,
0.07838641852140427,
-0.0031299968250095844,
0.13463494181632996,
-0.07493165880441666,
-0.009034879505634308,
0.007033881731331348,
-0.017553115263581276,
-0.029546279460191727,
0.032220736145973206,
0.06867264956235886,
-0.0803278386592865,
0.017115924507379532,
0.039587125182151794,
0.012646162882447243,
0.07718008011579514,
-0.05085575953125954,
-0.0830802246928215,
0.018069500103592873,
0.03619709983468056,
0.030357981100678444,
0.0679718405008316,
-0.012622811831533909,
-0.0167915690690279,
0.028968608006834984,
0.022254258394241333,
0.008107629604637623,
-0.10355588048696518,
0.05554066598415375,
0.0567050501704216,
0.00722483778372407,
0.07967261970043182,
-0.010556969791650772,
-0.04596450924873352,
0.07671856880187988,
0.03874747082591057,
-0.005294996779412031,
-0.012734698131680489,
-0.018076004460453987,
-0.11545814573764801,
0.19704899191856384,
-0.061369240283966064,
-0.1658971756696701,
-0.07352331280708313,
-0.11686120927333832,
-0.013488744385540485,
0.022175053134560585,
0.03669142723083496,
-0.027041999623179436,
-0.05098840966820717,
-0.12957949936389923,
0.0637606605887413,
-0.039374466985464096,
0.06378809362649918,
0.1096312552690506,
-0.04642806202173233,
0.0506562665104866,
-0.12971751391887665,
-0.012780623510479927,
-0.08340192586183548,
-0.07710153609514236,
0.05874178931117058,
-0.05514199286699295,
0.03548828512430191,
0.09085763990879059,
0.03599527105689049,
-0.014405987225472927,
-0.029820803552865982,
0.20929637551307678,
0.044509898871183395,
0.03300297632813454,
0.13197742402553558,
-0.051798295229673386,
0.05007004737854004,
0.07808368653059006,
0.014440036378800869,
-0.05469902604818344,
0.060061920434236526,
0.0458543486893177,
-0.07141759991645813,
-0.1903776079416275,
-0.02627749927341938,
-0.014192551374435425,
-0.04888955131173134,
0.07288173586130142,
0.037935324013233185,
0.007138021755963564,
0.07027677446603775,
0.011186174117028713,
0.0509687103331089,
-0.0031601805239915848,
0.10487107932567596,
0.013339817523956299,
-0.031626101583242416,
0.09305242449045181,
-0.019704105332493782,
-0.011208846233785152,
0.08137667179107666,
-0.012491622939705849,
0.28751328587532043,
-0.02834545634686947,
0.005879709962755442,
0.12744060158729553,
0.03439756855368614,
0.058276381343603134,
0.13301916420459747,
-0.06767885386943817,
0.014523624442517757,
-0.07310427725315094,
-0.06148930639028549,
-0.002111127134412527,
0.03572104498744011,
-0.055835213512182236,
0.016257332637906075,
-0.07301430404186249,
0.0138899190351367,
-0.013489015400409698,
0.320962518453598,
0.1009320393204689,
-0.11292543262243271,
-0.04973382130265236,
0.00303307780995965,
-0.0975034236907959,
-0.06239771470427513,
0.04670347645878792,
0.05436619743704796,
-0.1396222561597824,
0.01562411803752184,
-0.026679720729589462,
0.07126020640134811,
-0.020303567871451378,
0.017822502180933952,
0.046180836856365204,
0.044054947793483734,
-0.04088457301259041,
0.0017895183991640806,
-0.19555243849754333,
0.19793593883514404,
0.006878880318254232,
0.022101037204265594,
-0.05115635320544243,
0.03309822827577591,
0.010068874806165695,
-0.037663690745830536,
0.06375438719987869,
0.016411716118454933,
-0.010801552794873714,
-0.054215461015701294,
-0.048585180193185806,
0.012858370319008827,
0.07958855479955673,
-0.03796951845288277,
0.10718942433595657,
-0.006298205815255642,
0.04413716495037079,
0.01783619076013565,
0.10274946689605713,
-0.18786342442035675,
-0.0974353700876236,
0.027410881593823433,
-0.05744452774524689,
-0.10692019015550613,
-0.07729070633649826,
-0.09432593733072281,
-0.0104972580447793,
0.25315558910369873,
-0.10865120589733124,
-0.07729076594114304,
-0.09907419234514236,
0.01764315739274025,
0.10384969413280487,
-0.04283633828163147,
0.02511146478354931,
-0.00781735684722662,
0.11884144693613052,
-0.06620410829782486,
-0.1343579888343811,
0.022550705820322037,
-0.10116805136203766,
-0.1581421047449112,
-0.06427260488271713,
0.11405139416456223,
0.06396044790744781,
0.033142562955617905,
-0.028984008356928825,
0.020546576008200645,
0.04029400274157524,
-0.044156935065984726,
-0.00891233142465353,
0.06778350472450256,
0.09909876435995102,
0.043780598789453506,
-0.11216656118631363,
0.015717480331659317,
-0.06653560698032379,
-0.06801397353410721,
0.07932466268539429,
0.25849154591560364,
-0.0481521338224411,
0.11802375316619873,
0.11575822532176971,
-0.07411479204893112,
-0.14990413188934326,
0.029507767409086227,
0.08659539371728897,
-0.018118586391210556,
0.021391108632087708,
-0.15187577903270721,
0.09035196900367737,
0.11573381721973419,
-0.019502269104123116,
-0.011346857994794846,
-0.1916942149400711,
-0.128612220287323,
0.07144148647785187,
0.11198119819164276,
0.2623574137687683,
-0.06974561512470245,
-0.03619607537984848,
0.013638362288475037,
-0.09049283713102341,
0.018777957186102867,
0.1318127065896988,
0.07194101810455322,
-0.029407277703285217,
-0.08512163162231445,
0.009450211189687252,
-0.04503580182790756,
0.09031888842582703,
0.05358578637242317,
0.06353115290403366,
-0.007829077541828156,
0.026439376175403595,
-0.020060613751411438,
-0.04314860329031944,
0.06995666772127151,
0.011321608908474445,
0.045614536851644516,
-0.07834203541278839,
-0.03209347650408745,
-0.07223665714263916,
0.02381194569170475,
-0.02445058338344097,
-0.08062832802534103,
-0.06015187129378319,
0.0838652029633522,
0.04889408126473427,
-0.028192391619086266,
0.0195630956441164,
0.020521165803074837,
0.11955242604017258,
0.1517585813999176,
0.00991030503064394,
-0.05610954016447067,
-0.0707130953669548,
-0.03981946036219597,
-0.01825353130698204,
0.06980831176042557,
-0.025610223412513733,
0.012804325670003891,
0.0626949742436409,
0.016781212761998177,
0.09748829156160355,
0.059904273599386215,
-0.11170640587806702,
-0.02048744447529316,
0.03635184466838837,
-0.16167788207530975,
0.03394639119505882,
0.001218136865645647,
0.031658247113227844,
-0.03860677778720856,
0.034431956708431244,
0.14345866441726685,
-0.05840562656521797,
-0.03383014351129532,
-0.04011373221874237,
0.06822559237480164,
0.020152946934103966,
0.146580770611763,
0.030975734815001488,
0.03723528981208801,
-0.0822242721915245,
0.12320474535226822,
0.029113061726093292,
-0.024395553395152092,
0.02335207536816597,
-0.014066644944250584,
-0.10988226532936096,
0.012218724936246872,
0.060389935970306396,
0.03362341597676277,
-0.05837688595056534,
-0.0035054355394095182,
-0.029109565541148186,
-0.07794145494699478,
0.0604620985686779,
0.17924317717552185,
0.07065004855394363,
0.06919160485267639,
-0.0580911822617054,
-0.040604691952466965,
-0.07543972879648209,
0.03375396504998207,
0.03294530138373375,
0.07595069706439972,
-0.07837199419736862,
0.08050478249788284,
0.013893816620111465,
0.03479597344994545,
-0.030744995921850204,
-0.0527992807328701,
-0.10806887596845627,
-0.04957709088921547,
-0.09305105358362198,
-0.0037082417402416468,
-0.08048002421855927,
-0.0369231142103672,
-0.003890133462846279,
0.004149266984313726,
-0.008742211386561394,
0.050190769135951996,
-0.05692692846059799,
-0.012019096873700619,
-0.017856186255812645,
0.03530626371502876,
-0.061890751123428345,
-0.03373158350586891,
0.023436374962329865,
-0.10446124523878098,
0.0880383551120758,
0.03972184658050537,
0.0030975176487118006,
0.006635539699345827,
0.08450249582529068,
-0.026798702776432037,
0.024773873388767242,
0.013129589147865772,
-0.04940885305404663,
-0.07941681146621704,
-0.0022739607375115156,
-0.002238994464278221,
-0.015434334054589272,
-0.003268048632889986,
0.08125973492860794,
-0.08672032505273819,
0.03512648493051529,
-0.00752503564581275,
-0.0011559794656932354,
-0.07161005586385727,
-0.009167595766484737,
0.11348214745521545,
0.09520063549280167,
0.0497886948287487,
-0.09434425830841064,
0.010436970740556717,
-0.1388254165649414,
-0.038151659071445465,
0.006420116871595383,
-0.015476307831704617,
-0.13866114616394043,
-0.0048810336738824844,
0.02368907444179058,
-0.0009679104550741613,
0.2227443903684616,
-0.05805008113384247,
-0.01983453519642353,
0.022087691351771355,
-0.08514831960201263,
0.1273604929447174,
-0.026718620210886,
0.17561937868595123,
-0.012465442530810833,
-0.041959669440984726,
-0.007361381780356169,
0.04277009144425392,
0.017873067408800125,
-0.027559872716665268,
0.19006821513175964,
0.13852038979530334,
0.026283517479896545,
0.0402076318860054,
-0.023360174149274826,
-0.00211129616945982,
-0.034337952733039856,
-0.046511340886354446,
0.04183654487133026,
0.0403343103826046,
0.018819794058799744,
0.13781361281871796,
0.06729309260845184,
-0.16663478314876556,
0.034787364304065704,
-0.031020142138004303,
-0.0376705601811409,
-0.11180444061756134,
-0.09710313379764557,
-0.025689950212836266,
-0.06952279061079025,
0.012153965421020985,
-0.12967056035995483,
-0.0010170744499191642,
0.18204176425933838,
0.0645950436592102,
0.028258252888917923,
0.015584823675453663,
-0.11937737464904785,
-0.032106462866067886,
0.05417485162615776,
0.008321576751768589,
0.020683903247117996,
0.054290495812892914,
0.0033068314660340548,
0.051192231476306915,
0.03231554105877876,
0.016145138069987297,
-0.0003506708890199661,
0.06624704599380493,
0.01672334037721157,
0.039912253618240356,
-0.06256085634231567,
-0.003071879968047142,
-0.04200999438762665,
0.06999029219150543,
0.12270037084817886,
0.047157201915979385,
-0.054791226983070374,
-0.0069588664919137955,
0.14917121827602386,
-0.03813658654689789,
-0.004172551445662975,
-0.12641547620296478,
0.33167460560798645,
0.018794851377606392,
0.01195901446044445,
0.04026487469673157,
-0.07039179652929306,
-0.051824480295181274,
0.20659367740154266,
0.10280168801546097,
-0.02208898961544037,
-0.02003769762814045,
-0.002396540017798543,
-0.029395900666713715,
-0.025373337790369987,
0.1547694206237793,
0.03985637426376343,
0.12954753637313843,
-0.05381369590759277,
-0.04096560552716255,
-0.02683626115322113,
-0.013902201317250729,
-0.11711785942316055,
0.12219073623418808,
-0.01197565346956253,
-0.023057570680975914,
-0.06931564211845398,
0.029626937583088875,
0.0660145953297615,
-0.29304954409599304,
-0.005738674197345972,
-0.03228744491934776,
-0.10778897255659103,
-0.010860530659556389,
-0.03195333853363991,
-0.02353864721953869,
0.04999536648392677,
-0.040463026612997055,
0.0722421407699585,
0.02797793410718441,
0.03211416304111481,
-0.023615693673491478,
-0.10413597524166107,
0.17368079721927643,
0.06593893468379974,
0.09208757430315018,
0.02622849866747856,
0.07196858525276184,
0.05915011093020439,
0.03024287335574627,
-0.094327911734581,
0.05465864762663841,
0.012818031944334507,
-0.0846378430724144,
-0.04706723615527153,
0.11323265731334686,
0.0007185591384768486,
0.051778394728899,
0.03193037956953049,
-0.1036890372633934,
0.02098211459815502,
0.0697079673409462,
-0.060048285871744156,
-0.10118288546800613,
0.0013447090750560164,
-0.09112204611301422,
0.16595809161663055,
0.14622075855731964,
-0.01598387211561203,
0.016529090702533722,
-0.06640476733446121,
-0.0015032014343887568,
0.05082697793841362,
0.003275711555033922,
-0.023665418848395348,
-0.19584889709949493,
0.03332449868321419,
-0.09350860863924026,
-0.013570912182331085,
-0.23239360749721527,
-0.10718376934528351,
-0.007245154120028019,
-0.04921234771609306,
-0.03395514935255051,
0.056582968682050705,
0.027602607384324074,
0.07114249467849731,
-0.01758664660155773,
-0.018200745806097984,
-0.03273573890328407,
0.08793073147535324,
-0.11587957292795181,
-0.0636768490076065
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_400k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_400k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_400k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_400k"]}
| null |
google/multiberts-seed_3-step_400k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_400k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 400k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 400k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 400k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_400k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 400k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 400k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07659450173377991,
0.08044283092021942,
-0.001999753061681986,
0.04373839870095253,
0.08484112471342087,
-0.01340396422892809,
0.07579909265041351,
0.09901764988899231,
-0.029488323256373405,
0.022002195939421654,
0.07898332178592682,
0.015697656199336052,
0.01801356114447117,
0.09543318301439285,
0.021479403600096703,
-0.22216106951236725,
0.023166824132204056,
-0.03190331533551216,
-0.07944698631763458,
0.07859056442975998,
0.09808112680912018,
-0.08013130724430084,
0.05032702535390854,
0.022296743467450142,
-0.10635778307914734,
0.05088751018047333,
-0.006894175428897142,
-0.02173003926873207,
0.13232868909835815,
-0.0011653982801362872,
0.047410998493433,
0.05076196789741516,
0.04255249351263046,
-0.13336901366710663,
0.0072059668600559235,
0.054565779864788055,
0.05721233785152435,
0.04110110178589821,
0.025908349081873894,
0.0799185261130333,
-0.0009678064379841089,
0.024730930104851723,
0.04835977777838707,
0.02569836936891079,
-0.07631316035985947,
-0.053498104214668274,
-0.10738011449575424,
0.03252207860350609,
0.028493089601397514,
0.012473625130951405,
0.011075922288000584,
0.1312522441148758,
-0.040095631033182144,
0.0482676699757576,
0.18640758097171783,
-0.34096425771713257,
-0.017150137573480606,
0.08739840239286423,
0.046587761491537094,
0.12904302775859833,
-0.006001561880111694,
-0.015291310846805573,
0.07566868513822556,
0.035093288868665695,
0.09239652007818222,
-0.042464159429073334,
0.036953896284103394,
-0.052537936717271805,
-0.16086648404598236,
-0.038402434438467026,
0.10803746432065964,
-0.001386868767440319,
-0.13768863677978516,
-0.046584248542785645,
-0.03350134938955307,
0.025688536465168,
0.010616930201649666,
-0.034450408071279526,
0.031034113839268684,
0.00820824597030878,
-0.023902935907244682,
-0.0001406178780598566,
-0.10112862288951874,
-0.0496351532638073,
0.035200756043195724,
0.07939453423023224,
0.10148970782756805,
0.06255341321229935,
0.0056534125469625,
0.11191683262586594,
-0.17752590775489807,
-0.04670169577002525,
-0.028986681252717972,
-0.06367138773202896,
-0.04547042027115822,
-0.015582466498017311,
-0.1080690547823906,
-0.045416321605443954,
0.011179478839039803,
0.13500626385211945,
0.003826879896223545,
0.035476770251989365,
-0.02132279984652996,
0.004526577889919281,
0.05830024182796478,
0.04546641558408737,
-0.011199063621461391,
0.00859830155968666,
0.022970421239733696,
-0.007712156977504492,
-0.02136618085205555,
0.01784244179725647,
0.006129907444119453,
0.03502720966935158,
0.11844722181558609,
0.031168127432465553,
-0.10524585098028183,
0.07944104820489883,
-0.015172275714576244,
-0.047200385481119156,
0.023228051140904427,
-0.09050831943750381,
-0.06532567739486694,
-0.0448274090886116,
0.002317436970770359,
0.02023989148437977,
-0.009711568243801594,
-0.009586339816451073,
-0.02032374031841755,
-0.034356165677309036,
-0.0819975882768631,
-0.05006053298711777,
-0.053046856075525284,
-0.13002559542655945,
0.0012061261804774404,
-0.17012672126293182,
-0.03838261216878891,
-0.11862921714782715,
-0.18594567477703094,
-0.02695050835609436,
0.06158297508955002,
-0.011036185547709465,
-0.04925166815519333,
0.08332838863134384,
0.04238688573241234,
-0.0312713086605072,
-0.0029088323935866356,
0.07395292818546295,
-0.00605345144867897,
0.04071527346968651,
-0.022964781150221825,
0.06630093604326248,
0.007972120307385921,
0.034209150820970535,
-0.05247577279806137,
0.0600547231733799,
-0.1825791895389557,
0.04158474877476692,
-0.07768915593624115,
-0.01741587370634079,
-0.08334130048751831,
-0.04226144775748253,
-0.005705657415091991,
0.007179529871791601,
0.020821958780288696,
0.07590482383966446,
-0.18112827837467194,
-0.023860320448875427,
0.10192064195871353,
-0.15625277161598206,
-0.026832498610019684,
0.07055746018886566,
-0.0430341474711895,
0.09509559720754623,
0.07127701491117477,
0.1539430320262909,
-0.015296153724193573,
-0.08460382372140884,
0.05570775642991066,
-0.013852865435183048,
0.01238811295479536,
-0.010984166525304317,
0.06532862037420273,
-0.022762753069400787,
-0.15789222717285156,
0.039701659232378006,
-0.1260828822851181,
-0.002565309638157487,
-0.07563917338848114,
0.019157839938998222,
-0.0057798102498054504,
-0.06883430480957031,
-0.07686156034469604,
-0.02917293831706047,
0.06535379588603973,
-0.08060590922832489,
-0.011569794267416,
0.0240948349237442,
0.07176502794027328,
-0.07374636828899384,
0.06380963325500488,
-0.010239469818770885,
0.017973558977246284,
-0.08397382497787476,
-0.039300963282585144,
-0.1856336146593094,
0.03995842486619949,
0.09881246834993362,
0.015141603536903858,
-0.021833274513483047,
0.13628213107585907,
0.00831903051584959,
0.06814863532781601,
-0.05315529927611351,
0.010614529252052307,
-0.0005904950085096061,
-0.00297134043648839,
-0.0875169187784195,
-0.09352520108222961,
-0.08085830509662628,
-0.06723909080028534,
0.07345090806484222,
-0.11873941868543625,
0.020539380609989166,
-0.06139210984110832,
0.03607623279094696,
0.017385458573698997,
-0.08148854225873947,
-0.009643033146858215,
0.02041010372340679,
-0.05566220358014107,
-0.059918757528066635,
0.04213157296180725,
0.06756304949522018,
-0.009716800414025784,
0.08147168904542923,
-0.05332694947719574,
-0.07662195712327957,
0.03315339982509613,
0.09789375960826874,
-0.11517446488142014,
0.004597967956215143,
-0.05672644451260567,
-0.041927166283130646,
-0.057272542268037796,
-0.02349645271897316,
0.084571972489357,
-0.007237459532916546,
0.1344086229801178,
-0.07156173139810562,
-0.00611149612814188,
0.009043877013027668,
-0.014107838273048401,
-0.02480989880859852,
0.03473970293998718,
0.06053578481078148,
-0.07089995592832565,
0.012380273081362247,
0.04010114446282387,
0.01576869562268257,
0.07092512398958206,
-0.05219939351081848,
-0.08095487952232361,
0.019406922161579132,
0.03523654490709305,
0.02900642529129982,
0.06931966543197632,
-0.013243455439805984,
-0.016212305054068565,
0.03130398318171501,
0.02095750719308853,
0.0072392551228404045,
-0.10696464776992798,
0.05612504109740257,
0.053823720663785934,
0.008030129596590996,
0.06826148182153702,
-0.011044175364077091,
-0.04264219477772713,
0.07737142592668533,
0.03862213343381882,
-0.006264289375394583,
-0.011459309607744217,
-0.017063744366168976,
-0.11442221701145172,
0.19527828693389893,
-0.06552110612392426,
-0.16502676904201508,
-0.07323844730854034,
-0.11172415316104889,
-0.008187638595700264,
0.022099750116467476,
0.039585813879966736,
-0.030331958085298538,
-0.05163729563355446,
-0.12604984641075134,
0.06527465581893921,
-0.039440590888261795,
0.06460285931825638,
0.10753577947616577,
-0.048632897436618805,
0.05286122113466263,
-0.1287430226802826,
-0.009962408803403378,
-0.08269146084785461,
-0.07629136741161346,
0.06246525049209595,
-0.05397776514291763,
0.033426180481910706,
0.09309175610542297,
0.034382414072752,
-0.0170071329921484,
-0.029456883668899536,
0.21094101667404175,
0.04194674268364906,
0.03912720829248428,
0.12901519238948822,
-0.05143100023269653,
0.05073507875204086,
0.08259005099534988,
0.015030807815492153,
-0.051493655890226364,
0.05720942094922066,
0.044813185930252075,
-0.06762561947107315,
-0.19336111843585968,
-0.023881232365965843,
-0.011721649207174778,
-0.04421350359916687,
0.06998492032289505,
0.03870819881558418,
0.007163571193814278,
0.07166341692209244,
0.014094097539782524,
0.05725037679076195,
0.00016550545115023851,
0.1027572751045227,
0.019882807508111,
-0.034087471663951874,
0.09225951135158539,
-0.01968158222734928,
-0.009357225149869919,
0.08191248029470444,
-0.015706868842244148,
0.2884329855442047,
-0.028919119387865067,
0.007231360767036676,
0.12376298755407333,
0.04311661422252655,
0.06056157127022743,
0.13283997774124146,
-0.06752289086580276,
0.01638844795525074,
-0.07173765450716019,
-0.0610114149749279,
0.0008043713169172406,
0.03760651871562004,
-0.05913575366139412,
0.01229455228894949,
-0.07460426539182663,
0.011087334714829922,
-0.015913309529423714,
0.3146325349807739,
0.10056371241807938,
-0.11111225932836533,
-0.053076792508363724,
0.002792257582768798,
-0.09664374589920044,
-0.06566278636455536,
0.04830565303564072,
0.058224473148584366,
-0.14053630828857422,
0.011890261434018612,
-0.027675142511725426,
0.07141034305095673,
-0.020914409309625626,
0.019008547067642212,
0.0455843061208725,
0.04579703137278557,
-0.04226986691355705,
0.003264212515205145,
-0.190598264336586,
0.1890847086906433,
0.008224567398428917,
0.02347506582736969,
-0.049632079899311066,
0.03220551833510399,
0.012224482372403145,
-0.029803868383169174,
0.059646014124155045,
0.01639830507338047,
-0.02111196517944336,
-0.06102371960878372,
-0.048875898122787476,
0.016695642843842506,
0.07984793186187744,
-0.037568122148513794,
0.10605224967002869,
-0.004390326328575611,
0.04291890189051628,
0.01656567119061947,
0.09494216740131378,
-0.18991100788116455,
-0.09368759393692017,
0.029281560331583023,
-0.056530460715293884,
-0.09235691279172897,
-0.07783620804548264,
-0.09544622898101807,
-0.01844366081058979,
0.2507213354110718,
-0.10669411718845367,
-0.07846032828092575,
-0.098109170794487,
0.010822725482285023,
0.10673213750123978,
-0.04245232418179512,
0.025719359517097473,
-0.006167242769151926,
0.11681758612394333,
-0.06306224316358566,
-0.13633958995342255,
0.023212403059005737,
-0.0980100929737091,
-0.15793633460998535,
-0.06475897133350372,
0.11511892080307007,
0.060170888900756836,
0.03351407125592232,
-0.028696250170469284,
0.01999487914144993,
0.03542933985590935,
-0.04194316640496254,
-0.0057990713976323605,
0.0638340562582016,
0.10171288251876831,
0.040339164435863495,
-0.11758402734994888,
0.019925557076931,
-0.06760010868310928,
-0.06523976475000381,
0.07448728382587433,
0.25606948137283325,
-0.05021752789616585,
0.11772928386926651,
0.11032163351774216,
-0.07498625665903091,
-0.14881011843681335,
0.032950907945632935,
0.08571748435497284,
-0.019537486135959625,
0.0163434911519289,
-0.1560172587633133,
0.09305240958929062,
0.11679436266422272,
-0.018472006544470787,
0.008387956768274307,
-0.18260103464126587,
-0.12851916253566742,
0.06837888807058334,
0.10606642067432404,
0.2637310326099396,
-0.06954777985811234,
-0.037608176469802856,
0.018384233117103577,
-0.09255851805210114,
0.00748917693272233,
0.12592178583145142,
0.07114114612340927,
-0.029102973639965057,
-0.077760249376297,
0.009845085442066193,
-0.042729832231998444,
0.09184837341308594,
0.05745057016611099,
0.06413167715072632,
-0.0060600172728300095,
0.02530243806540966,
-0.02887909673154354,
-0.04313717037439346,
0.06584764271974564,
0.016903109848499298,
0.045623086392879486,
-0.08256737887859344,
-0.02828195132315159,
-0.07332415878772736,
0.025524551048874855,
-0.024943357333540916,
-0.07767782360315323,
-0.0588579922914505,
0.08150476217269897,
0.04481188952922821,
-0.02817605249583721,
0.022842220962047577,
0.023245815187692642,
0.12319888919591904,
0.15156085789203644,
0.0032938376534730196,
-0.05457508936524391,
-0.07130899280309677,
-0.03682509437203407,
-0.019659478217363358,
0.07090552896261215,
-0.03102414496243,
0.0151640884578228,
0.06596465408802032,
0.015561144798994064,
0.0985133945941925,
0.060383155941963196,
-0.109840989112854,
-0.019321531057357788,
0.03568587452173233,
-0.1629752814769745,
0.03299595043063164,
0.001763460342772305,
0.01988973654806614,
-0.03509797528386116,
0.03637506812810898,
0.1439116895198822,
-0.05936027318239212,
-0.03468975052237511,
-0.04101993143558502,
0.06860807538032532,
0.022127466276288033,
0.1482095718383789,
0.0321648009121418,
0.03640163689851761,
-0.08427058905363083,
0.12355844676494598,
0.030831415206193924,
-0.02920597605407238,
0.020500678569078445,
-0.01728721894323826,
-0.1136159598827362,
0.011036055162549019,
0.061100173741579056,
0.03931499645113945,
-0.05107152462005615,
-0.005610999651253223,
-0.028123974800109863,
-0.08143676072359085,
0.05827636271715164,
0.17817866802215576,
0.06651483476161957,
0.06838317960500717,
-0.055237628519535065,
-0.03968474641442299,
-0.0770619586110115,
0.03722413256764412,
0.03352494165301323,
0.07608995586633682,
-0.07817327231168747,
0.09139830619096756,
0.01287466287612915,
0.03890115022659302,
-0.030384382233023643,
-0.053314514458179474,
-0.10593806952238083,
-0.053672995418310165,
-0.10181494057178497,
-0.001333671505562961,
-0.07848566025495529,
-0.0356198213994503,
-0.0035695359110832214,
0.004413787741214037,
-0.007516307756304741,
0.04885578155517578,
-0.0599684938788414,
-0.01188165694475174,
-0.016518693417310715,
0.037244684994220734,
-0.05899014323949814,
-0.03527383133769035,
0.021622484549880028,
-0.10289791226387024,
0.08943406492471695,
0.045511435717344284,
0.0054204752668738365,
0.007832176983356476,
0.09404473006725311,
-0.023730013519525528,
0.023127630352973938,
0.012250184081494808,
-0.04693533480167389,
-0.08075074851512909,
-0.004082196392118931,
-0.007780944462865591,
-0.018320398405194283,
-0.00423915171995759,
0.08117265999317169,
-0.08764193952083588,
0.03621882572770119,
-0.00652725575491786,
0.001310052233748138,
-0.07167980819940567,
-0.011464301496744156,
0.10583700239658356,
0.09284137934446335,
0.046907879412174225,
-0.09560153633356094,
0.010732502676546574,
-0.1388128250837326,
-0.04005743935704231,
0.005291104316711426,
-0.017477478832006454,
-0.13057145476341248,
-0.007025029975920916,
0.02514924667775631,
-0.002566739683970809,
0.21778006851673126,
-0.05771328881382942,
-0.019893160089850426,
0.01939658261835575,
-0.08896155655384064,
0.11959020793437958,
-0.027376025915145874,
0.17679338157176971,
-0.011620461009442806,
-0.041783131659030914,
-0.0075286682695150375,
0.046160049736499786,
0.017154891043901443,
-0.02589932456612587,
0.1822461187839508,
0.13574226200580597,
0.027719872072339058,
0.03819788992404938,
-0.026199350133538246,
-0.0081281503662467,
-0.037023480981588364,
-0.04042889550328255,
0.04017454758286476,
0.044555094093084335,
0.015906795859336853,
0.1383030116558075,
0.06281870603561401,
-0.1654311716556549,
0.03331127017736435,
-0.03465327247977257,
-0.03861796483397484,
-0.11003198474645615,
-0.09044202417135239,
-0.02590865083038807,
-0.07114293426275253,
0.011322024278342724,
-0.12813131511211395,
0.0004203392018098384,
0.1851811558008194,
0.06332293897867203,
0.02792167104780674,
0.014984006062150002,
-0.12300800532102585,
-0.0321858674287796,
0.055050887167453766,
0.010224588215351105,
0.02025550976395607,
0.056466780602931976,
0.0021630723495036364,
0.05174655094742775,
0.02726765163242817,
0.013204829767346382,
-0.0015028318157419562,
0.06909066438674927,
0.016826482489705086,
0.041377436369657516,
-0.06255468726158142,
-0.004076201468706131,
-0.037668757140636444,
0.07101983577013016,
0.11777208000421524,
0.04704585671424866,
-0.054351091384887695,
-0.008060632273554802,
0.1540801078081131,
-0.03821795433759689,
0.003848935943096876,
-0.12547236680984497,
0.3359898328781128,
0.014071503654122353,
0.009378249756991863,
0.04156433045864105,
-0.07245803624391556,
-0.048532113432884216,
0.21432720124721527,
0.10163433849811554,
-0.019056037068367004,
-0.022602425888180733,
-0.0026958822272717953,
-0.030708132311701775,
-0.023661132901906967,
0.15471646189689636,
0.0418563187122345,
0.12248536944389343,
-0.05507670342922211,
-0.038450662046670914,
-0.02719283662736416,
-0.011437516659498215,
-0.1244124099612236,
0.12552905082702637,
-0.01796715147793293,
-0.024381812661886215,
-0.06788430362939835,
0.029004648327827454,
0.06776806712150574,
-0.30670005083084106,
-0.00100828951690346,
-0.028245167806744576,
-0.10750129818916321,
-0.010686646215617657,
-0.026061005890369415,
-0.022980842739343643,
0.04940304905176163,
-0.04315123334527016,
0.07160910964012146,
0.03968304768204689,
0.030490731820464134,
-0.019165992736816406,
-0.10601434111595154,
0.17276139557361603,
0.05690557882189751,
0.08762185275554657,
0.025668425485491753,
0.07511096447706223,
0.05779377371072769,
0.03282898664474487,
-0.0951782613992691,
0.05222993716597557,
0.012396237812936306,
-0.09275949746370316,
-0.04875573888421059,
0.11438292264938354,
0.0016490536509081721,
0.052804142236709595,
0.03391415625810623,
-0.09963339567184448,
0.018920576199889183,
0.07142891734838486,
-0.062348827719688416,
-0.09836476296186447,
-0.0018148624803870916,
-0.09002306312322617,
0.16380427777767181,
0.14919418096542358,
-0.015252412296831608,
0.01877572201192379,
-0.06756915152072906,
-0.006448091007769108,
0.053540393710136414,
-0.0014672863762825727,
-0.02537683956325054,
-0.1944587379693985,
0.03437347710132599,
-0.09022174030542374,
-0.012149357236921787,
-0.23112477362155914,
-0.10433804988861084,
-0.010559135116636753,
-0.04972298815846443,
-0.03155353665351868,
0.05746849998831749,
0.029524987563490868,
0.07025071233510971,
-0.017107799649238586,
-0.017346160486340523,
-0.032324038445949554,
0.0895824134349823,
-0.11372005194425583,
-0.06446216255426407
] |
null | null |
transformers
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 40k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
[the original BERT model](https://github.com/google-research/bert) but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
[http://goo.gle/multiberts](http://goo.gle/multiberts). We describe them in our
paper
[The MultiBERTs: BERT Reproductions for Robustness Analysis](https://arxiv.org/abs/2106.16163).
This is model #3, captured at step 40k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
[BERT-base uncased](https://github.com/google-research/bert), for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to [BERT-base uncased](https://github.com/google-research/bert). Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for [Turc et al., 2019](https://arxiv.org/abs/1908.08962).
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our [technical report](https://arxiv.org/abs/2106.16163) for more details.
### How to use
Using code from
[BERT-base uncased](https://huggingface.co/bert-base-uncased), here is an example based on
Tensorflow:
```
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_40k')
model = TFBertModel.from_pretrained("google/multiberts-seed_3-step_40k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
PyTorch version:
```
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('google/multiberts-seed_3-step_40k')
model = BertModel.from_pretrained("google/multiberts-seed_3-step_40k")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
## Citation info
```bibtex
@article{sellam2021multiberts,
title={The MultiBERTs: BERT Reproductions for Robustness Analysis},
author={Thibault Sellam and Steve Yadlowsky and Jason Wei and Naomi Saphra and Alexander D'Amour and Tal Linzen and Jasmijn Bastings and Iulia Turc and Jacob Eisenstein and Dipanjan Das and Ian Tenney and Ellie Pavlick},
journal={arXiv preprint arXiv:2106.16163},
year={2021}
}
```
|
{"language": "en", "license": "apache-2.0", "tags": ["multiberts", "multiberts-seed_3", "multiberts-seed_3-step_40k"]}
| null |
google/multiberts-seed_3-step_40k
|
[
"transformers",
"pytorch",
"tf",
"bert",
"pretraining",
"multiberts",
"multiberts-seed_3",
"multiberts-seed_3-step_40k",
"en",
"arxiv:2106.16163",
"arxiv:1908.08962",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.16163",
"1908.08962"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_40k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us
|
# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 40k
MultiBERTs is a collection of checkpoints and a statistical library to support
robust research on BERT. We provide 25 BERT-base models trained with
similar hyper-parameters as
the original BERT model but
with different random seeds, which causes variations in the initial weights and order of
training instances. The aim is to distinguish findings that apply to a specific
artifact (i.e., a particular instance of the model) from those that apply to the
more general procedure.
We also provide 140 intermediate checkpoints captured
during the course of pre-training (we saved 28 checkpoints for the first 5 runs).
The models were originally released through
URL We describe them in our
paper
The MultiBERTs: BERT Reproductions for Robustness Analysis.
This is model #3, captured at step 40k (max: 2000k, i.e., 2M steps).
## Model Description
This model was captured during a reproduction of
BERT-base uncased, for English: it
is a Transformers model pretrained on a large corpus of English data, using the
Masked Language Modelling (MLM) and the Next Sentence Prediction (NSP)
objectives.
The intended uses, limitations, training data and training procedure for the fully trained model are similar
to BERT-base uncased. Two major
differences with the original model:
* We pre-trained the MultiBERTs models for 2 million steps using sequence
length 512 (instead of 1 million steps using sequence length 128 then 512).
* We used an alternative version of Wikipedia and Books Corpus, initially
collected for Turc et al., 2019.
This is a best-effort reproduction, and so it is probable that differences with
the original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original
BERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).
See our technical report for more details.
### How to use
Using code from
BERT-base uncased, here is an example based on
Tensorflow:
PyTorch version:
info
|
[
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 40k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 40k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
"TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_40k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 40k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 40k (max: 2000k, i.e., 2M steps).",
"## Model Description\n\nThis model was captured during a reproduction of\nBERT-base uncased, for English: it\nis a Transformers model pretrained on a large corpus of English data, using the\nMasked Language Modelling (MLM) and the Next Sentence Prediction (NSP)\nobjectives.\n\nThe intended uses, limitations, training data and training procedure for the fully trained model are similar\nto BERT-base uncased. Two major\ndifferences with the original model:\n\n* We pre-trained the MultiBERTs models for 2 million steps using sequence\n length 512 (instead of 1 million steps using sequence length 128 then 512).\n* We used an alternative version of Wikipedia and Books Corpus, initially\n collected for Turc et al., 2019.\n\nThis is a best-effort reproduction, and so it is probable that differences with\nthe original model have gone unnoticed. The performance of MultiBERTs on GLUE after full training is oftentimes comparable to that of original\nBERT, but we found significant differences on the dev set of SQuAD (MultiBERTs outperforms original BERT).\nSee our technical report for more details.",
"### How to use\n\nUsing code from\nBERT-base uncased, here is an example based on\nTensorflow:\n\n\n\nPyTorch version:\n\n\n\ninfo"
] |
[
82,
220,
259,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tf #bert #pretraining #multiberts #multiberts-seed_3 #multiberts-seed_3-step_40k #en #arxiv-2106.16163 #arxiv-1908.08962 #license-apache-2.0 #endpoints_compatible #region-us \n# MultiBERTs, Intermediate Checkpoint - Seed 3, Step 40k\n\nMultiBERTs is a collection of checkpoints and a statistical library to support\nrobust research on BERT. We provide 25 BERT-base models trained with\nsimilar hyper-parameters as\nthe original BERT model but\nwith different random seeds, which causes variations in the initial weights and order of\ntraining instances. The aim is to distinguish findings that apply to a specific\nartifact (i.e., a particular instance of the model) from those that apply to the\nmore general procedure.\n\nWe also provide 140 intermediate checkpoints captured\nduring the course of pre-training (we saved 28 checkpoints for the first 5 runs).\n\nThe models were originally released through\nURL We describe them in our\npaper\nThe MultiBERTs: BERT Reproductions for Robustness Analysis.\n\nThis is model #3, captured at step 40k (max: 2000k, i.e., 2M steps)."
] |
[
-0.07758188247680664,
0.08659851551055908,
-0.0021109452936798334,
0.03897210210561752,
0.08204630017280579,
-0.014087783172726631,
0.08042816072702408,
0.10031697154045105,
-0.025246134027838707,
0.025661982595920563,
0.07954229414463043,
0.014508700929582119,
0.016817431896924973,
0.09372053295373917,
0.021545281633734703,
-0.22415791451931,
0.02347620204091072,
-0.029895052313804626,
-0.07514968514442444,
0.07870316505432129,
0.0982954129576683,
-0.0821770429611206,
0.04819551855325699,
0.024081354960799217,
-0.10668476670980453,
0.048547614365816116,
-0.007945113815367222,
-0.019803572446107864,
0.13211274147033691,
-0.001514958799816668,
0.047171205282211304,
0.05208209902048111,
0.03944212943315506,
-0.13614322245121002,
0.0072096227668225765,
0.05610516667366028,
0.054212670773267746,
0.041534893214702606,
0.02729102596640587,
0.08078532665967941,
0.004440550692379475,
0.02137089893221855,
0.04704277962446213,
0.024937376379966736,
-0.07514616847038269,
-0.05628172680735588,
-0.10585752129554749,
0.03198695927858353,
0.02670334465801716,
0.011960596777498722,
0.010215290822088718,
0.13974761962890625,
-0.03998717665672302,
0.050293609499931335,
0.1884077489376068,
-0.3424582779407501,
-0.01481618918478489,
0.09009809792041779,
0.051526110619306564,
0.12654584646224976,
-0.0063752299174666405,
-0.015987392514944077,
0.0755794569849968,
0.033539026975631714,
0.09034709632396698,
-0.04279443621635437,
0.046786967664957047,
-0.051584392786026,
-0.16285116970539093,
-0.03960517793893814,
0.1006142646074295,
-0.0023037283681333065,
-0.135810986161232,
-0.04899745061993599,
-0.03721575811505318,
0.028831763193011284,
0.008667165413498878,
-0.03315367177128792,
0.03258812054991722,
0.009093740954995155,
-0.020058825612068176,
0.00034045433858409524,
-0.10080963373184204,
-0.04905775934457779,
0.03665389120578766,
0.07586606591939926,
0.10364242643117905,
0.06038592755794525,
0.0034344508312642574,
0.10955481231212616,
-0.18895235657691956,
-0.04804954677820206,
-0.028814930468797684,
-0.06014631316065788,
-0.044219885021448135,
-0.015655457973480225,
-0.11113785207271576,
-0.04744219779968262,
0.014139988459646702,
0.13619861006736755,
-0.0032805094961076975,
0.0347931943833828,
-0.024987634271383286,
0.006146331317722797,
0.060819558799266815,
0.04543180763721466,
-0.01080731675028801,
0.007964564487338066,
0.020839005708694458,
-0.009968248195946217,
-0.020919501781463623,
0.017417380586266518,
0.005040987394750118,
0.033502768725156784,
0.12029695510864258,
0.02872634120285511,
-0.1044897735118866,
0.08030687272548676,
-0.014174836687743664,
-0.04749240353703499,
0.024176232516765594,
-0.0894273966550827,
-0.0650181770324707,
-0.04247594252228737,
0.002100052312016487,
0.022070612758398056,
-0.010684525594115257,
-0.008385785855352879,
-0.022626005113124847,
-0.03806988149881363,
-0.08235319703817368,
-0.04939689859747887,
-0.05287930741906166,
-0.130788654088974,
0.0034207815770059824,
-0.17798325419425964,
-0.03703761100769043,
-0.11564049124717712,
-0.18938308954238892,
-0.02800893411040306,
0.05860400199890137,
-0.01164983306080103,
-0.04982760548591614,
0.08177825808525085,
0.04401751607656479,
-0.030300593003630638,
-0.0020035149063915014,
0.06958945840597153,
-0.006477238144725561,
0.03989400714635849,
-0.020992370322346687,
0.0663396418094635,
0.008947361260652542,
0.03425923362374306,
-0.05234942585229874,
0.06067374348640442,
-0.18082988262176514,
0.04205721244215965,
-0.07706927508115768,
-0.01830351911485195,
-0.08570487052202225,
-0.04129919037222862,
-0.00017934472998604178,
0.008931988850235939,
0.01867447979748249,
0.07482299208641052,
-0.17967331409454346,
-0.02665247954428196,
0.11020547896623611,
-0.15518146753311157,
-0.025608856230974197,
0.07135095447301865,
-0.04374758526682854,
0.0909104123711586,
0.07088037580251694,
0.15568645298480988,
-0.014638127759099007,
-0.08533470332622528,
0.056261204183101654,
-0.01255730353295803,
0.0158989317715168,
-0.012776114977896214,
0.06800590455532074,
-0.023004041984677315,
-0.15494945645332336,
0.03708207607269287,
-0.13408412039279938,
-0.002573595615103841,
-0.07540208101272583,
0.020758075639605522,
-0.00578021677210927,
-0.06708622723817825,
-0.07515604048967361,
-0.02731245569884777,
0.06659673899412155,
-0.07897219061851501,
-0.013187729753553867,
0.027778079733252525,
0.07309915870428085,
-0.07546499371528625,
0.06351945549249649,
-0.012731383554637432,
0.01939297467470169,
-0.08452960103750229,
-0.03948049992322922,
-0.18828727304935455,
0.04129854962229729,
0.09966249763965607,
0.01531236618757248,
-0.022739024832844734,
0.14240100979804993,
0.009485638700425625,
0.06848469376564026,
-0.050372861325740814,
0.010094346478581429,
-0.0004980513476766646,
-0.004466281738132238,
-0.08702331781387329,
-0.0977129116654396,
-0.07740123569965363,
-0.07007630169391632,
0.07048597931861877,
-0.12404067814350128,
0.020817922428250313,
-0.05916724354028702,
0.03815016150474548,
0.016686301678419113,
-0.07995614409446716,
-0.008405466563999653,
0.018846040591597557,
-0.058041978627443314,
-0.0589534267783165,
0.04381456598639488,
0.06854460388422012,
-0.011347870342433453,
0.08729703724384308,
-0.056434568017721176,
-0.07982481271028519,
0.032374829053878784,
0.09475259482860565,
-0.11156556755304337,
0.0012602697825059295,
-0.0579356849193573,
-0.042307160794734955,
-0.05466940999031067,
-0.020339360460639,
0.08312275260686874,
-0.006815773900598288,
0.13802151381969452,
-0.07243002206087112,
-0.009246132336556911,
0.010245535522699356,
-0.01578148640692234,
-0.024510947987437248,
0.036213360726833344,
0.06063266098499298,
-0.06974098086357117,
0.014791552908718586,
0.040092382580041885,
0.013434807769954205,
0.07463715970516205,
-0.05512890964746475,
-0.083578921854496,
0.01821725443005562,
0.037763264030218124,
0.028456952422857285,
0.06686873733997345,
-0.019848113879561424,
-0.01861775480210781,
0.03247896954417229,
0.01871269755065441,
0.007787052541971207,
-0.10692302137613297,
0.05713373050093651,
0.054344892501831055,
0.008651258423924446,
0.06642412394285202,
-0.010471830144524574,
-0.04366001486778259,
0.076870858669281,
0.03785492479801178,
-0.007423051632940769,
-0.013479538261890411,
-0.017544077709317207,
-0.11449985951185226,
0.19720065593719482,
-0.063116654753685,
-0.16285765171051025,
-0.07213509827852249,
-0.118245430290699,
-0.008254391141235828,
0.02217714861035347,
0.03847711160778999,
-0.030571628361940384,
-0.050850674510002136,
-0.12429159879684448,
0.06568857282400131,
-0.04129815474152565,
0.06599996238946915,
0.10735593736171722,
-0.048465944826602936,
0.05332709848880768,
-0.1274852603673935,
-0.009299169294536114,
-0.0820656418800354,
-0.07419336587190628,
0.06375144422054291,
-0.05374031886458397,
0.03162829577922821,
0.09384584426879883,
0.03563080355525017,
-0.017349720001220703,
-0.028782857581973076,
0.20982709527015686,
0.040772199630737305,
0.037300895899534225,
0.13003848493099213,
-0.053736474364995956,
0.05234538018703461,
0.07982298731803894,
0.012955734506249428,
-0.0502970889210701,
0.0543656051158905,
0.04542883113026619,
-0.06518350541591644,
-0.19505667686462402,
-0.024207809939980507,
-0.011285766959190369,
-0.04342310503125191,
0.07200807332992554,
0.03781241551041603,
0.016228891909122467,
0.07043241709470749,
0.012476470321416855,
0.06304950267076492,
-0.003776791738346219,
0.10355155915021896,
0.020406438037753105,
-0.03436441347002983,
0.0911058634519577,
-0.02002711407840252,
-0.008802785538136959,
0.08226054906845093,
-0.014165005646646023,
0.2873658239841461,
-0.027189143002033234,
0.017980234697461128,
0.125314399600029,
0.03933872655034065,
0.061372045427560806,
0.12867526710033417,
-0.07014264911413193,
0.015117025002837181,
-0.0727570429444313,
-0.06209554150700569,
0.002884211950004101,
0.038964398205280304,
-0.057732321321964264,
0.0077237319201231,
-0.07242462038993835,
0.010917781852185726,
-0.015793176367878914,
0.31485602259635925,
0.10149367898702621,
-0.11454199254512787,
-0.05386807397007942,
0.0016095454338937998,
-0.096111960709095,
-0.06581152975559235,
0.047211047261953354,
0.05785167217254639,
-0.1366698294878006,
0.014480238780379295,
-0.02844913676381111,
0.07198011875152588,
-0.021381007507443428,
0.020061073824763298,
0.0424821563065052,
0.04554509371519089,
-0.04052792116999626,
0.004342805128544569,
-0.19430947303771973,
0.19056570529937744,
0.008963899686932564,
0.01980837620794773,
-0.050101857632398605,
0.0326826348900795,
0.010962841100990772,
-0.02492832951247692,
0.061674416065216064,
0.016646940261125565,
-0.021742690354585648,
-0.05457000434398651,
-0.048693154007196426,
0.014515253715217113,
0.07953004539012909,
-0.03803212195634842,
0.10867787152528763,
-0.005723066162317991,
0.04271084442734718,
0.01822284795343876,
0.08790986239910126,
-0.18548749387264252,
-0.09106626361608505,
0.030611051246523857,
-0.05520829185843468,
-0.10143591463565826,
-0.07699514925479889,
-0.09506995975971222,
-0.020992524921894073,
0.24994167685508728,
-0.11324068158864975,
-0.07626890391111374,
-0.09853220731019974,
0.01655760407447815,
0.10398540645837784,
-0.044991787523031235,
0.025584453716874123,
-0.008078436367213726,
0.12275426834821701,
-0.06376184523105621,
-0.1349143385887146,
0.024915356189012527,
-0.09998437762260437,
-0.15933389961719513,
-0.06462293118238449,
0.11618771404027939,
0.06091431900858879,
0.03380933031439781,
-0.02954522892832756,
0.020463764667510986,
0.03206127882003784,
-0.039527375251054764,
-0.00411643460392952,
0.06714191287755966,
0.1062374860048294,
0.035123057663440704,
-0.11336644738912582,
0.02185446210205555,
-0.06505067646503448,
-0.06465458124876022,
0.07737985998392105,
0.2574886679649353,
-0.04986979439854622,
0.1180574968457222,
0.11248133331537247,
-0.07492811232805252,
-0.1491202563047409,
0.03297289460897446,
0.08664977550506592,
-0.019107352942228317,
0.016003726050257683,
-0.15859843790531158,
0.08992169052362442,
0.11350014805793762,
-0.018400888890028,
0.004223324824124575,
-0.18459820747375488,
-0.12683948874473572,
0.07098060101270676,
0.1042490303516388,
0.26418739557266235,
-0.07139113545417786,
-0.04096844792366028,
0.01897866651415825,
-0.08777446299791336,
0.009860523045063019,
0.11938948929309845,
0.06827440857887268,
-0.027943940833210945,
-0.07751405984163284,
0.010811123996973038,
-0.04437443986535072,
0.0910988375544548,
0.05819542706012726,
0.06234006956219673,
-0.006155918352305889,
0.026535458862781525,
-0.024318760260939598,
-0.04396917670965195,
0.06456360220909119,
0.013717364519834518,
0.045096252113580704,
-0.08629930019378662,
-0.02886854112148285,
-0.07349999248981476,
0.02633696235716343,
-0.02471315860748291,
-0.07726415246725082,
-0.057652778923511505,
0.08067912608385086,
0.0462893582880497,
-0.02716747857630253,
0.02742953412234783,
0.023614434525370598,
0.12050138413906097,
0.15442147850990295,
0.0034531806595623493,
-0.04656551778316498,
-0.073953777551651,
-0.03857455402612686,
-0.01698332466185093,
0.0715390220284462,
-0.039953626692295074,
0.014171578921377659,
0.06441064178943634,
0.01784057542681694,
0.09860347956418991,
0.05779176577925682,
-0.11160304397344589,
-0.020753536373376846,
0.033999472856521606,
-0.16448688507080078,
0.026289286091923714,
0.0020473506301641464,
0.023169420659542084,
-0.03198658302426338,
0.03888510540127754,
0.14496563374996185,
-0.05945473536849022,
-0.033005159348249435,
-0.04070110246539116,
0.06783578544855118,
0.02457248605787754,
0.14632219076156616,
0.03331046923995018,
0.038075003772974014,
-0.08481435477733612,
0.12211454659700394,
0.031383808702230453,
-0.031165456399321556,
0.022267555817961693,
-0.018310951068997383,
-0.11355023831129074,
0.009581654332578182,
0.06307777017354965,
0.04399016499519348,
-0.05287223309278488,
-0.009348256513476372,
-0.03075583092868328,
-0.075189508497715,
0.0608237199485302,
0.18165287375450134,
0.06763006001710892,
0.06845611333847046,
-0.05529586970806122,
-0.040424689650535583,
-0.07831420004367828,
0.03728038817644119,
0.029565194621682167,
0.07631862163543701,
-0.0787249282002449,
0.09531008452177048,
0.013123746030032635,
0.040116045624017715,
-0.030307695269584656,
-0.052722107619047165,
-0.10501333326101303,
-0.054262809455394745,
-0.10068299621343613,
0.0018376598600298166,
-0.0795530155301094,
-0.036480508744716644,
-0.004623936954885721,
0.004673457704484463,
-0.006529027130454779,
0.04980611428618431,
-0.060025352984666824,
-0.010455883108079433,
-0.016808584332466125,
0.038439732044935226,
-0.06081360578536987,
-0.033160626888275146,
0.02085859142243862,
-0.10226479172706604,
0.0917338877916336,
0.04397711902856827,
0.005554248113185167,
0.00879358034580946,
0.09435911476612091,
-0.022598009556531906,
0.02527082897722721,
0.010494149290025234,
-0.04714486002922058,
-0.07892382144927979,
-0.0037345855962485075,
-0.008543088100850582,
-0.018804257735610008,
-0.005610945168882608,
0.07971974462270737,
-0.0864020362496376,
0.038084059953689575,
-0.0050440398044884205,
-0.00036487498437054455,
-0.07281866669654846,
-0.011122793890535831,
0.10111429542303085,
0.09448608011007309,
0.046361811459064484,
-0.09373418241739273,
0.011983935721218586,
-0.13725458085536957,
-0.0397227518260479,
0.0061324904672801495,
-0.016712093725800514,
-0.12999922037124634,
-0.007494205143302679,
0.023749245330691338,
-0.004858499858528376,
0.22039547562599182,
-0.058867197483778,
-0.01678336411714554,
0.018819546326994896,
-0.08885587751865387,
0.11939117312431335,
-0.028225528076291084,
0.17964082956314087,
-0.011433074250817299,
-0.04063301160931587,
-0.009118850342929363,
0.04685564711689949,
0.018333883956074715,
-0.028149234130978584,
0.17921532690525055,
0.13752765953540802,
0.029406417161226273,
0.038813747465610504,
-0.024566616863012314,
-0.008348419331014156,
-0.0440664179623127,
-0.03140776604413986,
0.03851427137851715,
0.04659144580364227,
0.017024455592036247,
0.149322509765625,
0.06711976230144501,
-0.1659148782491684,
0.03357233107089996,
-0.03368307277560234,
-0.03751690313220024,
-0.11052251607179642,
-0.09441409260034561,
-0.02717200666666031,
-0.07384413480758667,
0.010036389343440533,
-0.12735773622989655,
0.0009158689063042402,
0.181818425655365,
0.0645579919219017,
0.02633473463356495,
0.01365068182349205,
-0.12101945281028748,
-0.03354165703058243,
0.055429670959711075,
0.011215985752642155,
0.019912289455533028,
0.054517604410648346,
0.0034305613953620195,
0.055098939687013626,
0.027486948296427727,
0.0134132644161582,
-0.0032501835376024246,
0.07452725619077682,
0.02005208656191826,
0.039660170674324036,
-0.06446290761232376,
-0.0028775345999747515,
-0.03716915845870972,
0.06894182413816452,
0.11148326843976974,
0.04642903059720993,
-0.05416405573487282,
-0.00887391623109579,
0.15016275644302368,
-0.036969028413295746,
0.005963190458714962,
-0.12594544887542725,
0.33115822076797485,
0.01446481328457594,
0.009953428991138935,
0.04152856022119522,
-0.07364816963672638,
-0.04754478484392166,
0.21047180891036987,
0.09816335886716843,
-0.01805163361132145,
-0.021599270403385162,
-0.00021336022473406047,
-0.030127091333270073,
-0.023272117599844933,
0.1517685353755951,
0.04169446602463722,
0.12313468754291534,
-0.053488120436668396,
-0.04611022397875786,
-0.02732485719025135,
-0.012463312596082687,
-0.12164201587438583,
0.125062957406044,
-0.014917897060513496,
-0.02591886930167675,
-0.0670957937836647,
0.029672594740986824,
0.0690860003232956,
-0.30790215730667114,
-0.0025790019426494837,
-0.028586335480213165,
-0.10749475657939911,
-0.01133950985968113,
-0.023020198568701744,
-0.021278051659464836,
0.047821950167417526,
-0.042469948530197144,
0.0692053809762001,
0.03971361741423607,
0.0307097677141428,
-0.02233877405524254,
-0.10055367648601532,
0.1714707762002945,
0.06106758117675781,
0.08762793242931366,
0.024851670488715172,
0.07534699141979218,
0.058888331055641174,
0.031164150685071945,
-0.09407056123018265,
0.050821494311094284,
0.013979802839457989,
-0.09183233976364136,
-0.05028543621301651,
0.11569389700889587,
0.001839524251408875,
0.04819469898939133,
0.0368526317179203,
-0.10233915597200394,
0.01951884664595127,
0.069966159760952,
-0.06549710780382156,
-0.10013269633054733,
-0.00010648096940713003,
-0.09011274576187134,
0.16279681026935577,
0.14914976060390472,
-0.015295577235519886,
0.01793299987912178,
-0.06861083954572678,
-0.004255073145031929,
0.05312377214431763,
0.001338776433840394,
-0.025212030857801437,
-0.1914367377758026,
0.03643340617418289,
-0.08751317858695984,
-0.008051950484514236,
-0.23144087195396423,
-0.10227304697036743,
-0.00874785054475069,
-0.04805769771337509,
-0.03382565453648567,
0.05599056929349899,
0.02795455977320671,
0.0697590783238411,
-0.01776663027703762,
-0.020322492346167564,
-0.03187869116663933,
0.08837617933750153,
-0.11322617530822754,
-0.06581871956586838
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.