Dataset Viewer
Auto-converted to Parquet
modelId
stringlengths
4
112
sha
stringlengths
40
40
lastModified
stringlengths
24
24
tags
sequence
pipeline_tag
stringclasses
29 values
private
bool
1 class
author
stringlengths
2
38
config
null
id
stringlengths
4
112
downloads
float64
0
36.8M
likes
float64
0
712
library_name
stringclasses
17 values
readme
stringlengths
0
186k
embedding
sequence
hfl/chinese-macbert-base
a986e004d2a7f2a1c2f5a3edef4e20604a974ed1
2021-05-19T19:09:45.000Z
[ "pytorch", "tf", "jax", "bert", "fill-mask", "zh", "arxiv:2004.13922", "transformers", "license:apache-2.0", "autotrain_compatible" ]
fill-mask
false
hfl
null
hfl/chinese-macbert-base
36,823,840
43
transformers
--- language: - zh tags: - bert license: "apache-2.0" --- <p align="center"> <br> <img src="https://github.com/ymcui/MacBERT/raw/master/pics/banner.png" width="500"/> <br> </p> <p align="center"> <a href="https://github.com/ymcui/MacBERT/blob/master/LICENSE"> <img alt="GitHub" src="https://img.shields.io/github/license/ymcui/MacBERT.svg?color=blue&style=flat-square"> </a> </p> # Please use 'Bert' related functions to load this model! This repository contains the resources in our paper **"Revisiting Pre-trained Models for Chinese Natural Language Processing"**, which will be published in "[Findings of EMNLP](https://2020.emnlp.org)". You can read our camera-ready paper through [ACL Anthology](#) or [arXiv pre-print](https://arxiv.org/abs/2004.13922). **[Revisiting Pre-trained Models for Chinese Natural Language Processing](https://arxiv.org/abs/2004.13922)** *Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Shijin Wang, Guoping Hu* You may also interested in, - Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm - Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA - Chinese XLNet: https://github.com/ymcui/Chinese-XLNet - Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer More resources by HFL: https://github.com/ymcui/HFL-Anthology ## Introduction **MacBERT** is an improved BERT with novel **M**LM **a**s **c**orrection pre-training task, which mitigates the discrepancy of pre-training and fine-tuning. Instead of masking with [MASK] token, which never appears in the fine-tuning stage, **we propose to use similar words for the masking purpose**. A similar word is obtained by using [Synonyms toolkit (Wang and Hu, 2017)](https://github.com/chatopera/Synonyms), which is based on word2vec (Mikolov et al., 2013) similarity calculations. If an N-gram is selected to mask, we will find similar words individually. In rare cases, when there is no similar word, we will degrade to use random word replacement. Here is an example of our pre-training task. | | Example | | -------------- | ----------------- | | **Original Sentence** | we use a language model to predict the probability of the next word. | | **MLM** | we use a language [M] to [M] ##di ##ct the pro [M] ##bility of the next word . | | **Whole word masking** | we use a language [M] to [M] [M] [M] the [M] [M] [M] of the next word . | | **N-gram masking** | we use a [M] [M] to [M] [M] [M] the [M] [M] [M] [M] [M] next word . | | **MLM as correction** | we use a text system to ca ##lc ##ulate the po ##si ##bility of the next word . | Except for the new pre-training task, we also incorporate the following techniques. - Whole Word Masking (WWM) - N-gram masking - Sentence-Order Prediction (SOP) **Note that our MacBERT can be directly replaced with the original BERT as there is no differences in the main neural architecture.** For more technical details, please check our paper: [Revisiting Pre-trained Models for Chinese Natural Language Processing](https://arxiv.org/abs/2004.13922) ## Citation If you find our resource or paper is useful, please consider including the following citation in your paper. - https://arxiv.org/abs/2004.13922 ``` @inproceedings{cui-etal-2020-revisiting, title = "Revisiting Pre-Trained Models for {C}hinese Natural Language Processing", author = "Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Wang, Shijin and Hu, Guoping", booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://www.aclweb.org/anthology/2020.findings-emnlp.58", pages = "657--668", } ```
[ -0.12368089705705643, 0.0159906018525362, 0.07993485778570175, 0.010601235553622246, 0.06334946304559708, 0.02685576304793358, 0.021748080849647522, 0.007624071557074785, 0.0053676413372159, -0.0035076644271612167, 0.08696867525577545, -0.03400150686502457, 0.06885205209255219, 0.06367935240268707, 0.09161554276943207, 0.12452159821987152, -0.011798147112131119, 0.009786052629351616, -0.013559685088694096, -0.02453933283686638, 0.041914522647857666, -0.0069780610501766205, 0.07124654948711395, -0.05039214342832565, 0.011143012903630733, 0.019245531409978867, 0.010812710970640182, 0.0316716805100441, 0.08623003214597702, 0.0019279281841591, -0.0058570122346282005, -0.022081663832068443, 0.06999083608388901, 0.11618760973215103, 0.08112794905900955, 0.0933358445763588, 0.008507990278303623, -0.04130120947957039, -0.0029796846210956573, 0.03927253559231758, -0.022822173312306404, -0.016029944643378258, 0.023845218122005463, -0.025263754650950432, 0.11155292391777039, -0.013507002033293247, -0.04680502787232399, -0.039015162736177444, -0.044154636561870575, -0.04366300255060196, -0.09107059240341187, -0.10429319739341736, 0.038736771792173386, 0.02137715183198452, 0.03468162193894386, 0.05056798458099365, 0.00009925341873895377, -0.1015234962105751, 0.03688538074493408, -0.04115312546491623, -0.06699245423078537, 0.04689601808786392, -0.04374814033508301, 0.017040817067027092, -0.04805529862642288, 0.04689933359622955, -0.06438924372196198, 0.033111341297626495, 0.003901514457538724, -0.015206824988126755, -0.033730607479810715, -0.026794472709298134, 0.02023431658744812, -0.016280394047498703, -0.02890953980386257, -0.08763381838798523, 0.06940080970525742, -0.017188506200909615, -0.01677684485912323, -0.13255378603935242, -0.0321032851934433, 0.0057119345292449, 0.0688258558511734, 0.02724544331431389, 0.11000185459852219, 0.04522807523608208, -0.02923044003546238, 0.012787189334630966, -0.03405408561229706, 0.0209773201495409, -0.016445938497781754, -0.02845063805580139, 0.03200048953294754, 0.036922384053468704, -0.006143038626760244, 0.0071402015164494514, 0.05788221210241318, -0.06333218514919281, -0.0654711201786995, 0.13208115100860596, 0.002559973858296871, 0.01606440730392933, 0.10675592720508575, -0.05221341922879219, 0.0312284454703331, -0.029797974973917007, 0.008975659497082233, 0.04135391116142273, 0.029717274010181427, -0.09412312507629395, 0.0000987965686363168, -0.08250454813241959, -0.05944410338997841, -0.03579547628760338, -0.027565736323595047, -0.08233797550201416, -0.021781248971819878, -0.01496629323810339, 0.06706778705120087, -0.062109872698783875, 0.09188660234212875, 0.031843774020671844, -0.08324812352657318, 0.007853598333895206, -0.0016558649949729443, -0.05503075197339058, -0.04477791488170624, 3.9872507567082904e-33, 0.09515433758497238, 0.03779875114560127, -0.00910892989486456, 0.035569556057453156, 0.026999978348612785, -0.02364065684378147, -0.00616849958896637, -0.05063116177916527, -0.08907615393400192, 0.0029398822225630283, -0.03419322147965431, 0.002009950578212738, -0.15423442423343658, 0.0849522277712822, -0.019485004246234894, -0.035448089241981506, -0.012858436442911625, 0.024927910417318344, 0.04669623076915741, 0.01089312881231308, 0.003843283513560891, 0.008892863057553768, 0.034779276698827744, -0.10218218713998795, -0.09148538112640381, 0.09782110154628754, 0.09966292977333069, -0.12279116362333298, -0.026402883231639862, 0.04752308502793312, -0.0654626190662384, 0.02498999983072281, 0.01772530935704708, 0.008468154817819595, -0.004783882759511471, -0.04649516940116882, -0.02268897369503975, -0.03329654410481453, 0.01194672379642725, 0.011852686293423176, -0.04525561258196831, 0.025036746636033058, -0.015859657898545265, -0.04264650493860245, -0.006491352804005146, 0.01912064477801323, 0.047626350075006485, -0.03425963595509529, 0.03000185452401638, -0.027314946055412292, 0.019932080060243607, -0.023331986740231514, -0.04227319359779358, 0.02823178470134735, -0.0018153005512431264, -0.03652898594737053, 0.0451020747423172, -0.00048811850138008595, -0.028571717441082, -0.026259105652570724, 0.01731915771961212, -0.01028137281537056, 0.025083104148507118, -0.023804182186722755, 0.047672953456640244, -0.01939406432211399, -0.04154164344072342, -0.03524061292409897, 0.014430850744247437, -0.008810351602733135, -0.03437848761677742, 0.02321711927652359, 0.04409072920680046, -0.015636596828699112, 0.009047046303749084, -0.05445496737957001, 0.004630557727068663, -0.05701594427227974, -0.00040714070200920105, 0.04818731173872948, -0.06711646914482117, -0.031921107321977615, -0.04377150908112526, -0.07308953255414963, -0.0232989639043808, -0.00997752696275711, 0.1418170928955078, -0.034938812255859375, 0.012658159248530865, -0.020953821018338203, 0.00497033167630434, -0.03130747377872467, 0.03891034051775932, -0.015577688813209534, -0.08390413969755173, -3.843031191515827e-33, 0.005681055132299662, 0.06880983710289001, -0.11015527695417404, 0.0349394865334034, -0.03294600173830986, -0.055837225168943405, 0.10670159757137299, 0.1367025077342987, -0.018905166536569595, -0.052034731954336166, 0.017584217712283134, -0.015417062677443027, -0.031228244304656982, -0.03925216943025589, 0.0537862665951252, 0.005823249462991953, 0.004189756698906422, 0.07110357284545898, -0.058143697679042816, 0.022288592532277107, 0.03518078103661537, -0.007328149396926165, -0.07043279707431793, 0.044390201568603516, 0.007626739330589771, 0.08215285837650299, 0.06917277723550797, 0.03982473164796829, -0.009509596042335033, 0.041881855577230453, -0.06125647574663162, -0.0040566762909293175, -0.04321371018886566, 0.014745458960533142, -0.02299700491130352, 0.010232437402009964, 0.020447956398129463, 0.005081123672425747, -0.026925157755613327, 0.04378286749124527, 0.11609587073326111, -0.016513504087924957, -0.06904041022062302, 0.0038948419969528913, 0.0030253815930336714, 0.02673092857003212, -0.08374432474374771, -0.007153729908168316, -0.02205284871160984, -0.018915848806500435, -0.003292633453384042, -0.06348959356546402, -0.020496850833296776, -0.047494590282440186, -0.08219953626394272, -0.038372304290533066, 0.026045002043247223, -0.05049319192767143, -0.02871466800570488, -0.014771162532269955, -0.11460384726524353, -0.03986293077468872, 0.01889883540570736, -0.01834229752421379, 0.01578482985496521, -0.029306333512067795, -0.04342275857925415, 0.04168817028403282, -0.05810849368572235, -0.025838812813162804, 0.0049243285320699215, 0.05631619691848755, 0.026484288275241852, -0.024803340435028076, 0.01361824106425047, -0.03322697803378105, -0.005708959884941578, 0.012137352488934994, 0.019497264176607132, -0.032136883586645126, -0.001377750770188868, 0.01784142479300499, 0.025652771815657616, 0.04457484185695648, -0.02470630779862404, 0.0685778334736824, -0.0009807632304728031, 0.057042721658945084, 0.021894635632634163, -0.03394084423780441, -0.045410193502902985, 0.09921150654554367, 0.05109838396310806, 0.1163184866309166, -0.004373463336378336, -6.112880868158754e-8, -0.13637597858905792, -0.06666217744350433, -0.012441158294677734, -0.05018964037299156, -0.008073853328824043, -0.021447304636240005, 0.006277942098677158, -0.026682406663894653, 0.00823950581252575, -0.08113652467727661, 0.052003562450408936, 0.08876891434192657, -0.09241592139005661, 0.02386777475476265, -0.06570520997047424, 0.04055076465010643, -0.025593701750040054, 0.09918369352817535, 0.004524169024080038, -0.06846771389245987, 0.018662724643945694, 0.027569139376282692, 0.013464431278407574, 0.002088185865432024, -0.0638854131102562, -0.016810867935419083, -0.13969619572162628, 0.09635626524686813, -0.04342053830623627, -0.020619528368115425, 0.048024244606494904, 0.03854161128401756, 0.022140290588140488, 0.015644367784261703, 0.08534976840019226, 0.01574641838669777, -0.06530338525772095, -0.09666816145181656, -0.03873457759618759, 0.015085259452462196, 0.0877658948302269, 0.022334499284625053, -0.03204234689474106, -0.019491441547870636, 0.10277561843395233, -0.008164849132299423, 0.05094192177057266, -0.052191197872161865, 0.017249681055545807, 0.014338553883135319, 0.04345440864562988, -0.07436955720186234, -0.014294188469648361, 0.02477424219250679, -0.01830519549548626, -0.012047587893903255, 0.032452043145895004, -0.006100224796682596, 0.05766334757208824, 0.07216985523700714, 0.03660677745938301, 0.06690988689661026, 0.03807584196329117, 0.05661563202738762 ]
microsoft/deberta-base
7d4c0126b06bd59dccd3e48e467ed11e37b77f3f
2022-01-13T13:56:18.000Z
[ "pytorch", "tf", "rust", "deberta", "en", "arxiv:2006.03654", "transformers", "deberta-v1", "license:mit" ]
null
false
microsoft
null
microsoft/deberta-base
23,662,412
15
transformers
--- language: en tags: deberta-v1 thumbnail: https://huggingface.co/front/thumbnails/microsoft.png license: mit --- ## DeBERTa: Decoding-enhanced BERT with Disentangled Attention [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data. Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates. #### Fine-tuning on NLU tasks We present the dev results on SQuAD 1.1/2.0 and MNLI tasks. | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m | |-------------------|-----------|-----------|--------| | RoBERTa-base | 91.5/84.6 | 83.7/80.5 | 87.6 | | XLNet-Large | -/- | -/80.2 | 86.8 | | **DeBERTa-base** | 93.1/87.2 | 86.2/83.1 | 88.8 | ### Citation If you find DeBERTa useful for your work, please cite the following paper: ``` latex @inproceedings{ he2021deberta, title={DEBERTA: DECODING-ENHANCED BERT WITH DISENTANGLED ATTENTION}, author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen}, booktitle={International Conference on Learning Representations}, year={2021}, url={https://openreview.net/forum?id=XPZIaotutsD} } ```
[ -0.10310466587543488, -0.1119682714343071, 0.015463879331946373, 0.00776915205642581, 0.017533447593450546, -0.0014948627213016152, -0.015752648934721947, 0.038549475371837616, -0.017899053171277046, 0.048586562275886536, 0.026507191359996796, -0.007420028559863567, -0.02895142324268818, 0.011676686815917492, 0.05610055848956108, 0.10373175144195557, 0.11906201392412186, 0.00013945733371656388, -0.11269666999578476, -0.018713684752583504, 0.04179934039711952, -0.00038178180693648756, 0.04115593805909157, -0.05780600383877754, 0.0772848054766655, -0.037075161933898926, -0.0413367860019207, -0.04774325713515282, 0.05839122086763382, -0.0005428643780760467, 0.030714884400367737, 0.03183400630950928, 0.05710138753056526, 0.10364996641874313, 0.005641112104058266, 0.0841573029756546, 0.015114917419850826, -0.04312793165445328, 0.0422201007604599, -0.010868671350181103, -0.048135701566934586, 0.10785646736621857, -0.044377487152814865, -0.0341414250433445, 0.04455282911658287, -0.07409476488828659, 0.0018661754438653588, 0.016742665320634842, -0.020959675312042236, -0.04378517344594002, -0.07810170948505402, -0.00042894112993963063, 0.049286775290966034, 0.057064130902290344, -0.04308115318417549, -0.009413663297891617, 0.004795206245034933, -0.0039461227133870125, -0.03867325186729431, -0.021799542009830475, -0.06609603762626648, -0.06192491203546524, -0.06108545884490013, -0.024095969274640083, -0.024669531732797623, -0.005182500928640366, -0.019619714468717575, -0.0027617434971034527, 0.025947028771042824, -0.002403381047770381, -0.0006910886731930077, 0.023914065212011337, -0.033125292509794235, 0.0008575845276936889, 0.011537790298461914, 0.01743622124195099, 0.04998176544904709, -0.009752092882990837, 0.06623443961143494, -0.12556254863739014, 0.05519763380289078, -0.03548287972807884, 0.09530632197856903, 0.009886113926768303, 0.13587622344493866, -0.02605583332479, 0.02260863408446312, 0.0017649097135290504, -0.04547342285513878, -0.04735971614718437, -0.05158477649092674, -0.06724913418292999, 0.06409426778554916, 0.025856684893369675, 0.003480656538158655, 0.03629261627793312, 0.04051234945654869, 0.01052937749773264, -0.05604775622487068, 0.090065598487854, -0.023834949359297752, 0.034766729921102524, -0.021668365225195885, -0.0344795398414135, -0.04611264914274216, -0.014072253368794918, 0.12506826221942902, 0.03800078481435776, 0.02756485715508461, -0.061700738966464996, 0.04803468659520149, 0.03035777434706688, -0.023793326690793037, 0.0004723850579466671, 0.019804585725069046, -0.02437763474881649, 0.10293775051832199, 0.016911154612898827, 0.03899920731782913, 0.05113217234611511, 0.019636977463960648, -0.0402672253549099, -0.016473183408379555, -0.007001808378845453, 0.02217528410255909, 0.04383080452680588, -0.013859890401363373, 1.5994505311127538e-33, 0.06657180190086365, 0.027629615738987923, -0.013961181975901127, 0.0018153850687667727, -0.01749434880912304, 0.005814395844936371, -0.004458277951925993, 0.0133542250841856, -0.05880117788910866, 0.021671734750270844, -0.0785636454820633, 0.030311347916722298, -0.044702671468257904, 0.050478868186473846, -0.04199743643403053, -0.05068090558052063, 0.004110063426196575, 0.06554365158081055, 0.04271663725376129, 0.07575643062591553, 0.07556372880935669, 0.06700040400028229, -0.04544569179415703, -0.0646907389163971, -0.006601275410503149, 0.041455402970314026, 0.11527253687381744, -0.0794740542769432, -0.012357083149254322, 0.037368498742580414, -0.1525287926197052, 0.03588284179568291, -0.048162031918764114, 0.021220088005065918, -0.005572366062551737, -0.02581806108355522, -0.06689317524433136, -0.035378120839595795, 0.055423103272914886, -0.019299274310469627, 0.0021569004748016596, 0.05402275174856186, -0.02936478704214096, -0.07360396534204483, -0.05220339074730873, -0.0077002993784844875, 0.03159841522574425, -0.02073354832828045, 0.03671075403690338, -0.0024768461007624865, 0.07021190226078033, 0.019100654870271683, -0.1080954447388649, -0.034958548843860626, -0.006711389869451523, -0.06463238596916199, 0.12113174051046371, 0.05992475524544716, 0.09084602445363998, 0.05257830396294594, 0.029842983931303024, -0.07832162827253342, -0.008325078524649143, 0.02834157459437847, 0.07575772702693939, -0.038154713809490204, -0.051713794469833374, 0.010610015131533146, 0.016691191121935844, -0.010124060325324535, -0.08932680636644363, -0.04023421183228493, 0.055127471685409546, 0.02118282951414585, 0.053124092519283295, -0.012107742950320244, 0.07468301057815552, -0.054406702518463135, -0.0026985276490449905, 0.015964249148964882, -0.007590766996145248, 0.03694744408130646, -0.06025613844394684, -0.05334937199950218, -0.10420238226652145, -0.05864563211798668, 0.016782622784376144, -0.0848134383559227, -0.04466012865304947, 0.018021641299128532, 0.02923840656876564, 0.0074362740851938725, -0.016694100573658943, -0.012121578678488731, -0.029531393200159073, -1.2798717288550794e-33, -0.027465270832180977, 0.06697531044483185, -0.08306576311588287, 0.020274141803383827, -0.020047517493367195, -0.005527551751583815, 0.030457766726613045, 0.09332279860973358, 0.02660585753619671, 0.004634397104382515, 0.021129027009010315, -0.10083836317062378, -0.021064436063170433, -0.002302655251696706, 0.010793506167829037, 0.029548130929470062, 0.04841534048318863, -0.007163175381720066, -0.01180830504745245, -0.03847474977374077, 0.002386799780651927, 0.05338211730122566, 0.011575094424188137, 0.05030996352434158, -0.003194352611899376, 0.06235743314027786, -0.03258662670850754, 0.07505451887845993, -0.004913385957479477, -0.04262498393654823, -0.031276870518922806, 0.009566969238221645, -0.045318346470594406, -0.00836420338600874, -0.08143414556980133, 0.050018299371004105, -0.04664899408817291, -0.014468727633357048, -0.02596917375922203, 0.034013088792562485, 0.059354133903980255, -0.04208263009786606, -0.035734012722969055, 0.07300681620836258, 0.007672434207051992, -0.07456439733505249, -0.09048731625080109, -0.07758841663599014, -0.055981751531362534, 0.08759476244449615, 0.02793055586516857, 0.007378160487860441, -0.14372658729553223, -0.01642449013888836, -0.07033853977918625, -0.08782397955656052, 0.06216658651828766, -0.07376757264137268, -0.037953928112983704, 0.03189399838447571, 0.003481512889266014, 0.03558482602238655, -0.0059868632815778255, 0.029777370393276215, -0.006839497480541468, 0.020358476787805557, 0.02708692103624344, 0.04462640732526779, 0.011872041039168835, 0.05993601679801941, 0.04853992164134979, -0.0617828369140625, 0.04693448543548584, 0.0595419779419899, -0.011595534160733223, 0.046088363975286484, -0.02982044219970703, -0.0741802453994751, -0.0316292978823185, -0.08777559548616409, -0.09400491416454315, -0.0430799201130867, 0.0012567967642098665, 0.06108901649713516, 0.01077250950038433, 0.12168262153863907, 0.03152528032660484, 0.025714214891195297, -0.018563976511359215, 0.01365111768245697, -0.026907742023468018, -0.054912105202674866, 0.028111301362514496, 0.06933650374412537, 0.002669497160241008, -5.2565937380677497e-8, -0.07737111300230026, 0.023504378274083138, -0.06814935803413391, 0.013510596938431263, -0.08726222813129425, -0.10965876281261444, -0.06929775327444077, 0.07397295534610748, 0.008355950005352497, 0.00800460297614336, 0.10421135276556015, 0.013045383617281914, -0.12436142563819885, 0.014649596065282822, 0.05167539045214653, 0.0986645445227623, 0.02061818540096283, 0.0016761475708335638, -0.010892997495830059, -0.04243142530322075, 0.046105097979307175, 0.02695349045097828, -0.010210066102445126, -0.029202399775385857, 0.00694645568728447, -0.053144827485084534, -0.0648675188422203, 0.1098310723900795, 0.018134240061044693, -0.042069658637046814, 0.015892360359430313, 0.05210919678211212, -0.05758732557296753, -0.03435100242495537, 0.017377778887748718, 0.019273335114121437, 0.006408201530575752, -0.05510994791984558, -0.037613581866025925, 0.05819851532578468, 0.08037717640399933, 0.022305790334939957, -0.10255385935306549, 0.011780810542404652, 0.06536136567592621, 0.020022612065076828, 0.0006359907565638423, -0.08100403845310211, 0.04261159524321556, -0.026044337078928947, 0.051565591245889664, -0.05803786963224411, -0.04338210076093674, 0.07772205024957657, -0.03211448714137077, 0.04459163919091225, -0.044612910598516464, -0.002741842996329069, 0.06494173407554626, 0.01728961616754532, -0.03437575325369835, 0.012379856780171394, -0.04856253042817116, 0.003610059851780534 ]
bert-base-uncased
418430c3b5df7ace92f2aede75700d22c78a0f95
2022-06-06T11:41:24.000Z
[ "pytorch", "tf", "jax", "rust", "bert", "fill-mask", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:1810.04805", "transformers", "exbert", "license:apache-2.0", "autotrain_compatible" ]
fill-mask
false
null
null
bert-base-uncased
22,268,934
204
transformers
--- language: en tags: - exbert license: apache-2.0 datasets: - bookcorpus - wikipedia --- # BERT base model (uncased) Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/abs/1810.04805) and first released in [this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference between english and English. Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the BERT model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use You can use this model directly with a pipeline for masked language modeling: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='bert-base-uncased') >>> unmasker("Hello I'm a [MASK] model.") [{'sequence': "[CLS] hello i'm a fashion model. [SEP]", 'score': 0.1073106899857521, 'token': 4827, 'token_str': 'fashion'}, {'sequence': "[CLS] hello i'm a role model. [SEP]", 'score': 0.08774490654468536, 'token': 2535, 'token_str': 'role'}, {'sequence': "[CLS] hello i'm a new model. [SEP]", 'score': 0.05338378623127937, 'token': 2047, 'token_str': 'new'}, {'sequence': "[CLS] hello i'm a super model. [SEP]", 'score': 0.04667217284440994, 'token': 3565, 'token_str': 'super'}, {'sequence': "[CLS] hello i'm a fine model. [SEP]", 'score': 0.027095865458250046, 'token': 2986, 'token_str': 'fine'}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('bert-base-uncased') model = BertModel.from_pretrained("bert-base-uncased") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` and in TensorFlow: ```python from transformers import BertTokenizer, TFBertModel tokenizer = BertTokenizer.from_pretrained('bert-base-uncased') model = TFBertModel.from_pretrained("bert-base-uncased") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='bert-base-uncased') >>> unmasker("The man worked as a [MASK].") [{'sequence': '[CLS] the man worked as a carpenter. [SEP]', 'score': 0.09747550636529922, 'token': 10533, 'token_str': 'carpenter'}, {'sequence': '[CLS] the man worked as a waiter. [SEP]', 'score': 0.0523831807076931, 'token': 15610, 'token_str': 'waiter'}, {'sequence': '[CLS] the man worked as a barber. [SEP]', 'score': 0.04962705448269844, 'token': 13362, 'token_str': 'barber'}, {'sequence': '[CLS] the man worked as a mechanic. [SEP]', 'score': 0.03788609802722931, 'token': 15893, 'token_str': 'mechanic'}, {'sequence': '[CLS] the man worked as a salesman. [SEP]', 'score': 0.037680890411138535, 'token': 18968, 'token_str': 'salesman'}] >>> unmasker("The woman worked as a [MASK].") [{'sequence': '[CLS] the woman worked as a nurse. [SEP]', 'score': 0.21981462836265564, 'token': 6821, 'token_str': 'nurse'}, {'sequence': '[CLS] the woman worked as a waitress. [SEP]', 'score': 0.1597415804862976, 'token': 13877, 'token_str': 'waitress'}, {'sequence': '[CLS] the woman worked as a maid. [SEP]', 'score': 0.1154729500412941, 'token': 10850, 'token_str': 'maid'}, {'sequence': '[CLS] the woman worked as a prostitute. [SEP]', 'score': 0.037968918681144714, 'token': 19215, 'token_str': 'prostitute'}, {'sequence': '[CLS] the woman worked as a cook. [SEP]', 'score': 0.03042375110089779, 'token': 5660, 'token_str': 'cook'}] ``` This bias will also affect all fine-tuned versions of this model. ## Training data The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ## Evaluation results When fine-tuned on downstream tasks, this model achieves the following results: Glue test results: | Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average | |:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:| | | 84.6/83.4 | 71.2 | 90.5 | 93.5 | 52.1 | 85.8 | 88.9 | 66.4 | 79.6 | ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-1810-04805, author = {Jacob Devlin and Ming{-}Wei Chang and Kenton Lee and Kristina Toutanova}, title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language Understanding}, journal = {CoRR}, volume = {abs/1810.04805}, year = {2018}, url = {http://arxiv.org/abs/1810.04805}, archivePrefix = {arXiv}, eprint = {1810.04805}, timestamp = {Tue, 30 Oct 2018 20:39:56 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=bert-base-uncased"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
[ -0.10429023206233978, -0.07568823546171188, 0.0528205968439579, 0.029537051916122437, 0.03592930734157562, 0.06076014041900635, 0.016745345667004585, -0.011535749770700932, 0.014104398898780346, -0.024821121245622635, 0.0364287793636322, -0.055009908974170685, 0.05354085937142372, 0.04109733924269676, 0.030415909364819527, 0.022292472422122955, 0.07724115252494812, -0.04155919700860977, -0.07285065948963165, -0.021603459492325783, 0.07123256474733353, 0.07959746569395065, 0.03496096655726433, -0.021689286455512047, -0.0011311565758660436, -0.019583245739340782, -0.028005843982100487, -0.06651826202869415, 0.10060278326272964, 0.023515889421105385, 0.06312105059623718, -0.023379413411021233, 0.07627937197685242, 0.1127561703324318, -0.015401620417833328, 0.04292045906186104, -0.025678560137748718, -0.009867781773209572, 0.019827337935566902, -0.0017153871012851596, -0.06170213967561722, -0.05681595206260681, -0.039645206183195114, -0.039618514478206635, 0.08055847138166428, 0.031009085476398468, -0.051430873572826385, -0.009893628768622875, -0.11251183599233627, -0.03169631212949753, -0.08030558377504349, -0.046183954924345016, 0.040237344801425934, -0.014377943240106106, -0.04831799492239952, 0.006784780882298946, 0.024914342910051346, -0.059462979435920715, -0.02302601933479309, -0.052245479077100754, -0.12127243727445602, -0.035338759422302246, -0.007322712801396847, 0.06005391851067543, -0.08465824276208878, 0.038568250834941864, -0.03943444415926933, 0.011998992413282394, 0.018092691898345947, 0.009456639178097248, 0.042823031544685364, 0.04823748394846916, 0.05546247214078903, 0.006207224912941456, 0.03632885962724686, -0.026505418121814728, 0.07277990877628326, 0.015598254278302193, 0.04243132472038269, -0.06048264354467392, 0.01281778048723936, 0.017800219357013702, 0.0684138834476471, 0.018336299806833267, 0.03677145391702652, 0.013547367416322231, -0.0009019067510962486, 0.006275147665292025, -0.020071620121598244, 0.004158053081482649, -0.056557804346084595, -0.09862760454416275, 0.06195730343461037, 0.03797760605812073, -0.01949288696050644, -0.014099253341555595, 0.03103417344391346, 0.0181620754301548, -0.016791021451354027, 0.06994856894016266, -0.006534121930599213, 0.07210233062505722, 0.0189613476395607, -0.04980533942580223, 0.044844914227724075, -0.04316018149256706, -0.004327995236963034, 0.0023641972802579403, 0.0650322362780571, -0.08981825411319733, 0.05446161702275276, -0.055893052369356155, -0.03340187668800354, -0.03456137329339981, 0.005833787843585014, -0.05937950313091278, 0.023304801434278488, -0.040086738765239716, 0.027623619884252548, 0.09039068967103958, 0.012228255160152912, 0.026942668482661247, 0.07121511548757553, -0.0010299199493601918, -0.08900784701108932, -0.010268484242260456, -0.011618150398135185, 1.560576565589076e-33, 0.02798757515847683, 0.019054265692830086, -0.0018726870184764266, 0.0008465462015010417, 0.02259220741689205, -0.0293614212423563, 0.017971035093069077, 0.00021829857723787427, 0.03713684901595116, -0.021730972453951836, -0.020740043371915817, 0.035501983016729355, -0.09018325060606003, 0.11158109456300735, -0.05113924294710159, 0.04923287406563759, -0.02770504727959633, 0.06163402646780014, 0.05848699435591698, -0.005583863239735365, 0.07455001026391983, 0.04571005702018738, 0.04516781121492386, -0.12333088368177414, -0.05158194527029991, 0.0756906196475029, 0.07878211885690689, -0.1017572209239006, 0.012236510403454304, 0.05211499705910683, -0.11210931092500687, 0.03529820218682289, 0.0059121400117874146, 0.019336694851517677, 0.008660600520670414, 0.014121998101472855, 0.03991656377911568, -0.06780894845724106, 0.010466977022588253, -0.022653650492429733, 0.007017610594630241, 0.051826849579811096, 0.016301730647683144, -0.09178503602743149, -0.03883574157953262, -0.004366348963230848, 0.010699099861085415, -0.001716689090244472, 0.01027305144816637, -0.00906254444271326, 0.07234599441289902, 0.019360147416591644, -0.046360082924366, -0.05153945833444595, -0.000567937211599201, -0.009697988629341125, 0.06307937204837799, 0.0359126515686512, 0.00974391307681799, 0.013350504450500011, -0.016284607350826263, -0.005902224685996771, 0.05779380351305008, 0.04801936447620392, 0.041418854147195816, -0.04367368668317795, -0.01939735747873783, -0.028333110734820366, 0.0012392213102430105, -0.01987081952393055, -0.06455826014280319, 0.0006630497518926859, -0.04320439696311951, 0.014889493584632874, -0.008174968883395195, -0.07123851031064987, 0.04021378606557846, -0.06644681096076965, -0.041741445660591125, 0.06044313684105873, 0.003186436602845788, 0.052693840116262436, -0.049192335456609726, -0.0552670992910862, -0.052911754697561264, -0.011907716281712055, 0.07949648797512054, -0.05003225803375244, 0.025191262364387512, 0.002472976455464959, 0.049676090478897095, -0.0530138798058033, -0.035932507365942, 0.013322019949555397, 0.007800266146659851, -3.379070550743978e-33, -0.05675952881574631, 0.0324503630399704, -0.11059564352035522, 0.0003600041090976447, -0.06146073713898659, -0.10774464905261993, 0.07546927034854889, 0.196961909532547, 0.029759524390101433, -0.03023693338036537, -0.01150856539607048, -0.053671009838581085, -0.0191506277769804, 0.010718809440732002, 0.048924144357442856, -0.03926469758152962, 0.008886643685400486, -0.004839547909796238, 0.0018254992319270968, 0.02795078232884407, 0.015735073015093803, 0.004713710397481918, -0.1070152297616005, 0.07044441998004913, -0.0236411914229393, 0.08595185726881027, -0.0490153543651104, 0.07679782062768936, 0.04320277273654938, 0.02268024906516075, -0.013761699199676514, 0.012756693176925182, -0.0161298755556345, 0.052596304565668106, -0.11429251730442047, 0.05207913741469383, -0.002032277174293995, -0.019945845007896423, -0.0212161373347044, 0.004185016732662916, 0.04137434437870979, -0.0006305171409621835, -0.0816548615694046, 0.020661611109972, -0.007113681174814701, -0.002498422283679247, -0.1024039015173912, -0.08833065629005432, 0.019499780610203743, -0.07748983055353165, -0.013187422417104244, -0.014179261401295662, -0.09319126605987549, -0.04971740394830704, -0.11694701015949249, -0.09161987155675888, -0.012389862909913063, -0.06748734414577484, 0.005984834395349026, 0.005329212173819542, -0.008531524799764156, -0.004468786064535379, 0.006214268505573273, -0.04963003844022751, -0.02888992242515087, -0.014041339978575706, 0.0038778004236519337, 0.029458027333021164, -0.06128936633467674, -0.051838453859090805, 0.01915237493813038, 0.0022809989750385284, 0.015416988171637058, 0.02977568469941616, 0.011403705924749374, 0.003642740659415722, -0.04854327067732811, -0.10323040187358856, -0.036950062960386276, -0.08564858138561249, -0.0457274429500103, -0.08215271681547165, -0.0034860060550272465, 0.09353037178516388, 0.04785042628645897, -0.00859615858644247, 0.0017457515932619572, 0.0561240054666996, -0.028591370210051537, 0.016605107113718987, 0.012148366309702396, 0.049835093319416046, -0.030539629980921745, 0.15094560384750366, -0.012119823135435581, -5.619254750399705e-8, -0.07957130670547485, 0.022745640948414803, -0.02321653626859188, 0.03208130598068237, -0.053214602172374725, -0.04183860868215561, -0.03516432270407677, -0.04093530774116516, -0.004731090739369392, -0.047763142734766006, 0.0036703343503177166, 0.064683698117733, -0.1152191162109375, 0.025712180882692337, -0.04605894163250923, 0.08092635124921799, -0.04210112988948822, 0.06226341798901558, 0.02523229271173477, -0.03689774498343468, -0.00689668720588088, 0.016664521768689156, 0.015193339437246323, -0.056794580072164536, 0.03505510091781616, -0.03567483276128769, -0.020500265061855316, 0.10180246829986572, 0.0013946517137810588, 0.025993801653385162, -0.055132217705249786, 0.03563198074698448, -0.046598173677921295, 0.052478961646556854, 0.0708688497543335, 0.0809711366891861, -0.019979014992713928, -0.047416169196367264, -0.010461571626365185, 0.03729459270834923, 0.11781864613294601, 0.07328666001558304, -0.10803867131471634, -0.02183680795133114, 0.1204158142209053, 0.04429095610976219, -0.021950552240014076, -0.1053876057267189, 0.030969461426138878, 0.023236751556396484, 0.02977236546576023, -0.06183570250868797, -0.029373912140727043, 0.07834409922361374, -0.04289044812321663, 0.04190199822187424, -0.050902288407087326, -0.030598202720284462, 0.06658688187599182, 0.022599341347813606, 0.013220781460404396, 0.04453897476196289, 0.0698499009013176, 0.06520812958478928 ]
gpt2
6c0e6080953db56375760c0471a8c5f2929baf11
2021-05-19T16:25:59.000Z
[ "pytorch", "tf", "jax", "tflite", "rust", "gpt2", "text-generation", "en", "transformers", "exbert", "license:mit" ]
text-generation
false
null
null
gpt2
11,350,803
164
transformers
--- language: en tags: - exbert license: mit --- # GPT-2 Test the whole generation capabilities here: https://transformer.huggingface.co/doc/gpt2-large Pretrained model on English language using a causal language modeling (CLM) objective. It was introduced in [this paper](https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) and first released at [this page](https://openai.com/blog/better-language-models/). Disclaimer: The team releasing GPT-2 also wrote a [model card](https://github.com/openai/gpt-2/blob/master/model_card.md) for their model. Content from this model card has been written by the Hugging Face team to complete the information they provided and give specific examples of bias. ## Model description GPT-2 is a transformers model pretrained on a very large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was trained to guess the next word in sentences. More precisely, inputs are sequences of continuous text of a certain length and the targets are the same sequence, shifted one token (word or piece of word) to the right. The model uses internally a mask-mechanism to make sure the predictions for the token `i` only uses the inputs from `1` to `i` but not the future tokens. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks. The model is best at what it was pretrained for however, which is generating texts from a prompt. ## Intended uses & limitations You can use the raw model for text generation or fine-tune it to a downstream task. See the [model hub](https://huggingface.co/models?filter=gpt2) to look for fine-tuned versions on a task that interests you. ### How to use You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility: ```python >>> from transformers import pipeline, set_seed >>> generator = pipeline('text-generation', model='gpt2') >>> set_seed(42) >>> generator("Hello, I'm a language model,", max_length=30, num_return_sequences=5) [{'generated_text': "Hello, I'm a language model, a language for thinking, a language for expressing thoughts."}, {'generated_text': "Hello, I'm a language model, a compiler, a compiler library, I just want to know how I build this kind of stuff. I don"}, {'generated_text': "Hello, I'm a language model, and also have more than a few of your own, but I understand that they're going to need some help"}, {'generated_text': "Hello, I'm a language model, a system model. I want to know my language so that it might be more interesting, more user-friendly"}, {'generated_text': 'Hello, I\'m a language model, not a language model"\n\nThe concept of "no-tricks" comes in handy later with new'}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import GPT2Tokenizer, GPT2Model tokenizer = GPT2Tokenizer.from_pretrained('gpt2') model = GPT2Model.from_pretrained('gpt2') text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` and in TensorFlow: ```python from transformers import GPT2Tokenizer, TFGPT2Model tokenizer = GPT2Tokenizer.from_pretrained('gpt2') model = TFGPT2Model.from_pretrained('gpt2') text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` ### Limitations and bias The training data used for this model has not been released as a dataset one can browse. We know it contains a lot of unfiltered content from the internet, which is far from neutral. As the openAI team themselves point out in their [model card](https://github.com/openai/gpt-2/blob/master/model_card.md#out-of-scope-use-cases): > Because large-scale language models like GPT-2 do not distinguish fact from fiction, we don’t support use-cases > that require the generated text to be true. > > Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we do > not recommend that they be deployed into systems that interact with humans > unless the deployers first carry out a > study of biases relevant to the intended use-case. We found no statistically significant difference in gender, race, > and religious bias probes between 774M and 1.5B, implying all versions of GPT-2 should be approached with similar > levels of caution around use cases that are sensitive to biases around human attributes. Here's an example of how the model can have biased predictions: ```python >>> from transformers import pipeline, set_seed >>> generator = pipeline('text-generation', model='gpt2') >>> set_seed(42) >>> generator("The White man worked as a", max_length=10, num_return_sequences=5) [{'generated_text': 'The White man worked as a mannequin for'}, {'generated_text': 'The White man worked as a maniser of the'}, {'generated_text': 'The White man worked as a bus conductor by day'}, {'generated_text': 'The White man worked as a plumber at the'}, {'generated_text': 'The White man worked as a journalist. He had'}] >>> set_seed(42) >>> generator("The Black man worked as a", max_length=10, num_return_sequences=5) [{'generated_text': 'The Black man worked as a man at a restaurant'}, {'generated_text': 'The Black man worked as a car salesman in a'}, {'generated_text': 'The Black man worked as a police sergeant at the'}, {'generated_text': 'The Black man worked as a man-eating monster'}, {'generated_text': 'The Black man worked as a slave, and was'}] ``` This bias will also affect all fine-tuned versions of this model. ## Training data The OpenAI team wanted to train this model on a corpus as large as possible. To build it, they scraped all the web pages from outbound links on Reddit which received at least 3 karma. Note that all Wikipedia pages were removed from this dataset, so the model was not trained on any part of Wikipedia. The resulting dataset (called WebText) weights 40GB of texts but has not been publicly released. You can find a list of the top 1,000 domains present in WebText [here](https://github.com/openai/gpt-2/blob/master/domains.txt). ## Training procedure ### Preprocessing The texts are tokenized using a byte-level version of Byte Pair Encoding (BPE) (for unicode characters) and a vocabulary size of 50,257. The inputs are sequences of 1024 consecutive tokens. The larger model was trained on 256 cloud TPU v3 cores. The training duration was not disclosed, nor were the exact details of training. ## Evaluation results The model achieves the following results without any fine-tuning (zero-shot): | Dataset | LAMBADA | LAMBADA | CBT-CN | CBT-NE | WikiText2 | PTB | enwiki8 | text8 | WikiText103 | 1BW | |:--------:|:-------:|:-------:|:------:|:------:|:---------:|:------:|:-------:|:------:|:-----------:|:-----:| | (metric) | (PPL) | (ACC) | (ACC) | (ACC) | (PPL) | (PPL) | (BPB) | (BPC) | (PPL) | (PPL) | | | 35.13 | 45.99 | 87.65 | 83.4 | 29.41 | 65.85 | 1.16 | 1,17 | 37.50 | 75.20 | ### BibTeX entry and citation info ```bibtex @article{radford2019language, title={Language Models are Unsupervised Multitask Learners}, author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya}, year={2019} } ``` <a href="https://huggingface.co/exbert/?model=gpt2"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
[ -0.05892602726817131, -0.08013296872377396, 0.060150083154439926, 0.0071877371519804, 0.07087277621030807, 0.028216511011123657, 0.01796095259487629, 0.037697840481996536, 0.03646615520119667, -0.03444734588265419, -0.016371622681617737, -0.00010747667693067342, 0.03819410130381584, 0.05765138193964958, 0.04737577587366104, -0.061539992690086365, 0.06632949411869049, 0.028909843415021896, -0.08990569412708282, -0.094761423766613, 0.030421243980526924, 0.01383698359131813, 0.03299061954021454, 0.03436225280165672, 0.0017351757269352674, -0.009580046869814396, 0.014636637642979622, -0.05279671028256416, 0.05539756640791893, 0.03692585602402687, 0.007830037735402584, 0.00678889499977231, -0.01637965999543667, 0.03975406661629677, -0.05051802843809128, 0.06477388739585876, -0.04540223255753517, -0.05227212980389595, -0.010725662112236023, -0.027529755607247353, -0.023442383855581284, -0.045487526804208755, 0.021480262279510498, 0.016000868752598763, 0.1313019096851349, 0.02186761237680912, -0.03497331589460373, 0.008501802571117878, -0.11692365258932114, -0.02137296460568905, -0.07339566946029663, -0.08148463815450668, 0.021148838102817535, -0.006485986988991499, 0.045673225075006485, 0.005467485170811415, 0.03788411617279053, -0.009330494329333305, -0.03211315721273422, -0.04229879006743431, -0.07183285802602768, -0.05039284750819206, -0.1307666152715683, 0.024172868579626083, -0.049373652786016464, 0.026828045025467873, 0.005804947577416897, 0.08132722973823547, -0.005065005738288164, 0.0006432677619159222, -0.028793940320611, 0.04731769114732742, -0.00325483875349164, 0.00993158109486103, 0.0571775883436203, 0.0039596245624125, 0.0689300149679184, 0.036464665085077286, 0.009981505572795868, -0.07787179201841354, 0.04967169463634491, 0.030853236094117165, 0.08807449042797089, -0.0347074456512928, 0.018793420866131783, -0.047435227781534195, 0.027273252606391907, 0.03675220161676407, 0.03532203286886215, 0.05430271849036217, -0.0038647225592285395, -0.03266637772321701, 0.05716825649142265, 0.050182588398456573, -0.00351748988032341, -0.0030094159301370382, -0.04303678125143051, -0.028791191056370735, -0.04690708965063095, 0.06546055525541306, -0.005448827054351568, 0.05810021236538887, 0.02932371385395527, 0.0445702001452446, -0.052096955478191376, -0.10741066932678223, -0.051412299275398254, 0.04708129167556763, -0.02220306172966957, -0.06347592920064926, 0.04827923700213432, -0.025859355926513672, -0.0382503867149353, -0.0033601392060518265, 0.002233545295894146, 0.00551485363394022, -0.015435918234288692, -0.037586480379104614, 0.045566316694021225, 0.06969806551933289, -0.040226612240076065, 0.013338961638510227, 0.06700313091278076, -0.06659594923257828, -0.013858318328857422, -0.04905235394835472, -0.06302788853645325, -1.2489529902272137e-35, 0.07270601391792297, 0.06798285245895386, 0.029750358313322067, 0.06972150504589081, 0.010230882093310356, 0.025935184210538864, -0.003155958605930209, -0.016568835824728012, 0.06521124392747879, -0.05870434641838074, -0.011086644604802132, 0.0788489505648613, -0.08268727362155914, 0.06103655695915222, -0.009343304671347141, 0.005125379655510187, -0.070246122777462, 0.07298942655324936, -0.0035335805732756853, 0.05677931383252144, 0.09347444772720337, 0.07324142009019852, 0.0204923078417778, -0.05690169334411621, -0.05799655243754387, 0.04223327338695526, 0.036750469356775284, -0.11744081974029541, -0.0025215810164809227, 0.05074131116271019, -0.10029198229312897, -0.04553619399666786, -0.005146225448697805, 0.059933699667453766, 0.01956919953227043, -0.07415813952684402, -0.019019121304154396, -0.10239969938993454, 0.06828504800796509, -0.06309759616851807, 0.021241817623376846, -0.024189092218875885, 0.12045340240001678, -0.09623255580663681, -0.024697601795196533, -0.033679913729429245, 0.059292279183864594, -0.011447247117757797, -0.020612070336937904, -0.00021562726760748774, 0.05817294865846634, 0.027450017631053925, -0.09740268439054489, -0.037002161145210266, -0.004984162747859955, 0.09672603756189346, 0.041026730090379715, 0.051987241953611374, 0.032140083611011505, 0.02268791012465954, 0.004740583244711161, 0.022918658331036568, 0.06077612563967705, 0.02688775397837162, 0.06267997622489929, 0.0207382719963789, -0.12623639404773712, -0.06085523962974548, 0.07487807422876358, 0.04184544458985329, -0.08703786134719849, -0.039460379630327225, -0.03620414435863495, 0.019133953377604485, 0.06623239070177078, -0.045605577528476715, 0.05708731710910797, -0.06061842292547226, 0.020876772701740265, 0.04744112119078636, -0.0684313178062439, -0.01942598447203636, -0.004327951464802027, -0.07445637136697769, -0.04538948833942413, -0.041481900960206985, -0.0044249133206903934, -0.0869952067732811, 0.015393722802400589, -0.006811107043176889, -0.0033254569862037897, -0.025764407590031624, -0.03467931970953941, 0.02898344211280346, 0.02188977040350437, -2.3783062720027234e-33, 0.02381468191742897, -0.018739037215709686, -0.002850747900083661, 0.05096646398305893, -0.02067430131137371, -0.10257234424352646, 0.07381873577833176, 0.0957576334476471, 0.04118042439222336, -0.01577182486653328, 0.023941734805703163, -0.025464309379458427, 0.053104329854249954, 0.08983692526817322, 0.06763910502195358, -0.05681665986776352, 0.04867858812212944, -0.04733288288116455, 0.015139377675950527, 0.0789661779999733, 0.04384531453251839, 0.04216461256146431, -0.08557480573654175, 0.012195521034300327, -0.024544687941670418, 0.0031439668964594603, -0.023731812834739685, 0.02152595855295658, 0.050442565232515335, 0.03673405572772026, 0.0011454952182248235, 0.03430293872952461, -0.0519750602543354, 0.025835612788796425, -0.09194382280111313, 0.03727954253554344, 0.05420822277665138, 0.0038172476924955845, -0.08364070951938629, 0.07905913889408112, 0.04950482025742531, -0.03609387204051018, -0.05081098899245262, 0.004775262903422117, 0.00549783231690526, 0.03494369983673096, -0.0835636779665947, -0.012869508005678654, 0.006966410670429468, -0.02664158307015896, -0.04912186786532402, 0.000529336160980165, -0.02767839841544628, -0.00784996710717678, -0.07608485221862793, -0.13833121955394745, 0.058370377868413925, -0.1067751869559288, -0.011857377365231514, -0.025822240859270096, -0.004274724517017603, -0.026064341887831688, 0.022420359775424004, -0.07742084562778473, -0.014267791993916035, -0.08564025163650513, -0.018211327493190765, -0.041277557611465454, 0.06850147247314453, 0.005156074650585651, -0.01131095364689827, -0.016465729102492332, -0.044131889939308167, -0.047563645988702774, -0.05797361955046654, 0.03311511501669884, -0.0005417629145085812, -0.06224829703569412, 0.01707547716796398, -0.0956006571650505, -0.060542766004800797, 0.02891186811029911, 0.06433505564928055, 0.005930554587393999, 0.026528451591730118, -0.02709062583744526, 0.02539834752678871, 0.1081656888127327, 0.04401148483157158, 0.005174885503947735, -0.08458821475505829, 0.056444838643074036, 0.00823539774864912, 0.09974049776792526, -0.08906718343496323, -5.9169462218733315e-8, -0.07344075292348862, 0.004594176542013884, -0.012033337727189064, 0.08492157608270645, -0.045423947274684906, -0.08400595933198929, -0.06103430315852165, -0.0008042145636864007, -0.028577832505106926, 0.020548364147543907, -0.016068000346422195, 0.020363565534353256, -0.06973744183778763, -0.027238795533776283, 0.001125982729718089, 0.09258455038070679, -0.01401503011584282, 0.04134267196059227, -0.04207340627908707, -0.05647535249590874, 0.0519479401409626, -0.005748982075601816, 0.02122582122683525, -0.02296864241361618, -0.015435676090419292, -0.015048420988023281, -0.06611739844083786, 0.02353525348007679, 0.025763627141714096, -0.02352149784564972, 0.027265122160315514, -0.0006859628483653069, -0.028684215620160103, -0.03695642948150635, 0.06051561236381531, 0.05941610783338547, 0.019874505698680878, -0.018225662410259247, 0.06630377471446991, 0.03701598197221756, 0.08062226325273514, 0.09679914265871048, -0.0911552906036377, 0.054176874458789825, 0.08899198472499847, -0.001545113860629499, -0.08196742832660675, -0.1009916439652443, -0.007227270398288965, 0.02834734320640564, -0.002018164610490203, 0.03265596553683281, -0.07113093882799149, 0.04632430896162987, 0.007692320272326469, 0.10285839438438416, -0.019126037135720253, -0.04478902742266655, 0.019559696316719055, 0.05619129166007042, 0.024577109143137932, 0.06380170583724976, 0.025238923728466034, 0.027692148461937904 ]
distilbert-base-uncased
043235d6088ecd3dd5fb5ca3592b6913fd516027
2022-05-31T19:08:36.000Z
[ "pytorch", "tf", "jax", "rust", "distilbert", "fill-mask", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:1910.01108", "transformers", "exbert", "license:apache-2.0", "autotrain_compatible" ]
fill-mask
false
null
null
distilbert-base-uncased
11,250,037
70
transformers
--- language: en tags: - exbert license: apache-2.0 datasets: - bookcorpus - wikipedia --- # DistilBERT base model (uncased) This model is a distilled version of the [BERT base model](https://huggingface.co/bert-base-uncased). It was introduced in [this paper](https://arxiv.org/abs/1910.01108). The code for the distillation process can be found [here](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation). This model is uncased: it does not make a difference between english and English. ## Model description DistilBERT is a transformers model, smaller and faster than BERT, which was pretrained on the same corpus in a self-supervised fashion, using the BERT base model as a teacher. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts using the BERT base model. More precisely, it was pretrained with three objectives: - Distillation loss: the model was trained to return the same probabilities as the BERT base model. - Masked language modeling (MLM): this is part of the original training loss of the BERT base model. When taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Cosine embedding loss: the model was also trained to generate hidden states as close as possible as the BERT base model. This way, the model learns the same inner representation of the English language than its teacher model, while being faster for inference or downstream tasks. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=distilbert) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use You can use this model directly with a pipeline for masked language modeling: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='distilbert-base-uncased') >>> unmasker("Hello I'm a [MASK] model.") [{'sequence': "[CLS] hello i'm a role model. [SEP]", 'score': 0.05292855575680733, 'token': 2535, 'token_str': 'role'}, {'sequence': "[CLS] hello i'm a fashion model. [SEP]", 'score': 0.03968575969338417, 'token': 4827, 'token_str': 'fashion'}, {'sequence': "[CLS] hello i'm a business model. [SEP]", 'score': 0.034743521362543106, 'token': 2449, 'token_str': 'business'}, {'sequence': "[CLS] hello i'm a model model. [SEP]", 'score': 0.03462274372577667, 'token': 2944, 'token_str': 'model'}, {'sequence': "[CLS] hello i'm a modeling model. [SEP]", 'score': 0.018145186826586723, 'token': 11643, 'token_str': 'modeling'}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import DistilBertTokenizer, DistilBertModel tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-uncased') model = DistilBertModel.from_pretrained("distilbert-base-uncased") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` and in TensorFlow: ```python from transformers import DistilBertTokenizer, TFDistilBertModel tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-uncased') model = TFDistilBertModel.from_pretrained("distilbert-base-uncased") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions. It also inherits some of [the bias of its teacher model](https://huggingface.co/bert-base-uncased#limitations-and-bias). ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='distilbert-base-uncased') >>> unmasker("The White man worked as a [MASK].") [{'sequence': '[CLS] the white man worked as a blacksmith. [SEP]', 'score': 0.1235365942120552, 'token': 20987, 'token_str': 'blacksmith'}, {'sequence': '[CLS] the white man worked as a carpenter. [SEP]', 'score': 0.10142576694488525, 'token': 10533, 'token_str': 'carpenter'}, {'sequence': '[CLS] the white man worked as a farmer. [SEP]', 'score': 0.04985016956925392, 'token': 7500, 'token_str': 'farmer'}, {'sequence': '[CLS] the white man worked as a miner. [SEP]', 'score': 0.03932540491223335, 'token': 18594, 'token_str': 'miner'}, {'sequence': '[CLS] the white man worked as a butcher. [SEP]', 'score': 0.03351764753460884, 'token': 14998, 'token_str': 'butcher'}] >>> unmasker("The Black woman worked as a [MASK].") [{'sequence': '[CLS] the black woman worked as a waitress. [SEP]', 'score': 0.13283951580524445, 'token': 13877, 'token_str': 'waitress'}, {'sequence': '[CLS] the black woman worked as a nurse. [SEP]', 'score': 0.12586183845996857, 'token': 6821, 'token_str': 'nurse'}, {'sequence': '[CLS] the black woman worked as a maid. [SEP]', 'score': 0.11708822101354599, 'token': 10850, 'token_str': 'maid'}, {'sequence': '[CLS] the black woman worked as a prostitute. [SEP]', 'score': 0.11499975621700287, 'token': 19215, 'token_str': 'prostitute'}, {'sequence': '[CLS] the black woman worked as a housekeeper. [SEP]', 'score': 0.04722772538661957, 'token': 22583, 'token_str': 'housekeeper'}] ``` This bias will also affect all fine-tuned versions of this model. ## Training data DistilBERT pretrained on the same data as BERT, which is [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 8 16 GB V100 for 90 hours. See the [training code](https://github.com/huggingface/transformers/tree/master/examples/distillation) for all hyperparameters details. ## Evaluation results When fine-tuned on downstream tasks, this model achieves the following results: Glue test results: | Task | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | |:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:| | | 82.2 | 88.5 | 89.2 | 91.3 | 51.3 | 85.8 | 87.5 | 59.9 | ### BibTeX entry and citation info ```bibtex @article{Sanh2019DistilBERTAD, title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter}, author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf}, journal={ArXiv}, year={2019}, volume={abs/1910.01108} } ``` <a href="https://huggingface.co/exbert/?model=distilbert-base-uncased"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
[ -0.13866068422794342, -0.06151656433939934, 0.08515582978725433, 0.01706661842763424, 0.014973160810768604, -0.052277322858572006, -0.007684790063649416, 0.061743155121803284, 0.00879606045782566, -0.05457921698689461, 0.02629678323864937, 0.030604751780629158, 0.04207247868180275, 0.0005483476561494172, 0.014708519913256168, 0.05287332832813263, 0.03225383162498474, 0.0015686923870816827, -0.12773533165454865, -0.03734550252556801, 0.07698962092399597, 0.06860435009002686, -0.050180211663246155, -0.029958562925457954, 0.05471629649400711, 0.043994203209877014, -0.013200372457504272, -0.003895346075296402, 0.13334470987319946, -0.03536169230937958, 0.03555809706449509, -0.009632274508476257, 0.0066370246931910515, 0.07664995640516281, -0.0262793879956007, 0.0980677381157875, -0.010166587308049202, -0.02836974896490574, 0.0224303025752306, 0.014163599349558353, 0.005492146592587233, 0.015870342031121254, -0.05575831979513168, 0.0499497689306736, 0.04519451782107353, 0.03155642747879028, -0.0732596218585968, -0.006343332584947348, -0.09396521002054214, -0.03327856585383415, -0.06597352772951126, -0.016565216705203056, 0.05134807154536247, 0.06337177753448486, -0.03188000246882439, -0.017840372398495674, 0.042888566851615906, -0.033106040209531784, -0.04361216351389885, -0.04262606427073479, -0.0620386004447937, 0.03457228094339371, -0.041028786450624466, -0.004051884636282921, -0.046589747071266174, 0.030886143445968628, 0.0010129439178854227, 0.0427219420671463, 0.07695875316858292, -0.07374277710914612, 0.006045246031135321, 0.03867769241333008, 0.030813246965408325, 0.06711991876363754, 0.02951856516301632, -0.04581641033291817, 0.03980309143662453, 0.03719314560294151, -0.004274957813322544, -0.08615348488092422, 0.035238757729530334, -0.009797911159694195, 0.05394163355231285, 0.0031858005095273256, 0.0226765014231205, 0.03764154389500618, 0.015251902863383293, -0.01770288310945034, 0.035333383828401566, -0.01625302992761135, -0.029845019802451134, -0.058868665248155594, 0.03357330709695816, 0.00042852849583141506, -0.03971968591213226, 0.0372404009103775, 0.005182690918445587, 0.06889118999242783, -0.04318602383136749, 0.09763853996992111, -0.012045313604176044, 0.055005915462970734, 0.031240057200193405, -0.07776404917240143, -0.05719825625419617, -0.07101406157016754, -0.002126740524545312, 0.025007259100675583, 0.02198890410363674, -0.0827011987566948, -0.008798069320619106, -0.010530608706176281, -0.04852433130145073, -0.026119904592633247, -0.057952091097831726, -0.1038437932729721, 0.049006473273038864, -0.041620101779699326, 0.008363115601241589, 0.005875155795365572, 0.03788350522518158, -0.002124059945344925, 0.022192146629095078, 0.037699151784181595, -0.09234733879566193, -0.04183609038591385, -0.05258256196975708, -4.290482748814346e-34, 0.0027436446398496628, 0.020729025825858116, -0.039631664752960205, 0.03557262942194939, -0.008844909258186817, -0.009994585067033768, -0.01999596692621708, 0.02260269783437252, -0.024533027783036232, 0.00670611159875989, 0.013180105946958065, 0.060102980583906174, -0.11612702161073685, 0.053942251950502396, -0.04928957670927048, -0.018838485702872276, 0.02298988215625286, 0.06348869204521179, 0.05751795321702957, -0.03640071302652359, 0.07812874764204025, 0.08885659277439117, -0.009904347360134125, -0.0607282817363739, -0.09397021681070328, 0.055701255798339844, 0.022420695051550865, -0.038807984441518784, 0.048465825617313385, 0.03386658802628517, -0.01367725059390068, -0.010899627581238747, -0.029229167848825455, 0.03713969513773918, -0.013916864059865475, 0.0028331372886896133, -0.017168892547488213, -0.07732103019952774, 0.02103414386510849, -0.052486173808574677, 0.007472499739378691, 0.031900614500045776, 0.03249064460396767, -0.06035960838198662, -0.01469445414841175, 0.03656918555498123, -0.009355376474559307, -0.009704616852104664, 0.017283465713262558, -0.09143408387899399, 0.02569335699081421, 0.057866111397743225, 0.017810985445976257, -0.03304614499211311, -0.026747504249215126, 0.020251043140888214, 0.07283515483140945, 0.0024413736537098885, 0.028642011806368828, 0.05231553688645363, -0.03455353155732155, 0.08731789886951447, -0.0019658859819173813, 0.045681972056627274, 0.030375801026821136, -0.023715386167168617, -0.022995399311184883, 0.023654896765947342, 0.07898537069559097, 0.025255588814616203, -0.10223675519227982, -0.004775127861648798, -0.06725259870290756, 0.04502946883440018, -0.0017559867119416595, -0.10625667124986649, 0.014228960499167442, -0.042501382529735565, 0.015166955068707466, 0.007429137360304594, -0.012694275937974453, -0.0013582846149802208, -0.05749308317899704, -0.006862750742584467, -0.08307014405727386, -0.008404718711972237, -0.008315090090036392, -0.027783188968896866, 0.005403038579970598, -0.017015784978866577, 0.06964680552482605, -0.016187960281968117, -0.04881814494729042, -0.003071339102461934, 0.03397241234779358, -1.4029377221915187e-33, -0.043970365077257156, -0.009492705576121807, -0.09502815455198288, 0.08846535533666611, -0.004949036054313183, -0.03445154428482056, 0.03520054370164871, 0.16870936751365662, 0.06270091235637665, -0.04204845055937767, 0.03255351260304451, -0.04988118261098862, -0.04037486016750336, -0.015974540263414383, 0.01875440776348114, 0.05644284188747406, -0.026288826018571854, 0.007352486252784729, -0.03833453357219696, 0.010890682227909565, -0.03119562938809395, 0.08879288285970688, -0.12538985908031464, 0.044788848608732224, -0.03632137551903725, 0.04794274643063545, -0.035498183220624924, 0.06326068937778473, 0.061599284410476685, 0.030905671417713165, -0.0154383834451437, 0.01219564862549305, 0.007340000011026859, 0.013058936223387718, -0.16829924285411835, -0.013579093851149082, -0.0308084636926651, -0.038538914173841476, -0.03655311092734337, 0.005015521310269833, 0.027925431728363037, 0.02792162261903286, -0.035187967121601105, 0.0320284329354763, -0.03265973553061485, 0.04372207447886467, -0.10598547011613846, -0.0786374881863594, 0.048408154398202896, -0.016050856560468674, 0.1057291328907013, 0.01524902693927288, -0.08973900973796844, -0.04363322630524635, -0.027621738612651825, -0.06312444061040878, 0.023775795474648476, -0.02694408781826496, -0.06128871068358421, 0.07573570311069489, -0.008661064319312572, -0.018770525231957436, 0.020870082080364227, -0.007565828040242195, -0.020920665934681892, -0.049926940351724625, -0.04733188450336456, 0.05930350720882416, -0.03248777240514755, -0.0837777629494667, 0.0288255512714386, -0.0034850130323320627, 0.06530279666185379, -0.053614694625139236, -0.00785803608596325, -0.002526352647691965, -0.029473934322595596, -0.0964900478720665, -0.04915656894445419, -0.0511137954890728, -0.07452675700187683, -0.04938884079456329, -0.00818438921123743, 0.07874418050050735, 0.04303198680281639, -0.062087416648864746, 0.006945619825273752, 0.047804612666368484, 0.00486692413687706, 0.01670561544597149, 0.032502103596925735, -0.010872378945350647, -0.03177475929260254, 0.1666155904531479, 0.03527480363845825, -5.4173835195570064e-8, -0.07095728814601898, 0.001606266014277935, -0.02845580503344536, 0.024404840543866158, -0.061344537883996964, -0.031001504510641098, 0.06124451383948326, 0.05673748999834061, -0.05581453815102577, 0.01466374471783638, -0.016809973865747452, 0.03025517612695694, -0.12092714756727219, 0.007126490585505962, 0.009287535212934017, 0.1142575666308403, 0.017354948446154594, 0.10684161633253098, -0.02026248537003994, 0.004704112187027931, 0.03484029695391655, 0.00814052764326334, 0.0029335147701203823, -0.04058485105633736, 0.03694288432598114, -0.03130089491605759, -0.03278275206685066, 0.07669053971767426, 0.0019167590653523803, 0.013344875536859035, -0.008335051126778126, 0.05305621400475502, -0.04567776620388031, -0.018523626029491425, 0.02873382344841957, 0.09984202682971954, -0.046756863594055176, -0.04871577024459839, -0.05696769431233406, 0.060661062598228455, 0.08854454010725021, 0.07280514389276505, -0.1495671421289444, 0.04527703672647476, 0.09460712969303131, 0.02689356729388237, -0.028151459991931915, -0.013835445046424866, 0.032042331993579865, 0.08261910825967789, 0.06904201209545135, -0.02085300348699093, -0.013090851716697216, -0.013770508579909801, -0.05753115937113762, 0.041803210973739624, -0.07774621993303299, -0.004754333756864071, 0.027892375364899635, -0.030049391090869904, 0.039178017526865005, 0.003519226098433137, 0.16269099712371826, 0.04216780886054039 ]
Jean-Baptiste/camembert-ner
dbec8489a1c44ecad9da8a9185115bccabd799fe
2022-04-04T01:13:33.000Z
[ "pytorch", "camembert", "token-classification", "fr", "dataset:Jean-Baptiste/wikiner_fr", "transformers", "autotrain_compatible" ]
token-classification
false
Jean-Baptiste
null
Jean-Baptiste/camembert-ner
9,833,060
11
transformers
--- language: fr datasets: - Jean-Baptiste/wikiner_fr widget: - text: "Je m'appelle jean-baptiste et je vis à montréal" - text: "george washington est allé à washington" --- # camembert-ner: model fine-tuned from camemBERT for NER task. ## Introduction [camembert-ner] is a NER model that was fine-tuned from camemBERT on wikiner-fr dataset. Model was trained on wikiner-fr dataset (~170 634 sentences). Model was validated on emails/chat data and overperformed other models on this type of data specifically. In particular the model seems to work better on entity that don't start with an upper case. ## Training data Training data was classified as follow: Abbreviation|Description -|- O |Outside of a named entity MISC |Miscellaneous entity PER |Person’s name ORG |Organization LOC |Location ## How to use camembert-ner with HuggingFace ##### Load camembert-ner and its sub-word tokenizer : ```python from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("Jean-Baptiste/camembert-ner") model = AutoModelForTokenClassification.from_pretrained("Jean-Baptiste/camembert-ner") ##### Process text sample (from wikipedia) from transformers import pipeline nlp = pipeline('ner', model=model, tokenizer=tokenizer, aggregation_strategy="simple") nlp("Apple est créée le 1er avril 1976 dans le garage de la maison d'enfance de Steve Jobs à Los Altos en Californie par Steve Jobs, Steve Wozniak et Ronald Wayne14, puis constituée sous forme de société le 3 janvier 1977 à l'origine sous le nom d'Apple Computer, mais pour ses 30 ans et pour refléter la diversification de ses produits, le mot « computer » est retiré le 9 janvier 2015.") [{'entity_group': 'ORG', 'score': 0.9472818374633789, 'word': 'Apple', 'start': 0, 'end': 5}, {'entity_group': 'PER', 'score': 0.9838564991950989, 'word': 'Steve Jobs', 'start': 74, 'end': 85}, {'entity_group': 'LOC', 'score': 0.9831605950991312, 'word': 'Los Altos', 'start': 87, 'end': 97}, {'entity_group': 'LOC', 'score': 0.9834540486335754, 'word': 'Californie', 'start': 100, 'end': 111}, {'entity_group': 'PER', 'score': 0.9841555754343668, 'word': 'Steve Jobs', 'start': 115, 'end': 126}, {'entity_group': 'PER', 'score': 0.9843501806259155, 'word': 'Steve Wozniak', 'start': 127, 'end': 141}, {'entity_group': 'PER', 'score': 0.9841533899307251, 'word': 'Ronald Wayne', 'start': 144, 'end': 157}, {'entity_group': 'ORG', 'score': 0.9468960364659628, 'word': 'Apple Computer', 'start': 243, 'end': 257}] ``` ## Model performances (metric: seqeval) Overall precision|recall|f1 -|-|- 0.8859|0.8971|0.8914 By entity entity|precision|recall|f1 -|-|-|- PER|0.9372|0.9598|0.9483 ORG|0.8099|0.8265|0.8181 LOC|0.8905|0.9005|0.8955 MISC|0.8175|0.8117|0.8146 For those who could be interested, here is a short article on how I used the results of this model to train a LSTM model for signature detection in emails: https://medium.com/@jean-baptiste.polle/lstm-model-for-email-signature-detection-8e990384fefa
[ -0.11113610863685608, -0.03646893799304962, 0.059876054525375366, -0.003222051775082946, 0.012641402892768383, -0.04548738896846771, -0.01905476488173008, 0.06513305753469467, 0.05475887656211853, -0.036326371133327484, 0.01774558238685131, -0.053867973387241364, 0.0843958929181099, 0.014216315932571888, 0.014968344010412693, 0.015417986549437046, 0.12229806929826736, -0.03918964043259621, -0.10689777880907059, -0.15030120313167572, -0.03922760859131813, 0.12250178307294846, 0.05741746351122856, -0.06587070971727371, 0.024157488718628883, -0.06978852301836014, -0.03607677295804024, 0.0013306180480867624, 0.04913569241762161, 0.02288943901658058, -0.01566065475344658, 0.045882437378168106, -0.06527183204889297, 0.08393707126379013, -0.02127503789961338, 0.11218748241662979, -0.0598696731030941, 0.0006626221002079546, 0.029592473059892654, 0.04152301698923111, -0.04092497006058693, -0.0066643934696912766, -0.018043890595436096, 0.08008526265621185, -0.0037463405169546604, 0.0391949824988842, -0.07132649421691895, 0.017353955656290054, -0.08964718133211136, 0.029154097661376, -0.10570017248392105, -0.0007780709420330822, 0.05965246260166168, 0.11725851893424988, 0.007968098856508732, 0.032728131860494614, -0.03476317599415779, -0.004799895454198122, 0.019515741616487503, -0.06859120726585388, -0.05730503425002098, -0.05294766649603844, -0.030301326885819435, -0.02935441955924034, -0.04458209127187729, 0.0020704097114503384, -0.06652133911848068, 0.05566255748271942, 0.0063153225928545, 0.0060105533339083195, 0.0061968243680894375, 0.02943292073905468, -0.044217225164175034, 0.07919961959123611, -0.04450596123933792, -0.020503124222159386, 0.005020760465413332, 0.07576227188110352, 0.060975756496191025, -0.037642281502485275, 0.013675937429070473, 0.05059431865811348, 0.09006863832473755, -0.008867059834301472, 0.04339151084423065, -0.047485124319791794, 0.03800719976425171, -0.004271998070180416, 0.02562871389091015, 0.02675805240869522, -0.013719944283366203, -0.07358560711145401, 0.11448118090629578, 0.02392805740237236, -0.04391241446137428, -0.0043967668898403645, 0.019267892464995384, 0.06723225116729736, -0.030235616490244865, 0.07822909206151962, -0.03844386339187622, 0.08765698969364166, 0.07023965567350388, 0.009170414879918098, -0.06544094532728195, -0.11253324896097183, 0.027936235070228577, 0.05975547805428505, -0.002156394300982356, 0.00467282347381115, -0.0003596299793571234, 0.039281561970710754, -0.040850430727005005, -0.07208549231290817, 0.05379871279001236, -0.046511705964803696, -0.027691135182976723, -0.041770417243242264, -0.04661552608013153, 0.02741083689033985, 0.021671997383236885, 0.011550158262252808, -0.03807400166988373, 0.0405038557946682, 0.018705207854509354, 0.042623359709978104, -0.06469754129648209, 5.2800531989289034e-33, 0.015013108029961586, 0.05263020098209381, 0.01975894719362259, 0.06841922551393509, -0.015336090698838234, -0.024390147998929024, -0.04198630526661873, 0.05410590395331383, -0.029827255755662918, 0.04932790622115135, -0.04563392698764801, 0.015787111595273018, -0.018537165597081184, 0.0006495144334621727, 0.020806046202778816, -0.017140109091997147, -0.032201461493968964, -0.06487751752138138, -0.033459097146987915, 0.08403489738702774, 0.13827091455459595, 0.022321224212646484, -0.0032382372301071882, -0.02165602333843708, -0.029320403933525085, -0.008121415972709656, 0.02569066546857357, -0.058891575783491135, -0.05735340714454651, -0.0030404110439121723, -0.08305290341377258, -0.06102403625845909, 0.058034397661685944, 0.053483862429857254, 0.02452142722904682, -0.060090553015470505, -0.0029465346597135067, -0.06016087532043457, 0.024524280801415443, -0.09834102541208267, -0.0046828738413751125, 0.052050378173589706, 0.0518612377345562, -0.04359980300068855, -0.022677041590213776, 0.02502189762890339, 0.032375533133745193, -0.08032338321208954, 0.04282551258802414, 0.021420355886220932, 0.06751525402069092, 0.00696571497246623, -0.024430055171251297, 0.022692883387207985, -0.04806601256132126, 0.031648870557546616, 0.024627121165394783, 0.025774413719773293, 0.013194812461733818, 0.024091357365250587, -0.015260152518749237, -0.0176228117197752, 0.07187039405107498, 0.020635392516851425, 0.013839191757142544, -0.06898347288370132, -0.054881539195775986, -0.020688891410827637, 0.012222028337419033, -0.01705925166606903, -0.010548465885221958, -0.0011176371481269598, 0.007868162356317043, 0.05877615138888359, 0.055277395993471146, 0.0006131745176389813, 0.03801919147372246, -0.08084682375192642, -0.05340782552957535, -0.06259773671627045, 0.04869084805250168, -0.0039108009077608585, -0.02593676559627056, -0.0029037578497081995, -0.05425160005688667, -0.09924351423978806, 0.013998592272400856, -0.06560065597295761, -0.03488638624548912, -0.018197469413280487, -0.022176600992679596, 0.0022853680420666933, -0.08904680609703064, -0.002297591418027878, -0.056822288781404495, -5.894475279896082e-33, 0.0007535744225606322, 0.004808481782674789, -0.042320091277360916, 0.027559245005249977, 0.010642518289387226, -0.0578451007604599, 0.06545911729335785, 0.06916594505310059, 0.07298053801059723, -0.03905705735087395, 0.06734099239110947, -0.0605936162173748, 0.0202538650482893, -0.004037340637296438, 0.05533233657479286, 0.04114983603358269, -0.016127822920680046, -0.009476795792579651, 0.03284406661987305, 0.04371843859553337, 0.012790813110768795, -0.0027820714749395847, -0.15957199037075043, -0.009113848209381104, -0.017832910642027855, 0.09532331675291061, 0.010531049221754074, 0.04784856736660004, 0.012213031761348248, 0.021404564380645752, -0.06455758213996887, -0.007384799420833588, 0.028988000005483627, 0.03932356461882591, -0.09802371263504028, 0.004345392342656851, 0.05476130172610283, -0.016947202384471893, -0.03432752937078476, 0.03640088811516762, 0.0830153077840805, 0.06727971136569977, -0.08862045407295227, 0.06232025846838951, -0.02235911227762699, -0.06274692714214325, -0.10423873364925385, -0.0507933646440506, -0.009763655252754688, -0.03030838631093502, 0.0007167113944888115, 0.02019338309764862, -0.054085541516542435, 0.01593957468867302, -0.05924670398235321, 0.016604876145720482, 0.05776022747159004, -0.12376472353935242, -0.06450589746236801, -0.017904050648212433, -0.010688378475606441, 0.036154620349407196, 0.022718384861946106, -0.017027856782078743, 0.004380940459668636, -0.09922469407320023, -0.04112768545746803, 0.047469235956668854, 0.030388832092285156, -0.049799393862485886, 0.035697679966688156, -0.024605372920632362, -0.009452070109546185, -0.02526065707206726, 0.03418448939919472, -0.02311251126229763, 0.0003811634087469429, -0.06654092669487, -0.06489627808332443, -0.01520877331495285, -0.12239542603492737, -0.026167958974838257, 0.03130819648504257, 0.0455431193113327, 0.04713885486125946, 0.03456360846757889, 0.049823399633169174, 0.0012735472992062569, 0.009743544273078442, -0.0007807163055986166, -0.0005487700109370053, -0.02438909001648426, 0.011977885849773884, 0.13153356313705444, -0.026188116520643234, -5.2864262301000053e-8, -0.09382908791303635, 0.008693438023328781, -0.053582727909088135, 0.050897080451250076, -0.0006317118532024324, -0.09812775254249573, -0.02697572112083435, 0.013033357448875904, -0.04261314123868942, 0.06341064721345901, -0.010319354012608528, 0.042974356561899185, -0.1675150841474533, -0.0641779825091362, 0.08121847361326218, 0.07427430897951126, -0.009759617038071156, 0.044422488659620285, -0.054953694343566895, 0.012440558522939682, 0.027682434767484665, 0.07117754220962524, -0.017020991072058678, -0.07022807747125626, 0.03586193174123764, 0.015685902908444405, -0.05558481067419052, 0.04176840931177139, 0.017012733966112137, -0.033894963562488556, -0.031923577189445496, 0.07417985051870346, -0.0028749771881848574, -0.044057514518499374, -0.020911777392029762, 0.05135897174477577, -0.02997220866382122, -0.05998954176902771, 0.0025116177275776863, 0.0382687970995903, 0.019243301823735237, 0.06451516598463058, -0.10073581337928772, 0.07338731735944748, 0.061767786741256714, 0.03015534020960331, -0.031070096418261528, -0.1122286319732666, 0.11825168132781982, 0.013375181704759598, 0.04620024561882019, 0.02777743712067604, -0.05635066702961922, 0.02325102873146534, -0.0048889704048633575, 0.03919510170817375, -0.01709754578769207, 0.022852273657917976, 0.0050256759859621525, -0.038156598806381226, -0.03663944825530052, -0.028621580451726913, 0.00690268212929368, -0.0035170684568583965 ]
bert-base-cased
a8d257ba9925ef39f3036bfc338acf5283c512d9
2021-09-06T08:07:18.000Z
[ "pytorch", "tf", "jax", "bert", "fill-mask", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:1810.04805", "transformers", "exbert", "license:apache-2.0", "autotrain_compatible" ]
fill-mask
false
null
null
bert-base-cased
7,598,326
30
transformers
--- language: en tags: - exbert license: apache-2.0 datasets: - bookcorpus - wikipedia --- # BERT base model (cased) Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/abs/1810.04805) and first released in [this repository](https://github.com/google-research/bert). This model is case-sensitive: it makes a difference between english and English. Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with two objectives: - Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. - Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to predict if the two sentences were following each other or not. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the BERT model as inputs. ## Intended uses & limitations You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use You can use this model directly with a pipeline for masked language modeling: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='bert-base-cased') >>> unmasker("Hello I'm a [MASK] model.") [{'sequence': "[CLS] Hello I'm a fashion model. [SEP]", 'score': 0.09019174426794052, 'token': 4633, 'token_str': 'fashion'}, {'sequence': "[CLS] Hello I'm a new model. [SEP]", 'score': 0.06349995732307434, 'token': 1207, 'token_str': 'new'}, {'sequence': "[CLS] Hello I'm a male model. [SEP]", 'score': 0.06228214129805565, 'token': 2581, 'token_str': 'male'}, {'sequence': "[CLS] Hello I'm a professional model. [SEP]", 'score': 0.0441727414727211, 'token': 1848, 'token_str': 'professional'}, {'sequence': "[CLS] Hello I'm a super model. [SEP]", 'score': 0.03326151892542839, 'token': 7688, 'token_str': 'super'}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import BertTokenizer, BertModel tokenizer = BertTokenizer.from_pretrained('bert-base-cased') model = BertModel.from_pretrained("bert-base-cased") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` and in TensorFlow: ```python from transformers import BertTokenizer, TFBertModel tokenizer = BertTokenizer.from_pretrained('bert-base-cased') model = TFBertModel.from_pretrained("bert-base-cased") text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` ### Limitations and bias Even if the training data used for this model could be characterized as fairly neutral, this model can have biased predictions: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='bert-base-cased') >>> unmasker("The man worked as a [MASK].") [{'sequence': '[CLS] The man worked as a lawyer. [SEP]', 'score': 0.04804691672325134, 'token': 4545, 'token_str': 'lawyer'}, {'sequence': '[CLS] The man worked as a waiter. [SEP]', 'score': 0.037494491785764694, 'token': 17989, 'token_str': 'waiter'}, {'sequence': '[CLS] The man worked as a cop. [SEP]', 'score': 0.035512614995241165, 'token': 9947, 'token_str': 'cop'}, {'sequence': '[CLS] The man worked as a detective. [SEP]', 'score': 0.031271643936634064, 'token': 9140, 'token_str': 'detective'}, {'sequence': '[CLS] The man worked as a doctor. [SEP]', 'score': 0.027423162013292313, 'token': 3995, 'token_str': 'doctor'}] >>> unmasker("The woman worked as a [MASK].") [{'sequence': '[CLS] The woman worked as a nurse. [SEP]', 'score': 0.16927455365657806, 'token': 7439, 'token_str': 'nurse'}, {'sequence': '[CLS] The woman worked as a waitress. [SEP]', 'score': 0.1501094549894333, 'token': 15098, 'token_str': 'waitress'}, {'sequence': '[CLS] The woman worked as a maid. [SEP]', 'score': 0.05600163713097572, 'token': 13487, 'token_str': 'maid'}, {'sequence': '[CLS] The woman worked as a housekeeper. [SEP]', 'score': 0.04838843643665314, 'token': 26458, 'token_str': 'housekeeper'}, {'sequence': '[CLS] The woman worked as a cook. [SEP]', 'score': 0.029980547726154327, 'token': 9834, 'token_str': 'cook'}] ``` This bias will also affect all fine-tuned versions of this model. ## Training data The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers). ## Training procedure ### Preprocessing The texts are tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are then of the form: ``` [CLS] Sentence A [SEP] Sentence B [SEP] ``` With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two "sentences" has a combined length of less than 512 tokens. The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `[MASK]`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. ### Pretraining The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after. ## Evaluation results When fine-tuned on downstream tasks, this model achieves the following results: Glue test results: | Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average | |:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:| | | 84.6/83.4 | 71.2 | 90.5 | 93.5 | 52.1 | 85.8 | 88.9 | 66.4 | 79.6 | ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-1810-04805, author = {Jacob Devlin and Ming{-}Wei Chang and Kenton Lee and Kristina Toutanova}, title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language Understanding}, journal = {CoRR}, volume = {abs/1810.04805}, year = {2018}, url = {http://arxiv.org/abs/1810.04805}, archivePrefix = {arXiv}, eprint = {1810.04805}, timestamp = {Tue, 30 Oct 2018 20:39:56 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=bert-base-cased"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
[ -0.089531309902668, -0.07718607783317566, 0.05249395594000816, 0.018646515905857086, 0.0447709858417511, 0.06499486416578293, 0.01449884008616209, 0.013737021014094353, 0.022442683577537537, -0.017234643921256065, 0.04510723799467087, -0.0327119417488575, 0.06715553998947144, 0.0427323542535305, 0.051948729902505875, 0.01969432830810547, 0.07780064642429352, -0.041703589260578156, -0.07245732098817825, -0.020047221332788467, 0.07807008177042007, 0.07568361610174179, 0.032103944569826126, -0.022154154255986214, -0.024296052753925323, -0.03176116570830345, -0.019516821950674057, -0.07587572187185287, 0.09634166955947876, 0.03516039624810219, 0.0666544958949089, -0.008515086956322193, 0.07060568779706955, 0.10504449158906937, -0.014565989375114441, 0.04548881575465202, -0.027834974229335785, -0.01631050743162632, 0.006547209341078997, 0.0014306396478787065, -0.05591712146997452, -0.05727652087807655, -0.023139476776123047, -0.03502918407320976, 0.07555349171161652, 0.025608450174331665, -0.049424879252910614, 0.0027853455394506454, -0.1060907393693924, -0.028984952718019485, -0.09408117830753326, -0.04818697273731232, 0.046576980501413345, -0.014668128453195095, -0.04818219691514969, 0.008276250213384628, 0.029977522790431976, -0.04587174952030182, -0.014274115674197674, -0.059435002505779266, -0.11520417034626007, -0.044993169605731964, -0.004889159929007292, 0.055073171854019165, -0.07202785462141037, 0.03726043179631233, -0.04463006928563118, 0.009251738898456097, 0.007542753126472235, 0.0004234259540680796, 0.04537166282534599, 0.05031372606754303, 0.04745226725935936, 0.010242359712719917, 0.0024255472235381603, -0.026891911402344704, 0.09148892760276794, 0.010374490171670914, 0.05337757617235184, -0.054498109966516495, 0.027686988934874535, 0.019774554297327995, 0.07321810722351074, 0.016959337517619133, 0.02364237979054451, 0.009562404826283455, 0.009021895937621593, -0.0038732404354959726, -0.023675473406910896, 0.011834523640573025, -0.049714043736457825, -0.1093447282910347, 0.06522968411445618, 0.024691032245755196, -0.012528381310403347, -0.010350651107728481, 0.023978233337402344, 0.013276930898427963, -0.0028651715256273746, 0.058943748474121094, 0.008310630917549133, 0.08229421824216843, 0.02597595565021038, -0.052773818373680115, 0.04650958254933357, -0.04343169927597046, -0.0008739834884181619, 0.01645941101014614, 0.062154803425073624, -0.09576094150543213, 0.05197332426905632, -0.04421529546380043, -0.02001127041876316, -0.044246040284633636, 0.008192856796085835, -0.05881776660680771, 0.01545985322445631, -0.03040062077343464, 0.027347080409526825, 0.08610644191503525, 0.025982996448874474, 0.04569195955991745, 0.07730579376220703, 0.0023904191330075264, -0.07875856757164001, -0.013282506726682186, -0.01697215810418129, 1.357400081231121e-33, 0.04268280789256096, -0.00014798487245570868, -0.006979794707149267, 0.0019756839610636234, 0.028390269726514816, -0.040980640798807144, -0.008036523126065731, 0.0075150602497160435, 0.027914423495531082, -0.022181440144777298, -0.02032180316746235, 0.01974700763821602, -0.08073341101408005, 0.09627421200275421, -0.05151639133691788, 0.05650518462061882, -0.04788016155362129, 0.06085256114602089, 0.06545510143041611, 0.0029127788729965687, 0.07896560430526733, 0.031371548771858215, 0.04634787514805794, -0.10505971312522888, -0.048092808574438095, 0.08354596793651581, 0.08914259821176529, -0.09289815276861191, 0.018833234906196594, 0.04475601017475128, -0.12142839282751083, 0.03372875601053238, -0.0065977852791547775, 0.019078604876995087, 0.024509305134415627, 0.022135887295007706, 0.04598105326294899, -0.06743569672107697, 0.02359652705490589, -0.02216353453695774, 0.0006976609583944082, 0.04630599543452263, 0.018412476405501366, -0.08920019119977951, -0.04274723306298256, -0.0062667205929756165, -0.006542707793414593, -0.012158089317381382, 0.011639715172350407, -0.0007159750675782561, 0.08255664259195328, 0.016062339767813683, -0.04147819057106972, -0.04076427221298218, 0.014602158218622208, -0.006132692098617554, 0.04269050806760788, 0.027964789420366287, 0.02674713544547558, 0.03524129465222359, -0.01587456837296486, -0.0092091029509902, 0.05500652268528938, 0.04097604379057884, 0.0423637218773365, -0.057863447815179825, -0.010552209801971912, -0.028551330789923668, 0.0031932671554386616, -0.025534525513648987, -0.05708659440279007, -0.015979284420609474, -0.041284676641225815, 0.020512519404292107, 0.0009385264711454511, -0.07481833547353745, 0.04885474592447281, -0.06443916261196136, -0.05602208152413368, 0.049499619752168655, -0.00995589792728424, 0.04134364426136017, -0.05146399140357971, -0.04640667885541916, -0.05723816528916359, -0.0064347912557423115, 0.06912200152873993, -0.04905078932642937, 0.01563132181763649, -0.0027254954911768436, 0.047401949763298035, -0.05579875409603119, -0.029527654871344566, 0.037568770349025726, 0.0055297487415373325, -3.262752078652251e-33, -0.08756501227617264, 0.024840809404850006, -0.11436457186937332, -0.005051448941230774, -0.0644272044301033, -0.11625798046588898, 0.08047740906476974, 0.18613049387931824, 0.027496598660945892, -0.03250066563487053, -0.03733949735760689, -0.06095917150378227, -0.018609661608934402, 0.022323131561279297, 0.05310913175344467, -0.0225063543766737, 0.002106194384396076, -0.016011875122785568, -0.00514870323240757, 0.03316808491945267, 0.028667394071817398, 0.0015836149686947465, -0.10878629982471466, 0.059927813708782196, -0.013996516354382038, 0.09435109049081802, -0.06366360932588577, 0.06689576804637909, 0.03615027293562889, 0.020274367183446884, -0.017755623906850815, 0.02518705651164055, -0.030143214389681816, 0.06646382808685303, -0.11433769017457962, 0.06250106543302536, 0.000536567997187376, -0.025522438809275627, -0.017754359170794487, 0.010217498987913132, 0.027043277397751808, -0.005991458427160978, -0.08598023653030396, 0.01709730178117752, -0.0103088254109025, -0.00046578943147324026, -0.09195385873317719, -0.10261750966310501, 0.03088230825960636, -0.08251959085464478, -0.025727692991495132, -0.009403418749570847, -0.08610673993825912, -0.036910898983478546, -0.13493192195892334, -0.10599235445261002, -0.005458411294966936, -0.07260340452194214, -0.004219879396259785, 0.004861523862928152, 0.0024637815076857805, -0.003529423614963889, 0.004173181485384703, -0.04515686258673668, -0.022059109061956406, -0.014078098349273205, -0.000879601517226547, 0.024047719314694405, -0.06497428566217422, -0.04905511066317558, 0.012798408046364784, -0.0033942321315407753, 0.014244364574551582, 0.04243399575352669, 0.014849185012280941, 0.009840867482125759, -0.04434835910797119, -0.09208904206752777, -0.044061437249183655, -0.07562600076198578, -0.03262712061405182, -0.07847381383180618, 0.0012263846583664417, 0.08634983003139496, 0.022294754162430763, 0.0022249913308769464, 0.013521641492843628, 0.0448823906481266, -0.04626850411295891, 0.02253107912838459, -0.007803646847605705, 0.04298356547951698, -0.02981039509177208, 0.1568523645401001, -0.004549895413219929, -5.669204838909536e-8, -0.08373355865478516, 0.017326457425951958, -0.022021034732460976, 0.037600547075271606, -0.041430529206991196, -0.04608028009533882, -0.061801727861166, -0.022069474682211876, -0.004016488324850798, -0.053303103893995285, -0.01536548975855112, 0.07557529211044312, -0.11526832729578018, 0.02382061816751957, -0.03464876115322113, 0.08503061532974243, -0.04904837906360626, 0.04142264276742935, 0.024586280807852745, -0.029331007972359657, -0.006415247451514006, 0.017927490174770355, 0.0044896225444972515, -0.05858573317527771, 0.023150930181145668, -0.03343328461050987, -0.006923678331077099, 0.1117820143699646, -0.014445005916059017, 0.02986587956547737, -0.05785840377211571, 0.05217581242322922, -0.06586463749408722, 0.04344407469034195, 0.07361704111099243, 0.07715494930744171, -0.013297432102262974, -0.0483381412923336, -0.004351658280938864, 0.022122571244835854, 0.0947931706905365, 0.06653829663991928, -0.10222460329532623, -0.015618362464010715, 0.11952386796474457, 0.045025475323200226, -0.024398408830165863, -0.10285982489585876, 0.036406997591257095, 0.027866436168551445, 0.019199801608920097, -0.06946895271539688, -0.023697461932897568, 0.07563816756010056, -0.044870782643556595, 0.04493724927306175, -0.045577503740787506, -0.027278954163193703, 0.05682523176074028, 0.01655655913054943, 0.012739447876811028, 0.0639270767569542, 0.07415040582418442, 0.05126776546239853 ]
roberta-base
251c3c36356d3ad6845eb0554fdb9703d632c6cc
2021-07-06T10:34:50.000Z
[ "pytorch", "tf", "jax", "rust", "roberta", "fill-mask", "en", "dataset:bookcorpus", "dataset:wikipedia", "arxiv:1907.11692", "arxiv:1806.02847", "transformers", "exbert", "license:mit", "autotrain_compatible" ]
fill-mask
false
null
null
roberta-base
7,254,067
45
transformers
--- language: en tags: - exbert license: mit datasets: - bookcorpus - wikipedia --- # RoBERTa base model Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in [this paper](https://arxiv.org/abs/1907.11692) and first released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/roberta). This model is case-sensitive: it makes a difference between english and English. Disclaimer: The team releasing RoBERTa did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description RoBERTa is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. This way, the model learns an inner representation of the English language that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the BERT model as inputs. ## Intended uses & limitations You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=roberta) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ### How to use You can use this model directly with a pipeline for masked language modeling: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='roberta-base') >>> unmasker("Hello I'm a <mask> model.") [{'sequence': "<s>Hello I'm a male model.</s>", 'score': 0.3306540250778198, 'token': 2943, 'token_str': 'Ġmale'}, {'sequence': "<s>Hello I'm a female model.</s>", 'score': 0.04655390977859497, 'token': 2182, 'token_str': 'Ġfemale'}, {'sequence': "<s>Hello I'm a professional model.</s>", 'score': 0.04232972860336304, 'token': 2038, 'token_str': 'Ġprofessional'}, {'sequence': "<s>Hello I'm a fashion model.</s>", 'score': 0.037216778844594955, 'token': 2734, 'token_str': 'Ġfashion'}, {'sequence': "<s>Hello I'm a Russian model.</s>", 'score': 0.03253649175167084, 'token': 1083, 'token_str': 'ĠRussian'}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import RobertaTokenizer, RobertaModel tokenizer = RobertaTokenizer.from_pretrained('roberta-base') model = RobertaModel.from_pretrained('roberta-base') text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` and in TensorFlow: ```python from transformers import RobertaTokenizer, TFRobertaModel tokenizer = RobertaTokenizer.from_pretrained('roberta-base') model = TFRobertaModel.from_pretrained('roberta-base') text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` ### Limitations and bias The training data used for this model contains a lot of unfiltered content from the internet, which is far from neutral. Therefore, the model can have biased predictions: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='roberta-base') >>> unmasker("The man worked as a <mask>.") [{'sequence': '<s>The man worked as a mechanic.</s>', 'score': 0.08702439814805984, 'token': 25682, 'token_str': 'Ġmechanic'}, {'sequence': '<s>The man worked as a waiter.</s>', 'score': 0.0819653645157814, 'token': 38233, 'token_str': 'Ġwaiter'}, {'sequence': '<s>The man worked as a butcher.</s>', 'score': 0.073323555290699, 'token': 32364, 'token_str': 'Ġbutcher'}, {'sequence': '<s>The man worked as a miner.</s>', 'score': 0.046322137117385864, 'token': 18678, 'token_str': 'Ġminer'}, {'sequence': '<s>The man worked as a guard.</s>', 'score': 0.040150221437215805, 'token': 2510, 'token_str': 'Ġguard'}] >>> unmasker("The Black woman worked as a <mask>.") [{'sequence': '<s>The Black woman worked as a waitress.</s>', 'score': 0.22177888453006744, 'token': 35698, 'token_str': 'Ġwaitress'}, {'sequence': '<s>The Black woman worked as a prostitute.</s>', 'score': 0.19288744032382965, 'token': 36289, 'token_str': 'Ġprostitute'}, {'sequence': '<s>The Black woman worked as a maid.</s>', 'score': 0.06498628109693527, 'token': 29754, 'token_str': 'Ġmaid'}, {'sequence': '<s>The Black woman worked as a secretary.</s>', 'score': 0.05375480651855469, 'token': 2971, 'token_str': 'Ġsecretary'}, {'sequence': '<s>The Black woman worked as a nurse.</s>', 'score': 0.05245552211999893, 'token': 9008, 'token_str': 'Ġnurse'}] ``` This bias will also affect all fine-tuned versions of this model. ## Training data The RoBERTa model was pretrained on the reunion of five datasets: - [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038 unpublished books; - [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and headers) ; - [CC-News](https://commoncrawl.org/2016/10/news-dataset-available/), a dataset containing 63 millions English news articles crawled between September 2016 and February 2019. - [OpenWebText](https://github.com/jcpeterson/openwebtext), an opensource recreation of the WebText dataset used to train GPT-2, - [Stories](https://arxiv.org/abs/1806.02847) a dataset containing a subset of CommonCrawl data filtered to match the story-like style of Winograd schemas. Together theses datasets weight 160GB of text. ## Training procedure ### Preprocessing The texts are tokenized using a byte version of Byte-Pair Encoding (BPE) and a vocabulary size of 50,000. The inputs of the model take pieces of 512 contiguous token that may span over documents. The beginning of a new document is marked with `<s>` and the end of one by `</s>` The details of the masking procedure for each sentence are the following: - 15% of the tokens are masked. - In 80% of the cases, the masked tokens are replaced by `<mask>`. - In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace. - In the 10% remaining cases, the masked tokens are left as is. Contrary to BERT, the masking is done dynamically during pretraining (e.g., it changes at each epoch and is not fixed). ### Pretraining The model was trained on 1024 V100 GPUs for 500K steps with a batch size of 8K and a sequence length of 512. The optimizer used is Adam with a learning rate of 6e-4, \\(\beta_{1} = 0.9\\), \\(\beta_{2} = 0.98\\) and \\(\epsilon = 1e-6\\), a weight decay of 0.01, learning rate warmup for 24,000 steps and linear decay of the learning rate after. ## Evaluation results When fine-tuned on downstream tasks, this model achieves the following results: Glue test results: | Task | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | |:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:| | | 87.6 | 91.9 | 92.8 | 94.8 | 63.6 | 91.2 | 90.2 | 78.7 | ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-1907-11692, author = {Yinhan Liu and Myle Ott and Naman Goyal and Jingfei Du and Mandar Joshi and Danqi Chen and Omer Levy and Mike Lewis and Luke Zettlemoyer and Veselin Stoyanov}, title = {RoBERTa: {A} Robustly Optimized {BERT} Pretraining Approach}, journal = {CoRR}, volume = {abs/1907.11692}, year = {2019}, url = {http://arxiv.org/abs/1907.11692}, archivePrefix = {arXiv}, eprint = {1907.11692}, timestamp = {Thu, 01 Aug 2019 08:59:33 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-1907-11692.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=roberta-base"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
[ -0.08212313055992126, -0.10390054434537888, -0.027432728558778763, 0.05361437052488327, -0.008713857270777225, 0.08931197971105576, 0.012921089306473732, -0.0319417305290699, 0.03874579817056656, 0.02321513742208481, 0.06043674796819687, -0.031464289873838425, 0.07827506959438324, 0.029055174440145493, 0.012395328842103481, 0.02689121477305889, 0.07593533396720886, 0.014687705785036087, -0.11450008302927017, -0.049616776406764984, 0.04797573760151863, 0.06781342625617981, 0.03838324174284935, 0.02702377550303936, 0.03736986219882965, -0.012935413978993893, -0.0256693996489048, -0.02736048586666584, 0.09874799102544785, -0.0026962950360029936, -0.0042382520623505116, 0.03558741882443428, 0.09679447114467621, 0.09779024869203568, -0.043239593505859375, 0.05962347984313965, -0.03975430503487587, -0.020670050755143166, -0.002412152709439397, -0.031007153913378716, -0.06964299827814102, -0.057625092566013336, -0.007961327210068703, -0.010580910369753838, 0.044617339968681335, 0.016297124326229095, -0.0064804647117853165, 0.039289023727178574, -0.11242145299911499, -0.032335150986909866, -0.12197995185852051, -0.04704497754573822, -0.03094966523349285, -0.00464315852150321, -0.06423099339008331, -0.046304307878017426, 0.029733624309301376, -0.03291700780391693, -0.01744566112756729, -0.011818883009254932, -0.09874549508094788, -0.040970370173454285, -0.07536564767360687, 0.06539982557296753, -0.016042007133364677, -0.021375149488449097, -0.05620294809341431, 0.0354902409017086, 0.02111370489001274, -0.02198796346783638, 0.012204588390886784, -0.005389190744608641, 0.02833700180053711, 0.06678900122642517, 0.04813894256949425, 0.031781211495399475, 0.09661594033241272, 0.027613528072834015, 0.045708831399679184, -0.039990294724702835, 0.024119101464748383, 0.025410819798707962, 0.0879463255405426, 0.028227079659700394, 0.017665376886725426, 0.012731572613120079, -0.023859171196818352, 0.05712113156914711, 0.007102148607373238, -0.03457454964518547, -0.05969918146729469, -0.05344460904598236, 0.05333346873521805, 0.032954588532447815, -0.05042494460940361, 0.008642683736979961, 0.00035550445318222046, 0.024960121139883995, -0.012947848998010159, 0.08578070998191833, -0.015691136941313744, 0.09398066252470016, 0.016177991405129433, 0.010003107599914074, -0.023397225886583328, -0.1328941434621811, 0.008100180886685848, 0.04217952862381935, 0.04603559523820877, -0.08931060135364532, 0.10133129358291626, -0.02825361303985119, -0.09132593870162964, 0.014613982290029526, 0.049795038998126984, -0.05340542644262314, 0.04130291938781738, -0.06003325432538986, 0.011421226896345615, 0.05992741137742996, -0.004191403277218342, 0.03738957270979881, 0.04031217843294144, -0.004342162050306797, -0.024098537862300873, 0.015412067994475365, -0.043065208941698074, 1.927960427207289e-33, 0.04944639280438423, 0.06868238747119904, 0.033271364867687225, 0.03514889255166054, 0.018731746822595596, -0.03777240216732025, -0.013376017101109028, 0.00999488402158022, 0.04441818594932556, 0.019227314740419388, -0.010336397215723991, 0.03241290897130966, -0.051111288368701935, 0.06539622694253922, -0.05680163949728012, 0.02364775538444519, -0.06406411528587341, 0.026163509115576744, 0.004441630095243454, 0.025813192129135132, 0.09431494027376175, 0.06419012695550919, 0.02332470193505287, -0.07896998524665833, -0.045751191675662994, 0.07581622898578644, 0.09220792353153229, -0.10862000286579132, -0.000056342658353969455, 0.03090069256722927, -0.07661359012126923, 0.014879279769957066, -0.00026731155230663717, 0.004728981759399176, 0.006047264207154512, -0.03535827249288559, 0.048186931759119034, -0.08132076263427734, 0.02142195589840412, -0.037886932492256165, 0.006791887804865837, 0.0373186394572258, 0.05131097882986069, -0.058145780116319656, -0.036002468317747116, -0.016969937831163406, -0.0022517910692840815, 0.0006288925069384277, 0.017965272068977356, -0.007713486440479755, 0.07044688612222672, 0.027205444872379303, 0.0002800181391648948, -0.006074720527976751, -0.004662784282118082, 0.06409762799739838, 0.08798649162054062, 0.04811446741223335, 0.007085307966917753, 0.043195974081754684, 0.0002739969058893621, -0.024268193170428276, 0.056253332644701004, 0.042472828179597855, 0.08052994310855865, -0.006212517619132996, -0.03828892484307289, -0.09502533078193665, 0.090201236307621, 0.01849748007953167, -0.09753229469060898, -0.02583504095673561, -0.04901733994483948, 0.021024266257882118, -0.00937916710972786, -0.03969297930598259, 0.061341483145952225, -0.046558674424886703, -0.020008303225040436, 0.013840388506650925, -0.02930927649140358, 0.05483552813529968, -0.015573572367429733, -0.022343512624502182, -0.036905378103256226, -0.04976329579949379, 0.049847815185785294, -0.06438089162111282, -0.015683777630329132, -0.029729878529906273, 0.06090894713997841, 0.0642743781208992, -0.027143731713294983, 0.03274845704436302, 0.022321658208966255, -2.758533695100616e-33, -0.03470517694950104, 0.017889956012368202, -0.07858152687549591, 0.024247851222753525, -0.010427725501358509, -0.11836204677820206, 0.0502910278737545, 0.11374001950025558, 0.03747328743338585, -0.03274431824684143, 0.023625079542398453, -0.10022133588790894, 0.036454252898693085, 0.046416353434324265, 0.0596897229552269, -0.04499993100762367, 0.034443944692611694, -0.05274884030222893, 0.021231207996606827, 0.06322304159402847, -0.07724914699792862, 0.10372144728899002, -0.0858403742313385, -0.009318646974861622, -0.03507218882441521, 0.006336480379104614, -0.005695057101547718, 0.0486026257276535, 0.039652954787015915, 0.015386028215289116, -0.002450275234878063, -0.04296000301837921, -0.01596658118069172, -0.0005649054073728621, -0.15281100571155548, 0.04266441985964775, -0.004778658039867878, -0.04854978621006012, -0.016432397067546844, 0.03894307091832161, 0.048911526799201965, 0.026461448520421982, -0.11923982948064804, 0.025022100657224655, -0.015559940598905087, -0.025814849883317947, -0.09695779532194138, -0.06778528541326523, -0.016015075147151947, -0.05128163844347, 0.0002789662394206971, -0.022119460627436638, -0.08185230940580368, -0.015931744128465652, -0.08617742359638214, -0.08042497932910919, 0.03598605841398239, -0.09678564965724945, 0.040427032858133316, -0.006684636697173119, -0.001651445054449141, 0.015331625007092953, -0.015752486884593964, -0.07972411066293716, 0.013031522743403912, -0.043249472975730896, -0.00029807983082719147, -0.011639262549579144, -0.04093516245484352, -0.014722185209393501, 0.07255052030086517, -0.0005694770952686667, -0.00788575317710638, 0.02917758747935295, 0.04271109029650688, 0.00512972567230463, -0.07428991049528122, -0.06994529813528061, -0.055213894695043564, -0.11628760397434235, -0.08632394671440125, -0.04732079803943634, -0.0018510075751692057, 0.09274037927389145, 0.0772918239235878, 0.0021164030767977238, 0.01278630644083023, 0.09536950290203094, -0.015195210464298725, 0.050031427294015884, 0.018412554636597633, 0.0329139344394207, 0.013269810006022453, 0.12701024115085602, -0.06576225161552429, -5.79329402228268e-8, -0.10214338451623917, 0.04635382816195488, -0.059428539127111435, 0.034549593925476074, -0.005696326959878206, -0.024402407929301262, -0.004926423542201519, -0.032293785363435745, -0.009865719825029373, 0.01583954691886902, -0.02458330988883972, 0.016254164278507233, -0.08997861295938492, 0.016522444784641266, -0.04440576583147049, 0.06921971589326859, -0.004341252148151398, 0.07178453356027603, -0.010262193158268929, 0.0018308067228645086, 0.03906615078449249, 0.029810750856995583, -0.010954804718494415, -0.022193333134055138, 0.00012949503434356302, -0.01880479045212269, -0.07516337186098099, 0.08114933222532272, 0.01779184117913246, -0.028539028018712997, -0.036318857222795486, 0.04221895709633827, 0.02146512269973755, 0.016880476847290993, -0.0295562744140625, 0.07540080696344376, 0.012916159816086292, -0.04382641240954399, -0.013711965642869473, 0.015656251460313797, 0.08663709461688995, 0.01223217323422432, -0.127671018242836, -0.019443152472376823, 0.09095695614814758, 0.026450932025909424, -0.03210148587822914, -0.16094635426998138, 0.030573509633541107, 0.04203788563609123, 0.019353268668055534, -0.06058424711227417, -0.034888722002506256, 0.035000648349523544, 0.008549763821065426, 0.026006324216723442, -0.0044225784949958324, -0.030993474647402763, 0.09247840195894241, 0.036541227251291275, 0.023664690554142, 0.02576891891658306, 0.021909277886152267, -0.015276530757546425 ]
SpanBERT/spanbert-large-cased
a49cba45de9565a5d3e7b089a94dbae679e64e79
2021-05-19T11:31:33.000Z
[ "pytorch", "jax", "bert", "transformers" ]
null
false
SpanBERT
null
SpanBERT/spanbert-large-cased
7,120,559
3
transformers
Entry not found
[ 0.0461147278547287, -0.038838207721710205, -0.01049656979739666, -0.03682169318199158, 0.011261860840022564, 0.013094935566186905, 0.0019101888174191117, -0.013979103416204453, 0.027092741802334785, -0.015212527476251125, 0.017284274101257324, -0.08189476281404495, 0.03817418962717056, -0.04920130595564842, 0.021389011293649673, -0.015245908871293068, -0.03203780576586723, -0.1245758980512619, 0.03150877356529236, 0.032381657510995865, -0.060957908630371094, 0.05409295856952667, -0.025087490677833557, 0.01568586938083172, 0.028129950165748596, -0.04710396006703377, -0.018688226118683815, 0.013785239309072495, -0.04001208767294884, 0.01173911802470684, -0.04317743331193924, 0.05500618368387222, 0.004543041344732046, 0.02973111905157566, 0.14852192997932434, 0.02658126689493656, 0.02907961793243885, -0.05169107764959335, 0.05803573504090309, -0.07732241600751877, -0.017637968063354492, -0.04219653457403183, 0.041807834059000015, 0.023620979860424995, 0.021563321352005005, 0.016478516161441803, -0.0021814992651343346, -0.06400240957736969, 0.06393089145421982, 0.019599027931690216, -0.08565037697553635, 0.00934905931353569, -0.008718925528228283, -0.028583496809005737, -0.07310017943382263, 0.09416428208351135, 0.001759322709403932, 0.06184990331530571, 0.011840506456792355, -0.035997264087200165, 0.08358278125524521, -0.02619801089167595, 0.03736566752195358, -0.028206506744027138, -0.07454850524663925, -0.08883563429117203, -0.06279942393302917, -0.008695344440639019, 0.014119276776909828, -0.0825355276465416, 0.0649217739701271, -0.00223911227658391, -0.14716917276382446, 0.07743025571107864, -0.03548373281955719, -0.055201586335897446, 0.006981803569942713, -0.012166670523583889, 0.055111464112997055, -0.007116836030036211, -0.023175746202468872, -0.005835152696818113, -0.09185640513896942, 0.055196937173604965, 0.034148022532463074, 0.03835180774331093, 0.038685429841279984, -0.025987252593040466, 0.017804903909564018, 0.022428328171372414, 0.025005368515849113, -0.10761535167694092, -0.048001550137996674, -0.04343584179878235, 0.012374646961688995, -0.019502125680446625, 0.029218152165412903, 0.0842173621058464, -0.011719699949026108, 0.09283553808927536, -0.007015465293079615, -0.03543110564351082, -0.06936459988355637, 0.09425332397222519, -0.010958523489534855, -0.00805904995650053, 0.004974212497472763, -0.0031528924591839314, 0.06105927750468254, -0.03964288905262947, -0.03619541600346565, -0.019901901483535767, 0.07134733349084854, 0.039514873176813126, -0.012729483656585217, -0.006646515801548958, -0.04746140539646149, -0.014432490803301334, -0.05157482624053955, 0.09506245702505112, -0.049747664481401443, -0.04591796174645424, -0.008965466171503067, -0.0325421579182148, -0.08626784384250641, -0.06624380499124527, 0.02538885548710823, -4.303924894057984e-33, 0.01133066974580288, 0.0033434738870710135, -0.002155609894543886, 0.04871906340122223, -0.023564351722598076, -0.07933273911476135, 0.0600903145968914, 0.02335330657660961, -0.03844716399908066, -0.020433755591511726, -0.06952055543661118, -0.03235611692070961, 0.0062485747039318085, 0.064804308116436, -0.03201229125261307, 0.061689723283052444, 0.0417000837624073, -0.00761845987290144, 0.03340127319097519, -0.047770582139492035, 0.00887306872755289, -0.04066338762640953, -0.010506896302103996, 0.0106519665569067, 0.021333497017621994, 0.12854498624801636, -0.009705503471195698, 0.010055632330477238, -0.017507633194327354, 0.006515394430607557, 0.06334009766578674, -0.057817306369543076, 0.013668818399310112, -0.020286159589886665, 0.05430467426776886, -0.023184705525636673, 0.0828516036272049, 0.0005449643940664828, -0.10372652113437653, -0.07634282112121582, -0.005381610710173845, -0.039263784885406494, 0.0006114727002568543, -0.013281986117362976, 0.07119110971689224, 0.043696220964193344, 0.03168422728776932, 0.04338686540722847, 0.05728672817349434, 0.0832006186246872, -0.07961414009332657, 0.015234283171594143, 0.017002005130052567, 0.047004107385873795, -0.09794387966394424, 0.004990279674530029, -0.07062993198633194, -0.028000490739941597, -0.04018733277916908, -0.0702052190899849, 0.011351344175636768, 0.06020182743668556, -0.03297270089387894, 0.09396500885486603, 0.03417910635471344, -0.019825750961899757, -0.034690454602241516, -0.013036907650530338, 0.05896938592195511, -0.012359356507658958, -0.017275206744670868, -0.07982361316680908, 0.02059139870107174, 0.06737419217824936, 0.04176458343863487, -0.04978838190436363, -0.05877475067973137, -0.06289287656545639, -0.03354167565703392, -0.03871942684054375, 0.009898529388010502, -0.05514208599925041, -0.11629002541303635, -0.011855563148856163, 0.10663620382547379, 0.037354156374931335, -0.0065480442717671394, -0.051189567893743515, 0.06663123518228531, 0.01874656230211258, 0.032841797918081284, 0.041593004018068314, -0.06879369914531708, 0.04216769337654114, -0.01628219522535801, 5.4139394340936695e-34, 0.05697013810276985, -0.006972255185246468, 0.015711724758148193, -0.17956365644931793, 0.02320219948887825, 0.007923615165054798, -0.008062449283897877, 0.0074974060989916325, 0.07391711324453354, 0.0309313777834177, 0.060510627925395966, 0.058605875819921494, 0.09515274316072464, -0.002282935893163085, 0.001603541080839932, 0.07024981826543808, 0.012629246339201927, 0.07425693422555923, -0.038426291197538376, 0.01861148327589035, 0.030608950182795525, -0.02449394389986992, 0.021528491750359535, -0.003039651783183217, -0.03676343336701393, 0.03130284696817398, 0.07998586446046829, 0.010451192036271095, -0.07930229604244232, -0.013543923385441303, 0.018781835213303566, 0.05168003588914871, -0.07191970944404602, 0.15783067047595978, 0.026191607117652893, 0.01262354850769043, 0.08218053728342056, -0.029807550832629204, -0.07528624683618546, -0.04250097647309303, 0.017244765534996986, 0.04411793500185013, 0.03708017244935036, 0.009233047254383564, -0.040271829813718796, 0.022496428340673447, 0.02495843544602394, 0.07633638381958008, 0.005147108342498541, 0.013892097398638725, 0.05610476806759834, -0.06684739887714386, 0.05862557515501976, -0.020688841119408607, 0.05377643182873726, 0.06718500703573227, 0.005329249892383814, -0.01388032827526331, 0.029931528493762016, 0.009508464485406876, -0.045173756778240204, 0.11534366756677628, -0.06510116159915924, 0.05117698386311531, -0.0026125339791178703, -0.08554837852716446, -0.03784770518541336, 0.0804959163069725, 0.011298024095594883, -0.07695550471544266, -0.04868878796696663, 0.02515520341694355, 0.06252261996269226, -0.04509226232767105, -0.01246943511068821, 0.028559505939483643, -0.030573077499866486, 0.05066261067986488, -0.08187384903430939, 0.04469604790210724, 0.0034051244147121906, 0.04145054519176483, -0.021858664229512215, -0.06112268194556236, -0.00908052921295166, -0.05903250351548195, 0.0259539932012558, 0.059690944850444794, -0.07613514363765717, -0.03720718249678612, -0.036316655576229095, 0.07058046013116837, -0.008224100805819035, 0.041961874812841415, -0.0285952128469944, -1.496900736697171e-8, -0.0014124972512945533, 0.03401879221200943, -0.040338415652513504, 0.04116074740886688, 0.0935964286327362, -0.05115952715277672, 0.0008746005478315055, -0.03389839455485344, -0.00567849725484848, -0.010686947964131832, -0.04789939522743225, -0.04820054769515991, -0.02011880651116371, -0.03209094703197479, -0.04211259260773659, -0.10229527950286865, -0.07819421589374542, -0.031228765845298767, -0.02154778689146042, -0.04960230365395546, 0.08087796717882156, -0.07801242172718048, 0.06919731199741364, -0.04999840259552002, 0.03687043860554695, 0.03889009356498718, -0.049989692866802216, -0.04254625365138054, -0.04606937617063522, 0.08682432025671005, -0.031148413196206093, 0.11826753616333008, 0.034102488309144974, -0.0208592489361763, -0.0205202866345644, 0.027134142816066742, 0.09741277992725372, 0.051608603447675705, 0.013477512635290623, -0.13649295270442963, -0.022304272279143333, 0.02385953813791275, 0.038732077926397324, -0.09249968826770782, -0.04549082741141319, 0.054220106452703476, 0.01160438358783722, 0.051190607249736786, 0.07713303714990616, -0.022097084671258926, -0.06127818301320076, -0.01857956498861313, 0.006740490905940533, -0.00496308971196413, 0.024095389991998672, 0.0736224576830864, -0.003481915919110179, -0.0699305310845375, -0.006629763171076775, -0.0598808117210865, 0.05297163128852844, -0.02902800403535366, -0.027858933433890343, -0.01287526823580265 ]
xlm-roberta-base
f6d161e8f5f6f2ed433fb4023d6cb34146506b3f
2022-06-06T11:40:43.000Z
[ "pytorch", "tf", "jax", "xlm-roberta", "fill-mask", "multilingual", "af", "am", "ar", "as", "az", "be", "bg", "bn", "br", "bs", "ca", "cs", "cy", "da", "de", "el", "en", "eo", "es", "et", "eu", "fa", "fi", "fr", "fy", "ga", "gd", "gl", "gu", "ha", "he", "hi", "hr", "hu", "hy", "id", "is", "it", "ja", "jv", "ka", "kk", "km", "kn", "ko", "ku", "ky", "la", "lo", "lt", "lv", "mg", "mk", "ml", "mn", "mr", "ms", "my", "ne", "nl", "no", "om", "or", "pa", "pl", "ps", "pt", "ro", "ru", "sa", "sd", "si", "sk", "sl", "so", "sq", "sr", "su", "sv", "sw", "ta", "te", "th", "tl", "tr", "ug", "uk", "ur", "uz", "vi", "xh", "yi", "zh", "arxiv:1911.02116", "transformers", "exbert", "license:mit", "autotrain_compatible" ]
fill-mask
false
null
null
xlm-roberta-base
6,960,013
42
transformers
--- tags: - exbert language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - no - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh license: mit --- # XLM-RoBERTa (base-sized model) XLM-RoBERTa model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper [Unsupervised Cross-lingual Representation Learning at Scale](https://arxiv.org/abs/1911.02116) by Conneau et al. and first released in [this repository](https://github.com/pytorch/fairseq/tree/master/examples/xlmr). Disclaimer: The team releasing XLM-RoBERTa did not write a model card for this model so this model card has been written by the Hugging Face team. ## Model description XLM-RoBERTa is a multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence. This way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa model as inputs. ## Intended uses & limitations You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?search=xlm-roberta) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2. ## Usage You can use this model directly with a pipeline for masked language modeling: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='xlm-roberta-base') >>> unmasker("Hello I'm a <mask> model.") [{'score': 0.10563907772302628, 'sequence': "Hello I'm a fashion model.", 'token': 54543, 'token_str': 'fashion'}, {'score': 0.08015287667512894, 'sequence': "Hello I'm a new model.", 'token': 3525, 'token_str': 'new'}, {'score': 0.033413201570510864, 'sequence': "Hello I'm a model model.", 'token': 3299, 'token_str': 'model'}, {'score': 0.030217764899134636, 'sequence': "Hello I'm a French model.", 'token': 92265, 'token_str': 'French'}, {'score': 0.026436051353812218, 'sequence': "Hello I'm a sexy model.", 'token': 17473, 'token_str': 'sexy'}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained('xlm-roberta-base') model = AutoModelForMaskedLM.from_pretrained("xlm-roberta-base") # prepare input text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') # forward pass output = model(**encoded_input) ``` ### BibTeX entry and citation info ```bibtex @article{DBLP:journals/corr/abs-1911-02116, author = {Alexis Conneau and Kartikay Khandelwal and Naman Goyal and Vishrav Chaudhary and Guillaume Wenzek and Francisco Guzm{\'{a}}n and Edouard Grave and Myle Ott and Luke Zettlemoyer and Veselin Stoyanov}, title = {Unsupervised Cross-lingual Representation Learning at Scale}, journal = {CoRR}, volume = {abs/1911.02116}, year = {2019}, url = {http://arxiv.org/abs/1911.02116}, eprinttype = {arXiv}, eprint = {1911.02116}, timestamp = {Mon, 11 Nov 2019 18:38:09 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-1911-02116.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` <a href="https://huggingface.co/exbert/?model=xlm-roberta-base"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
[ -0.08281730860471725, 0.003249414497986436, -0.041813094168901443, -0.047962404787540436, 0.060015056282281876, 0.03705935552716255, 0.03477831184864044, 0.03974342346191406, -0.008022570051252842, 0.021637847647070885, 0.08089376986026764, -0.04877686873078346, 0.08540689200162888, -0.035379793494939804, -0.05062985420227051, 0.03418223559856415, -0.036097630858421326, 0.020237574353814125, -0.14425572752952576, -0.004613094497472048, 0.02864249423146248, 0.02947583608329296, -0.023463759571313858, -0.012288014404475689, 0.04746605083346367, 0.036739058792591095, -0.02791615203022957, 0.08251674473285675, 0.0382915623486042, -0.04493943601846695, -0.01109251193702221, 0.1110743060708046, 0.0772641971707344, 0.08860312402248383, 0.038153860718011856, 0.028096405789256096, -0.09501869976520538, -0.08599246293306351, 0.044307030737400055, 0.027998656034469604, 0.029834924265742302, -0.018668241798877716, 0.0027537939604371786, -0.0032636604737490416, 0.08310531079769135, -0.019244085997343063, -0.07809913903474808, -0.000776743923779577, -0.016486477106809616, -0.019175734370946884, -0.15297456085681915, 0.001768144778907299, -0.05551907792687416, 0.042555101215839386, -0.04154287278652191, -0.05034345015883446, -0.04697342589497566, -0.07802332937717438, 0.017777161672711372, -0.07275139540433884, -0.07300341129302979, -0.019891194999217987, -0.1178426742553711, 0.07353843003511429, -0.021757183596491814, -0.014677219092845917, -0.020926380529999733, 0.019956985488533974, -0.03534978628158569, 0.061466820538043976, 0.014512757770717144, 0.021760640665888786, -0.059575218707323074, 0.09421637654304504, 0.02839736081659794, 0.03230004757642746, 0.021487684920430183, 0.06585147231817245, 0.031181544065475464, -0.0669073835015297, -0.085780568420887, -0.005546950735151768, 0.022333966568112373, -0.07278377562761307, -0.03415834903717041, 0.0030434008222073317, 0.005892851389944553, 0.05750686675310135, 0.03694901615381241, -0.05128937587141991, 0.006422250531613827, -0.04039490967988968, -0.024935804307460785, 0.033873897045850754, -0.1082722619175911, 0.027332985773682594, 0.008891538716852665, 0.06092298775911331, -0.02602071315050125, 0.09636182337999344, -0.000953576760366559, 0.008766921237111092, 0.07067617028951645, 0.010455954819917679, -0.11030442267656326, -0.06324313580989838, 0.07838018238544464, 0.152840256690979, -0.010513335466384888, -0.06772302836179733, 0.01347855944186449, -0.0017712214030325413, -0.005600190721452236, -0.05192319676280022, 0.0005696949083358049, 0.05918079987168312, 0.02221277914941311, -0.0793992206454277, 0.03996582329273224, 0.00867485161870718, -0.022685537114739418, -0.018555298447608948, -0.02126438356935978, -0.022611740976572037, -0.04660337045788765, -0.036407966166734695, -0.017830288037657738, -5.6318114345468264e-33, 0.018457574769854546, -0.013587369583547115, -0.003177803475409746, -0.024275347590446472, -0.017692716792225838, 0.005816428456455469, -0.06478535383939743, 0.027852557599544525, -0.05442121624946594, 0.05322468280792236, -0.05989943444728851, 0.05562322959303856, -0.07951730489730835, -0.011185137555003166, -0.015848815441131592, 0.03896721452474594, 0.05149108171463013, 0.0061616310849785805, -0.05700710043311119, 0.04521261155605316, 0.1228703036904335, -0.015513170510530472, 0.01824849471449852, -0.0028821490705013275, -0.06153037026524544, 0.09463315457105637, 0.05455092340707779, -0.09098367393016815, 0.0038257879205048084, 0.06324567645788193, -0.02578427642583847, 0.012249138206243515, -0.02117711678147316, 0.0010365599300712347, -0.0649496465921402, 0.01805831305682659, -0.04280754551291466, -0.008568361401557922, 0.020615100860595703, -0.04533670097589493, -0.03259552642703056, 0.014753271825611591, 0.01580297388136387, -0.024884605780243874, 0.021439163014292717, -0.03023614175617695, -0.009700670838356018, -0.01174770388752222, -0.055873531848192215, 0.045548487454652786, 0.02011461742222309, -0.015923790633678436, 0.022687561810016632, 0.05654555559158325, -0.08824550360441208, 0.10945647209882736, 0.03612837567925453, 0.09502732008695602, 0.024472743272781372, 0.05092456564307213, -0.016261495649814606, -0.00675303814932704, 0.004061791114509106, 0.06443434208631516, 0.1268225461244583, 0.008744585327804089, 0.04779292643070221, -0.019829729571938515, 0.05886654183268547, 0.03148362785577774, -0.06195065379142761, -0.11515632271766663, 0.07588869333267212, 0.07098479568958282, 0.09257650375366211, -0.05428113043308258, 0.07513223588466644, -0.033446330577135086, 0.019400326535105705, 0.0011833560420200229, -0.0944288820028305, 0.011343048885464668, -0.06811488419771194, -0.09018751233816147, -0.07277720421552658, -0.016562331467866898, 0.041017159819602966, -0.10367842763662338, -0.04268224164843559, -0.022681836038827896, -0.001943620853126049, 0.06786537170410156, -0.04666311293840408, -0.05446922034025192, -0.024051262065768242, 4.307925982954595e-33, 0.02927684411406517, 0.02473670430481434, -0.00730359461158514, 0.06567075848579407, 0.04975317046046257, -0.043955568224191666, 0.10664968937635422, 0.07942792028188705, 0.03227192908525467, 0.0016607856377959251, 0.08409073948860168, -0.08409922569990158, -0.0030871855560690165, -0.04746316000819206, 0.1042633205652237, 0.03323349729180336, 0.03747016564011574, 0.04032249003648758, -0.0014772710856050253, 0.08360491693019867, -0.037960734218358994, 0.08552755415439606, -0.06699495017528534, 0.08624345064163208, -0.006871153134852648, 0.0338614359498024, 0.003953146282583475, 0.06719767302274704, 0.0352039709687233, 0.01776283048093319, 0.00748350890353322, -0.007872789166867733, -0.07293678820133209, 0.029392030090093613, -0.08071400225162506, -0.04681556671857834, -0.021643172949552536, 0.0763869360089302, 0.025475574657320976, 0.12731412053108215, -0.021447651088237762, 0.014906303025782108, -0.08710325509309769, 0.024161433801054955, 0.0032107275910675526, 0.001372150145471096, 0.014508230611681938, -0.07287387549877167, -0.005187300965189934, -0.045692674815654755, 0.02829768881201744, 0.083643838763237, -0.043828707188367844, 0.020603585988283157, -0.02178143337368965, -0.020921146497130394, 0.005685606971383095, -0.09147240221500397, -0.11369822174310684, -0.03821299597620964, -0.04065840691328049, 0.014469864778220654, -0.025570005178451538, -0.04891706258058548, 0.09647815674543381, -0.02074797824025154, -0.0036030281335115433, 0.016031337901949883, 0.020497210323810577, -0.009751959703862667, 0.012645588256418705, -0.023753564804792404, -0.07546000927686691, -0.05613580718636513, 0.04199549928307533, -0.01006949134171009, -0.06682123243808746, -0.02513960376381874, 0.011232016608119011, -0.08045299351215363, -0.06016146019101143, -0.030346058309078217, 0.03464629873633385, 0.06832406669855118, 0.022138357162475586, 0.002787449164316058, -0.017108144238591194, -0.0009226190741173923, -0.004626354202628136, 0.02437295764684677, -0.021744905039668083, 0.011401963420212269, 0.030895933508872986, 0.06541696935892105, -0.07869194447994232, -5.5407490151537786e-8, -0.08769413828849792, 0.014649382792413235, -0.024911191314458847, 0.044224727898836136, 0.0685027465224266, -0.013683382421731949, -0.06761698424816132, 0.010718715377151966, -0.012818120419979095, 0.02858855202794075, 0.04478464648127556, 0.0037659923546016216, -0.14068704843521118, -0.01394902914762497, -0.02907126396894455, 0.02611912414431572, -0.03023781254887581, 0.05059933662414551, -0.056442514061927795, -0.02243898995220661, 0.03751762956380844, 0.05011224001646042, 0.013110699132084846, -0.012432933785021305, 0.052495915442705154, -0.022730913013219833, -0.026057280600070953, 0.059689249843358994, 0.04844257980585098, 0.030781982466578484, 0.004751097410917282, -0.056531164795160294, -0.009044178761541843, -0.0034333174116909504, 0.0000055284785958065186, -0.004781386349350214, 0.06540637463331223, -0.029928866773843765, -0.02884620800614357, 0.052054837346076965, 0.05254862457513809, 0.07164739817380905, -0.05654052644968033, 0.008308994583785534, 0.0022141665685921907, -0.01405635941773653, 0.004416080191731453, -0.06004299595952034, 0.044131189584732056, -0.03237607702612877, 0.03300559148192406, 0.015884263440966606, -0.05324851721525192, -0.012414139695465565, -0.026974666863679886, 0.014853843487799168, 0.07596500962972641, 0.01680518314242363, 0.040262386202812195, -0.009760027751326561, 0.07420459389686584, 0.000252932048169896, 0.03853527456521988, -0.027812540531158447 ]
distilbert-base-uncased-finetuned-sst-2-english
00c3f1ef306e837efb641eaca05d24d161d9513c
2022-07-22T08:00:55.000Z
[ "pytorch", "tf", "rust", "distilbert", "text-classification", "en", "dataset:sst2", "dataset:glue", "transformers", "license:apache-2.0", "model-index" ]
text-classification
false
null
null
distilbert-base-uncased-finetuned-sst-2-english
5,401,984
77
transformers
--- language: en license: apache-2.0 datasets: - sst2 - glue model-index: - name: distilbert-base-uncased-finetuned-sst-2-english results: - task: type: text-classification name: Text Classification dataset: name: glue type: glue config: sst2 split: validation metrics: - name: Accuracy type: accuracy value: 0.9105504587155964 verified: true - name: Precision type: precision value: 0.8978260869565218 verified: true - name: Recall type: recall value: 0.9301801801801802 verified: true - name: AUC type: auc value: 0.9716626673402374 verified: true - name: F1 type: f1 value: 0.9137168141592922 verified: true - name: loss type: loss value: 0.39013850688934326 verified: true --- # DistilBERT base uncased finetuned SST-2 ## Table of Contents - [Model Details](#model-details) - [How to Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) ## Model Details **Model Description:** This model is a fine-tune checkpoint of [DistilBERT-base-uncased](https://huggingface.co/distilbert-base-uncased), fine-tuned on SST-2. This model reaches an accuracy of 91.3 on the dev set (for comparison, Bert bert-base-uncased version reaches an accuracy of 92.7). - **Developed by:** Hugging Face - **Model Type:** Text Classification - **Language(s):** English - **License:** Apache-2.0 - **Parent Model:** For more details about DistilBERT, we encourage users to check out [this model card](https://huggingface.co/distilbert-base-uncased). - **Resources for more information:** - [Model Documentation](https://huggingface.co/docs/transformers/main/en/model_doc/distilbert#transformers.DistilBertForSequenceClassification) ## How to Get Started With the Model Example of single-label classification: ​​ ```python import torch from transformers import DistilBertTokenizer, DistilBertForSequenceClassification tokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-uncased") model = DistilBertForSequenceClassification.from_pretrained("distilbert-base-uncased") inputs = tokenizer("Hello, my dog is cute", return_tensors="pt") with torch.no_grad(): logits = model(**inputs).logits predicted_class_id = logits.argmax().item() model.config.id2label[predicted_class_id] ``` ## Uses #### Direct Use This model can be used for topic classification. You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you. #### Misuse and Out-of-scope Use The model should not be used to intentionally create hostile or alienating environments for people. In addition, the model was not trained to be factual or true representations of people or events, and therefore using the model to generate such content is out-of-scope for the abilities of this model. ## Risks, Limitations and Biases Based on a few experimentations, we observed that this model could produce biased predictions that target underrepresented populations. For instance, for sentences like `This film was filmed in COUNTRY`, this binary classification model will give radically different probabilities for the positive label depending on the country (0.89 if the country is France, but 0.08 if the country is Afghanistan) when nothing in the input indicates such a strong semantic shift. In this [colab](https://colab.research.google.com/gist/ageron/fb2f64fb145b4bc7c49efc97e5f114d3/biasmap.ipynb), [Aurélien Géron](https://twitter.com/aureliengeron) made an interesting map plotting these probabilities for each country. <img src="https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/map.jpeg" alt="Map of positive probabilities per country." width="500"/> We strongly advise users to thoroughly probe these aspects on their use-cases in order to evaluate the risks of this model. We recommend looking at the following bias evaluation datasets as a place to start: [WinoBias](https://huggingface.co/datasets/wino_bias), [WinoGender](https://huggingface.co/datasets/super_glue), [Stereoset](https://huggingface.co/datasets/stereoset). # Training #### Training Data The authors use the following Stanford Sentiment Treebank([sst2](https://huggingface.co/datasets/sst2)) corpora for the model. #### Training Procedure ###### Fine-tuning hyper-parameters - learning_rate = 1e-5 - batch_size = 32 - warmup = 600 - max_seq_length = 128 - num_train_epochs = 3.0
[ -0.034528184682130814, -0.04061563313007355, -0.060734909027814865, 0.035735320299863815, 0.08161136507987976, 0.04066551476716995, 0.0003900852461811155, 0.05125672370195389, -0.01557959709316492, -0.04621459171175957, 0.06463723629713058, -0.0745718702673912, 0.007900773547589779, -0.04636524245142937, -0.045924101024866104, 0.0031687221489846706, -0.0017600350547581911, -0.026327982544898987, -0.08090730011463165, -0.054016292095184326, 0.02613011561334133, 0.07434148341417313, 0.019266663119196892, 0.004441873170435429, -0.012956242077052593, 0.03759530559182167, -0.05946134403347969, 0.056175075471401215, -0.05009042099118233, -0.09794259816408157, -0.017057521268725395, 0.014577332884073257, -0.04968097433447838, 0.04638285934925079, 0.056561652570962906, 0.03576315566897392, -0.020063502714037895, -0.0919356569647789, 0.07058654725551605, -0.05232078954577446, -0.02592923305928707, -0.0803440734744072, 0.007064383011311293, 0.006914000958204269, 0.03280770033597946, 0.010106046684086323, -0.0706280842423439, -0.07566897571086884, -0.001110800076276064, -0.04233204573392868, -0.09990938007831573, 0.08299244940280914, 0.024374952539801598, -0.007397995796054602, -0.01947414129972458, 0.06554175913333893, -0.00684709195047617, -0.008281921036541462, -0.03309927508234978, -0.041304927319288254, 0.08803742378950119, -0.07627736777067184, -0.08528819680213928, -0.041889455169439316, -0.05190161243081093, -0.04456804692745209, -0.028959011659026146, -0.06239091604948044, 0.031282976269721985, -0.0029180366545915604, -0.033848635852336884, 0.06049459055066109, -0.03166172653436661, 0.03997395187616348, 0.00801291037350893, 0.0011991526698693633, -0.005749389063566923, 0.01925067789852619, -0.04317876324057579, -0.04210835322737694, -0.05182720348238945, -0.0006485650083050132, -0.04021994024515152, 0.019248107448220253, 0.024702949449419975, -0.022287992760539055, 0.02800622582435608, 0.03267811983823776, -0.026479540392756462, 0.03933087736368179, 0.03451079502701759, -0.026402641087770462, 0.040765728801488876, 0.01867460086941719, -0.03403506428003311, 0.03444633260369301, 0.01977524720132351, 0.05638619139790535, -0.054377343505620956, 0.0598503015935421, -0.023708172142505646, 0.006410863716155291, -0.042800046503543854, -0.022928087040781975, -0.01947651244699955, -0.05400276556611061, -0.01880214549601078, 0.06993740051984787, 0.03560532256960869, -0.08651524782180786, 0.029732944443821907, -0.019008547067642212, -0.038918234407901764, -0.02222507633268833, -0.024155884981155396, 0.02972250245511532, -0.02510223165154457, -0.002537656109780073, -0.016032809391617775, 0.017509792000055313, 0.02781536616384983, 0.061235249042510986, 0.01542685367166996, 0.0390188954770565, -0.07524999231100082, -0.06420542299747467, -0.05854025483131409, 9.667859165809658e-33, 0.05958928167819977, -0.0028450831305235624, -0.03959986940026283, -0.03999707102775574, 0.0021915575489401817, 0.015939783304929733, -0.06429584324359894, 0.014883936382830143, -0.047469958662986755, -0.004204314202070236, -0.0277580376714468, -0.042214442044496536, -0.09099645912647247, -0.012465905398130417, -0.039101626724004745, -0.03763199597597122, -0.04291344806551933, 0.046967118978500366, -0.018478646874427795, -0.010804920457303524, 0.07159097492694855, 0.012949157506227493, 0.02397306263446808, -0.06120394915342331, 0.0032404509838670492, -0.013659643009305, 0.005895700305700302, 0.032473839819431305, -0.07821336388587952, 0.045776356011629105, 0.0135103864595294, -0.016131220385432243, 0.02127915434539318, -0.05513791739940643, 0.08323783427476883, -0.011589126661419868, 0.048335570842027664, -0.006733706220984459, -0.010028962045907974, -0.045086078345775604, 0.05397498607635498, 0.04180573672056198, 0.008122152648866177, -0.017986014485359192, 0.006682120263576508, -0.09693867713212967, 0.023120801895856857, 0.04297223314642906, -0.000038180052797542885, 0.0008401974919252098, -0.05790770798921585, -0.02784038707613945, -0.0017650899244472384, -0.01746472902595997, -0.07562286406755447, 0.023482557386159897, 0.03394586965441704, 0.12510034441947937, 0.032982975244522095, 0.10657306015491486, -0.04819806292653084, 0.026356616988778114, -0.05753549188375473, -0.004564085975289345, -0.0009628639090806246, 0.02028582990169525, 0.01173920463770628, 0.038027312606573105, 0.01020195335149765, 0.047100018709897995, 0.00441410206258297, -0.02083936706185341, 0.108701691031456, 0.04777595400810242, 0.043552543967962265, -0.10530558973550797, 0.0494094043970108, -0.05879805237054825, -0.004578579217195511, -0.026602517813444138, -0.022856637835502625, 0.03693356364965439, 0.010566340759396553, -0.08011618256568909, -0.09869521856307983, 0.013430068269371986, 0.008022122085094452, 0.005497898440808058, -0.04337068647146225, -0.007987105287611485, -0.026876864954829216, 0.010413826443254948, 0.017402904108166695, -0.04356294870376587, -0.02914242632687092, -1.1607266152929071e-32, -0.03284565359354019, 0.03691665828227997, -0.03520732372999191, 0.04411090537905693, 0.06927577406167984, -0.0062076193280518055, -0.023015175014734268, 0.08327069133520126, -0.018633710220456123, 0.019540026783943176, 0.09128665924072266, -0.062486521899700165, -0.03804779797792435, -0.02974804863333702, -0.058082081377506256, 0.03174576535820961, -0.05296424403786659, -0.05509765073657036, -0.022013984620571136, 0.043118637055158615, 0.005678188055753708, 0.13868626952171326, -0.004333973862230778, 0.10798435658216476, -0.020478306338191032, 0.0032737180590629578, -0.04111756756901741, 0.032071735709905624, -0.006569514516741037, 0.024222519248723984, 0.048995841294527054, 0.00881223939359188, -0.09206900000572205, 0.040232982486486435, -0.04007026553153992, -0.07744421809911728, 0.043821901082992554, -0.02283964678645134, -0.019168931990861893, 0.11192497611045837, 0.07756718248128891, 0.05921250209212303, -0.1229737251996994, 0.0297628752887249, 0.015957774594426155, -0.00436796760186553, 0.08666378259658813, 0.00618850439786911, 0.0289131049066782, -0.039977848529815674, 0.014746847562491894, -0.05074803903698921, -0.03861381113529205, 0.07625307887792587, 0.06032727286219597, -0.016713254153728485, 0.0029029289726167917, -0.030257131904363632, -0.13130109012126923, -0.005305160768330097, 0.003214683150872588, 0.09248856455087662, -0.011832328513264656, 0.020441250875592232, 0.14355997741222382, -0.06608074903488159, -0.06789001077413559, 0.012670592404901981, -0.03645924851298332, 0.005259866360574961, -0.07741650193929672, -0.04224354773759842, 0.03872883319854736, -0.05146412178874016, 0.049612756818532944, -0.04086856171488762, -0.09289322793483734, 0.07550373673439026, 0.0009474398102611303, 0.07460238039493561, -0.08214838802814484, 0.015505080111324787, 0.050381939858198166, 0.04444338008761406, 0.04156315699219704, 0.0293880682438612, 0.011009065434336662, 0.16796134412288666, -0.05520200729370117, 0.046316877007484436, -0.04974700137972832, 0.02146417833864689, 0.0268064197152853, 0.09620455652475357, -0.020324304699897766, -6.90094239530481e-8, 0.008209466002881527, -0.07367204129695892, -0.08163882791996002, 0.04309520870447159, 0.012591507285833359, -0.047700025141239166, 0.0006253309547901154, 0.05527471378445625, -0.05759308859705925, -0.04510695859789848, -0.013958062045276165, -0.04644084721803665, -0.1921624392271042, -0.033200111240148544, 0.03955411911010742, -0.022615846246480942, 0.012790380977094173, 0.14348255097866058, -0.03382507711648941, -0.05411389097571373, 0.09257788956165314, -0.04351550340652466, 0.059021059423685074, -0.019651958718895912, 0.060329362750053406, -0.005352836102247238, -0.07960563898086548, 0.0958612784743309, 0.011653048917651176, 0.10848495364189148, 0.04428793489933014, -0.07739012688398361, -0.014249623753130436, -0.083491250872612, 0.0284982081502676, 0.0222613662481308, 0.04554552212357521, 0.0020384853705763817, 0.037506598979234695, 0.04831733554601669, -0.00461548799648881, 0.08809906244277954, -0.0730617418885231, 0.06679829955101013, 0.05864988639950752, 0.007102853152900934, -0.029017852619290352, -0.024738090112805367, 0.044977787882089615, -0.07851994782686234, -0.016312159597873688, -0.058491870760917664, -0.038087066262960434, 0.048060476779937744, -0.03180398419499397, 0.07673823088407516, 0.021402671933174133, 0.0128202885389328, 0.03310016170144081, -0.08764824271202087, 0.08194759488105774, -0.04170574992895126, 0.019586944952607155, 0.0467221774160862 ]
distilroberta-base
c1149320821601524a8d373726ed95bbd2bc0dc2
2022-07-22T08:13:21.000Z
[ "pytorch", "tf", "jax", "rust", "roberta", "fill-mask", "en", "dataset:openwebtext", "arxiv:1910.01108", "arxiv:1910.09700", "transformers", "exbert", "license:apache-2.0", "autotrain_compatible" ]
fill-mask
false
null
null
distilroberta-base
5,192,102
21
transformers
--- language: en tags: - exbert license: apache-2.0 datasets: - openwebtext --- # Model Card for DistilRoBERTa base # Table of Contents 1. [Model Details](#model-details) 2. [Uses](#uses) 3. [Bias, Risks, and Limitations](#bias-risks-and-limitations) 4. [Training Details](#training-details) 5. [Evaluation](#evaluation) 6. [Environmental Impact](#environmental-impact) 7. [Citation](#citation) 8. [How To Get Started With the Model](#how-to-get-started-with-the-model) # Model Details ## Model Description This model is a distilled version of the [RoBERTa-base model](https://huggingface.co/roberta-base). It follows the same training procedure as [DistilBERT](https://huggingface.co/distilbert-base-uncased). The code for the distillation process can be found [here](https://github.com/huggingface/transformers/tree/master/examples/distillation). This model is case-sensitive: it makes a difference between english and English. The model has 6 layers, 768 dimension and 12 heads, totalizing 82M parameters (compared to 125M parameters for RoBERTa-base). On average DistilRoBERTa is twice as fast as Roberta-base. We encourage users of this model card to check out the [RoBERTa-base model card](https://huggingface.co/roberta-base) to learn more about usage, limitations and potential biases. - **Developed by:** Victor Sanh, Lysandre Debut, Julien Chaumond, Thomas Wolf (Hugging Face) - **Model type:** Transformer-based language model - **Language(s) (NLP):** English - **License:** Apache 2.0 - **Related Models:** [RoBERTa-base model card](https://huggingface.co/roberta-base) - **Resources for more information:** - [GitHub Repository](https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/README.md) - [Associated Paper](https://arxiv.org/abs/1910.01108) # Uses ## Direct Use and Downstream Use You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=roberta) to look for fine-tuned versions on a task that interests you. Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation you should look at model like GPT2. ## Out of Scope Use The model should not be used to intentionally create hostile or alienating environments for people. The model was not trained to be factual or true representations of people or events, and therefore using the models to generate such content is out-of-scope for the abilities of this model. # Bias, Risks, and Limitations Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). Predictions generated by the model may include disturbing and harmful stereotypes across protected classes; identity characteristics; and sensitive, social, and occupational groups. For example: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='distilroberta-base') >>> unmasker("The man worked as a <mask>.") [{'score': 0.1237526461482048, 'sequence': 'The man worked as a waiter.', 'token': 38233, 'token_str': ' waiter'}, {'score': 0.08968018740415573, 'sequence': 'The man worked as a waitress.', 'token': 35698, 'token_str': ' waitress'}, {'score': 0.08387645334005356, 'sequence': 'The man worked as a bartender.', 'token': 33080, 'token_str': ' bartender'}, {'score': 0.061059024184942245, 'sequence': 'The man worked as a mechanic.', 'token': 25682, 'token_str': ' mechanic'}, {'score': 0.03804653510451317, 'sequence': 'The man worked as a courier.', 'token': 37171, 'token_str': ' courier'}] >>> unmasker("The woman worked as a <mask>.") [{'score': 0.23149248957633972, 'sequence': 'The woman worked as a waitress.', 'token': 35698, 'token_str': ' waitress'}, {'score': 0.07563332468271255, 'sequence': 'The woman worked as a waiter.', 'token': 38233, 'token_str': ' waiter'}, {'score': 0.06983394920825958, 'sequence': 'The woman worked as a bartender.', 'token': 33080, 'token_str': ' bartender'}, {'score': 0.05411609262228012, 'sequence': 'The woman worked as a nurse.', 'token': 9008, 'token_str': ' nurse'}, {'score': 0.04995106905698776, 'sequence': 'The woman worked as a maid.', 'token': 29754, 'token_str': ' maid'}] ``` ## Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. # Training Details DistilRoBERTa was pre-trained on [OpenWebTextCorpus](https://skylion007.github.io/OpenWebTextCorpus/), a reproduction of OpenAI's WebText dataset (it is ~4 times less training data than the teacher RoBERTa). See the [roberta-base model card](https://huggingface.co/roberta-base/blob/main/README.md) for further details on training. # Evaluation When fine-tuned on downstream tasks, this model achieves the following results (see [GitHub Repo](https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/README.md)): Glue test results: | Task | MNLI | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | |:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:| | | 84.0 | 89.4 | 90.8 | 92.5 | 59.3 | 88.3 | 86.6 | 67.9 | # Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** More information needed - **Hours used:** More information needed - **Cloud Provider:** More information needed - **Compute Region:** More information needed - **Carbon Emitted:** More information needed # Citation ```bibtex @article{Sanh2019DistilBERTAD, title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter}, author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf}, journal={ArXiv}, year={2019}, volume={abs/1910.01108} } ``` APA - Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108. # How to Get Started With the Model You can use the model directly with a pipeline for masked language modeling: ```python >>> from transformers import pipeline >>> unmasker = pipeline('fill-mask', model='distilroberta-base') >>> unmasker("Hello I'm a <mask> model.") [{'score': 0.04673689603805542, 'sequence': "Hello I'm a business model.", 'token': 265, 'token_str': ' business'}, {'score': 0.03846118599176407, 'sequence': "Hello I'm a freelance model.", 'token': 18150, 'token_str': ' freelance'}, {'score': 0.03308931365609169, 'sequence': "Hello I'm a fashion model.", 'token': 2734, 'token_str': ' fashion'}, {'score': 0.03018997237086296, 'sequence': "Hello I'm a role model.", 'token': 774, 'token_str': ' role'}, {'score': 0.02111748233437538, 'sequence': "Hello I'm a Playboy model.", 'token': 24526, 'token_str': ' Playboy'}] ``` <a href="https://huggingface.co/exbert/?model=distilroberta-base"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
[ -0.1325169950723648, -0.03109726682305336, 0.035468652844429016, 0.0266091488301754, 0.027898555621504784, -0.04462097957730293, -0.04662588611245155, 0.06629492342472076, -0.08276285231113434, -0.04405451565980911, 0.023149758577346802, -0.024889251217246056, 0.013186859898269176, -0.002421727403998375, -0.014946619980037212, 0.12591679394245148, -0.0013454101281240582, 0.08119090646505356, -0.1671920269727707, 0.044374074786901474, 0.0542934350669384, 0.006431962363421917, -0.032872628420591354, -0.008708999492228031, 0.023900935426354408, 0.022487478330731392, 0.017797812819480896, 0.08263954520225525, 0.061395879834890366, -0.0976102352142334, 0.044183604419231415, 0.0161800105124712, -0.003678069217130542, 0.005930530838668346, -0.05205381289124489, 0.10172851383686066, -0.03470184653997421, -0.02751070261001587, -0.015759697183966637, 0.02269050106406212, 0.017614217475056648, 0.0497620552778244, -0.01757735386490822, 0.03951349854469299, 0.032661307603120804, -0.06495597213506699, -0.024326777085661888, -0.024492338299751282, -0.05080601945519447, -0.04108262062072754, -0.07629881799221039, -0.06326238811016083, 0.037413302809000015, 0.053506411612033844, -0.02010401524603367, -0.04143150523304939, -0.009262456558644772, -0.07746787369251251, -0.06472862511873245, -0.1090773418545723, 0.04138338193297386, 0.0269842017441988, -0.1261567920446396, -0.036122094839811325, -0.04870771989226341, -0.048726629465818405, 0.023937208577990532, 0.0470610149204731, 0.09384818375110626, -0.08601564913988113, -0.04502706602215767, 0.027576610445976257, -0.06939943879842758, -0.009706144221127033, 0.02386690489947796, 0.04178176820278168, 0.08084473013877869, 0.05546773597598076, 0.027096891775727272, -0.06857219338417053, 0.013360084034502506, -0.006022179964929819, 0.06926318258047104, -0.03423211723566055, 0.04581581801176071, 0.02274204045534134, 0.006079469341784716, 0.0773886889219284, 0.04232882335782051, -0.01608322001993656, 0.0009516353020444512, 0.050605569034814835, 0.007820384576916695, -0.026651350781321526, 0.001210397225804627, 0.10721268504858017, 0.013919197022914886, 0.032583486288785934, -0.05503128841519356, 0.1474369764328003, -0.012733235023915768, 0.020458100363612175, 0.0025749155320227146, -0.03172674775123596, -0.060721129179000854, -0.07257993519306183, 0.0238973256200552, 0.10621699690818787, 0.005107710603624582, -0.09425338357686996, 0.01540915947407484, 0.0276627279818058, -0.037179846316576004, 0.013870888389647007, -0.03712299466133118, 0.006143073085695505, 0.03549015894532204, -0.06912566721439362, 0.011352432891726494, 0.009742965921759605, 0.03163883090019226, -0.02456299029290676, 0.0017130362102761865, 0.04796706885099411, -0.08469254523515701, -0.031150996685028076, -0.08420047909021378, 2.3316826763002575e-33, 0.07299400866031647, 0.05987376719713211, -0.03133464232087135, 0.057264089584350586, 0.07374782115221024, -0.027884559705853462, -0.0032120603136718273, 0.017464425414800644, -0.06839979439973831, -0.0008248352678492665, -0.0026582127902656794, 0.04926937818527222, -0.05923222750425339, -0.020153271034359932, -0.06105899065732956, -0.06138071045279503, -0.04812508076429367, 0.04714326187968254, 0.04905850067734718, 0.012488368898630142, 0.06765265762805939, 0.015080473385751247, -0.07770277559757233, 0.029032068327069283, -0.06039832532405853, 0.0679718479514122, 0.03505299612879753, 0.014001683332026005, -0.058269716799259186, 0.041873760521411896, 0.058909203857183456, -0.019230060279369354, -0.05510164797306061, -0.012352090328931808, 0.02712513506412506, -0.024885864928364754, -0.01668708212673664, -0.053808484226465225, -0.009272005409002304, -0.10155223309993744, 0.011651335284113884, 0.028475802391767502, 0.045124392956495285, -0.008451412431895733, -0.01855388842523098, -0.039208609610795975, 0.05267312750220299, 0.014439444988965988, 0.07894431054592133, -0.026199206709861755, -0.005201333202421665, 0.05620149150490761, 0.028044521808624268, -0.040110040456056595, -0.10580434650182724, 0.07983432710170746, 0.06394259631633759, 0.02879471518099308, 0.030921347439289093, 0.03405029699206352, -0.029047252610325813, 0.06798513978719711, -0.03324683755636215, 0.015993893146514893, 0.027547787874937057, 0.014532877132296562, -0.06474550068378448, -0.022552357986569405, 0.08339890837669373, 0.018369177356362343, -0.07779113948345184, -0.013456657528877258, -0.01680343970656395, 0.0021891763899475336, -0.0072923870757222176, -0.04857650771737099, 0.05051117017865181, -0.013507521711289883, -0.0076170251704752445, -0.024633992463350296, -0.026623297482728958, 0.08902285993099213, -0.04196152836084366, -0.03058716654777527, -0.05840864032506943, -0.01396047044545412, 0.01612778753042221, -0.050541333854198456, -0.051140427589416504, 0.0034438734874129295, 0.05482067912817001, 0.06268476694822311, -0.017143700271844864, -0.03800712898373604, 0.07731347531080246, -1.1311569079611786e-33, 0.02517421916127205, 0.026804441586136818, 0.016558585688471794, 0.0743376836180687, 0.08600375056266785, -0.04863206297159195, -0.00431816978380084, 0.11391730606555939, 0.07400654256343842, -0.04443538933992386, 0.07879638671875, -0.0011662041069939733, -0.02974288910627365, -0.012111865915358067, 0.03370830416679382, 0.04141715541481972, -0.04202897846698761, -0.041979145258665085, -0.010330306366086006, 0.04095667973160744, 0.004297059495002031, 0.14148351550102234, -0.10823095589876175, 0.012274316512048244, -0.037739768624305725, 0.04205487668514252, -0.00885002687573433, 0.03372102603316307, 0.08823005855083466, 0.004169171676039696, -0.019187744706869125, -0.009914612397551537, -0.015211015939712524, 0.0025936150923371315, -0.14285196363925934, 0.0362696573138237, -0.00376747059635818, -0.015000796876847744, -0.00006845729512860999, 0.08795682340860367, 0.028506649658083916, -0.0052198562771081924, -0.04046019911766052, 0.012806599028408527, -0.02423112839460373, -0.013542354106903076, -0.02931845188140869, -0.03243974223732948, 0.036380473524332047, 0.010074494406580925, 0.06389858573675156, 0.023577744141221046, -0.0968029573559761, 0.02731814794242382, 0.024276474490761757, -0.05137030780315399, 0.0772944763302803, -0.05822744965553284, -0.04060572758316994, 0.036089882254600525, -0.0575345940887928, 0.015057104639708996, 0.011315255425870419, 0.008356581442058086, -0.030069243162870407, -0.0610218308866024, -0.09328675270080566, 0.04821012169122696, -0.05386818200349808, -0.003991599194705486, -0.027244796976447105, 0.0173236932605505, 0.08668476343154907, -0.023462722077965736, 0.0580298937857151, -0.08969233930110931, -0.0005509175243787467, -0.019572198390960693, -0.0018802896374836564, -0.01655222475528717, -0.13073746860027313, -0.012169732712209225, 0.020733971148729324, 0.01482128445059061, 0.10539527982473373, -0.02298569492995739, 0.038234300911426544, 0.06595458090305328, -0.030938301235437393, -0.018857557326555252, -0.05457845702767372, -0.017761850729584694, 0.04020891711115837, 0.13055892288684845, 0.03845702484250069, -5.93711533269925e-8, -0.08411537110805511, 0.0008437753422185779, -0.020894238725304604, 0.03504718840122223, -0.059033215045928955, -0.006568064913153648, 0.04675130918622017, 0.02817852422595024, -0.07576336711645126, 0.02890443429350853, 0.041653238236904144, 0.06890301406383514, -0.06914074718952179, -0.0011602322338148952, -0.002868970623239875, 0.05173512548208237, 0.0338614284992218, 0.07747440040111542, -0.0724874809384346, -0.018088605254888535, 0.04728223755955696, -0.01775083877146244, 0.05918387696146965, -0.042197998613119125, 0.08685232698917389, -0.05253151059150696, -0.0637534111738205, 0.04341726750135422, 0.013659508898854256, 0.0005539839621633291, -0.007450522854924202, 0.0432833656668663, 0.0049882628954946995, -0.021427758038043976, -0.029947549104690552, 0.059231117367744446, -0.036460813134908676, -0.04680398479104042, -0.04349459335207939, 0.0782858356833458, 0.08170107007026672, 0.04206938296556473, -0.11869895458221436, 0.036482229828834534, 0.08470788598060608, 0.03461912274360657, -0.007449630182236433, -0.04762262850999832, 0.016825463622808456, 0.08508694171905518, 0.06770002096891403, -0.05067925155162811, -0.021951263770461082, 0.013346041552722454, -0.024475157260894775, 0.02930784784257412, -0.0021265968680381775, -0.023417361080646515, 0.03339109942317009, -0.03386706858873367, 0.03316865488886833, 0.00410274975001812, 0.04872361198067665, -0.019148804247379303 ]
distilgpt2
ca98be8f8f0994e707b944a9ef55e66fbcf9e586
2022-07-22T08:12:56.000Z
[ "pytorch", "tf", "jax", "tflite", "rust", "gpt2", "text-generation", "en", "dataset:openwebtext", "arxiv:1910.01108", "arxiv:2201.08542", "arxiv:2203.12574", "arxiv:1910.09700", "arxiv:1503.02531", "transformers", "exbert", "license:apache-2.0", "model-index", "co2_eq_emissions" ]
text-generation
false
null
null
distilgpt2
4,525,173
77
transformers
--- language: en tags: - exbert license: apache-2.0 datasets: - openwebtext model-index: - name: distilgpt2 results: - task: type: text-generation name: Text Generation dataset: type: wikitext name: WikiText-103 metrics: - type: perplexity name: Perplexity value: 21.1 co2_eq_emissions: 149200 --- # DistilGPT2 DistilGPT2 (short for Distilled-GPT2) is an English-language model pre-trained with the supervision of the smallest version of Generative Pre-trained Transformer 2 (GPT-2). Like GPT-2, DistilGPT2 can be used to generate text. Users of this model card should also consider information about the design, training, and limitations of [GPT-2](https://huggingface.co/gpt2). ## Model Details - **Developed by:** Hugging Face - **Model type:** Transformer-based Language Model - **Language:** English - **License:** Apache 2.0 - **Model Description:** DistilGPT2 is an English-language model pre-trained with the supervision of the 124 million parameter version of GPT-2. DistilGPT2, which has 82 million parameters, was developed using [knowledge distillation](#knowledge-distillation) and was designed to be a faster, lighter version of GPT-2. - **Resources for more information:** See [this repository](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) for more about Distil\* (a class of compressed models including Distilled-GPT2), [Sanh et al. (2019)](https://arxiv.org/abs/1910.01108) for more information about knowledge distillation and the training procedure, and this page for more about [GPT-2](https://openai.com/blog/better-language-models/). ## Uses, Limitations and Risks #### Limitations and Risks <details> <summary>Click to expand</summary> **CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.** As the developers of GPT-2 (OpenAI) note in their [model card](https://github.com/openai/gpt-2/blob/master/model_card.md), “language models like GPT-2 reflect the biases inherent to the systems they were trained on.” Significant research has explored bias and fairness issues with models for language generation including GPT-2 (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). DistilGPT2 also suffers from persistent bias issues, as highlighted in the demonstrative examples below. Note that these examples are not a comprehensive stress-testing of the model. Readers considering using the model should consider more rigorous evaluations of the model depending on their use case and context. The impact of model compression techniques – such as knowledge distillation – on bias and fairness issues associated with language models is an active area of research. For example: - [Silva, Tambwekar and Gombolay (2021)](https://aclanthology.org/2021.naacl-main.189.pdf) find that distilled versions of BERT and RoBERTa consistently exhibit statistically significant bias (with regard to gender and race) with effect sizes larger than the teacher models. - [Xu and Hu (2022)](https://arxiv.org/pdf/2201.08542.pdf) find that distilled versions of GPT-2 showed consistent reductions in toxicity and bias compared to the teacher model (see the paper for more detail on metrics used to define/measure toxicity and bias). - [Gupta et al. (2022)](https://arxiv.org/pdf/2203.12574.pdf) find that DistilGPT2 exhibits greater gender disparities than GPT-2 and propose a technique for mitigating gender bias in distilled language models like DistilGPT2. ```python >>> from transformers import pipeline, set_seed >>> generator = pipeline('text-generation', model='distilgpt2') >>> set_seed(48) >>> generator("The White man worked as a", max_length=20, num_return_sequences=3) [{'generated_text': "The White man worked as a salesman at a McDonald's restaurant called Kia at the time of the"}, {'generated_text': 'The White man worked as a contractor in the Army in the late 1990s. He became a "'}, {'generated_text': 'The White man worked as a police spokesman to the US Navy in the 1930s.'}] >>> set_seed(48) >>> generator("The Black man worked as a", max_length=20, num_return_sequences=3) [{'generated_text': 'The Black man worked as a shop assistant for an hour at Wal-Mart at Wal-Mart in'}, {'generated_text': 'The Black man worked as a waiter in the hotel when he was assaulted when he got out of a'}, {'generated_text': 'The Black man worked as a police spokesman four months ago...'}] ``` </details> #### Potential Uses Since DistilGPT2 is a distilled version of GPT-2, it is intended to be used for similar use cases with the increased functionality of being smaller and easier to run than the base model. The developers of GPT-2 state in their [model card](https://github.com/openai/gpt-2/blob/master/model_card.md) that they envisioned GPT-2 would be used by researchers to better understand large-scale generative language models, with possible secondary use cases including: > - *Writing assistance: Grammar assistance, autocompletion (for normal prose or code)* > - *Creative writing and art: exploring the generation of creative, fictional texts; aiding creation of poetry and other literary art.* > - *Entertainment: Creation of games, chat bots, and amusing generations.* Using DistilGPT2, the Hugging Face team built the [Write With Transformers](https://transformer.huggingface.co/doc/distil-gpt2) web app, which allows users to play with the model to generate text directly from their browser. #### Out-of-scope Uses OpenAI states in the GPT-2 [model card](https://github.com/openai/gpt-2/blob/master/model_card.md): > Because large-scale language models like GPT-2 do not distinguish fact from fiction, we don’t support use-cases that require the generated text to be true. > > Additionally, language models like GPT-2 reflect the biases inherent to the systems they were trained on, so we do not recommend that they be deployed into systems that interact with humans unless the deployers first carry out a study of biases relevant to the intended use-case. ### How to Get Started with the Model <details> <summary>Click to expand</summary> *Be sure to read the sections on in-scope and out-of-scope uses and limitations of the model for further information on how to use the model.* Using DistilGPT2 is similar to using GPT-2. DistilGPT2 can be used directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility: ```python >>> from transformers import pipeline, set_seed >>> generator = pipeline('text-generation', model='distilgpt2') >>> set_seed(42) >>> generator("Hello, I’m a language model", max_length=20, num_return_sequences=5) Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation. [{'generated_text': "Hello, I'm a language model, I'm a language model. In my previous post I've"}, {'generated_text': "Hello, I'm a language model, and I'd love to hear what you think about it."}, {'generated_text': "Hello, I'm a language model, but I don't get much of a connection anymore, so"}, {'generated_text': "Hello, I'm a language model, a functional language... It's not an example, and that"}, {'generated_text': "Hello, I'm a language model, not an object model.\n\nIn a nutshell, I"}] ``` Here is how to use this model to get the features of a given text in PyTorch: ```python from transformers import GPT2Tokenizer, GPT2Model tokenizer = GPT2Tokenizer.from_pretrained('distilgpt2') model = GPT2Model.from_pretrained('distilgpt2') text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='pt') output = model(**encoded_input) ``` And in TensorFlow: ```python from transformers import GPT2Tokenizer, TFGPT2Model tokenizer = GPT2Tokenizer.from_pretrained('distilgpt2') model = TFGPT2Model.from_pretrained('distilgpt2') text = "Replace me by any text you'd like." encoded_input = tokenizer(text, return_tensors='tf') output = model(encoded_input) ``` </details> ## Training Data DistilGPT2 was trained using [OpenWebTextCorpus](https://skylion007.github.io/OpenWebTextCorpus/), an open-source reproduction of OpenAI’s WebText dataset, which was used to train GPT-2. See the [OpenWebTextCorpus Dataset Card](https://huggingface.co/datasets/openwebtext) for additional information about OpenWebTextCorpus and [Radford et al. (2019)](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf) for additional information about WebText. ## Training Procedure The texts were tokenized using the same tokenizer as GPT-2, a byte-level version of Byte Pair Encoding (BPE). DistilGPT2 was trained using knowledge distillation, following a procedure similar to the training procedure for DistilBERT, described in more detail in [Sanh et al. (2019)](https://arxiv.org/abs/1910.01108). ## Evaluation Results The creators of DistilGPT2 [report](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) that, on the [WikiText-103](https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/) benchmark, GPT-2 reaches a perplexity on the test set of 16.3 compared to 21.1 for DistilGPT2 (after fine-tuning on the train set). ## Environmental Impact *Carbon emissions were estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). The hardware, runtime, cloud provider, and compute region were utilized to estimate the carbon impact.* - **Hardware Type:** 8 16GB V100 - **Hours used:** 168 (1 week) - **Cloud Provider:** Azure - **Compute Region:** unavailable, assumed East US for calculations - **Carbon Emitted** *(Power consumption x Time x Carbon produced based on location of power grid)*: 149.2 kg eq. CO2 ## Citation ```bibtex @inproceedings{sanh2019distilbert, title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter}, author={Sanh, Victor and Debut, Lysandre and Chaumond, Julien and Wolf, Thomas}, booktitle={NeurIPS EMC^2 Workshop}, year={2019} } ``` ## Glossary - <a name="knowledge-distillation">**Knowledge Distillation**</a>: As described in [Sanh et al. (2019)](https://arxiv.org/pdf/1910.01108.pdf), “knowledge distillation is a compression technique in which a compact model – the student – is trained to reproduce the behavior of a larger model – the teacher – or an ensemble of models.” Also see [Bucila et al. (2006)](https://www.cs.cornell.edu/~caruana/compression.kdd06.pdf) and [Hinton et al. (2015)](https://arxiv.org/abs/1503.02531). <a href="https://huggingface.co/exbert/?model=distilgpt2"> <img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png"> </a>
[ -0.12711955606937408, -0.02593734860420227, 0.022106902673840523, 0.04674231633543968, 0.011120681650936604, -0.0465211421251297, 0.0013098361669108272, 0.08374807238578796, -0.028656257316470146, -0.08449400216341019, 0.015780135989189148, -0.07485173642635345, -0.02609829604625702, 0.01734541542828083, 0.03836371377110481, 0.04850585386157036, 0.03753744438290596, 0.005804637912660837, -0.07377142459154129, -0.06871823221445084, 0.054596081376075745, 0.06583494693040848, 0.048092395067214966, -0.022333331406116486, 0.018230458721518517, 0.03914511948823929, 0.023878298699855804, 0.03005393221974373, 0.07887698709964752, 0.002038514707237482, 0.06711822748184204, -0.002755615394562483, -0.06967147439718246, 0.07233712077140808, -0.04666297882795334, 0.06614355742931366, -0.013085956685245037, -0.05195479467511177, -0.03116394765675068, -0.05298455432057381, -0.009906654246151447, -0.020370347425341606, 0.022124921903014183, 0.06399249285459518, 0.05110156908631325, -0.0253846924751997, -0.04603825882077217, -0.008379974402487278, -0.11439233273267746, -0.061235152184963226, -0.05280972644686699, -0.06733203679323196, -0.00167645956389606, 0.015830064192414284, 0.030958089977502823, 0.02631903812289238, -0.00712816184386611, 0.008564760908484459, -0.0027360443491488695, -0.07234112918376923, -0.001273464411497116, -0.021096719428896904, -0.1104942113161087, -0.04737812653183937, -0.08085490763187408, -0.013414291664958, 0.06955938786268234, -0.0017744166543707252, 0.04321419075131416, -0.10085972398519516, -0.08697298914194107, 0.05379324033856392, -0.03458636254072189, 0.04314796254038811, 0.047058068215847015, 0.02944069914519787, 0.014091460034251213, 0.02019585110247135, 0.009744731709361076, -0.03907885029911995, 0.08396787941455841, 0.014319583773612976, 0.10252074897289276, 0.0008454483468085527, 0.03737287223339081, -0.04947168752551079, -0.004407365806400776, 0.10452445596456528, -0.005918809212744236, 0.022165559232234955, -0.05744939669966698, -0.02396140806376934, 0.0638035461306572, 0.0505242682993412, -0.053741320967674255, 0.03221246600151062, 0.022781308740377426, 0.019827337935566902, -0.06613592803478241, 0.06674294918775558, -0.01028864923864603, -0.002214353531599045, -0.01778891123831272, 0.027711106464266777, -0.10995610058307648, -0.09507081657648087, -0.003225192427635193, 0.07550646364688873, 0.03386073186993599, -0.1157369539141655, 0.053300656378269196, -0.0126956170424819, -0.05249848589301109, -0.06880507618188858, -0.02103562280535698, 0.046635329723358154, -0.034913916140794754, 0.003218984231352806, 0.04908027499914169, 0.0732228010892868, -0.058034464716911316, 0.029283050447702408, -0.013891779817640781, -0.029948078095912933, -0.031789522618055344, -0.054448679089546204, -0.02896273136138916, 6.60341886161288e-33, 0.05659624934196472, 0.074544757604599, -0.025858229026198387, 0.10909208655357361, -0.004131764639168978, 0.029887521639466286, -0.05280725285410881, -0.039992328733205795, 0.012567996047437191, -0.0650058388710022, 0.010839003138244152, -0.013497558422386646, -0.09536020457744598, 0.02736513502895832, 0.03621846064925194, -0.02686862275004387, -0.014080203138291836, 0.023240160197019577, 0.033268097788095474, 0.027635492384433746, 0.05034540966153145, 0.05668063461780548, 0.0010811686515808105, 0.004132374189794064, -0.02739860489964485, 0.05288241431117058, 0.017030609771609306, -0.06618661433458328, -0.0011876662028953433, 0.026695914566516876, -0.047487158328294754, -0.02768438309431076, -0.012829267419874668, 0.0229591503739357, -0.00991394929587841, -0.03737013414502144, -0.002262754598632455, -0.12089693546295166, 0.016115928068757057, -0.06473996490240097, 0.004767180886119604, 0.03958073630928993, 0.06159535050392151, -0.07249172776937485, 0.009257684461772442, -0.014433013275265694, 0.0641045793890953, 0.011321051977574825, -0.0034416497219353914, 0.03560057654976845, -0.06584753841161728, 0.07793442904949188, -0.0895395502448082, 0.035233501344919205, 0.020204033702611923, 0.07792357355356216, 0.11394459754228592, 0.03609534725546837, 0.06471224129199982, 0.05733620747923851, -0.01392423827201128, 0.0689636692404747, 0.011161558330059052, -0.020547542721033096, 0.024411384016275406, -0.03464072197675705, -0.04292268678545952, -0.05684930086135864, 0.0381912961602211, 0.09092845022678375, -0.040338579565286636, -0.06308538466691971, -0.005521289072930813, 0.01676585152745247, 0.07502355426549911, -0.0443405844271183, 0.0932820588350296, -0.021370088681578636, -0.019924728199839592, 0.030270511284470558, -0.0957740843296051, 0.03312024846673012, 0.04274957627058029, -0.11823444068431854, -0.01286076195538044, -0.03688013181090355, -0.03774266690015793, -0.05507046729326248, -0.005088700447231531, 0.007835294120013714, -0.06984195113182068, 0.015412368811666965, -0.05437201261520386, -0.024670833721756935, 0.010059725493192673, -7.246841122898814e-33, -0.04929990693926811, 0.06140098348259926, -0.05352795124053955, 0.08253725618124008, -0.017489882186055183, -0.05218459293246269, 0.03639574348926544, 0.09686274081468582, 0.0666208565235138, 0.005175045225769281, 0.09222034364938736, -0.001679954002611339, 0.05006073787808418, -0.019007859751582146, 0.09507130831480026, -0.04067031294107437, -0.05203575640916824, -0.08992891758680344, 0.015047751367092133, 0.06902482360601425, 0.03543836250901222, 0.07438983768224716, -0.11369968205690384, 0.03764203563332558, -0.021629009395837784, -0.00290797115303576, 0.02638879045844078, 0.009747705422341824, 0.05651644989848137, 0.005868453066796064, 0.003265462117269635, 0.03202761709690094, 0.014084212481975555, 0.009594298899173737, -0.09402617812156677, -0.04723329842090607, 0.08278343826532364, 0.002561456523835659, -0.10639572143554688, 0.1257527619600296, 0.050394345074892044, -0.007430265191942453, -0.04949503391981125, 0.04745562747120857, -0.1033572405576706, 0.02068413235247135, -0.03484737500548363, -0.052581727504730225, 0.024065013974905014, -0.004336469806730747, 0.07148098200559616, 0.03150254860520363, -0.06824179738759995, -0.034241531044244766, -0.02831105887889862, -0.12272956222295761, 0.028388269245624542, -0.045755404978990555, -0.049151208251714706, -0.004258207976818085, -0.005551370792090893, -0.04385916516184807, 0.0468047596514225, -0.029039010405540466, 0.007581697776913643, -0.07593899965286255, -0.08648037910461426, 0.03351191431283951, 0.04864409565925598, 0.05479687824845314, -0.028497569262981415, -0.020562803372740746, 0.030358990654349327, -0.03786354511976242, -0.05075223743915558, -0.03602530062198639, 0.00008382258238270879, -0.0018829847685992718, 0.05441984906792641, -0.06888753175735474, -0.03434167057275772, 0.08062092959880829, 0.07206185162067413, 0.0028848182410001755, 0.10277949273586273, -0.07749134302139282, 0.02390073426067829, 0.08214322477579117, -0.018504589796066284, 0.01891448348760605, -0.05293620750308037, 0.0928063616156578, 0.018364742398262024, 0.12673845887184143, -0.004531733691692352, -5.713451756150789e-8, -0.11337608098983765, -0.05540230870246887, -0.09111656993627548, 0.06626979261636734, -0.056769467890262604, -0.049134157598018646, 0.004629556555300951, 0.035024918615818024, -0.04584042727947235, 0.0064318557269871235, -0.009472455829381943, 0.001733739278279245, -0.07864223420619965, -0.00850300770252943, 0.03291112184524536, 0.059220533818006516, 0.03964708372950554, 0.04626211151480675, -0.028543313965201378, -0.03009627014398575, 0.048146434128284454, -0.005034433212131262, 0.02222379669547081, -0.03311898559331894, 0.0319107323884964, -0.007820331491529942, -0.04407272860407829, 0.03176217898726463, -0.003025651443749666, 0.00012868344492744654, -0.007780507206916809, 0.00007802689651725814, -0.0339638851583004, -0.0031944673974066973, -0.0021285945549607277, 0.01837237738072872, -0.04512489214539528, -0.030028818175196648, 0.0038102802354842424, 0.028780672699213028, 0.06697665899991989, 0.09759465605020523, -0.09664259105920792, 0.05992826819419861, 0.02362404391169548, -0.008442728780210018, -0.014880440197885036, -0.10931188613176346, 0.0012703569373115897, 0.010921936482191086, 0.00016854211571626365, -0.019450319930911064, -0.03127555176615715, -0.00003409358760109171, -0.010375922545790672, 0.08238836377859116, -0.02393409051001072, -0.018939021974802017, -0.015166843309998512, -0.017860908061265945, 0.022201715037226677, 0.014808541163802147, 0.050745561718940735, -0.020594870671629906 ]
End of preview. Expand in Data Studio

No dataset card yet

Downloads last month
7