sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-2
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-2
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09546346962451935,
0.11630092561244965,
-0.0023504605051130056,
0.0914682075381279,
0.11875256150960922,
0.022348016500473022,
0.10047981888055801,
0.12805958092212677,
-0.09776324033737183,
0.08641652017831802,
0.08787635713815689,
0.03832802176475525,
0.04681330546736717,
0.14512260258197784,
-0.019056620076298714,
-0.26079118251800537,
0.010405387729406357,
-0.004223830997943878,
-0.03501170128583908,
0.11173348873853683,
0.08585860580205917,
-0.10986869037151337,
0.08598999679088593,
0.01519002579152584,
-0.15466320514678955,
0.01954800821840763,
-0.036715514957904816,
-0.03444463759660721,
0.11325221508741379,
-0.033013880252838135,
0.10853555798530579,
0.02597665786743164,
0.13475124537944794,
-0.20998859405517578,
0.004569557961076498,
0.07273060083389282,
0.046612005680799484,
0.10063295066356659,
0.05102524906396866,
0.01596488617360592,
0.0893438383936882,
-0.15386153757572174,
0.09244793653488159,
0.029089832678437233,
-0.09049420803785324,
-0.12955816090106964,
-0.09566113352775574,
0.02524188533425331,
0.05314967408776283,
0.0677354633808136,
0.0018619478214532137,
0.15106897056102753,
-0.059926047921180725,
0.07846932113170624,
0.26626065373420715,
-0.3276076316833496,
-0.06440626829862595,
0.03205225616693497,
0.05955515056848526,
0.053650837391614914,
-0.12170962244272232,
-0.006281107198446989,
0.027269789949059486,
0.02961335889995098,
0.1186748668551445,
-0.01708277314901352,
-0.11284784972667694,
-0.013300164602696896,
-0.12871481478214264,
-0.0018707560375332832,
0.07165881246328354,
0.035625159740448,
-0.05304476618766785,
-0.09341900050640106,
-0.07563532888889313,
-0.09110801666975021,
-0.024994025006890297,
-0.06542833149433136,
0.05678554251790047,
-0.0544450469315052,
-0.08145022392272949,
-0.03828814625740051,
-0.057504475116729736,
-0.07664133608341217,
-0.018439844250679016,
0.158922016620636,
0.04032986983656883,
0.020952701568603516,
-0.031718671321868896,
0.1093897819519043,
0.002884801710024476,
-0.14128261804580688,
-0.015841303393244743,
-0.0008013220503926277,
-0.09696761518716812,
-0.04693247750401497,
-0.05138250067830086,
-0.01695346087217331,
0.010234805755317211,
0.1762000024318695,
-0.08070594072341919,
0.07507462054491043,
0.010097727179527283,
-0.028912490233778954,
-0.00647730752825737,
0.14852246642112732,
-0.042727552354335785,
-0.04496801644563675,
-0.010343749076128006,
0.07348711788654327,
0.003539435565471649,
-0.014146016910672188,
-0.06555633991956711,
-0.028204912319779396,
0.10317026078701019,
0.04631715640425682,
-0.060305580496788025,
0.03862181305885315,
-0.023725666105747223,
-0.028386099264025688,
0.016413159668445587,
-0.11560224741697311,
0.04415217041969299,
-0.0023629057686775923,
-0.08418992906808853,
-0.002169005572795868,
0.00033168791560456157,
-0.004590427502989769,
-0.007227140013128519,
0.10917782038450241,
-0.09842661023139954,
-0.0012617899337783456,
-0.06291595101356506,
-0.08172771334648132,
0.008931045420467854,
-0.15568508207798004,
-0.015885135158896446,
-0.05787983536720276,
-0.1699339747428894,
-0.030145158991217613,
0.03712144121527672,
-0.07464016228914261,
-0.011134009808301926,
-0.048387546092271805,
-0.06418125331401825,
0.024617692455649376,
-0.014715075492858887,
0.17357659339904785,
-0.05340415984392166,
0.07271263003349304,
-0.0005533587536774576,
0.04626351222395897,
0.014631959609687328,
0.0353730283677578,
-0.10429439693689346,
0.025770599022507668,
-0.13824836909770966,
0.06995224207639694,
-0.0838518738746643,
-0.004125004168599844,
-0.13425296545028687,
-0.09813620895147324,
0.01009853184223175,
-0.022694377228617668,
0.09030427783727646,
0.1387033313512802,
-0.19280272722244263,
-0.01755589433014393,
0.12702015042304993,
-0.0757000669836998,
-0.06324300169944763,
0.06292745471000671,
-0.061121683567762375,
0.03232312202453613,
0.051279012113809586,
0.21082305908203125,
0.0407416857779026,
-0.16593803465366364,
-0.03126402199268341,
-0.004055963363498449,
0.040709979832172394,
0.02676253952085972,
0.041485246270895004,
0.00402451679110527,
0.062002528458833694,
0.01446553971618414,
-0.07781229168176651,
-0.032443251460790634,
-0.09193336218595505,
-0.06585696339607239,
-0.054985515773296356,
-0.0724107176065445,
0.042644768953323364,
0.0019296882674098015,
0.04289917275309563,
-0.06433438509702682,
-0.10080806165933609,
0.11984525620937347,
0.09745097160339355,
-0.047962162643671036,
0.03762660548090935,
-0.07944329082965851,
0.01922762207686901,
-0.021635551005601883,
-0.03966132551431656,
-0.2062169313430786,
-0.12958048284053802,
0.05322343856096268,
-0.057682063430547714,
0.034299228340387344,
0.007145078852772713,
0.08054858446121216,
0.06143449991941452,
-0.04331836476922035,
-0.011910676024854183,
-0.09385332465171814,
0.002563855377957225,
-0.11769338697195053,
-0.18819186091423035,
-0.07785231620073318,
-0.040125295519828796,
0.09486113488674164,
-0.17341138422489166,
-0.007540668826550245,
0.015082672238349915,
0.14365066587924957,
0.026995092630386353,
-0.06771409511566162,
-0.003716640407219529,
0.03677238151431084,
0.0021719657815992832,
-0.09508141130208969,
0.04463176801800728,
0.008752296678721905,
-0.09434600919485092,
-0.06202266737818718,
-0.13398340344429016,
-0.011094972491264343,
0.058676883578300476,
0.052403874695301056,
-0.096689872443676,
-0.04594990983605385,
-0.07084905356168747,
-0.040893182158470154,
-0.07624552398920059,
0.013113426975905895,
0.20131246745586395,
0.03447125479578972,
0.11293230205774307,
-0.06782376021146774,
-0.07739250361919403,
-0.003535453462973237,
0.02110874466598034,
0.011987674050033092,
0.07642252743244171,
0.040797095745801926,
-0.054769907146692276,
0.07290735840797424,
0.10043694823980331,
-0.023265717551112175,
0.12313743680715561,
-0.04707719013094902,
-0.08412032574415207,
-0.034814316779375076,
-0.022813523188233376,
-0.02882537432014942,
0.12280898541212082,
-0.03923013433814049,
0.006084402557462454,
0.036187686026096344,
0.0451764240860939,
0.017099281772971153,
-0.16322804987430573,
0.00834602303802967,
0.02228873036801815,
-0.054673708975315094,
-0.03555602207779884,
-0.0014858219074085355,
0.027157647535204887,
0.09220617264509201,
0.03155555948615074,
-0.01366437692195177,
0.0038614775985479355,
-0.011593556962907314,
-0.062157779932022095,
0.18471239507198334,
-0.09788040071725845,
-0.08553814888000488,
-0.07633981108665466,
0.006169972475618124,
-0.05929284915328026,
-0.036140091717243195,
0.01604667864739895,
-0.08672385662794113,
-0.03842433542013168,
-0.08765372633934021,
-0.016611207276582718,
-0.01906965859234333,
0.02122795209288597,
0.032611194998025894,
-0.022578487172722816,
0.08186597377061844,
-0.13850073516368866,
0.0016085999086499214,
-0.051755983382463455,
-0.09262537211179733,
-0.0002283143112435937,
0.07496769726276398,
0.09820310771465302,
0.07912364602088928,
-0.017024507746100426,
0.029456524178385735,
-0.03417111560702324,
0.24252384901046753,
-0.044793594628572464,
0.010535704903304577,
0.10424191504716873,
-0.014451717026531696,
0.05697852373123169,
0.09497124701738358,
0.03694063425064087,
-0.0935468077659607,
0.021162958815693855,
0.08212022483348846,
-0.029179777950048447,
-0.22893927991390228,
-0.026119448244571686,
-0.003696180647239089,
-0.07973229885101318,
0.10627847164869308,
0.03149707242846489,
-0.03964782878756523,
0.045425284653902054,
0.021433280780911446,
0.001954735955223441,
-0.05605345964431763,
0.08146864920854568,
0.07596560567617416,
0.05702431499958038,
0.10000662505626678,
-0.008837372064590454,
-0.029202276840806007,
0.062464892864227295,
0.008504313416779041,
0.2470264732837677,
-0.025008708238601685,
0.10024819523096085,
0.032789334654808044,
0.15305927395820618,
-0.027052326127886772,
0.06411781907081604,
0.004428844433277845,
-0.009120166301727295,
-0.015287203714251518,
-0.06703681498765945,
-0.02537187747657299,
0.023847782984375954,
-0.045303281396627426,
0.02979489043354988,
-0.08226194232702255,
0.02756461873650551,
0.027398420497775078,
0.28074389696121216,
0.03479916229844093,
-0.27296972274780273,
-0.06581532955169678,
-0.012424355372786522,
-0.04232358932495117,
-0.06309014558792114,
0.006410944275557995,
0.12117952853441238,
-0.133270263671875,
0.06409479677677155,
-0.07631256431341171,
0.0890931561589241,
-0.03888282552361488,
0.010843084193766117,
0.04577110335230827,
0.15314802527427673,
-0.01690642535686493,
0.05151533707976341,
-0.18405483663082123,
0.24243327975273132,
0.024693850427865982,
0.10679223388433456,
-0.063845694065094,
0.010507593862712383,
0.018414685502648354,
0.0075941127724945545,
0.10958965867757797,
0.001534913550131023,
-0.069127157330513,
-0.1379888653755188,
-0.10028562694787979,
0.04681578651070595,
0.14269210398197174,
-0.03571401536464691,
0.09933219850063324,
-0.02847924642264843,
0.012775072827935219,
0.034141600131988525,
-0.029412413015961647,
-0.15699785947799683,
-0.07284396141767502,
0.00993756577372551,
0.026432016864418983,
-0.01662764698266983,
-0.052486520260572433,
-0.10427986085414886,
-0.03744913637638092,
0.11944999545812607,
0.0031364108435809612,
-0.045934047549963,
-0.1503559947013855,
0.08556688576936722,
0.14581705629825592,
-0.05895314738154411,
0.015100984834134579,
0.014078988693654537,
0.11240702867507935,
0.033026233315467834,
-0.08609902858734131,
0.06618726253509521,
-0.053191542625427246,
-0.17400410771369934,
-0.057393502444028854,
0.1199108362197876,
0.07902879267930984,
0.04566282778978348,
0.0006358107202686369,
0.056601572781801224,
0.0024631188716739416,
-0.09687899053096771,
0.036472730338573456,
0.004287844989448786,
0.051540445536375046,
0.029250076040625572,
-0.0856865867972374,
0.07886111736297607,
-0.0341402031481266,
0.018267584964632988,
0.1310667246580124,
0.23464301228523254,
-0.09986494481563568,
0.10289862751960754,
0.08017773926258087,
-0.07659845054149628,
-0.1594030261039734,
0.06040877103805542,
0.12676987051963806,
0.004380606114864349,
0.08526960760354996,
-0.20050249993801117,
0.13414330780506134,
0.10655944049358368,
-0.014129912480711937,
0.018710268661379814,
-0.2734469473361969,
-0.13200204074382782,
0.06481468677520752,
0.10970161855220795,
0.050486695021390915,
-0.12092560529708862,
-0.03575963154435158,
-0.010045114904642105,
-0.1198149248957634,
0.12845958769321442,
-0.07551444321870804,
0.11724022775888443,
-0.02104036509990692,
0.1233946904540062,
0.02439168654382229,
-0.03652780130505562,
0.11440178751945496,
0.07100733369588852,
0.08476632088422775,
-0.0388602539896965,
-0.0022644177079200745,
0.06421168148517609,
-0.06284328550100327,
0.03650935739278793,
-0.03728321194648743,
0.06367433071136475,
-0.14794966578483582,
0.006387044209986925,
-0.07745718955993652,
0.059736136347055435,
-0.046981558203697205,
-0.06522039324045181,
-0.027585741132497787,
0.04679057374596596,
0.07293284684419632,
-0.03570247069001198,
0.04453258961439133,
0.009853833355009556,
0.08853977173566818,
0.10143212229013443,
0.072367824614048,
-0.02101597748696804,
-0.08278690278530121,
0.012948376126587391,
0.004690942820161581,
0.046746402978897095,
-0.08572236448526382,
0.01661228947341442,
0.14580483734607697,
0.05970475450158119,
0.10224327445030212,
0.045248597860336304,
-0.043998800218105316,
0.006207639817148447,
0.016186105087399483,
-0.14266803860664368,
-0.10179730504751205,
0.02757570706307888,
-0.05780600756406784,
-0.15505561232566833,
0.03287492319941521,
0.1245538741350174,
-0.03724298253655434,
-0.016373181715607643,
-0.007095059379935265,
0.008995495736598969,
-0.012109378352761269,
0.18406268954277039,
0.04213373363018036,
0.05468728393316269,
-0.09044548124074936,
0.11460289359092712,
0.036024097353219986,
-0.04064503312110901,
0.05390782281756401,
0.06675736606121063,
-0.09911442548036575,
0.013644249178469181,
0.07233376801013947,
0.14901600778102875,
-0.06752602756023407,
-0.012670770287513733,
-0.09159742295742035,
-0.0770348384976387,
0.04444195702672005,
0.1438271552324295,
0.05390258878469467,
-0.004830132704228163,
-0.0608736053109169,
0.03563135862350464,
-0.11776527762413025,
0.06873353570699692,
0.053014375269412994,
0.08255593478679657,
-0.10885485261678696,
0.12460847944021225,
-0.007462050300091505,
0.02542250230908394,
-0.028157277032732964,
0.017923761159181595,
-0.10005071759223938,
-0.034607697278261185,
-0.10973379760980606,
-0.013340919278562069,
-0.017474276944994926,
-0.003433177713304758,
-0.018926698714494705,
-0.07585888355970383,
-0.043033163994550705,
0.0332452692091465,
-0.07654562592506409,
-0.04891510680317879,
0.016719983890652657,
0.04026465117931366,
-0.1605832725763321,
0.002588934963569045,
0.026779400184750557,
-0.08785627782344818,
0.08869129419326782,
0.06918508559465408,
0.016250532120466232,
0.02806272730231285,
-0.12144172936677933,
-0.033013951033353806,
0.0010791171807795763,
0.010790668427944183,
0.0769667774438858,
-0.09415129572153091,
-0.03019677847623825,
-0.030396590009331703,
0.04944169148802757,
0.01459003146737814,
0.10344788432121277,
-0.1193767786026001,
-0.013310959562659264,
-0.0470912866294384,
-0.03877551481127739,
-0.05701439082622528,
0.026106925681233406,
0.11323916167020798,
0.04555857181549072,
0.15779642760753632,
-0.07041624933481216,
0.05466447398066521,
-0.2043170928955078,
-0.032735683023929596,
0.010737897828221321,
-0.046234942972660065,
-0.07518849521875381,
-0.04573403298854828,
0.08321705460548401,
-0.050250280648469925,
0.12100505828857422,
-0.015845289453864098,
0.09275433421134949,
0.04393598437309265,
-0.004759157542139292,
-0.06864606589078903,
-0.011456158012151718,
0.18319791555404663,
0.057787902653217316,
-0.02099188044667244,
0.12035787850618362,
0.0026099092792719603,
0.0433274507522583,
0.06685510277748108,
0.23435457050800323,
0.15251806378364563,
-0.011700605042278767,
0.07522571831941605,
0.06680069863796234,
-0.0743282362818718,
-0.14190053939819336,
0.1224692091345787,
-0.020602038130164146,
0.10609132051467896,
-0.0513722226023674,
0.1892610341310501,
0.03834651783108711,
-0.17659740149974823,
0.05319315567612648,
-0.024468176066875458,
-0.10828454792499542,
-0.1258784830570221,
-0.01666993275284767,
-0.08292018622159958,
-0.11634693294763565,
0.027613457292318344,
-0.12356971949338913,
0.06788772344589233,
0.09562619030475616,
0.006530427373945713,
0.035551682114601135,
0.18310043215751648,
-0.05742675065994263,
0.010977468453347683,
0.07238172739744186,
0.020579108968377113,
-0.002964702667668462,
-0.038257062435150146,
-0.06745589524507523,
0.03751412034034729,
0.04505590721964836,
0.07119890302419662,
-0.05103762075304985,
0.010395965538918972,
0.01484358124434948,
-0.01108929980546236,
-0.07820700854063034,
0.00807184912264347,
0.014255219139158726,
0.049225904047489166,
0.03404758498072624,
0.047381795942783356,
0.008249693550169468,
-0.05321227014064789,
0.27537116408348083,
-0.06760665774345398,
-0.06242189556360245,
-0.12343522161245346,
0.19443932175636292,
0.032534681260585785,
-0.01806526444852352,
0.05696238949894905,
-0.09261194616556168,
-0.01333378255367279,
0.16161733865737915,
0.13391010463237762,
-0.0923173725605011,
-0.02159261889755726,
-0.02417534776031971,
-0.008975010365247726,
-0.012751351110637188,
0.10537828505039215,
0.07125111669301987,
0.0006812670617364347,
-0.067132867872715,
-0.014863036572933197,
-0.03017476573586464,
-0.04694359004497528,
-0.06323087215423584,
0.0589791014790535,
0.02673014998435974,
-0.00546529283747077,
-0.058844976127147675,
0.06287180632352829,
-0.003587854327633977,
-0.23459941148757935,
0.03719703480601311,
-0.17226558923721313,
-0.17408962547779083,
-0.013244089670479298,
0.07090387493371964,
0.0008809108985587955,
0.05617930367588997,
-0.007697461172938347,
0.009760379791259766,
0.11600451916456223,
-0.016891948878765106,
-0.014126207679510117,
-0.1159362643957138,
0.10996492952108383,
-0.10841033607721329,
0.21247562766075134,
-0.0008546898607164621,
0.06591647118330002,
0.09852612018585205,
0.03766172379255295,
-0.1356099545955658,
0.018092013895511627,
0.06248864531517029,
-0.12492421269416809,
0.0019505913369357586,
0.1461915671825409,
-0.034320950508117676,
0.06321822106838226,
0.03253651410341263,
-0.1492004096508026,
-0.004352053627371788,
0.02813700959086418,
-0.0373096689581871,
-0.06893940269947052,
-0.010986611247062683,
-0.05745144560933113,
0.16540378332138062,
0.20635585486888885,
-0.02938218042254448,
0.012750054709613323,
-0.08401136845350266,
0.02210722118616104,
0.04838372394442558,
0.06037896126508713,
-0.038315970450639725,
-0.21625417470932007,
0.02078722044825554,
0.07237577438354492,
-0.0023933309130370617,
-0.19601137936115265,
-0.09745045006275177,
0.043049730360507965,
-0.03595304116606712,
-0.04609289765357971,
0.09253069013357162,
0.024350065737962723,
0.03673742339015007,
-0.018782274797558784,
-0.11710353940725327,
-0.028578108176589012,
0.14531438052654266,
-0.17576919496059418,
-0.04249763861298561
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-4
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-4
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09605805575847626,
0.11538566648960114,
-0.002311817603185773,
0.09229128807783127,
0.11982187628746033,
0.022684704512357712,
0.10114018619060516,
0.1272805631160736,
-0.09795808792114258,
0.085903599858284,
0.08793768286705017,
0.03803246468305588,
0.04614812135696411,
0.1448981910943985,
-0.018858006224036217,
-0.2607820928096771,
0.010202617384493351,
-0.0036979641299694777,
-0.034340422600507736,
0.11187586188316345,
0.08487848937511444,
-0.11045607924461365,
0.08650356531143188,
0.01491569634526968,
-0.1553596556186676,
0.020002884790301323,
-0.03740735724568367,
-0.03434440866112709,
0.11343974620103836,
-0.03299093246459961,
0.10881082713603973,
0.02569727972149849,
0.1346348226070404,
-0.20856553316116333,
0.0048093427903950214,
0.07266949117183685,
0.046039704233407974,
0.10009679943323135,
0.05070909112691879,
0.01572827249765396,
0.08847644180059433,
-0.15379323065280914,
0.09255356341600418,
0.028841005638241768,
-0.09067782759666443,
-0.12997350096702576,
-0.09502339363098145,
0.024135341867804527,
0.05253751948475838,
0.0687292218208313,
0.0013044936349615455,
0.15032055974006653,
-0.06048935651779175,
0.078739695250988,
0.26388442516326904,
-0.3287176191806793,
-0.06476029008626938,
0.032368820160627365,
0.059832632541656494,
0.05374632403254509,
-0.12251009047031403,
-0.0057088397443294525,
0.027361860498785973,
0.030537230893969536,
0.11909055709838867,
-0.017377294600009918,
-0.11263196170330048,
-0.012870997190475464,
-0.12857837975025177,
-0.000281134998658672,
0.07224635779857635,
0.03586443141102791,
-0.05277292802929878,
-0.09419044107198715,
-0.07456452399492264,
-0.09240240603685379,
-0.02542221173644066,
-0.064608134329319,
0.056793589144945145,
-0.05544332042336464,
-0.08202998340129852,
-0.036352016031742096,
-0.057342659682035446,
-0.07565072178840637,
-0.018245607614517212,
0.15752412378787994,
0.040245965123176575,
0.020663119852542877,
-0.03132137283682823,
0.10886823385953903,
0.002785523422062397,
-0.14102621376514435,
-0.014719159342348576,
-0.0014857781352475286,
-0.09683414548635483,
-0.046957314014434814,
-0.052057743072509766,
-0.015257232822477818,
0.010411984287202358,
0.17637120187282562,
-0.08087678253650665,
0.07577923685312271,
0.011219097301363945,
-0.029766391962766647,
-0.006368170026689768,
0.1472429484128952,
-0.04419485479593277,
-0.04664606228470802,
-0.010681218467652798,
0.07401883602142334,
0.0026851240545511246,
-0.014182008802890778,
-0.06556862592697144,
-0.027176905423402786,
0.10217370092868805,
0.0456576943397522,
-0.06149033084511757,
0.040398210287094116,
-0.022746948525309563,
-0.02811785228550434,
0.016810335218906403,
-0.11542539298534393,
0.043852899223566055,
-0.002328047528862953,
-0.08446566760540009,
-0.001411261036992073,
-0.0004384554922580719,
-0.00544752785935998,
-0.007638951763510704,
0.10945695638656616,
-0.09934193640947342,
-0.002087925560772419,
-0.06334934383630753,
-0.0827900767326355,
0.00835376512259245,
-0.15552440285682678,
-0.014918025583028793,
-0.05742425099015236,
-0.17032775282859802,
-0.0316651277244091,
0.036963123828172684,
-0.07454222440719604,
-0.010110236704349518,
-0.04848853498697281,
-0.06482633203268051,
0.024171223863959312,
-0.01473509892821312,
0.17417477071285248,
-0.053832609206438065,
0.07170756906270981,
0.0005927608581259847,
0.046212732791900635,
0.014345335774123669,
0.035446807742118835,
-0.10409491509199142,
0.025082549080252647,
-0.1381714940071106,
0.06937029212713242,
-0.08442158997058868,
-0.0024119350127875805,
-0.13309265673160553,
-0.09932161122560501,
0.010870298370718956,
-0.02158115990459919,
0.0898241326212883,
0.13848663866519928,
-0.1924690157175064,
-0.01782541535794735,
0.12641072273254395,
-0.07470392435789108,
-0.06336003541946411,
0.06266132742166519,
-0.06135678291320801,
0.031100327149033546,
0.05197831615805626,
0.21086134016513824,
0.04097462818026543,
-0.16535590589046478,
-0.03252193704247475,
-0.004988887347280979,
0.04109002649784088,
0.026118189096450806,
0.040062639862298965,
0.005380693357437849,
0.062292519956827164,
0.015143568627536297,
-0.07680442929267883,
-0.032417234033346176,
-0.09140069037675858,
-0.06544588506221771,
-0.054096709936857224,
-0.07232818752527237,
0.042220160365104675,
0.0024934241082519293,
0.04292420670390129,
-0.06537680327892303,
-0.1013021320104599,
0.11932910978794098,
0.09718266874551773,
-0.048519525676965714,
0.03618309646844864,
-0.07982015609741211,
0.019773444160819054,
-0.02165275067090988,
-0.03913358971476555,
-0.20576056838035583,
-0.13073179125785828,
0.05215860530734062,
-0.055215828120708466,
0.0341779924929142,
0.007565580308437347,
0.0813431367278099,
0.06176438182592392,
-0.043734338134527206,
-0.012394712306559086,
-0.09275095164775848,
0.0029077595099806786,
-0.11743243038654327,
-0.18865399062633514,
-0.07833117991685867,
-0.03998451307415962,
0.0953030064702034,
-0.17461657524108887,
-0.007147399242967367,
0.015417705290019512,
0.14283062517642975,
0.02656194195151329,
-0.06727422028779984,
-0.0028111429419368505,
0.03805665299296379,
0.003098964225500822,
-0.09491053968667984,
0.04524841532111168,
0.008614645339548588,
-0.09325069189071655,
-0.06308790296316147,
-0.13441023230552673,
-0.00943235494196415,
0.05977441370487213,
0.05168162286281586,
-0.09707317501306534,
-0.04590946063399315,
-0.07059622555971146,
-0.04091203585267067,
-0.07459887117147446,
0.013107187114655972,
0.2022535651922226,
0.034053463488817215,
0.11265796422958374,
-0.06696098297834396,
-0.0769868865609169,
-0.0034359553828835487,
0.02251923270523548,
0.013059510849416256,
0.07630147784948349,
0.03966758772730827,
-0.052640337496995926,
0.07320720702409744,
0.09949493408203125,
-0.023629792034626007,
0.12393485754728317,
-0.04731408506631851,
-0.0836067870259285,
-0.03416691720485687,
-0.02381647564470768,
-0.02933431603014469,
0.12356950342655182,
-0.03967259079217911,
0.004796166438609362,
0.035862553864717484,
0.044512052088975906,
0.017423685640096664,
-0.1623886376619339,
0.008323460817337036,
0.0215449221432209,
-0.05367428809404373,
-0.036699078977108,
-0.0011135010281577706,
0.026526235044002533,
0.09179257601499557,
0.031355418264865875,
-0.014987795613706112,
0.0037273066118359566,
-0.011433488689363003,
-0.06155019626021385,
0.18522019684314728,
-0.0979851558804512,
-0.08439996838569641,
-0.07611439377069473,
0.004476947244256735,
-0.060274429619312286,
-0.03657343611121178,
0.015615495853126049,
-0.08819235116243362,
-0.038608305156230927,
-0.08733224868774414,
-0.017624400556087494,
-0.018756577745079994,
0.021074343472719193,
0.031796686351299286,
-0.022900709882378578,
0.08133210986852646,
-0.1381857842206955,
0.00185838108882308,
-0.05201856046915054,
-0.09263650327920914,
0.00037327580503188074,
0.07527121156454086,
0.098484106361866,
0.07897311449050903,
-0.016762729734182358,
0.029507866129279137,
-0.03435896709561348,
0.2432532161474228,
-0.04495634511113167,
0.010656381957232952,
0.1040826216340065,
-0.013617806136608124,
0.05582791566848755,
0.0950319692492485,
0.037874799221754074,
-0.09395825862884521,
0.020976999774575233,
0.08257611095905304,
-0.028710266575217247,
-0.22871169447898865,
-0.025521114468574524,
-0.00404210714623332,
-0.07972637563943863,
0.10620232671499252,
0.031446170061826706,
-0.03912033885717392,
0.046183258295059204,
0.022280342876911163,
0.002854897640645504,
-0.056280672550201416,
0.08099433034658432,
0.07733945548534393,
0.05649995803833008,
0.10101363807916641,
-0.008938122540712357,
-0.029478590935468674,
0.06164930388331413,
0.008716527372598648,
0.24766132235527039,
-0.025313306599855423,
0.09994111955165863,
0.033324021846055984,
0.15246348083019257,
-0.026564795523881912,
0.06494245678186417,
0.003911864478141069,
-0.009777864441275597,
-0.015025651082396507,
-0.0668579488992691,
-0.024136539548635483,
0.022949539124965668,
-0.046354688704013824,
0.029201235622167587,
-0.08235213160514832,
0.02603829838335514,
0.027548640966415405,
0.2802988886833191,
0.034784283488988876,
-0.27550506591796875,
-0.0666365697979927,
-0.013210872188210487,
-0.041971929371356964,
-0.06261470168828964,
0.006599244195967913,
0.12066928297281265,
-0.13263849914073944,
0.06482827663421631,
-0.07638005167245865,
0.08874117583036423,
-0.03897840902209282,
0.011792282573878765,
0.04779600352048874,
0.1541430503129959,
-0.017777999863028526,
0.05108502134680748,
-0.18447767198085785,
0.241279736161232,
0.02480662614107132,
0.10754191130399704,
-0.0640241727232933,
0.010387539863586426,
0.019295696169137955,
0.008830553852021694,
0.10922883450984955,
0.0009006643085740507,
-0.06874965131282806,
-0.13784965872764587,
-0.09941113740205765,
0.04802892729640007,
0.14179106056690216,
-0.034778814762830734,
0.10014806687831879,
-0.02756614051759243,
0.012370293028652668,
0.03381287679076195,
-0.030470477417111397,
-0.15713706612586975,
-0.07341014593839645,
0.009225169196724892,
0.02762586809694767,
-0.016188861802220345,
-0.05181208997964859,
-0.10479192435741425,
-0.0388856939971447,
0.11872988939285278,
0.004454180132597685,
-0.04615698382258415,
-0.15088361501693726,
0.08491233736276627,
0.1459839642047882,
-0.057837698608636856,
0.015056937001645565,
0.014861061237752438,
0.11149871349334717,
0.034003954380750656,
-0.08576877415180206,
0.06624802201986313,
-0.05387137830257416,
-0.17278973758220673,
-0.05736885592341423,
0.11906818300485611,
0.07874821126461029,
0.04493340104818344,
0.0003912109532393515,
0.056497521698474884,
0.0016470263944938779,
-0.09747333824634552,
0.03681157901883125,
0.0023698245640844107,
0.052445173263549805,
0.028835320845246315,
-0.08646345883607864,
0.07869022339582443,
-0.03353273868560791,
0.019220052286982536,
0.12921524047851562,
0.23185795545578003,
-0.09910091012716293,
0.1009884849190712,
0.08062111586332321,
-0.07617458701133728,
-0.15896861255168915,
0.06139129400253296,
0.1256677657365799,
0.0051294234581291676,
0.08385124057531357,
-0.20134074985980988,
0.13527072966098785,
0.10599713027477264,
-0.013122539967298508,
0.02057298831641674,
-0.27078160643577576,
-0.1312893033027649,
0.06517080962657928,
0.11042829602956772,
0.05128590017557144,
-0.12131273746490479,
-0.035148587077856064,
-0.010139735415577888,
-0.11923923343420029,
0.1276622712612152,
-0.07812292873859406,
0.11731924116611481,
-0.021381918340921402,
0.12377038598060608,
0.023662781342864037,
-0.036604251712560654,
0.11351503431797028,
0.07189572602510452,
0.08577481657266617,
-0.0389927476644516,
-0.001524658757261932,
0.06413334608078003,
-0.062157269567251205,
0.03643925487995148,
-0.03835460543632507,
0.06306929141283035,
-0.1483425498008728,
0.006387913599610329,
-0.07829950749874115,
0.05945536866784096,
-0.046770066022872925,
-0.06499131768941879,
-0.026649391278624535,
0.04711519926786423,
0.07188086956739426,
-0.03582943603396416,
0.043146178126335144,
0.008904479444026947,
0.08892123401165009,
0.1001858338713646,
0.07303166389465332,
-0.022194446995854378,
-0.08364073932170868,
0.01411316730082035,
0.004320780746638775,
0.04671579226851463,
-0.08533129841089249,
0.015317177399992943,
0.14651960134506226,
0.058986637741327286,
0.1019655391573906,
0.046074774116277695,
-0.04302499070763588,
0.005725590046495199,
0.01725105755031109,
-0.1427859365940094,
-0.10060283541679382,
0.027781151235103607,
-0.06125636771321297,
-0.1548287719488144,
0.03396044299006462,
0.12443593144416809,
-0.03709472715854645,
-0.01614762656390667,
-0.007069293409585953,
0.008365290239453316,
-0.012266202829778194,
0.18528388440608978,
0.0422811359167099,
0.054290544241666794,
-0.09153192490339279,
0.11399000138044357,
0.03592486307024956,
-0.04127230495214462,
0.05384349822998047,
0.06752129644155502,
-0.10011660307645798,
0.012680190615355968,
0.07148559391498566,
0.15036121010780334,
-0.06640935689210892,
-0.013545121066272259,
-0.09286266565322876,
-0.07751162350177765,
0.04446304962038994,
0.1426847279071808,
0.053653568029403687,
-0.005560294259339571,
-0.06121278554201126,
0.035132672637701035,
-0.1184074655175209,
0.06833638995885849,
0.05223572254180908,
0.08280861377716064,
-0.10891871899366379,
0.12394699454307556,
-0.007529906928539276,
0.024644194170832634,
-0.028159335255622864,
0.01845712773501873,
-0.10077137500047684,
-0.034660059958696365,
-0.10982786864042282,
-0.014270815066993237,
-0.01788683608174324,
-0.002912495518103242,
-0.01923864521086216,
-0.07456796616315842,
-0.04367350414395332,
0.03325961157679558,
-0.0768347829580307,
-0.048314839601516724,
0.01835499331355095,
0.04085639491677284,
-0.15995542705059052,
0.0029712936375290155,
0.025605522096157074,
-0.08779601752758026,
0.08852152526378632,
0.06911735236644745,
0.016024844720959663,
0.028434118255972862,
-0.11984244734048843,
-0.033539067953825,
0.0006039126892574131,
0.010058115236461163,
0.07751182466745377,
-0.09344347566366196,
-0.02956533059477806,
-0.030606253072619438,
0.04970991238951683,
0.014835333451628685,
0.10261882841587067,
-0.11860086023807526,
-0.01296150404959917,
-0.046315841376781464,
-0.03751818835735321,
-0.057490307837724686,
0.02659677341580391,
0.11394655704498291,
0.04498285800218582,
0.158230721950531,
-0.07043416798114777,
0.05382515490055084,
-0.20458079874515533,
-0.03309888020157814,
0.010594865307211876,
-0.04752326384186745,
-0.07446518540382385,
-0.04572737589478493,
0.0839611366391182,
-0.050793230533599854,
0.12251725792884827,
-0.01607673242688179,
0.09316608309745789,
0.04340260848402977,
-0.004507581703364849,
-0.06972228735685349,
-0.010864298790693283,
0.18293118476867676,
0.05732829123735428,
-0.021658675745129585,
0.12009924650192261,
0.0038191110361367464,
0.04249972477555275,
0.06811554729938507,
0.23258110880851746,
0.15167519450187683,
-0.011000202968716621,
0.07543674111366272,
0.06748735159635544,
-0.07481499761343002,
-0.14064526557922363,
0.12339027971029282,
-0.020577169954776764,
0.10673166066408157,
-0.05225397273898125,
0.18811509013175964,
0.03757116571068764,
-0.17574994266033173,
0.05358582362532616,
-0.025304948911070824,
-0.10844812542200089,
-0.12466070801019669,
-0.01577555388212204,
-0.08226384967565536,
-0.11696812510490417,
0.027758803218603134,
-0.12407475709915161,
0.06652525812387466,
0.09675601124763489,
0.0073431553319096565,
0.03506787121295929,
0.18462106585502625,
-0.05630696937441826,
0.011655263602733612,
0.07270197570323944,
0.020048052072525024,
-0.0033022791612893343,
-0.03803618624806404,
-0.06666076928377151,
0.03694182634353638,
0.043555498123168945,
0.07074430584907532,
-0.05195499584078789,
0.009707145392894745,
0.015523191541433334,
-0.010280163027346134,
-0.07778321206569672,
0.007875040173530579,
0.014514386653900146,
0.048949237912893295,
0.035038579255342484,
0.046924080699682236,
0.007413040846586227,
-0.053382959216833115,
0.27427804470062256,
-0.06761101633310318,
-0.06118005886673927,
-0.1242690458893776,
0.19411511719226837,
0.032393306493759155,
-0.01866655983030796,
0.056055281311273575,
-0.09174101799726486,
-0.012214358896017075,
0.16287770867347717,
0.13579659163951874,
-0.0923306792974472,
-0.021945064887404442,
-0.02369621768593788,
-0.008974737487733364,
-0.01291604619473219,
0.10572057217359543,
0.07123469561338425,
0.00021287990966811776,
-0.0671599730849266,
-0.014757469296455383,
-0.02957061305642128,
-0.04692716896533966,
-0.06283977627754211,
0.05844695866107941,
0.02790057100355625,
-0.005836991127580404,
-0.05817985534667969,
0.06309203058481216,
-0.0038977758958935738,
-0.234405517578125,
0.03813330829143524,
-0.17169706523418427,
-0.17453211545944214,
-0.014353472739458084,
0.07018331438302994,
0.00165753869805485,
0.05693543702363968,
-0.007586387917399406,
0.009563068859279156,
0.11656007915735245,
-0.017151689156889915,
-0.013269730843603611,
-0.11675620824098587,
0.11010018736124039,
-0.10813423246145248,
0.21123401820659637,
-0.0015462059527635574,
0.06590951979160309,
0.09898059815168381,
0.03729658201336861,
-0.1353580355644226,
0.018764425069093704,
0.061713941395282745,
-0.1254614144563675,
0.002526669530197978,
0.14534051716327667,
-0.03416861593723297,
0.06241767853498459,
0.031343087553977966,
-0.14829429984092712,
-0.003522661980241537,
0.028353026136755943,
-0.037129905074834824,
-0.0689920112490654,
-0.010677393525838852,
-0.05703576281666756,
0.16593047976493835,
0.2079044133424759,
-0.029216095805168152,
0.012632901780307293,
-0.08390983194112778,
0.022321289405226707,
0.04889502376317978,
0.05910002812743187,
-0.03913543000817299,
-0.2162637710571289,
0.0209213737398386,
0.07288092374801636,
-0.0029320656321942806,
-0.19709259271621704,
-0.09628436714410782,
0.04310246929526329,
-0.035828012973070145,
-0.04632046818733215,
0.09245806932449341,
0.025265661999583244,
0.03779496252536774,
-0.01927279494702816,
-0.11475367099046707,
-0.028513867408037186,
0.1450241655111313,
-0.1761130392551422,
-0.0432141087949276
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-6
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-6
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09611675888299942,
0.11506524682044983,
-0.0022690631449222565,
0.09200654178857803,
0.11999661475419998,
0.022231798619031906,
0.10075999796390533,
0.12770316004753113,
-0.09725208580493927,
0.08629079908132553,
0.08714986592531204,
0.038843728601932526,
0.04662603512406349,
0.14475257694721222,
-0.0188559852540493,
-0.2603597342967987,
0.01048896461725235,
-0.003385183634236455,
-0.03424552083015442,
0.1118900403380394,
0.08487607538700104,
-0.11067185550928116,
0.08575831353664398,
0.01440374180674553,
-0.15555377304553986,
0.020261704921722412,
-0.03766871988773346,
-0.03399287909269333,
0.11328829079866409,
-0.033467475324869156,
0.10844962298870087,
0.02581438422203064,
0.13458169996738434,
-0.20918457210063934,
0.004838097840547562,
0.07325578480958939,
0.04619140177965164,
0.10028402507305145,
0.051511719822883606,
0.015892647206783295,
0.09025190025568008,
-0.15352143347263336,
0.09239989519119263,
0.029414629563689232,
-0.0905521959066391,
-0.12821924686431885,
-0.09549057483673096,
0.0234205424785614,
0.053186800330877304,
0.06941893696784973,
0.0008658781880512834,
0.15129905939102173,
-0.0609947144985199,
0.07895851135253906,
0.2655561864376068,
-0.32725247740745544,
-0.06472030282020569,
0.03342960402369499,
0.05968736857175827,
0.05276748165488243,
-0.12337823957204819,
-0.0066198743879795074,
0.027386777102947235,
0.030213668942451477,
0.11811120063066483,
-0.016933415085077286,
-0.11411579698324203,
-0.013241943903267384,
-0.12890753149986267,
-0.00034507300006225705,
0.07066646218299866,
0.035920511931180954,
-0.0522216334939003,
-0.09394876658916473,
-0.07506299018859863,
-0.09184139221906662,
-0.02523687668144703,
-0.06455917656421661,
0.05699559673666954,
-0.05549929663538933,
-0.08148149400949478,
-0.036196913570165634,
-0.05709391087293625,
-0.0767948105931282,
-0.01803278550505638,
0.15723378956317902,
0.04022248834371567,
0.02049601823091507,
-0.0320817232131958,
0.10870169848203659,
0.002766501856967807,
-0.14118793606758118,
-0.015719523653388023,
-0.0008203383185900748,
-0.09721005707979202,
-0.04717562720179558,
-0.05152745544910431,
-0.017027830705046654,
0.009945016354322433,
0.17459379136562347,
-0.08109069615602493,
0.07621031999588013,
0.010293080471456051,
-0.029854753986001015,
-0.007002931088209152,
0.1474197506904602,
-0.043288931250572205,
-0.0451529435813427,
-0.011360474862158298,
0.07398974895477295,
0.0021748701110482216,
-0.013832769356667995,
-0.06498351693153381,
-0.027327468618750572,
0.10260554403066635,
0.04549197480082512,
-0.06100216507911682,
0.04006801173090935,
-0.02297579124569893,
-0.028247306123375893,
0.01712048053741455,
-0.11528225243091583,
0.04402152821421623,
-0.002784508280456066,
-0.08488667756319046,
-0.002475727815181017,
-0.0002336165780434385,
-0.005735834129154682,
-0.007978429086506367,
0.11016290634870529,
-0.09969929605722427,
-0.0018592504784464836,
-0.06436474621295929,
-0.08264285326004028,
0.008790960535407066,
-0.15716101229190826,
-0.015311711467802525,
-0.056650418788194656,
-0.17076611518859863,
-0.03177459165453911,
0.03662184998393059,
-0.07456763833761215,
-0.009478241205215454,
-0.04909461736679077,
-0.06541342288255692,
0.024469733238220215,
-0.01433630846440792,
0.17516468465328217,
-0.053551774471998215,
0.0723767876625061,
0.00034325261367484927,
0.04597047343850136,
0.01480388455092907,
0.03606599569320679,
-0.10501181334257126,
0.024835774675011635,
-0.1376476287841797,
0.06943301111459732,
-0.0853472501039505,
-0.0025977655313909054,
-0.1340150088071823,
-0.09861554950475693,
0.009673378430306911,
-0.02209128811955452,
0.09058307111263275,
0.1391936093568802,
-0.1928074061870575,
-0.017464542761445045,
0.1268923133611679,
-0.07549098879098892,
-0.0633821040391922,
0.06168295815587044,
-0.06105482205748558,
0.030726946890354156,
0.05154084041714668,
0.21117882430553436,
0.04029781371355057,
-0.16514676809310913,
-0.033603206276893616,
-0.0055679455399513245,
0.04154879227280617,
0.02556019090116024,
0.03983382508158684,
0.005293391644954681,
0.06336427479982376,
0.014855310320854187,
-0.0766463652253151,
-0.03270915895700455,
-0.09158556908369064,
-0.06483130156993866,
-0.054618969559669495,
-0.0724518746137619,
0.04165276885032654,
0.003718388034030795,
0.04258492961525917,
-0.06492765992879868,
-0.1008373349905014,
0.11932864785194397,
0.09716238081455231,
-0.047897569835186005,
0.03643408417701721,
-0.0793846920132637,
0.018656175583600998,
-0.022135229781270027,
-0.0390663780272007,
-0.20710738003253937,
-0.13099591434001923,
0.05258467420935631,
-0.05578748136758804,
0.034327633678913116,
0.007047789636999369,
0.08125259727239609,
0.06092143803834915,
-0.04387272521853447,
-0.012391744181513786,
-0.09348472207784653,
0.002470229519531131,
-0.11793718487024307,
-0.18875297904014587,
-0.0784095898270607,
-0.04064769670367241,
0.09308468550443649,
-0.17408622801303864,
-0.007451931945979595,
0.0154434097930789,
0.14343024790287018,
0.026804344728589058,
-0.06774493306875229,
-0.0028050330001860857,
0.0378011129796505,
0.002794424071907997,
-0.0951734185218811,
0.045062337070703506,
0.007747276220470667,
-0.09280752390623093,
-0.0631980150938034,
-0.13473816215991974,
-0.01114532072097063,
0.059464871883392334,
0.05308579280972481,
-0.09727446734905243,
-0.04591469094157219,
-0.07053982466459274,
-0.041082195937633514,
-0.07575452327728271,
0.01409445982426405,
0.20170798897743225,
0.03434578701853752,
0.11219370365142822,
-0.06712491065263748,
-0.07718658447265625,
-0.002910551382228732,
0.022798238322138786,
0.012618264183402061,
0.07729468494653702,
0.041419945657253265,
-0.05445403605699539,
0.07395488023757935,
0.09999097883701324,
-0.023065416142344475,
0.12427034229040146,
-0.047454845160245895,
-0.08402828872203827,
-0.03285641968250275,
-0.023483719676733017,
-0.029210740700364113,
0.12338421493768692,
-0.03937321901321411,
0.005188161041587591,
0.0356830433011055,
0.04498419165611267,
0.017452750355005264,
-0.16254371404647827,
0.00833907537162304,
0.02136061154305935,
-0.05351770669221878,
-0.037435028702020645,
-0.0011443666880950332,
0.02647251822054386,
0.09213317930698395,
0.031055327504873276,
-0.014670349657535553,
0.0030785787384957075,
-0.011467360891401768,
-0.061564140021800995,
0.18605908751487732,
-0.0976032167673111,
-0.08328773081302643,
-0.07482561469078064,
0.005247254855930805,
-0.05984016880393028,
-0.03684558346867561,
0.01577439345419407,
-0.08922410011291504,
-0.03863291069865227,
-0.08730123937129974,
-0.01741274632513523,
-0.018578842282295227,
0.020278748124837875,
0.031109357252717018,
-0.022592199966311455,
0.08057305216789246,
-0.1389220505952835,
0.0020629388745874166,
-0.052502043545246124,
-0.09286369383335114,
0.00006475533882621676,
0.0748816728591919,
0.09839984774589539,
0.07923344522714615,
-0.01716417260468006,
0.02957046777009964,
-0.03446396067738533,
0.2420383244752884,
-0.04544955864548683,
0.010209904052317142,
0.10394835472106934,
-0.01254227478057146,
0.05624080449342728,
0.09529311954975128,
0.03719385340809822,
-0.09399154037237167,
0.02129019983112812,
0.08326724916696548,
-0.02902253344655037,
-0.22976453602313995,
-0.025707315653562546,
-0.00420048413798213,
-0.07989540696144104,
0.10619185119867325,
0.03174397349357605,
-0.03915560990571976,
0.04602208733558655,
0.021642709150910378,
0.0015458473935723305,
-0.05624578893184662,
0.08132142573595047,
0.07521731406450272,
0.05722302570939064,
0.1006859764456749,
-0.008965196087956429,
-0.028863297775387764,
0.06098255515098572,
0.008736690506339073,
0.24878540635108948,
-0.02559984289109707,
0.09964830428361893,
0.032877203077077866,
0.15213973820209503,
-0.02709169127047062,
0.06542520970106125,
0.003678292967379093,
-0.009895442984998226,
-0.014856314286589622,
-0.06667030602693558,
-0.024279681965708733,
0.02291863225400448,
-0.04705125465989113,
0.029351823031902313,
-0.08221065253019333,
0.02641519159078598,
0.027000995352864265,
0.28095096349716187,
0.034870609641075134,
-0.27452000975608826,
-0.06590276211500168,
-0.013464605435729027,
-0.04225609079003334,
-0.06324540078639984,
0.006415608339011669,
0.1202663704752922,
-0.13221906125545502,
0.06462480127811432,
-0.07680199295282364,
0.08965381979942322,
-0.037680432200431824,
0.011137322522699833,
0.04712491109967232,
0.15432406961917877,
-0.017748326063156128,
0.05149969458580017,
-0.18553505837917328,
0.24332523345947266,
0.024938002228736877,
0.10768340528011322,
-0.06429681926965714,
0.01018726546317339,
0.018678121268749237,
0.007311126217246056,
0.10972753167152405,
0.000857018050737679,
-0.06939128786325455,
-0.13762551546096802,
-0.0987028181552887,
0.047805704176425934,
0.1426747441291809,
-0.03483140096068382,
0.09959068149328232,
-0.0275471992790699,
0.01245130319148302,
0.034591689705848694,
-0.030581749975681305,
-0.15749208629131317,
-0.07330041378736496,
0.009495879523456097,
0.027061481028795242,
-0.016933506354689598,
-0.051473457366228104,
-0.1045328751206398,
-0.037712544202804565,
0.11836659908294678,
0.004398067481815815,
-0.045533277094364166,
-0.15088659524917603,
0.08570300787687302,
0.1460379958152771,
-0.058201856911182404,
0.014954742044210434,
0.014564769342541695,
0.1110953614115715,
0.03311656042933464,
-0.08626560121774673,
0.06720227003097534,
-0.053888414055109024,
-0.17308741807937622,
-0.05729875713586807,
0.11857448518276215,
0.07938043028116226,
0.045339327305555344,
0.0002040179242612794,
0.056892555207014084,
0.0018930271035060287,
-0.09717600792646408,
0.037094201892614365,
0.002644259249791503,
0.05237172916531563,
0.02935013733804226,
-0.08602682501077652,
0.07738637179136276,
-0.03412388265132904,
0.018670406192541122,
0.12883083522319794,
0.23281075060367584,
-0.09900295734405518,
0.1013229489326477,
0.08109667152166367,
-0.07654595375061035,
-0.1596553474664688,
0.06230475381016731,
0.12598061561584473,
0.004612141288816929,
0.08456747233867645,
-0.20120398700237274,
0.135209858417511,
0.10582312196493149,
-0.01346365176141262,
0.01977190561592579,
-0.27155783772468567,
-0.1313038021326065,
0.06511843204498291,
0.11029860377311707,
0.04925752431154251,
-0.12123055011034012,
-0.0349380299448967,
-0.009863654151558876,
-0.11884415149688721,
0.12839289009571075,
-0.07697220146656036,
0.11753394454717636,
-0.021703587844967842,
0.12390153110027313,
0.023834656924009323,
-0.037134092301130295,
0.11213044077157974,
0.07201343029737473,
0.0860728845000267,
-0.03875898942351341,
-0.0017574889352545142,
0.06420876085758209,
-0.062299374490976334,
0.036600545048713684,
-0.03843911364674568,
0.06348219513893127,
-0.14715924859046936,
0.006760005839169025,
-0.07880418002605438,
0.05996205285191536,
-0.04667843133211136,
-0.06501283496618271,
-0.02677895314991474,
0.04755817726254463,
0.07229529321193695,
-0.036269377917051315,
0.044455189257860184,
0.008930507116019726,
0.09059972316026688,
0.10033981502056122,
0.07354790717363358,
-0.021971073001623154,
-0.08255864679813385,
0.013598061166703701,
0.005048500839620829,
0.046891119331121445,
-0.0863039493560791,
0.015103263780474663,
0.14662617444992065,
0.06008845195174217,
0.10184202343225479,
0.04647800698876381,
-0.04365590214729309,
0.0057638115249574184,
0.016431588679552078,
-0.1420828104019165,
-0.101497121155262,
0.02811262011528015,
-0.0577232763171196,
-0.15519827604293823,
0.03450038284063339,
0.12304256856441498,
-0.03803040087223053,
-0.016171781346201897,
-0.006857586558908224,
0.008915852755308151,
-0.011689823120832443,
0.18592633306980133,
0.04231688007712364,
0.05487504228949547,
-0.09138303995132446,
0.11438249796628952,
0.03547251597046852,
-0.042458001524209976,
0.05403415858745575,
0.0677531510591507,
-0.09951659291982651,
0.012661641463637352,
0.07303567975759506,
0.1494629681110382,
-0.0664593055844307,
-0.012286141514778137,
-0.09182029217481613,
-0.07721967995166779,
0.04474629834294319,
0.14441703259944916,
0.05349651351571083,
-0.0056678494438529015,
-0.060871608555316925,
0.03558872267603874,
-0.11896095424890518,
0.06857617199420929,
0.05178975686430931,
0.08301086723804474,
-0.1085646003484726,
0.12268650531768799,
-0.007359572686254978,
0.024845991283655167,
-0.02808772400021553,
0.01887296512722969,
-0.10054414719343185,
-0.03459993004798889,
-0.10782849043607712,
-0.014472488313913345,
-0.0179241131991148,
-0.0033906553871929646,
-0.019712179899215698,
-0.07496973127126694,
-0.04288638010621071,
0.03332255035638809,
-0.07707314938306808,
-0.04879918694496155,
0.017727011814713478,
0.04038049653172493,
-0.16037575900554657,
0.003115374594926834,
0.025762315839529037,
-0.08719516545534134,
0.08752837032079697,
0.06843283772468567,
0.016098329797387123,
0.028768321499228477,
-0.12236052751541138,
-0.03318532556295395,
0.0008535300730727613,
0.010110766626894474,
0.07749436050653458,
-0.09277983754873276,
-0.029894856736063957,
-0.0308107640594244,
0.04984612390398979,
0.014655581675469875,
0.10190269351005554,
-0.11907292902469635,
-0.013682231307029724,
-0.04763225093483925,
-0.03819047659635544,
-0.056907590478658676,
0.027132200077176094,
0.11431443691253662,
0.04511943459510803,
0.15787498652935028,
-0.07021201401948929,
0.05449474975466728,
-0.20455829799175262,
-0.032997164875268936,
0.010887875221669674,
-0.04732292890548706,
-0.07492067664861679,
-0.044999413192272186,
0.08432972431182861,
-0.050854019820690155,
0.12020742148160934,
-0.015503679402172565,
0.09382221102714539,
0.043780598789453506,
-0.004386188928037882,
-0.07016514241695404,
-0.011004259809851646,
0.18249693512916565,
0.05711350589990616,
-0.021143712103366852,
0.12170993536710739,
0.004085435997694731,
0.042142294347286224,
0.06957272440195084,
0.23461920022964478,
0.1528048813343048,
-0.011998885311186314,
0.07553248852491379,
0.06726159155368805,
-0.07542289793491364,
-0.14051350951194763,
0.12328511476516724,
-0.02024291828274727,
0.10660578310489655,
-0.05260084196925163,
0.18881359696388245,
0.037524476647377014,
-0.17543327808380127,
0.054195743054151535,
-0.025747308507561684,
-0.10827741026878357,
-0.12474998831748962,
-0.01567872241139412,
-0.08220061659812927,
-0.11680600047111511,
0.028121838346123695,
-0.12427365779876709,
0.0672697201371193,
0.097078837454319,
0.007357392925769091,
0.035287920385599136,
0.18532109260559082,
-0.05604834854602814,
0.011708113364875317,
0.07305154204368591,
0.020194580778479576,
-0.0037041327450424433,
-0.03834041580557823,
-0.06628098338842392,
0.03806370869278908,
0.04342738911509514,
0.07085980474948883,
-0.05183114483952522,
0.008643876761198044,
0.01535320095717907,
-0.010245200246572495,
-0.07790607959032059,
0.00808943435549736,
0.014852123335003853,
0.04912996664643288,
0.035682085901498795,
0.046840425580739975,
0.007899695076048374,
-0.05348087474703789,
0.2759856879711151,
-0.06798730790615082,
-0.0609833188354969,
-0.12338440865278244,
0.19495448470115662,
0.033424027264118195,
-0.018596915528178215,
0.05594586953520775,
-0.0925811380147934,
-0.012470502406358719,
0.1614169329404831,
0.13409040868282318,
-0.09130249917507172,
-0.02172037959098816,
-0.024088917300105095,
-0.00884418934583664,
-0.012769877910614014,
0.10539711266756058,
0.07156574726104736,
0.0009517869912087917,
-0.06743204593658447,
-0.014508131891489029,
-0.029561588540673256,
-0.047720056027173996,
-0.06186247617006302,
0.058572981506586075,
0.028292862698435783,
-0.006298460997641087,
-0.058375317603349686,
0.06337269395589828,
-0.004064241424202919,
-0.23400761187076569,
0.037432439625263214,
-0.1727382391691208,
-0.17405781149864197,
-0.014306584373116493,
0.07010842114686966,
0.0017492888728156686,
0.05669410154223442,
-0.007448614574968815,
0.010057750158011913,
0.1145237386226654,
-0.0170243289321661,
-0.013561583124101162,
-0.1180097684264183,
0.1094338521361351,
-0.10873474180698395,
0.21176981925964355,
-0.0016602486139163375,
0.0650615319609642,
0.09898346662521362,
0.03741194307804108,
-0.1358819305896759,
0.01854456216096878,
0.061992496252059937,
-0.12608171999454498,
0.0023818351328372955,
0.14638054370880127,
-0.03423341363668442,
0.06273989379405975,
0.03098211996257305,
-0.1499023735523224,
-0.00310093373991549,
0.027279792353510857,
-0.036713857203722,
-0.06956394761800766,
-0.008980288170278072,
-0.05681468918919563,
0.1660158783197403,
0.20810925960540771,
-0.028920037671923637,
0.012437586672604084,
-0.08443623036146164,
0.02204689383506775,
0.04787500202655792,
0.05991097912192345,
-0.038952477276325226,
-0.2163068652153015,
0.021300969645380974,
0.07272624969482422,
-0.003130511846393347,
-0.19680023193359375,
-0.0958777442574501,
0.043298572301864624,
-0.03677055984735489,
-0.046355266124010086,
0.09182168543338776,
0.025436008349061012,
0.037256430834531784,
-0.019400782883167267,
-0.11686301976442337,
-0.028140839189291,
0.14555558562278748,
-0.176206573843956,
-0.042861208319664
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-8
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-8
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-256-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09660891443490982,
0.11594275385141373,
-0.0022939175833016634,
0.09240933507680893,
0.12006375193595886,
0.022873030975461006,
0.10037589818239212,
0.12763796746730804,
-0.09690495580434799,
0.08590693771839142,
0.08741127699613571,
0.03814011812210083,
0.04656323790550232,
0.14407165348529816,
-0.019002964720129967,
-0.2604576051235199,
0.010023348964750767,
-0.00287490151822567,
-0.032424069941043854,
0.11141398549079895,
0.08493701368570328,
-0.11074067652225494,
0.08569508790969849,
0.014249629341065884,
-0.15487700700759888,
0.0200912244617939,
-0.03714637830853462,
-0.03450867161154747,
0.11361522972583771,
-0.03284633159637451,
0.10846753418445587,
0.025074703618884087,
0.1346234530210495,
-0.209883451461792,
0.004643324296921492,
0.0735207200050354,
0.046060703694820404,
0.10034556686878204,
0.050346337258815765,
0.016798678785562515,
0.08929542452096939,
-0.1540527045726776,
0.0926218256354332,
0.029027897864580154,
-0.09037990868091583,
-0.12859958410263062,
-0.09468056261539459,
0.024567363783717155,
0.05257558077573776,
0.06878430396318436,
0.0018517444841563702,
0.15287615358829498,
-0.05974341183900833,
0.07933807373046875,
0.2658204138278961,
-0.3270338773727417,
-0.06395169347524643,
0.03239760175347328,
0.059529952704906464,
0.05399635061621666,
-0.12256857752799988,
-0.006444322410970926,
0.02703256532549858,
0.030081987380981445,
0.11865938454866409,
-0.017682917416095734,
-0.1155630350112915,
-0.013385211117565632,
-0.12839238345623016,
-0.0007901607896201313,
0.07114572823047638,
0.036090534180402756,
-0.052429407835006714,
-0.09401651471853256,
-0.07576585561037064,
-0.09263817220926285,
-0.02573348581790924,
-0.06492692232131958,
0.05644727125763893,
-0.05513268709182739,
-0.08062766492366791,
-0.03632901981472969,
-0.05658743903040886,
-0.07596825063228607,
-0.017734460532665253,
0.15722419321537018,
0.04035162925720215,
0.020410209894180298,
-0.03131626173853874,
0.108339823782444,
0.0008907117880880833,
-0.14126308262348175,
-0.015316096134483814,
-0.0005542942672036588,
-0.09730064123868942,
-0.04738088324666023,
-0.05176674947142601,
-0.01805596984922886,
0.009564812295138836,
0.17608274519443512,
-0.08073500543832779,
0.07595723867416382,
0.010734181851148605,
-0.029301680624485016,
-0.006593712605535984,
0.14820720255374908,
-0.04395946487784386,
-0.046067021787166595,
-0.010405274108052254,
0.0736556351184845,
0.0030683695804327726,
-0.014740333892405033,
-0.0659775361418724,
-0.02807306870818138,
0.10330983251333237,
0.044986121356487274,
-0.06095373257994652,
0.0395544059574604,
-0.023113321512937546,
-0.028452882543206215,
0.01799052208662033,
-0.11514420062303543,
0.04422278329730034,
-0.0028435764834284782,
-0.08456351608037949,
-0.0017121840501204133,
0.0004803251940757036,
-0.005027483217418194,
-0.008271697908639908,
0.10929813235998154,
-0.09888572990894318,
-0.001411779085174203,
-0.06397267431020737,
-0.08280790597200394,
0.009137190878391266,
-0.15709516406059265,
-0.014558189548552036,
-0.05784938856959343,
-0.1710127890110016,
-0.03130291402339935,
0.03677091374993324,
-0.07399597764015198,
-0.009206295944750309,
-0.0480223074555397,
-0.06412561237812042,
0.024305248633027077,
-0.014936883002519608,
0.1731480360031128,
-0.05377255007624626,
0.07198631763458252,
0.0003676059131976217,
0.04557833820581436,
0.013548259623348713,
0.036144547164440155,
-0.10458287596702576,
0.024678558111190796,
-0.13774023950099945,
0.06934612989425659,
-0.0847153514623642,
-0.0024689147248864174,
-0.1327403038740158,
-0.09855187684297562,
0.010701990686357021,
-0.021466989070177078,
0.09123402088880539,
0.1385006159543991,
-0.19282600283622742,
-0.0172786433249712,
0.1260719895362854,
-0.07517998665571213,
-0.06357865780591965,
0.06318139284849167,
-0.061223104596138,
0.030629128217697144,
0.05210324004292488,
0.21102431416511536,
0.041887614876031876,
-0.16518568992614746,
-0.03289134427905083,
-0.0043618637137115,
0.04219619184732437,
0.024520186707377434,
0.039703961461782455,
0.005653927568346262,
0.06383057683706284,
0.015165993012487888,
-0.07558291405439377,
-0.03189771622419357,
-0.09146220237016678,
-0.06522722542285919,
-0.05487681180238724,
-0.07217097282409668,
0.04129716008901596,
0.004347868263721466,
0.04259774461388588,
-0.06473434716463089,
-0.10134294629096985,
0.1208370178937912,
0.09684126079082489,
-0.04825441166758537,
0.03649936243891716,
-0.07899176329374313,
0.019220145419239998,
-0.021365661174058914,
-0.03897527605295181,
-0.20667411386966705,
-0.12984679639339447,
0.05277381092309952,
-0.05567743256688118,
0.03382328525185585,
0.007410047575831413,
0.08059201389551163,
0.0613645501434803,
-0.04334452003240585,
-0.011614620685577393,
-0.0929216742515564,
0.0027462018188089132,
-0.11870867758989334,
-0.18764719367027283,
-0.07857084274291992,
-0.04013534262776375,
0.09353578090667725,
-0.1748964935541153,
-0.006915098987519741,
0.015094561502337456,
0.14283357560634613,
0.026434484869241714,
-0.06766866147518158,
-0.002790502505376935,
0.03847728297114372,
0.0025988288689404726,
-0.09546604007482529,
0.04522198811173439,
0.008360743522644043,
-0.09360211342573166,
-0.06422030925750732,
-0.13508088886737823,
-0.010169179178774357,
0.05920986458659172,
0.05220121517777443,
-0.09714514762163162,
-0.0465090312063694,
-0.07001978158950806,
-0.04089651629328728,
-0.07504414021968842,
0.013182701542973518,
0.20257677137851715,
0.03465012088418007,
0.11319273710250854,
-0.06699973344802856,
-0.07661055028438568,
-0.0028453811537474394,
0.022733446210622787,
0.013229778967797756,
0.0763556957244873,
0.04091906547546387,
-0.05230487510561943,
0.07320322841405869,
0.09920230507850647,
-0.024021577090024948,
0.12363121658563614,
-0.04714059457182884,
-0.08360258489847183,
-0.033038340508937836,
-0.023761112242937088,
-0.029008449986577034,
0.123371422290802,
-0.04089965298771858,
0.004565287381410599,
0.03595065325498581,
0.044552430510520935,
0.017576118931174278,
-0.1624908149242401,
0.008237140253186226,
0.02220269851386547,
-0.05307353287935257,
-0.03648049756884575,
-0.0017747521633282304,
0.026143386960029602,
0.09137332439422607,
0.030708329752087593,
-0.014991515316069126,
0.0037573191802948713,
-0.01118615735322237,
-0.061853744089603424,
0.18538141250610352,
-0.09761416912078857,
-0.08465886116027832,
-0.07614236325025558,
0.005630496423691511,
-0.059949371963739395,
-0.03687925264239311,
0.016115104779601097,
-0.0875081717967987,
-0.038471002131700516,
-0.08768977969884872,
-0.019178781658411026,
-0.017953654751181602,
0.02023346535861492,
0.03141528367996216,
-0.023082446306943893,
0.08117401599884033,
-0.13874636590480804,
0.0018447416368871927,
-0.051871899515390396,
-0.09201523661613464,
0.00047090352745726705,
0.07482922822237015,
0.0992441475391388,
0.07969322055578232,
-0.017897319048643112,
0.029239444062113762,
-0.03421841934323311,
0.24188221991062164,
-0.044955749064683914,
0.010575386695563793,
0.10431099683046341,
-0.013551024720072746,
0.056477684527635574,
0.09475480765104294,
0.03750636801123619,
-0.09394723922014236,
0.02080417424440384,
0.08218743652105331,
-0.029562881216406822,
-0.22890731692314148,
-0.025629142299294472,
-0.004353605676442385,
-0.07970144599676132,
0.1064695343375206,
0.031681351363658905,
-0.0386386513710022,
0.04646061733365059,
0.021738961338996887,
0.0031019661109894514,
-0.057036351412534714,
0.08123414218425751,
0.07590582966804504,
0.05715465545654297,
0.10053269565105438,
-0.008588739670813084,
-0.0289025716483593,
0.061278827488422394,
0.008129152469336987,
0.24674583971500397,
-0.025803515687584877,
0.10055696219205856,
0.03183342516422272,
0.1527666449546814,
-0.02702602744102478,
0.06553223729133606,
0.0034036925062537193,
-0.010171202942728996,
-0.015072260983288288,
-0.06685642153024673,
-0.025734392926096916,
0.023464461788535118,
-0.04767264798283577,
0.02972574532032013,
-0.08244118094444275,
0.027085645124316216,
0.02676786482334137,
0.28048983216285706,
0.03464389219880104,
-0.2747097909450531,
-0.06618443876504898,
-0.013729200698435307,
-0.04213427007198334,
-0.06373891979455948,
0.006320635788142681,
0.12110531330108643,
-0.13190612196922302,
0.06417690217494965,
-0.07618501782417297,
0.08974217623472214,
-0.03876972571015358,
0.011497672647237778,
0.046375423669815063,
0.15436951816082,
-0.017451997846364975,
0.0517527312040329,
-0.18619833886623383,
0.24174289405345917,
0.02527017891407013,
0.10795578360557556,
-0.06467042118310928,
0.010743455961346626,
0.018367525190114975,
0.008988065645098686,
0.108883336186409,
0.001213589683175087,
-0.06827961653470993,
-0.1389939934015274,
-0.09907452762126923,
0.04781953990459442,
0.14141105115413666,
-0.03333171084523201,
0.0991780087351799,
-0.02767886035144329,
0.012539335526525974,
0.03484518453478813,
-0.02986793778836727,
-0.15749996900558472,
-0.07398346811532974,
0.009104226715862751,
0.028293561190366745,
-0.016609657555818558,
-0.05127957835793495,
-0.10422265529632568,
-0.039174217730760574,
0.11854296922683716,
0.005179506726562977,
-0.04576897248625755,
-0.15087175369262695,
0.08622666448354721,
0.1452796310186386,
-0.05843203514814377,
0.014883211813867092,
0.01453807856887579,
0.1112993136048317,
0.0328819565474987,
-0.08541698008775711,
0.06718123704195023,
-0.053658224642276764,
-0.1722797304391861,
-0.057648904621601105,
0.11774621903896332,
0.07890895009040833,
0.045472364872694016,
0.0006619741907343268,
0.0566248744726181,
0.0017906082794070244,
-0.09722816199064255,
0.035825613886117935,
0.003317074151709676,
0.05148433893918991,
0.029242824763059616,
-0.08611011505126953,
0.0782768577337265,
-0.03378568962216377,
0.018888846039772034,
0.1299893856048584,
0.23238438367843628,
-0.09938392788171768,
0.10053471475839615,
0.08187302947044373,
-0.07643775641918182,
-0.15929976105690002,
0.06186666339635849,
0.1253844052553177,
0.004652710631489754,
0.08478561043739319,
-0.20061849057674408,
0.13520179688930511,
0.10684642940759659,
-0.013038484379649162,
0.019542619585990906,
-0.2715294063091278,
-0.13153348863124847,
0.06588338315486908,
0.11033093184232712,
0.051358312368392944,
-0.12184128165245056,
-0.034906964749097824,
-0.010578745976090431,
-0.12002124637365341,
0.12750424444675446,
-0.07663556188344955,
0.11731354892253876,
-0.021544426679611206,
0.12248032540082932,
0.023953013122081757,
-0.037326179444789886,
0.11248892545700073,
0.072651706635952,
0.08598288148641586,
-0.03903026878833771,
-0.0013843430206179619,
0.06399045884609222,
-0.06248985603451729,
0.037347204983234406,
-0.0383266806602478,
0.06350439786911011,
-0.14890924096107483,
0.006581475026905537,
-0.07753404229879379,
0.06049145385622978,
-0.046372584998607635,
-0.0652947649359703,
-0.026658598333597183,
0.046505074948072433,
0.07219377160072327,
-0.03611728921532631,
0.04546603187918663,
0.009267359972000122,
0.08985539525747299,
0.1018035039305687,
0.07214216142892838,
-0.025189749896526337,
-0.0827055349946022,
0.013604152016341686,
0.00481441942974925,
0.04718458652496338,
-0.08590571582317352,
0.015645550563931465,
0.14674660563468933,
0.06014425307512283,
0.10212893038988113,
0.04543355107307434,
-0.043098583817481995,
0.005883889738470316,
0.01594882644712925,
-0.1419604867696762,
-0.100608691573143,
0.02755427546799183,
-0.05804605409502983,
-0.15476654469966888,
0.03355433791875839,
0.12358328700065613,
-0.03874462842941284,
-0.015561042353510857,
-0.007126145530492067,
0.007658562622964382,
-0.011567349545657635,
0.18550069630146027,
0.042928557842969894,
0.05467825010418892,
-0.09129692614078522,
0.11395347118377686,
0.035932958126068115,
-0.04151114076375961,
0.054379526525735855,
0.06732511520385742,
-0.09981381893157959,
0.012495793402194977,
0.07301333546638489,
0.14936591684818268,
-0.06723026186227798,
-0.01332948263734579,
-0.0923643484711647,
-0.07617995887994766,
0.04430307820439339,
0.14337359368801117,
0.053873591125011444,
-0.005975607316941023,
-0.06118239834904671,
0.034864187240600586,
-0.11909633129835129,
0.06806404143571854,
0.051476139575242996,
0.08330940455198288,
-0.10884224623441696,
0.12406528741121292,
-0.0066903941333293915,
0.02480112574994564,
-0.028028756380081177,
0.018480390310287476,
-0.10052336007356644,
-0.03439553454518318,
-0.10921546071767807,
-0.014218290336430073,
-0.01784392260015011,
-0.0029599126428365707,
-0.019805781543254852,
-0.07475250959396362,
-0.04290010407567024,
0.0332057885825634,
-0.07635199278593063,
-0.04863252118229866,
0.018087495118379593,
0.0400206558406353,
-0.15991543233394623,
0.0027862493880093098,
0.02540387213230133,
-0.08708616346120834,
0.08784591406583786,
0.06781355291604996,
0.01567053236067295,
0.02838805690407753,
-0.12259089946746826,
-0.03317432850599289,
0.0009079048759303987,
0.010982939973473549,
0.07752785831689835,
-0.09117718040943146,
-0.029055003076791763,
-0.030381087213754654,
0.04975909739732742,
0.014538734219968319,
0.1022830381989479,
-0.11926307529211044,
-0.013545902445912361,
-0.04738793522119522,
-0.03818495199084282,
-0.05732795223593712,
0.027075698599219322,
0.1141282171010971,
0.04433266445994377,
0.15783901512622833,
-0.07010569423437119,
0.05433986335992813,
-0.20482726395130157,
-0.03321265056729317,
0.011089639738202095,
-0.04711590334773064,
-0.07483040541410446,
-0.0458422526717186,
0.08401711285114288,
-0.050274670124053955,
0.12201591581106186,
-0.015508403070271015,
0.0944395437836647,
0.04338032007217407,
-0.004966467618942261,
-0.07009084522724152,
-0.011686825193464756,
0.18337030708789825,
0.05825401842594147,
-0.02114408276975155,
0.1209075078368187,
0.00436515873298049,
0.043236296623945236,
0.06889896094799042,
0.23225915431976318,
0.1527746468782425,
-0.012755798175930977,
0.07557403296232224,
0.06708145141601562,
-0.07493174076080322,
-0.14053627848625183,
0.12292690575122833,
-0.020348049700260162,
0.10701119899749756,
-0.05256844684481621,
0.18883676826953888,
0.03748589754104614,
-0.1753225177526474,
0.05386146157979965,
-0.025051940232515335,
-0.1084703579545021,
-0.12488483637571335,
-0.01510030496865511,
-0.0822862982749939,
-0.11684613674879074,
0.02765747718513012,
-0.12372583895921707,
0.06686879694461823,
0.09737326949834824,
0.006851743441075087,
0.035070501267910004,
0.18462154269218445,
-0.056078966706991196,
0.012033055536448956,
0.07278682291507721,
0.020071882754564285,
-0.0033323802053928375,
-0.03940886631608009,
-0.06713466346263885,
0.037969447672367096,
0.04359026998281479,
0.0713900625705719,
-0.05189223960042,
0.010567697696387768,
0.015745310112833977,
-0.009989725425839424,
-0.07837661355733871,
0.008047835901379585,
0.01424796599894762,
0.048845209181308746,
0.03451721742749214,
0.047116830945014954,
0.008039302192628384,
-0.05364425107836723,
0.27442625164985657,
-0.06729774177074432,
-0.061564017087221146,
-0.12329519540071487,
0.19385266304016113,
0.033827632665634155,
-0.01835550181567669,
0.05591506510972977,
-0.09257826209068298,
-0.011766724288463593,
0.1616879254579544,
0.13344238698482513,
-0.0920015498995781,
-0.0216678474098444,
-0.023786604404449463,
-0.008981889113783836,
-0.0137874074280262,
0.10595278441905975,
0.07169071584939957,
-0.00019968205015175045,
-0.06664247810840607,
-0.014668297953903675,
-0.029673917219042778,
-0.04770153760910034,
-0.06308766454458237,
0.057838112115859985,
0.028396541252732277,
-0.005582099314779043,
-0.05794670060276985,
0.06287825852632523,
-0.0031856826972216368,
-0.23460066318511963,
0.03757234290242195,
-0.17302381992340088,
-0.17394232749938965,
-0.014125008136034012,
0.07039304077625275,
0.0019941036589443684,
0.05626882240176201,
-0.007340960204601288,
0.010477796196937561,
0.11528418213129044,
-0.017310166731476784,
-0.013955574482679367,
-0.11687645316123962,
0.10876777023077011,
-0.10755512863397598,
0.2114909142255783,
-0.0016065433155745268,
0.06570212543010712,
0.09894607961177826,
0.037789005786180496,
-0.13513995707035065,
0.018792463466525078,
0.06168599799275398,
-0.12514916062355042,
0.0021209048572927713,
0.14492809772491455,
-0.0342196561396122,
0.06257011741399765,
0.03143903240561485,
-0.14954189956188202,
-0.0038023199886083603,
0.02615940384566784,
-0.03673716261982918,
-0.06918322294950485,
-0.01043025217950344,
-0.0562095083296299,
0.16606460511684418,
0.20723198354244232,
-0.02867380529642105,
0.011853715404868126,
-0.0845348909497261,
0.021853027865290642,
0.048364993184804916,
0.059775080531835556,
-0.03916340321302414,
-0.2160986065864563,
0.021795162931084633,
0.07262589782476425,
-0.00290510430932045,
-0.19645161926746368,
-0.09643978625535965,
0.04325177147984505,
-0.03716134652495384,
-0.0461021363735199,
0.09190284460783005,
0.025476522743701935,
0.0375407338142395,
-0.019468046724796295,
-0.11703924089670181,
-0.028624743223190308,
0.14557506144046783,
-0.176247701048851,
-0.0430610217154026
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-0
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-0
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09857524931430817,
0.09789428859949112,
-0.002358255675062537,
0.09164634346961975,
0.12360285967588425,
0.018879951909184456,
0.0933256521821022,
0.1295999437570572,
-0.0999157652258873,
0.06915223598480225,
0.08869295567274094,
0.033378686755895615,
0.04174807667732239,
0.14055795967578888,
-0.006309207528829575,
-0.2775788903236389,
-0.0006739717209711671,
-0.003385382005944848,
-0.053037721663713455,
0.12033379822969437,
0.08888635784387589,
-0.1096535250544548,
0.07459181547164917,
0.007612418383359909,
-0.152072474360466,
0.016420722007751465,
-0.030734622851014137,
-0.0348871573805809,
0.12345098704099655,
-0.02699551172554493,
0.10732850432395935,
0.02834458462893963,
0.13622553646564484,
-0.21106760203838348,
0.007419598288834095,
0.07853086292743683,
0.05532877892255783,
0.09811637550592422,
0.047243472188711166,
0.011612510308623314,
0.101263627409935,
-0.14747972786426544,
0.09509944170713425,
0.030328329652547836,
-0.09003821760416031,
-0.15564410388469696,
-0.08850023150444031,
0.027406198903918266,
0.04981593042612076,
0.07612768560647964,
0.003172264201566577,
0.13406871259212494,
-0.07145257294178009,
0.08591347932815552,
0.2521916329860687,
-0.3124181032180786,
-0.06773930042982101,
0.02428918331861496,
0.05998457968235016,
0.06189024820923805,
-0.12508969008922577,
-0.002360164187848568,
0.01657453551888466,
0.027512677013874054,
0.12352945655584335,
-0.011904319748282433,
-0.1035810261964798,
-0.01067600678652525,
-0.12206027656793594,
-0.0021632022690027952,
0.062008317559957504,
0.02889268286526203,
-0.04930701106786728,
-0.10777385532855988,
-0.06770946085453033,
-0.08033910393714905,
-0.022785276174545288,
-0.050759270787239075,
0.04638848826289177,
-0.05119309574365616,
-0.09862659871578217,
-0.042879778891801834,
-0.05933251976966858,
-0.08095046132802963,
-0.007402525283396244,
0.17059093713760376,
0.033491428941488266,
0.01992524042725563,
-0.03137236461043358,
0.11616477370262146,
0.0301158856600523,
-0.1389746516942978,
-0.011467691510915756,
-0.0039337798953056335,
-0.09689826518297195,
-0.039807144552469254,
-0.05592269077897072,
-0.006721255369484425,
0.003161252476274967,
0.1657295972108841,
-0.07287505269050598,
0.07434195280075073,
0.013937640003859997,
-0.02650478482246399,
-0.015145305544137955,
0.15328171849250793,
-0.042052801698446274,
-0.04541639983654022,
-0.015963105484843254,
0.08131176978349686,
-0.0018992989789694548,
-0.021890416741371155,
-0.06404407322406769,
-0.02861454337835312,
0.09449753165245056,
0.05488870292901993,
-0.05868060514330864,
0.037649743258953094,
-0.02720893919467926,
-0.024882327765226364,
0.016242198646068573,
-0.118272565305233,
0.03999975696206093,
0.001996675506234169,
-0.07669937610626221,
-0.003002728568390012,
0.0011647290084511042,
-0.010821468196809292,
-0.0052072759717702866,
0.10044974088668823,
-0.08611289411783218,
-0.005453294143080711,
-0.06720774620771408,
-0.07862738519906998,
-0.000515446939971298,
-0.1425522267818451,
-0.009410507045686245,
-0.058205585926771164,
-0.16141846776008606,
-0.03675154596567154,
0.04330424591898918,
-0.07502451539039612,
-0.014680561609566212,
-0.04361090064048767,
-0.06208418309688568,
0.023038506507873535,
-0.01236721407622099,
0.19969502091407776,
-0.05076201632618904,
0.08418929576873779,
-0.010057761333882809,
0.04860275238752365,
0.028993699699640274,
0.038779545575380325,
-0.09649484604597092,
0.025994107127189636,
-0.13278602063655853,
0.08415858447551727,
-0.08480783551931381,
-0.0061844284646213055,
-0.1363307386636734,
-0.09862534701824188,
0.007540466263890266,
-0.01893651857972145,
0.08954204618930817,
0.13385766744613647,
-0.1956453174352646,
-0.022173522040247917,
0.12597501277923584,
-0.07398252934217453,
-0.04439331963658333,
0.06303448230028152,
-0.06468883901834488,
0.03775124251842499,
0.05385682359337807,
0.20612725615501404,
0.060265202075242996,
-0.14958007633686066,
-0.005618989933282137,
0.013927476480603218,
0.05119091644883156,
0.030137833207845688,
0.04302091896533966,
0.00016716100799385458,
0.05427522957324982,
0.013175250962376595,
-0.09304738789796829,
-0.020969601348042488,
-0.09154953062534332,
-0.06452420353889465,
-0.049886226654052734,
-0.07495597749948502,
0.05412406846880913,
0.010430019348859787,
0.03825896233320236,
-0.05978512018918991,
-0.10640811175107956,
0.11532089859247208,
0.10012563318014145,
-0.056146781891584396,
0.0387846864759922,
-0.07681280374526978,
0.010574947111308575,
-0.004501188639551401,
-0.03469756245613098,
-0.21295617520809174,
-0.12364828586578369,
0.04735467582941055,
-0.03540565073490143,
0.02108345739543438,
0.013434487394988537,
0.0851110965013504,
0.05675245448946953,
-0.052942026406526566,
-0.013474696315824986,
-0.09890645742416382,
0.0021986565552651882,
-0.11366868019104004,
-0.1920257806777954,
-0.0867251306772232,
-0.04507143795490265,
0.09678143262863159,
-0.17674751579761505,
-0.009417184628546238,
0.023762237280607224,
0.13278570771217346,
0.026533853262662888,
-0.06839405000209808,
-0.001799576566554606,
0.04364262521266937,
0.011827107518911362,
-0.09654484689235687,
0.05494490638375282,
0.01298143994063139,
-0.10745897889137268,
-0.04495273530483246,
-0.12689223885536194,
-0.017748836427927017,
0.05149058252573013,
0.05909903347492218,
-0.09912052005529404,
-0.058887165039777756,
-0.07436507195234299,
-0.03777701407670975,
-0.079025037586689,
0.01650303788483143,
0.21254658699035645,
0.04051206260919571,
0.11050346493721008,
-0.06181267276406288,
-0.08174702525138855,
-0.00844074971973896,
0.030007049441337585,
0.024208953604102135,
0.08947869390249252,
0.0230821892619133,
-0.043501656502485275,
0.06683510541915894,
0.10300995409488678,
-0.022053858265280724,
0.13109362125396729,
-0.055517613887786865,
-0.08455649763345718,
-0.03036651946604252,
-0.018599728122353554,
-0.02568242698907852,
0.12528809905052185,
-0.037420038133859634,
0.0016822535544633865,
0.035129714757204056,
0.04087414965033531,
0.01128216739743948,
-0.16828063130378723,
0.0015225327806547284,
0.03136580064892769,
-0.05629555135965347,
-0.043553005903959274,
-0.003966138698160648,
0.019040698185563087,
0.08677763491868973,
0.03119952790439129,
-0.002119694370776415,
0.006613490637391806,
-0.013831255957484245,
-0.05680624395608902,
0.19074611365795135,
-0.09387664496898651,
-0.0763261541724205,
-0.07248029112815857,
0.01766609586775303,
-0.04431798309087753,
-0.03663873299956322,
0.006372081581503153,
-0.09257812052965164,
-0.029096217826008797,
-0.0811418816447258,
-0.020436303690075874,
-0.027602940797805786,
0.020093176513910294,
0.023510176688432693,
-0.018734736368060112,
0.07966871559619904,
-0.13650347292423248,
0.007151873782277107,
-0.04854517802596092,
-0.09737181663513184,
0.0036463767755776644,
0.07482778280973434,
0.09065389633178711,
0.08480440825223923,
-0.013663023710250854,
0.024610158056020737,
-0.03950360044836998,
0.23225915431976318,
-0.05595932528376579,
0.011044054292142391,
0.1173558458685875,
-0.015038713812828064,
0.052054282277822495,
0.09370984137058258,
0.038310568779706955,
-0.0917968600988388,
0.023622091859579086,
0.07973190397024155,
-0.037644606083631516,
-0.22786518931388855,
-0.01537142600864172,
-0.00693148747086525,
-0.08302333205938339,
0.1023908481001854,
0.031631357967853546,
-0.05075884982943535,
0.04052462801337242,
0.018549606204032898,
-0.010151468217372894,
-0.04095952585339546,
0.0688871517777443,
0.0776354968547821,
0.047685544937849045,
0.10846377164125443,
-0.0050396062433719635,
-0.019099179655313492,
0.05549578368663788,
0.014752209186553955,
0.26061487197875977,
-0.041412413120269775,
0.1048530712723732,
0.03301529958844185,
0.14985620975494385,
-0.0211020614951849,
0.06333069503307343,
0.00031932853744365275,
-0.009613635949790478,
-0.012173817493021488,
-0.06242990866303444,
-0.030221683904528618,
0.01332408282905817,
-0.042801156640052795,
0.023523129522800446,
-0.08142010867595673,
0.02644820138812065,
0.02165834978222847,
0.286399245262146,
0.029641224071383476,
-0.2539968192577362,
-0.07794493436813354,
-0.014403361827135086,
-0.05030372738838196,
-0.06048048287630081,
0.008518744260072708,
0.13829351961612701,
-0.13956321775913239,
0.045389533042907715,
-0.07814832031726837,
0.08686554431915283,
-0.05075492709875107,
0.011510670185089111,
0.05115087702870369,
0.1493704468011856,
-0.0178054291754961,
0.053751785308122635,
-0.19479864835739136,
0.2544858753681183,
0.017407217994332314,
0.10349921137094498,
-0.06554127484560013,
0.013058343902230263,
0.022877775132656097,
0.018546365201473236,
0.1149417832493782,
0.0023475110065191984,
-0.07014615088701248,
-0.14509998261928558,
-0.0909082442522049,
0.04763183742761612,
0.14173240959644318,
-0.045662540942430496,
0.08971570432186127,
-0.03721855953335762,
0.013235214166343212,
0.03699818253517151,
-0.03489183261990547,
-0.14745867252349854,
-0.08686787635087967,
-0.0006565021467395127,
0.008957608602941036,
-0.006993747781962156,
-0.06198931857943535,
-0.10540718585252762,
-0.009815339930355549,
0.10501058399677277,
0.002637607976794243,
-0.05447068437933922,
-0.15807203948497772,
0.08848961442708969,
0.1437869369983673,
-0.058114778250455856,
0.011727558448910713,
0.015312286093831062,
0.11220011115074158,
0.03567105159163475,
-0.07829003781080246,
0.06174624711275101,
-0.06140070781111717,
-0.17994239926338196,
-0.05594974383711815,
0.12307451665401459,
0.08260050415992737,
0.049567487090826035,
-0.001273868139833212,
0.051153719425201416,
0.0003303121484350413,
-0.0961952731013298,
0.03396902605891228,
0.008459734730422497,
0.03600417822599411,
0.016859635710716248,
-0.08732306212186813,
0.09828510880470276,
-0.03554469719529152,
0.009513042867183685,
0.13077984750270844,
0.2081141471862793,
-0.10587961971759796,
0.11271850764751434,
0.08552434295415878,
-0.073317751288414,
-0.1672377735376358,
0.05991847440600395,
0.1299392729997635,
0.012416231445968151,
0.0836852639913559,
-0.21384039521217346,
0.12247665226459503,
0.09969879686832428,
-0.01080789789557457,
0.010111457668244839,
-0.2791346311569214,
-0.1282568871974945,
0.05876632779836655,
0.10932847112417221,
0.042545247822999954,
-0.11728201061487198,
-0.0360734798014164,
-0.004201301373541355,
-0.09309133142232895,
0.1122521162033081,
-0.0721668154001236,
0.11586446315050125,
-0.01628151908516884,
0.11147815734148026,
0.02579965814948082,
-0.031005213037133217,
0.10907450318336487,
0.05914188176393509,
0.08028001338243484,
-0.03505498170852661,
0.00832764059305191,
0.054612692445516586,
-0.05591920018196106,
0.015180973336100578,
-0.04440940171480179,
0.0670602023601532,
-0.15016525983810425,
-0.00026617778348736465,
-0.09186552464962006,
0.050547920167446136,
-0.04864292964339256,
-0.07177093625068665,
-0.014665904454886913,
0.05374361574649811,
0.07505688816308975,
-0.03985706344246864,
0.027884358540177345,
-0.00688947131857276,
0.0982230007648468,
0.09526558965444565,
0.08029252290725708,
-0.015441780909895897,
-0.0925222784280777,
0.011342592537403107,
0.004293104168027639,
0.054769307374954224,
-0.10518768429756165,
0.014103125780820847,
0.13731953501701355,
0.06594116240739822,
0.09565599262714386,
0.047451507300138474,
-0.03961751610040665,
0.003520675702020526,
0.013469547964632511,
-0.12147213518619537,
-0.11257050186395645,
0.024061299860477448,
-0.04558897763490677,
-0.15441732108592987,
0.020913688465952873,
0.12030354142189026,
-0.039907608181238174,
-0.016995869576931,
-0.007579623721539974,
0.004849303048104048,
-0.013695620000362396,
0.1845337301492691,
0.04563838616013527,
0.06268902122974396,
-0.08758049458265305,
0.10723206400871277,
0.03553600609302521,
-0.05223521590232849,
0.05115048959851265,
0.06307279318571091,
-0.10352542996406555,
0.0090431347489357,
0.07597725838422775,
0.12508046627044678,
-0.04924225062131882,
-0.009820356033742428,
-0.08917030692100525,
-0.08431876450777054,
0.041533973067998886,
0.13279925286769867,
0.05345606058835983,
0.0000344317959388718,
-0.07166706025600433,
0.04152395576238632,
-0.11849662661552429,
0.07129715383052826,
0.045149702578783035,
0.06966716796159744,
-0.09983368963003159,
0.1313454657793045,
-0.0013905916130170226,
0.025096694007515907,
-0.02600882388651371,
0.014614979736506939,
-0.09603406488895416,
-0.024212699383497238,
-0.10876376926898956,
-0.025276968255639076,
-0.009253970347344875,
0.0006230009021237493,
-0.022278383374214172,
-0.07451418787240982,
-0.027113894000649452,
0.03870315104722977,
-0.07564548403024673,
-0.050133682787418365,
0.01446348987519741,
0.040243424475193024,
-0.15144766867160797,
0.0019897068850696087,
0.029059788212180138,
-0.09308840334415436,
0.09090939164161682,
0.0630597174167633,
0.014487562701106071,
0.026484746485948563,
-0.11385966092348099,
-0.027977272868156433,
-0.011339538730680943,
0.0057068755850195885,
0.06499752402305603,
-0.09737318009138107,
-0.02684154361486435,
-0.03931591659784317,
0.045608725398778915,
0.017913416028022766,
0.09954124689102173,
-0.11759981513023376,
-0.004472943022847176,
-0.038394372910261154,
-0.04114575684070587,
-0.06270254403352737,
0.035546910017728806,
0.1030164361000061,
0.05504555255174637,
0.14872358739376068,
-0.07407873868942261,
0.05884998291730881,
-0.20094987750053406,
-0.03535540774464607,
0.011003013700246811,
-0.04343282803893089,
-0.08352184295654297,
-0.05220154672861099,
0.08917184174060822,
-0.04463899880647659,
0.10514482855796814,
-0.020471204072237015,
0.11004116386175156,
0.04302608221769333,
-0.010780155658721924,
-0.05931415036320686,
-0.0069175646640360355,
0.1893530786037445,
0.05880630016326904,
-0.016772959381341934,
0.13018488883972168,
-0.00048406890709884465,
0.029686501249670982,
0.0857197567820549,
0.22501538693904877,
0.16162413358688354,
0.0019477496389299631,
0.06353549659252167,
0.060332514345645905,
-0.07296038419008255,
-0.152186319231987,
0.11706409603357315,
-0.019272202625870705,
0.10238984227180481,
-0.06768966466188431,
0.19056501984596252,
0.038914408534765244,
-0.1830752193927765,
0.06363505870103836,
-0.025463176891207695,
-0.1108139306306839,
-0.12230919301509857,
-0.02449517697095871,
-0.06886153668165207,
-0.11989918351173401,
0.023670174181461334,
-0.11690288782119751,
0.06280817836523056,
0.10159517079591751,
0.008880887180566788,
0.0381440743803978,
0.18460707366466522,
-0.045952945947647095,
0.00981347355991602,
0.08360444754362106,
0.020214155316352844,
0.005719375796616077,
-0.045013535767793655,
-0.06697659939527512,
0.035186491906642914,
0.03298734128475189,
0.062791608273983,
-0.05124978721141815,
-0.000024294380637002178,
0.009220149368047714,
-0.007326733786612749,
-0.07711749523878098,
0.010441344231367111,
0.010313859209418297,
0.05443735420703888,
0.05233278125524521,
0.046029169112443924,
0.00615824107080698,
-0.053577519953250885,
0.29700329899787903,
-0.07016897201538086,
-0.06981083005666733,
-0.12968292832374573,
0.20727872848510742,
0.021912872791290283,
-0.022161385044455528,
0.054213304072618484,
-0.08395291119813919,
-0.015049342066049576,
0.1708548665046692,
0.1320357322692871,
-0.09413231164216995,
-0.01566404476761818,
-0.014174871146678925,
-0.009971873834729195,
-0.014248362742364407,
0.11649530380964279,
0.07646515220403671,
-0.009942775592207909,
-0.06854362040758133,
-0.018761005252599716,
-0.021665286272764206,
-0.05611569806933403,
-0.0616300068795681,
0.06990204751491547,
0.02511008083820343,
-0.007506960071623325,
-0.06274428218603134,
0.0688505694270134,
-0.002178699942305684,
-0.24341483414173126,
0.0435156412422657,
-0.17213396728038788,
-0.1700136959552765,
-0.026129860430955887,
0.07248935848474503,
0.005674028769135475,
0.05702953785657883,
0.0018236830364912748,
0.019870970398187637,
0.12336047738790512,
-0.012301299721002579,
-0.0032782885245978832,
-0.10970382392406464,
0.11779661476612091,
-0.0849636048078537,
0.19748853147029877,
-0.006880991626530886,
0.05315748229622841,
0.09692715108394623,
0.041605569422245026,
-0.13812775909900665,
0.017644127830863,
0.06571134179830551,
-0.13042764365673065,
-0.002876995364204049,
0.14815998077392578,
-0.03332124277949333,
0.063759446144104,
0.026605796068906784,
-0.15276379883289337,
0.007393678650259972,
0.015483861789107323,
-0.038246020674705505,
-0.06722551584243774,
-0.008977876044809818,
-0.05152153596282005,
0.16757914423942566,
0.21866947412490845,
-0.029130278155207634,
0.004676268436014652,
-0.08925586938858032,
0.010549803264439106,
0.04569592699408531,
0.06343235820531845,
-0.042951442301273346,
-0.20465721189975739,
0.010870925150811672,
0.06426047533750534,
-0.0047602178528904915,
-0.1941950023174286,
-0.09940063953399658,
0.0533333458006382,
-0.03982504829764366,
-0.041618578135967255,
0.0951608344912529,
0.019481031224131584,
0.037176117300987244,
-0.011941318400204182,
-0.11926557123661041,
-0.021329578012228012,
0.13890637457370758,
-0.1777951568365097,
-0.028934722766280174
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-10
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-10
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09936357289552689,
0.09747885167598724,
-0.002289424417540431,
0.09123210608959198,
0.12337643653154373,
0.018942054361104965,
0.09370537847280502,
0.12853789329528809,
-0.09885383397340775,
0.06783896684646606,
0.08750726282596588,
0.03421362489461899,
0.042067669332027435,
0.14188829064369202,
-0.00540816830471158,
-0.27846765518188477,
-0.0009189741685986519,
-0.0027134830597788095,
-0.05131833627820015,
0.1200861856341362,
0.08927436918020248,
-0.10969793051481247,
0.07370396703481674,
0.0075662825256586075,
-0.15150520205497742,
0.016877885907888412,
-0.031017791479825974,
-0.03453119099140167,
0.12342212349176407,
-0.02859857864677906,
0.10687289386987686,
0.028114374727010727,
0.13814273476600647,
-0.20997372269630432,
0.007402637042105198,
0.07754290848970413,
0.05513838678598404,
0.09771224111318588,
0.04681209847331047,
0.012482469901442528,
0.10066750645637512,
-0.1484249234199524,
0.09575342386960983,
0.029479704797267914,
-0.08955425769090652,
-0.1535816192626953,
-0.08844901621341705,
0.028226008638739586,
0.052151087671518326,
0.07547765970230103,
0.002745124977082014,
0.13412484526634216,
-0.07204200327396393,
0.08592662215232849,
0.251868337392807,
-0.3133012652397156,
-0.06759215146303177,
0.02613353729248047,
0.061789173632860184,
0.06227542832493782,
-0.12533040344715118,
-0.003234296338632703,
0.017107129096984863,
0.02678525261580944,
0.12442006915807724,
-0.012652192264795303,
-0.1035037487745285,
-0.010913833044469357,
-0.12313209474086761,
-0.0011667076032608747,
0.06236005946993828,
0.02965322509407997,
-0.04941863566637039,
-0.10884105414152145,
-0.06762491911649704,
-0.08097690343856812,
-0.023821083828806877,
-0.05188333988189697,
0.046271953731775284,
-0.051788829267024994,
-0.09798158705234528,
-0.043257903307676315,
-0.05871506407856941,
-0.08150286227464676,
-0.005645533557981253,
0.16907690465450287,
0.03392452746629715,
0.01897117868065834,
-0.030818914994597435,
0.11580748856067657,
0.027650082483887672,
-0.13849841058254242,
-0.01047028973698616,
-0.0040155453607439995,
-0.09831739962100983,
-0.040945857763290405,
-0.05623650178313255,
-0.0064124795608222485,
0.0018724693218246102,
0.16639605164527893,
-0.07009952515363693,
0.07427787780761719,
0.015872344374656677,
-0.02730896696448326,
-0.014865617267787457,
0.1532583087682724,
-0.04284895956516266,
-0.04754113033413887,
-0.016153564676642418,
0.08214359730482101,
-0.0022594663314521313,
-0.02057765983045101,
-0.06489022821187973,
-0.029213985428214073,
0.09481799602508545,
0.05477322265505791,
-0.06019897386431694,
0.037619370967149734,
-0.02626483142375946,
-0.02478310652077198,
0.017466841265559196,
-0.11890188604593277,
0.04034661501646042,
0.0011057078372687101,
-0.07777594774961472,
-0.004091319628059864,
-0.00014645302144344896,
-0.009832236915826797,
-0.00530972657725215,
0.09922519326210022,
-0.08583585917949677,
-0.005690590478479862,
-0.06763375550508499,
-0.07921753823757172,
-0.0002572344965301454,
-0.14583858847618103,
-0.008085131645202637,
-0.05728127062320709,
-0.1647653877735138,
-0.0374334417283535,
0.042286258190870285,
-0.07414824515581131,
-0.015206866897642612,
-0.04463842883706093,
-0.06187959015369415,
0.021231038495898247,
-0.011705024167895317,
0.20045062899589539,
-0.04970023036003113,
0.0830317884683609,
-0.010051572695374489,
0.04937930777668953,
0.02912093698978424,
0.03907514736056328,
-0.09605203568935394,
0.025837523862719536,
-0.1319994330406189,
0.08406201750040054,
-0.08504169434309006,
-0.0043462589383125305,
-0.1360233873128891,
-0.09874065965414047,
0.005973580293357372,
-0.018851645290851593,
0.08923730254173279,
0.13383720815181732,
-0.19658122956752777,
-0.0210595540702343,
0.12723998725414276,
-0.07350147515535355,
-0.043662115931510925,
0.06192668527364731,
-0.06502832472324371,
0.039078403264284134,
0.055493131279945374,
0.20565059781074524,
0.06259622424840927,
-0.1488521248102188,
-0.00619543669745326,
0.013973613269627094,
0.050998885184526443,
0.02813098393380642,
0.043089620769023895,
0.001635246560908854,
0.05543822422623634,
0.013322887010872364,
-0.09232497215270996,
-0.02111940085887909,
-0.09077874571084976,
-0.0649239644408226,
-0.04949821159243584,
-0.07589489966630936,
0.05457393079996109,
0.01085750199854374,
0.03862479701638222,
-0.059925422072410583,
-0.10615774989128113,
0.11606840044260025,
0.10073491930961609,
-0.05599913001060486,
0.03750632703304291,
-0.0766761302947998,
0.010070090182125568,
-0.005439582746475935,
-0.03475680574774742,
-0.2127823829650879,
-0.12202302366495132,
0.047679487615823746,
-0.035891417413949966,
0.021070446819067,
0.015325641259551048,
0.08571495115756989,
0.05642085149884224,
-0.05258195474743843,
-0.013967222534120083,
-0.09887146949768066,
0.002077516634017229,
-0.11474188417196274,
-0.1904357522726059,
-0.08770357072353363,
-0.045587897300720215,
0.09746979922056198,
-0.17788615822792053,
-0.008484098128974438,
0.02178124710917473,
0.1322643905878067,
0.025709662586450577,
-0.06848161667585373,
-0.0008250556420534849,
0.0428626723587513,
0.012608670629560947,
-0.09682145714759827,
0.05465497076511383,
0.011747628450393677,
-0.10709444433450699,
-0.04666292667388916,
-0.12802870571613312,
-0.019366048276424408,
0.05074367672204971,
0.06118125095963478,
-0.09881076961755753,
-0.05931088700890541,
-0.07418845593929291,
-0.03715168312191963,
-0.07788903266191483,
0.01592577062547207,
0.21155284345149994,
0.03956329822540283,
0.10989712923765182,
-0.06183910742402077,
-0.08276089280843735,
-0.008502556011080742,
0.03143705055117607,
0.024872148409485817,
0.0891575738787651,
0.023072224110364914,
-0.043542731553316116,
0.0662018433213234,
0.10406754165887833,
-0.0217219777405262,
0.1303342580795288,
-0.05563168600201607,
-0.08530843257904053,
-0.030058354139328003,
-0.019009489566087723,
-0.026767831295728683,
0.12497849017381668,
-0.037719838321208954,
-0.00002319723535038065,
0.034672435373067856,
0.03954921290278435,
0.011293447576463223,
-0.16859805583953857,
0.0017952329944819212,
0.03131190314888954,
-0.0553877130150795,
-0.045078933238983154,
-0.004902105778455734,
0.017852403223514557,
0.08612117916345596,
0.03049519658088684,
-0.00276545831002295,
0.006039902102202177,
-0.013522044755518436,
-0.05632244423031807,
0.19076001644134521,
-0.09281764179468155,
-0.07498972117900848,
-0.07181480526924133,
0.01820189133286476,
-0.042484432458877563,
-0.036709222942590714,
0.005449129734188318,
-0.09248201549053192,
-0.028599141165614128,
-0.08061614632606506,
-0.021306892856955528,
-0.0275783222168684,
0.019471989944577217,
0.02487146109342575,
-0.018201416358351707,
0.07792533934116364,
-0.136537104845047,
0.007810215000063181,
-0.049038052558898926,
-0.09702398627996445,
0.003040814772248268,
0.07396548241376877,
0.09098968654870987,
0.08514624089002609,
-0.014328064396977425,
0.024668652564287186,
-0.04006252810359001,
0.2314894050359726,
-0.056342918425798416,
0.012538796290755272,
0.11718982458114624,
-0.014537773095071316,
0.051819901913404465,
0.09431356191635132,
0.03758623078465462,
-0.09146709740161896,
0.02321411482989788,
0.07900440692901611,
-0.037154827266931534,
-0.22851695120334625,
-0.014755421318113804,
-0.006184256635606289,
-0.08435054123401642,
0.10300703346729279,
0.031314656138420105,
-0.048805173486471176,
0.042277831584215164,
0.018402447924017906,
-0.008637582883238792,
-0.039837706834077835,
0.0685960203409195,
0.0754493772983551,
0.046921100467443466,
0.10855384171009064,
-0.00526599632576108,
-0.02018672041594982,
0.05403916910290718,
0.015546135604381561,
0.26133283972740173,
-0.04026266559958458,
0.10462137311697006,
0.03206140920519829,
0.1492159515619278,
-0.021863630041480064,
0.06572001427412033,
0.0011734863510355353,
-0.009925499558448792,
-0.012121273204684258,
-0.062163665890693665,
-0.02880127727985382,
0.01383728813380003,
-0.04245569556951523,
0.023177649825811386,
-0.08082093298435211,
0.025990359485149384,
0.021044736728072166,
0.285076767206192,
0.031051676720380783,
-0.25490865111351013,
-0.0773174837231636,
-0.014111388474702835,
-0.051311809569597244,
-0.06016945466399193,
0.008159372955560684,
0.13707099854946136,
-0.13919506967067719,
0.046197231858968735,
-0.07822040468454361,
0.08703894168138504,
-0.04984211176633835,
0.011714689433574677,
0.04992048069834709,
0.14925864338874817,
-0.017526159062981606,
0.054916150867938995,
-0.1948857456445694,
0.25317588448524475,
0.01747012324631214,
0.10472282022237778,
-0.06649947911500931,
0.013068901374936104,
0.02285575121641159,
0.017999423667788506,
0.11610587686300278,
0.0020415352191776037,
-0.07014025002717972,
-0.1456286758184433,
-0.0911208763718605,
0.04757460579276085,
0.14195546507835388,
-0.04519129917025566,
0.09027576446533203,
-0.03621148690581322,
0.012203698046505451,
0.03717842325568199,
-0.036326028406620026,
-0.14873068034648895,
-0.08642824739217758,
-0.0010171042522415519,
0.007954725995659828,
-0.007085909601300955,
-0.06142401322722435,
-0.10532663762569427,
-0.012046150863170624,
0.10345259308815002,
0.0038455005269497633,
-0.054559506475925446,
-0.1578812450170517,
0.08990084379911423,
0.14419783651828766,
-0.057812899351119995,
0.01242029294371605,
0.01670178771018982,
0.11278636008501053,
0.035122305154800415,
-0.07795346528291702,
0.06099862605333328,
-0.061463311314582825,
-0.17874595522880554,
-0.05496794357895851,
0.1242353767156601,
0.08296522498130798,
0.04969821125268936,
0.00019447511294856668,
0.05042169243097305,
0.00017417811613995582,
-0.09593909978866577,
0.03348348289728165,
0.00812902394682169,
0.03551968187093735,
0.017067568376660347,
-0.08838033676147461,
0.09691168367862701,
-0.036127831786870956,
0.011596969328820705,
0.13094598054885864,
0.2051355093717575,
-0.10566136986017227,
0.11191894859075546,
0.08540354669094086,
-0.07373632490634918,
-0.16678482294082642,
0.06052099168300629,
0.12984679639339447,
0.012984278611838818,
0.08411899209022522,
-0.21347413957118988,
0.12273503094911575,
0.09931061416864395,
-0.010209296829998493,
0.00905698537826538,
-0.27911728620529175,
-0.1277589648962021,
0.059888459742069244,
0.10959655791521072,
0.04038091376423836,
-0.11668988317251205,
-0.03592666611075401,
-0.0047561535611748695,
-0.0930095985531807,
0.1117786094546318,
-0.07332811504602432,
0.11503486335277557,
-0.015706736594438553,
0.11063634604215622,
0.025716345757246017,
-0.031105419620871544,
0.10739251226186752,
0.06100841984152794,
0.08042357861995697,
-0.03469022363424301,
0.007268994115293026,
0.05693037062883377,
-0.05559268593788147,
0.017021719366312027,
-0.04329835623502731,
0.06667009741067886,
-0.14998272061347961,
-0.0008138256962411106,
-0.09129668027162552,
0.050400570034980774,
-0.04824690520763397,
-0.07197106629610062,
-0.014068350195884705,
0.053595151752233505,
0.07463029026985168,
-0.03991147503256798,
0.025793805718421936,
-0.005500860046595335,
0.09737600386142731,
0.09150661528110504,
0.08134999126195908,
-0.014523197896778584,
-0.09159030020236969,
0.011266149580478668,
0.004418509546667337,
0.05419273301959038,
-0.10575361549854279,
0.013477851636707783,
0.13735370337963104,
0.06639797985553741,
0.09582298994064331,
0.046851545572280884,
-0.039749931544065475,
0.0029762398917227983,
0.012836707755923271,
-0.11929544061422348,
-0.11369987577199936,
0.023836858570575714,
-0.046537160873413086,
-0.15484978258609772,
0.02240907773375511,
0.11839546263217926,
-0.04070684686303139,
-0.018027469515800476,
-0.008718320168554783,
0.004501460585743189,
-0.013274912722408772,
0.18588975071907043,
0.04641936346888542,
0.06283707171678543,
-0.08800198882818222,
0.10675417631864548,
0.03578804060816765,
-0.05259611830115318,
0.050783175975084305,
0.06298142671585083,
-0.10421362519264221,
0.008524390868842602,
0.07708205282688141,
0.12539424002170563,
-0.04670603945851326,
-0.010084369219839573,
-0.08898907899856567,
-0.0841803103685379,
0.04176866635680199,
0.1325099617242813,
0.05371540039777756,
-0.0014235166599974036,
-0.07114121317863464,
0.04193580523133278,
-0.1190074011683464,
0.07127045840024948,
0.04523380100727081,
0.0697036013007164,
-0.10012631118297577,
0.13161152601242065,
-0.0013149961596354842,
0.02570721134543419,
-0.025972982868552208,
0.015601984225213528,
-0.09528985619544983,
-0.02469918131828308,
-0.10724616795778275,
-0.02664785273373127,
-0.010638956911861897,
0.0007056133472360671,
-0.022948209196329117,
-0.07398837059736252,
-0.0268009752035141,
0.03832937404513359,
-0.0755058154463768,
-0.05036332458257675,
0.013970276340842247,
0.039290718734264374,
-0.1505708396434784,
0.0019825673662126064,
0.028219038620591164,
-0.09243252873420715,
0.0901758074760437,
0.06238941475749016,
0.015353906899690628,
0.02717563509941101,
-0.11328272521495819,
-0.027311066165566444,
-0.010443251579999924,
0.006003822200000286,
0.06457465887069702,
-0.0969260111451149,
-0.02636106126010418,
-0.03907857462763786,
0.046730153262615204,
0.017054513096809387,
0.0968153178691864,
-0.11670607328414917,
-0.005471707321703434,
-0.03970248997211456,
-0.04067232832312584,
-0.0631164014339447,
0.035942912101745605,
0.10244642943143845,
0.053554557263851166,
0.14918211102485657,
-0.07237908989191055,
0.058809418231248856,
-0.2015233188867569,
-0.03589644283056259,
0.010200937278568745,
-0.04350237548351288,
-0.08323213458061218,
-0.05282082408666611,
0.08921816945075989,
-0.04430292919278145,
0.10612299293279648,
-0.020310992375016212,
0.11110328137874603,
0.04212234914302826,
-0.007759158033877611,
-0.059057269245386124,
-0.006332903169095516,
0.18971097469329834,
0.05894704908132553,
-0.01726050116121769,
0.12921196222305298,
0.0002469784230925143,
0.030015360563993454,
0.08418957889080048,
0.22241830825805664,
0.16170205175876617,
0.001075885258615017,
0.06351037323474884,
0.061092864722013474,
-0.07315662503242493,
-0.1511552780866623,
0.11806409806013107,
-0.0197418462485075,
0.1005953773856163,
-0.0675068125128746,
0.19283203780651093,
0.03847375512123108,
-0.18295173346996307,
0.06419219076633453,
-0.02458048425614834,
-0.11161557585000992,
-0.12107770889997482,
-0.026038585230708122,
-0.06912830471992493,
-0.11868146806955338,
0.02390602044761181,
-0.11691761761903763,
0.061308931559324265,
0.10177426040172577,
0.008720184676349163,
0.03741523250937462,
0.18591807782649994,
-0.046659789979457855,
0.010135093703866005,
0.08365952223539352,
0.01981906034052372,
0.00626253429800272,
-0.04506656527519226,
-0.06607636064291,
0.03629951551556587,
0.03220634162425995,
0.06344016641378403,
-0.05350198969244957,
-0.0005407003918662667,
0.009070458821952343,
-0.0066018179059028625,
-0.07652872055768967,
0.010592763312160969,
0.01007077656686306,
0.054457277059555054,
0.05125746503472328,
0.0462614968419075,
0.0055017550475895405,
-0.05401046946644783,
0.29563823342323303,
-0.07013764977455139,
-0.07062248140573502,
-0.12971341609954834,
0.20483005046844482,
0.02307446114718914,
-0.022274097427725792,
0.05491180345416069,
-0.0843721255660057,
-0.012662098743021488,
0.17167586088180542,
0.13214299082756042,
-0.0918048769235611,
-0.016205361112952232,
-0.013871499337255955,
-0.010268012061715126,
-0.014855424873530865,
0.11597707122564316,
0.07697748392820358,
-0.012172083370387554,
-0.06832163035869598,
-0.018001699820160866,
-0.020264852792024612,
-0.05714522674679756,
-0.06183259189128876,
0.06962276250123978,
0.026063064113259315,
-0.00791248120367527,
-0.060841143131256104,
0.07045082747936249,
-0.0000027220171432418283,
-0.24336272478103638,
0.042230598628520966,
-0.17104272544384003,
-0.17002727091312408,
-0.026688961312174797,
0.07233931124210358,
0.007415956351906061,
0.05690542981028557,
0.0020162740256637335,
0.020146219059824944,
0.12267564982175827,
-0.011601647362112999,
-0.003991955891251564,
-0.10973202437162399,
0.11789856106042862,
-0.08632592111825943,
0.19642430543899536,
-0.007065426558256149,
0.05412943288683891,
0.09681832045316696,
0.0399189293384552,
-0.1380169540643692,
0.01820756122469902,
0.06576190143823624,
-0.12859243154525757,
-0.0012425478780642152,
0.1487300544977188,
-0.0330992266535759,
0.06145521253347397,
0.025611866265535355,
-0.15303373336791992,
0.007663262076675892,
0.01585240289568901,
-0.03767586871981621,
-0.06797901540994644,
-0.006564506329596043,
-0.05084875598549843,
0.1683325171470642,
0.21843235194683075,
-0.029566753655672073,
0.00521131232380867,
-0.08980946242809296,
0.00992043036967516,
0.04601665213704109,
0.06346386671066284,
-0.04282134771347046,
-0.20413249731063843,
0.009733149781823158,
0.06194477528333664,
-0.0041379136964678764,
-0.1932765692472458,
-0.0982736349105835,
0.05217117816209793,
-0.04111715406179428,
-0.04180268570780754,
0.09426815062761307,
0.021232526749372482,
0.037122078239917755,
-0.011684118770062923,
-0.1197519302368164,
-0.02163669466972351,
0.13893137872219086,
-0.17875716090202332,
-0.02815152145922184
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-2
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-2
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09795872867107391,
0.09827897697687149,
-0.002397244330495596,
0.09156182408332825,
0.12289761751890182,
0.01940619759261608,
0.09373132884502411,
0.12929759919643402,
-0.0994158685207367,
0.0684191957116127,
0.08812866359949112,
0.03271307423710823,
0.04162152484059334,
0.1408972442150116,
-0.005587295163422823,
-0.27868884801864624,
-0.001023312914185226,
-0.004353750497102737,
-0.05341112241148949,
0.12007037550210953,
0.09006665647029877,
-0.10866093635559082,
0.07420029491186142,
0.008038877509534359,
-0.15230749547481537,
0.01671528071165085,
-0.02998812310397625,
-0.03379830718040466,
0.12331430613994598,
-0.0277558583766222,
0.10741065442562103,
0.02925604209303856,
0.13790187239646912,
-0.21074263751506805,
0.007057345937937498,
0.07683892548084259,
0.055587105453014374,
0.09810815751552582,
0.047518469393253326,
0.012532862834632397,
0.10024131834506989,
-0.14768201112747192,
0.09473247826099396,
0.029741477221250534,
-0.08961771428585052,
-0.15461157262325287,
-0.08918095380067825,
0.02835964411497116,
0.05213269218802452,
0.07420995086431503,
0.003684300696477294,
0.13271623849868774,
-0.0715143233537674,
0.08549422025680542,
0.25185227394104004,
-0.31424108147621155,
-0.06788322329521179,
0.02467513084411621,
0.060480207204818726,
0.0631454661488533,
-0.12356137484312057,
-0.0024565551429986954,
0.0170296560972929,
0.027055827900767326,
0.12330905348062515,
-0.012192205525934696,
-0.10288706421852112,
-0.010850798338651657,
-0.12313365191221237,
-0.003125190967693925,
0.06253841519355774,
0.02904040552675724,
-0.05051944777369499,
-0.10678732395172119,
-0.0682959109544754,
-0.0785926878452301,
-0.022239090874791145,
-0.052563972771167755,
0.04656754061579704,
-0.05033091455698013,
-0.09769432991743088,
-0.04551626369357109,
-0.06014327332377434,
-0.08177485316991806,
-0.006671293638646603,
0.17082205414772034,
0.03388971835374832,
0.02061878703534603,
-0.03049015998840332,
0.1171816736459732,
0.029484407976269722,
-0.138270303606987,
-0.010942655615508556,
-0.0034883564803749323,
-0.09744949638843536,
-0.040390338748693466,
-0.05613487958908081,
-0.005749909207224846,
0.0026651560328900814,
0.1652631312608719,
-0.07232897728681564,
0.07337479293346405,
0.014123824425041676,
-0.026070425286889076,
-0.014812457375228405,
0.1532088667154312,
-0.040773387998342514,
-0.04435234144330025,
-0.015814712271094322,
0.08140164613723755,
-0.0012630157871171832,
-0.020891012623906136,
-0.06495265662670135,
-0.02992059476673603,
0.09546488523483276,
0.05613870918750763,
-0.059764374047517776,
0.03544265031814575,
-0.02798403985798359,
-0.024880563840270042,
0.0157332606613636,
-0.1189892515540123,
0.040157388895750046,
0.0016173359472304583,
-0.07618295401334763,
-0.004408305510878563,
0.000736517074983567,
-0.009498217143118382,
-0.004177963361144066,
0.09833578765392303,
-0.08453744649887085,
-0.004537281114608049,
-0.06618639081716537,
-0.07717461884021759,
-0.0003523191262502223,
-0.14262327551841736,
-0.009148404002189636,
-0.058015305548906326,
-0.16213920712471008,
-0.03529902175068855,
0.04333130270242691,
-0.07587923109531403,
-0.01765732280910015,
-0.04350388050079346,
-0.06091039255261421,
0.022546375170350075,
-0.012419384904205799,
0.20003913342952728,
-0.04991712421178818,
0.08425305038690567,
-0.00984133593738079,
0.049396663904190063,
0.028957456350326538,
0.03801742196083069,
-0.09497008472681046,
0.02706230990588665,
-0.132969930768013,
0.0845547541975975,
-0.08381544798612595,
-0.00773730268701911,
-0.13735930621623993,
-0.09820862114429474,
0.007793168071657419,
-0.019808398559689522,
0.08855016529560089,
0.1336262971162796,
-0.19527682662010193,
-0.0207398422062397,
0.12577609717845917,
-0.07524042576551437,
-0.043565474450588226,
0.06338007748126984,
-0.0649210587143898,
0.0408114492893219,
0.05319608375430107,
0.20582544803619385,
0.06160710006952286,
-0.14934416115283966,
-0.003773903474211693,
0.015576810576021671,
0.05044802650809288,
0.030216233804821968,
0.04493670165538788,
0.00006200658390298486,
0.052931491285562515,
0.01284894160926342,
-0.09474803507328033,
-0.021312179043889046,
-0.09172383695840836,
-0.06571274250745773,
-0.04989762604236603,
-0.07535475492477417,
0.055830709636211395,
0.008463438600301743,
0.0389939583837986,
-0.0592336542904377,
-0.10513973236083984,
0.11513659358024597,
0.10106322169303894,
-0.05561318248510361,
0.03918375447392464,
-0.07714702188968658,
0.010089066810905933,
-0.006280544213950634,
-0.03551457077264786,
-0.21239487826824188,
-0.12316923588514328,
0.04851487651467323,
-0.03623737022280693,
0.021024372428655624,
0.015607393346726894,
0.08475738763809204,
0.056794486939907074,
-0.05280950665473938,
-0.013821505010128021,
-0.09922326356172562,
0.0016358870780095458,
-0.11417388916015625,
-0.1914801299571991,
-0.08735539019107819,
-0.04545692354440689,
0.09783533960580826,
-0.17691916227340698,
-0.009701299481093884,
0.02317226119339466,
0.13250279426574707,
0.02612287923693657,
-0.06786710768938065,
-0.0023914913181215525,
0.041793715208768845,
0.011853859759867191,
-0.09604508429765701,
0.054730117321014404,
0.013027478009462357,
-0.10806136578321457,
-0.044470760971307755,
-0.1257369965314865,
-0.018509039655327797,
0.049433302134275436,
0.06011528521776199,
-0.0988273099064827,
-0.059652578085660934,
-0.07453057169914246,
-0.0380413681268692,
-0.07912848889827728,
0.01599722169339657,
0.2114982157945633,
0.03931545466184616,
0.1103070080280304,
-0.06223167106509209,
-0.0820658877491951,
-0.008793270215392113,
0.028957687318325043,
0.023687364533543587,
0.08871959894895554,
0.022984690964221954,
-0.04529868811368942,
0.06524237245321274,
0.10465235263109207,
-0.0219082273542881,
0.12948934733867645,
-0.055640652775764465,
-0.0849890485405922,
-0.031819093972444534,
-0.016557062044739723,
-0.02597333863377571,
0.12439745664596558,
-0.036414243280887604,
0.002192945219576359,
0.03480004519224167,
0.040839191526174545,
0.011097577400505543,
-0.16965632140636444,
0.0016886935336515307,
0.03152740001678467,
-0.05746287852525711,
-0.04237193614244461,
-0.004905570298433304,
0.018884940072894096,
0.08669993281364441,
0.03135420382022858,
-0.00252960785292089,
0.007879971526563168,
-0.013859963975846767,
-0.057506028562784195,
0.1896706372499466,
-0.0927761048078537,
-0.07637238502502441,
-0.07372409105300903,
0.017932619899511337,
-0.04311142861843109,
-0.036299463361501694,
0.0062638153322041035,
-0.09132228791713715,
-0.028279997408390045,
-0.08099975436925888,
-0.019894884899258614,
-0.02880753017961979,
0.020655062049627304,
0.025747016072273254,
-0.018312640488147736,
0.08062851428985596,
-0.13579247891902924,
0.007356063462793827,
-0.04861420765519142,
-0.09836462885141373,
0.0037610915023833513,
0.07481575012207031,
0.09030260890722275,
0.08432573080062866,
-0.013371898792684078,
0.024516720324754715,
-0.039090950042009354,
0.23254553973674774,
-0.055092163383960724,
0.011465638875961304,
0.1178494244813919,
-0.01658697985112667,
0.05269565060734749,
0.09400434046983719,
0.0372421033680439,
-0.0915001928806305,
0.02359168231487274,
0.07847355306148529,
-0.03770344331860542,
-0.22828933596611023,
-0.015486698597669601,
-0.0055609638802707195,
-0.08389091491699219,
0.10248751193284988,
0.03139983490109444,
-0.05283293128013611,
0.040786102414131165,
0.019122406840324402,
-0.009881348349153996,
-0.040186841040849686,
0.06871918588876724,
0.07705076783895493,
0.047028057277202606,
0.1081131100654602,
-0.004811742343008518,
-0.020180607214570045,
0.055844422429800034,
0.016240200027823448,
0.26072198152542114,
-0.040398165583610535,
0.10383831709623337,
0.03272252902388573,
0.15078452229499817,
-0.021644555032253265,
0.06353617459535599,
0.0013895153533667326,
-0.009182280860841274,
-0.012844085693359375,
-0.062237825244665146,
-0.02963740937411785,
0.014735822565853596,
-0.04159463196992874,
0.023383213207125664,
-0.08191104978322983,
0.028611697256565094,
0.020891061052680016,
0.28672581911087036,
0.030960964038968086,
-0.252623587846756,
-0.07679799944162369,
-0.013210685923695564,
-0.05109461024403572,
-0.05965953692793846,
0.008180801756680012,
0.13896073400974274,
-0.14025293290615082,
0.04468736797571182,
-0.07797051966190338,
0.08613384515047073,
-0.05058789253234863,
0.011045909486711025,
0.049289945513010025,
0.14811861515045166,
-0.016356071457266808,
0.055099014192819595,
-0.1929399073123932,
0.2537788450717926,
0.017243461683392525,
0.10282913595438004,
-0.06479447335004807,
0.013406097888946533,
0.022070828825235367,
0.018173418939113617,
0.11631528288125992,
0.0029895370826125145,
-0.07143205404281616,
-0.14522434771060944,
-0.09251262247562408,
0.04719840735197067,
0.14271780848503113,
-0.04675021767616272,
0.08981708437204361,
-0.03716319426894188,
0.013166331686079502,
0.03670274466276169,
-0.034255411475896835,
-0.14774346351623535,
-0.08580923825502396,
-0.0006210631690919399,
0.006723176687955856,
-0.007651496212929487,
-0.06270170211791992,
-0.1054895669221878,
-0.00833851844072342,
0.10580640286207199,
0.0032400304917246103,
-0.054816849529743195,
-0.15712831914424896,
0.08933151513338089,
0.14363393187522888,
-0.058781594038009644,
0.01140524446964264,
0.016177503392100334,
0.11312270164489746,
0.03539377823472023,
-0.07811400294303894,
0.06111154705286026,
-0.061136458069086075,
-0.1807926744222641,
-0.0552452951669693,
0.12505289912223816,
0.0823594406247139,
0.05001861974596977,
0.000050319769798079506,
0.05022701248526573,
0.0015053662937134504,
-0.09578320384025574,
0.03488561138510704,
0.00885236170142889,
0.03501546010375023,
0.017147134989500046,
-0.08800768107175827,
0.09993697702884674,
-0.035854533314704895,
0.00959885586053133,
0.13278332352638245,
0.2097105085849762,
-0.10630248486995697,
0.11403478682041168,
0.08482969552278519,
-0.07414726912975311,
-0.16666100919246674,
0.058357123285532,
0.1315416693687439,
0.011920184828341007,
0.08584465086460114,
-0.2136896401643753,
0.12209532409906387,
0.09919652342796326,
-0.011635194532573223,
0.007956196554005146,
-0.2802911400794983,
-0.1279761642217636,
0.058257926255464554,
0.1092551127076149,
0.044048819690942764,
-0.116241455078125,
-0.03698151931166649,
-0.0035220349673181772,
-0.09353400766849518,
0.11136005073785782,
-0.07123380899429321,
0.11539929360151291,
-0.015899192541837692,
0.11159465461969376,
0.025944169610738754,
-0.030333872884511948,
0.11009784787893295,
0.05929984897375107,
0.07873080670833588,
-0.034507859498262405,
0.007841654121875763,
0.054790180176496506,
-0.056399647146463394,
0.0157464686781168,
-0.042794931679964066,
0.06721709668636322,
-0.1505684107542038,
-0.0009026589686982334,
-0.0908455103635788,
0.049879010766744614,
-0.049147505313158035,
-0.07164790481328964,
-0.014319936744868755,
0.05307280272245407,
0.07503478974103928,
-0.039562977850437164,
0.02565917931497097,
-0.0047232117503881454,
0.09574293345212936,
0.09613479673862457,
0.08027508854866028,
-0.012866292148828506,
-0.09201902896165848,
0.009777621366083622,
0.004020887427031994,
0.0542205274105072,
-0.10472539812326431,
0.01496774610131979,
0.13641931116580963,
0.0652616024017334,
0.09603388607501984,
0.046898338943719864,
-0.040556151419878006,
0.00426797941327095,
0.012943262234330177,
-0.12111470848321915,
-0.1147228479385376,
0.023230837658047676,
-0.045634377747774124,
-0.15528830885887146,
0.02033783122897148,
0.12137524038553238,
-0.03951917216181755,
-0.017774436622858047,
-0.008292139507830143,
0.005475287325680256,
-0.014050282537937164,
0.1838875561952591,
0.04519160836935043,
0.0632292702794075,
-0.0865606889128685,
0.1068362444639206,
0.03580009937286377,
-0.05083988234400749,
0.05041017383337021,
0.06237594038248062,
-0.10334429889917374,
0.00944004487246275,
0.07619346678256989,
0.12459532171487808,
-0.04947658255696297,
-0.009514627046883106,
-0.08897417783737183,
-0.08439937978982925,
0.0408138781785965,
0.13118454813957214,
0.05445639416575432,
-0.0003078613372053951,
-0.07109517604112625,
0.04180370271205902,
-0.11763659864664078,
0.07159050554037094,
0.045973390340805054,
0.06987439841032028,
-0.10113513469696045,
0.13158270716667175,
-0.001965027768164873,
0.027423759922385216,
-0.02624175138771534,
0.014582933858036995,
-0.09508660435676575,
-0.024770664051175117,
-0.10924425721168518,
-0.024446288123726845,
-0.00871602725237608,
0.0005489566246978939,
-0.02183673158288002,
-0.07511506974697113,
-0.02678564377129078,
0.0392548106610775,
-0.07544931769371033,
-0.05036086589097977,
0.01330940704792738,
0.03996938467025757,
-0.15057240426540375,
0.0011298882309347391,
0.029695579782128334,
-0.09339619427919388,
0.09185141324996948,
0.06309156864881516,
0.015502562746405602,
0.026705825701355934,
-0.11182057857513428,
-0.027705350890755653,
-0.010316191241145134,
0.00588919036090374,
0.06422615796327591,
-0.09848611056804657,
-0.027652502059936523,
-0.0387713648378849,
0.04617379978299141,
0.0174267441034317,
0.10027758032083511,
-0.11750767379999161,
-0.004890201613306999,
-0.03978874161839485,
-0.042328886687755585,
-0.06285916268825531,
0.034833312034606934,
0.10166174173355103,
0.05596340820193291,
0.14895059168338776,
-0.0738777369260788,
0.059208232909440994,
-0.20094062387943268,
-0.03535367548465729,
0.010230275802314281,
-0.041441403329372406,
-0.08394894003868103,
-0.05282086506485939,
0.0879194512963295,
-0.04444318637251854,
0.10496503859758377,
-0.02078990451991558,
0.10922610014677048,
0.042521849274635315,
-0.010209811851382256,
-0.0573512502014637,
-0.006696292664855719,
0.1885349601507187,
0.0587911494076252,
-0.016589781269431114,
0.12940557301044464,
-0.0017059316160157323,
0.030595332384109497,
0.08370135724544525,
0.2246125489473343,
0.1615375131368637,
0.0014750907430425286,
0.06360717862844467,
0.060990385711193085,
-0.07232832908630371,
-0.15315815806388855,
0.11674173176288605,
-0.018996568396687508,
0.10085730999708176,
-0.06626313179731369,
0.19089315831661224,
0.03921598941087723,
-0.18364301323890686,
0.062402382493019104,
-0.02467847242951393,
-0.11147603392601013,
-0.12281077355146408,
-0.0245650764554739,
-0.07003810256719589,
-0.1193423792719841,
0.023652540519833565,
-0.11672830581665039,
0.06259477138519287,
0.1012483611702919,
0.007704087533056736,
0.03828054293990135,
0.18298394978046417,
-0.046072885394096375,
0.010177215561270714,
0.08254566043615341,
0.019918598234653473,
0.007098004221916199,
-0.04309333115816116,
-0.06709811836481094,
0.03557012602686882,
0.034444767981767654,
0.06263312697410583,
-0.051823679357767105,
0.0008290193509310484,
0.007967803627252579,
-0.007728721480816603,
-0.07702518999576569,
0.010286057367920876,
0.010103368200361729,
0.05438641086220741,
0.0501902736723423,
0.04657962918281555,
0.005964227020740509,
-0.0534060113132,
0.2970721423625946,
-0.06991829723119736,
-0.0701148584485054,
-0.12952497601509094,
0.20702166855335236,
0.0209639985114336,
-0.021349992603063583,
0.05576727166771889,
-0.08405634760856628,
-0.014777772128582,
0.16962426900863647,
0.1313537210226059,
-0.09472738206386566,
-0.015917642042040825,
-0.014253856614232063,
-0.010181643068790436,
-0.013815862126648426,
0.11703085154294968,
0.07628930360078812,
-0.01049893070012331,
-0.06926406174898148,
-0.018766416236758232,
-0.022042809054255486,
-0.05575009807944298,
-0.06264946609735489,
0.07022389024496078,
0.024660658091306686,
-0.006352285388857126,
-0.062393251806497574,
0.06908796727657318,
-0.000009816583769861609,
-0.24267861247062683,
0.0423167385160923,
-0.17077268660068512,
-0.1703803688287735,
-0.025294935330748558,
0.07332796603441238,
0.005238668993115425,
0.05693725496530533,
0.00046677514910697937,
0.019703468307852745,
0.12360420823097229,
-0.012096112594008446,
-0.004231972619891167,
-0.10796509683132172,
0.11883544921875,
-0.08623415231704712,
0.19766367971897125,
-0.005801123566925526,
0.05480094999074936,
0.0961320549249649,
0.040793806314468384,
-0.1388048678636551,
0.016868488863110542,
0.0657753273844719,
-0.12886886298656464,
-0.0022919774055480957,
0.1494663655757904,
-0.033303771167993546,
0.06305232644081116,
0.027519240975379944,
-0.15272730588912964,
0.005835824646055698,
0.016966158524155617,
-0.038180116564035416,
-0.06705647706985474,
-0.009404017589986324,
-0.05282484367489815,
0.1670239269733429,
0.21750223636627197,
-0.029959937557578087,
0.005756816361099482,
-0.08887755870819092,
0.010679415427148342,
0.046025071293115616,
0.06525589525699615,
-0.04169795289635658,
-0.20446240901947021,
0.009631410241127014,
0.06348646432161331,
-0.0038464744575321674,
-0.1943059265613556,
-0.10060496628284454,
0.05251585692167282,
-0.03962205722928047,
-0.04187890887260437,
0.09569893777370453,
0.01982884109020233,
0.03640967234969139,
-0.011207631789147854,
-0.12092480063438416,
-0.022269977256655693,
0.1382671743631363,
-0.1781710535287857,
-0.028459150344133377
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-4
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-4
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09856618195772171,
0.09722921252250671,
-0.0023534668143838644,
0.0924082100391388,
0.12409114837646484,
0.019835934042930603,
0.09448842704296112,
0.12851624190807343,
-0.0996614620089531,
0.0676981657743454,
0.08818597346544266,
0.03241783007979393,
0.04085610434412956,
0.14063866436481476,
-0.005322934594005346,
-0.2787233293056488,
-0.0012044509639963508,
-0.0036761516239494085,
-0.052534278482198715,
0.12026920914649963,
0.08879533410072327,
-0.10928794741630554,
0.0746401846408844,
0.007626797072589397,
-0.15306013822555542,
0.017304232344031334,
-0.030979394912719727,
-0.033785343170166016,
0.12347979843616486,
-0.027727171778678894,
0.10755335539579391,
0.029016446322202682,
0.13778690993785858,
-0.20938873291015625,
0.007294342387467623,
0.07674791663885117,
0.05484885349869728,
0.09735118597745895,
0.04709124192595482,
0.012199736200273037,
0.0993381142616272,
-0.14768928289413452,
0.09487465769052505,
0.02947470173239708,
-0.08975382894277573,
-0.15522857010364532,
-0.08824213594198227,
0.0272148959338665,
0.05160609632730484,
0.07545657455921173,
0.0029107530135661364,
0.13206246495246887,
-0.07223804295063019,
0.08580416440963745,
0.2492484152317047,
-0.3155210614204407,
-0.06830451637506485,
0.024977095425128937,
0.060843952000141144,
0.06320231407880783,
-0.12451035529375076,
-0.0017611879156902432,
0.017175938934087753,
0.028151478618383408,
0.12371501326560974,
-0.012722242623567581,
-0.10279542952775955,
-0.010266098193824291,
-0.12308456748723984,
-0.0013832170516252518,
0.06332097947597504,
0.029269680380821228,
-0.05033263564109802,
-0.10765790194272995,
-0.06684164702892303,
-0.08002132177352905,
-0.022740831598639488,
-0.05164754390716553,
0.04653538018465042,
-0.05164714902639389,
-0.09839251637458801,
-0.04325637221336365,
-0.06000330671668053,
-0.0806889683008194,
-0.006228815298527479,
0.1693059802055359,
0.03379173204302788,
0.0202716663479805,
-0.02996165119111538,
0.11659948527812958,
0.029141150414943695,
-0.13772805035114288,
-0.00983436219394207,
-0.0043649133294820786,
-0.09748619049787521,
-0.04037533700466156,
-0.05678047612309456,
-0.003976102452725172,
0.0028125704266130924,
0.16550609469413757,
-0.07258131355047226,
0.07424183189868927,
0.01552243996411562,
-0.02704344131052494,
-0.014560354873538017,
0.15160441398620605,
-0.0425211526453495,
-0.04624859616160393,
-0.01638564094901085,
0.08194080740213394,
-0.002352413022890687,
-0.021005891263484955,
-0.065036840736866,
-0.028475526720285416,
0.09444697946310043,
0.055496279150247574,
-0.06098077818751335,
0.03756260126829147,
-0.026714639738202095,
-0.024577470496296883,
0.016125768423080444,
-0.1186445876955986,
0.03990579769015312,
0.001641851500608027,
-0.07656221091747284,
-0.003720021340996027,
-0.00010068763367598876,
-0.010543876327574253,
-0.004820285364985466,
0.09867843985557556,
-0.08545416593551636,
-0.005448766052722931,
-0.06657540053129196,
-0.07846234738826752,
-0.000933001923840493,
-0.14233587682247162,
-0.00806388258934021,
-0.05757324770092964,
-0.16275274753570557,
-0.0370631217956543,
0.04306173324584961,
-0.07577802240848541,
-0.016367489472031593,
-0.043736040592193604,
-0.061853449791669846,
0.02200477197766304,
-0.01252073235809803,
0.20059457421302795,
-0.05039437487721443,
0.0829957127571106,
-0.008483062498271465,
0.04919416457414627,
0.028597556054592133,
0.03811107575893402,
-0.09482456743717194,
0.026235077530145645,
-0.1329731047153473,
0.08385545760393143,
-0.08457636088132858,
-0.005836042575538158,
-0.1359792798757553,
-0.09964988380670547,
0.00871208030730486,
-0.01852933131158352,
0.0879557728767395,
0.13349199295043945,
-0.19493576884269714,
-0.021071093156933784,
0.1249728873372078,
-0.07413184642791748,
-0.04370734095573425,
0.06317742168903351,
-0.06502329558134079,
0.03932571038603783,
0.053897976875305176,
0.205850288271904,
0.061897438019514084,
-0.14859165251255035,
-0.00545237073674798,
0.01438240334391594,
0.05098758265376091,
0.029333898797631264,
0.04336642101407051,
0.0017423885874450207,
0.05343113839626312,
0.013658602721989155,
-0.0936029702425003,
-0.021443786099553108,
-0.09120666235685349,
-0.06523628532886505,
-0.04881730675697327,
-0.0752740278840065,
0.055413950234651566,
0.009381561540067196,
0.038905348628759384,
-0.060512397438287735,
-0.1059841588139534,
0.11467419564723969,
0.10071088373661041,
-0.056275662034749985,
0.03753986209630966,
-0.07757384330034256,
0.010789717547595501,
-0.006377349141985178,
-0.03486878052353859,
-0.21189731359481812,
-0.12453949451446533,
0.04723614826798439,
-0.03338802978396416,
0.021024556830525398,
0.016116227954626083,
0.0856318548321724,
0.057231366634368896,
-0.05333486199378967,
-0.014598223380744457,
-0.09806067496538162,
0.0021270443685352802,
-0.11373299360275269,
-0.1919436752796173,
-0.0880415290594101,
-0.04531591013073921,
0.09845733642578125,
-0.17849808931350708,
-0.009157619439065456,
0.023592745885252953,
0.13143698871135712,
0.0255572572350502,
-0.06730823218822479,
-0.001188878552056849,
0.04344610869884491,
0.012850461527705193,
-0.0957893654704094,
0.055555664002895355,
0.012721247039735317,
-0.1068386361002922,
-0.045843642204999924,
-0.12626715004444122,
-0.016810528934001923,
0.050602976232767105,
0.05922115594148636,
-0.09928480535745621,
-0.0594145692884922,
-0.07432777434587479,
-0.038054209202528,
-0.07728932052850723,
0.015950961038470268,
0.21256110072135925,
0.03894508257508278,
0.10979709774255753,
-0.06105656921863556,
-0.08162017911672592,
-0.008641230873763561,
0.030568033456802368,
0.02479943446815014,
0.08862161636352539,
0.021737579256296158,
-0.043045785278081894,
0.0656534731388092,
0.10339909791946411,
-0.02236519753932953,
0.13058623671531677,
-0.055789705365896225,
-0.08440166711807251,
-0.031044330447912216,
-0.01770041137933731,
-0.026549823582172394,
0.1253751665353775,
-0.036970894783735275,
0.0006049562944099307,
0.034436777234077454,
0.040007349103689194,
0.011455610394477844,
-0.16859246790409088,
0.0016304095042869449,
0.03064795210957527,
-0.0562688484787941,
-0.043737832456827164,
-0.004538868088275194,
0.01814430020749569,
0.08616620302200317,
0.031060978770256042,
-0.0040696957148611546,
0.007734157610684633,
-0.01363828219473362,
-0.05680695176124573,
0.19032660126686096,
-0.09283661097288132,
-0.07528062909841537,
-0.07331263273954391,
0.016333140432834625,
-0.04424070939421654,
-0.036888062953948975,
0.005973865278065205,
-0.09300072491168976,
-0.02843460626900196,
-0.08078692853450775,
-0.021176490932703018,
-0.028423134237527847,
0.020424114540219307,
0.0245920792222023,
-0.018584420904517174,
0.0799143984913826,
-0.13535435497760773,
0.0076689873822033405,
-0.04895927757024765,
-0.09840530902147293,
0.004617823753505945,
0.07513311505317688,
0.09078311175107956,
0.08422400802373886,
-0.01310309674590826,
0.02464439906179905,
-0.03950512036681175,
0.23314329981803894,
-0.05521978810429573,
0.011651570908725262,
0.11754690855741501,
-0.015513194724917412,
0.05135231837630272,
0.09409389644861221,
0.038270607590675354,
-0.09191235154867172,
0.023338962346315384,
0.07904940843582153,
-0.0371520034968853,
-0.22798795998096466,
-0.014862652868032455,
-0.005950283724814653,
-0.08340656012296677,
0.10230616480112076,
0.031340450048446655,
-0.052186764776706696,
0.041574422270059586,
0.020132260397076607,
-0.008963345550000668,
-0.04034031555056572,
0.06819314509630203,
0.07848221063613892,
0.04641924053430557,
0.10927248001098633,
-0.00496390787884593,
-0.020562371239066124,
0.054892558604478836,
0.01641303487122059,
0.26141828298568726,
-0.04082100838422775,
0.1033686101436615,
0.03335022181272507,
0.1499563455581665,
-0.021153191104531288,
0.06453517079353333,
0.0008485317230224609,
-0.010002410970628262,
-0.01248115859925747,
-0.06205742061138153,
-0.028217727318406105,
0.01385065633803606,
-0.042556364089250565,
0.022662699222564697,
-0.0822032243013382,
0.026923924684524536,
0.021204277873039246,
0.28627195954322815,
0.031106462702155113,
-0.25555214285850525,
-0.07788590341806412,
-0.014096236787736416,
-0.05061859264969826,
-0.05911095067858696,
0.00834775622934103,
0.13846081495285034,
-0.13949717581272125,
0.04559464752674103,
-0.0780915915966034,
0.08564810454845428,
-0.050653424113988876,
0.012134920805692673,
0.05153318867087364,
0.149030402302742,
-0.017269983887672424,
0.05478023365139961,
-0.193323016166687,
0.25256091356277466,
0.017369138076901436,
0.10377106815576553,
-0.06510438024997711,
0.013230604119598866,
0.022944459691643715,
0.01961693912744522,
0.11605370789766312,
0.0022559750359505415,
-0.07128559052944183,
-0.1452927589416504,
-0.09137038141489029,
0.04859713464975357,
0.1417204588651657,
-0.045530300587415695,
0.09085510671138763,
-0.036023568361997604,
0.012667241506278515,
0.03624993935227394,
-0.035281017422676086,
-0.14810869097709656,
-0.08640772849321365,
-0.001268095220439136,
0.008179113268852234,
-0.007301327306777239,
-0.0619383379817009,
-0.10608877241611481,
-0.009854946285486221,
0.10521494597196579,
0.004740022588521242,
-0.0550324022769928,
-0.15784889459609985,
0.08878785371780396,
0.1437738537788391,
-0.0575019046664238,
0.011226898059248924,
0.017092859372496605,
0.11211799085140228,
0.0364341177046299,
-0.07775276154279709,
0.06114675849676132,
-0.062047358602285385,
-0.17924632132053375,
-0.05522890016436577,
0.12400210648775101,
0.08202080428600311,
0.04924866184592247,
-0.0002893990313168615,
0.05003349855542183,
0.0007865708903409541,
-0.0964808315038681,
0.03533606603741646,
0.006531109102070332,
0.035990092903375626,
0.016766732558608055,
-0.08883478492498398,
0.09965910017490387,
-0.03513651713728905,
0.010708280839025974,
0.1306837499141693,
0.20661687850952148,
-0.10549477487802505,
0.11174798011779785,
0.08539754897356033,
-0.07361005246639252,
-0.16636359691619873,
0.059477679431438446,
0.13017427921295166,
0.012675588950514793,
0.08419252932071686,
-0.21453341841697693,
0.12329817563295364,
0.09844162315130234,
-0.010388107970356941,
0.010032459162175655,
-0.2772678732872009,
-0.1270759403705597,
0.05882444977760315,
0.11007296293973923,
0.04469433054327965,
-0.1167004331946373,
-0.036217253655195236,
-0.0036267489194869995,
-0.09279700368642807,
0.110432468354702,
-0.07435665279626846,
0.11559256911277771,
-0.016222873702645302,
0.11203614622354507,
0.025109317153692245,
-0.030452363193035126,
0.10891195386648178,
0.06028275191783905,
0.07987023144960403,
-0.034670427441596985,
0.008698984980583191,
0.054884009063243866,
-0.05570802465081215,
0.01594417728483677,
-0.04402433708310127,
0.06651447713375092,
-0.15103496611118317,
-0.000842225446831435,
-0.09190591424703598,
0.0497063510119915,
-0.04884544387459755,
-0.07151643186807632,
-0.013146888464689255,
0.05355779826641083,
0.07367818802595139,
-0.039702340960502625,
0.024029452353715897,
-0.0058667296543717384,
0.09625468403100967,
0.09506641328334808,
0.08110488951206207,
-0.014226756989955902,
-0.09298322349786758,
0.01101037859916687,
0.003580233780667186,
0.05414873734116554,
-0.10416576266288757,
0.013409372419118881,
0.13733357191085815,
0.06457929313182831,
0.09579133242368698,
0.04787035286426544,
-0.03954463079571724,
0.0036921510472893715,
0.014090600423514843,
-0.1213630959391594,
-0.11354407668113708,
0.023453976958990097,
-0.04946104437112808,
-0.15495699644088745,
0.021777860820293427,
0.12132775783538818,
-0.03950805589556694,
-0.017543701454997063,
-0.008287792094051838,
0.00477270083501935,
-0.014296581037342548,
0.1855267882347107,
0.04521452262997627,
0.06295880675315857,
-0.08774007856845856,
0.10619242489337921,
0.03560502827167511,
-0.05167455971240997,
0.050549790263175964,
0.063291996717453,
-0.10442926734685898,
0.008167529478669167,
0.07526130229234695,
0.125996395945549,
-0.048115868121385574,
-0.010626127943396568,
-0.09048280119895935,
-0.08488566428422928,
0.04088543355464935,
0.1297738403081894,
0.05411530286073685,
-0.0012893242528662086,
-0.07133440673351288,
0.04123008996248245,
-0.11846097558736801,
0.07122943550348282,
0.045095376670360565,
0.07015004754066467,
-0.10129448026418686,
0.13093475997447968,
-0.002011187607422471,
0.026708420366048813,
-0.02625739760696888,
0.015182791277766228,
-0.0960269644856453,
-0.024717114865779877,
-0.10941373556852341,
-0.02569805271923542,
-0.009141507558524609,
0.0010699565755203366,
-0.022155912593007088,
-0.07355953752994537,
-0.027572106570005417,
0.039287909865379333,
-0.07582490891218185,
-0.04971650242805481,
0.015276237390935421,
0.04068494588136673,
-0.14994904398918152,
0.0015174287836998701,
0.028239110484719276,
-0.09327083826065063,
0.09163636714220047,
0.06301958858966827,
0.015145630575716496,
0.027097446843981743,
-0.1099843829870224,
-0.02837461419403553,
-0.01081047859042883,
0.0050344932824373245,
0.06495804339647293,
-0.09775607287883759,
-0.026988618075847626,
-0.03892771899700165,
0.04643333703279495,
0.017806995660066605,
0.09922077506780624,
-0.11648905277252197,
-0.004629259929060936,
-0.03905737027525902,
-0.04098382219672203,
-0.06332655251026154,
0.03548547998070717,
0.10244432836771011,
0.055187635123729706,
0.14952075481414795,
-0.07392486929893494,
0.05838238447904587,
-0.201217383146286,
-0.03583848476409912,
0.010100643150508404,
-0.04286472126841545,
-0.08286925405263901,
-0.05270306393504143,
0.08881117403507233,
-0.045175474137067795,
0.10641200840473175,
-0.021067669615149498,
0.1100383922457695,
0.04190356656908989,
-0.010114564560353756,
-0.0585324652493,
-0.005947061348706484,
0.1880689561367035,
0.058276593685150146,
-0.017460109665989876,
0.1291009783744812,
-0.00025727925822138786,
0.029652530327439308,
0.08541226387023926,
0.22270351648330688,
0.1603516787290573,
0.0023857466876506805,
0.06397581845521927,
0.06179584935307503,
-0.07294352352619171,
-0.15179675817489624,
0.1176961213350296,
-0.018742384389042854,
0.10181429237127304,
-0.0674109160900116,
0.1896403282880783,
0.03847783803939819,
-0.18262086808681488,
0.0628872960805893,
-0.02572452463209629,
-0.11175421625375748,
-0.12133938819169998,
-0.02382618933916092,
-0.06918510794639587,
-0.12009796500205994,
0.023923907428979874,
-0.11726569384336472,
0.0610092394053936,
0.10230638831853867,
0.008797572925686836,
0.037838518619537354,
0.1848052591085434,
-0.04477894306182861,
0.01116535346955061,
0.08307985216379166,
0.01920795440673828,
0.006527040619403124,
-0.042851127684116364,
-0.06596006453037262,
0.03503748029470444,
0.03266120329499245,
0.06212777644395828,
-0.05285760015249252,
-0.00012224428064655513,
0.00888286717236042,
-0.00674837501719594,
-0.07661324739456177,
0.010018842294812202,
0.010480538941919804,
0.05420828238129616,
0.05136862024664879,
0.04610539227724075,
0.004871825221925974,
-0.05357274413108826,
0.2957582473754883,
-0.0700177252292633,
-0.06886604428291321,
-0.13040795922279358,
0.2068452537059784,
0.020775936543941498,
-0.02196904830634594,
0.054717253893613815,
-0.08313529938459396,
-0.013428906910121441,
0.1709737926721573,
0.1336042582988739,
-0.09443926066160202,
-0.016318276524543762,
-0.013746967539191246,
-0.010173513554036617,
-0.014179743826389313,
0.11725394427776337,
0.07628676295280457,
-0.010923018679022789,
-0.06935769319534302,
-0.018575219437479973,
-0.021318063139915466,
-0.055852893739938736,
-0.06214861944317818,
0.06972214579582214,
0.026165973395109177,
-0.006739525590091944,
-0.06159020960330963,
0.0693962424993515,
-0.0006781182601116598,
-0.24246738851070404,
0.04339165240526199,
-0.17012840509414673,
-0.1708153635263443,
-0.02652624435722828,
0.07264241576194763,
0.006105635315179825,
0.05797558277845383,
0.00045445034629665315,
0.01938806287944317,
0.12413448840379715,
-0.012417137622833252,
-0.0032855598255991936,
-0.10892845690250397,
0.11884798109531403,
-0.08588673919439316,
0.19627182185649872,
-0.0066946507431566715,
0.05488789454102516,
0.09682553261518478,
0.040277689695358276,
-0.1386277675628662,
0.017702026292681694,
0.06496939063072205,
-0.12941880524158478,
-0.0015490418300032616,
0.14877399802207947,
-0.03324778378009796,
0.06231800466775894,
0.02604665420949459,
-0.15186646580696106,
0.006826217286288738,
0.017171122133731842,
-0.03786275163292885,
-0.06702999025583267,
-0.00898053776472807,
-0.052301984280347824,
0.16769666969776154,
0.2191711664199829,
-0.029812825843691826,
0.005560098681598902,
-0.08893892168998718,
0.010952712036669254,
0.04657783359289169,
0.06387326121330261,
-0.042485252022743225,
-0.20438827574253082,
0.009949494153261185,
0.06400616466999054,
-0.004350744653493166,
-0.19574034214019775,
-0.0992458313703537,
0.052768729627132416,
-0.03952072188258171,
-0.04207904636859894,
0.09566440433263779,
0.02112814225256443,
0.03765639662742615,
-0.011963692493736744,
-0.11853064596652985,
-0.02223835326731205,
0.13805057108402252,
-0.178501158952713,
-0.029247364029288292
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-6
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-6
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.0986977368593216,
0.09716949611902237,
-0.0023116290103644133,
0.09207946062088013,
0.1242663636803627,
0.01926584355533123,
0.09400489926338196,
0.1290326863527298,
-0.09887884557247162,
0.0683203637599945,
0.08735819160938263,
0.033442314714193344,
0.04136065021157265,
0.14060406386852264,
-0.005303698126226664,
-0.2780936658382416,
-0.0010462818900123239,
-0.0034778807312250137,
-0.052460312843322754,
0.12019984424114227,
0.08880690485239029,
-0.10953827202320099,
0.07369787991046906,
0.0070250676944851875,
-0.15324610471725464,
0.017516659572720528,
-0.03124157153069973,
-0.033429794013500214,
0.12332598119974136,
-0.02828005515038967,
0.10701756179332733,
0.02911519818007946,
0.1376994550228119,
-0.21013259887695312,
0.007351486478000879,
0.07738464325666428,
0.05502588301897049,
0.09756718575954437,
0.04806572571396828,
0.012608363293111324,
0.10114133358001709,
-0.14758378267288208,
0.09469737112522125,
0.03007120080292225,
-0.08956607431173325,
-0.15335892140865326,
-0.08887612819671631,
0.026593174785375595,
0.05228765681385994,
0.07609419524669647,
0.0025279794353991747,
0.1330440491437912,
-0.07251296937465668,
0.08611247688531876,
0.2512308359146118,
-0.31378498673439026,
-0.068245068192482,
0.02601276896893978,
0.0608009397983551,
0.062232598662376404,
-0.12551836669445038,
-0.0027521520387381315,
0.017316851764917374,
0.02768980711698532,
0.12265290319919586,
-0.012255944311618805,
-0.10433943569660187,
-0.010844222269952297,
-0.12343057245016098,
-0.0015781112015247345,
0.06148137152194977,
0.029413040727376938,
-0.04968716576695442,
-0.10744228959083557,
-0.06755983084440231,
-0.07955159991979599,
-0.022668087854981422,
-0.05153510347008705,
0.04678110405802727,
-0.05170676112174988,
-0.09761445969343185,
-0.04320508614182472,
-0.05966291204094887,
-0.0818822979927063,
-0.005889789201319218,
0.16904638707637787,
0.03380902484059334,
0.019946638494729996,
-0.030807092785835266,
0.11630375683307648,
0.029387956485152245,
-0.13801456987857819,
-0.011012658476829529,
-0.003469478338956833,
-0.09790125489234924,
-0.04069007188081741,
-0.056114740669727325,
-0.005721575114876032,
0.0025443092454224825,
0.16387756168842316,
-0.07268530130386353,
0.07477481663227081,
0.014426002278923988,
-0.027094140648841858,
-0.01532353088259697,
0.15181122720241547,
-0.04142264649271965,
-0.044890351593494415,
-0.016838766634464264,
0.08191804587841034,
-0.0028213663026690483,
-0.02058490552008152,
-0.06430046260356903,
-0.02871669828891754,
0.09508528560400009,
0.055507879704236984,
-0.06022977456450462,
0.037020258605480194,
-0.026942741125822067,
-0.02467700093984604,
0.016404082998633385,
-0.11861372739076614,
0.04005352035164833,
0.0011395805049687624,
-0.0768456757068634,
-0.004851918667554855,
0.000034595010220073164,
-0.010865225456655025,
-0.005127550568431616,
0.09927020967006683,
-0.08573008328676224,
-0.005224813707172871,
-0.06741565465927124,
-0.078215092420578,
-0.00031230837339535356,
-0.14412415027618408,
-0.00844544917345047,
-0.05684566870331764,
-0.16321289539337158,
-0.03731180727481842,
0.04245350882411003,
-0.07569587975740433,
-0.015836546197533607,
-0.04423420876264572,
-0.06246667355298996,
0.022328751161694527,
-0.012066529132425785,
0.20166566967964172,
-0.050100311636924744,
0.08370240032672882,
-0.008847318589687347,
0.048889804631471634,
0.029311304911971092,
0.038787174969911575,
-0.09573396295309067,
0.025963064283132553,
-0.13240239024162292,
0.08398476988077164,
-0.08558029681444168,
-0.0057861278764903545,
-0.13711383938789368,
-0.09877575188875198,
0.007460368797183037,
-0.019199427217245102,
0.08882082253694534,
0.13428851962089539,
-0.195358544588089,
-0.020642157644033432,
0.1255474090576172,
-0.0749702900648117,
-0.04376658797264099,
0.06211799383163452,
-0.06470534205436707,
0.03886525705456734,
0.05347545072436333,
0.20611394941806793,
0.06116865575313568,
-0.1482594758272171,
-0.006386274471879005,
0.013776788488030434,
0.05156060308218002,
0.028839463368058205,
0.0431368350982666,
0.0015425996389240026,
0.05453529208898544,
0.013302839361131191,
-0.09347045421600342,
-0.02166658826172352,
-0.09135176986455917,
-0.06458494067192078,
-0.04946291446685791,
-0.07537328451871872,
0.054595693945884705,
0.010607720352709293,
0.03852401301264763,
-0.060025762766599655,
-0.10519374907016754,
0.11465928703546524,
0.10071079432964325,
-0.05565860867500305,
0.03794766589999199,
-0.07680066674947739,
0.009630486369132996,
-0.006744957994669676,
-0.034735653549432755,
-0.21341601014137268,
-0.12471567094326019,
0.04778089374303818,
-0.03424627706408501,
0.021068386733531952,
0.015482468530535698,
0.08539941906929016,
0.056328509002923965,
-0.053350843489170074,
-0.014531338587403297,
-0.09884702414274216,
0.0016610861057415605,
-0.11442398279905319,
-0.19206708669662476,
-0.08805128186941147,
-0.04610442370176315,
0.09596884250640869,
-0.17794305086135864,
-0.009442637674510479,
0.023442337289452553,
0.1320587545633316,
0.02572542615234852,
-0.06794265657663345,
-0.0011628023348748684,
0.04311605542898178,
0.012473917566239834,
-0.09623707830905914,
0.055301278829574585,
0.01182502694427967,
-0.10633287578821182,
-0.045951567590236664,
-0.12656541168689728,
-0.01866566762328148,
0.0502232201397419,
0.06076934561133385,
-0.09958414733409882,
-0.059495989233255386,
-0.07427410781383514,
-0.03827454149723053,
-0.07870862632989883,
0.016924839466810226,
0.21199484169483185,
0.03928840905427933,
0.10939765721559525,
-0.061313752084970474,
-0.08194854110479355,
-0.00803228560835123,
0.03091179020702839,
0.024471847340464592,
0.0897752046585083,
0.02366858907043934,
-0.044976428151130676,
0.0665096864104271,
0.10400468856096268,
-0.02159482054412365,
0.1307450234889984,
-0.05603479593992233,
-0.08484150469303131,
-0.02962246723473072,
-0.01716057024896145,
-0.026415372267365456,
0.1249934583902359,
-0.03672210872173309,
0.0010010219411924481,
0.03427145630121231,
0.04043944180011749,
0.011347724124789238,
-0.16876772046089172,
0.0016598625807091594,
0.0305044986307621,
-0.056153956800699234,
-0.04424033313989639,
-0.004594410303980112,
0.01816064491868019,
0.0865691751241684,
0.030689191073179245,
-0.003627092344686389,
0.007003704085946083,
-0.013673017732799053,
-0.05688369274139404,
0.1912386119365692,
-0.09242814779281616,
-0.07417639344930649,
-0.07176267355680466,
0.017323412001132965,
-0.04368085041642189,
-0.03719639778137207,
0.006250767968595028,
-0.09389844536781311,
-0.02845214307308197,
-0.08068429678678513,
-0.02096414752304554,
-0.028230594471096992,
0.019575685262680054,
0.023991188034415245,
-0.018293581902980804,
0.07919798791408539,
-0.13622444868087769,
0.007901154458522797,
-0.04941317439079285,
-0.0986897423863411,
0.004126048646867275,
0.07489531487226486,
0.09063506871461868,
0.08439408242702484,
-0.013510128483176231,
0.02456515282392502,
-0.03958263620734215,
0.23191678524017334,
-0.055789556354284286,
0.01119632925838232,
0.11739075183868408,
-0.01432717964053154,
0.05192841589450836,
0.09430710971355438,
0.037506137043237686,
-0.09195567667484283,
0.023609885945916176,
0.07974694669246674,
-0.037486810237169266,
-0.22925402224063873,
-0.014959522522985935,
-0.006141891703009605,
-0.08365152776241302,
0.10246917605400085,
0.03177691996097565,
-0.05223754793405533,
0.041391484439373016,
0.019296742975711823,
-0.010264054872095585,
-0.040335021913051605,
0.06870583444833755,
0.07624876499176025,
0.04725653678178787,
0.108880914747715,
-0.004970644600689411,
-0.01974673941731453,
0.054231494665145874,
0.016356736421585083,
0.2625410556793213,
-0.0410456620156765,
0.10327089577913284,
0.032868482172489166,
0.1496991664171219,
-0.021736452355980873,
0.06487269699573517,
0.0007111371960490942,
-0.010009394027292728,
-0.012285695411264896,
-0.06182705610990524,
-0.02844899706542492,
0.013829909265041351,
-0.043439608067274094,
0.02286297082901001,
-0.082034170627594,
0.027209265157580376,
0.02060331404209137,
0.2869991064071655,
0.031101157888770103,
-0.2541731297969818,
-0.07698088139295578,
-0.014387518167495728,
-0.05117417499423027,
-0.05966813862323761,
0.00821943674236536,
0.1380586475133896,
-0.1392732560634613,
0.04531624913215637,
-0.0784531682729721,
0.08677312731742859,
-0.0493021160364151,
0.011418309062719345,
0.05073627084493637,
0.14914095401763916,
-0.01723884418606758,
0.055148687213659286,
-0.19450509548187256,
0.2546672523021698,
0.01742849498987198,
0.10372726619243622,
-0.06537918746471405,
0.013099892996251583,
0.02245408482849598,
0.017806582152843475,
0.11653142422437668,
0.002250244375318289,
-0.07213209569454193,
-0.14507801830768585,
-0.09079046547412872,
0.048267658799886703,
0.14254681766033173,
-0.04575280100107193,
0.0901329293847084,
-0.03597624972462654,
0.012685112655162811,
0.03709934651851654,
-0.0355391651391983,
-0.14854055643081665,
-0.08621430397033691,
-0.0009407549514435232,
0.0074321129359304905,
-0.007957415655255318,
-0.061614055186510086,
-0.1058604046702385,
-0.008858710527420044,
0.1047619879245758,
0.004572833422571421,
-0.05438484624028206,
-0.15787792205810547,
0.08973678946495056,
0.1439332813024521,
-0.058037638664245605,
0.011400829069316387,
0.016720963642001152,
0.11168339848518372,
0.03546442836523056,
-0.07824642956256866,
0.062102265655994415,
-0.06210973858833313,
-0.17988814413547516,
-0.055129408836364746,
0.12357719987630844,
0.08260282129049301,
0.04966917261481285,
-0.00040064414497464895,
0.05042470991611481,
0.0010630508186295629,
-0.09622829407453537,
0.03546219691634178,
0.007021818310022354,
0.03596419095993042,
0.017180219292640686,
-0.08832859247922897,
0.09829789400100708,
-0.03585689887404442,
0.010038649663329124,
0.1305558830499649,
0.20778919756412506,
-0.10545558482408524,
0.11219681799411774,
0.085820771753788,
-0.07419341057538986,
-0.16698119044303894,
0.06039188429713249,
0.13062144815921783,
0.012009457685053349,
0.08506997674703598,
-0.21437968313694,
0.12328255921602249,
0.09845418483018875,
-0.010784272104501724,
0.009166683070361614,
-0.27824652194976807,
-0.12721776962280273,
0.05868542566895485,
0.1100124716758728,
0.042409464716911316,
-0.1167992651462555,
-0.036159127950668335,
-0.003188605885952711,
-0.0922379121184349,
0.11110704392194748,
-0.07280688732862473,
0.11574111133813858,
-0.016558295115828514,
0.11187443882226944,
0.02530510537326336,
-0.031011033803224564,
0.10748428106307983,
0.06054132431745529,
0.08008121699094772,
-0.03440588712692261,
0.008545706048607826,
0.0548444464802742,
-0.055818770080804825,
0.015847815200686455,
-0.04399874806404114,
0.06698208302259445,
-0.14980871975421906,
-0.000413268047850579,
-0.09239093214273453,
0.05031393840909004,
-0.04884765297174454,
-0.0713646337389946,
-0.013279806822538376,
0.05392899364233017,
0.07434232532978058,
-0.04014575481414795,
0.025550734251737595,
-0.005708906799554825,
0.09811169654130936,
0.09506839513778687,
0.08144976198673248,
-0.013859527185559273,
-0.09203255921602249,
0.010372204706072807,
0.004369921050965786,
0.05421353504061699,
-0.10525479912757874,
0.013270213268697262,
0.13742505013942719,
0.065956249833107,
0.09560099244117737,
0.048197146505117416,
-0.04006802290678024,
0.0037445530761033297,
0.013073809444904327,
-0.12046001106500626,
-0.11434996128082275,
0.023812847211956978,
-0.04560919106006622,
-0.15530557930469513,
0.022191133350133896,
0.11976318061351776,
-0.04055274277925491,
-0.01765815168619156,
-0.0081228232011199,
0.005365766119211912,
-0.013472890481352806,
0.18622344732284546,
0.04538007080554962,
0.06364911794662476,
-0.0875796526670456,
0.10657127946615219,
0.035124771296978,
-0.05287205055356026,
0.05078873038291931,
0.06363870948553085,
-0.10365036129951477,
0.00815916620194912,
0.07706254720687866,
0.1249031350016594,
-0.04817810654640198,
-0.009130607359111309,
-0.0892772227525711,
-0.08446715027093887,
0.04116535931825638,
0.13179463148117065,
0.05397311970591545,
-0.0013800554443150759,
-0.07096122205257416,
0.041708238422870636,
-0.11900978535413742,
0.07158973813056946,
0.044823676347732544,
0.07042568922042847,
-0.10083996504545212,
0.12968780100345612,
-0.0017610196955502033,
0.026862820610404015,
-0.026146119460463524,
0.015595907345414162,
-0.09575311839580536,
-0.024729426950216293,
-0.10724297910928726,
-0.025701679289340973,
-0.009313964284956455,
0.0005390124279074371,
-0.022745458409190178,
-0.07404002547264099,
-0.02660762146115303,
0.03937503695487976,
-0.07600745558738708,
-0.05037873983383179,
0.014504818245768547,
0.040237586945295334,
-0.15032511949539185,
0.0016270654741674662,
0.028360718861222267,
-0.09273751080036163,
0.0905848890542984,
0.06221059337258339,
0.01526377908885479,
0.02730458602309227,
-0.11265084892511368,
-0.02792479284107685,
-0.010519103147089481,
0.005130521021783352,
0.06474852561950684,
-0.09708988666534424,
-0.02725270390510559,
-0.039185866713523865,
0.04656212776899338,
0.017500869929790497,
0.09837936609983444,
-0.11713524907827377,
-0.0054887873120605946,
-0.04052447900176048,
-0.04168161004781723,
-0.06271746009588242,
0.03579718992114067,
0.1027824878692627,
0.05530141666531563,
0.14893871545791626,
-0.07363106310367584,
0.059222932904958725,
-0.20120589435100555,
-0.03579162061214447,
0.01028489414602518,
-0.04273327440023422,
-0.08354853093624115,
-0.05188589543104172,
0.08919982612133026,
-0.045133233070373535,
0.10411714762449265,
-0.020705081522464752,
0.11081884056329727,
0.042350564152002335,
-0.00987725704908371,
-0.05893867835402489,
-0.006139643024653196,
0.18777331709861755,
0.05812094733119011,
-0.016833245754241943,
0.13082638382911682,
0.00006657731137238443,
0.029395362362265587,
0.08670996129512787,
0.22483594715595245,
0.16175521910190582,
0.0013438570313155651,
0.06401550769805908,
0.06130263954401016,
-0.07364987581968307,
-0.15163056552410126,
0.1176610067486763,
-0.018352266401052475,
0.10170881450176239,
-0.06778533011674881,
0.1902708262205124,
0.03834983706474304,
-0.1823710799217224,
0.0635082796216011,
-0.026065798476338387,
-0.11153407394886017,
-0.12150554358959198,
-0.02353002317249775,
-0.06918095797300339,
-0.11976855248212814,
0.02426471933722496,
-0.1174858883023262,
0.06190316006541252,
0.10253610461950302,
0.008728046901524067,
0.038016218692064285,
0.18528339266777039,
-0.044401392340660095,
0.011011461727321148,
0.08357298374176025,
0.01948610134422779,
0.006273453123867512,
-0.0433061458170414,
-0.06578768044710159,
0.036219481378793716,
0.03263106569647789,
0.06227320432662964,
-0.05264870822429657,
-0.0011024253908544779,
0.008787858299911022,
-0.0068701403215527534,
-0.07681722193956375,
0.010282261297106743,
0.010772976092994213,
0.054403726011514664,
0.05211775377392769,
0.04606860876083374,
0.005472821183502674,
-0.05365559086203575,
0.29760321974754333,
-0.07029671221971512,
-0.06868284195661545,
-0.1294003427028656,
0.20750504732131958,
0.02199656516313553,
-0.022021273151040077,
0.05468007177114487,
-0.08405088633298874,
-0.013644689694046974,
0.16945478320121765,
0.13151437044143677,
-0.09323886036872864,
-0.015986111015081406,
-0.01406671479344368,
-0.010071593336760998,
-0.013948342762887478,
0.11697913706302643,
0.0766688734292984,
-0.010359316132962704,
-0.06955424696207047,
-0.01842193491756916,
-0.021313417702913284,
-0.05678478255867958,
-0.061380743980407715,
0.06976177543401718,
0.026432521641254425,
-0.0072057899087667465,
-0.06168941780924797,
0.06975621730089188,
-0.0008303184295073152,
-0.24228064715862274,
0.042565394192934036,
-0.17120273411273956,
-0.17039676010608673,
-0.026463937014341354,
0.07255928963422775,
0.006372199393808842,
0.05740118399262428,
0.000826039060484618,
0.020098136737942696,
0.12210941314697266,
-0.012335383333265781,
-0.003526065731421113,
-0.11029944568872452,
0.11811774969100952,
-0.08659633249044418,
0.19682207703590393,
-0.006819764152169228,
0.053875964134931564,
0.09678394347429276,
0.040521956980228424,
-0.1393292397260666,
0.017362095415592194,
0.0653749480843544,
-0.13014306128025055,
-0.0015971452230587602,
0.14987480640411377,
-0.03321990743279457,
0.06261564791202545,
0.025737185031175613,
-0.15349054336547852,
0.007175884209573269,
0.016026854515075684,
-0.03733941540122032,
-0.06770605593919754,
-0.007119782734662294,
-0.05202222615480423,
0.16782356798648834,
0.21928487718105316,
-0.02958107367157936,
0.0053253937512636185,
-0.08943041414022446,
0.010548874735832214,
0.045428402721881866,
0.06473004072904587,
-0.04231264442205429,
-0.20457394421100616,
0.010392826981842518,
0.06366275250911713,
-0.004461780656129122,
-0.1948833465576172,
-0.09894385188817978,
0.0529630221426487,
-0.04055685177445412,
-0.04217951372265816,
0.09502807259559631,
0.021055735647678375,
0.036844950169324875,
-0.01195201463997364,
-0.12097208946943283,
-0.02191832661628723,
0.1387224644422531,
-0.1786680668592453,
-0.028823137283325195
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-8
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-8
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
55,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-32-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09926343709230423,
0.09803375601768494,
-0.0023355018347501755,
0.09246624261140823,
0.12422728538513184,
0.019909044727683067,
0.09348292648792267,
0.1289600133895874,
-0.0984756350517273,
0.06783876568078995,
0.08768877387046814,
0.032717280089855194,
0.041286688297986984,
0.13975128531455994,
-0.00544354971498251,
-0.2781513035297394,
-0.0015492640668526292,
-0.0028630965389311314,
-0.05055268853902817,
0.1197071298956871,
0.08887344598770142,
-0.10957146435976028,
0.0736064612865448,
0.00679375696927309,
-0.15256191790103912,
0.01736271195113659,
-0.030724558979272842,
-0.03400476276874542,
0.12361639738082886,
-0.02756159007549286,
0.10705176740884781,
0.028345391154289246,
0.1377917230129242,
-0.21090422570705414,
0.007097307126969099,
0.07763848453760147,
0.05495075508952141,
0.09763973206281662,
0.046752285212278366,
0.013422627002000809,
0.10015816986560822,
-0.14810626208782196,
0.09499699622392654,
0.029699556529521942,
-0.08931837975978851,
-0.15384726226329803,
-0.08794067054986954,
0.027826376259326935,
0.051772069185972214,
0.07551656663417816,
0.003563088132068515,
0.13475173711776733,
-0.07128344476222992,
0.0865008607506752,
0.25137999653816223,
-0.3134840130805969,
-0.06745388358831406,
0.02500215545296669,
0.06044599041342735,
0.06351906806230545,
-0.1246546134352684,
-0.002563095185905695,
0.016985265538096428,
0.02749260514974594,
0.12314444780349731,
-0.01303727738559246,
-0.1058385893702507,
-0.010917415842413902,
-0.12288815528154373,
-0.0020169690251350403,
0.062309980392456055,
0.029655249789357185,
-0.04990458860993385,
-0.10744916647672653,
-0.06815753877162933,
-0.08033624291419983,
-0.023103831335902214,
-0.051911309361457825,
0.04616463556885719,
-0.05130735784769058,
-0.09678557515144348,
-0.04323834553360939,
-0.05912158265709877,
-0.08114129304885864,
-0.005671611521393061,
0.16911624372005463,
0.03395792469382286,
0.019880300387740135,
-0.029962701722979546,
0.11585086584091187,
0.027409590780735016,
-0.1381019949913025,
-0.010539739392697811,
-0.0031912620179355145,
-0.09801606088876724,
-0.040828339755535126,
-0.05644059553742409,
-0.006900042295455933,
0.00204021530225873,
0.1653245985507965,
-0.07236246764659882,
0.07451476156711578,
0.014995289035141468,
-0.026483530178666115,
-0.014875844120979309,
0.1527080088853836,
-0.0421452522277832,
-0.04562118276953697,
-0.01596636325120926,
0.08146243542432785,
-0.001948941731825471,
-0.02150275558233261,
-0.06548025459051132,
-0.029529446735978127,
0.09568056464195251,
0.054878268390893936,
-0.06019465625286102,
0.03645315393805504,
-0.02713933028280735,
-0.02488700859248638,
0.017208851873874664,
-0.1184844970703125,
0.04028394818305969,
0.001058027846738696,
-0.07660867273807526,
-0.004087965004146099,
0.000889801187440753,
-0.010258805006742477,
-0.005507303401827812,
0.09816838055849075,
-0.08488655835390091,
-0.004721370525658131,
-0.06708651781082153,
-0.07836337387561798,
-0.00003171892240061425,
-0.14404207468032837,
-0.007539673242717981,
-0.0580294094979763,
-0.16339142620563507,
-0.03678203001618385,
0.042780403047800064,
-0.07515361160039902,
-0.015539252199232578,
-0.04323657229542732,
-0.06102365627884865,
0.022166814655065536,
-0.01268794946372509,
0.1995120644569397,
-0.05026286840438843,
0.08335345983505249,
-0.008747042156755924,
0.048627693206071854,
0.027985336259007454,
0.03888208791613579,
-0.09528813511133194,
0.025828272104263306,
-0.13243378698825836,
0.08400779217481613,
-0.08489136397838593,
-0.0057389624416828156,
-0.13574625551700592,
-0.09870912879705429,
0.00841568037867546,
-0.018571794033050537,
0.08960539847612381,
0.1334824562072754,
-0.19517704844474792,
-0.02040979638695717,
0.12471303343772888,
-0.07452888041734695,
-0.043979279696941376,
0.06363382935523987,
-0.06487400084733963,
0.03886809200048447,
0.05402583256363869,
0.2061096429824829,
0.06296603381633759,
-0.1483597457408905,
-0.005682696122676134,
0.015290958806872368,
0.052227918058633804,
0.02770003117620945,
0.04301607981324196,
0.0019807773642241955,
0.05479472875595093,
0.013620484620332718,
-0.0923023670911789,
-0.020838109776377678,
-0.09130105376243591,
-0.06505297869443893,
-0.04971465468406677,
-0.07516548782587051,
0.0543082021176815,
0.011390991508960724,
0.038539908826351166,
-0.059761445969343185,
-0.10583803057670593,
0.11625438928604126,
0.1004091203212738,
-0.05600910261273384,
0.03794921189546585,
-0.07644076645374298,
0.0101796118542552,
-0.006140409503132105,
-0.03465558961033821,
-0.21294155716896057,
-0.12351595610380173,
0.04794970899820328,
-0.033965736627578735,
0.02058158814907074,
0.015744071453809738,
0.08472390472888947,
0.05677873641252518,
-0.05279712378978729,
-0.013755188323557377,
-0.09834739565849304,
0.001912650652229786,
-0.11523997038602829,
-0.19084139168262482,
-0.08820419758558273,
-0.04552225396037102,
0.09649118036031723,
-0.17869232594966888,
-0.008944082073867321,
0.02298351749777794,
0.1313338279724121,
0.025209838524460793,
-0.06789133697748184,
-0.0011771749705076218,
0.04376176372170448,
0.012271976098418236,
-0.09653905034065247,
0.05548252537846565,
0.012468075379729271,
-0.1073247566819191,
-0.04700207710266113,
-0.12691834568977356,
-0.017765812575817108,
0.04996737837791443,
0.05983272194862366,
-0.09952420741319656,
-0.060073330998420715,
-0.07371972501277924,
-0.038082629442214966,
-0.07792865484952927,
0.016165880486369133,
0.2130226343870163,
0.039507877081632614,
0.11030997335910797,
-0.061178456991910934,
-0.0813475251197815,
-0.007902517914772034,
0.030708573758602142,
0.025120310485363007,
0.0888807624578476,
0.02317127399146557,
-0.04269816726446152,
0.06573382765054703,
0.1031356230378151,
-0.022625315934419632,
0.130062073469162,
-0.05570349469780922,
-0.0844443142414093,
-0.029894577339291573,
-0.017464328557252884,
-0.026232779026031494,
0.12500542402267456,
-0.03829805925488472,
0.0003585406520869583,
0.034517984837293625,
0.03998730331659317,
0.011526827700436115,
-0.16873984038829803,
0.0015327517176046968,
0.031489718705415726,
-0.05562065169215202,
-0.04338363930583,
-0.005464727059006691,
0.017686229199171066,
0.08567342907190323,
0.030370695516467094,
-0.003966694697737694,
0.00769948773086071,
-0.013315990567207336,
-0.0571967288851738,
0.19057431817054749,
-0.09239900857210159,
-0.07559353858232498,
-0.07322592288255692,
0.0178266279399395,
-0.04376048222184181,
-0.03720879927277565,
0.0065116072073578835,
-0.09219399839639664,
-0.028211094439029694,
-0.08102203160524368,
-0.02278212457895279,
-0.02762029506266117,
0.019542863592505455,
0.02440614253282547,
-0.018733995035290718,
0.07974172383546829,
-0.1360035240650177,
0.007633812725543976,
-0.04873354732990265,
-0.09778883308172226,
0.004571565892547369,
0.07454220950603485,
0.09154071658849716,
0.08491799235343933,
-0.014399862848222256,
0.024360155686736107,
-0.0393020398914814,
0.23160965740680695,
-0.05529900640249252,
0.011533095501363277,
0.11768528074026108,
-0.015353837981820107,
0.05207790806889534,
0.09366792440414429,
0.0378030464053154,
-0.09197868406772614,
0.02314174734055996,
0.07862844318151474,
-0.03817174211144447,
-0.22835604846477509,
-0.014901414513587952,
-0.00623167073354125,
-0.08337629586458206,
0.10268282890319824,
0.03160896524786949,
-0.051676660776138306,
0.04192708432674408,
0.019562631845474243,
-0.00862675067037344,
-0.041160888969898224,
0.06854420155286789,
0.07679495215415955,
0.04724826291203499,
0.10868638008832932,
-0.004651697352528572,
-0.019883008673787117,
0.05451061949133873,
0.015699194744229317,
0.26044192910194397,
-0.04131641983985901,
0.10416863858699799,
0.0316375307738781,
0.15044298768043518,
-0.021776108071208,
0.06510432064533234,
0.0003940443566534668,
-0.010344412177801132,
-0.012445330619812012,
-0.06202547624707222,
-0.03008536994457245,
0.014401298947632313,
-0.04392479732632637,
0.023281633853912354,
-0.08232077211141586,
0.02800697274506092,
0.020269043743610382,
0.28651800751686096,
0.030928559601306915,
-0.25459200143814087,
-0.0772835910320282,
-0.014770781621336937,
-0.050941091030836105,
-0.06029089167714119,
0.00815594382584095,
0.1389959454536438,
-0.13878150284290314,
0.04482736065983772,
-0.07758119702339172,
0.08687935769557953,
-0.05041588097810745,
0.011721456423401833,
0.0497933067381382,
0.14901098608970642,
-0.016878735274076462,
0.05557812750339508,
-0.1951000988483429,
0.25295984745025635,
0.017840122804045677,
0.10409659147262573,
-0.065742626786232,
0.01362514030188322,
0.021951982751488686,
0.019699109718203545,
0.11580029875040054,
0.002691019792109728,
-0.07068829238414764,
-0.14660263061523438,
-0.09114933013916016,
0.048296041786670685,
0.14129161834716797,
-0.04419543966650963,
0.08978275209665298,
-0.036076344549655914,
0.012793561443686485,
0.037366028875112534,
-0.03485000506043434,
-0.14840079843997955,
-0.08693540841341019,
-0.001524685532785952,
0.008694256655871868,
-0.007671569474041462,
-0.061327144503593445,
-0.10541583597660065,
-0.010131504386663437,
0.10500910133123398,
0.0054207053035497665,
-0.05464291200041771,
-0.1578587293624878,
0.09029972553253174,
0.14306262135505676,
-0.0582045279443264,
0.01110503263771534,
0.01669435203075409,
0.11182998865842819,
0.03524256870150566,
-0.07741579413414001,
0.062130145728588104,
-0.06178595498204231,
-0.17873986065387726,
-0.05548153817653656,
0.12271704524755478,
0.08215665072202682,
0.049881402403116226,
0.00006488761573564261,
0.05012769252061844,
0.0009344883146695793,
-0.09626071900129318,
0.03423244133591652,
0.00757526746019721,
0.03506181389093399,
0.017126135528087616,
-0.08836211264133453,
0.09931986778974533,
-0.0355682410299778,
0.010303233750164509,
0.13171035051345825,
0.20717328786849976,
-0.10581304877996445,
0.11132868379354477,
0.0867212638258934,
-0.07408463954925537,
-0.16673749685287476,
0.05994101241230965,
0.13001921772956848,
0.012148838490247726,
0.0851266160607338,
-0.2138296663761139,
0.12329067289829254,
0.09942955523729324,
-0.010317943058907986,
0.008847622200846672,
-0.27820971608161926,
-0.1273958832025528,
0.0595981739461422,
0.10994471609592438,
0.04455704987049103,
-0.11730580031871796,
-0.03606439754366875,
-0.004028360825031996,
-0.09353125840425491,
0.1102006733417511,
-0.072503000497818,
0.11553394794464111,
-0.016292179003357887,
0.1104108914732933,
0.02549699880182743,
-0.031190142035484314,
0.10793350636959076,
0.061151765286922455,
0.08005395531654358,
-0.034665293991565704,
0.008984201587736607,
0.054601434618234634,
-0.05600022152066231,
0.0169502105563879,
-0.043802279978990555,
0.06708955764770508,
-0.1516462117433548,
-0.0007709331694059074,
-0.0910164937376976,
0.050864022225141525,
-0.04844692721962929,
-0.07178536057472229,
-0.013177560642361641,
0.05279207229614258,
0.07415671646595001,
-0.04001924768090248,
0.026513902470469475,
-0.00531373405829072,
0.09722844511270523,
0.09674122184515,
0.07999762892723083,
-0.017144372686743736,
-0.09218724071979523,
0.010401892475783825,
0.0041431849822402,
0.05463920533657074,
-0.10494855791330338,
0.013769567012786865,
0.13762697577476501,
0.06594453006982803,
0.09584075212478638,
0.04714398831129074,
-0.03947184234857559,
0.003768053837120533,
0.012574638240039349,
-0.12026602774858475,
-0.11350958049297333,
0.02317047119140625,
-0.04598565399646759,
-0.1547568440437317,
0.021162565797567368,
0.12029165774583817,
-0.04141208156943321,
-0.016958575695753098,
-0.008422449231147766,
0.004056185018271208,
-0.013505125418305397,
0.18584948778152466,
0.04603175073862076,
0.06342316418886185,
-0.08745048940181732,
0.106195367872715,
0.03567364811897278,
-0.05204557254910469,
0.05128731578588486,
0.06306961178779602,
-0.10400150716304779,
0.007982708513736725,
0.07713545858860016,
0.12483023852109909,
-0.04898742586374283,
-0.01029442623257637,
-0.089841328561306,
-0.08339638262987137,
0.04074441269040108,
0.13053975999355316,
0.05439111962914467,
-0.001716137514449656,
-0.07117263972759247,
0.0409245491027832,
-0.11925824731588364,
0.0710538998246193,
0.044312238693237305,
0.07063081115484238,
-0.10111155360937119,
0.1310303658246994,
-0.0009873805101960897,
0.026939835399389267,
-0.02615530975162983,
0.015090840868651867,
-0.09575331956148148,
-0.02438288740813732,
-0.10867127031087875,
-0.02548130415380001,
-0.00917030218988657,
0.0010223612189292908,
-0.0227594505995512,
-0.07379776984453201,
-0.026595966890454292,
0.039279185235500336,
-0.07528334110975266,
-0.05018424615263939,
0.01485705841332674,
0.039833344519138336,
-0.1498723328113556,
0.0012485008919611573,
0.02802547998726368,
-0.09255437552928925,
0.0908665880560875,
0.061686672270298004,
0.014891421422362328,
0.02696862816810608,
-0.11276312172412872,
-0.027758928015828133,
-0.010479702614247799,
0.0060648429207503796,
0.06473970413208008,
-0.09549792855978012,
-0.026394633576273918,
-0.038700416684150696,
0.04643448442220688,
0.01737915351986885,
0.09885995835065842,
-0.11737392097711563,
-0.005380831193178892,
-0.04041833057999611,
-0.04174135997891426,
-0.06319449841976166,
0.035873644053936005,
0.10250070691108704,
0.05455784127116203,
0.14881110191345215,
-0.07347214221954346,
0.059027206152677536,
-0.20151175558567047,
-0.036052003502845764,
0.010553311556577682,
-0.0423104353249073,
-0.08327064663171768,
-0.05283114314079285,
0.0889257863163948,
-0.044564589858055115,
0.1058555543422699,
-0.02055731788277626,
0.11143022775650024,
0.04186518117785454,
-0.010520683601498604,
-0.058819856494665146,
-0.006872687954455614,
0.1887088268995285,
0.05934930220246315,
-0.016819758340716362,
0.1300118863582611,
0.0003921398310922086,
0.030505826696753502,
0.08611442893743515,
0.22213056683540344,
0.16160792112350464,
0.0005529632908292115,
0.06407225877046585,
0.06112869828939438,
-0.07309403270483017,
-0.15177351236343384,
0.11731421202421188,
-0.018465396016836166,
0.10198380053043365,
-0.06764987111091614,
0.1902623325586319,
0.03844861313700676,
-0.18224704265594482,
0.06326964497566223,
-0.025303591042757034,
-0.11181355267763138,
-0.12167225033044815,
-0.02303396724164486,
-0.06922456622123718,
-0.11987947672605515,
0.0238109789788723,
-0.11685209721326828,
0.06139471381902695,
0.10290201753377914,
0.00824659038335085,
0.03781485557556152,
0.18475341796875,
-0.04445694386959076,
0.011442173272371292,
0.08321110159158707,
0.019295180216431618,
0.00666129169985652,
-0.04447546601295471,
-0.06659048795700073,
0.03631235659122467,
0.03273800387978554,
0.0629219189286232,
-0.052715789526700974,
0.0008350475109182298,
0.00921336468309164,
-0.006492447108030319,
-0.07726766914129257,
0.010170924477279186,
0.010141984559595585,
0.05405089259147644,
0.050718873739242554,
0.04636791720986366,
0.005581536330282688,
-0.05389363318681717,
0.2958619296550751,
-0.0696897953748703,
-0.06933937221765518,
-0.12930428981781006,
0.20656098425388336,
0.0224453154951334,
-0.021660950034856796,
0.05465502664446831,
-0.08405798673629761,
-0.012987415306270123,
0.1696757972240448,
0.13075155019760132,
-0.09389356523752213,
-0.01600019261240959,
-0.013748697005212307,
-0.010172701440751553,
-0.015107087790966034,
0.11756132543087006,
0.07685165107250214,
-0.011455175466835499,
-0.06883551180362701,
-0.0185954961925745,
-0.021550238132476807,
-0.05679558217525482,
-0.06257553398609161,
0.06895127892494202,
0.0265792328864336,
-0.006426288280636072,
-0.06128358095884323,
0.0692233145236969,
0.00009735589992487803,
-0.24285008013248444,
0.04288073256611824,
-0.17143514752388,
-0.17018282413482666,
-0.026311885565519333,
0.07278940081596375,
0.0064856442622840405,
0.05707075819373131,
0.0008821299998089671,
0.02049754559993744,
0.12285879999399185,
-0.012557774782180786,
-0.004148528911173344,
-0.10897624492645264,
0.11738470941781998,
-0.08525194227695465,
0.19652768969535828,
-0.006806211080402136,
0.05471091344952583,
0.09670887887477875,
0.040860455483198166,
-0.13848525285720825,
0.01778442971408367,
0.06505777686834335,
-0.12897038459777832,
-0.0018385557923465967,
0.14835071563720703,
-0.033294226974248886,
0.06252314895391464,
0.026216620579361916,
-0.1531832218170166,
0.006387670058757067,
0.014782514423131943,
-0.03730807825922966,
-0.0673183724284172,
-0.008708355017006397,
-0.05138631910085678,
0.16788196563720703,
0.2183661013841629,
-0.029319116845726967,
0.0047046123072505,
-0.08961118757724762,
0.01037047989666462,
0.04597717523574829,
0.06455966830253601,
-0.042471274733543396,
-0.20418675243854523,
0.010852782987058163,
0.0635291337966919,
-0.0042596496641635895,
-0.19470083713531494,
-0.09944775700569153,
0.05291980132460594,
-0.041059985756874084,
-0.04178059473633766,
0.09505394101142883,
0.021166730672121048,
0.03722022473812103,
-0.012165398336946964,
-0.1211882084608078,
-0.02237573079764843,
0.13862189650535583,
-0.17861929535865784,
-0.029125096276402473
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-0
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-0
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09619186073541641,
0.11424469202756882,
-0.0023460325319319963,
0.09160967171192169,
0.11931473761796951,
0.0222989022731781,
0.10062847286462784,
0.12891706824302673,
-0.09665769338607788,
0.08748069405555725,
0.0878918468952179,
0.039508260786533356,
0.047779690474271774,
0.1463567018508911,
-0.020621174946427345,
-0.259102463722229,
0.010331019759178162,
-0.004004384391009808,
-0.03417292237281799,
0.11181710660457611,
0.0846916139125824,
-0.11070802807807922,
0.08647225797176361,
0.014961148612201214,
-0.1533486545085907,
0.019199365749955177,
-0.03679472208023071,
-0.03508223220705986,
0.11322050541639328,
-0.032233331352472305,
0.10896486788988113,
0.025081219151616096,
0.1334080994129181,
-0.21045327186584473,
0.005117871332913637,
0.07437635958194733,
0.0454922690987587,
0.10076765716075897,
0.05206679925322533,
0.01448766142129898,
0.0894092470407486,
-0.15310493111610413,
0.09271236509084702,
0.030401045456528664,
-0.09170734882354736,
-0.13117776811122894,
-0.09629125148057938,
0.025488832965493202,
0.051327429711818695,
0.06898684054613113,
0.001692401710897684,
0.15219981968402863,
-0.059466443955898285,
0.07903765141963959,
0.267377644777298,
-0.32606977224349976,
-0.06381887197494507,
0.03211598843336105,
0.05971338227391243,
0.05209185183048248,
-0.12322580069303513,
-0.006130116526037455,
0.0270221009850502,
0.029414743185043335,
0.11783584952354431,
-0.016664739698171616,
-0.11225339025259018,
-0.013319111429154873,
-0.12740354239940643,
-0.0008079814142547548,
0.07087966799736023,
0.03511830419301987,
-0.051906611770391464,
-0.09469397366046906,
-0.07560622692108154,
-0.09402250498533249,
-0.024952426552772522,
-0.0639747902750969,
0.05660509690642357,
-0.054803479462862015,
-0.08085417747497559,
-0.035656481981277466,
-0.05665753409266472,
-0.07618462294340134,
-0.019526276737451553,
0.157478928565979,
0.039754509925842285,
0.021099666133522987,
-0.033816125243902206,
0.10834866762161255,
0.0028346313629299402,
-0.14183028042316437,
-0.015819285064935684,
-0.0014104042202234268,
-0.09694512188434601,
-0.04671577736735344,
-0.04987572878599167,
-0.019077427685260773,
0.010794983245432377,
0.1778779923915863,
-0.08025147765874863,
0.07574822753667831,
0.00899700541049242,
-0.02891162782907486,
-0.0065866573713719845,
0.14800970256328583,
-0.04361874982714653,
-0.0466899648308754,
-0.010015858337283134,
0.07322606444358826,
0.002995099173858762,
-0.015265284106135368,
-0.06446895748376846,
-0.027004852890968323,
0.101927749812603,
0.04583033546805382,
-0.05848934128880501,
0.0402611680328846,
-0.02355027385056019,
-0.028529632836580276,
0.016800720244646072,
-0.11475755274295807,
0.04455243796110153,
-0.0011247835354879498,
-0.08443493396043777,
-0.0005749976262450218,
0.0008479595999233425,
-0.005419112741947174,
-0.007729162462055683,
0.1118205264210701,
-0.09971515834331512,
-0.0024350646417587996,
-0.0642186626791954,
-0.08359704911708832,
0.009063446894288063,
-0.1547849029302597,
-0.016726642847061157,
-0.05762948840856552,
-0.16977302730083466,
-0.032238662242889404,
0.03751494362950325,
-0.07378493994474411,
-0.007376588881015778,
-0.04843074083328247,
-0.06529387086629868,
0.025596410036087036,
-0.01430688239634037,
0.1734120398759842,
-0.05409983918070793,
0.07298780232667923,
-0.0008298468892462552,
0.0462181381881237,
0.014396118931472301,
0.03584081679582596,
-0.10540584474802017,
0.02492368035018444,
-0.13772448897361755,
0.06862855702638626,
-0.08460333198308945,
-0.002697039395570755,
-0.13303126394748688,
-0.09764165431261063,
0.010056615807116032,
-0.022511612623929977,
0.09100847691297531,
0.13833729922771454,
-0.19417117536067963,
-0.018945571035146713,
0.12678620219230652,
-0.07527472823858261,
-0.0641639307141304,
0.061803966760635376,
-0.06061090901494026,
0.029406895861029625,
0.05168112367391586,
0.21143889427185059,
0.03849511966109276,
-0.16785651445388794,
-0.03284803777933121,
-0.00724639231339097,
0.03998812288045883,
0.02722969651222229,
0.03983620926737785,
0.003954550717025995,
0.06415817141532898,
0.014155492186546326,
-0.07575850188732147,
-0.032694291323423386,
-0.09157669544219971,
-0.06477023661136627,
-0.05464804172515869,
-0.07194286584854126,
0.04057120904326439,
0.0035588389728218317,
0.04229973256587982,
-0.0645512044429779,
-0.1013663187623024,
0.11935868859291077,
0.09596791863441467,
-0.04745170846581459,
0.03720896691083908,
-0.07945312559604645,
0.01980036124587059,
-0.01938987709581852,
-0.0391198992729187,
-0.2063767910003662,
-0.13026070594787598,
0.05199168622493744,
-0.05775579810142517,
0.033630866557359695,
0.0055394358932971954,
0.08142763376235962,
0.06121581792831421,
-0.04343240708112717,
-0.011632254347205162,
-0.09368254244327545,
0.003233879804611206,
-0.11717666685581207,
-0.1891748011112213,
-0.07731674611568451,
-0.03973378986120224,
0.09261500835418701,
-0.17308592796325684,
-0.006887109484523535,
0.01585649698972702,
0.14450620114803314,
0.028097044676542282,
-0.0686451718211174,
-0.00317175080999732,
0.03825271502137184,
0.0019121242221444845,
-0.09514423459768295,
0.044897548854351044,
0.007875144481658936,
-0.09310827404260635,
-0.06265608966350555,
-0.13610562682151794,
-0.010584814473986626,
0.060325976461172104,
0.05218540132045746,
-0.09685302525758743,
-0.04563253000378609,
-0.07083559781312943,
-0.04030855745077133,
-0.07669606059789658,
0.013339054770767689,
0.201598659157753,
0.036005012691020966,
0.11288684606552124,
-0.06671249866485596,
-0.07771597057580948,
-0.003317014081403613,
0.022426892071962357,
0.0124419080093503,
0.07681751251220703,
0.04168710485100746,
-0.05292532220482826,
0.07522477954626083,
0.09880527853965759,
-0.022387251257896423,
0.1250949651002884,
-0.04617089033126831,
-0.08372470736503601,
-0.03346883878111839,
-0.02476789988577366,
-0.028039734810590744,
0.1246037632226944,
-0.03907759487628937,
0.006133650429546833,
0.03680386021733284,
0.04517799988389015,
0.01699812337756157,
-0.16167931258678436,
0.008118926547467709,
0.02166798897087574,
-0.05336626619100571,
-0.03810938820242882,
-0.000434326590038836,
0.02779272012412548,
0.09269742667675018,
0.03158632665872574,
-0.01243904884904623,
0.0024411845952272415,
-0.012024673633277416,
-0.0615462027490139,
0.18478941917419434,
-0.09893979132175446,
-0.08558402955532074,
-0.07526203244924545,
0.005949090700596571,
-0.059986114501953125,
-0.036210767924785614,
0.01663881354033947,
-0.0878937765955925,
-0.039561718702316284,
-0.08728528022766113,
-0.017251938581466675,
-0.01707299053668976,
0.020833805203437805,
0.030794432386755943,
-0.022674962878227234,
0.0804881826043129,
-0.13951241970062256,
0.001092705992050469,
-0.051940854638814926,
-0.09185586124658585,
-0.0003813373332377523,
0.07496082782745361,
0.09811288118362427,
0.07953781634569168,
-0.016926415264606476,
0.029906783252954483,
-0.034467823803424835,
0.24170050024986267,
-0.046041056513786316,
0.010888660326600075,
0.10354969650506973,
-0.012676064856350422,
0.05645483732223511,
0.0957883968949318,
0.037929434329271317,
-0.09403964132070541,
0.020644567906856537,
0.08352236449718475,
-0.02887541987001896,
-0.22940510511398315,
-0.02573966048657894,
-0.005254583898931742,
-0.07847007364034653,
0.10558392107486725,
0.0319654680788517,
-0.03711985424160957,
0.044593121856451035,
0.020246472209692,
0.0016111737350001931,
-0.055412907153367996,
0.08193688839673996,
0.07552821189165115,
0.057326480746269226,
0.09993652254343033,
-0.008791168220341206,
-0.027332903817296028,
0.062270838767290115,
0.007644591853022575,
0.24653734266757965,
-0.02502221241593361,
0.1007436215877533,
0.03255615755915642,
0.15093937516212463,
-0.026356147602200508,
0.0641200914978981,
0.0031399994622915983,
-0.009516408666968346,
-0.014355744235217571,
-0.06699754297733307,
-0.025594437494874,
0.022777803242206573,
-0.046272676438093185,
0.029985958710312843,
-0.08145537972450256,
0.02453037165105343,
0.02882813662290573,
0.2794438600540161,
0.03369833901524544,
-0.2734116017818451,
-0.06662212312221527,
-0.013350858353078365,
-0.04100620746612549,
-0.06392408907413483,
0.006088618654757738,
0.11969815939664841,
-0.13309521973133087,
0.06550832092761993,
-0.07637372612953186,
0.09016426652669907,
-0.03836498036980629,
0.011001801118254662,
0.046851374208927155,
0.1535278558731079,
-0.01893579214811325,
0.04929836839437485,
-0.18624038994312286,
0.24303379654884338,
0.025215381756424904,
0.10750674456357956,
-0.06417720019817352,
0.009801266714930534,
0.019428934901952744,
0.00800356175750494,
0.10805553197860718,
0.0011225570924580097,
-0.06751200556755066,
-0.13835805654525757,
-0.09948893636465073,
0.04685185104608536,
0.1416385918855667,
-0.03482412174344063,
0.09886644780635834,
-0.028843341395258904,
0.012758365832269192,
0.033525366336107254,
-0.030289657413959503,
-0.15701408684253693,
-0.07260612398386002,
0.010742639191448689,
0.027982166036963463,
-0.014748184941709042,
-0.05165497213602066,
-0.10390570014715195,
-0.03858664631843567,
0.11900676786899567,
0.0006511809770017862,
-0.04572102054953575,
-0.15115362405776978,
0.0835893526673317,
0.1454702913761139,
-0.0582464225590229,
0.015483342111110687,
0.013312660157680511,
0.11138994246721268,
0.032444216310977936,
-0.0865766853094101,
0.06737996637821198,
-0.0534631609916687,
-0.1742396503686905,
-0.05875902995467186,
0.11838242411613464,
0.0793711319565773,
0.0453827865421772,
0.000012073453035554849,
0.057796988636255264,
0.0010414727730676532,
-0.09675030410289764,
0.036751966923475266,
0.005543984472751617,
0.051947157829999924,
0.028712300583720207,
-0.08488698303699493,
0.07538945972919464,
-0.033974114805459976,
0.017901012673974037,
0.12864172458648682,
0.23409856855869293,
-0.0991453230381012,
0.1036028191447258,
0.07955274730920792,
-0.07598898559808731,
-0.15946532785892487,
0.0616634376347065,
0.1252821683883667,
0.00455169752240181,
0.08336757868528366,
-0.19962358474731445,
0.13336943089962006,
0.10763288289308548,
-0.013850190676748753,
0.022477690130472183,
-0.2716895639896393,
-0.1324421614408493,
0.06468367576599121,
0.10929688811302185,
0.04953569918870926,
-0.12289167195558548,
-0.03516143932938576,
-0.010614436119794846,
-0.12101023644208908,
0.12943199276924133,
-0.07563531398773193,
0.11740424484014511,
-0.021933233365416527,
0.12316179275512695,
0.024540681391954422,
-0.03722750395536423,
0.11327318102121353,
0.07054434716701508,
0.0862170159816742,
-0.03948551416397095,
-0.00346121727488935,
0.0646812915802002,
-0.06263161450624466,
0.034868910908699036,
-0.03743794932961464,
0.06269535422325134,
-0.1482759714126587,
0.007390081882476807,
-0.07828138768672943,
0.06073397397994995,
-0.046366434544324875,
-0.06551038473844528,
-0.028090165928006172,
0.04740043357014656,
0.07305458933115005,
-0.03570307418704033,
0.04753480106592178,
0.007652612403035164,
0.09257252514362335,
0.10064250975847244,
0.07295867055654526,
-0.023336270824074745,
-0.08324671536684036,
0.014557232148945332,
0.00446806475520134,
0.04763925075531006,
-0.08546293526887894,
0.015926167368888855,
0.1468295305967331,
0.06042960658669472,
0.10234645754098892,
0.04599648714065552,
-0.04325208440423012,
0.005930156912654638,
0.017465343698859215,
-0.14340147376060486,
-0.09902460873126984,
0.02872556634247303,
-0.05712555721402168,
-0.15320105850696564,
0.03357134386897087,
0.12314195930957794,
-0.03682403266429901,
-0.016289832070469856,
-0.006126491818577051,
0.009375265799462795,
-0.011290247552096844,
0.18421998620033264,
0.042121149599552155,
0.054280102252960205,
-0.09108210355043411,
0.11433450132608414,
0.035539399832487106,
-0.041998691856861115,
0.05419333279132843,
0.06794506311416626,
-0.0990629717707634,
0.013627412728965282,
0.07318148761987686,
0.15055422484874725,
-0.06749893724918365,
-0.012533819302916527,
-0.09168146550655365,
-0.07607265561819077,
0.04485778138041496,
0.14670351147651672,
0.05250377207994461,
-0.0046648853458464146,
-0.06055436655879021,
0.03580280765891075,
-0.11771895736455917,
0.06800747662782669,
0.052116863429546356,
0.08184562623500824,
-0.10742872208356857,
0.12492690980434418,
-0.007089314516633749,
0.022349106147885323,
-0.028177671134471893,
0.01821018010377884,
-0.10165401548147202,
-0.034130197018384933,
-0.10831982642412186,
-0.014114723540842533,
-0.017720788717269897,
-0.003476113546639681,
-0.019586989656090736,
-0.07584299147129059,
-0.04329336807131767,
0.03282928839325905,
-0.07707703113555908,
-0.04845108091831207,
0.01753004640340805,
0.03992336988449097,
-0.16212397813796997,
0.0032643477898091078,
0.026494672521948814,
-0.0873534083366394,
0.08757968991994858,
0.06976421177387238,
0.015716230496764183,
0.027714410796761513,
-0.1255611628293991,
-0.033318739384412766,
-0.00030684194643981755,
0.01030963845551014,
0.07770883291959763,
-0.093647800385952,
-0.02958768419921398,
-0.031220193952322006,
0.04834846779704094,
0.015543215908110142,
0.10369674116373062,
-0.119207002222538,
-0.012637410312891006,
-0.045154038816690445,
-0.03833571821451187,
-0.05686607584357262,
0.02624964714050293,
0.11429909616708755,
0.04518614709377289,
0.157021626830101,
-0.07052519172430038,
0.054608747363090515,
-0.2041202336549759,
-0.0325244665145874,
0.011209671385586262,
-0.047381866723299026,
-0.0743965283036232,
-0.04435279220342636,
0.08416140079498291,
-0.05037989094853401,
0.12138772755861282,
-0.015301558189094067,
0.09190934896469116,
0.044671639800071716,
-0.004723771475255489,
-0.0720113143324852,
-0.012123584747314453,
0.1835772842168808,
0.05727427452802658,
-0.02097560279071331,
0.12073978781700134,
0.004119975958019495,
0.04218287765979767,
0.06758814305067062,
0.23600022494792938,
0.15154831111431122,
-0.012149279937148094,
0.07423187047243118,
0.06602797657251358,
-0.0755506157875061,
-0.1411580890417099,
0.12114766985177994,
-0.02109590545296669,
0.10661554336547852,
-0.05287281051278114,
0.1897527426481247,
0.038827501237392426,
-0.17653848230838776,
0.05450022965669632,
-0.025243466719985008,
-0.10731296986341476,
-0.1259671300649643,
-0.01651308685541153,
-0.08154504746198654,
-0.1165395975112915,
0.027606550604104996,
-0.12330912053585052,
0.06949884444475174,
0.09504813700914383,
0.00762645760551095,
0.0358503982424736,
0.1834729015827179,
-0.058300625532865524,
0.01071237027645111,
0.07188870757818222,
0.021557465195655823,
-0.004731169901788235,
-0.04028328135609627,
-0.06705498695373535,
0.03649964928627014,
0.043046850711107254,
0.07112697511911392,
-0.0500105656683445,
0.009391219355165958,
0.015023408457636833,
-0.010254410095512867,
-0.07846033573150635,
0.007820549421012402,
0.014371522702276707,
0.04858968406915665,
0.036534227430820465,
0.047088563442230225,
0.009272085502743721,
-0.05330520495772362,
0.2759115397930145,
-0.06771377474069595,
-0.06165669858455658,
-0.12333023548126221,
0.19554302096366882,
0.03336431458592415,
-0.01876183971762657,
0.05504624545574188,
-0.09300592541694641,
-0.013392237946391106,
0.16222122311592102,
0.135042205452919,
-0.09095106273889542,
-0.02097851224243641,
-0.02454080618917942,
-0.00849419366568327,
-0.011935090646147728,
0.10399936139583588,
0.07106059044599533,
0.0020418937783688307,
-0.06626754999160767,
-0.013301372528076172,
-0.02990497462451458,
-0.04739980399608612,
-0.06202109903097153,
0.059269897639751434,
0.02639712393283844,
-0.00695628160610795,
-0.05966729298233986,
0.06317722052335739,
-0.005218928214162588,
-0.23565173149108887,
0.03924039006233215,
-0.1740521639585495,
-0.1735990047454834,
-0.013794326223433018,
0.07103235274553299,
0.000603867752943188,
0.05607287585735321,
-0.005973500199615955,
0.009563214145600796,
0.11620095372200012,
-0.01676705852150917,
-0.013835775665938854,
-0.1184966191649437,
0.10868009179830551,
-0.1086229458451271,
0.21272911131381989,
-0.0015255971811711788,
0.06374379992485046,
0.0991949588060379,
0.03811347484588623,
-0.13467341661453247,
0.019020333886146545,
0.0619763508439064,
-0.1276823729276657,
0.0004618045350071043,
0.14574481546878815,
-0.034819845110177994,
0.06375051289796829,
0.03141941502690315,
-0.14964573085308075,
-0.0026046643033623695,
0.027147045359015465,
-0.03768931329250336,
-0.06861033290624619,
-0.009650888852775097,
-0.05562086030840874,
0.1656474471092224,
0.20687083899974823,
-0.028518924489617348,
0.012014019303023815,
-0.08461031317710876,
0.021947434172034264,
0.048489153385162354,
0.05822794884443283,
-0.040055569261312485,
-0.21644027531147003,
0.022532885894179344,
0.07264996320009232,
-0.0031036960426717997,
-0.194415882229805,
-0.09568168222904205,
0.042193688452243805,
-0.035804592072963715,
-0.04612903669476509,
0.09135933965444565,
0.02373286336660385,
0.03711015731096268,
-0.0193007942289114,
-0.11547468602657318,
-0.02668777108192444,
0.14594659209251404,
-0.1753440946340561,
-0.042742807418107986
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-10
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-10
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09701802581548691,
0.11383537203073502,
-0.0022766750771552324,
0.09110088646411896,
0.11912036687135696,
0.022329431027173996,
0.10108508169651031,
0.12778528034687042,
-0.09577248245477676,
0.08611389249563217,
0.08684318512678146,
0.04014597088098526,
0.048279859125614166,
0.14775799214839935,
-0.01965445652604103,
-0.26001161336898804,
0.010203367099165916,
-0.0032289254013448954,
-0.03271442651748657,
0.11168364435434341,
0.08515287935733795,
-0.11079072952270508,
0.08566240221261978,
0.014980755746364594,
-0.15286840498447418,
0.01955465041100979,
-0.037138260900974274,
-0.034848056733608246,
0.11315848678350449,
-0.03374398872256279,
0.1086292713880539,
0.024884512647986412,
0.13550513982772827,
-0.20942309498786926,
0.00512869656085968,
0.07343244552612305,
0.04549374058842659,
0.10046039521694183,
0.051450587809085846,
0.015382049605250359,
0.08877576142549515,
-0.15389113128185272,
0.09351978451013565,
0.029453257098793983,
-0.09123632311820984,
-0.1290239542722702,
-0.09618140757083893,
0.02629212476313114,
0.05367782339453697,
0.06842755526304245,
0.0012667491100728512,
0.1521807610988617,
-0.060248393565416336,
0.07900897413492203,
0.26677367091178894,
-0.3268178701400757,
-0.06371834874153137,
0.0339965894818306,
0.06147186458110809,
0.0525456927716732,
-0.12347808480262756,
-0.007017144933342934,
0.027574097737669945,
0.028710637241601944,
0.11894030123949051,
-0.017180927097797394,
-0.11199697107076645,
-0.013515787199139595,
-0.1283177137374878,
0.000027767142455559224,
0.07140809297561646,
0.03595226630568504,
-0.05194179341197014,
-0.09616955369710922,
-0.07531559467315674,
-0.09461918473243713,
-0.025937408208847046,
-0.06521232426166534,
0.056450873613357544,
-0.055386126041412354,
-0.08041124790906906,
-0.03592563793063164,
-0.056029755622148514,
-0.07674642652273178,
-0.017746448516845703,
0.1556023359298706,
0.04007955268025398,
0.02020322158932686,
-0.033287838101387024,
0.10819850862026215,
0.0005921275587752461,
-0.14143306016921997,
-0.014862060546875,
-0.0016162422252818942,
-0.09851841628551483,
-0.04791910946369171,
-0.050137441605329514,
-0.018409643322229385,
0.009398486465215683,
0.17855997383594513,
-0.07720024883747101,
0.07573147118091583,
0.010960468091070652,
-0.02967916615307331,
-0.006160777527838945,
0.14804291725158691,
-0.04443381354212761,
-0.0489073321223259,
-0.010357716120779514,
0.07428180426359177,
0.0025233018677681684,
-0.014000666327774525,
-0.06533205509185791,
-0.02758776769042015,
0.10224581509828568,
0.04565608873963356,
-0.06005227193236351,
0.040413182228803635,
-0.022626405581831932,
-0.028410498052835464,
0.017992878332734108,
-0.11539081484079361,
0.04476960748434067,
-0.0019191449973732233,
-0.08558745682239532,
-0.001781352562829852,
-0.0005153703968971968,
-0.004347524605691433,
-0.007703899405896664,
0.11043468117713928,
-0.09942328929901123,
-0.002845207927748561,
-0.06468645483255386,
-0.08419079333543777,
0.009179564192891121,
-0.1581045687198639,
-0.015392069704830647,
-0.05669834464788437,
-0.17324158549308777,
-0.032992828637361526,
0.0365758016705513,
-0.07293061912059784,
-0.008132335729897022,
-0.04962072893977165,
-0.06509232521057129,
0.02387017197906971,
-0.013670030049979687,
0.17441439628601074,
-0.05310934782028198,
0.07185959070920944,
-0.0009348921594209969,
0.047113899141550064,
0.014745855703949928,
0.03618713840842247,
-0.1049225926399231,
0.024724818766117096,
-0.13666972517967224,
0.06883653253316879,
-0.08484695851802826,
-0.0008632008684799075,
-0.13267582654953003,
-0.09781260043382645,
0.008332242257893085,
-0.022341188043355942,
0.0907282754778862,
0.13838617503643036,
-0.19518108665943146,
-0.017945613712072372,
0.12804065644741058,
-0.07473035156726837,
-0.06342747062444687,
0.06071864441037178,
-0.061091259121894836,
0.030827181413769722,
0.05334502458572388,
0.21090398728847504,
0.0410408191382885,
-0.16708222031593323,
-0.03321126103401184,
-0.006960955914109945,
0.03983140364289284,
0.025368979200720787,
0.03990543261170387,
0.005357000976800919,
0.0652921125292778,
0.014277747832238674,
-0.07504131644964218,
-0.03272572159767151,
-0.09092678129673004,
-0.06517297774553299,
-0.05422947183251381,
-0.0728851780295372,
0.04097806289792061,
0.003984123934060335,
0.04266269505023956,
-0.06470084190368652,
-0.10124953091144562,
0.1200474351644516,
0.09668624401092529,
-0.04740916192531586,
0.035741522908210754,
-0.07946240901947021,
0.019287681207060814,
-0.020254608243703842,
-0.039250731468200684,
-0.2061840295791626,
-0.12844125926494598,
0.05234281346201897,
-0.058016687631607056,
0.03343138098716736,
0.007721117697656155,
0.08211733400821686,
0.060837145894765854,
-0.04316331446170807,
-0.012205288745462894,
-0.0936446562409401,
0.0031431957613676786,
-0.11816412955522537,
-0.18767505884170532,
-0.07831492274999619,
-0.0402819998562336,
0.09351501613855362,
-0.17449192702770233,
-0.006022047717124224,
0.014000071212649345,
0.1440434604883194,
0.027354897931218147,
-0.06871871650218964,
-0.002290382981300354,
0.03732200339436531,
0.0027037246618419886,
-0.09555843472480774,
0.044604603201150894,
0.00682598352432251,
-0.09286724776029587,
-0.0643497183918953,
-0.1371707320213318,
-0.012178617529571056,
0.0596407875418663,
0.054705556482076645,
-0.09658391028642654,
-0.0461479052901268,
-0.07074414193630219,
-0.039702240377664566,
-0.07557564228773117,
0.01278223842382431,
0.20075754821300507,
0.0350768081843853,
0.11232133954763412,
-0.06678793579339981,
-0.07875574380159378,
-0.003503350308164954,
0.02391866035759449,
0.01314693782478571,
0.07640170305967331,
0.04144919291138649,
-0.052943769842386246,
0.0746125653386116,
0.09975798428058624,
-0.021896928548812866,
0.12442947179079056,
-0.04641120508313179,
-0.08444516360759735,
-0.033150359988212585,
-0.02514112927019596,
-0.029137151315808296,
0.12428659945726395,
-0.03945300728082657,
0.004434140399098396,
0.03636851906776428,
0.043896619230508804,
0.016946885734796524,
-0.16204339265823364,
0.008401579223573208,
0.021783828735351562,
-0.05249115452170372,
-0.03947457671165466,
-0.0013920639175921679,
0.026624172925949097,
0.09203092753887177,
0.030929360538721085,
-0.013099105097353458,
0.0018205660162493587,
-0.011793818324804306,
-0.06112369894981384,
0.1848343163728714,
-0.09776836633682251,
-0.08435055613517761,
-0.0746181458234787,
0.006690310779958963,
-0.05803777277469635,
-0.036271676421165466,
0.015421448275446892,
-0.08796947449445724,
-0.03898521885275841,
-0.08684530854225159,
-0.01814785972237587,
-0.01695442572236061,
0.02012082189321518,
0.032344479113817215,
-0.021980248391628265,
0.07870956510305405,
-0.13964082300662994,
0.001753970980644226,
-0.05245301499962807,
-0.09158510714769363,
-0.0010022231144830585,
0.07393387705087662,
0.09834460914134979,
0.08006593585014343,
-0.017523745074868202,
0.030055904760956764,
-0.03508920222520828,
0.24106159806251526,
-0.046576615422964096,
0.012331601232290268,
0.10324877500534058,
-0.012340093962848186,
0.05605250597000122,
0.0965767577290535,
0.03718312084674835,
-0.09374964982271194,
0.020343171432614326,
0.08267120271921158,
-0.028355177491903305,
-0.23000583052635193,
-0.025123760104179382,
-0.004517331253737211,
-0.07981421798467636,
0.10629867762327194,
0.031453706324100494,
-0.035061147063970566,
0.046321261674165726,
0.020037632435560226,
0.003011892084032297,
-0.05409251153469086,
0.08148735761642456,
0.07323739677667618,
0.056429922580718994,
0.10021014511585236,
-0.00904412567615509,
-0.02850104495882988,
0.060757022351026535,
0.008374359458684921,
0.24748502671718597,
-0.023957500234246254,
0.10053069144487381,
0.03153910115361214,
0.1503297984600067,
-0.02704549767076969,
0.06646022945642471,
0.004095335956662893,
-0.009918506257236004,
-0.014329012483358383,
-0.06669950485229492,
-0.024343542754650116,
0.02329356037080288,
-0.04553817957639694,
0.02978256344795227,
-0.08081842213869095,
0.024251006543636322,
0.028140638023614883,
0.27820825576782227,
0.03523040935397148,
-0.27445077896118164,
-0.0659874677658081,
-0.012978078797459602,
-0.04203435778617859,
-0.06377560645341873,
0.005752839148044586,
0.11868655681610107,
-0.1329948455095291,
0.06623770296573639,
-0.07637423276901245,
0.09036371856927872,
-0.0376916341483593,
0.011104848235845566,
0.045587942004203796,
0.1532742977142334,
-0.01875404082238674,
0.05044617876410484,
-0.18641217052936554,
0.24187621474266052,
0.025165865197777748,
0.10881216824054718,
-0.0653098002076149,
0.009961440227925777,
0.019432729110121727,
0.007538078352808952,
0.10933203250169754,
0.0007804777123965323,
-0.0676531046628952,
-0.1388792097568512,
-0.09972994774580002,
0.04683365300297737,
0.141724094748497,
-0.034466370940208435,
0.09942649304866791,
-0.02797507308423519,
0.011802898719906807,
0.03378822281956673,
-0.03171870484948158,
-0.15830914676189423,
-0.07244873046875,
0.010135291144251823,
0.0268408190459013,
-0.014846546575427055,
-0.05123889818787575,
-0.1039411649107933,
-0.04066044092178345,
0.11760152876377106,
0.0018615216249600053,
-0.04589197039604187,
-0.15098997950553894,
0.085231252014637,
0.14582377672195435,
-0.05777551606297493,
0.01608869433403015,
0.014704436995089054,
0.11223164945840836,
0.03192651644349098,
-0.08610665053129196,
0.06654344499111176,
-0.053507041186094284,
-0.1729840636253357,
-0.057858482003211975,
0.11964801698923111,
0.0798150897026062,
0.045664746314287186,
0.0015027533518150449,
0.057101864367723465,
0.0010688757756724954,
-0.09643002599477768,
0.03620709851384163,
0.005288176704198122,
0.051385778933763504,
0.02891489863395691,
-0.08586716651916504,
0.07396531850099564,
-0.03456532210111618,
0.019884031265974045,
0.1288772076368332,
0.23083259165287018,
-0.09897062182426453,
0.10284709185361862,
0.07936853170394897,
-0.07643915712833405,
-0.15916217863559723,
0.06245391070842743,
0.12537115812301636,
0.005356854293495417,
0.08365661650896072,
-0.19926956295967102,
0.13353364169597626,
0.1070304661989212,
-0.013255809433758259,
0.02117743156850338,
-0.2716525197029114,
-0.13194634020328522,
0.06590050458908081,
0.10944140702486038,
0.04726649820804596,
-0.1220836490392685,
-0.034987643361091614,
-0.011183010414242744,
-0.12093627452850342,
0.1288571059703827,
-0.07667522132396698,
0.1165110319852829,
-0.021268120035529137,
0.12210357934236526,
0.02453400194644928,
-0.037333112210035324,
0.11164409667253494,
0.0722794383764267,
0.08646518737077713,
-0.03921503946185112,
-0.00442904606461525,
0.06713326275348663,
-0.062326084822416306,
0.03665319085121155,
-0.03645529970526695,
0.06237918138504028,
-0.14809194207191467,
0.006712295580655336,
-0.07775150239467621,
0.06048397347331047,
-0.04606272280216217,
-0.0657467246055603,
-0.027588307857513428,
0.04741428419947624,
0.07269015908241272,
-0.0357772558927536,
0.04534502699971199,
0.00893367175012827,
0.0918073058128357,
0.09675964713096619,
0.07419103384017944,
-0.022241173312067986,
-0.08237680792808533,
0.014461299404501915,
0.004683061968535185,
0.04705186188220978,
-0.08615993708372116,
0.015320445410907269,
0.14683645963668823,
0.06086720898747444,
0.10260313749313354,
0.04543743282556534,
-0.0433783121407032,
0.00539566483348608,
0.01682276837527752,
-0.14093628525733948,
-0.10039439052343369,
0.028559353202581406,
-0.058144133538007736,
-0.15384025871753693,
0.034791912883520126,
0.12116234749555588,
-0.0376499705016613,
-0.017180336639285088,
-0.007238544523715973,
0.008934194222092628,
-0.010893060825765133,
0.18555134534835815,
0.0429266095161438,
0.05443131923675537,
-0.09155045449733734,
0.11386317014694214,
0.03582952544093132,
-0.04229452833533287,
0.0539262555539608,
0.06742458790540695,
-0.0998077318072319,
0.01327080000191927,
0.07439279556274414,
0.15093667805194855,
-0.06438474357128143,
-0.012870639562606812,
-0.09147821366786957,
-0.07611779868602753,
0.04519877955317497,
0.1461360603570938,
0.052721548825502396,
-0.005955343134701252,
-0.060162972658872604,
0.036208223551511765,
-0.11824918538331985,
0.06801987439393997,
0.05222401022911072,
0.08172135800123215,
-0.10772542655467987,
0.12515166401863098,
-0.006986947730183601,
0.02289852499961853,
-0.028148796409368515,
0.019023990258574486,
-0.10080239921808243,
-0.034514542669057846,
-0.10685495287179947,
-0.015530886128544807,
-0.019101755693554878,
-0.003333761589601636,
-0.02022595889866352,
-0.07529252022504807,
-0.042996007949113846,
0.03246142715215683,
-0.07682792097330093,
-0.04855259880423546,
0.01709122210741043,
0.03912845999002457,
-0.16113221645355225,
0.003251671325415373,
0.025763938203454018,
-0.08686971664428711,
0.08691408485174179,
0.0690549910068512,
0.016476595774292946,
0.02836538851261139,
-0.12483806908130646,
-0.03277609869837761,
0.0006305581191554666,
0.010712041519582272,
0.07728992402553558,
-0.09338998794555664,
-0.029076432809233665,
-0.030992677435278893,
0.04943234100937843,
0.014706145040690899,
0.10087069123983383,
-0.11821824312210083,
-0.013607270084321499,
-0.04634145274758339,
-0.037769172340631485,
-0.057329390197992325,
0.026909012347459793,
0.11376544088125229,
0.043691299855709076,
0.15761007368564606,
-0.06877119839191437,
0.05453219264745712,
-0.20489008724689484,
-0.03307840973138809,
0.01049004215747118,
-0.04735901951789856,
-0.07414544373750687,
-0.045121412724256516,
0.08431795984506607,
-0.049919433891773224,
0.12241184711456299,
-0.015315842814743519,
0.09289387613534927,
0.04375402629375458,
-0.0016213774215430021,
-0.07155350595712662,
-0.011406551115214825,
0.1840817779302597,
0.05751248821616173,
-0.0215297844260931,
0.11989721655845642,
0.004672515206038952,
0.04254377633333206,
0.06611550599336624,
0.23332759737968445,
0.1516849845647812,
-0.01308975089341402,
0.07422996312379837,
0.06686556339263916,
-0.07574039697647095,
-0.13994182646274567,
0.12243298441171646,
-0.021810133010149002,
0.10480066388845444,
-0.052624840289354324,
0.19201919436454773,
0.03845350816845894,
-0.1765505075454712,
0.05507785826921463,
-0.024484137073159218,
-0.10819602757692337,
-0.12472803890705109,
-0.01828979328274727,
-0.08183987438678741,
-0.11539217084646225,
0.027919737622141838,
-0.12334755808115005,
0.06778419017791748,
0.09527573734521866,
0.007608390878885984,
0.03514673933386803,
0.1847342550754547,
-0.058865275233983994,
0.011046036146581173,
0.07215776294469833,
0.021114090457558632,
-0.004178482107818127,
-0.04070371016860008,
-0.06612630933523178,
0.03771935775876045,
0.042229361832141876,
0.07180971652269363,
-0.052313946187496185,
0.008777616545557976,
0.014930108562111855,
-0.009408412501215935,
-0.07784626632928848,
0.007963844574987888,
0.013962958008050919,
0.04851960390806198,
0.035425927489995956,
0.04734444618225098,
0.008716740645468235,
-0.05372147262096405,
0.2744821012020111,
-0.06768449395895004,
-0.06261806935071945,
-0.12369920313358307,
0.19282501935958862,
0.03438832610845566,
-0.01879298873245716,
0.05594943091273308,
-0.09332943707704544,
-0.011136398650705814,
0.16311459243297577,
0.13524256646633148,
-0.08871638029813766,
-0.021528199315071106,
-0.024139173328876495,
-0.008876577019691467,
-0.012719869613647461,
0.10350129753351212,
0.07145664840936661,
-0.0003708606236614287,
-0.0660696029663086,
-0.012637191452085972,
-0.028342658653855324,
-0.0486622229218483,
-0.06185729056596756,
0.05911947041749954,
0.027386654168367386,
-0.007308262400329113,
-0.05780360475182533,
0.06479351967573166,
-0.0030132620595395565,
-0.2355911284685135,
0.03795027732849121,
-0.1729860007762909,
-0.17372868955135345,
-0.014448394067585468,
0.07077663391828537,
0.0023709714878350496,
0.055966686457395554,
-0.005781505722552538,
0.009876882657408714,
0.11558526009321213,
-0.01612097956240177,
-0.014447974972426891,
-0.11855733394622803,
0.1089099645614624,
-0.10985484719276428,
0.21160735189914703,
-0.00176405836828053,
0.06476353853940964,
0.09905688464641571,
0.03646567091345787,
-0.13450093567371368,
0.0194613765925169,
0.06210321560502052,
-0.1255643665790558,
0.002036954276263714,
0.1461997777223587,
-0.03457355499267578,
0.061456143856048584,
0.030595026910305023,
-0.14991839230060577,
-0.002259587636217475,
0.027283454313874245,
-0.03712068125605583,
-0.0695255696773529,
-0.007106670178472996,
-0.0548626147210598,
0.16635717451572418,
0.2066783457994461,
-0.028951648622751236,
0.012545452453196049,
-0.0851154625415802,
0.02132675051689148,
0.0488370843231678,
0.05837768316268921,
-0.03983912244439125,
-0.21613018214702606,
0.021347787231206894,
0.07001469284296036,
-0.0025592695455998182,
-0.19367146492004395,
-0.09468641877174377,
0.041071873158216476,
-0.0372159481048584,
-0.0461987629532814,
0.0905754491686821,
0.025403259322047234,
0.03713355213403702,
-0.019018245860934258,
-0.1157686784863472,
-0.02698887139558792,
0.1459505707025528,
-0.17629772424697876,
-0.04187227785587311
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-2
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-2
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09579430520534515,
0.11478172242641449,
-0.002373844152316451,
0.09141865372657776,
0.11867187917232513,
0.02269737794995308,
0.10071933269500732,
0.1285841166973114,
-0.09636662155389786,
0.08655231446027756,
0.08752021193504333,
0.03893868252635002,
0.04766376316547394,
0.14678867161273956,
-0.019829167053103447,
-0.2600948214530945,
0.009911463595926762,
-0.004685971420258284,
-0.03450951725244522,
0.11164409667253494,
0.0857575535774231,
-0.10982316732406616,
0.08600731939077377,
0.015367548912763596,
-0.15389181673526764,
0.019543496891856194,
-0.036203037947416306,
-0.034123439341783524,
0.11300452053546906,
-0.032965514808893204,
0.10895974189043045,
0.02587384730577469,
0.13520576059818268,
-0.2100490778684616,
0.004798361100256443,
0.07267912477254868,
0.04591664299368858,
0.10080882906913757,
0.052060775458812714,
0.015394861809909344,
0.08822319656610489,
-0.15332038700580597,
0.0924358144402504,
0.029785452410578728,
-0.09112973511219025,
-0.1300676167011261,
-0.09658592194318771,
0.026441019028425217,
0.053493432700634,
0.06733841449022293,
0.0021163923665881157,
0.15091198682785034,
-0.05954392999410629,
0.07869445532560349,
0.2668158710002899,
-0.3275105059146881,
-0.06411159038543701,
0.03254278376698494,
0.06007636711001396,
0.05342306196689606,
-0.12193991988897324,
-0.006263501942157745,
0.027565835043787956,
0.028995456174016,
0.11792141944169998,
-0.01702667772769928,
-0.11192586272954941,
-0.013565157540142536,
-0.12835544347763062,
-0.001754482858814299,
0.0718558058142662,
0.03532055392861366,
-0.05298001319169998,
-0.09388123452663422,
-0.07596689462661743,
-0.09236659854650497,
-0.024614036083221436,
-0.06562840938568115,
0.056745368987321854,
-0.054047439247369766,
-0.08007879555225372,
-0.03813827782869339,
-0.05741528794169426,
-0.07704498618841171,
-0.01868819259107113,
0.15753135085105896,
0.04006305709481239,
0.0216514952480793,
-0.03294060006737709,
0.10938235372304916,
0.002325245412066579,
-0.14109301567077637,
-0.015518924221396446,
-0.000953750975895673,
-0.0974632129073143,
-0.04727952554821968,
-0.05007705092430115,
-0.017797304317355156,
0.010176965966820717,
0.17744329571723938,
-0.07951266318559647,
0.07496720552444458,
0.009462709538638592,
-0.0284718070179224,
-0.00615668622776866,
0.1479114294052124,
-0.04249773547053337,
-0.045748598873615265,
-0.009871212765574455,
0.07331844419240952,
0.0033594134729355574,
-0.014233261346817017,
-0.06539811193943024,
-0.028239324688911438,
0.1029568612575531,
0.04690643772482872,
-0.059613704681396484,
0.03835996985435486,
-0.02428254298865795,
-0.028419166803359985,
0.01615932583808899,
-0.11542918533086777,
0.04453651234507561,
-0.0016373289981856942,
-0.08398830890655518,
-0.002095672767609358,
0.00041442844667471945,
-0.004285983741283417,
-0.006675101816654205,
0.10939689725637436,
-0.09823817759752274,
-0.0016963205998763442,
-0.06320653855800629,
-0.08208917826414108,
0.009100326336920261,
-0.15483926236629486,
-0.016241420060396194,
-0.05754644423723221,
-0.17078189551830292,
-0.03105422854423523,
0.037377454340457916,
-0.07448877394199371,
-0.010389922186732292,
-0.04849493131041527,
-0.06425539404153824,
0.025114281103014946,
-0.014377846382558346,
0.17406709492206573,
-0.05321725085377693,
0.0728791132569313,
-0.0006273504695855081,
0.046930886805057526,
0.014483424834907055,
0.03525197133421898,
-0.10404328256845474,
0.025815851986408234,
-0.13776199519634247,
0.06933755427598953,
-0.08384741097688675,
-0.003997830208390951,
-0.13398520648479462,
-0.09732749313116074,
0.010081226006150246,
-0.023351356387138367,
0.09009906649589539,
0.13827691972255707,
-0.19375170767307281,
-0.017536696046590805,
0.12671977281570435,
-0.07626175135374069,
-0.06327594816684723,
0.062134966254234314,
-0.06087316945195198,
0.0325440987944603,
0.051164865493774414,
0.21112921833992004,
0.040193960070610046,
-0.16727283596992493,
-0.030961541458964348,
-0.0052517992444336414,
0.03947335481643677,
0.026969389989972115,
0.04158440977334976,
0.003947946708649397,
0.06256907433271408,
0.013960890471935272,
-0.07749001681804657,
-0.03292708098888397,
-0.0918823704123497,
-0.06584332883358002,
-0.05470629781484604,
-0.07245628535747528,
0.04209166765213013,
0.0019313590601086617,
0.04300301522016525,
-0.0639885738492012,
-0.10016898065805435,
0.11930732429027557,
0.09704777598381042,
-0.047002147883176804,
0.03738206997513771,
-0.07974077016115189,
0.019412389025092125,
-0.02119266800582409,
-0.03992631658911705,
-0.20600122213363647,
-0.12942585349082947,
0.05298718437552452,
-0.05825446546077728,
0.033551935106515884,
0.007785916328430176,
0.08108341693878174,
0.061056192964315414,
-0.0432722233235836,
-0.012112845666706562,
-0.09400244057178497,
0.002720574149861932,
-0.11766768991947174,
-0.1883922815322876,
-0.07784554362297058,
-0.040112193673849106,
0.09383498132228851,
-0.17337125539779663,
-0.007129597011953592,
0.015023821033537388,
0.14398394525051117,
0.027534306049346924,
-0.06817556917667389,
-0.0035639218986034393,
0.03652556613087654,
0.0020329905673861504,
-0.09483257681131363,
0.04481348395347595,
0.008064057677984238,
-0.09382428973913193,
-0.062320172786712646,
-0.1347937136888504,
-0.011246968992054462,
0.05823764204978943,
0.053379807621240616,
-0.0967692956328392,
-0.04647046700119972,
-0.07097698748111725,
-0.040539659559726715,
-0.07681945711374283,
0.012957338243722916,
0.20105072855949402,
0.03481580689549446,
0.11264923959970474,
-0.06718060374259949,
-0.07799083739519119,
-0.0037286251317709684,
0.02155200205743313,
0.012197072617709637,
0.07619768381118774,
0.04150266572833061,
-0.05473069101572037,
0.073735311627388,
0.10027004778385162,
-0.022192982956767082,
0.12366735935211182,
-0.04640227183699608,
-0.0840810164809227,
-0.034784235060214996,
-0.022856775671243668,
-0.028528500348329544,
0.12361498177051544,
-0.03832898288965225,
0.006458928342908621,
0.03646887093782425,
0.044984254986047745,
0.016786886379122734,
-0.1628274768590927,
0.008245040662586689,
0.021984698250889778,
-0.05437102168798447,
-0.03688148036599159,
-0.0013336410047486424,
0.02758917585015297,
0.09249075502157211,
0.031599581241607666,
-0.012827994301915169,
0.0035286066122353077,
-0.011930689215660095,
-0.06210670992732048,
0.18402422964572906,
-0.0978786051273346,
-0.0857355073094368,
-0.07629840075969696,
0.006771714426577091,
-0.058629103004932404,
-0.036003440618515015,
0.016303211450576782,
-0.08671513944864273,
-0.038577426224946976,
-0.08723345398902893,
-0.01691088080406189,
-0.01828858256340027,
0.021121658384799957,
0.03289109095931053,
-0.022218754515051842,
0.08129020780324936,
-0.13891544938087463,
0.0013864225475117564,
-0.052055954933166504,
-0.09299609065055847,
-0.00025686449953354895,
0.0748368352651596,
0.09781751036643982,
0.07912095636129379,
-0.016805404797196388,
0.029799679294228554,
-0.03416941687464714,
0.24203208088874817,
-0.0452045276761055,
0.011301006190478802,
0.10392311960458755,
-0.013951918110251427,
0.05693044140934944,
0.09608107805252075,
0.03691193088889122,
-0.09380123764276505,
0.02068098448216915,
0.08213120698928833,
-0.028876155614852905,
-0.2297896295785904,
-0.02576267346739769,
-0.0039816065691411495,
-0.07929400354623795,
0.10581254959106445,
0.031589847058057785,
-0.03905956447124481,
0.04509864002466202,
0.02086590602993965,
0.0017928702291101217,
-0.054798442870378494,
0.0816638320684433,
0.07501517981290817,
0.05659596994519234,
0.09981310367584229,
-0.008589807897806168,
-0.028421755880117416,
0.06243916228413582,
0.008908904157578945,
0.24683748185634613,
-0.024181434884667397,
0.09993551671504974,
0.03220214322209358,
0.1520429104566574,
-0.02680416963994503,
0.06433158367872238,
0.004266409669071436,
-0.009166840463876724,
-0.014987306669354439,
-0.06671800464391708,
-0.025383105501532555,
0.023995401337742805,
-0.04494290426373482,
0.02997821755707264,
-0.08188068866729736,
0.026666564866900444,
0.02786186710000038,
0.27983495593070984,
0.03488132730126381,
-0.2722266614437103,
-0.06565624475479126,
-0.012316322885453701,
-0.041778940707445145,
-0.0633065328001976,
0.005833946168422699,
0.12054918706417084,
-0.13397888839244843,
0.06459037214517593,
-0.07600131630897522,
0.08954234421253204,
-0.03828725963830948,
0.010447698645293713,
0.04488025978207588,
0.1523570865392685,
-0.017611132934689522,
0.05071721598505974,
-0.1844499111175537,
0.2424861639738083,
0.025009801611304283,
0.10689183324575424,
-0.06362046301364899,
0.010197040624916553,
0.018729092553257942,
0.00786596816033125,
0.1095341295003891,
0.0018103467300534248,
-0.06888329982757568,
-0.1386641412973404,
-0.10097158700227737,
0.04655231162905693,
0.14229942858219147,
-0.03590868040919304,
0.09880569577217102,
-0.028806159272789955,
0.0126761170104146,
0.03331267833709717,
-0.029937855899333954,
-0.15756383538246155,
-0.07175734639167786,
0.010395122691988945,
0.02598702162504196,
-0.015337899327278137,
-0.05237428843975067,
-0.10406992584466934,
-0.03695376589894295,
0.11970171332359314,
0.0014230801025405526,
-0.046133849769830704,
-0.150373175740242,
0.08474363386631012,
0.1453283578157425,
-0.058854833245277405,
0.015159251168370247,
0.014072498306632042,
0.11242985725402832,
0.03231602907180786,
-0.08632266521453857,
0.06654933094978333,
-0.053320448845624924,
-0.17470993101596832,
-0.05799722671508789,
0.12023398280143738,
0.07924322783946991,
0.04589907452464104,
0.0013411202235147357,
0.05679796263575554,
0.0021711636800318956,
-0.09645459800958633,
0.03736645728349686,
0.0057824659161269665,
0.051118265837430954,
0.029027441516518593,
-0.08560772240161896,
0.07694485038518906,
-0.034222908318042755,
0.017964348196983337,
0.13075491786003113,
0.234883114695549,
-0.09963145852088928,
0.10453014075756073,
0.0789095014333725,
-0.0769088938832283,
-0.1590869277715683,
0.06044163182377815,
0.12691593170166016,
0.0042861029505729675,
0.08528471738100052,
-0.19962146878242493,
0.13307087123394012,
0.10702701658010483,
-0.014532066881656647,
0.020127173513174057,
-0.272757887840271,
-0.1321210116147995,
0.06445720791816711,
0.10929969698190689,
0.05069076642394066,
-0.1216418668627739,
-0.03590517118573189,
-0.009896607138216496,
-0.12110109627246857,
0.12835371494293213,
-0.07460147142410278,
0.11692823469638824,
-0.0214754119515419,
0.12326613813638687,
0.02467968873679638,
-0.03649280592799187,
0.11405190825462341,
0.0707852691411972,
0.08490326255559921,
-0.03895503655076027,
-0.003545787651091814,
0.06494259089231491,
-0.06298397481441498,
0.03553839772939682,
-0.03595864027738571,
0.06308357417583466,
-0.14885014295578003,
0.006718717515468597,
-0.07743574678897858,
0.0601823665201664,
-0.046828776597976685,
-0.06543959677219391,
-0.027700645849108696,
0.04687413573265076,
0.07290911674499512,
-0.035486310720443726,
0.0451008602976799,
0.009602919220924377,
0.090276800096035,
0.10148028284311295,
0.07307682931423187,
-0.020850086584687233,
-0.08299297839403152,
0.01306266337633133,
0.0043144868686795235,
0.047086238861083984,
-0.08521178364753723,
0.01666800118982792,
0.14600352942943573,
0.05999935045838356,
0.10266514867544174,
0.04545105993747711,
-0.043896857649087906,
0.006514646578580141,
0.01671544462442398,
-0.14266164600849152,
-0.10135892778635025,
0.027926340699195862,
-0.057159267365932465,
-0.15414458513259888,
0.03289274871349335,
0.12389224022626877,
-0.03683595731854439,
-0.01700529456138611,
-0.006929941009730101,
0.009725536219775677,
-0.011550490744411945,
0.18377123773097992,
0.041875287890434265,
0.05485478788614273,
-0.09023826569318771,
0.11397656798362732,
0.03571471571922302,
-0.04058675840497017,
0.05368360877037048,
0.06683404743671417,
-0.09888938814401627,
0.01388928759843111,
0.073716901242733,
0.15011455118656158,
-0.0672837644815445,
-0.012278452515602112,
-0.09132981300354004,
-0.07613623887300491,
0.04428676888346672,
0.14484675228595734,
0.0534934364259243,
-0.005168212112039328,
-0.060117870569229126,
0.035869449377059937,
-0.11703356355428696,
0.06833949685096741,
0.05288789048790932,
0.08192367851734161,
-0.10860386490821838,
0.125105619430542,
-0.007547816261649132,
0.024556364864110947,
-0.028414888307452202,
0.018014023080468178,
-0.10056424140930176,
-0.034596603363752365,
-0.10867138206958771,
-0.013353555463254452,
-0.01722397841513157,
-0.003512031165882945,
-0.01917913556098938,
-0.07631610333919525,
-0.042985279113054276,
0.03341543301939964,
-0.07685564458370209,
-0.04863729327917099,
0.01665761135518551,
0.039852242916822433,
-0.16107332706451416,
0.002473817439749837,
0.027009200304746628,
-0.08760792762041092,
0.08834456652402878,
0.06980155408382416,
0.0166326854377985,
0.027857551351189613,
-0.1233827993273735,
-0.03310762718319893,
0.0006961298058740795,
0.010589714162051678,
0.07684600353240967,
-0.09471230953931808,
-0.030272435396909714,
-0.030650854110717773,
0.04892286658287048,
0.014975570142269135,
0.10416820645332336,
-0.11906784772872925,
-0.013130550272762775,
-0.046597011387348175,
-0.03913405165076256,
-0.05710483342409134,
0.02568470872938633,
0.11306064575910568,
0.04597057029604912,
0.15726013481616974,
-0.07016714662313461,
0.05493343994021416,
-0.20433300733566284,
-0.032638486474752426,
0.010490021668374538,
-0.045522887259721756,
-0.07488688826560974,
-0.04518578574061394,
0.08307936042547226,
-0.05009860172867775,
0.12118878960609436,
-0.01577063463628292,
0.09131952375173569,
0.044123973697423935,
-0.00410277396440506,
-0.06991184502840042,
-0.011918927542865276,
0.18295283615589142,
0.05745582655072212,
-0.020863527432084084,
0.12037012726068497,
0.0030861084815114737,
0.04313277080655098,
0.06587384641170502,
0.23530463874340057,
0.15151359140872955,
-0.012651698663830757,
0.07431323081254959,
0.06657245010137558,
-0.0748758465051651,
-0.1419381946325302,
0.12109861522912979,
-0.02086622454226017,
0.10512376576662064,
-0.05156835913658142,
0.1900835931301117,
0.03908466547727585,
-0.17701727151870728,
0.05342958867549896,
-0.02451150119304657,
-0.10813076794147491,
-0.1263124644756317,
-0.016770394518971443,
-0.08268535137176514,
-0.11602013558149338,
0.027611345052719116,
-0.12320311367511749,
0.06893337517976761,
0.09479749947786331,
0.0066474927589297295,
0.03592139482498169,
0.18209415674209595,
-0.058199603110551834,
0.01098841056227684,
0.07135489583015442,
0.0211209487169981,
-0.003358916612342,
-0.0389830619096756,
-0.06718671321868896,
0.03720833733677864,
0.04437153786420822,
0.07107213884592056,
-0.050652164965867996,
0.01005376037210226,
0.013964813202619553,
-0.010612204670906067,
-0.0783904418349266,
0.007506481371819973,
0.014207634143531322,
0.04846961796283722,
0.034413158893585205,
0.047615569084882736,
0.009083135053515434,
-0.0531502328813076,
0.27585262060165405,
-0.06742891669273376,
-0.0619342178106308,
-0.1232747882604599,
0.19502206146717072,
0.032442256808280945,
-0.01805809885263443,
0.056686270982027054,
-0.09288012981414795,
-0.013041583821177483,
0.16118505597114563,
0.13431525230407715,
-0.09137705713510513,
-0.021217811852693558,
-0.02457880601286888,
-0.008796240203082561,
-0.011809870600700378,
0.10463918745517731,
0.07090283930301666,
0.0009379620896652341,
-0.06701567769050598,
-0.013362045399844646,
-0.03012789599597454,
-0.047317396849393845,
-0.06292315572500229,
0.059368427842855453,
0.02604948915541172,
-0.0058343433775007725,
-0.059140581637620926,
0.06343551725149155,
-0.0034661253448575735,
-0.23495268821716309,
0.037973690778017044,
-0.1726563721895218,
-0.17402495443820953,
-0.013247871771454811,
0.07152704149484634,
0.0004300063301343471,
0.05595802143216133,
-0.007319062948226929,
0.009637599810957909,
0.11645487695932388,
-0.01666608452796936,
-0.01461083348840475,
-0.11698472499847412,
0.10971491783857346,
-0.10964954644441605,
0.21268072724342346,
-0.0006186257814988494,
0.06535909324884415,
0.09846477210521698,
0.037286244332790375,
-0.13541638851165771,
0.018291940912604332,
0.06203562021255493,
-0.12584218382835388,
0.0012423532316461205,
0.14692743122577667,
-0.03473607823252678,
0.06306399405002594,
0.032192256301641464,
-0.14967632293701172,
-0.00405661016702652,
0.028204862028360367,
-0.03728610277175903,
-0.0686410665512085,
-0.009881678968667984,
-0.05677032470703125,
0.1652427464723587,
0.2057962566614151,
-0.029385870322585106,
0.012834514491260052,
-0.08434412628412247,
0.02205999195575714,
0.04883703216910362,
0.059952884912490845,
-0.038807936012744904,
-0.2162420153617859,
0.021289603784680367,
0.07184264808893204,
-0.002327770460397005,
-0.19453869760036469,
-0.09671548753976822,
0.041489679366350174,
-0.03582504764199257,
-0.046142563223838806,
0.09188299626111984,
0.023999173194169998,
0.03646843880414963,
-0.01853249780833721,
-0.11710601300001144,
-0.027671435847878456,
0.1453811377286911,
-0.17564968764781952,
-0.042156487703323364
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-4
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-4
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09636647254228592,
0.11389192193746567,
-0.0023361151106655598,
0.09222032129764557,
0.11973018944263458,
0.02301490306854248,
0.10135083645582199,
0.12783251702785492,
-0.09656552970409393,
0.08606206625699997,
0.08757563680410385,
0.03866757079958916,
0.04701310768723488,
0.14656303822994232,
-0.019635280594229698,
-0.26008906960487366,
0.00969864521175623,
-0.00416589668020606,
-0.033847060054540634,
0.11179300397634506,
0.08478758484125137,
-0.11040694266557693,
0.08651191741228104,
0.015089966356754303,
-0.15457557141780853,
0.01999802701175213,
-0.03689328953623772,
-0.03403201699256897,
0.1131887286901474,
-0.03293192759156227,
0.10922179371118546,
0.02558165043592453,
0.13506823778152466,
-0.20866847038269043,
0.005040515214204788,
0.07263773679733276,
0.045343317091464996,
0.10026831179857254,
0.05174073949456215,
0.015167376026511192,
0.08736202120780945,
-0.15326544642448425,
0.09254438430070877,
0.029541144147515297,
-0.09132153540849686,
-0.130467489361763,
-0.09594997763633728,
0.025355765596032143,
0.05284778028726578,
0.06832504272460938,
0.0015664368402212858,
0.15019330382347107,
-0.06008044630289078,
0.07895202189683914,
0.26447105407714844,
-0.32857614755630493,
-0.0644669160246849,
0.032847337424755096,
0.06034928187727928,
0.053505707532167435,
-0.12275084108114243,
-0.005685027688741684,
0.02765846811234951,
0.029911130666732788,
0.1183386966586113,
-0.017313621938228607,
-0.11171883344650269,
-0.01314003486186266,
-0.12821520864963531,
-0.00019229618192184716,
0.0724218562245369,
0.03554863855242729,
-0.052698682993650436,
-0.09467010945081711,
-0.07491982728242874,
-0.09363677352666855,
-0.0250538419932127,
-0.06482025235891342,
0.056758277118206024,
-0.055030666291713715,
-0.08064161241054535,
-0.036216188222169876,
-0.057257793843746185,
-0.07604221254587173,
-0.01850574091076851,
0.15615339577198029,
0.0399773083627224,
0.02134396694600582,
-0.03255024179816246,
0.1088489294052124,
0.0022534988820552826,
-0.14085711538791656,
-0.01441989652812481,
-0.0016351350350305438,
-0.09730883687734604,
-0.04730205237865448,
-0.050737135112285614,
-0.016138695180416107,
0.01036752574145794,
0.17762477695941925,
-0.07966198772192001,
0.0756688192486763,
0.010573796927928925,
-0.029322892427444458,
-0.0060268668457865715,
0.1466447412967682,
-0.04395686835050583,
-0.04741982743144035,
-0.01021014153957367,
0.07384289801120758,
0.0025060977786779404,
-0.014283482916653156,
-0.06537934392690659,
-0.027215447276830673,
0.10198287665843964,
0.04625524580478668,
-0.060744911432266235,
0.040130309760570526,
-0.023311829194426537,
-0.02815794013440609,
0.01656976155936718,
-0.11525996029376984,
0.04423743858933449,
-0.0016196348005905747,
-0.08424769341945648,
-0.001334928092546761,
-0.0003594112058635801,
-0.005131897050887346,
-0.007084650918841362,
0.10968244820833206,
-0.09915333241224289,
-0.0025380458682775497,
-0.0636177659034729,
-0.08312297612428665,
0.008538603782653809,
-0.15467268228530884,
-0.015280558727681637,
-0.057105548679828644,
-0.17114540934562683,
-0.03258483484387398,
0.03720230236649513,
-0.07436522096395493,
-0.009367069229483604,
-0.04858141019940376,
-0.06489178538322449,
0.024667570367455482,
-0.014398564584553242,
0.17465412616729736,
-0.05364185571670532,
0.0718812569975853,
0.0004932644078508019,
0.04685388132929802,
0.014218688011169434,
0.03532833233475685,
-0.10386408120393753,
0.025127239525318146,
-0.13767372071743011,
0.06876038759946823,
-0.08442537486553192,
-0.002299319487065077,
-0.13283850252628326,
-0.09848304837942123,
0.010825552977621555,
-0.02223672904074192,
0.08960723876953125,
0.13805995881557465,
-0.19343076646327972,
-0.017813561484217644,
0.1261398047208786,
-0.07526040822267532,
-0.06340213119983673,
0.06186447665095329,
-0.06110391765832901,
0.03132098913192749,
0.051862187683582306,
0.21116206049919128,
0.040427688509225845,
-0.16670534014701843,
-0.032225314527750015,
-0.006181784905493259,
0.039856214076280594,
0.026372229680418968,
0.04016066715121269,
0.005289600696414709,
0.0628846064209938,
0.014637176878750324,
-0.07649289071559906,
-0.03290173038840294,
-0.09134134650230408,
-0.06543582677841187,
-0.05383435636758804,
-0.07236839830875397,
0.04164326190948486,
0.0025160950608551502,
0.04300585389137268,
-0.06501153111457825,
-0.10065112262964249,
0.11880739778280258,
0.09677378088235855,
-0.047546789050102234,
0.03595313802361488,
-0.08010516315698624,
0.01995123364031315,
-0.02117759920656681,
-0.039388351142406464,
-0.20555974543094635,
-0.1305827796459198,
0.051939450204372406,
-0.05583524703979492,
0.03344739228487015,
0.008165544830262661,
0.0818607360124588,
0.06136833503842354,
-0.04367561265826225,
-0.012585713528096676,
-0.09290478378534317,
0.0030641949269920588,
-0.11739740520715714,
-0.18884781002998352,
-0.07830183953046799,
-0.03996086120605469,
0.0942518338561058,
-0.17454028129577637,
-0.006731708999723196,
0.015375521965324879,
0.14319108426570892,
0.027105143293738365,
-0.06774485111236572,
-0.0026554756332188845,
0.03780398890376091,
0.0029440121725201607,
-0.09466811269521713,
0.045415934175252914,
0.007929409854114056,
-0.09272261708974838,
-0.06335967034101486,
-0.13522207736968994,
-0.00961508322507143,
0.0593392439186573,
0.05266181379556656,
-0.09714744985103607,
-0.04644017666578293,
-0.07073188573122025,
-0.04055042564868927,
-0.07519199699163437,
0.012943761423230171,
0.20200327038764954,
0.034413211047649384,
0.11237095296382904,
-0.06633469462394714,
-0.07760497182607651,
-0.003628031350672245,
0.022959891706705093,
0.013241161592304707,
0.07607674598693848,
0.04037747532129288,
-0.05263064429163933,
0.0740610808134079,
0.09933816641569138,
-0.02251007966697216,
0.12447723746299744,
-0.04664275050163269,
-0.08356861025094986,
-0.034130435436964035,
-0.023848731070756912,
-0.02902797982096672,
0.12435095757246017,
-0.03877097740769386,
0.0051711443811655045,
0.0361422561109066,
0.04431688413023949,
0.017106343060731888,
-0.16197223961353302,
0.008227293379604816,
0.02125246450304985,
-0.05337049439549446,
-0.03799238055944443,
-0.0009477607090957463,
0.026970231905579567,
0.09208817780017853,
0.03140145540237427,
-0.014100944623351097,
0.0033738131169229746,
-0.011781699024140835,
-0.061489287763834,
0.18453668057918549,
-0.09799674153327942,
-0.08462966233491898,
-0.07606099545955658,
0.005109516438096762,
-0.0596076063811779,
-0.036426007747650146,
0.015874361619353294,
-0.08816348761320114,
-0.03875686973333359,
-0.08691509068012238,
-0.017896423116326332,
-0.01796109415590763,
0.02095247246325016,
0.03205423802137375,
-0.022537311539053917,
0.08075820654630661,
-0.1386203020811081,
0.0016285841120406985,
-0.05229029059410095,
-0.09300516545772552,
0.0003278481017332524,
0.07514499872922897,
0.09812919050455093,
0.07895917445421219,
-0.016554003581404686,
0.029825584962964058,
-0.03435191884636879,
0.24275611340999603,
-0.04538355767726898,
0.011429265141487122,
0.10376542061567307,
-0.01312102098017931,
0.055798955261707306,
0.09613066911697388,
0.0378354974091053,
-0.09420536458492279,
0.02050076797604561,
0.08259960263967514,
-0.02840321697294712,
-0.22956213355064392,
-0.02517562173306942,
-0.0043410989455878735,
-0.07929353415966034,
0.10574524849653244,
0.03155447915196419,
-0.03852761909365654,
0.045845530927181244,
0.021665913984179497,
0.0026906842831522226,
-0.05503368750214577,
0.08120110630989075,
0.07638117671012878,
0.056075453758239746,
0.10080127418041229,
-0.008681890554726124,
-0.02869376167654991,
0.06164753809571266,
0.00911522563546896,
0.24743041396141052,
-0.024484850466251373,
0.09964904189109802,
0.03274295851588249,
0.1514597088098526,
-0.026316456496715546,
0.06512860208749771,
0.003748528426513076,
-0.009806099347770214,
-0.014726940542459488,
-0.06654110550880432,
-0.024184972047805786,
0.023096969351172447,
-0.04599223658442497,
0.029381541535258293,
-0.08195248991250992,
0.02512384206056595,
0.028034035116434097,
0.2794007360935211,
0.03483222797513008,
-0.27470824122428894,
-0.06646589189767838,
-0.01308794692158699,
-0.041440531611442566,
-0.06284578889608383,
0.006023978348821402,
0.12001912295818329,
-0.1333668977022171,
0.06533237546682358,
-0.07606633007526398,
0.0892045870423317,
-0.0383809469640255,
0.011395707726478577,
0.04688572883605957,
0.1533481627702713,
-0.018484320491552353,
0.05029628425836563,
-0.18489649891853333,
0.24132156372070312,
0.025116832926869392,
0.10761623829603195,
-0.06381233036518097,
0.01008464116603136,
0.019623301923274994,
0.009082739241421223,
0.10917555540800095,
0.0011781378416344523,
-0.06850776076316833,
-0.1385248601436615,
-0.10009738802909851,
0.04774917662143707,
0.14141042530536652,
-0.0350019708275795,
0.09960996359586716,
-0.027916021645069122,
0.01228269748389721,
0.03298942372202873,
-0.0309952050447464,
-0.1577037125825882,
-0.07234734296798706,
0.009689852595329285,
0.027154404670000076,
-0.014883394353091717,
-0.05169888958334923,
-0.10456822067499161,
-0.038398802280426025,
0.11898189038038254,
0.002707442967221141,
-0.046357255429029465,
-0.15090475976467133,
0.0840822383761406,
0.14549008011817932,
-0.05776262283325195,
0.01511793676763773,
0.014834502711892128,
0.11156073212623596,
0.03328206390142441,
-0.08600354194641113,
0.06660539656877518,
-0.05399399623274803,
-0.17350977659225464,
-0.057980842888355255,
0.11937963962554932,
0.07896528393030167,
0.04517342150211334,
0.0011020867386832833,
0.05670300871133804,
0.0013495555613189936,
-0.09704636037349701,
0.03768577054142952,
0.0038838479667901993,
0.05202065035700798,
0.02861243300139904,
-0.08639291673898697,
0.07674823701381683,
-0.03363504260778427,
0.01890716515481472,
0.12891241908073425,
0.23211154341697693,
-0.0988755002617836,
0.10263542085886002,
0.07934939861297607,
-0.07649710029363632,
-0.15868531167507172,
0.06140763312578201,
0.12580466270446777,
0.005019594915211201,
0.08387643098831177,
-0.2004532366991043,
0.1341915726661682,
0.10648079961538315,
-0.013535960577428341,
0.021963132545351982,
-0.2701011896133423,
-0.13141898810863495,
0.06480718404054642,
0.11002248525619507,
0.051438722759485245,
-0.1220540702342987,
-0.03530125692486763,
-0.00998794473707676,
-0.12051299214363098,
0.12757544219493866,
-0.07717025279998779,
0.11699137091636658,
-0.02181112766265869,
0.12362458556890488,
0.023960905149579048,
-0.03655809164047241,
0.11315961927175522,
0.07165674865245819,
0.08590475469827652,
-0.03909866139292717,
-0.0028153148014098406,
0.06487762928009033,
-0.062291525304317474,
0.03545406833291054,
-0.03702815622091293,
0.062498971819877625,
-0.14923739433288574,
0.006717330310493708,
-0.07827671617269516,
0.059928588569164276,
-0.04663076251745224,
-0.06522249430418015,
-0.026785705238580704,
0.04719770699739456,
0.0718841478228569,
-0.035610586404800415,
0.04377424716949463,
0.0086557911708951,
0.09068798273801804,
0.10023418813943863,
0.07373519986867905,
-0.022021815180778503,
-0.0838383138179779,
0.014216083101928234,
0.003943488467484713,
0.04704831913113594,
-0.08482038974761963,
0.015388943254947662,
0.14671377837657928,
0.05931031331419945,
0.10236909985542297,
0.04625982418656349,
-0.04294029623270035,
0.006018566899001598,
0.017771324142813683,
-0.14279092848300934,
-0.1001889705657959,
0.02812776528298855,
-0.06055419147014618,
-0.1539086252450943,
0.03393898904323578,
0.12375432252883911,
-0.03668975457549095,
-0.01679457537829876,
-0.0069040837697684765,
0.009097050875425339,
-0.011672930791974068,
0.18496261537075043,
0.042017608880996704,
0.05444536730647087,
-0.09132885187864304,
0.11337512731552124,
0.03562378138303757,
-0.04122580960392952,
0.05362242832779884,
0.06758026033639908,
-0.09986765682697296,
0.012931491248309612,
0.07291305065155029,
0.15143190324306488,
-0.06617303937673569,
-0.013133461587131023,
-0.09255807846784592,
-0.07661091536283493,
0.04431324824690819,
0.1437211036682129,
0.053241048008203506,
-0.0058891004882752895,
-0.06044284999370575,
0.03537994623184204,
-0.11765825748443604,
0.06794927269220352,
0.05211399868130684,
0.08217759430408478,
-0.1086520329117775,
0.12443113327026367,
-0.0076184640638530254,
0.0237674992531538,
-0.028420547023415565,
0.018542518839240074,
-0.10126270353794098,
-0.03466380015015602,
-0.10873495787382126,
-0.014266574755311012,
-0.017622698098421097,
-0.0029984579887241125,
-0.019495569169521332,
-0.07504408806562424,
-0.04361603409051895,
0.03342796489596367,
-0.07714197784662247,
-0.048052966594696045,
0.018272358924150467,
0.04045674577355385,
-0.16046451032161713,
0.0028573856689035892,
0.025865742936730385,
-0.08755304664373398,
0.08816517889499664,
0.06972607225179672,
0.016406426206231117,
0.028242751955986023,
-0.12181217968463898,
-0.03363807126879692,
0.00022128266573417932,
0.009886031970381737,
0.07737720757722855,
-0.0939892902970314,
-0.029636375606060028,
-0.03085690177977085,
0.04917563498020172,
0.01520751416683197,
0.10332448035478592,
-0.11830739676952362,
-0.012767148204147816,
-0.04580902308225632,
-0.03788832947611809,
-0.05757354199886322,
0.026163341477513313,
0.11377429962158203,
0.04538292437791824,
0.15769895911216736,
-0.07017457485198975,
0.05409333482384682,
-0.2046044021844864,
-0.03299839422106743,
0.010350736789405346,
-0.04683378338813782,
-0.07418330758810043,
-0.045167434960603714,
0.08381333947181702,
-0.05062993988394737,
0.1227082833647728,
-0.01600630208849907,
0.09173721820116043,
0.04360831156373024,
-0.0038537480868399143,
-0.07098953425884247,
-0.011335829272866249,
0.18271049857139587,
0.05701731517910957,
-0.021529970690608025,
0.12012320756912231,
0.004279577173292637,
0.04229928180575371,
0.06710321456193924,
0.23353266716003418,
0.15069998800754547,
-0.01194670982658863,
0.07449742406606674,
0.0672612264752388,
-0.075349360704422,
-0.1407022327184677,
0.122002013027668,
-0.020847134292125702,
0.10574648529291153,
-0.05245376378297806,
0.1889537274837494,
0.0383225679397583,
-0.17619235813617706,
0.05384599789977074,
-0.02535737119615078,
-0.1082797423005104,
-0.12511177361011505,
-0.01588360220193863,
-0.08202698081731796,
-0.11660642921924591,
0.027751736342906952,
-0.12371360510587692,
0.06759736686944962,
0.0959123820066452,
0.007444146554917097,
0.035440701991319656,
0.18360589444637299,
-0.05710562318563461,
0.01162729226052761,
0.07168944180011749,
0.020603463053703308,
-0.0036964186001569033,
-0.038784053176641464,
-0.06639973819255829,
0.03663387894630432,
0.04288534075021744,
0.07061787694692612,
-0.05155215784907341,
0.009379005990922451,
0.014637153595685959,
-0.009830690920352936,
-0.07797420769929886,
0.007310560904443264,
0.014442160725593567,
0.04817245528101921,
0.03540467098355293,
0.04715682566165924,
0.008264635689556599,
-0.0533100850880146,
0.2747902572154999,
-0.0674225389957428,
-0.06069957837462425,
-0.12409050762653351,
0.19468559324741364,
0.03231482580304146,
-0.01864861138164997,
0.05579182505607605,
-0.0920141413807869,
-0.011940686032176018,
0.16243581473827362,
0.13616971671581268,
-0.09137700498104095,
-0.021556351333856583,
-0.02409820817410946,
-0.008786432445049286,
-0.011965488083660603,
0.10498786717653275,
0.07089193910360336,
0.0004747453494928777,
-0.06703300774097443,
-0.013248683884739876,
-0.02952105738222599,
-0.04730212315917015,
-0.06254316121339798,
0.05884716287255287,
0.02719961106777191,
-0.006208570208400488,
-0.05848436430096626,
0.06364821642637253,
-0.0038012682925909758,
-0.23481181263923645,
0.03887913376092911,
-0.17212152481079102,
-0.17446637153625488,
-0.014355337247252464,
0.07080388069152832,
0.0012158745666965842,
0.056707751005887985,
-0.007174353115260601,
0.009459855034947395,
0.11701134592294693,
-0.016912294551730156,
-0.013750853948295116,
-0.11778565496206284,
0.10984992235898972,
-0.10938248783349991,
0.21146337687969208,
-0.0013090487336739898,
0.06531909108161926,
0.09891043603420258,
0.03692229092121124,
-0.13516055047512054,
0.018948007375001907,
0.06127805635333061,
-0.12637865543365479,
0.0018172813579440117,
0.14607609808444977,
-0.03456898033618927,
0.062265098094940186,
0.031030280515551567,
-0.1487579643726349,
-0.0032235595863312483,
0.028402017429471016,
-0.03709486126899719,
-0.06869477033615112,
-0.009548110887408257,
-0.05635152757167816,
0.16576586663722992,
0.20733506977558136,
-0.029212726280093193,
0.012708340771496296,
-0.08425500988960266,
0.02225920557975769,
0.04934092238545418,
0.058709729462862015,
-0.03962881118059158,
-0.21623627841472626,
0.02142135426402092,
0.07233966141939163,
-0.0028608727734535933,
-0.19559212028980255,
-0.09555477648973465,
0.04154803231358528,
-0.0356936939060688,
-0.04636048525571823,
0.09182127565145493,
0.02490277774631977,
0.037511635571718216,
-0.01901240460574627,
-0.1147991344332695,
-0.027601495385169983,
0.14510421454906464,
-0.17597483098506927,
-0.04284917563199997
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-6
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-6
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09642046689987183,
0.11360721290111542,
-0.002294799778610468,
0.09194186329841614,
0.11989547312259674,
0.02256525680422783,
0.10098964720964432,
0.12825758755207062,
-0.09586231410503387,
0.0864546075463295,
0.08679883927106857,
0.0394660122692585,
0.04748095944523811,
0.14642339944839478,
-0.019612878561019897,
-0.2596695125102997,
0.009991263039410114,
-0.003865421749651432,
-0.03376559913158417,
0.11179567873477936,
0.084788978099823,
-0.11062474548816681,
0.0857725590467453,
0.014583570882678032,
-0.15475311875343323,
0.020253999158740044,
-0.03716207668185234,
-0.033696312457323074,
0.11303438246250153,
-0.03339816629886627,
0.10883768647909164,
0.025709308683872223,
0.13499967753887177,
-0.20930980145931244,
0.005069805774837732,
0.07321576029062271,
0.04548269137740135,
0.10044515877962112,
0.052541159093379974,
0.015335045754909515,
0.08912965655326843,
-0.15300370752811432,
0.09240961074829102,
0.030106911435723305,
-0.09119559079408646,
-0.1287587583065033,
-0.09641487151384354,
0.024665968492627144,
0.05348048731684685,
0.06901250034570694,
0.0011287516681477427,
0.1511327028274536,
-0.060583215206861496,
0.07916630059480667,
0.2661176323890686,
-0.3271316587924957,
-0.06442460417747498,
0.033873915672302246,
0.06020798534154892,
0.05256075784564018,
-0.12362100183963776,
-0.006582872476428747,
0.027679763734340668,
0.0295905489474535,
0.11738349497318268,
-0.01687634550035,
-0.11317881941795349,
-0.013506334275007248,
-0.12852726876735687,
-0.0002670397807378322,
0.07085280120372772,
0.035610273480415344,
-0.05215342342853546,
-0.09445640444755554,
-0.07539073377847672,
-0.09307487308979034,
-0.024871325120329857,
-0.06476088613271713,
0.05695845186710358,
-0.05508987233042717,
-0.08011168241500854,
-0.03606618195772171,
-0.05700657516717911,
-0.07715148478746414,
-0.018265314400196075,
0.15586380660533905,
0.03996129706501961,
0.021180221810936928,
-0.033275309950113297,
0.10867337882518768,
0.00226855231449008,
-0.14102546870708466,
-0.015418772585690022,
-0.0009728699806146324,
-0.09766360372304916,
-0.04751253500580788,
-0.05021566152572632,
-0.017854416742920876,
0.009927001781761646,
0.1758861392736435,
-0.07988618314266205,
0.07610364258289337,
0.009653570130467415,
-0.029421629384160042,
-0.0066565158776938915,
0.14680734276771545,
-0.04305073991417885,
-0.045957665890455246,
-0.01087644137442112,
0.07381787151098251,
0.0020132227800786495,
-0.013946975581347942,
-0.06479504704475403,
-0.027333201840519905,
0.10241540521383286,
0.04611097648739815,
-0.06024572625756264,
0.03979267552495003,
-0.023540038615465164,
-0.028294259682297707,
0.01689610257744789,
-0.11513155698776245,
0.04441584646701813,
-0.0020599630661308765,
-0.08464944362640381,
-0.0024009409826248884,
-0.0001558464573463425,
-0.005425706971436739,
-0.007433285936713219,
0.11036641150712967,
-0.0995054617524147,
-0.0023176188115030527,
-0.06459855288267136,
-0.08296104520559311,
0.008988240733742714,
-0.15629012882709503,
-0.015666713938117027,
-0.056344956159591675,
-0.1715591847896576,
-0.03269573673605919,
0.03685836121439934,
-0.07439681142568588,
-0.008758557960391045,
-0.0491693913936615,
-0.06547694653272629,
0.024962401017546654,
-0.014014746993780136,
0.17561893165111542,
-0.053365133702754974,
0.07253816723823547,
0.0002464493445586413,
0.04660837724804878,
0.014695907011628151,
0.03594253584742546,
-0.10475919395685196,
0.024879414588212967,
-0.13716816902160645,
0.06882194429636002,
-0.08536018431186676,
-0.002465201308950782,
-0.1337529718875885,
-0.09779925644397736,
0.009655552916228771,
-0.02274318039417267,
0.0903489962220192,
0.1387631744146347,
-0.193728506565094,
-0.01747463457286358,
0.12661759555339813,
-0.07604467868804932,
-0.0634380504488945,
0.06091257557272911,
-0.06080053746700287,
0.030923519283533096,
0.051436230540275574,
0.2114666849374771,
0.03976069763302803,
-0.1664963960647583,
-0.03330899775028229,
-0.006773252040147781,
0.040336061269044876,
0.025829145684838295,
0.039945974946022034,
0.005198841914534569,
0.06395743042230606,
0.014360986649990082,
-0.07632821053266525,
-0.03318501263856888,
-0.0915123000741005,
-0.06483383476734161,
-0.0543450303375721,
-0.0724785104393959,
0.04108769819140434,
0.00374047108925879,
0.0426696352660656,
-0.06458279490470886,
-0.10020020604133606,
0.11881407350301743,
0.09675147384405136,
-0.046947017312049866,
0.03621349483728409,
-0.07966773211956024,
0.018864639103412628,
-0.021655183285474777,
-0.0393175333738327,
-0.2068847268819809,
-0.13086332380771637,
0.05236292630434036,
-0.056409258395433426,
0.033584896475076675,
0.007636439986526966,
0.08176777511835098,
0.06054115295410156,
-0.04382071644067764,
-0.012582877650856972,
-0.09364153444766998,
0.0026347984094172716,
-0.11788808554410934,
-0.18896009027957916,
-0.07838854193687439,
-0.04062023386359215,
0.09209877252578735,
-0.17402639985084534,
-0.007032996509224176,
0.015411938540637493,
0.14378857612609863,
0.02735006809234619,
-0.06822661310434341,
-0.0026511179748922586,
0.0375528410077095,
0.0026466655544936657,
-0.09493044763803482,
0.04523202404379845,
0.007079083938151598,
-0.09227553009986877,
-0.06345705687999725,
-0.13555002212524414,
-0.011272178962826729,
0.05904361978173256,
0.05404407158493996,
-0.09733153134584427,
-0.046420030295848846,
-0.07067999988794327,
-0.040723104029893875,
-0.07633364200592041,
0.01391527894884348,
0.20144018530845642,
0.034697871655225754,
0.11191035062074661,
-0.06649724394083023,
-0.07780495285987854,
-0.0030975572299212217,
0.023234853520989418,
0.01279633678495884,
0.07705644518136978,
0.042102713137865067,
-0.05439648777246475,
0.07481075078248978,
0.09983278065919876,
-0.021930767223238945,
0.12480384111404419,
-0.04678472876548767,
-0.08399098366498947,
-0.032834794372320175,
-0.023526882752776146,
-0.028901778161525726,
0.12416382879018784,
-0.03849567100405693,
0.005551429931074381,
0.03597494587302208,
0.04477860778570175,
0.017133772373199463,
-0.16210904717445374,
0.008252634666860104,
0.021066397428512573,
-0.053229790180921555,
-0.038698937743902206,
-0.000988768064416945,
0.026916971430182457,
0.09241397678852081,
0.031112058088183403,
-0.013772035017609596,
0.0027358634397387505,
-0.011811234056949615,
-0.06152019649744034,
0.18535590171813965,
-0.09762219339609146,
-0.08355285972356796,
-0.07479128986597061,
0.005853038281202316,
-0.059190232306718826,
-0.03670111671090126,
0.01604977808892727,
-0.08918260037899017,
-0.03879266977310181,
-0.08688816428184509,
-0.017685189843177795,
-0.017780061811208725,
0.020158762112259865,
0.03139534592628479,
-0.022222518920898438,
0.08001235872507095,
-0.13934551179409027,
0.0018297981005162,
-0.05275607109069824,
-0.09321986883878708,
0.00002007995317399036,
0.07476724684238434,
0.09803158789873123,
0.07920093834400177,
-0.016925623640418053,
0.029884329065680504,
-0.03445564582943916,
0.2415635585784912,
-0.04587766155600548,
0.010994805954396725,
0.10363160073757172,
-0.012065021321177483,
0.05621069669723511,
0.0963793471455574,
0.03716617822647095,
-0.0942336767911911,
0.020801758393645287,
0.08329347521066666,
-0.028714817017316818,
-0.23061025142669678,
-0.025363657623529434,
-0.004508309997618198,
-0.07943837344646454,
0.10574615001678467,
0.03186497092247009,
-0.03853659704327583,
0.04567969962954521,
0.021029168739914894,
0.0013920770725235343,
-0.054975152015686035,
0.08152499049901962,
0.07430751621723175,
0.05679268762469292,
0.10048335790634155,
-0.008721034973859787,
-0.0280764102935791,
0.061001259833574295,
0.009112785570323467,
0.24854376912117004,
-0.024763545021414757,
0.09938880056142807,
0.032310426235198975,
0.15114878118038177,
-0.026845578104257584,
0.0655936598777771,
0.003514631651341915,
-0.009914970956742764,
-0.014549742452800274,
-0.06636171787977219,
-0.024318426847457886,
0.02306700125336647,
-0.04666392132639885,
0.029510682448744774,
-0.08181749284267426,
0.025477157905697823,
0.027501311153173447,
0.2800476551055908,
0.034933630377054214,
-0.27372851967811584,
-0.06574980914592743,
-0.013329146429896355,
-0.04174301028251648,
-0.06345582008361816,
0.00584045983850956,
0.11961808800697327,
-0.1329651027917862,
0.06514129042625427,
-0.07648135721683502,
0.09010004997253418,
-0.03710673376917839,
0.010742129758000374,
0.0462329238653183,
0.15349777042865753,
-0.018446380272507668,
0.05070953071117401,
-0.18592411279678345,
0.24331586062908173,
0.02523416467010975,
0.10775291174650192,
-0.06408713757991791,
0.0098970802500844,
0.019019443541765213,
0.007561637554317713,
0.10966593772172928,
0.0011391532607376575,
-0.0691586285829544,
-0.13829784095287323,
-0.0994124785065651,
0.047525323927402496,
0.14227335155010223,
-0.035053689032793045,
0.09906353801488876,
-0.027891209349036217,
0.01235394086688757,
0.033753763884305954,
-0.0310931745916605,
-0.15805256366729736,
-0.07227113097906113,
0.00997029710561037,
0.026584269478917122,
-0.01562468335032463,
-0.0513777919113636,
-0.1043272316455841,
-0.03726213797926903,
0.1186324805021286,
0.0026785864029079676,
-0.04575787112116814,
-0.1509196162223816,
0.08486225455999374,
0.14555475115776062,
-0.05812809243798256,
0.015035355463624,
0.01454128697514534,
0.1111704558134079,
0.03241078928112984,
-0.08649042248725891,
0.067536361515522,
-0.05401330441236496,
-0.17382720112800598,
-0.05790426954627037,
0.11891162395477295,
0.07957867532968521,
0.04556893929839134,
0.0009130560792982578,
0.05710512027144432,
0.0016070660203695297,
-0.09676431119441986,
0.037965767085552216,
0.004148500971496105,
0.051958102732896805,
0.029097532853484154,
-0.08594424277544022,
0.0754704475402832,
-0.03422268107533455,
0.0183577723801136,
0.1285337656736374,
0.23304051160812378,
-0.0987737700343132,
0.10299580544233322,
0.07982639968395233,
-0.07686470448970795,
-0.15936064720153809,
0.06227599456906319,
0.12610390782356262,
0.0044982717372477055,
0.08459094911813736,
-0.2003256231546402,
0.13414838910102844,
0.10631150007247925,
-0.01387124601751566,
0.02118772454559803,
-0.27086204290390015,
-0.13143973052501678,
0.06474652141332626,
0.1098850890994072,
0.04939214512705803,
-0.12197753041982651,
-0.03511510416865349,
-0.009707598015666008,
-0.12014244496822357,
0.12830041348934174,
-0.07602944225072861,
0.1171974316239357,
-0.02213590405881405,
0.12371845543384552,
0.024121426045894623,
-0.03708484023809433,
0.11180223524570465,
0.07176734507083893,
0.08616801351308823,
-0.03887159004807472,
-0.003039828035980463,
0.0649431124329567,
-0.06243182718753815,
0.0356200709939003,
-0.037115056067705154,
0.06289707124233246,
-0.14808626472949982,
0.007082692813128233,
-0.07877844572067261,
0.060413699597120285,
-0.046559013426303864,
-0.06522748619318008,
-0.026907240971922874,
0.04763583838939667,
0.07229950278997421,
-0.03604019433259964,
0.04508155211806297,
0.008690237998962402,
0.0923452079296112,
0.10040424019098282,
0.07422662526369095,
-0.02180386148393154,
-0.08277968317270279,
0.013681459240615368,
0.004654655233025551,
0.04720257967710495,
-0.0857764333486557,
0.01519719511270523,
0.1468157321214676,
0.06039465591311455,
0.10225096344947815,
0.04665375128388405,
-0.04356441646814346,
0.006061484105885029,
0.01696857437491417,
-0.14209714531898499,
-0.10107149928808212,
0.028453705832362175,
-0.05710305646061897,
-0.15428423881530762,
0.03446044400334358,
0.12239217758178711,
-0.03760850802063942,
-0.016830159351229668,
-0.006696566008031368,
0.00965464673936367,
-0.011107091791927814,
0.1855962723493576,
0.04206862673163414,
0.05502390116453171,
-0.09118866920471191,
0.11377735435962677,
0.03519183769822121,
-0.042407263070344925,
0.053815122693777084,
0.06783320754766464,
-0.09927279502153397,
0.012908858247101307,
0.07442646473646164,
0.150536447763443,
-0.06620966643095016,
-0.011892172507941723,
-0.09154563397169113,
-0.07634437084197998,
0.04461401700973511,
0.14540621638298035,
0.053077369928359985,
-0.0059877256862819195,
-0.060104139149188995,
0.03583851829171181,
-0.11820579320192337,
0.06821248680353165,
0.05168147012591362,
0.08237290382385254,
-0.10829269140958786,
0.12316182255744934,
-0.0074384487234056,
0.023974033072590828,
-0.028344256803393364,
0.01894528418779373,
-0.10105025768280029,
-0.03461163491010666,
-0.1067834198474884,
-0.014464996755123138,
-0.017687171697616577,
-0.0034650955349206924,
-0.019968615844845772,
-0.075442835688591,
-0.04283902049064636,
0.033490173518657684,
-0.07737518846988678,
-0.048549551516771317,
0.0176447331905365,
0.0399993434548378,
-0.16089531779289246,
0.002998596988618374,
0.02601815201342106,
-0.08698096871376038,
0.08720434457063675,
0.06905282288789749,
0.016475515440106392,
0.028568854555487633,
-0.124242402613163,
-0.03328406438231468,
0.00045815121848136187,
0.00992994848638773,
0.07735676318407059,
-0.09336908906698227,
-0.02995801903307438,
-0.031056959182024002,
0.049300793558359146,
0.015032989904284477,
0.10260935872793198,
-0.11878025531768799,
-0.013482395559549332,
-0.047120943665504456,
-0.03856493532657623,
-0.05698820576071739,
0.02669767662882805,
0.1141221672296524,
0.04549606889486313,
0.15735024213790894,
-0.06996087729930878,
0.05476323142647743,
-0.20459839701652527,
-0.03289756923913956,
0.010638011619448662,
-0.046639516949653625,
-0.07462745904922485,
-0.04443458467721939,
0.08418264985084534,
-0.0506848469376564,
0.12043973058462143,
-0.015457704663276672,
0.09239137917757034,
0.04398559406399727,
-0.0037386808544397354,
-0.07141722738742828,
-0.011460854671895504,
0.18228094279766083,
0.05679601803421974,
-0.021023355424404144,
0.12171448022127151,
0.0045236810110509396,
0.04194312542676926,
0.06853294372558594,
0.23553210496902466,
0.1518261730670929,
-0.01290070079267025,
0.07460445165634155,
0.06702842563390732,
-0.07594318687915802,
-0.14057372510433197,
0.121913842856884,
-0.020517945289611816,
0.10563791543245316,
-0.052785828709602356,
0.1896078884601593,
0.03827821463346481,
-0.1758984923362732,
0.05444847047328949,
-0.025782618671655655,
-0.10810411721467972,
-0.12519125640392303,
-0.015783822163939476,
-0.08196881413459778,
-0.11645987629890442,
0.028113385662436485,
-0.12392506003379822,
0.06832844763994217,
0.096228688955307,
0.007465093396604061,
0.03565327078104019,
0.18429210782051086,
-0.056844644248485565,
0.011682278476655483,
0.07203424721956253,
0.020751310512423515,
-0.0040816948749125,
-0.03906438499689102,
-0.06601332873106003,
0.03773276507854462,
0.04278077185153961,
0.07073958218097687,
-0.05142039433121681,
0.008328328840434551,
0.014491496607661247,
-0.009799035266041756,
-0.07809648662805557,
0.007533466909080744,
0.014770337380468845,
0.048354875296354294,
0.03606657683849335,
0.04706699401140213,
0.008744337595999241,
-0.05340222269296646,
0.27646300196647644,
-0.0677923932671547,
-0.060507405549287796,
-0.12322978675365448,
0.19548408687114716,
0.03334607928991318,
-0.018591852858662605,
0.055689048022031784,
-0.09284066408872604,
-0.012194015085697174,
0.16100488603115082,
0.1344861388206482,
-0.09034498780965805,
-0.02133149467408657,
-0.024460548534989357,
-0.008657816797494888,
-0.011805707588791847,
0.10466457903385162,
0.07121682167053223,
0.0012196199968457222,
-0.06729638576507568,
-0.013019097037613392,
-0.02950848452746868,
-0.04808833450078964,
-0.06159035861492157,
0.05899001657962799,
0.02758033573627472,
-0.006669904571026564,
-0.058670297265052795,
0.0639258399605751,
-0.003959985915571451,
-0.23441798985004425,
0.03819877654314041,
-0.17314934730529785,
-0.17401118576526642,
-0.014302724041044712,
0.07073872536420822,
0.0013147869613021612,
0.05645569786429405,
-0.007008813321590424,
0.009944230318069458,
0.11502749472856522,
-0.016776567324995995,
-0.014040675014257431,
-0.11902020126581192,
0.10919231921434402,
-0.10996362566947937,
0.21200910210609436,
-0.001427471055649221,
0.06447609513998032,
0.09891625493764877,
0.03703764081001282,
-0.1356925666332245,
0.018713725730776787,
0.06156587600708008,
-0.12699171900749207,
0.001671388978138566,
0.14709952473640442,
-0.034623220562934875,
0.06257328391075134,
0.03067830577492714,
-0.1503181755542755,
-0.0028129236306995153,
0.027361368760466576,
-0.03668814152479172,
-0.06925930082798004,
-0.007870296016335487,
-0.05613664537668228,
0.16584105789661407,
0.20753943920135498,
-0.028935248032212257,
0.012500807642936707,
-0.08475927263498306,
0.021995821967720985,
0.048327285796403885,
0.059514183551073074,
-0.0394366979598999,
-0.21629206836223602,
0.021805595606565475,
0.07217446714639664,
-0.0030467906035482883,
-0.19530805945396423,
-0.09515783935785294,
0.04175853356719017,
-0.03662335127592087,
-0.04640722647309303,
0.09120611846446991,
0.025058921426534653,
0.0369698591530323,
-0.01913807913661003,
-0.11687704920768738,
-0.02724407985806465,
0.145633265376091,
-0.17607171833515167,
-0.04249197244644165
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-8
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-8
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
106,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-512-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 10.0### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09692684561014175,
0.11446870863437653,
-0.002318326383829117,
0.09235680848360062,
0.11996675282716751,
0.023216156288981438,
0.10062595456838608,
0.12818671762943268,
-0.09549134224653244,
0.08604642003774643,
0.08706024289131165,
0.038765717297792435,
0.047408830374479294,
0.14574600756168365,
-0.019753625616431236,
-0.25976550579071045,
0.009522806853055954,
-0.003348028054460883,
-0.031922709196805954,
0.11133016645908356,
0.08483558148145676,
-0.11068841069936752,
0.08570443093776703,
0.014409852214157581,
-0.1540999561548233,
0.020097265020012856,
-0.036637287586927414,
-0.034199126064777374,
0.1133672371506691,
-0.03279668092727661,
0.10887372493743896,
0.02497907727956772,
0.13506853580474854,
-0.2099633365869522,
0.004873128142207861,
0.07346853613853455,
0.04535544663667679,
0.10051597654819489,
0.05137670412659645,
0.0162298996001482,
0.08815287053585052,
-0.15351499617099762,
0.09261339902877808,
0.029726451262831688,
-0.0910201147198677,
-0.12910126149654388,
-0.09561518579721451,
0.025799501687288284,
0.05291115120053291,
0.06838304549455643,
0.0021001610439270735,
0.15271107852458954,
-0.0593637190759182,
0.07955213636159897,
0.26637253165245056,
-0.3269422650337219,
-0.06365898251533508,
0.032869912683963776,
0.06005106493830681,
0.05377212166786194,
-0.1228020116686821,
-0.006413757801055908,
0.027335958555340767,
0.029457079246640205,
0.11790210008621216,
-0.01763787679374218,
-0.11462865769863129,
-0.01365263108164072,
-0.12802790105342865,
-0.0006951794493943453,
0.07134053856134415,
0.03579716384410858,
-0.05237060412764549,
-0.09450096637010574,
-0.07609216123819351,
-0.09387565404176712,
-0.025349341332912445,
-0.06513546407222748,
0.056409187614917755,
-0.054738715291023254,
-0.07926109433174133,
-0.03618976101279259,
-0.056511055678129196,
-0.076368048787117,
-0.01796838827431202,
0.15583522617816925,
0.040090952068567276,
0.021108442917466164,
-0.032516948878765106,
0.10831805318593979,
0.0003400622808840126,
-0.14107446372509003,
-0.014998945407569408,
-0.0007133971084840596,
-0.097781240940094,
-0.04771988093852997,
-0.0504496768116951,
-0.01889202743768692,
0.009520307183265686,
0.17732374370098114,
-0.07951730489730835,
0.07584933191537857,
0.010105871595442295,
-0.028865180909633636,
-0.006260774563997984,
0.14757922291755676,
-0.04372752457857132,
-0.0468410961329937,
-0.00993757788091898,
0.07349439710378647,
0.002892149845138192,
-0.014835596084594727,
-0.06582048535346985,
-0.02809528075158596,
0.10311184823513031,
0.045594722032547,
-0.06023484095931053,
0.03927934914827347,
-0.023661477491259575,
-0.028492102399468422,
0.017733586952090263,
-0.1149742379784584,
0.04461154714226723,
-0.002119308803230524,
-0.08435339480638504,
-0.0016476032324135303,
0.0005550001515075564,
-0.004728185478597879,
-0.007722282316535711,
0.10947395116090775,
-0.09869015216827393,
-0.0018566915532574058,
-0.06423617899417877,
-0.08315188437700272,
0.009318622760474682,
-0.1562443971633911,
-0.014905725605785847,
-0.057523347437381744,
-0.17184396088123322,
-0.032216958701610565,
0.03701567277312279,
-0.07384076714515686,
-0.008477979339659214,
-0.04811999574303627,
-0.06420575827360153,
0.024798959493637085,
-0.014602022245526314,
0.173637256026268,
-0.05357670783996582,
0.07214950770139694,
0.0002988309133797884,
0.04623569920659065,
0.013421732932329178,
0.0360175259411335,
-0.10433146357536316,
0.024725178256630898,
-0.13725613057613373,
0.06872888654470444,
-0.08471179008483887,
-0.002350699156522751,
-0.13248766958713531,
-0.09773652255535126,
0.010683565400540829,
-0.022124554961919785,
0.09101573377847672,
0.13807769119739532,
-0.1937568187713623,
-0.017259709537029266,
0.12577414512634277,
-0.07573904097080231,
-0.06361258029937744,
0.06238589435815811,
-0.060972459614276886,
0.030857127159833908,
0.05198715627193451,
0.21132883429527283,
0.041342027485370636,
-0.16651003062725067,
-0.0326043926179409,
-0.005564031656831503,
0.040958471596241,
0.024751530960202217,
0.03980891406536102,
0.005584953352808952,
0.06440157443284988,
0.014667108654975891,
-0.07526416331529617,
-0.03239240124821663,
-0.09140530973672867,
-0.06521880626678467,
-0.05458793044090271,
-0.07221460342407227,
0.040759842842817307,
0.004354613367468119,
0.04269721359014511,
-0.06439119577407837,
-0.10071230679750443,
0.12029105424880981,
0.09642884880304337,
-0.04729333147406578,
0.036256566643714905,
-0.07927890866994858,
0.019399357959628105,
-0.020928917452692986,
-0.03923691809177399,
-0.20644831657409668,
-0.12970498204231262,
0.0525527149438858,
-0.056253451853990555,
0.03308623656630516,
0.008039834909141064,
0.081119604408741,
0.06098049879074097,
-0.04329323023557663,
-0.011831110343337059,
-0.09308885782957077,
0.002904461929574609,
-0.11868253350257874,
-0.1878434121608734,
-0.07857351750135422,
-0.04012355953454971,
0.09251812845468521,
-0.17485912144184113,
-0.006503239739686251,
0.015040401369333267,
0.143169566988945,
0.026955217123031616,
-0.06813880801200867,
-0.0026294463314116,
0.038212891668081284,
0.0024582939222455025,
-0.09521417319774628,
0.04540465399622917,
0.00767139857634902,
-0.09307684004306793,
-0.0645001083612442,
-0.13589175045490265,
-0.010337959043681622,
0.05876537039875984,
0.05320171266794205,
-0.09722408652305603,
-0.04701390862464905,
-0.07015910744667053,
-0.040544938296079636,
-0.07562831789255142,
0.013021472841501236,
0.2023203819990158,
0.0349934846162796,
0.11289456486701965,
-0.06635591387748718,
-0.0772177055478096,
-0.00302712619304657,
0.023166866973042488,
0.013433611020445824,
0.07612469047307968,
0.04159826040267944,
-0.05228757858276367,
0.07404438406229019,
0.09904533624649048,
-0.02291479893028736,
0.12416090071201324,
-0.04645814374089241,
-0.08356567472219467,
-0.033023182302713394,
-0.02378794550895691,
-0.028714487329125404,
0.12417541444301605,
-0.03999219462275505,
0.004932887852191925,
0.03623047098517418,
0.044357385486364365,
0.017256030812859535,
-0.16208267211914062,
0.008139523677527905,
0.02189394272863865,
-0.05277445539832115,
-0.037784021347761154,
-0.0016346204793080688,
0.02657049521803856,
0.09165705740451813,
0.030757205560803413,
-0.014138021506369114,
0.003425084287300706,
-0.011526043526828289,
-0.061806850135326385,
0.1846977174282074,
-0.09760076552629471,
-0.08488103002309799,
-0.07609495520591736,
0.006251031998544931,
-0.059288281947374344,
-0.036745890974998474,
0.016383875161409378,
-0.08748913556337357,
-0.03861500322818756,
-0.08726925402879715,
-0.019474420696496964,
-0.017171287909150124,
0.020121047273278236,
0.03168712928891182,
-0.022716538980603218,
0.08059234917163849,
-0.13915763795375824,
0.001617049565538764,
-0.05216319113969803,
-0.09239032864570618,
0.0004375594144221395,
0.07468759268522263,
0.09885361045598984,
0.0796700045466423,
-0.017665307968854904,
0.029575852677226067,
-0.03422049805521965,
0.24136891961097717,
-0.04536760598421097,
0.01136009581387043,
0.10398496687412262,
-0.013051144778728485,
0.05643637478351593,
0.09585798531770706,
0.03746681660413742,
-0.09418830275535583,
0.020322073251008987,
0.08221662789583206,
-0.029265806078910828,
-0.2297704666852951,
-0.025261282920837402,
-0.004636935889720917,
-0.07924681156873703,
0.1060129851102829,
0.03178096562623978,
-0.03803587332367897,
0.04614311829209328,
0.021160705015063286,
0.002939085476100445,
-0.05577225610613823,
0.08142802119255066,
0.07495961338281631,
0.05671696364879608,
0.10033351927995682,
-0.008341474458575249,
-0.028129225596785545,
0.06126375123858452,
0.008538056164979935,
0.24655216932296753,
-0.02497931197285652,
0.10025274753570557,
0.03125282749533653,
0.15176072716712952,
-0.026790495961904526,
0.0657409131526947,
0.0032567365560680628,
-0.010210669599473476,
-0.014776840806007385,
-0.06654147058725357,
-0.02574160508811474,
0.023626605048775673,
-0.047287922352552414,
0.0299029853194952,
-0.08207176625728607,
0.026187580078840256,
0.027240479364991188,
0.27959612011909485,
0.034753575921058655,
-0.27394378185272217,
-0.06602839380502701,
-0.013612670823931694,
-0.04160982370376587,
-0.06394868344068527,
0.005738294683396816,
0.12047690153121948,
-0.13263067603111267,
0.06467930972576141,
-0.07587173581123352,
0.09018804877996445,
-0.03816584497690201,
0.011100281961262226,
0.045454639941453934,
0.153542622923851,
-0.018147088587284088,
0.050977353006601334,
-0.18658243119716644,
0.24177567660808563,
0.02557569555938244,
0.1080445796251297,
-0.06444401293992996,
0.010441495105624199,
0.01868753507733345,
0.00923879537731409,
0.10884232819080353,
0.001497123041190207,
-0.06804420799016953,
-0.13965271413326263,
-0.09978107362985611,
0.047552354633808136,
0.1410234421491623,
-0.033548906445503235,
0.09866301715373993,
-0.028003865852952003,
0.01243840716779232,
0.03401309251785278,
-0.030369693413376808,
-0.15806633234024048,
-0.07290157675743103,
0.009574851021170616,
0.027809668332338333,
-0.015328681096434593,
-0.0511692576110363,
-0.10401090979576111,
-0.03866889700293541,
0.11881433427333832,
0.0034675763454288244,
-0.04597272723913193,
-0.1508931815624237,
0.08541437983512878,
0.14479690790176392,
-0.05834105983376503,
0.014941416680812836,
0.014526542276144028,
0.1113273873925209,
0.03217144310474396,
-0.08564183861017227,
0.06754380464553833,
-0.05377993360161781,
-0.17299893498420715,
-0.05825135484337807,
0.11808809638023376,
0.07911083847284317,
0.04570819437503815,
0.0013837125152349472,
0.05681535229086876,
0.0015179982874542475,
-0.09680584818124771,
0.03673326596617699,
0.004800628870725632,
0.0510665625333786,
0.02900330349802971,
-0.08602114021778107,
0.07637257128953934,
-0.03388914093375206,
0.018585817888379097,
0.12970149517059326,
0.23262277245521545,
-0.09914929419755936,
0.10217960178852081,
0.0806073471903801,
-0.07676044851541519,
-0.1590011864900589,
0.061868369579315186,
0.12552669644355774,
0.004549079108983278,
0.08479461818933487,
-0.1997368186712265,
0.13413086533546448,
0.1073162779211998,
-0.013444878160953522,
0.020941469818353653,
-0.27083131670951843,
-0.1316552460193634,
0.0655168741941452,
0.10993096232414246,
0.0515303760766983,
-0.12257114052772522,
-0.03506682440638542,
-0.010409261099994183,
-0.12130159884691238,
0.12740494310855865,
-0.07571542263031006,
0.11699995398521423,
-0.0219722893089056,
0.1223365068435669,
0.024241751059889793,
-0.037286024540662766,
0.1121298223733902,
0.07242126017808914,
0.08609972149133682,
-0.03912375867366791,
-0.0026988410390913486,
0.06473337113857269,
-0.06263574212789536,
0.03638399764895439,
-0.03698943927884102,
0.06292138248682022,
-0.1498146653175354,
0.006911186501383781,
-0.07752919942140579,
0.06095157936215401,
-0.0462346225976944,
-0.06551803648471832,
-0.026768933981657028,
0.04658379778265953,
0.07217665761709213,
-0.035898152738809586,
0.046046145260334015,
0.009029436856508255,
0.0915888249874115,
0.10187605768442154,
0.07284074276685715,
-0.024997297674417496,
-0.08292215317487717,
0.013702855445444584,
0.00442676804959774,
0.0475090853869915,
-0.08540920913219452,
0.01570807956159115,
0.1469438523054123,
0.06044944375753403,
0.10254546254873276,
0.04563179239630699,
-0.043005380779504776,
0.006184370722621679,
0.01647602580487728,
-0.1419554203748703,
-0.10019621253013611,
0.027898618951439857,
-0.05739384889602661,
-0.15386275947093964,
0.03355895355343819,
0.12294113636016846,
-0.03834133595228195,
-0.016215620562434196,
-0.006969878915697336,
0.008395341224968433,
-0.011004464700818062,
0.18520113825798035,
0.04266906529664993,
0.0548446886241436,
-0.09109277278184891,
0.11334025114774704,
0.03562094643712044,
-0.04147038608789444,
0.05416732653975487,
0.06740206480026245,
-0.09957746416330338,
0.012726934626698494,
0.0744127705693245,
0.15045271813869476,
-0.06697896122932434,
-0.01292275357991457,
-0.09209269285202026,
-0.07528547197580338,
0.04415160417556763,
0.14437544345855713,
0.053456924855709076,
-0.006306807044893503,
-0.060414765030145645,
0.03510555997490883,
-0.11836068332195282,
0.06768743693828583,
0.05135391280055046,
0.08267398923635483,
-0.10859228670597076,
0.12455004453659058,
-0.006776965688914061,
0.023948121815919876,
-0.02829163335263729,
0.018561694771051407,
-0.10103991627693176,
-0.03439481928944588,
-0.10813787579536438,
-0.014224949292838573,
-0.017600782215595245,
-0.003038358176127076,
-0.020062634721398354,
-0.07521107792854309,
-0.042847707867622375,
0.03338165581226349,
-0.07666242122650146,
-0.04836468771100044,
0.0180125143378973,
0.03961856663227081,
-0.1604107767343521,
0.0026630896609276533,
0.025632763281464577,
-0.08685020357370377,
0.08751512318849564,
0.06844564527273178,
0.01605999656021595,
0.028192520141601562,
-0.12448526173830032,
-0.03325732797384262,
0.0005280552431941032,
0.010782970115542412,
0.07738562673330307,
-0.09175877273082733,
-0.02913222648203373,
-0.030637411400675774,
0.04923216998577118,
0.01491820439696312,
0.10299921035766602,
-0.1189546287059784,
-0.013381811790168285,
-0.04690585657954216,
-0.03855374455451965,
-0.05742012336850166,
0.026659877970814705,
0.11393603682518005,
0.04473348334431648,
0.15730495750904083,
-0.06984909623861313,
0.054615892469882965,
-0.20485100150108337,
-0.03312087804079056,
0.010834281332790852,
-0.04641278088092804,
-0.07453072816133499,
-0.04528540372848511,
0.08387388288974762,
-0.05011850595474243,
0.12219829857349396,
-0.015460588037967682,
0.09302221983671188,
0.04356685280799866,
-0.004310264252126217,
-0.0713535025715828,
-0.012132410891354084,
0.18313463032245636,
0.05791594088077545,
-0.021025869995355606,
0.12091057747602463,
0.004829295910894871,
0.0430389940738678,
0.0679132267832756,
0.23318251967430115,
0.15177497267723083,
-0.013677138835191727,
0.07466281950473785,
0.06685930490493774,
-0.07546086609363556,
-0.14059916138648987,
0.12156210094690323,
-0.020606333389878273,
0.10603213310241699,
-0.05276155844330788,
0.18962474167346954,
0.038229767233133316,
-0.17575162649154663,
0.05411086603999138,
-0.025095215067267418,
-0.1083156168460846,
-0.12531886994838715,
-0.01521212700754404,
-0.08205225318670273,
-0.11650273203849792,
0.027658162638545036,
-0.12336204946041107,
0.06792085617780685,
0.09653612971305847,
0.00696494709700346,
0.035439424216747284,
0.1836092323064804,
-0.05684370920062065,
0.012032145634293556,
0.07176849991083145,
0.020613588392734528,
-0.00372514221817255,
-0.04012034460902214,
-0.06683852523565292,
0.037664905190467834,
0.04292444884777069,
0.07127048820257187,
-0.05150270089507103,
0.010236538015305996,
0.014875391498208046,
-0.009522615000605583,
-0.07856758683919907,
0.00747669255360961,
0.014184756204485893,
0.04809097945690155,
0.03487721085548401,
0.04735472798347473,
0.008862129412591457,
-0.05357316508889198,
0.27491363883018494,
-0.06712361425161362,
-0.06107412651181221,
-0.12313759326934814,
0.19444145262241364,
0.033731721341609955,
-0.01834189146757126,
0.05564689263701439,
-0.09284093230962753,
-0.011470462195575237,
0.16125470399856567,
0.1338384598493576,
-0.09103905409574509,
-0.021290604025125504,
-0.024175509810447693,
-0.00880078412592411,
-0.012839743867516518,
0.10520507395267487,
0.07133091241121292,
0.00006159586337162182,
-0.0665326789021492,
-0.013156969100236893,
-0.029617737978696823,
-0.04809004068374634,
-0.06278903037309647,
0.05825610086321831,
0.02770727314054966,
-0.0059458171017467976,
-0.05824565142393112,
0.06344512104988098,
-0.0030671812128275633,
-0.23497357964515686,
0.03833966702222824,
-0.17342080175876617,
-0.17388004064559937,
-0.014132756739854813,
0.07102029025554657,
0.0015504775801673532,
0.05606011673808098,
-0.006945403292775154,
0.010356598533689976,
0.11575067043304443,
-0.017065495252609253,
-0.014453423209488392,
-0.1179061233997345,
0.10851918905973434,
-0.10878867655992508,
0.21169880032539368,
-0.001372978207655251,
0.06514137983322144,
0.0988847017288208,
0.03740041330456734,
-0.13496458530426025,
0.018986372277140617,
0.061247169971466064,
-0.12605823576450348,
0.0014203672762960196,
0.14567425847053528,
-0.03462837263941765,
0.06241241842508316,
0.031106336042284966,
-0.15001381933689117,
-0.003512949449941516,
0.02625245228409767,
-0.03669865429401398,
-0.06888311356306076,
-0.009320900775492191,
-0.05552857369184494,
0.1658998280763626,
0.20666764676570892,
-0.02868916280567646,
0.011928028427064419,
-0.08486800640821457,
0.021797550842165947,
0.0488215796649456,
0.0593683160841465,
-0.039645273238420486,
-0.21607691049575806,
0.022298242896795273,
0.07206716388463974,
-0.0028264534194022417,
-0.194989413022995,
-0.09570536762475967,
0.04169750586152077,
-0.037030987441539764,
-0.04615339636802673,
0.09126842021942139,
0.02512887679040432,
0.03726144880056381,
-0.019219564273953438,
-0.11706450581550598,
-0.027728958055377007,
0.1456427276134491,
-0.17611831426620483,
-0.04271331802010536
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-0
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-0", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-0
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-0
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-0\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09762564301490784,
0.10809286683797836,
-0.002378305420279503,
0.09533600509166718,
0.1207759752869606,
0.015796002000570297,
0.09843666851520538,
0.1312742531299591,
-0.10521803796291351,
0.06905101984739304,
0.088321253657341,
0.03505740687251091,
0.04359634220600128,
0.14999158680438995,
-0.007701677270233631,
-0.27226921916007996,
-0.00016606197459623218,
-0.000151259548147209,
-0.04284622520208359,
0.12082558125257492,
0.08781033754348755,
-0.11040080338716507,
0.07767559587955475,
0.010281083174049854,
-0.15440315008163452,
0.01900395005941391,
-0.03172634169459343,
-0.03714746981859207,
0.12207570672035217,
-0.036052968353033066,
0.10712718218564987,
0.029726693406701088,
0.13518109917640686,
-0.20757320523262024,
0.006566893775016069,
0.07754557579755783,
0.052471090108156204,
0.10034000128507614,
0.04668140411376953,
0.009513834491372108,
0.08899269998073578,
-0.1488652527332306,
0.09257540851831436,
0.030366815626621246,
-0.0914350375533104,
-0.15517693758010864,
-0.09210411459207535,
0.030676010996103287,
0.04820083826780319,
0.0715680718421936,
0.0028594862669706345,
0.14833666384220123,
-0.06289025396108627,
0.0841362327337265,
0.2631259858608246,
-0.3214332163333893,
-0.06716787815093994,
0.028909752145409584,
0.05599837750196457,
0.05972316116094589,
-0.1197274699807167,
-0.002691423986107111,
0.019560053944587708,
0.027688153088092804,
0.12741628289222717,
-0.015827735885977745,
-0.1073940247297287,
-0.00958931166678667,
-0.1237616240978241,
-0.0018759138183668256,
0.059888385236263275,
0.024938838556408882,
-0.05275880917906761,
-0.10639020055532455,
-0.06840914487838745,
-0.08471028506755829,
-0.02220076695084572,
-0.054159048944711685,
0.051584064960479736,
-0.05449014529585838,
-0.0981256514787674,
-0.039765067398548126,
-0.057383328676223755,
-0.0800454318523407,
-0.00989798828959465,
0.16811197996139526,
0.035437390208244324,
0.020650183781981468,
-0.03070790134370327,
0.11810633540153503,
0.021347815170884132,
-0.14028801023960114,
-0.00962501298636198,
-0.004656471777707338,
-0.09280829131603241,
-0.04023192077875137,
-0.05329243838787079,
-0.012069174088537693,
0.006029845215380192,
0.168827623128891,
-0.08162076771259308,
0.07422562688589096,
0.012355897575616837,
-0.023583553731441498,
-0.013150940649211407,
0.15282797813415527,
-0.03987162560224533,
-0.038298819214105606,
-0.015609154477715492,
0.07916638255119324,
0.004599911160767078,
-0.019568881019949913,
-0.0663411095738411,
-0.027623672038316727,
0.09464085847139359,
0.055112097412347794,
-0.061937179416418076,
0.03958037495613098,
-0.02974877879023552,
-0.026415618136525154,
0.017695685848593712,
-0.11894597858190536,
0.04157232865691185,
-0.003264777362346649,
-0.08153427392244339,
-0.0069783455692231655,
-0.0012852440122514963,
-0.009509623050689697,
-0.010362706147134304,
0.09982578456401825,
-0.087652787566185,
-0.0006200483185239136,
-0.0701369121670723,
-0.08086137473583221,
-0.0010164609411731362,
-0.1545376181602478,
-0.015131017193198204,
-0.05965275317430496,
-0.16567817330360413,
-0.033166851848363876,
0.04437042772769928,
-0.07554774731397629,
-0.01088760606944561,
-0.04286440089344978,
-0.062148161232471466,
0.017918292433023453,
-0.013110451400279999,
0.1917368322610855,
-0.05310139060020447,
0.08203266561031342,
-0.007121166680008173,
0.05038304999470711,
0.02604903280735016,
0.03596234321594238,
-0.10394541919231415,
0.028187597170472145,
-0.14043688774108887,
0.07748814672231674,
-0.08505922555923462,
-0.005441642366349697,
-0.13644100725650787,
-0.10214116424322128,
0.014992051757872105,
-0.02069890685379505,
0.09438085556030273,
0.13329912722110748,
-0.19880379736423492,
-0.02009275183081627,
0.12556472420692444,
-0.07573694735765457,
-0.051652565598487854,
0.06044459342956543,
-0.06154930964112282,
0.03936462104320526,
0.050992731004953384,
0.21299561858177185,
0.053396888077259064,
-0.15722982585430145,
-0.010462114587426186,
0.0055589075200259686,
0.04487115144729614,
0.027255956083536148,
0.03781335428357124,
0.002707255305722356,
0.05738481506705284,
0.015929441899061203,
-0.0905783399939537,
-0.024681255221366882,
-0.09019775688648224,
-0.06597421318292618,
-0.05114147812128067,
-0.07518940418958664,
0.05443323776125908,
0.007831133902072906,
0.041338372975587845,
-0.06517764180898666,
-0.1040172278881073,
0.1139654591679573,
0.09517072886228561,
-0.051744844764471054,
0.038697805255651474,
-0.08063210546970367,
0.013707563281059265,
-0.0036602106411010027,
-0.03554164618253708,
-0.21137583255767822,
-0.11618813127279282,
0.05109574645757675,
-0.044763196259737015,
0.025682633742690086,
0.001053264015354216,
0.08397838473320007,
0.05669647082686424,
-0.05110299214720726,
-0.01468528900295496,
-0.09689163416624069,
0.001488701906055212,
-0.11360172182321548,
-0.19068372249603271,
-0.08451950550079346,
-0.0424187108874321,
0.09329717606306076,
-0.1664063036441803,
-0.005378169473260641,
0.022243550047278404,
0.13880205154418945,
0.028240816667675972,
-0.06765507161617279,
0.0009701682720333338,
0.04716034233570099,
0.012779509648680687,
-0.0970369204878807,
0.054961469024419785,
0.013205550611019135,
-0.10400553047657013,
-0.047374650835990906,
-0.13218416273593903,
-0.01667674630880356,
0.0548989437520504,
0.059358444064855576,
-0.10313872247934341,
-0.059491924941539764,
-0.07303356379270554,
-0.03767736628651619,
-0.07809890806674957,
0.02261248789727688,
0.212728813290596,
0.039475444704294205,
0.11403729021549225,
-0.06542904675006866,
-0.08432521671056747,
-0.00780025590211153,
0.026042012497782707,
0.022329790517687798,
0.08497390151023865,
0.023332629352808,
-0.03853892907500267,
0.06917521357536316,
0.10164080560207367,
-0.027287624776363373,
0.13349099457263947,
-0.055405423045158386,
-0.08068733662366867,
-0.03132631629705429,
-0.022211844101548195,
-0.024843603372573853,
0.12961061298847198,
-0.028056329116225243,
0.0009602533536963165,
0.03536806255578995,
0.038225531578063965,
0.011197023093700409,
-0.16877615451812744,
0.0028259127866476774,
0.02751392498612404,
-0.055555399507284164,
-0.040045976638793945,
-0.0052823410369455814,
0.021983109414577484,
0.089102603495121,
0.03036130592226982,
-0.006478483788669109,
0.00845846626907587,
-0.011106571182608604,
-0.057702790945768356,
0.18770618736743927,
-0.09757649153470993,
-0.08117671310901642,
-0.07216598093509674,
0.020418675616383553,
-0.05174790695309639,
-0.03821669891476631,
0.008447466418147087,
-0.09263081848621368,
-0.030497876927256584,
-0.08802918344736099,
-0.026218146085739136,
-0.01935204491019249,
0.020385708659887314,
0.023881569504737854,
-0.01718606799840927,
0.08576007187366486,
-0.13850389420986176,
0.004914340563118458,
-0.0484713651239872,
-0.09459629654884338,
0.008848314173519611,
0.07637645304203033,
0.09143004566431046,
0.08098288625478745,
-0.01905244030058384,
0.02816115692257881,
-0.03884173929691315,
0.23221233487129211,
-0.053320907056331635,
0.012298720888793468,
0.11489098519086838,
-0.010806653648614883,
0.05504702776670456,
0.09166225045919418,
0.038910411298274994,
-0.0909116193652153,
0.023940568789839745,
0.07771944254636765,
-0.0379677377641201,
-0.22508862614631653,
-0.019033538177609444,
-0.0017602224834263325,
-0.07685601711273193,
0.1071448028087616,
0.03224765136837959,
-0.048881400376558304,
0.043324511498212814,
0.022401170805096626,
-0.008706173859536648,
-0.046876464039087296,
0.07498641312122345,
0.07259108126163483,
0.05204001069068909,
0.10640624910593033,
-0.0052779908291995525,
-0.025644300505518913,
0.05663921684026718,
0.016289440914988518,
0.25304433703422546,
-0.044414326548576355,
0.10223758965730667,
0.03196925297379494,
0.15397457778453827,
-0.019475819543004036,
0.06598571687936783,
0.0001786745706340298,
-0.009443964809179306,
-0.010799984447658062,
-0.0663142129778862,
-0.025748884305357933,
0.017022935673594475,
-0.04712839424610138,
0.02445855177938938,
-0.07552110403776169,
0.02332150749862194,
0.02759573422372341,
0.2920893132686615,
0.0266736913472414,
-0.2602766752243042,
-0.07400362193584442,
-0.016064319759607315,
-0.0443887934088707,
-0.06201338768005371,
0.00736504141241312,
0.1342717558145523,
-0.13906903564929962,
0.0538390688598156,
-0.07879109680652618,
0.08861140161752701,
-0.045620210468769073,
0.012429362162947655,
0.04771505668759346,
0.15153868496418,
-0.018642423674464226,
0.05084395408630371,
-0.19879579544067383,
0.25200462341308594,
0.02089935727417469,
0.10653362423181534,
-0.06491126865148544,
0.011118059977889061,
0.020443113520741463,
0.010798638686537743,
0.11021237075328827,
0.003031220054253936,
-0.06439154595136642,
-0.14631688594818115,
-0.09071312844753265,
0.04516947269439697,
0.14104925096035004,
-0.039077162742614746,
0.089341901242733,
-0.029658952727913857,
0.011817601509392262,
0.02988245151937008,
-0.04119284823536873,
-0.1511005014181137,
-0.07809886336326599,
0.0013174356427043676,
0.015748169273138046,
-0.006017369218170643,
-0.06161554530262947,
-0.10504335165023804,
-0.018373001366853714,
0.11263588815927505,
-0.0028582043014466763,
-0.05784870311617851,
-0.15480771660804749,
0.08279680460691452,
0.1418585479259491,
-0.05420595780014992,
0.012338535860180855,
0.015750302001833916,
0.114912249147892,
0.029880201444029808,
-0.08123964816331863,
0.06480273604393005,
-0.0570412315428257,
-0.1817626804113388,
-0.05632777512073517,
0.12148824334144592,
0.08192400634288788,
0.04928480088710785,
-0.0016178677324205637,
0.054831430315971375,
0.0014234472764655948,
-0.09514164924621582,
0.03932071849703789,
0.0027720583602786064,
0.043738871812820435,
0.017672099173069,
-0.0839015319943428,
0.09477467834949493,
-0.03677353635430336,
0.00942482054233551,
0.12795127928256989,
0.21290038526058197,
-0.1075783297419548,
0.11528926342725754,
0.08553856611251831,
-0.072964608669281,
-0.1664058417081833,
0.06111229583621025,
0.1304948776960373,
0.008981783874332905,
0.08396515995264053,
-0.2130977064371109,
0.12349054962396622,
0.1040550172328949,
-0.013080072589218616,
0.008952864445745945,
-0.27590233087539673,
-0.13175539672374725,
0.05834275484085083,
0.11138936877250671,
0.0420059971511364,
-0.11535021662712097,
-0.0333268828690052,
-0.007389693055301905,
-0.0993357002735138,
0.11574801802635193,
-0.07317674160003662,
0.11469293385744095,
-0.020317623391747475,
0.11726806312799454,
0.025994103401899338,
-0.03424953296780586,
0.10943743586540222,
0.06048273295164108,
0.08790028095245361,
-0.03712322935461998,
0.008259158581495285,
0.06059164181351662,
-0.059234295040369034,
0.026423323899507523,
-0.043216634541749954,
0.06742463260889053,
-0.14745710790157318,
0.0065062204375863075,
-0.08732675015926361,
0.05423221364617348,
-0.04579941928386688,
-0.07223336398601532,
-0.01795337162911892,
0.053012456744909286,
0.06964534521102905,
-0.04127003625035286,
0.030861161649227142,
-0.002376405056566,
0.09981920570135117,
0.10565074533224106,
0.08030261844396591,
-0.025389650836586952,
-0.08720266073942184,
0.01415893156081438,
0.0031831583473831415,
0.05577656999230385,
-0.09567060321569443,
0.014159762300550938,
0.1413513720035553,
0.06523190438747406,
0.09546773135662079,
0.0460662916302681,
-0.04233131930232048,
0.004821361508220434,
0.014654245227575302,
-0.133844792842865,
-0.10268548876047134,
0.02568371221423149,
-0.041068486869335175,
-0.15037639439105988,
0.027097336947917938,
0.12055008858442307,
-0.0392889641225338,
-0.02074122428894043,
-0.004634348209947348,
0.003716146806254983,
-0.01226640585809946,
0.18328484892845154,
0.04437996447086334,
0.06243583559989929,
-0.09083161503076553,
0.11129098385572433,
0.03422949090600014,
-0.04942355304956436,
0.05416560173034668,
0.06701748073101044,
-0.10416115075349808,
0.011120311915874481,
0.08049605041742325,
0.13248786330223083,
-0.056069791316986084,
-0.011959875002503395,
-0.09570622444152832,
-0.08380549401044846,
0.04160841926932335,
0.13716650009155273,
0.05521804466843605,
-0.0010385125642642379,
-0.06555408984422684,
0.03600447624921799,
-0.11913862079381943,
0.0685519129037857,
0.04813922569155693,
0.07570886611938477,
-0.09963081777095795,
0.1340518444776535,
-0.0020183371379971504,
0.025125687941908836,
-0.02712094783782959,
0.013190007768571377,
-0.09961700439453125,
-0.024043813347816467,
-0.10735741257667542,
-0.023119688034057617,
-0.01100568100810051,
-0.0013214654754847288,
-0.022303706035017967,
-0.06982413679361343,
-0.02919902466237545,
0.03913646936416626,
-0.07880111783742905,
-0.047982677817344666,
0.01664140820503235,
0.03589799255132675,
-0.1550179272890091,
0.0032390097621828318,
0.025704462081193924,
-0.08988461643457413,
0.0902017280459404,
0.06352188438177109,
0.011605954729020596,
0.023977763950824738,
-0.1201145276427269,
-0.030904628336429596,
-0.010196537710726261,
0.0047999355010688305,
0.06893302500247955,
-0.09474349021911621,
-0.027994463220238686,
-0.035812728106975555,
0.04052038490772247,
0.020030442625284195,
0.10504121333360672,
-0.12063763290643692,
-0.003271755063906312,
-0.036207061260938644,
-0.03918883949518204,
-0.06501814723014832,
0.036335837095975876,
0.10712879151105881,
0.05349376052618027,
0.15109112858772278,
-0.07744812220335007,
0.0544007383286953,
-0.19860713183879852,
-0.037575531750917435,
0.012596916407346725,
-0.047863878309726715,
-0.08155286312103271,
-0.04698476567864418,
0.08828378468751907,
-0.047041673213243484,
0.11453381925821304,
-0.012670954689383507,
0.1009836420416832,
0.04448496177792549,
-0.011445147916674614,
-0.06486112624406815,
-0.007154472172260284,
0.18379035592079163,
0.05278431251645088,
-0.01732206903398037,
0.1267349123954773,
0.0035432290751487017,
0.029084259644150734,
0.08617763221263885,
0.22071325778961182,
0.15386900305747986,
0.0010713129304349422,
0.06211231276392937,
0.05983708053827286,
-0.06851769983768463,
-0.1497124880552292,
0.1205420196056366,
-0.019504588097333908,
0.10687620937824249,
-0.06643299013376236,
0.18971066176891327,
0.03833605349063873,
-0.1804887056350708,
0.06450440734624863,
-0.024522345513105392,
-0.10818112641572952,
-0.11924739181995392,
-0.026721017435193062,
-0.07098279893398285,
-0.12258516252040863,
0.024788610637187958,
-0.11702892929315567,
0.0625554770231247,
0.10301095992326736,
0.008153287693858147,
0.0386604443192482,
0.1825466752052307,
-0.04833272472023964,
0.01187886856496334,
0.08369431644678116,
0.019356828182935715,
0.002054064068943262,
-0.04048459604382515,
-0.06615646183490753,
0.03527562692761421,
0.03451630100607872,
0.0643526241183281,
-0.04720855876803398,
0.005901988595724106,
0.004562269896268845,
-0.008935944177210331,
-0.0764593705534935,
0.011396875604987144,
0.009376085363328457,
0.05146663263440132,
0.046736765652894974,
0.04743443801999092,
0.006460031494498253,
-0.05534471943974495,
0.2926419675350189,
-0.06957811117172241,
-0.06970441341400146,
-0.1287553459405899,
0.21777445077896118,
0.023850949481129646,
-0.025859495624899864,
0.055233556777238846,
-0.0865856483578682,
-0.015506250783801079,
0.16359837353229523,
0.138380765914917,
-0.08963792026042938,
-0.015613386407494545,
-0.023999402299523354,
-0.010954580269753933,
-0.014421173371374607,
0.11500460654497147,
0.07522769272327423,
-0.01120342779904604,
-0.06973166018724442,
-0.012643206864595413,
-0.028050072491168976,
-0.055441971868276596,
-0.06861178576946259,
0.06885810196399689,
0.02709442749619484,
-0.009725887328386307,
-0.06549370288848877,
0.0659429207444191,
-0.001290846848860383,
-0.2339053601026535,
0.043847840279340744,
-0.1744237244129181,
-0.17004388570785522,
-0.018911734223365784,
0.07140786945819855,
0.0034674417693167925,
0.05606399103999138,
0.001973668811842799,
0.023880546912550926,
0.11412900686264038,
-0.014255376532673836,
-0.002986237406730652,
-0.11503637582063675,
0.11480333656072617,
-0.10100817680358887,
0.2000560611486435,
-0.007049274165183306,
0.05753818526864052,
0.09667383879423141,
0.03849909454584122,
-0.13660992681980133,
0.022933077067136765,
0.06329406052827835,
-0.12853211164474487,
-0.005061878357082605,
0.1489018201828003,
-0.030818816274404526,
0.0642775148153305,
0.026859257370233536,
-0.1460643708705902,
0.0021568278316408396,
0.017688224092125893,
-0.03626276180148125,
-0.06908966600894928,
-0.008895027451217175,
-0.050541456788778305,
0.16473518311977386,
0.2158004492521286,
-0.028890438377857208,
0.007570815738290548,
-0.09148211032152176,
0.012782683596014977,
0.04706805571913719,
0.05022786557674408,
-0.04154186695814133,
-0.2059042900800705,
0.01571732386946678,
0.07379934191703796,
-0.006132220849394798,
-0.19572868943214417,
-0.09635964781045914,
0.04648562893271446,
-0.03924676403403282,
-0.04276080057024956,
0.09257791191339493,
0.020050639286637306,
0.041048359125852585,
-0.013267560862004757,
-0.11511249840259552,
-0.021074393764138222,
0.1388065218925476,
-0.17533326148986816,
-0.03273987025022507
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-10
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-10", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-10
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-10
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-10\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09842012077569962,
0.10779953002929688,
-0.0023048182483762503,
0.09483485668897629,
0.1206069067120552,
0.01582573540508747,
0.09887749701738358,
0.1301700919866562,
-0.10436957329511642,
0.06771063804626465,
0.08713673055171967,
0.035852398723363876,
0.044089965522289276,
0.15142996609210968,
-0.0067156776785850525,
-0.2732408046722412,
-0.0003459287981968373,
0.0006532512488774955,
-0.04120606929063797,
0.12060233950614929,
0.0882684662938118,
-0.1105198785662651,
0.07686443626880646,
0.010285839438438416,
-0.15380951762199402,
0.019465679302811623,
-0.032093342393636703,
-0.03686324879527092,
0.12189912050962448,
-0.03768620267510414,
0.10666461288928986,
0.02945377491414547,
0.13737285137176514,
-0.20652969181537628,
0.0065458170138299465,
0.07655643671751022,
0.052366890013217926,
0.09998112171888351,
0.04617675766348839,
0.010446637868881226,
0.08846108615398407,
-0.1497737616300583,
0.09338480234146118,
0.029421480372548103,
-0.0909409299492836,
-0.15295609831809998,
-0.09200585633516312,
0.03148424252867699,
0.05062675103545189,
0.07100756466388702,
0.0024082560557872057,
0.14836353063583374,
-0.06367899477481842,
0.08409810811281204,
0.26265156269073486,
-0.3222024142742157,
-0.06703683733940125,
0.030859870836138725,
0.057860080152750015,
0.06016886234283447,
-0.1200583204627037,
-0.003610681276768446,
0.020133981481194496,
0.026962516829371452,
0.12853311002254486,
-0.016424985602498055,
-0.10710641741752625,
-0.009791962802410126,
-0.1247742772102356,
-0.000903693784493953,
0.06028973311185837,
0.02584557980298996,
-0.05274653062224388,
-0.10777341574430466,
-0.06819263845682144,
-0.0853513777256012,
-0.023338094353675842,
-0.055450860410928726,
0.05152400583028793,
-0.05512714013457298,
-0.0974840521812439,
-0.03997865691781044,
-0.05673981457948685,
-0.0805731862783432,
-0.008051443845033646,
0.16620482504367828,
0.03582647815346718,
0.019705088809132576,
-0.03019777126610279,
0.1177775040268898,
0.01885901764035225,
-0.13988451659679413,
-0.00863755214959383,
-0.0047853728756308556,
-0.09439413994550705,
-0.04144613444805145,
-0.05353594198822975,
-0.011509260162711143,
0.004607250913977623,
0.1694687455892563,
-0.07859814167022705,
0.07424063235521317,
0.014350992627441883,
-0.024483539164066315,
-0.012689664028584957,
0.1528286188840866,
-0.0407787449657917,
-0.0407210998237133,
-0.015941914170980453,
0.08017048239707947,
0.004160038661211729,
-0.01819082535803318,
-0.06722904741764069,
-0.0281855259090662,
0.09492071717977524,
0.05484805628657341,
-0.06350506842136383,
0.03963836655020714,
-0.02870772033929825,
-0.026309378445148468,
0.01904846355319023,
-0.11959457397460938,
0.04188241809606552,
-0.004147121217101812,
-0.08276621252298355,
-0.008126291446387768,
-0.0026636351831257343,
-0.008399390615522861,
-0.010415921919047832,
0.09848307073116302,
-0.08743471652269363,
-0.0009650950669310987,
-0.07060796022415161,
-0.08149441331624985,
-0.0008268304518423975,
-0.15796396136283875,
-0.01377484668046236,
-0.058574024587869644,
-0.16936138272285461,
-0.03394794836640358,
0.0433136485517025,
-0.07470746338367462,
-0.011603666469454765,
-0.04407187178730965,
-0.06188633292913437,
0.016106292605400085,
-0.012397713027894497,
0.19265657663345337,
-0.05206257104873657,
0.08082497864961624,
-0.007174127735197544,
0.05129053816199303,
0.026313409209251404,
0.036323562264442444,
-0.10350356251001358,
0.027924969792366028,
-0.13950897753238678,
0.07759299129247665,
-0.0853506550192833,
-0.0035529425367712975,
-0.13606594502925873,
-0.10235365480184555,
0.013170727528631687,
-0.02056083083152771,
0.09403581917285919,
0.133359894156456,
-0.19979926943778992,
-0.019098835065960884,
0.1268545389175415,
-0.07522276788949966,
-0.050910189747810364,
0.05927839130163193,
-0.061954669654369354,
0.04072637856006622,
0.05277837812900543,
0.21245908737182617,
0.05595523118972778,
-0.15640155971050262,
-0.01102759875357151,
0.0057190172374248505,
0.04471702501177788,
0.02525879070162773,
0.0378996878862381,
0.0041998825035989285,
0.058659449219703674,
0.016097640618681908,
-0.08978366106748581,
-0.024793490767478943,
-0.08948507159948349,
-0.06637163460254669,
-0.050708189606666565,
-0.07618270814418793,
0.05481906607747078,
0.008325786329805851,
0.04173233360052109,
-0.06530291587114334,
-0.10383916646242142,
0.11484947055578232,
0.09590107947587967,
-0.05162114277482033,
0.037237901240587234,
-0.0805596113204956,
0.013171355240046978,
-0.004657482262700796,
-0.03566771373152733,
-0.2111843228340149,
-0.11430831998586655,
0.0514269657433033,
-0.04517657309770584,
0.02561682090163231,
0.00322577147744596,
0.08469952642917633,
0.05630403012037277,
-0.05084759369492531,
-0.01516254898160696,
-0.09686300903558731,
0.0014194438699632883,
-0.11465569585561752,
-0.18915873765945435,
-0.08550754189491272,
-0.042944155633449554,
0.09408106654882431,
-0.1678541600704193,
-0.004463833756744862,
0.020155226811766624,
0.13835883140563965,
0.02738136425614357,
-0.06779195368289948,
0.001939789392054081,
0.04620439186692238,
0.01362051721662283,
-0.09741302579641342,
0.05464702099561691,
0.012028224766254425,
-0.1035991832613945,
-0.049201130867004395,
-0.13330397009849548,
-0.018278414383530617,
0.054178740829229355,
0.061849381774663925,
-0.10278012603521347,
-0.059893976897001266,
-0.07289671152830124,
-0.037095557898283005,
-0.07696985453367233,
0.022020447999238968,
0.21173447370529175,
0.038447778671979904,
0.11341740936040878,
-0.06551086902618408,
-0.08541042357683182,
-0.007907304912805557,
0.027625208720564842,
0.023008128628134727,
0.08458549529314041,
0.02326849102973938,
-0.03865543752908707,
0.06856797635555267,
0.1026126965880394,
-0.026719676330685616,
0.1327870786190033,
-0.055600304156541824,
-0.08149849623441696,
-0.030990373343229294,
-0.02271624095737934,
-0.026063349097967148,
0.1292942464351654,
-0.028477396816015244,
-0.0007605213904753327,
0.03490298613905907,
0.036898206919431686,
0.011230017058551311,
-0.16903220117092133,
0.0031283062417060137,
0.02758299559354782,
-0.05461029335856438,
-0.04154370725154877,
-0.006330464966595173,
0.02078365534543991,
0.08845880627632141,
0.029683226719498634,
-0.007143731229007244,
0.007697634398937225,
-0.010803026147186756,
-0.057289618998765945,
0.18773837387561798,
-0.09638551622629166,
-0.07990230619907379,
-0.0714573785662651,
0.02111310325562954,
-0.04975515976548195,
-0.038329869508743286,
0.0072648790664970875,
-0.09272338449954987,
-0.029982196167111397,
-0.0875643789768219,
-0.027132853865623474,
-0.01930086500942707,
0.01972927898168564,
0.02555437944829464,
-0.016538359224796295,
0.08390961587429047,
-0.13865861296653748,
0.005611272528767586,
-0.049036622047424316,
-0.09420862793922424,
0.00807857234030962,
0.07534842938184738,
0.0917012020945549,
0.0815013125538826,
-0.01976982317864895,
0.028335321694612503,
-0.03948213532567024,
0.23153337836265564,
-0.05380450561642647,
0.013794693164527416,
0.11453815549612045,
-0.01033113058656454,
0.054696377366781235,
0.09236489236354828,
0.03816622495651245,
-0.0905630812048912,
0.0235239639878273,
0.07698024809360504,
-0.03736645355820656,
-0.2257813960313797,
-0.018425598740577698,
-0.0009717951179482043,
-0.07823261618614197,
0.10786008834838867,
0.03177151829004288,
-0.04667774960398674,
0.0450768768787384,
0.022233428433537483,
-0.007193712051957846,
-0.045578498393297195,
0.07455930858850479,
0.07026980817317963,
0.05122976750135422,
0.1066344752907753,
-0.0055772000923752785,
-0.026793932542204857,
0.055120475590229034,
0.01692277565598488,
0.2538515031337738,
-0.043239347636699677,
0.10195433348417282,
0.030909661203622818,
0.15324613451957703,
-0.020326625555753708,
0.06842971593141556,
0.0011219038860872388,
-0.009836474433541298,
-0.01074647530913353,
-0.0660456046462059,
-0.02431255578994751,
0.017569372430443764,
-0.04648709297180176,
0.024206984788179398,
-0.07483035326004028,
0.02293427474796772,
0.02691006287932396,
0.29063481092453003,
0.028261493891477585,
-0.2613110840320587,
-0.0733131468296051,
-0.015703732147812843,
-0.04546124115586281,
-0.061786625534296036,
0.006988589186221361,
0.13303297758102417,
-0.13883164525032043,
0.0548064187169075,
-0.07881472259759903,
0.08880966156721115,
-0.04471874237060547,
0.01250741071999073,
0.046455834060907364,
0.15135011076927185,
-0.018433989956974983,
0.05207069218158722,
-0.19885873794555664,
0.2507048547267914,
0.020903440192341805,
0.1078418642282486,
-0.06607398390769958,
0.011245624162256718,
0.0204406026750803,
0.010296441614627838,
0.11147591471672058,
0.0026805875822901726,
-0.06446678191423416,
-0.1468087136745453,
-0.09098789840936661,
0.04514219984412193,
0.14112907648086548,
-0.038655396550893784,
0.08994823694229126,
-0.028690380975604057,
0.010768162086606026,
0.030179154127836227,
-0.04271805286407471,
-0.1524878889322281,
-0.07779248803853989,
0.0008046259172260761,
0.014606312848627567,
-0.00623394176363945,
-0.061056382954120636,
-0.10513871163129807,
-0.020760921761393547,
0.11116324365139008,
-0.0014326788950711489,
-0.05797019973397255,
-0.15461714565753937,
0.08455021679401398,
0.14227166771888733,
-0.053790245205163956,
0.013059651479125023,
0.01725573092699051,
0.1157134473323822,
0.029408011585474014,
-0.08085855096578598,
0.06396391242742538,
-0.05703228712081909,
-0.18040217459201813,
-0.05532034486532211,
0.12272575497627258,
0.08243365585803986,
0.04954129084944725,
-0.00013353249232750386,
0.05413208156824112,
0.0014078589156270027,
-0.09486669301986694,
0.03877156227827072,
0.0025208485312759876,
0.04330338165163994,
0.0179301630705595,
-0.08489545434713364,
0.09321177005767822,
-0.0373515821993351,
0.011581901460886002,
0.12808889150619507,
0.2096073031425476,
-0.10741006582975388,
0.11445660144090652,
0.08527872711420059,
-0.07341711223125458,
-0.16598708927631378,
0.061862457543611526,
0.130519300699234,
0.009706081822514534,
0.08447116613388062,
-0.2126806676387787,
0.12376612424850464,
0.10348781198263168,
-0.012435097247362137,
0.007579589728266001,
-0.27582719922065735,
-0.13123105466365814,
0.05960721895098686,
0.11156173795461655,
0.03956804797053337,
-0.11459213495254517,
-0.033179186284542084,
-0.00798418466001749,
-0.0993589386343956,
0.11526807397603989,
-0.07429991662502289,
0.1137542724609375,
-0.01966194063425064,
0.11621884256601334,
0.025945376604795456,
-0.03432624787092209,
0.1077193021774292,
0.06230453774333,
0.08807440847158432,
-0.036843642592430115,
0.007206878159195185,
0.06314808875322342,
-0.0589008592069149,
0.028385018929839134,
-0.04214610159397125,
0.06706503033638,
-0.1473459005355835,
0.005837662611156702,
-0.08682186156511307,
0.053957607597112656,
-0.04545244202017784,
-0.07245095074176788,
-0.017464540898799896,
0.052913978695869446,
0.06931357830762863,
-0.041377171874046326,
0.02869749255478382,
-0.0009661180665716529,
0.09907335788011551,
0.1015838086605072,
0.08158125728368759,
-0.02426796592772007,
-0.08633661270141602,
0.01402540784329176,
0.003391345962882042,
0.055076826363801956,
-0.09645005315542221,
0.013504672795534134,
0.14139987528324127,
0.06573754549026489,
0.09571440517902374,
0.0454336442053318,
-0.04242292419075966,
0.004183652810752392,
0.014021330513060093,
-0.1313180923461914,
-0.10407613962888718,
0.025479761883616447,
-0.04219075292348862,
-0.15108256042003632,
0.028492052108049393,
0.11838745325803757,
-0.04018750414252281,
-0.02172469161450863,
-0.005784564185887575,
0.0033640905749052763,
-0.01187497191131115,
0.18475152552127838,
0.045231640338897705,
0.06257052719593048,
-0.09132492542266846,
0.11087577790021896,
0.03454188257455826,
-0.049705032259225845,
0.05384720861911774,
0.06672383099794388,
-0.10481670498847961,
0.010734735056757927,
0.08181571960449219,
0.13298924267292023,
-0.05299790948629379,
-0.012309537269175053,
-0.09554368257522583,
-0.08373268693685532,
0.04202233627438545,
0.13671176135540009,
0.055454086512327194,
-0.0024475776590406895,
-0.06513751298189163,
0.036438796669244766,
-0.1196916401386261,
0.06857453286647797,
0.04832630977034569,
0.07559457421302795,
-0.09999179095029831,
0.13425643742084503,
-0.0019004675559699535,
0.025743521749973297,
-0.02709653414785862,
0.014141295105218887,
-0.09870368987321854,
-0.024508235976099968,
-0.10581478476524353,
-0.02457578480243683,
-0.012561174109578133,
-0.0011799505446106195,
-0.023000624030828476,
-0.06933552026748657,
-0.028966132551431656,
0.038722120225429535,
-0.07856796681880951,
-0.04814571514725685,
0.016182512044906616,
0.03505663573741913,
-0.15406756103038788,
0.0032797600142657757,
0.024895375594496727,
-0.08924850821495056,
0.08941420167684555,
0.06279706954956055,
0.012501230463385582,
0.024687262251973152,
-0.11934052407741547,
-0.030319221317768097,
-0.009135921485722065,
0.00513418996706605,
0.06848020106554031,
-0.09443064033985138,
-0.02749166637659073,
-0.03553413599729538,
0.04168059676885605,
0.01912565343081951,
0.10207457840442657,
-0.11961889266967773,
-0.0043131145648658276,
-0.03754206374287605,
-0.03863116726279259,
-0.06537605077028275,
0.036910735070705414,
0.10660749673843384,
0.05184689909219742,
0.1516052931547165,
-0.07562792301177979,
0.05437014624476433,
-0.19951048493385315,
-0.03814050182700157,
0.0117730051279068,
-0.047874920070171356,
-0.0813041552901268,
-0.04769628494977951,
0.08838589489459991,
-0.046557795256376266,
0.11560574173927307,
-0.012561149895191193,
0.1020079031586647,
0.04356784373521805,
-0.008136785589158535,
-0.06440971046686172,
-0.006486525759100914,
0.18429093062877655,
0.052973438054323196,
-0.017927957698702812,
0.1258840560913086,
0.004195608664304018,
0.029459768906235695,
0.08461350202560425,
0.2179962545633316,
0.1539931297302246,
0.00007847254164516926,
0.06209445372223854,
0.06071843206882477,
-0.06880476325750351,
-0.148594468832016,
0.12183981388807297,
-0.020206201821565628,
0.10496146231889725,
-0.06614196300506592,
0.19202883541584015,
0.037941671907901764,
-0.18046317994594574,
0.06513246148824692,
-0.02369789034128189,
-0.10908687114715576,
-0.11795904487371445,
-0.028474045917391777,
-0.0712895542383194,
-0.12142535299062729,
0.025073878467082977,
-0.11711540818214417,
0.06091809272766113,
0.1032148152589798,
0.008082536049187183,
0.03787892684340477,
0.18389812111854553,
-0.04897491633892059,
0.012159832753241062,
0.08382699638605118,
0.018949419260025024,
0.00265565887093544,
-0.040871862322092056,
-0.06511935591697693,
0.03652126342058182,
0.03373919054865837,
0.06511302292346954,
-0.049626514315605164,
0.005282875616103411,
0.004529720637947321,
-0.008070566691458225,
-0.07585947960615158,
0.011551794596016407,
0.009015066549181938,
0.051463156938552856,
0.0455789789557457,
0.047627102583646774,
0.0058782948181033134,
-0.055755406618118286,
0.2911308705806732,
-0.06955232471227646,
-0.0706072598695755,
-0.12901166081428528,
0.21499423682689667,
0.025028705596923828,
-0.025932051241397858,
0.05610010400414467,
-0.08700302243232727,
-0.013150090351700783,
0.16453571617603302,
0.1385190337896347,
-0.0871708020567894,
-0.016204657033085823,
-0.023628955706954002,
-0.011304994113743305,
-0.015106997452676296,
0.11434074491262436,
0.07562373578548431,
-0.013630512170493603,
-0.06951908767223358,
-0.011937742121517658,
-0.02646886743605137,
-0.05666043981909752,
-0.06846732646226883,
0.06861032545566559,
0.028128810226917267,
-0.010131942108273506,
-0.06351657956838608,
0.06766167283058167,
0.0010785224149003625,
-0.23382247984409332,
0.04245956987142563,
-0.17337238788604736,
-0.17018313705921173,
-0.019522525370121002,
0.07122073322534561,
0.0053143189288675785,
0.055876847356557846,
0.002224997617304325,
0.024234943091869354,
0.11345137655735016,
-0.013551567681133747,
-0.003740439424291253,
-0.11520285904407501,
0.11497467011213303,
-0.10235698521137238,
0.19894583523273468,
-0.007252226583659649,
0.05853642523288727,
0.0965101420879364,
0.03682546317577362,
-0.1364825814962387,
0.023372134193778038,
0.06336463242769241,
-0.12639769911766052,
-0.003403523936867714,
0.14941872656345367,
-0.030601603910326958,
0.06184450164437294,
0.02590193599462509,
-0.1463695913553238,
0.0024984506890177727,
0.01795084960758686,
-0.03569822385907173,
-0.06992244720458984,
-0.006273836828768253,
-0.049782298505306244,
0.16547228395938873,
0.21553371846675873,
-0.029316922649741173,
0.008112622424960136,
-0.09198901802301407,
0.012149044312536716,
0.04735271632671356,
0.05039886385202408,
-0.04134494811296463,
-0.20566324889659882,
0.014529981650412083,
0.0711372122168541,
-0.00553223118185997,
-0.19480274617671967,
-0.09526962041854858,
0.045424625277519226,
-0.04071854054927826,
-0.042923372238874435,
0.0917230024933815,
0.021810214966535568,
0.04099969193339348,
-0.012982213869690895,
-0.11559553444385529,
-0.021348673850297928,
0.1388711780309677,
-0.17638124525547028,
-0.0318634994328022
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-2
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-2", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-2
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-2
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-2\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09707863628864288,
0.10870766639709473,
-0.0024108830839395523,
0.09516728669404984,
0.12005501240491867,
0.016284910961985588,
0.09861568361520767,
0.13093328475952148,
-0.10488006472587585,
0.06816889345645905,
0.08780702948570251,
0.03434590995311737,
0.043502144515514374,
0.1503830999135971,
-0.006851539947092533,
-0.2733902335166931,
-0.0005533320363610983,
-0.000960816687438637,
-0.04312215745449066,
0.12054871767759323,
0.0889689177274704,
-0.10941648483276367,
0.07728185504674911,
0.010753287002444267,
-0.15475237369537354,
0.01934470422565937,
-0.03108428418636322,
-0.03606957942247391,
0.12180734425783157,
-0.036851465702056885,
0.10715480893850327,
0.03057374246418476,
0.13707134127616882,
-0.20715714991092682,
0.006231564097106457,
0.0758201852440834,
0.05281204730272293,
0.10037019848823547,
0.04675960913300514,
0.01046296302229166,
0.08793529123067856,
-0.14910835027694702,
0.09232403337955475,
0.02970975451171398,
-0.09086517244577408,
-0.1539522111415863,
-0.09255113452672958,
0.03169844672083855,
0.05054781213402748,
0.0697208121418953,
0.0033388277515769005,
0.14691345393657684,
-0.06302934139966965,
0.08368893712759018,
0.26251107454299927,
-0.3231040835380554,
-0.0673830434679985,
0.02932424284517765,
0.056489866226911545,
0.06103070452809334,
-0.11828326433897018,
-0.002815582789480686,
0.02005940116941929,
0.027292396873235703,
0.12739288806915283,
-0.01619032584130764,
-0.10689835995435715,
-0.009808730334043503,
-0.12478026747703552,
-0.0027907455805689096,
0.06060459464788437,
0.02514057792723179,
-0.05390743538737297,
-0.10549397021532059,
-0.0688498243689537,
-0.08290623873472214,
-0.021807663142681122,
-0.05595896765589714,
0.051754895597696304,
-0.05363571271300316,
-0.0972055047750473,
-0.04244179651141167,
-0.05819861963391304,
-0.080929696559906,
-0.009131098166108131,
0.1681416928768158,
0.03577738627791405,
0.021303169429302216,
-0.029772590845823288,
0.11916392296552658,
0.0206303708255291,
-0.13945774734020233,
-0.009266320616006851,
-0.00422052014619112,
-0.09343904256820679,
-0.04081207513809204,
-0.053540538996458054,
-0.010874012485146523,
0.005364574491977692,
0.16831907629966736,
-0.08088159561157227,
0.07332108169794083,
0.012720027007162571,
-0.02316012606024742,
-0.012687725014984608,
0.1526699811220169,
-0.03872344642877579,
-0.037397101521492004,
-0.015455449931323528,
0.07928797602653503,
0.005155824590474367,
-0.0184729415923357,
-0.06733326613903046,
-0.028959164395928383,
0.09558677673339844,
0.05620117485523224,
-0.06308984011411667,
0.03750041499733925,
-0.030480988323688507,
-0.026329364627599716,
0.017152369022369385,
-0.11962016671895981,
0.04166441038250923,
-0.0037723882123827934,
-0.08108925819396973,
-0.00851630698889494,
-0.0016728929476812482,
-0.008188833482563496,
-0.009307555854320526,
0.09745923429727554,
-0.08608254045248032,
0.00019645677821245044,
-0.0691167414188385,
-0.07930512726306915,
-0.0009022161830216646,
-0.15462130308151245,
-0.014735410921275616,
-0.05949551239609718,
-0.1667672097682953,
-0.03179575875401497,
0.044283173978328705,
-0.07634144276380539,
-0.014042296446859837,
-0.04289242997765541,
-0.060989703983068466,
0.01740112528204918,
-0.013143445365130901,
0.19226224720478058,
-0.05220161750912666,
0.08192078769207001,
-0.006964354310184717,
0.05114556849002838,
0.026005635038018227,
0.03527792543172836,
-0.10249299556016922,
0.029152071103453636,
-0.14053764939308167,
0.07813551276922226,
-0.08415929973125458,
-0.007011221256107092,
-0.13741327822208405,
-0.10182632505893707,
0.015009205788373947,
-0.021511755883693695,
0.09334984421730042,
0.13318322598934174,
-0.19842158257961273,
-0.018639327958226204,
0.12545591592788696,
-0.07688532769680023,
-0.05076046288013458,
0.06084024906158447,
-0.06178762763738632,
0.042555518448352814,
0.05048409476876259,
0.2126055806875229,
0.05508238449692726,
-0.15681777894496918,
-0.008630407974123955,
0.007486927323043346,
0.04420776292681694,
0.027092568576335907,
0.039710406213998795,
0.002733493223786354,
0.05588440224528313,
0.015682494267821312,
-0.09234286844730377,
-0.024973629042506218,
-0.09040071070194244,
-0.0671091303229332,
-0.05115269124507904,
-0.07572359591722488,
0.05608081445097923,
0.005995669402182102,
0.04207754507660866,
-0.0645609050989151,
-0.10276695340871811,
0.11394833028316498,
0.09625673294067383,
-0.051184944808483124,
0.03896313160657883,
-0.08099276572465897,
0.013199816457927227,
-0.005513609386980534,
-0.036401063203811646,
-0.21087466180324554,
-0.11542998254299164,
0.052180755883455276,
-0.04542353004217148,
0.025694124400615692,
0.0034141221549361944,
0.08366159349679947,
0.056638237088918686,
-0.050968945026397705,
-0.015139113180339336,
-0.09726431220769882,
0.0009381487034261227,
-0.11405499279499054,
-0.18990826606750488,
-0.08509980142116547,
-0.04279957711696625,
0.09456991404294968,
-0.16673460602760315,
-0.0056273615919053555,
0.021381663158535957,
0.1384178102016449,
0.02771526202559471,
-0.06715860962867737,
0.0004962055245414376,
0.04528981074690819,
0.012888981960713863,
-0.09661051630973816,
0.05481693521142006,
0.013369680382311344,
-0.10468921810388565,
-0.04695191606879234,
-0.1308760643005371,
-0.017433203756809235,
0.05273013934493065,
0.06063469499349594,
-0.102913036942482,
-0.060306891798973083,
-0.07318486273288727,
-0.03792243450880051,
-0.07824303954839706,
0.0221063494682312,
0.21195171773433685,
0.03822299465537071,
0.11379366368055344,
-0.06586966663599014,
-0.08461544662714005,
-0.008203343488276005,
0.025099486112594604,
0.021929988637566566,
0.08415407687425613,
0.023204823955893517,
-0.04047452285885811,
0.0675526112318039,
0.10315771400928497,
-0.02710745856165886,
0.13194352388381958,
-0.05553547292947769,
-0.08110757917165756,
-0.03275030106306076,
-0.020244350656867027,
-0.025326846167445183,
0.12859897315502167,
-0.027142802253365517,
0.001387579832226038,
0.03500857204198837,
0.03810708224773407,
0.011043894104659557,
-0.1699868142604828,
0.0029743914492428303,
0.027745362371206284,
-0.056638579815626144,
-0.03886386752128601,
-0.006264947354793549,
0.021801399067044258,
0.0889393761754036,
0.030440865084528923,
-0.006864908151328564,
0.009577730670571327,
-0.011023801751434803,
-0.05835214629769325,
0.18674936890602112,
-0.09644006937742233,
-0.08131692558526993,
-0.07339934259653091,
0.021048307418823242,
-0.0503624826669693,
-0.03793476149439812,
0.008131919428706169,
-0.09134262055158615,
-0.02955036796629429,
-0.08801117539405823,
-0.02578834630548954,
-0.02063000574707985,
0.02080002799630165,
0.026105929166078568,
-0.016681157052516937,
0.08662368357181549,
-0.1378379464149475,
0.005204836837947369,
-0.0485963337123394,
-0.09571980684995651,
0.008877716027200222,
0.07627213001251221,
0.09108626842498779,
0.08054452389478683,
-0.01889839768409729,
0.028053972870111465,
-0.038510385900735855,
0.23247209191322327,
-0.052408356219530106,
0.012744070962071419,
0.11527860164642334,
-0.012254355475306511,
0.05562463402748108,
0.09195228666067123,
0.03781476989388466,
-0.09057552367448807,
0.023951325565576553,
0.07638601958751678,
-0.037939462810754776,
-0.2254825383424759,
-0.019119367003440857,
-0.0003688695724122226,
-0.07772071659564972,
0.1073688417673111,
0.031892772763967514,
-0.050910502672195435,
0.043736379593610764,
0.02300131693482399,
-0.008452647365629673,
-0.046100303530693054,
0.07468296587467194,
0.07201990485191345,
0.051313839852809906,
0.1061757355928421,
-0.005068876780569553,
-0.026788262650370598,
0.056887250393629074,
0.017652617767453194,
0.25315359234809875,
-0.043519023805856705,
0.10125954449176788,
0.031579334288835526,
0.15495170652866364,
-0.020010417327284813,
0.06624389439821243,
0.0013353396207094193,
-0.009010041132569313,
-0.011481606401503086,
-0.0660218670964241,
-0.025366419926285744,
0.018411429598927498,
-0.04569872096180916,
0.0244281068444252,
-0.07598348706960678,
0.025620197877287865,
0.026649540290236473,
0.29239413142204285,
0.027972958981990814,
-0.259025514125824,
-0.07295612245798111,
-0.01490265503525734,
-0.0451681911945343,
-0.0613265223801136,
0.007055206224322319,
0.135086327791214,
-0.13988979160785675,
0.053089775145053864,
-0.07843472063541412,
0.08788087218999863,
-0.045431289821863174,
0.01184807624667883,
0.04571101814508438,
0.15033024549484253,
-0.017196420580148697,
0.05230459198355675,
-0.1968189775943756,
0.2512758672237396,
0.020716188475489616,
0.10594857484102249,
-0.0642583891749382,
0.011539781466126442,
0.019696291536092758,
0.010604935698211193,
0.11170510202646255,
0.0037261084653437138,
-0.06579132378101349,
-0.14651380479335785,
-0.09236379712820053,
0.04483751952648163,
0.1418026089668274,
-0.04019032418727875,
0.08937641978263855,
-0.02963816002011299,
0.011723036877810955,
0.029653504490852356,
-0.040805790573358536,
-0.15159529447555542,
-0.07715137302875519,
0.0010906619718298316,
0.013572490774095058,
-0.006708435248583555,
-0.0623142309486866,
-0.10520103573799133,
-0.01678086817264557,
0.11342291533946991,
-0.0020832957234233618,
-0.05822513625025749,
-0.15392035245895386,
0.08391518890857697,
0.14167441427707672,
-0.05487746000289917,
0.011977954767644405,
0.016628630459308624,
0.11602916568517685,
0.029712889343500137,
-0.08104289323091507,
0.06398171186447144,
-0.05680757015943527,
-0.1823003888130188,
-0.0555579774081707,
0.12340781837701797,
0.08179005980491638,
0.04981285333633423,
-0.0002006951253861189,
0.053820375353097916,
0.0026684345211833715,
-0.09476341307163239,
0.040096476674079895,
0.003146240720525384,
0.04283374547958374,
0.018016083166003227,
-0.08468440175056458,
0.09639645367860794,
-0.03697764873504639,
0.009588196873664856,
0.1300828903913498,
0.21392805874347687,
-0.10804878175258636,
0.11645252257585526,
0.08484518527984619,
-0.07387989014387131,
-0.16597776114940643,
0.05962786823511124,
0.13215836882591248,
0.008655752055346966,
0.08612604439258575,
-0.2130197435617447,
0.1231963187456131,
0.10347115248441696,
-0.013823317363858223,
0.0064440215937793255,
-0.27692732214927673,
-0.13141077756881714,
0.05802024155855179,
0.11134087294340134,
0.04319867491722107,
-0.1141105592250824,
-0.034135300666093826,
-0.006640581879764795,
-0.09961085766553879,
0.11480233818292618,
-0.0722905695438385,
0.1141539141535759,
-0.019800525158643723,
0.11736894398927689,
0.026124553754925728,
-0.033467549830675125,
0.1103205680847168,
0.06065782904624939,
0.08635444939136505,
-0.036599330604076385,
0.007995683699846268,
0.06090322509407997,
-0.059636782854795456,
0.027148712426424026,
-0.041627369821071625,
0.06770681589841843,
-0.14801500737667084,
0.0058177621103823185,
-0.08638380467891693,
0.053602371364831924,
-0.04627460241317749,
-0.07216878980398178,
-0.01755034364759922,
0.05243009328842163,
0.06959445774555206,
-0.04102731868624687,
0.028368430212140083,
-0.0002452954649925232,
0.09739425033330917,
0.10645965486764908,
0.08045565336942673,
-0.022741517052054405,
-0.08683104813098907,
0.012597962282598019,
0.0029576795641332865,
0.0551602765917778,
-0.0953131690621376,
0.014982456341385841,
0.140447735786438,
0.06466870754957199,
0.0958424061536789,
0.0454716756939888,
-0.0431499183177948,
0.005542940925806761,
0.013989065773785114,
-0.13311626017093658,
-0.10511980950832367,
0.024864301085472107,
-0.04112745821475983,
-0.15139292180538177,
0.026424098759889603,
0.12143490463495255,
-0.039176058024168015,
-0.021517911925911903,
-0.005411508493125439,
0.004232712090015411,
-0.012637095525860786,
0.18272268772125244,
0.04398160055279732,
0.06298530101776123,
-0.0898783877491951,
0.11096026748418808,
0.03443914279341698,
-0.04797552153468132,
0.0535549633204937,
0.06600765138864517,
-0.10394938290119171,
0.011465661227703094,
0.08089976012706757,
0.13205069303512573,
-0.05591180920600891,
-0.011747573502361774,
-0.0953875184059143,
-0.08386549353599548,
0.04101414978504181,
0.135215163230896,
0.05620509013533592,
-0.0014985550660640001,
-0.06507161259651184,
0.03618820384144783,
-0.11832111328840256,
0.06884847581386566,
0.04900478944182396,
0.07579245418310165,
-0.10091274976730347,
0.13429011404514313,
-0.002567801158875227,
0.02749892883002758,
-0.027370097115635872,
0.013096174225211143,
-0.09847521036863327,
-0.02457686886191368,
-0.10784357041120529,
-0.022289885208010674,
-0.01056064572185278,
-0.0013278715778142214,
-0.021888015791773796,
-0.0703732892870903,
-0.02896263636648655,
0.03966685011982918,
-0.07856491953134537,
-0.0481819212436676,
0.015593770891427994,
0.03575770929455757,
-0.15401987731456757,
0.002414410002529621,
0.02629055827856064,
-0.09012419730424881,
0.09101741015911102,
0.06358721852302551,
0.012650853022933006,
0.024197040125727654,
-0.11782791465520859,
-0.030702166259288788,
-0.009110260754823685,
0.005033021792769432,
0.06805476546287537,
-0.09582044184207916,
-0.028739027678966522,
-0.035210371017456055,
0.04107498750090599,
0.01945311203598976,
0.105613112449646,
-0.1204034835100174,
-0.003764713415876031,
-0.037691861391067505,
-0.04010135307908058,
-0.06522372364997864,
0.03571125119924545,
0.1058112233877182,
0.0543673112988472,
0.15136244893074036,
-0.07712054252624512,
0.0546964593231678,
-0.19882380962371826,
-0.03763028606772423,
0.0118137551471591,
-0.04580629989504814,
-0.0820106491446495,
-0.04776294156908989,
0.0870659127831459,
-0.04668837785720825,
0.11437160521745682,
-0.013088881969451904,
0.10024785250425339,
0.043926939368247986,
-0.01075845118612051,
-0.06273317337036133,
-0.006910789757966995,
0.182992085814476,
0.05288316309452057,
-0.01723420061171055,
0.12618118524551392,
0.0023233420215547085,
0.03006833605468273,
0.08425603061914444,
0.22011204063892365,
0.15374402701854706,
0.0006183413788676262,
0.06221862509846687,
0.06052181124687195,
-0.06782606989145279,
-0.1506248414516449,
0.12046388536691666,
-0.01929355226457119,
0.1052207499742508,
-0.065007284283638,
0.19010399281978607,
0.03865274786949158,
-0.18103213608264923,
0.06332503259181976,
-0.023746326565742493,
-0.10896051675081253,
-0.11964745819568634,
-0.026971329003572464,
-0.07219918817281723,
-0.12199695408344269,
0.02475670911371708,
-0.11689773201942444,
0.06205989792943001,
0.102674700319767,
0.007034665439277887,
0.03875013440847397,
0.18098250031471252,
-0.048375967890024185,
0.01221354492008686,
0.08287887275218964,
0.018935101106762886,
0.003508656984195113,
-0.03885019198060036,
-0.0662916824221611,
0.035917624831199646,
0.03600703552365303,
0.06433422863483429,
-0.047877464443445206,
0.006708031054586172,
0.0034537087194621563,
-0.009328332729637623,
-0.0763714611530304,
0.01110912673175335,
0.009185302071273327,
0.051386963576078415,
0.04446239769458771,
0.04800622537732124,
0.006251661106944084,
-0.05514312908053398,
0.292617529630661,
-0.06929411739110947,
-0.07000100612640381,
-0.12874948978424072,
0.21726636588573456,
0.022875996306538582,
-0.025055568665266037,
0.05693582817912102,
-0.08657446503639221,
-0.01522909477353096,
0.1624649167060852,
0.13769251108169556,
-0.09012342244386673,
-0.015878966078162193,
-0.024049239233136177,
-0.01121521182358265,
-0.014195716939866543,
0.1155790314078331,
0.07502032816410065,
-0.0121165094897151,
-0.07048778980970383,
-0.01265262346714735,
-0.028295215219259262,
-0.05522480979561806,
-0.06952511519193649,
0.0690508559346199,
0.026632940396666527,
-0.008525398559868336,
-0.06499366462230682,
0.06622350215911865,
0.0006793720531277359,
-0.23319873213768005,
0.04243546724319458,
-0.17300505936145782,
-0.17048408091068268,
-0.01822224073112011,
0.07207047194242477,
0.003182373009622097,
0.05601878464221954,
0.0005776412435807288,
0.02383328601717949,
0.1143881231546402,
-0.014108267612755299,
-0.0038939749356359243,
-0.11338520795106888,
0.11587905883789062,
-0.10202524811029434,
0.20015496015548706,
-0.006017846986651421,
0.05918010324239731,
0.09587953239679337,
0.03764069825410843,
-0.1373787820339203,
0.022107699885964394,
0.0633482038974762,
-0.12668956816196442,
-0.004328800365328789,
0.15020233392715454,
-0.030841579660773277,
0.0635099783539772,
0.027734043076634407,
-0.14608480036258698,
0.0006374241784214973,
0.01892741210758686,
-0.036037854850292206,
-0.06904689222574234,
-0.009145542979240417,
-0.051791053265333176,
0.16426241397857666,
0.21457591652870178,
-0.0297471322119236,
0.008507602848112583,
-0.09114618599414825,
0.012916374951601028,
0.047384195029735565,
0.05209491774439812,
-0.0401853509247303,
-0.20571456849575043,
0.014403294771909714,
0.07291506975889206,
-0.00525568937882781,
-0.19590938091278076,
-0.09748135507106781,
0.04575662687420845,
-0.03919162601232529,
-0.042863473296165466,
0.09316077828407288,
0.02038874290883541,
0.04027901962399483,
-0.0124673992395401,
-0.11689083278179169,
-0.022046659141778946,
0.13818368315696716,
-0.17566721141338348,
-0.03219081833958626
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-4
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-4", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-4
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-4
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-4\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09767802059650421,
0.1077912300825119,
-0.002368568442761898,
0.09604807198047638,
0.1212543398141861,
0.016602901741862297,
0.09927284717559814,
0.13008186221122742,
-0.105131134390831,
0.06756774336099625,
0.08794160932302475,
0.03415761888027191,
0.04270967096090317,
0.15017397701740265,
-0.006577004212886095,
-0.2733900845050812,
-0.0007922647055238485,
-0.00037023864570073783,
-0.04235648736357689,
0.12071692943572998,
0.0878448560833931,
-0.11010075360536575,
0.07780495285987854,
0.01034170389175415,
-0.15560679137706757,
0.01996534690260887,
-0.031970854848623276,
-0.03601132705807686,
0.12196534126996994,
-0.03685707598924637,
0.10724995285272598,
0.03031492605805397,
0.1369011402130127,
-0.20569533109664917,
0.00647837994620204,
0.07566317915916443,
0.05214780941605568,
0.09970390051603317,
0.04648442938923836,
0.010230342857539654,
0.08698880672454834,
-0.14902625977993011,
0.09236755967140198,
0.029493071138858795,
-0.09105754643678665,
-0.15463869273662567,
-0.09176619350910187,
0.030514994636178017,
0.049958620220422745,
0.07088790088891983,
0.0026564327999949455,
0.14608067274093628,
-0.06357911974191666,
0.08404020220041275,
0.2600495219230652,
-0.32427433133125305,
-0.06779828667640686,
0.029638413339853287,
0.05687713623046875,
0.06127187982201576,
-0.11921359598636627,
-0.0021436037495732307,
0.02022949792444706,
0.02829587645828724,
0.1278732419013977,
-0.016629043966531754,
-0.10663383454084396,
-0.009344963356852531,
-0.12472017109394073,
-0.0011250965762883425,
0.061498016119003296,
0.025491364300251007,
-0.05366664007306099,
-0.1063152477145195,
-0.067556232213974,
-0.08425722271203995,
-0.022326840087771416,
-0.05511823669075966,
0.051817312836647034,
-0.054787635803222656,
-0.09782526642084122,
-0.04028351604938507,
-0.058094244450330734,
-0.07973149418830872,
-0.008697515353560448,
0.1666794717311859,
0.035748548805713654,
0.020934663712978363,
-0.02925322949886322,
0.1185087189078331,
0.020579008385539055,
-0.1391199231147766,
-0.008069452829658985,
-0.00495053268969059,
-0.09322366118431091,
-0.040809571743011475,
-0.054233744740486145,
-0.008884682320058346,
0.005730013363063335,
0.16853854060173035,
-0.08116566389799118,
0.07419484853744507,
0.014070353470742702,
-0.024126600474119186,
-0.012507949955761433,
0.1511552482843399,
-0.04023001715540886,
-0.03926096856594086,
-0.015872599557042122,
0.07977558672428131,
0.004124770872294903,
-0.018472647294402122,
-0.06736918538808823,
-0.027614755555987358,
0.09458965063095093,
0.05559859424829483,
-0.06428635865449905,
0.03939229995012283,
-0.029378825798630714,
-0.026052085682749748,
0.017599809914827347,
-0.11944110691547394,
0.041307296603918076,
-0.003776340978220105,
-0.08130481094121933,
-0.007765403017401695,
-0.0025892218109220266,
-0.009165594354271889,
-0.009788203984498978,
0.09752338379621506,
-0.087046317756176,
-0.0006775257643312216,
-0.06945017725229263,
-0.0803927406668663,
-0.0015052700182422996,
-0.1543872207403183,
-0.013605070300400257,
-0.059022657573223114,
-0.16713793575763702,
-0.03348546847701073,
0.0440346896648407,
-0.07625128328800201,
-0.013041481375694275,
-0.04295972362160683,
-0.061733342707157135,
0.016865719109773636,
-0.013196990825235844,
0.19295567274093628,
-0.052655577659606934,
0.08074185252189636,
-0.0055437772534787655,
0.05106698349118233,
0.025829147547483444,
0.035373203456401825,
-0.10223405808210373,
0.028390156105160713,
-0.14051032066345215,
0.07744628936052322,
-0.08487488329410553,
-0.004972080700099468,
-0.13624241948127747,
-0.10313212126493454,
0.015946319326758385,
-0.020341739058494568,
0.09269467741250992,
0.13301117718219757,
-0.19791212677955627,
-0.01894865743815899,
0.12472613900899887,
-0.07575888931751251,
-0.05094582214951515,
0.06056017801165581,
-0.06191982328891754,
0.04120257496833801,
0.051227495074272156,
0.2126815766096115,
0.055324628949165344,
-0.15606239438056946,
-0.009996285662055016,
0.006495344452559948,
0.04471796378493309,
0.02633163332939148,
0.0382080078125,
0.004143895115703344,
0.056131862103939056,
0.016537271440029144,
-0.09129292517900467,
-0.025054490193724632,
-0.08988397568464279,
-0.06665241718292236,
-0.050170671194791794,
-0.07562156021595001,
0.05562303587794304,
0.006661824882030487,
0.04205786809325218,
-0.06575463712215424,
-0.1033157929778099,
0.11331997811794281,
0.0959511250257492,
-0.05183582752943039,
0.03741680830717087,
-0.08126609772443771,
0.013883676379919052,
-0.005649884697049856,
-0.03581593185663223,
-0.21040070056915283,
-0.11670272052288055,
0.05103486403822899,
-0.04276594519615173,
0.0255820844322443,
0.003891585161909461,
0.08443333953619003,
0.05698852241039276,
-0.0514211505651474,
-0.015738561749458313,
-0.0960327535867691,
0.0013750967336818576,
-0.11372188478708267,
-0.19047199189662933,
-0.08563262969255447,
-0.04264609143137932,
0.09508421272039413,
-0.16817398369312286,
-0.005204774439334869,
0.02174539305269718,
0.13740472495555878,
0.027097493410110474,
-0.0666857585310936,
0.001540879369713366,
0.04666053131222725,
0.013898398727178574,
-0.0964222401380539,
0.05556803196668625,
0.013180695474147797,
-0.10346806794404984,
-0.04815828427672386,
-0.1312953680753708,
-0.015566932037472725,
0.053866609930992126,
0.05971341207623482,
-0.10334073007106781,
-0.0601225309073925,
-0.07295636832714081,
-0.03798777237534523,
-0.07642661780118942,
0.022103916853666306,
0.21292008459568024,
0.03765891119837761,
0.1134277731180191,
-0.06491837650537491,
-0.08421137183904648,
-0.008047920651733875,
0.026605837047100067,
0.02302713505923748,
0.08411123603582382,
0.021913252770900726,
-0.03824146091938019,
0.06790273636579514,
0.10220573097467422,
-0.027342217043042183,
0.13281266391277313,
-0.0558154433965683,
-0.08052179962396622,
-0.032095059752464294,
-0.021294213831424713,
-0.025966322049498558,
0.12940311431884766,
-0.02774692513048649,
-0.00006318746454780921,
0.034650884568691254,
0.037341222167015076,
0.011397991329431534,
-0.16896957159042358,
0.002974457573145628,
0.02692863717675209,
-0.055565569549798965,
-0.03989504277706146,
-0.005902041215449572,
0.02100284770131111,
0.08844852447509766,
0.030180681496858597,
-0.00830103550106287,
0.009474639780819416,
-0.010792877525091171,
-0.05772628262639046,
0.18739701807498932,
-0.09657604247331619,
-0.08022217452526093,
-0.07301854342222214,
0.019240569323301315,
-0.051460038870573044,
-0.03846587613224983,
0.007848518900573254,
-0.0929945856332779,
-0.029770655557513237,
-0.08768986910581589,
-0.026971638202667236,
-0.020454488694667816,
0.020645178854465485,
0.025263864547014236,
-0.017039049416780472,
0.08608206361532211,
-0.13748139142990112,
0.005459759384393692,
-0.048878949135541916,
-0.0957372784614563,
0.009596028365194798,
0.07663530111312866,
0.0914374515414238,
0.0802718847990036,
-0.01850653812289238,
0.028114350512623787,
-0.038769371807575226,
0.2333322912454605,
-0.05254644155502319,
0.012942027300596237,
0.115077905356884,
-0.011248663999140263,
0.054376859217882156,
0.09196429699659348,
0.03883398696780205,
-0.09104768186807632,
0.023737594485282898,
0.07689069956541061,
-0.037406932562589645,
-0.22530491650104523,
-0.018454276025295258,
-0.0007095924811437726,
-0.07759147882461548,
0.10728996247053146,
0.031877271831035614,
-0.05028166249394417,
0.04455368220806122,
0.02397538349032402,
-0.007502858527004719,
-0.04639020189642906,
0.07418595254421234,
0.07365478575229645,
0.0506906732916832,
0.10730884969234467,
-0.005229239817708731,
-0.027065420523285866,
0.056023478507995605,
0.017822977155447006,
0.25400012731552124,
-0.043649572879076004,
0.10102970153093338,
0.03229088336229324,
0.15432670712471008,
-0.01955333724617958,
0.06705895066261292,
0.0008693868876434863,
-0.009786226786673069,
-0.011159210465848446,
-0.06584183871746063,
-0.02385214902460575,
0.01742158830165863,
-0.04677440598607063,
0.023693745955824852,
-0.07616443932056427,
0.023917758837342262,
0.02683698758482933,
0.2918229401111603,
0.02797439508140087,
-0.2617294490337372,
-0.07384612411260605,
-0.015762703493237495,
-0.0448634959757328,
-0.060725726187229156,
0.007246397435665131,
0.1344376504421234,
-0.1392526626586914,
0.05392108112573624,
-0.07849874347448349,
0.0874491035938263,
-0.04550398141145706,
0.012879087589681149,
0.04784148558974266,
0.15129263699054718,
-0.018088871613144875,
0.05195019021630287,
-0.19722384214401245,
0.25008606910705566,
0.020796962082386017,
0.1066824272274971,
-0.06443215906620026,
0.011437192559242249,
0.02058592438697815,
0.011939025484025478,
0.11139953136444092,
0.0030768555589020252,
-0.06561066210269928,
-0.1463802456855774,
-0.09135660529136658,
0.04614930972456932,
0.14084143936634064,
-0.039197858422994614,
0.09029500931501389,
-0.0285783763974905,
0.011264335364103317,
0.0292341411113739,
-0.04179411381483078,
-0.151957169175148,
-0.07775390148162842,
0.0003918722504749894,
0.014846513979136944,
-0.006297634448856115,
-0.061586592346429825,
-0.10578860342502594,
-0.018475214019417763,
0.11268763244152069,
-0.0005763702793046832,
-0.0585327073931694,
-0.15456782281398773,
0.08332972228527069,
0.14186911284923553,
-0.05370418727397919,
0.01197032630443573,
0.01744549348950386,
0.11503934860229492,
0.03084450028836727,
-0.08066622167825699,
0.06402748823165894,
-0.05763901770114899,
-0.18094362318515778,
-0.055475544184446335,
0.12253566086292267,
0.08147508651018143,
0.04898454621434212,
-0.0004887543618679047,
0.0537031814455986,
0.001786737353540957,
-0.0954861268401146,
0.04056167975068092,
0.0009023535531014204,
0.043940845876932144,
0.017522228881716728,
-0.08552086353302002,
0.09627527743577957,
-0.03637031838297844,
0.01051370333880186,
0.12811727821826935,
0.2110086977481842,
-0.10723378509283066,
0.1143057644367218,
0.08531520515680313,
-0.07350331544876099,
-0.16548708081245422,
0.06063121184706688,
0.13099060952663422,
0.009316104464232922,
0.08469186723232269,
-0.21396814286708832,
0.12455521523952484,
0.10282576829195023,
-0.012678384780883789,
0.00848972238600254,
-0.27402180433273315,
-0.13062521815299988,
0.05848512053489685,
0.1121625080704689,
0.043960023671388626,
-0.11461013555526733,
-0.033475637435913086,
-0.006719536148011684,
-0.09895649552345276,
0.1137913390994072,
-0.07508208602666855,
0.11427631229162216,
-0.020184004679322243,
0.11770313233137131,
0.025259701535105705,
-0.0335126556456089,
0.10937269777059555,
0.06170617789030075,
0.08742006123065948,
-0.036741625517606735,
0.008856688626110554,
0.06086166948080063,
-0.058908216655254364,
0.027171388268470764,
-0.0428314208984375,
0.06706064939498901,
-0.14850910007953644,
0.005833533126860857,
-0.08733398467302322,
0.05333036184310913,
-0.04609375447034836,
-0.07186643034219742,
-0.01646995358169079,
0.052712902426719666,
0.06844339519739151,
-0.04113657772541046,
0.0268253181129694,
-0.0012364555150270462,
0.09777891635894775,
0.10527270287275314,
0.08109100908041,
-0.024033470079302788,
-0.08796536922454834,
0.01369580440223217,
0.0025488927494734526,
0.05499745532870293,
-0.09487728774547577,
0.013527669943869114,
0.14121662080287933,
0.06391425430774689,
0.095543272793293,
0.04635542631149292,
-0.042050063610076904,
0.004946134053170681,
0.015100089833140373,
-0.13327035307884216,
-0.10396593809127808,
0.02502612955868244,
-0.045051660388708115,
-0.1511855572462082,
0.027663936838507652,
0.12130282074213028,
-0.03902116045355797,
-0.02133387140929699,
-0.005501679610460997,
0.0035826663952320814,
-0.0128035182133317,
0.18419696390628815,
0.0441509373486042,
0.06264317780733109,
-0.09107816964387894,
0.1103520542383194,
0.03432897850871086,
-0.04867585003376007,
0.05351480096578598,
0.06691547483205795,
-0.10500495135784149,
0.010282736271619797,
0.08009639382362366,
0.13350921869277954,
-0.05464737117290497,
-0.012710495851933956,
-0.09682154655456543,
-0.08436983078718185,
0.04109292849898338,
0.13383424282073975,
0.05597848817706108,
-0.002400052733719349,
-0.0653245598077774,
0.03562217950820923,
-0.11905427277088165,
0.06853167712688446,
0.04819035902619362,
0.07610131800174713,
-0.10106854140758514,
0.13345110416412354,
-0.0025493649300187826,
0.02675532177090645,
-0.027364090085029602,
0.013619380071759224,
-0.09928349405527115,
-0.024670051410794258,
-0.10796590894460678,
-0.02335307002067566,
-0.0110058868303895,
-0.0007843587663955986,
-0.02221691980957985,
-0.0689479187130928,
-0.029633227735757828,
0.039786871522665024,
-0.07890182733535767,
-0.047603681683540344,
0.017458729445934296,
0.0365375354886055,
-0.1532895416021347,
0.0028148172423243523,
0.02495882660150528,
-0.09004728496074677,
0.09085147827863693,
0.06351830810308456,
0.012397369369864464,
0.024548744782805443,
-0.11587657779455185,
-0.031234806403517723,
-0.009555336087942123,
0.004195435903966427,
0.06860902160406113,
-0.09524181485176086,
-0.028160082176327705,
-0.03535078093409538,
0.04142919182777405,
0.019707772880792618,
0.1046401783823967,
-0.11959672719240189,
-0.0034983092918992043,
-0.03696676343679428,
-0.03879385441541672,
-0.06565887480974197,
0.03617456182837486,
0.10648573189973831,
0.05359923839569092,
0.15187081694602966,
-0.07715766131877899,
0.053845759481191635,
-0.1991513967514038,
-0.03806755691766739,
0.011615961790084839,
-0.04728643223643303,
-0.0811939686536789,
-0.0477350652217865,
0.08785761892795563,
-0.047319043427705765,
0.11599503457546234,
-0.0133846839889884,
0.1008639857172966,
0.043331682682037354,
-0.010456779040396214,
-0.06377851963043213,
-0.006253871601074934,
0.18275651335716248,
0.05232932046055794,
-0.017999447882175446,
0.12597399950027466,
0.0037211275193840265,
0.029241269454360008,
0.08560924977064133,
0.21806277334690094,
0.15283000469207764,
0.00131309125572443,
0.06241951510310173,
0.061187174171209335,
-0.06838627904653549,
-0.14935708045959473,
0.12144918739795685,
-0.019161028787493706,
0.10611361265182495,
-0.06593725830316544,
0.1886863112449646,
0.037859782576560974,
-0.1801358163356781,
0.06373970955610275,
-0.024643873795866966,
-0.10918164253234863,
-0.11826866120100021,
-0.026067910715937614,
-0.07149738073348999,
-0.12278652936220169,
0.024950558319687843,
-0.1175098717212677,
0.06056869402527809,
0.10387445241212845,
0.007963588461279869,
0.038231320679187775,
0.18269513547420502,
-0.04694744944572449,
0.012897390872240067,
0.08327971398830414,
0.01827186904847622,
0.0030496090184897184,
-0.03862486407160759,
-0.06522833555936813,
0.03528681769967079,
0.03433872014284134,
0.06378044933080673,
-0.04889633134007454,
0.005804595537483692,
0.004266119562089443,
-0.008494891226291656,
-0.07595419883728027,
0.010895689949393272,
0.009513027966022491,
0.051105353981256485,
0.04553418979048729,
0.047450367361307144,
0.005240621976554394,
-0.05529490113258362,
0.2913624942302704,
-0.069244883954525,
-0.06866278499364853,
-0.12950804829597473,
0.21688973903656006,
0.022781506180763245,
-0.02575604058802128,
0.05599318444728851,
-0.08558771014213562,
-0.013890998438000679,
0.1639009714126587,
0.13971130549907684,
-0.08989841490983963,
-0.01626049354672432,
-0.02351602353155613,
-0.011244887486100197,
-0.014322895556688309,
0.11592583358287811,
0.07504843920469284,
-0.012706231325864792,
-0.07057447731494904,
-0.012544880621135235,
-0.027641385793685913,
-0.05526737868785858,
-0.0692572221159935,
0.06846778094768524,
0.02803046815097332,
-0.008933552540838718,
-0.06417547166347504,
0.06650817394256592,
0.0003574518486857414,
-0.23293434083461761,
0.04347586631774902,
-0.17226643860340118,
-0.17094571888446808,
-0.019453812390565872,
0.07140997797250748,
0.004100894555449486,
0.05680171400308609,
0.0006715530762448907,
0.023608990013599396,
0.11507076770067215,
-0.01438390463590622,
-0.0030248684342950583,
-0.11430350691080093,
0.11601521074771881,
-0.10188629478216171,
0.1987621784210205,
-0.006807422265410423,
0.05927411839365959,
0.09640948474407196,
0.03718072921037674,
-0.1372341364622116,
0.022841449826955795,
0.06252061575651169,
-0.12718957662582397,
-0.0035632899962365627,
0.14930294454097748,
-0.030632810667157173,
0.06258615851402283,
0.026355987414717674,
-0.14505866169929504,
0.0014935298822820187,
0.01932535693049431,
-0.035731200128793716,
-0.06898404657840729,
-0.008793883956968784,
-0.051411259919404984,
0.1648467779159546,
0.21618948876857758,
-0.029685774818062782,
0.008305174298584461,
-0.09106981009244919,
0.01317670289427042,
0.04798934981226921,
0.05075560510158539,
-0.04104231297969818,
-0.20582328736782074,
0.014592139981687069,
0.07348296046257019,
-0.005820111371576786,
-0.19708524644374847,
-0.09623254835605621,
0.04589085280895233,
-0.03905361890792847,
-0.04316813126206398,
0.09308017790317535,
0.021488353610038757,
0.041379474103450775,
-0.013068431988358498,
-0.11438921093940735,
-0.022077666595578194,
0.1379465013742447,
-0.17606361210346222,
-0.032939840108156204
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-6
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-6", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-6
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-6
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-6\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09777898341417313,
0.107546865940094,
-0.0023258288856595755,
0.09569839388132095,
0.12140414118766785,
0.01608388125896454,
0.09882712364196777,
0.1306215226650238,
-0.1043536514043808,
0.06813018769025803,
0.08706463873386383,
0.035100165754556656,
0.04328155890107155,
0.14996610581874847,
-0.00663397554308176,
-0.27285388112068176,
-0.0005295031005516648,
-0.0000883662432897836,
-0.04228359833359718,
0.12072888016700745,
0.08783917129039764,
-0.11031193286180496,
0.07693285495042801,
0.00981320533901453,
-0.15572799742221832,
0.020151928067207336,
-0.03220953419804573,
-0.03563927486538887,
0.12176290154457092,
-0.037356533110141754,
0.10682477802038193,
0.03041241690516472,
0.13680411875247955,
-0.2064426839351654,
0.006496849004179239,
0.07634976506233215,
0.052289701998233795,
0.09990031272172928,
0.04735320061445236,
0.010453016497194767,
0.08889801800251007,
-0.14883308112621307,
0.09221987426280975,
0.030116280540823936,
-0.09091341495513916,
-0.152669295668602,
-0.09233477711677551,
0.029786696657538414,
0.05057786405086517,
0.07161691039800644,
0.002222685841843486,
0.14721570909023285,
-0.06408757716417313,
0.08428174257278442,
0.26194196939468384,
-0.3226184546947479,
-0.06774245202541351,
0.030704110860824585,
0.05668703839182854,
0.06016622111201286,
-0.12019983679056168,
-0.0031408127397298813,
0.020306501537561417,
0.02792207896709442,
0.1267264038324356,
-0.01608944684267044,
-0.10818953812122345,
-0.009821366518735886,
-0.12506747245788574,
-0.001263801590539515,
0.0596500039100647,
0.0255765151232481,
-0.053045693784952164,
-0.106071338057518,
-0.06821498274803162,
-0.08375895023345947,
-0.02214215137064457,
-0.055017389357089996,
0.052032217383384705,
-0.05486934632062912,
-0.0971289798617363,
-0.040094196796417236,
-0.05778472498059273,
-0.08098185062408447,
-0.00849564466625452,
0.16645920276641846,
0.03571672737598419,
0.020707029849290848,
-0.030136091634631157,
0.1182703971862793,
0.020655406638979912,
-0.13940857350826263,
-0.009230867959558964,
-0.004154473077505827,
-0.09362461417913437,
-0.0410405658185482,
-0.05358048900961876,
-0.010897299274802208,
0.005236864555627108,
0.16666316986083984,
-0.08132319152355194,
0.07467205077409744,
0.012982841581106186,
-0.02420063689351082,
-0.013228108175098896,
0.15143926441669464,
-0.039270542562007904,
-0.03765469044446945,
-0.01654095947742462,
0.07978955656290054,
0.0036182475741952658,
-0.018129991367459297,
-0.06664303690195084,
-0.02787727676331997,
0.09513041377067566,
0.05545537918806076,
-0.06366387009620667,
0.038987983018159866,
-0.029597612097859383,
-0.02618587017059326,
0.017929449677467346,
-0.1193147823214531,
0.041481487452983856,
-0.004218959249556065,
-0.08176753669977188,
-0.008845602162182331,
-0.002367812441661954,
-0.009495602920651436,
-0.010163776576519012,
0.0983736589550972,
-0.08742527663707733,
-0.00045321404468268156,
-0.07045796513557434,
-0.0802326425909996,
-0.0009717930806800723,
-0.15613113343715668,
-0.0140887051820755,
-0.058196280151605606,
-0.16760039329528809,
-0.03366679698228836,
0.04356779530644417,
-0.07623573392629623,
-0.012303952127695084,
-0.04363042116165161,
-0.06237848475575447,
0.017236366868019104,
-0.01272959727793932,
0.1939471960067749,
-0.05237434804439545,
0.08152440190315247,
-0.0059139421209692955,
0.050777535885572433,
0.02638520859181881,
0.03604734316468239,
-0.10325292497873306,
0.028105996549129486,
-0.13993452489376068,
0.07756082713603973,
-0.08586592227220535,
-0.005165427923202515,
-0.1372445523738861,
-0.10234592109918594,
0.014613059349358082,
-0.020961740985512733,
0.09357590973377228,
0.1337982565164566,
-0.19831421971321106,
-0.01858336664736271,
0.12530627846717834,
-0.07662627846002579,
-0.05098449066281319,
0.059499382972717285,
-0.061586689203977585,
0.04068760946393013,
0.05073539912700653,
0.2129608392715454,
0.054630909115076065,
-0.15582163631916046,
-0.011169971898198128,
0.005827017594128847,
0.04526890441775322,
0.025812363252043724,
0.0379629023373127,
0.003991940524429083,
0.05732925981283188,
0.016160333529114723,
-0.09108322113752365,
-0.02531842514872551,
-0.09007782489061356,
-0.06599440425634384,
-0.05077654868364334,
-0.07571133971214294,
0.05495802313089371,
0.00800997857004404,
0.041659533977508545,
-0.06524792313575745,
-0.10277613997459412,
0.11327964812517166,
0.09594698995351791,
-0.05116187781095505,
0.03771338611841202,
-0.08070321381092072,
0.012718310579657555,
-0.006116610486060381,
-0.035719551146030426,
-0.21188409626483917,
-0.11694890260696411,
0.05152994394302368,
-0.04351473227143288,
0.02571663074195385,
0.003256110241636634,
0.08432339876890182,
0.05607263743877411,
-0.05153505876660347,
-0.015702135860919952,
-0.09685834497213364,
0.000868123141117394,
-0.1143113449215889,
-0.19057804346084595,
-0.08566402643918991,
-0.043358154594898224,
0.092580147087574,
-0.1675075888633728,
-0.005509964190423489,
0.021743498742580414,
0.1381046622991562,
0.02736913599073887,
-0.0672382265329361,
0.0015690759755671024,
0.046404674649238586,
0.013558303005993366,
-0.09673131257295609,
0.05531326308846474,
0.012249194085597992,
-0.10295172780752182,
-0.04826725274324417,
-0.13163352012634277,
-0.017502617090940475,
0.05353470519185066,
0.06135016307234764,
-0.10355447977781296,
-0.06020871177315712,
-0.07289407402276993,
-0.038172245025634766,
-0.0777106061577797,
0.02316858060657978,
0.2124014049768448,
0.0380806066095829,
0.11293923854827881,
-0.06516076624393463,
-0.08448019623756409,
-0.007451930548995733,
0.026937277987599373,
0.022588670253753662,
0.08524464815855026,
0.0238955095410347,
-0.04024727642536163,
0.06878522038459778,
0.10269921272993088,
-0.026707075536251068,
0.13314838707447052,
-0.05597805604338646,
-0.08100332319736481,
-0.03064119629561901,
-0.020851124078035355,
-0.025785619392991066,
0.12916426360607147,
-0.027450725436210632,
0.0003908916551154107,
0.0344776026904583,
0.03782254457473755,
0.011359396390616894,
-0.1691378355026245,
0.0029824578668922186,
0.02676771767437458,
-0.05540746822953224,
-0.04065496101975441,
-0.0058968341909348965,
0.021050453186035156,
0.08887124806642532,
0.029852688312530518,
-0.00790190789848566,
0.008716423064470291,
-0.010875233449041843,
-0.057746488600969315,
0.18829548358917236,
-0.09614212065935135,
-0.07904542237520218,
-0.0715818926692009,
0.020240111276507378,
-0.051019489765167236,
-0.03878413513302803,
0.008004573173820972,
-0.09404858946800232,
-0.029785001650452614,
-0.08760460466146469,
-0.026698263362050056,
-0.020181335508823395,
0.01980604976415634,
0.02452309988439083,
-0.01671764999628067,
0.08529528975486755,
-0.13832439482212067,
0.005684251897037029,
-0.04938662424683571,
-0.09599118679761887,
0.009161816909909248,
0.07626698911190033,
0.09132691472768784,
0.08058759570121765,
-0.019001968204975128,
0.02816920541226864,
-0.03885268792510033,
0.23200859129428864,
-0.05310504138469696,
0.012385918758809566,
0.11491032689809799,
-0.010054262354969978,
0.05488269031047821,
0.09225538372993469,
0.03808412700891495,
-0.09105651080608368,
0.02403908409178257,
0.07764088362455368,
-0.03769661858677864,
-0.2264568954706192,
-0.018684906885027885,
-0.0009347833693027496,
-0.07777862250804901,
0.10730457305908203,
0.03224179893732071,
-0.05028706416487694,
0.044345010071992874,
0.02316088043153286,
-0.00894606951624155,
-0.04640530049800873,
0.07460442930459976,
0.07132231444120407,
0.05154747515916824,
0.10690125823020935,
-0.0052426946349442005,
-0.02634275145828724,
0.055321499705314636,
0.017757248133420944,
0.25509756803512573,
-0.044042542576789856,
0.10073726624250412,
0.03175009787082672,
0.15400364995002747,
-0.02011219970881939,
0.06752200424671173,
0.0006195006426423788,
-0.009866730310022831,
-0.010952728800475597,
-0.06567295640707016,
-0.02417064644396305,
0.017420930787920952,
-0.04753734543919563,
0.023944763466715813,
-0.0759989470243454,
0.0242436733096838,
0.026272002607584,
0.29261600971221924,
0.028057442978024483,
-0.26054683327674866,
-0.07304086536169052,
-0.016068274155259132,
-0.04523468390107155,
-0.061390191316604614,
0.00706939771771431,
0.1340191662311554,
-0.13883593678474426,
0.05369164049625397,
-0.07893987745046616,
0.08846879005432129,
-0.04412275552749634,
0.012149286456406116,
0.04713333025574684,
0.1514817327260971,
-0.01808297261595726,
0.05236982926726341,
-0.19837097823619843,
0.2522664964199066,
0.020907405763864517,
0.10675522685050964,
-0.06476714462041855,
0.011219415813684464,
0.019976885989308357,
0.010203847661614418,
0.1119227483868599,
0.0029995590448379517,
-0.066181480884552,
-0.1461319476366043,
-0.09068308025598526,
0.04587026685476303,
0.14178769290447235,
-0.03931110352277756,
0.08964759856462479,
-0.02856704406440258,
0.011314088478684425,
0.030101658776402473,
-0.04204469919204712,
-0.15224750339984894,
-0.0775725245475769,
0.000732355925720185,
0.014192217029631138,
-0.007075628265738487,
-0.06123463064432144,
-0.10555071383714676,
-0.01724897511303425,
0.11229363083839417,
-0.0007163331029005349,
-0.057825226336717606,
-0.15453487634658813,
0.0841473937034607,
0.14197410643100739,
-0.05415697395801544,
0.011967675760388374,
0.017060237005352974,
0.11459562927484512,
0.029856428503990173,
-0.08119074255228043,
0.06502187252044678,
-0.05760942026972771,
-0.18140770494937897,
-0.05541788041591644,
0.12207238376140594,
0.08213692158460617,
0.049432422965765,
-0.0007148728473111987,
0.05412708967924118,
0.0021017498802393675,
-0.09515831619501114,
0.04073132202029228,
0.0013334270333871245,
0.04387892410159111,
0.018074313178658485,
-0.08499373495578766,
0.09480664879083633,
-0.03704362362623215,
0.009954683482646942,
0.12775695323944092,
0.21208840608596802,
-0.10715193301439285,
0.11466509103775024,
0.08578082919120789,
-0.07387382537126541,
-0.16617831587791443,
0.06162098050117493,
0.1313498169183731,
0.008750137872993946,
0.08540293574333191,
-0.21373164653778076,
0.12438883632421494,
0.10269587486982346,
-0.013068568892776966,
0.007682406809180975,
-0.27496957778930664,
-0.13067540526390076,
0.05836181342601776,
0.11202295124530792,
0.041756853461265564,
-0.11456963419914246,
-0.03333786129951477,
-0.006369804963469505,
-0.0984833687543869,
0.11458727717399597,
-0.0736134797334671,
0.11448710411787033,
-0.020561998710036278,
0.11776848882436752,
0.025505773723125458,
-0.03412001579999924,
0.10787849873304367,
0.06180274859070778,
0.0877143070101738,
-0.036530278623104095,
0.008586300536990166,
0.06081994250416756,
-0.05904517322778702,
0.02727908454835415,
-0.042897678911685944,
0.06753870844841003,
-0.14722996950149536,
0.006247975397855043,
-0.08792583644390106,
0.053901296108961105,
-0.046011775732040405,
-0.07184511423110962,
-0.016666250303387642,
0.05318548530340195,
0.06894933432340622,
-0.0416107252240181,
0.028367450460791588,
-0.0012177537428215146,
0.09966447204351425,
0.10536479204893112,
0.08163916319608688,
-0.023724107071757317,
-0.08685392886400223,
0.013149193488061428,
0.0033451991621404886,
0.055180612951517105,
-0.09597016125917435,
0.013311734423041344,
0.14137190580368042,
0.06524745374917984,
0.09539393335580826,
0.046735115349292755,
-0.04271213710308075,
0.004963931627571583,
0.014198850840330124,
-0.13253247737884521,
-0.10484681278467178,
0.025378772988915443,
-0.041104044765233994,
-0.15158508718013763,
0.02821214497089386,
0.11972327530384064,
-0.040085319429636,
-0.021361131221055984,
-0.005251175258308649,
0.004172706510871649,
-0.012089984491467476,
0.18481560051441193,
0.04424356669187546,
0.06328018754720688,
-0.09089699387550354,
0.11079259216785431,
0.03384515643119812,
-0.049938954412937164,
0.05375044792890549,
0.0672120675444603,
-0.10425771027803421,
0.010308816097676754,
0.08180484175682068,
0.13243722915649414,
-0.054718680679798126,
-0.011272256262600422,
-0.09564775973558426,
-0.08399150520563126,
0.04139869660139084,
0.13593369722366333,
0.055802322924137115,
-0.002450663363561034,
-0.06498724222183228,
0.03611820563673973,
-0.11962618678808212,
0.0688074380159378,
0.047767430543899536,
0.07630105316638947,
-0.10063287615776062,
0.13214097917079926,
-0.002365939551964402,
0.026903195306658745,
-0.027266882359981537,
0.014071032404899597,
-0.09899573773145676,
-0.024605540558695793,
-0.1057400107383728,
-0.02349870093166828,
-0.011067749001085758,
-0.0013115585315972567,
-0.022753115743398666,
-0.06947484612464905,
-0.028754280880093575,
0.039821743965148926,
-0.07912629842758179,
-0.04815105348825455,
0.016727015376091003,
0.036003973335027695,
-0.1537998765707016,
0.0029626504983752966,
0.02511964552104473,
-0.08943955600261688,
0.08974587917327881,
0.06274019926786423,
0.012473040260374546,
0.024894189089536667,
-0.11868146061897278,
-0.030846688896417618,
-0.009278900921344757,
0.004288539756089449,
0.06855522841215134,
-0.09448988735675812,
-0.028429968282580376,
-0.03560382127761841,
0.041511036455631256,
0.019497448578476906,
0.10383350402116776,
-0.1201653778553009,
-0.004270133562386036,
-0.03833237662911415,
-0.03952817991375923,
-0.06503944098949432,
0.03668677434325218,
0.10692715644836426,
0.053742170333862305,
0.15137244760990143,
-0.0768810510635376,
0.054651644080877304,
-0.19913434982299805,
-0.03796691074967384,
0.011913408525288105,
-0.04710284620523453,
-0.08174987137317657,
-0.04688010737299919,
0.08826284110546112,
-0.047362618148326874,
0.11342622339725494,
-0.012852787040174007,
0.10159250348806381,
0.04380470886826515,
-0.010301466099917889,
-0.06426651775836945,
-0.0064306557178497314,
0.18229499459266663,
0.05214657261967659,
-0.017408357933163643,
0.12769414484500885,
0.003978712018579245,
0.028839057311415672,
0.08705747127532959,
0.22032396495342255,
0.15407609939575195,
0.00029535958310589194,
0.06251338869333267,
0.06089470535516739,
-0.06908982992172241,
-0.1491972804069519,
0.12134160846471786,
-0.01881040446460247,
0.10594362020492554,
-0.06631682813167572,
0.18951573967933655,
0.037799082696437836,
-0.1797926276922226,
0.0644197016954422,
-0.025089673697948456,
-0.10897386819124222,
-0.11840883642435074,
-0.025938980281352997,
-0.07143149524927139,
-0.12250592559576035,
0.02533828653395176,
-0.11768448352813721,
0.0614873431622982,
0.10415042191743851,
0.007960663177073002,
0.038435839116573334,
0.18336373567581177,
-0.04673207923769951,
0.012882581911981106,
0.08376779407262802,
0.01854723133146763,
0.0026918656658381224,
-0.03904644027352333,
-0.06491760909557343,
0.036521513015031815,
0.03426232933998108,
0.06394224613904953,
-0.04868558049201965,
0.004736601375043392,
0.0040967767126858234,
-0.008475039154291153,
-0.0761314183473587,
0.011147082783281803,
0.009842014871537685,
0.05130956321954727,
0.04630271717905998,
0.047403983771800995,
0.005866611376404762,
-0.05540059879422188,
0.2931637167930603,
-0.06966686248779297,
-0.06844969838857651,
-0.12854275107383728,
0.21778573095798492,
0.023923873901367188,
-0.025698857381939888,
0.05588128790259361,
-0.08655089884996414,
-0.014190280809998512,
0.1622716188430786,
0.1377304196357727,
-0.08878596127033234,
-0.015958502888679504,
-0.0239175483584404,
-0.01109298411756754,
-0.01416196022182703,
0.11551601439714432,
0.07540316134691238,
-0.0119058508425951,
-0.07083387672901154,
-0.01230524480342865,
-0.02765977941453457,
-0.056115735322237015,
-0.06816904991865158,
0.06856173276901245,
0.028347669169306755,
-0.009432111866772175,
-0.06441693007946014,
0.06682045757770538,
0.0001346346689388156,
-0.23259499669075012,
0.04269114136695862,
-0.17346207797527313,
-0.17046815156936646,
-0.019375305622816086,
0.07128400355577469,
0.004258084576576948,
0.056448958814144135,
0.0009218797204084694,
0.02420460432767868,
0.11288271099328995,
-0.014272745698690414,
-0.0033166927751153708,
-0.11569610238075256,
0.11524412035942078,
-0.10248223692178726,
0.19934697449207306,
-0.006908203475177288,
0.058288026601076126,
0.09638956934213638,
0.037390872836112976,
-0.13777801394462585,
0.022577382624149323,
0.06287883967161179,
-0.1279369741678238,
-0.0037347038742154837,
0.1504690796136856,
-0.030674539506435394,
0.06299509853124619,
0.026020921766757965,
-0.14674945175647736,
0.0019539655186235905,
0.018081901594996452,
-0.03527475893497467,
-0.06964217871427536,
-0.0069008637219667435,
-0.05110898241400719,
0.16495679318904877,
0.2164195328950882,
-0.02933463081717491,
0.0080825574696064,
-0.09162671864032745,
0.012818926945328712,
0.04682813212275505,
0.05161470174789429,
-0.040845468640327454,
-0.2058800309896469,
0.01506477314978838,
0.07320583611726761,
-0.005972175393253565,
-0.19658009707927704,
-0.09583037346601486,
0.04614808037877083,
-0.040088869631290436,
-0.04320066049695015,
0.09241282194852829,
0.02148953452706337,
0.04074576124548912,
-0.013155779801309109,
-0.11672773212194443,
-0.021647417917847633,
0.1385541409254074,
-0.17617419362068176,
-0.032539937645196915
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-8
This model is a fine-tuned version of [SpanBERT/spanbert-base-cased](https://huggingface.co/SpanBERT/spanbert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-8", "results": []}]}
|
question-answering
|
anas-awadalla/spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-8
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us
|
# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-8
This model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 200
### Training results
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
[
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n",
"# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200",
"### Training results",
"### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
42,
56,
6,
12,
8,
3,
104,
4,
38
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #endpoints_compatible #region-us \n# spanbert-base-cased-few-shot-k-64-finetuned-squad-seed-8\n\nThis model is a fine-tuned version of SpanBERT/spanbert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 24\n- eval_batch_size: 24\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.1\n- training_steps: 200### Training results### Framework versions\n\n- Transformers 4.16.0.dev0\n- Pytorch 1.10.2+cu102\n- Datasets 1.17.0\n- Tokenizers 0.10.3"
] |
[
-0.09828430414199829,
0.10841035842895508,
-0.002351732226088643,
0.09611917287111282,
0.12141726166009903,
0.016754688695073128,
0.09844475239515305,
0.13051797449588776,
-0.10399731993675232,
0.06768608838319778,
0.08740446716547012,
0.03439464420080185,
0.043160196393728256,
0.14920392632484436,
-0.006781809497624636,
-0.2729329466819763,
-0.0010383655317127705,
0.0005562950973398983,
-0.040428679436445236,
0.1202133372426033,
0.0879170224070549,
-0.11038452386856079,
0.076813243329525,
0.009662116877734661,
-0.15504592657089233,
0.01999465934932232,
-0.031711094081401825,
-0.036174360662698746,
0.1221359595656395,
-0.03664083406329155,
0.10689456760883331,
0.029632722958922386,
0.13685467839241028,
-0.20718054473400116,
0.006291546393185854,
0.07659147679805756,
0.052187480032444,
0.09996909648180008,
0.04607250168919563,
0.011400271207094193,
0.08788478374481201,
-0.14931271970272064,
0.09247418493032455,
0.02968958579003811,
-0.09070193767547607,
-0.153184711933136,
-0.09140068292617798,
0.031046215444803238,
0.04996049776673317,
0.07100442051887512,
0.00325956498272717,
0.14895184338092804,
-0.06281698495149612,
0.08471720665693283,
0.26218700408935547,
-0.3223406672477722,
-0.0669490173459053,
0.02969234809279442,
0.05641356110572815,
0.06145823746919632,
-0.11930844187736511,
-0.0029703250620514154,
0.01995467022061348,
0.02773340977728367,
0.12726011872291565,
-0.016857611015439034,
-0.10967116057872772,
-0.009923674166202545,
-0.12450223416090012,
-0.0017262265319004655,
0.06032304838299751,
0.025773216038942337,
-0.053235974162817,
-0.10611815750598907,
-0.06894466280937195,
-0.08459898829460144,
-0.022649679332971573,
-0.055388160049915314,
0.05143413692712784,
-0.0544426366686821,
-0.09625962376594543,
-0.04014580696821213,
-0.057242028415203094,
-0.08010131120681763,
-0.00825288612395525,
0.16645826399326324,
0.03587803244590759,
0.020630016922950745,
-0.029322881251573563,
0.11786764115095139,
0.018658366054296494,
-0.13947485387325287,
-0.00877363234758377,
-0.003837631782516837,
-0.09365863353013992,
-0.04123410955071449,
-0.05388810113072395,
-0.01200917549431324,
0.004837534390389919,
0.16829811036586761,
-0.08099097013473511,
0.07441195845603943,
0.013507670722901821,
-0.023590700700879097,
-0.012802604585886002,
0.15228933095932007,
-0.039916884154081345,
-0.03852321580052376,
-0.01557155605405569,
0.07937747240066528,
0.00453455513343215,
-0.019079608842730522,
-0.06774254143238068,
-0.02857852727174759,
0.0958133414387703,
0.05482696369290352,
-0.06361938267946243,
0.038442742079496384,
-0.029785465449094772,
-0.02639657072722912,
0.018744153901934624,
-0.11913847178220749,
0.041677799075841904,
-0.004290567245334387,
-0.08143000304698944,
-0.008030216209590435,
-0.001550187123939395,
-0.008803172037005424,
-0.01050045806914568,
0.09738708287477493,
-0.0865759328007698,
0.000030605297069996595,
-0.07007783651351929,
-0.080417700111866,
-0.0006662480300292373,
-0.1560567170381546,
-0.013244467787444592,
-0.05947289988398552,
-0.16776780784130096,
-0.0331944040954113,
0.043823275715112686,
-0.075621098279953,
-0.012050917372107506,
-0.04253442585468292,
-0.060951974242925644,
0.01707754284143448,
-0.0133591890335083,
0.1917768269777298,
-0.05255793035030365,
0.08110345900058746,
-0.005856906995177269,
0.050431374460458755,
0.025045890361070633,
0.03614131361246109,
-0.10280460864305496,
0.027963686734437943,
-0.13994769752025604,
0.07747790962457657,
-0.08517186343669891,
-0.005032880697399378,
-0.13587446510791779,
-0.10224130749702454,
0.015670007094740868,
-0.020276488736271858,
0.09432082623243332,
0.13302212953567505,
-0.19823475182056427,
-0.01841038651764393,
0.12443044036626816,
-0.0762171521782875,
-0.05117601901292801,
0.0610857829451561,
-0.06178754195570946,
0.04065893590450287,
0.0513019785284996,
0.2129088193178177,
0.05628785118460655,
-0.15582585334777832,
-0.010363646782934666,
0.007240366656333208,
0.04597732052206993,
0.02469959855079651,
0.03780665621161461,
0.004389192909002304,
0.05771278217434883,
0.016480213031172752,
-0.08990197628736496,
-0.024491365998983383,
-0.09002158045768738,
-0.06643863022327423,
-0.0510205514729023,
-0.07546044886112213,
0.054580312222242355,
0.008734210394322872,
0.04172825813293457,
-0.0650351494550705,
-0.10333267599344254,
0.11490166932344437,
0.09561257064342499,
-0.0515662282705307,
0.037775956094264984,
-0.08031714707612991,
0.013285871595144272,
-0.00532962242141366,
-0.03563370555639267,
-0.21138866245746613,
-0.11572366952896118,
0.05169852450489998,
-0.043330561369657516,
0.02520778961479664,
0.0036423103883862495,
0.0836176723241806,
0.056554894894361496,
-0.05097232013940811,
-0.014899620786309242,
-0.09626756608486176,
0.0011722816852852702,
-0.11517778784036636,
-0.18940727412700653,
-0.08583047240972519,
-0.04279247298836708,
0.09309540688991547,
-0.16834545135498047,
-0.0049539972096681595,
0.021251613274216652,
0.13741351664066315,
0.026913123205304146,
-0.06718098372220993,
0.00151330407243222,
0.04713863879442215,
0.013307503424584866,
-0.09702989459037781,
0.05550611764192581,
0.012897665612399578,
-0.10389342904090881,
-0.04938644543290138,
-0.13205640017986298,
-0.016464248299598694,
0.053290415555238724,
0.060325197875499725,
-0.10346515476703644,
-0.060770027339458466,
-0.07233819365501404,
-0.03797828033566475,
-0.07696357369422913,
0.02230307087302208,
0.2133350670337677,
0.03833233565092087,
0.11397142708301544,
-0.06499205529689789,
-0.08385670930147171,
-0.007402643095701933,
0.026765016838908195,
0.02325575239956379,
0.0843387022614479,
0.0233466736972332,
-0.03792194277048111,
0.06800660490989685,
0.10179460793733597,
-0.027789097279310226,
0.13250192999839783,
-0.05566883459687233,
-0.0805373266339302,
-0.030884958803653717,
-0.02118055708706379,
-0.02555488422513008,
0.12912443280220032,
-0.0291574839502573,
-0.0002494690124876797,
0.034749314188957214,
0.03736772760748863,
0.011515042744576931,
-0.1691274642944336,
0.002854169812053442,
0.027728481218218803,
-0.05487940087914467,
-0.03968791663646698,
-0.006666919682174921,
0.02062036283314228,
0.08800844103097916,
0.02947733923792839,
-0.008255133405327797,
0.009423783980309963,
-0.01053114328533411,
-0.05804218351840973,
0.18761637806892395,
-0.0961475819349289,
-0.08046934008598328,
-0.07294828444719315,
0.020643703639507294,
-0.051125459372997284,
-0.03881176933646202,
0.008385399356484413,
-0.0922774001955986,
-0.029604393988847733,
-0.0880034863948822,
-0.028563346713781357,
-0.019546927884221077,
0.019769733771681786,
0.024975381791591644,
-0.017250292003154755,
0.08587636798620224,
-0.13815489411354065,
0.005426865536719561,
-0.04871071130037308,
-0.09504835307598114,
0.009675012901425362,
0.07605395466089249,
0.0922311320900917,
0.0810953676700592,
-0.01979556865990162,
0.027864588424563408,
-0.03853999450802803,
0.23184648156166077,
-0.0525873564183712,
0.012765951454639435,
0.11526253074407578,
-0.01111164502799511,
0.05508079007267952,
0.09161880612373352,
0.03844209015369415,
-0.09107261151075363,
0.02353859134018421,
0.07653538882732391,
-0.038354791700839996,
-0.22553132474422455,
-0.018589546903967857,
-0.0010820399038493633,
-0.07753899693489075,
0.10757429152727127,
0.032124318182468414,
-0.04973535239696503,
0.0448228120803833,
0.02335026115179062,
-0.00724813062697649,
-0.0472353957593441,
0.07449445873498917,
0.07202918827533722,
0.05151434242725372,
0.10676993429660797,
-0.0048909434117376804,
-0.02644968591630459,
0.055617984384298325,
0.01707085408270359,
0.2529353201389313,
-0.0442495159804821,
0.1017046719789505,
0.030643058940768242,
0.15471912920475006,
-0.02006690390408039,
0.06769653409719467,
0.0002855632919818163,
-0.010219087824225426,
-0.011161848902702332,
-0.06583477556705475,
-0.02572384662926197,
0.017954658716917038,
-0.04812829941511154,
0.024313772097229958,
-0.07622917741537094,
0.025013402104377747,
0.0259782113134861,
0.2921144664287567,
0.027834508568048477,
-0.2608294188976288,
-0.07334993779659271,
-0.016402162611484528,
-0.04505472630262375,
-0.061960138380527496,
0.007019816432148218,
0.13487687706947327,
-0.13843917846679688,
0.05319537594914436,
-0.07819470018148422,
0.08862658590078354,
-0.045271504670381546,
0.012549792416393757,
0.04628291726112366,
0.151418998837471,
-0.01777486316859722,
0.05267045646905899,
-0.19907492399215698,
0.25058451294898987,
0.021263444796204567,
0.10706401616334915,
-0.06517742574214935,
0.011769939213991165,
0.019567295908927917,
0.012025394476950169,
0.1110101267695427,
0.0034379353746771812,
-0.06490780413150787,
-0.14765772223472595,
-0.09098272770643234,
0.045880917459726334,
0.14045944809913635,
-0.03771696239709854,
0.08925977349281311,
-0.028666289523243904,
0.011446681804955006,
0.03040451928973198,
-0.04122775048017502,
-0.15217307209968567,
-0.07834047824144363,
0.00023873911413829774,
0.015532289631664753,
-0.0067454744130373,
-0.06100030988454819,
-0.10516443848609924,
-0.01871737465262413,
0.1124952882528305,
0.00014084384019952267,
-0.05808103084564209,
-0.15453651547431946,
0.08473041653633118,
0.1410956233739853,
-0.054351046681404114,
0.011789528653025627,
0.017042890191078186,
0.11473080515861511,
0.029594730585813522,
-0.08030666410923004,
0.06505001336336136,
-0.05738508701324463,
-0.18043698370456696,
-0.055778197944164276,
0.12114258855581284,
0.08167380839586258,
0.04959728568792343,
-0.00026878868811763823,
0.05382303521037102,
0.0019365883199498057,
-0.09520303457975388,
0.03942514955997467,
0.0019450954860076308,
0.04291410744190216,
0.017943089827895164,
-0.08502141386270523,
0.09583113342523575,
-0.036685045808553696,
0.010150542482733727,
0.1289702206850052,
0.2116280198097229,
-0.10752428323030472,
0.1137981191277504,
0.08661101013422012,
-0.07377360761165619,
-0.16583366692066193,
0.0611281618475914,
0.13075114786624908,
0.008822551928460598,
0.08554256707429886,
-0.2131592035293579,
0.12444280833005905,
0.10376270115375519,
-0.012586156837642193,
0.007359504699707031,
-0.2749796211719513,
-0.1309305876493454,
0.059261664748191833,
0.11199534684419632,
0.044058118015527725,
-0.1151835098862648,
-0.03324471414089203,
-0.007185773458331823,
-0.0997919887304306,
0.11365395039319992,
-0.07325796782970428,
0.11428235471248627,
-0.020374804735183716,
0.1162644550204277,
0.025647949427366257,
-0.03431636095046997,
0.10830026865005493,
0.06247171759605408,
0.08767640590667725,
-0.036776576191186905,
0.009006732143461704,
0.06063035875558853,
-0.0591939352452755,
0.028139228001236916,
-0.042759090662002563,
0.06757295876741409,
-0.1490938514471054,
0.005968168377876282,
-0.08655378222465515,
0.05443302541971207,
-0.04568518325686455,
-0.07220885157585144,
-0.016534211114048958,
0.05203410983085632,
0.06883103400468826,
-0.041458502411842346,
0.02944866009056568,
-0.0008296617888845503,
0.09887432307004929,
0.10697872191667557,
0.08007731288671494,
-0.027124058455228806,
-0.08704374730587006,
0.01315640565007925,
0.0031304776202887297,
0.055555377155542374,
-0.09554024785757065,
0.013863716274499893,
0.14154022932052612,
0.06525864452123642,
0.09567548334598541,
0.04569695517420769,
-0.04208783805370331,
0.005031772423535585,
0.013649989850819111,
-0.13233838975429535,
-0.1039593517780304,
0.024785717949271202,
-0.0414733961224556,
-0.15103429555892944,
0.02717968076467514,
0.12030453234910965,
-0.04085931181907654,
-0.020689455792307854,
-0.005536484997719526,
0.0028195464983582497,
-0.012040994130074978,
0.18442845344543457,
0.04489691182971001,
0.06307321786880493,
-0.09080052375793457,
0.11030558496713638,
0.03434896096587181,
-0.04900038242340088,
0.05416366457939148,
0.06671885401010513,
-0.10458045452833176,
0.010125440545380116,
0.08188635855913162,
0.13241109251976013,
-0.055600665509700775,
-0.012388939969241619,
-0.09625215828418732,
-0.08294300734996796,
0.04097997397184372,
0.13468606770038605,
0.0562070831656456,
-0.002808312186971307,
-0.06532072275876999,
0.035335127264261246,
-0.11982175707817078,
0.0682583823800087,
0.04741012305021286,
0.07659752666950226,
-0.10087043046951294,
0.13359355926513672,
-0.0016360072186216712,
0.026847073808312416,
-0.027219869196414948,
0.013603096827864647,
-0.09903702884912491,
-0.02431718073785305,
-0.10723591595888138,
-0.023285899311304092,
-0.01095852442085743,
-0.0008515712106600404,
-0.02279934659600258,
-0.06917057186365128,
-0.028741834685206413,
0.039718419313430786,
-0.07838386297225952,
-0.04796460270881653,
0.01711852289736271,
0.035612259060144424,
-0.15329013764858246,
0.0026266940403729677,
0.024756431579589844,
-0.0893077403306961,
0.0900755524635315,
0.06215127930045128,
0.012007658369839191,
0.024469954892992973,
-0.11888349056243896,
-0.030763613060116768,
-0.009261266328394413,
0.005235457327216864,
0.06858404725790024,
-0.09280727803707123,
-0.027576392516493797,
-0.03513962775468826,
0.04141608253121376,
0.019369447603821754,
0.1043037697672844,
-0.12037117034196854,
-0.004157292656600475,
-0.038137152791023254,
-0.03957687318325043,
-0.06551678478717804,
0.036642368882894516,
0.10662158578634262,
0.052950240671634674,
0.151315838098526,
-0.0767727643251419,
0.05446501076221466,
-0.1994047313928604,
-0.03822407126426697,
0.012191234156489372,
-0.04682037606835365,
-0.08157768845558167,
-0.04783445969223976,
0.08797347545623779,
-0.04675787687301636,
0.11529158055782318,
-0.012791311368346214,
0.10215268284082413,
0.04333064705133438,
-0.010968739166855812,
-0.06413295120000839,
-0.007197785656899214,
0.18332761526107788,
0.05334113538265228,
-0.01738101802766323,
0.12682890892028809,
0.004322560504078865,
0.02999279648065567,
0.08642961829900742,
0.21776100993156433,
0.1540316641330719,
-0.00048689061077311635,
0.06254785507917404,
0.060619648545980453,
-0.06854818761348724,
-0.1493275910615921,
0.12098375707864761,
-0.018951179459691048,
0.10636457055807114,
-0.0662522092461586,
0.18942497670650482,
0.037817683070898056,
-0.1796664297580719,
0.06411474198102951,
-0.024381259456276894,
-0.10921035706996918,
-0.11856599152088165,
-0.02540953829884529,
-0.07150031626224518,
-0.12262171506881714,
0.024867383763194084,
-0.1171097382903099,
0.06102396547794342,
0.10450723022222519,
0.007471360731869936,
0.03824860230088234,
0.18270693719387054,
-0.04675915092229843,
0.013275792822241783,
0.08338561654090881,
0.01837637461721897,
0.0030452904757112265,
-0.04023201763629913,
-0.06580568104982376,
0.036464061588048935,
0.0343472994863987,
0.06451047956943512,
-0.04876529425382614,
0.0067152478732168674,
0.004524510819464922,
-0.008151200599968433,
-0.07661917060613632,
0.011082977056503296,
0.00917788501828909,
0.0510258674621582,
0.045024048537015915,
0.047664251178503036,
0.005997122265398502,
-0.05560895428061485,
0.2914558947086334,
-0.06896056979894638,
-0.06914307922124863,
-0.12843672931194305,
0.21670666337013245,
0.02436533011496067,
-0.025417398661375046,
0.05585230514407158,
-0.08649427443742752,
-0.013496289029717445,
0.1625296175479889,
0.13705874979496002,
-0.08952663838863373,
-0.015938755124807358,
-0.023566873744130135,
-0.011225437745451927,
-0.015279085375368595,
0.11613790690898895,
0.07560092210769653,
-0.013073093257844448,
-0.07001796364784241,
-0.012506418861448765,
-0.027834253385663033,
-0.05610887333750725,
-0.06948506832122803,
0.06776240468025208,
0.02855149284005165,
-0.008686299435794353,
-0.06396348029375076,
0.06625358760356903,
0.0010501667857170105,
-0.23313027620315552,
0.04297913610935211,
-0.1737307757139206,
-0.17031879723072052,
-0.019202323630452156,
0.0715717002749443,
0.004459436517208815,
0.056021105498075485,
0.0010102103697136045,
0.02463231422007084,
0.11367674171924591,
-0.014547574333846569,
-0.003793677082285285,
-0.11444143950939178,
0.11453922837972641,
-0.10121726244688034,
0.1990378051996231,
-0.006873325444757938,
0.05902962014079094,
0.09636402875185013,
0.03779313713312149,
-0.13692373037338257,
0.0229229424148798,
0.06255415081977844,
-0.12689101696014404,
-0.0040198685601353645,
0.14884035289287567,
-0.030675075948238373,
0.0628877580165863,
0.026497410610318184,
-0.14650245010852814,
0.0011825242545455694,
0.016937725245952606,
-0.03527334704995155,
-0.06925068795681,
-0.008499392308294773,
-0.05048159882426262,
0.16499759256839752,
0.21552015841007233,
-0.02908594161272049,
0.007433941122144461,
-0.09170860052108765,
0.012647365219891071,
0.047404300421476364,
0.051468029618263245,
-0.04107625037431717,
-0.20562857389450073,
0.015571820549666882,
0.07305392622947693,
-0.00576857291162014,
-0.19624847173690796,
-0.09639862924814224,
0.046031445264816284,
-0.040569864213466644,
-0.04290955513715744,
0.0924375131726265,
0.02160651609301567,
0.0410766527056694,
-0.013287484645843506,
-0.11693703383207321,
-0.02217494510114193,
0.13854853808879852,
-0.17615969479084015,
-0.03279801458120346
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-squad
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "distilbert-base-uncased-finetuned-squad", "results": []}]}
|
question-answering
|
anasaqsme/distilbert-base-uncased-finetuned-squad
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# distilbert-base-uncased-finetuned-squad
This model is a fine-tuned version of distilbert-base-uncased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
[
"# distilbert-base-uncased-finetuned-squad\n\nThis model is a fine-tuned version of distilbert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3",
"### Framework versions\n\n- Transformers 4.17.0\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.4\n- Tokenizers 0.11.6"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-finetuned-squad\n\nThis model is a fine-tuned version of distilbert-base-uncased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3",
"### Framework versions\n\n- Transformers 4.17.0\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.4\n- Tokenizers 0.11.6"
] |
[
56,
43,
6,
12,
8,
3,
90,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# distilbert-base-uncased-finetuned-squad\n\nThis model is a fine-tuned version of distilbert-base-uncased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3### Framework versions\n\n- Transformers 4.17.0\n- Pytorch 1.10.0+cu111\n- Datasets 1.18.4\n- Tokenizers 0.11.6"
] |
[
-0.07956094294786453,
0.07565832138061523,
-0.002049687784165144,
0.07811261713504791,
0.16935135424137115,
0.032394446432590485,
0.14314541220664978,
0.08737340569496155,
-0.10581380128860474,
0.04573038965463638,
0.07196767628192902,
0.09371573477983475,
0.01692243292927742,
0.08578015118837357,
-0.03747858107089996,
-0.24850711226463318,
0.011834061704576015,
0.03176911175251007,
-0.11484844982624054,
0.09492959827184677,
0.10360485315322876,
-0.11621373891830444,
0.06629014015197754,
0.02048727311193943,
-0.1928252875804901,
0.03707636520266533,
-0.02163739688694477,
-0.043038442730903625,
0.10310681909322739,
0.020365022122859955,
0.12509068846702576,
0.003539963159710169,
0.12612715363502502,
-0.19616658985614777,
0.00255411583930254,
0.08264464884996414,
0.03270382434129715,
0.07171010226011276,
0.015415409579873085,
0.026649869978427887,
0.09531762450933456,
-0.10755489766597748,
0.09261782467365265,
0.02356511913239956,
-0.07250165939331055,
-0.183389350771904,
-0.09016814827919006,
0.058151815086603165,
0.08317108452320099,
0.1023942157626152,
0.009326936677098274,
0.13870905339717865,
-0.08597627282142639,
0.07354284077882767,
0.20646236836910248,
-0.28165480494499207,
-0.08921345323324203,
0.0652402713894844,
0.04822944849729538,
0.05311136320233345,
-0.08691255003213882,
-0.022983526811003685,
0.05260440707206726,
0.05732424557209015,
0.10179895162582397,
-0.03488140553236008,
-0.09065038710832596,
-0.012079432606697083,
-0.14564856886863708,
0.00900491327047348,
0.18703657388687134,
0.031932130455970764,
-0.04600522667169571,
-0.05707505717873573,
-0.07784099131822586,
-0.05787324905395508,
-0.017953356727957726,
-0.07794030010700226,
0.0486963726580143,
-0.039259541779756546,
-0.07727315276861191,
-0.06626596301794052,
-0.08072756975889206,
-0.055241841822862625,
-0.025379156693816185,
0.13243871927261353,
0.046995777636766434,
0.024062978103756905,
-0.056967463344335556,
0.08816520124673843,
-0.02672540210187435,
-0.11504462361335754,
0.003176153404638171,
-0.013731158338487148,
-0.03960571065545082,
-0.0555729940533638,
-0.06852967292070389,
-0.020587457343935966,
0.008776555769145489,
0.2050991803407669,
-0.08029735088348389,
0.05857892334461212,
0.038009997457265854,
0.008673373609781265,
-0.02490781992673874,
0.13470374047756195,
-0.0533900149166584,
-0.04024988040328026,
-0.00024869447224773467,
0.07091563940048218,
0.013015702366828918,
-0.004331254865974188,
-0.09481498599052429,
-0.0027989959344267845,
0.06995806843042374,
0.019194968044757843,
-0.05295779928565025,
0.04038017988204956,
-0.009062616154551506,
-0.04876897856593132,
-0.0025663478299975395,
-0.10936777293682098,
0.03530150651931763,
-0.02508505992591381,
-0.07246560603380203,
0.03623360022902489,
0.015341617166996002,
0.015200859867036343,
-0.01781221106648445,
0.10776376724243164,
-0.10081794112920761,
0.012860726565122604,
-0.10818419605493546,
-0.08726257085800171,
0.003969072364270687,
-0.09243640303611755,
0.009645971469581127,
-0.08149947226047516,
-0.19083747267723083,
-0.027719199657440186,
0.0700116902589798,
-0.03381514921784401,
-0.029172493144869804,
-0.02308962121605873,
-0.0654992014169693,
0.00816608127206564,
-0.014125747606158257,
0.09512817859649658,
-0.04049016162753105,
0.061216868460178375,
0.030688609927892685,
0.034434251487255096,
-0.04609283432364464,
0.04209193214774132,
-0.09857737272977829,
0.022002577781677246,
-0.15828649699687958,
0.04698118194937706,
-0.08286799490451813,
0.036495476961135864,
-0.08852030336856842,
-0.1173890084028244,
-0.010027342475950718,
-0.0031807466875761747,
0.06799418479204178,
0.0803903192281723,
-0.1786995828151703,
-0.04073544591665268,
0.12245186418294907,
-0.08084967732429504,
-0.09100216627120972,
0.0939997062087059,
-0.051568109542131424,
0.05757220461964607,
0.05529472976922989,
0.12807010114192963,
0.09085242450237274,
-0.12390775978565216,
-0.03537836670875549,
0.0005600904696621001,
0.06196843832731247,
-0.0003048342186957598,
0.04250892251729965,
-0.0002162072923965752,
0.042402446269989014,
0.015209812670946121,
-0.07327049225568771,
-0.013706491328775883,
-0.09193601459264755,
-0.08042968809604645,
-0.0611594021320343,
-0.08665037900209427,
0.03925296664237976,
0.04107310622930527,
0.0548122301697731,
-0.0709192082285881,
-0.09459205716848373,
0.17812125384807587,
0.11019866913557053,
-0.06684749573469162,
0.021311931312084198,
-0.07372191548347473,
0.04062234237790108,
-0.025141308084130287,
-0.021087484434247017,
-0.21089260280132294,
-0.11125530302524567,
0.017408806830644608,
-0.02067943848669529,
0.05138205736875534,
0.050521861761808395,
0.05513014271855354,
0.06105688959360123,
-0.04850892350077629,
0.003960004076361656,
-0.09633664041757584,
-0.0037187044508755207,
-0.09968142956495285,
-0.1763273924589157,
-0.06528225541114807,
-0.022475039586424828,
0.14061109721660614,
-0.20516017079353333,
0.032996486872434616,
-0.04715000092983246,
0.13158242404460907,
0.006387566216289997,
-0.030132390558719635,
-0.06176244094967842,
0.07032568007707596,
-0.025614749640226364,
-0.08241084218025208,
0.051627565175294876,
0.01469460129737854,
-0.05588926002383232,
-0.12863898277282715,
-0.12593398988246918,
0.07171987742185593,
0.10278143733739853,
-0.03272605314850807,
-0.06837180256843567,
-0.0010575182968750596,
-0.06612422317266464,
-0.040373794734478,
-0.06392206251621246,
-0.0014975892845541239,
0.16398616135120392,
-0.01558323297649622,
0.12044289708137512,
-0.0643109679222107,
-0.047054801136255264,
0.0022425863426178694,
-0.01814493164420128,
0.0001849926629802212,
0.03976568207144737,
0.12213492393493652,
-0.08628235012292862,
0.10842252522706985,
0.13025417923927307,
-0.09236040711402893,
0.13748422265052795,
-0.03588304668664932,
-0.06718143820762634,
-0.023913031443953514,
-0.025437375530600548,
-0.023760678246617317,
0.1134243980050087,
-0.1246948093175888,
0.007847211323678493,
0.024923069402575493,
0.02389387972652912,
0.05283832177519798,
-0.180781751871109,
0.004530312959104776,
0.016374532133340836,
-0.0171793382614851,
-0.015057643875479698,
-0.019912956282496452,
0.021440444514155388,
0.07809212803840637,
0.018417872488498688,
-0.025115948170423508,
0.032843027263879776,
0.010806383565068245,
-0.06988078355789185,
0.19263841211795807,
-0.125325009226799,
-0.1292896270751953,
-0.11205054819583893,
0.004096580669283867,
-0.08006005734205246,
-0.014626994729042053,
0.03789946809411049,
-0.08676093071699142,
-0.05512140691280365,
-0.052584875375032425,
0.0172105859965086,
-0.021916091442108154,
-0.005804636050015688,
0.08146274089813232,
0.0025111616123467684,
0.0842180848121643,
-0.140372171998024,
0.0015804078429937363,
-0.03458132967352867,
-0.10445541888475418,
-0.001092261984013021,
0.059484176337718964,
0.1014636978507042,
0.12418471276760101,
-0.01785559579730034,
0.010651341639459133,
-0.017762988805770874,
0.26838064193725586,
-0.056626174598932266,
-0.010379955172538757,
0.15153396129608154,
0.017648138105869293,
0.047958746552467346,
0.09735226631164551,
0.06118107587099075,
-0.1050882637500763,
0.02851833589375019,
0.09251904487609863,
-0.030694054439663887,
-0.2220306545495987,
-0.04388050362467766,
-0.033719275146722794,
-0.08144818246364594,
0.09040733426809311,
0.03176137059926987,
0.04078537970781326,
0.07353884726762772,
-0.0023824211675673723,
0.08951966464519501,
-0.035644304007291794,
0.08549191802740097,
0.1441689133644104,
0.04063209146261215,
0.12157517671585083,
-0.02637030929327011,
-0.044875338673591614,
0.05683290213346481,
-0.004751648288220167,
0.27990686893463135,
0.011259329505264759,
0.054844919592142105,
0.07187086343765259,
0.14140279591083527,
-0.031055008992552757,
0.06501343101263046,
-0.004665217828005552,
-0.025920655578374863,
-0.000953818904235959,
-0.05122408643364906,
-0.021157333627343178,
0.013953625224530697,
-0.05344333127140999,
0.05534355342388153,
-0.07957776635885239,
0.061012450605630875,
0.045346345752477646,
0.27630284428596497,
0.008856742642819881,
-0.2858753502368927,
-0.09682097285985947,
0.00196057022549212,
-0.02091345191001892,
-0.04919067770242691,
0.023992912843823433,
0.10227543860673904,
-0.10598651319742203,
0.051842231303453445,
-0.06871019303798676,
0.09332368522882462,
0.006216808222234249,
0.028113115578889847,
0.0962923988699913,
0.1508220136165619,
0.015777571126818657,
0.07082033902406693,
-0.23744715750217438,
0.1893463283777237,
0.019983744248747826,
0.12643694877624512,
-0.05632011964917183,
0.030368460342288017,
0.014925088733434677,
0.08808916807174683,
0.0626353770494461,
-0.003512768307700753,
-0.01995234377682209,
-0.13961376249790192,
-0.0290103517472744,
0.04295087605714798,
0.12623140215873718,
-0.016439009457826614,
0.10050556063652039,
-0.04740498214960098,
0.022082580253481865,
0.060869526118040085,
-0.03576861321926117,
-0.17706066370010376,
-0.1319575011730194,
0.011151047423481941,
0.01988043636083603,
-0.06920737028121948,
-0.07285120338201523,
-0.10131803154945374,
-0.057112812995910645,
0.17253421247005463,
-0.013488934375345707,
-0.03936292231082916,
-0.12531721591949463,
0.09253479540348053,
0.11991818994283676,
-0.05555281043052673,
0.03494876250624657,
0.016099072992801666,
0.09790413081645966,
0.028576165437698364,
-0.10939028859138489,
0.05407090485095978,
-0.0917186513543129,
-0.1504073292016983,
-0.03732040151953697,
0.09254332631826401,
0.05450546741485596,
0.03258951008319855,
-0.0006003400776535273,
0.022444287315011024,
0.0022740012500435114,
-0.10263857990503311,
-0.014631528407335281,
0.04292833060026169,
0.08338098973035812,
0.0457388311624527,
-0.0934009850025177,
0.031169624999165535,
-0.03338741883635521,
0.008412003517150879,
0.11330509185791016,
0.18069277703762054,
-0.08598203957080841,
0.013531491160392761,
0.07935323566198349,
-0.08311626315116882,
-0.16984044015407562,
0.0706067755818367,
0.10029692947864532,
-0.010671177878975868,
0.044510725885629654,
-0.21489326655864716,
0.17386938631534576,
0.14657725393772125,
-0.011038161814212799,
0.0733911544084549,
-0.31237339973449707,
-0.13108336925506592,
0.08585958182811737,
0.10827143490314484,
0.06835848838090897,
-0.15050986409187317,
-0.0250973142683506,
-0.04676723852753639,
-0.17928387224674225,
0.1447104960680008,
-0.14802095293998718,
0.1027131900191307,
-0.0033694917801767588,
0.09045290946960449,
0.0026545843575149775,
-0.040860746055841446,
0.1343722641468048,
0.050634562969207764,
0.10248341411352158,
-0.05046067759394646,
0.005110458470880985,
0.11099252849817276,
-0.04578722268342972,
0.03987046703696251,
-0.010205895639955997,
0.05695676803588867,
-0.08621431142091751,
-0.024380063638091087,
-0.07192382961511612,
0.05964980646967888,
-0.05733572691679001,
-0.07235337048768997,
-0.05429820716381073,
0.03937402740120888,
0.03640658035874367,
-0.029927335679531097,
0.10343281924724579,
0.038253333419561386,
0.12935477495193481,
0.08079414814710617,
0.09448327869176865,
-0.0768168494105339,
-0.10014388710260391,
-0.004446136765182018,
-0.012185723520815372,
0.07673649489879608,
-0.09716399013996124,
0.028523370623588562,
0.13573622703552246,
0.04558201879262924,
0.1306448131799698,
0.06801604479551315,
-0.033025987446308136,
0.0087288161739707,
0.043885402381420135,
-0.12629105150699615,
-0.1718844473361969,
0.0036807137075811625,
-0.057848114520311356,
-0.12420225143432617,
0.0775413066148758,
0.11840477585792542,
-0.04913102462887764,
-0.007574379909783602,
-0.00945531390607357,
-0.002239964436739683,
-0.04397078976035118,
0.1895928829908371,
0.038165267556905746,
0.04801735281944275,
-0.09172486513853073,
0.11369717121124268,
0.054808542132377625,
-0.0656997412443161,
0.02756466157734394,
0.054838377982378006,
-0.08532784879207611,
-0.020621825009584427,
0.040085263550281525,
0.1358214020729065,
-0.09745071828365326,
-0.045896656811237335,
-0.11293164640665054,
-0.09584376960992813,
0.04270830750465393,
0.12625613808631897,
0.0739825889468193,
-0.02866196632385254,
-0.05850578844547272,
0.058105796575546265,
-0.14290963113307953,
0.06719226390123367,
0.026658710092306137,
0.08583974838256836,
-0.15814585983753204,
0.11064516007900238,
0.02134859934449196,
0.03598076105117798,
-0.022234158590435982,
0.0184892900288105,
-0.10009028762578964,
-0.021742697805166245,
-0.1514139175415039,
-0.04524412378668785,
-0.04345840960741043,
0.001135434489697218,
0.000477567664347589,
-0.039099518209695816,
-0.07258333265781403,
0.05053583160042763,
-0.07107238471508026,
-0.04360709711909294,
0.032651979476213455,
0.03268321603536606,
-0.1464887261390686,
0.01528663095086813,
0.016552863642573357,
-0.086356982588768,
0.06604548543691635,
0.07127850502729416,
0.024165110662579536,
0.04873039200901985,
-0.11129912734031677,
-0.03268137574195862,
0.03983763977885246,
0.04239555075764656,
0.08623156696557999,
-0.06398377567529678,
-0.021201424300670624,
-0.0021014453377574682,
0.08765926957130432,
0.011669673025608063,
0.06331561505794525,
-0.12254253774881363,
-0.008729908615350723,
-0.0554252490401268,
-0.05653060972690582,
-0.06401528418064117,
0.01959007978439331,
0.1216728612780571,
0.04850788414478302,
0.19846108555793762,
-0.07603351771831512,
0.023544836789369583,
-0.1929795891046524,
-0.0257957112044096,
0.0005485390429385006,
-0.04345777630805969,
-0.03399834781885147,
-0.03614586219191551,
0.05647612735629082,
-0.05775698274374008,
0.13618874549865723,
-0.0246012844145298,
0.09889882057905197,
0.03674127906560898,
-0.03120577521622181,
-0.03877990320324898,
-0.001859559677541256,
0.20035450160503387,
0.06005493178963661,
-0.012615453451871872,
0.048834383487701416,
0.02711118571460247,
0.07388827204704285,
0.06861784309148788,
0.21522939205169678,
0.1422039121389389,
-0.058321405202150345,
0.0666990876197815,
0.07228773832321167,
-0.0849510058760643,
-0.1544213742017746,
0.08794435113668442,
-0.015965072438120842,
0.10606437176465988,
-0.04151974245905876,
0.1588960886001587,
0.10891617089509964,
-0.16167345643043518,
0.04678675904870033,
-0.06797464191913605,
-0.10411202162504196,
-0.12084983289241791,
-0.024175597354769707,
-0.07339373975992203,
-0.14868921041488647,
0.016840990632772446,
-0.14801865816116333,
0.0215312447398901,
0.10869129747152328,
0.009243017062544823,
-0.0030858267564326525,
0.1745208203792572,
-0.04687343165278435,
0.014630230143666267,
0.028684906661510468,
0.0010542634408921003,
-0.022382792085409164,
-0.06729723513126373,
-0.05933135747909546,
0.01745006814599037,
0.006238109897822142,
0.07611311972141266,
-0.055910930037498474,
-0.02514897845685482,
0.023064037784934044,
-0.013938791118562222,
-0.047298163175582886,
0.014912854880094528,
0.02972758747637272,
0.025458266958594322,
0.03827293962240219,
0.020247312262654305,
-0.008391499519348145,
-0.037684522569179535,
0.25159063935279846,
-0.0844125747680664,
-0.11556795984506607,
-0.14820189774036407,
0.1991211473941803,
0.042088426649570465,
-0.005539631005376577,
0.057607490569353104,
-0.0968957245349884,
-0.02472645975649357,
0.18389849364757538,
0.16675332188606262,
-0.07578101754188538,
-0.025335801765322685,
0.006913683842867613,
-0.015168150886893272,
-0.08131328225135803,
0.11903678625822067,
0.1325710415840149,
0.07487353682518005,
-0.04703247547149658,
-0.03996971994638443,
-0.029744386672973633,
-0.019302185624837875,
-0.09441645443439484,
0.04134739935398102,
0.033679451793432236,
0.006548155564814806,
-0.021575376391410828,
0.059905242174863815,
-0.0019254429498687387,
-0.15872061252593994,
0.05305268242955208,
-0.1342078000307083,
-0.16335709393024445,
-0.022660672664642334,
0.08225154876708984,
-0.04291442781686783,
0.06476952880620956,
-0.024377895519137383,
-0.025370262563228607,
0.14018692076206207,
-0.02212369069457054,
-0.05251980200409889,
-0.1016155257821083,
0.11418446898460388,
-0.07834528386592865,
0.20798781514167786,
-0.021218402311205864,
0.08121469616889954,
0.11962294578552246,
0.043854422867298126,
-0.09995348751544952,
0.0402316115796566,
0.05978715047240257,
-0.07924099266529083,
0.011632664129137993,
0.10196766257286072,
-0.042517680674791336,
0.08677350729703903,
0.04047604650259018,
-0.1636388748884201,
-0.00889796856790781,
0.015339503064751625,
-0.034546349197626114,
-0.07517961412668228,
-0.007840554229915142,
-0.09118738025426865,
0.13599300384521484,
0.20957212150096893,
-0.026845276355743408,
0.006591746583580971,
-0.07262121140956879,
0.0468372143805027,
0.06370067596435547,
0.09939389675855637,
-0.05498907342553139,
-0.23638056218624115,
0.03024020418524742,
0.011545768938958645,
-0.0111926244571805,
-0.2157316505908966,
-0.08785829693078995,
0.05472338944673538,
-0.05112982541322708,
-0.050989001989364624,
0.0818411335349083,
0.07571344077587128,
0.051135316491127014,
-0.0497271753847599,
-0.11693914979696274,
-0.09123611450195312,
0.15344414114952087,
-0.15844281017780304,
-0.06194758415222168
] |
null | null |
transformers
|
# XLM-RoBERTa large for QA on Vietnamese languages (also support various languages)
## Overview
- Language model: xlm-roberta-large
- Fine-tune: [deepset/xlm-roberta-large-squad2](https://huggingface.co/deepset/xlm-roberta-large-squad2)
- Language: Vietnamese
- Downstream-task: Extractive QA
- Dataset: [mailong25/bert-vietnamese-question-answering](https://github.com/mailong25/bert-vietnamese-question-answering/tree/master/dataset)
- Training data: train-v2.0.json (SQuAD 2.0 format)
- Eval data: dev-v2.0.json (SQuAD 2.0 format)
- Infrastructure: 1x Tesla P100 (Google Colab)
## Performance
Evaluated on dev-v2.0.json
```
exact: 136 / 141
f1: 0.9692671394799054
```
Evaluated on Vietnamese XQuAD: [xquad.vi.json](https://github.com/deepmind/xquad/blob/master/xquad.vi.json)
```
exact: 604 / 1190
f1: 0.7224454217571596
```
## Author
An Pham (ancs21.ps [at] gmail.com)
## License
MIT
|
{"language": "vi", "license": "mit", "tags": ["vi", "xlm-roberta"], "metrics": ["f1", "em"], "widget": [{"text": "To\u00e0 nh\u00e0 n\u00e0o cao nh\u1ea5t Vi\u1ec7t Nam?", "context": "Landmark 81 l\u00e0 m\u1ed9t to\u00e0 nh\u00e0 ch\u1ecdc tr\u1eddi trong t\u1ed5 h\u1ee3p d\u1ef1 \u00e1n Vinhomes T\u00e2n C\u1ea3ng, m\u1ed9t d\u1ef1 \u00e1n c\u00f3 t\u1ed5ng m\u1ee9c \u0111\u1ea7u t\u01b0 40.000 t\u1ef7 \u0111\u1ed3ng, do C\u00f4ng ty C\u1ed5 ph\u1ea7n \u0110\u1ea7u t\u01b0 x\u00e2y d\u1ef1ng T\u00e2n Li\u00ean Ph\u00e1t thu\u1ed9c Vingroup l\u00e0m ch\u1ee7 \u0111\u1ea7u t\u01b0. To\u00e0 th\u00e1p cao 81 t\u1ea7ng, hi\u1ec7n t\u1ea1i l\u00e0 to\u00e0 nh\u00e0 cao nh\u1ea5t Vi\u1ec7t Nam v\u00e0 l\u00e0 to\u00e0 nh\u00e0 cao nh\u1ea5t \u0110\u00f4ng Nam \u00c1 t\u1eeb th\u00e1ng 3 n\u0103m 2018."}]}
|
question-answering
|
ancs21/xlm-roberta-large-vi-qa
|
[
"transformers",
"pytorch",
"xlm-roberta",
"question-answering",
"vi",
"license:mit",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"vi"
] |
TAGS
#transformers #pytorch #xlm-roberta #question-answering #vi #license-mit #endpoints_compatible #region-us
|
# XLM-RoBERTa large for QA on Vietnamese languages (also support various languages)
## Overview
- Language model: xlm-roberta-large
- Fine-tune: deepset/xlm-roberta-large-squad2
- Language: Vietnamese
- Downstream-task: Extractive QA
- Dataset: mailong25/bert-vietnamese-question-answering
- Training data: train-v2.0.json (SQuAD 2.0 format)
- Eval data: dev-v2.0.json (SQuAD 2.0 format)
- Infrastructure: 1x Tesla P100 (Google Colab)
## Performance
Evaluated on dev-v2.0.json
Evaluated on Vietnamese XQuAD: URL
## Author
An Pham (URL [at] URL)
## License
MIT
|
[
"# XLM-RoBERTa large for QA on Vietnamese languages (also support various languages)",
"## Overview\n\n- Language model: xlm-roberta-large\n- Fine-tune: deepset/xlm-roberta-large-squad2\n- Language: Vietnamese\n- Downstream-task: Extractive QA\n- Dataset: mailong25/bert-vietnamese-question-answering\n- Training data: train-v2.0.json (SQuAD 2.0 format)\n- Eval data: dev-v2.0.json (SQuAD 2.0 format)\n- Infrastructure: 1x Tesla P100 (Google Colab)",
"## Performance\n\nEvaluated on dev-v2.0.json\n\n\nEvaluated on Vietnamese XQuAD: URL",
"## Author\n\nAn Pham (URL [at] URL)",
"## License\n\nMIT"
] |
[
"TAGS\n#transformers #pytorch #xlm-roberta #question-answering #vi #license-mit #endpoints_compatible #region-us \n",
"# XLM-RoBERTa large for QA on Vietnamese languages (also support various languages)",
"## Overview\n\n- Language model: xlm-roberta-large\n- Fine-tune: deepset/xlm-roberta-large-squad2\n- Language: Vietnamese\n- Downstream-task: Extractive QA\n- Dataset: mailong25/bert-vietnamese-question-answering\n- Training data: train-v2.0.json (SQuAD 2.0 format)\n- Eval data: dev-v2.0.json (SQuAD 2.0 format)\n- Infrastructure: 1x Tesla P100 (Google Colab)",
"## Performance\n\nEvaluated on dev-v2.0.json\n\n\nEvaluated on Vietnamese XQuAD: URL",
"## Author\n\nAn Pham (URL [at] URL)",
"## License\n\nMIT"
] |
[
40,
24,
122,
22,
12,
3
] |
[
"passage: TAGS\n#transformers #pytorch #xlm-roberta #question-answering #vi #license-mit #endpoints_compatible #region-us \n# XLM-RoBERTa large for QA on Vietnamese languages (also support various languages)## Overview\n\n- Language model: xlm-roberta-large\n- Fine-tune: deepset/xlm-roberta-large-squad2\n- Language: Vietnamese\n- Downstream-task: Extractive QA\n- Dataset: mailong25/bert-vietnamese-question-answering\n- Training data: train-v2.0.json (SQuAD 2.0 format)\n- Eval data: dev-v2.0.json (SQuAD 2.0 format)\n- Infrastructure: 1x Tesla P100 (Google Colab)## Performance\n\nEvaluated on dev-v2.0.json\n\n\nEvaluated on Vietnamese XQuAD: URL## Author\n\nAn Pham (URL [at] URL)## License\n\nMIT"
] |
[
-0.08989952504634857,
0.18837641179561615,
-0.0033161186147481203,
0.05788253992795944,
0.08367085456848145,
-0.06694314628839493,
0.06643754243850708,
0.13882870972156525,
-0.0033959404099732637,
0.001243707723915577,
0.05349394306540489,
0.05788441002368927,
0.07322078198194504,
0.07397552579641342,
-0.012625010684132576,
-0.18490740656852722,
0.0016256539383903146,
0.042759668081998825,
-0.10824836790561676,
0.08018641173839569,
0.12192446738481522,
-0.07202886044979095,
0.1431141346693039,
0.030957205221056938,
0.04674481227993965,
0.04714294523000717,
-0.036274440586566925,
-0.11220809817314148,
0.08189578354358673,
0.009664790704846382,
-0.005697933956980705,
0.04177560284733772,
0.032478272914886475,
-0.15967527031898499,
0.01812690682709217,
0.013802370056509972,
-0.01768593117594719,
0.03529695048928261,
0.08137623965740204,
0.030053889378905296,
0.1389094740152359,
-0.05506192892789841,
-0.018573641777038574,
0.06126575171947479,
-0.000345122127328068,
-0.23884093761444092,
-0.11064646393060684,
0.03764956071972847,
0.0928582176566124,
0.10366351157426834,
-0.005446399096399546,
0.1652139127254486,
-0.22175171971321106,
0.05281250551342964,
0.0032591870985925198,
-0.2747602164745331,
-0.031616464257240295,
0.04659327119588852,
0.11001121252775192,
0.07610854506492615,
-0.09273792803287506,
-0.009280053898692131,
0.08816556632518768,
0.00952790305018425,
-0.06265920400619507,
-0.10736542195081711,
-0.057834524661302567,
0.0783037543296814,
-0.07240313291549683,
0.051218003034591675,
0.23527838289737701,
0.05775832012295723,
-0.04601601883769035,
-0.047611020505428314,
-0.015899451449513435,
-0.019222578033804893,
-0.009982314892113209,
-0.07502125948667526,
0.07024174928665161,
0.006352215074002743,
0.07164214551448822,
-0.04466498643159866,
-0.09090230613946915,
-0.04966457933187485,
-0.057620517909526825,
-0.16648489236831665,
0.06356777995824814,
0.028250737115740776,
-0.09883541613817215,
0.024335287511348724,
-0.022141562774777412,
-0.09030278027057648,
-0.06197596713900566,
-0.10249348729848862,
0.01461565401405096,
0.04587479680776596,
0.10618904232978821,
-0.01439733523875475,
0.11775098741054535,
0.031008757650852203,
-0.05884026736021042,
0.02594750002026558,
-0.08263419568538666,
0.010984373278915882,
-0.03149374574422836,
0.24185222387313843,
-0.12725530564785004,
-0.06232375651597977,
0.04729319363832474,
0.0366998128592968,
-0.03936947509646416,
0.06655726581811905,
-0.09373336285352707,
-0.02772510051727295,
-0.03993004932999611,
0.045064084231853485,
-0.041001204401254654,
0.012769599445164204,
-0.021428896114230156,
-0.05072353407740593,
0.09897544980049133,
-0.0537630096077919,
0.0013670151820406318,
0.019981948658823967,
-0.06808524578809738,
0.15056787431240082,
0.025718819350004196,
0.07974586635828018,
-0.07601438462734222,
-0.037686605006456375,
-0.006113619077950716,
-0.004507340490818024,
-0.04382677748799324,
-0.09513945877552032,
0.027524756267666817,
0.003732029814273119,
-0.00781344249844551,
-0.07685142755508423,
-0.13545800745487213,
-0.006973064970225096,
0.030733222141861916,
-0.0514640286564827,
-0.04870139807462692,
-0.03442344069480896,
-0.10198348015546799,
-0.03984291851520538,
0.019484810531139374,
0.058429744094610214,
-0.07326915115118027,
0.06325526535511017,
-0.027172695845365524,
0.04947548359632492,
-0.060368552803993225,
0.07783093303442001,
-0.06609386205673218,
-0.00239856936968863,
-0.05481669306755066,
0.1030493751168251,
-0.13423557579517365,
0.036358434706926346,
-0.10088933259248734,
-0.10944012552499771,
0.07176847755908966,
-0.004399701952934265,
-0.02235882356762886,
0.07416711002588272,
-0.16228312253952026,
-0.026506902649998665,
0.1419607698917389,
-0.04666982218623161,
-0.13830873370170593,
0.154027059674263,
0.0005969994817860425,
0.0334632433950901,
0.04591986536979675,
0.11698424816131592,
0.17078736424446106,
-0.18576572835445404,
-0.019942566752433777,
0.0795358195900917,
-0.015129524283111095,
-0.11538544297218323,
0.15615519881248474,
0.03896620124578476,
-0.014986122027039528,
0.029715271666646004,
-0.0847734659910202,
0.018246062099933624,
-0.018703773617744446,
-0.10047366470098495,
0.0015137853333726525,
-0.06985209882259369,
0.04846334084868431,
0.02866223081946373,
0.08360429853200912,
-0.029024245217442513,
-0.06330366432666779,
-0.06668959558010101,
0.09586529433727264,
0.03274042159318924,
-0.05245807766914368,
-0.11866989731788635,
0.06879882514476776,
-0.01241571456193924,
0.013628511689603329,
-0.039880815893411636,
0.10079292207956314,
0.08379668742418289,
0.023863328620791435,
0.007663180585950613,
0.142259418964386,
0.004872986581176519,
0.045531198382377625,
-0.0076417624950408936,
-0.01654353365302086,
-0.054993998259305954,
-0.02457614615559578,
-0.03373267874121666,
-0.08035971224308014,
0.11676400154829025,
-0.04226098582148552,
-0.0012575160944834352,
-0.1398858278989792,
-0.050377171486616135,
0.08039085566997528,
0.025063948705792427,
0.050649989396333694,
0.12103249132633209,
0.02377692423760891,
0.05000120401382446,
-0.022196408361196518,
0.018759436905384064,
0.027372634038329124,
-0.030765146017074585,
-0.011624141596257687,
0.1528683751821518,
0.09639015793800354,
0.14254271984100342,
0.10625883936882019,
0.07103487849235535,
0.03594948723912239,
-0.00777056161314249,
-0.050530172884464264,
-0.023501290008425713,
-0.026961686089634895,
0.05919468030333519,
0.11265600472688675,
0.005408333148807287,
0.1683516502380371,
-0.127651184797287,
0.011770816519856453,
-0.012694409117102623,
0.0005442594992928207,
0.035606272518634796,
0.2096916288137436,
0.14368413388729095,
-0.049622584134340286,
0.09372305124998093,
0.11056952178478241,
0.0038257786072790623,
0.158776193857193,
-0.05852016061544418,
-0.05769490450620651,
-0.016713010147213936,
0.10393594205379486,
-0.026597099378705025,
0.09425348043441772,
-0.10115532577037811,
0.0570923276245594,
0.06776846945285797,
0.048025161027908325,
0.0013139358488842845,
-0.14357160031795502,
-0.04936951398849487,
-0.0740562379360199,
-0.09511493146419525,
-0.10145691782236099,
0.10453126579523087,
0.07751672714948654,
0.07220294326543808,
-0.009814958088099957,
0.008559753187000751,
-0.01999017409980297,
-0.0253727026283741,
-0.06966015696525574,
0.1630924940109253,
-0.11131434142589569,
-0.23896940052509308,
-0.06834203004837036,
-0.005768221337348223,
-0.05762326717376709,
-0.05858824774622917,
0.05759255960583687,
-0.21713431179523468,
-0.08088408410549164,
-0.026331786066293716,
-0.0042889355681836605,
0.014585085213184357,
-0.07509122788906097,
-0.045745376497507095,
0.11935865134000778,
-0.02379142865538597,
-0.12720468640327454,
-0.004559606313705444,
-0.04683571308851242,
-0.08159055560827255,
0.033292386680841446,
-0.048544708639383316,
0.005164504051208496,
0.03947998583316803,
-0.025217849761247635,
0.02586524933576584,
-0.044191308319568634,
0.1496819108724594,
-0.07924660295248032,
0.038149621337652206,
0.20644427835941315,
0.14349442720413208,
0.050353631377220154,
0.15862227976322174,
-0.014904500916600227,
-0.037080053240060806,
0.06453148275613785,
0.03949231654405594,
0.010119205340743065,
-0.31419751048088074,
-0.058860328048467636,
-0.08845388144254684,
0.11141342669725418,
-0.04676742106676102,
0.052769020199775696,
-0.05540032312273979,
0.046426039189100266,
0.018863456323742867,
0.1377415657043457,
-0.0361199788749218,
0.0007375759887509048,
0.13710446655750275,
-0.004031449556350708,
0.05462133139371872,
-0.11218968033790588,
-0.012752746231853962,
0.14658762514591217,
0.15358908474445343,
0.17361871898174286,
-0.05635380744934082,
0.0538724847137928,
0.11675689369440079,
0.27805429697036743,
0.030682628974318504,
-0.034973688423633575,
-0.061154529452323914,
0.005372239276766777,
-0.029019158333539963,
-0.07769285142421722,
0.05007413029670715,
0.09374520927667618,
0.052026454359292984,
0.011114112101495266,
-0.0100737065076828,
0.1809108853340149,
0.03653119131922722,
0.2220364809036255,
0.010631809942424297,
-0.09368637949228287,
0.002025234280154109,
0.04416979104280472,
0.009992759674787521,
0.012255133129656315,
0.1236797571182251,
0.1411677598953247,
-0.16785748302936554,
0.03733561560511589,
-0.013653972186148167,
0.10941758006811142,
0.011235885322093964,
0.018615106120705605,
0.05738134682178497,
-0.011051228269934654,
0.07583259791135788,
0.1120910793542862,
-0.26574987173080444,
0.20376327633857727,
-0.008084370754659176,
0.034845080226659775,
-0.05460617318749428,
0.04225670546293259,
0.04303816333413124,
0.04861178249120712,
0.14389851689338684,
-0.015107020735740662,
0.08806160092353821,
-0.04222415015101433,
-0.13282105326652527,
0.07853724807500839,
-0.048770155757665634,
0.07580786943435669,
-0.023863552138209343,
-0.02188211865723133,
-0.03801846504211426,
-0.05851500481367111,
0.004138389602303505,
-0.13565309345722198,
-0.05696609988808632,
0.00919271633028984,
-0.018310701474547386,
-0.009265066124498844,
-0.06741879135370255,
-0.06950302422046661,
-0.07293286919593811,
0.1183803454041481,
-0.11580198258161545,
-0.05884522199630737,
-0.03596912696957588,
-0.04024864360690117,
0.061398234218358994,
-0.09280712902545929,
-0.02314218133687973,
-0.057190362364053726,
-0.00565917557105422,
0.030780527740716934,
-0.030515149235725403,
0.06505205482244492,
-0.06150967255234718,
-0.09193389117717743,
-0.017669174820184708,
0.11501261591911316,
-0.06542326509952545,
0.029337218031287193,
0.024180712178349495,
-0.06425698101520538,
-0.08022168278694153,
-0.16421663761138916,
-0.07371010631322861,
0.06399034708738327,
0.06740177422761917,
0.05854979157447815,
-0.12513577938079834,
-0.16718429327011108,
-0.02917524427175522,
-0.08805610984563828,
0.13758867979049683,
0.12330134212970734,
-0.08738413453102112,
0.17854996025562286,
0.09706389158964157,
-0.04632025212049484,
-0.21859616041183472,
-0.00457467045634985,
0.05420023947954178,
0.04557260870933533,
0.007882347330451012,
-0.10383851826190948,
0.09579748660326004,
0.03801603615283966,
-0.01764874905347824,
-0.01210697926580906,
-0.2232857644557953,
-0.1284869909286499,
0.032151609659194946,
0.02638668566942215,
0.015985088422894478,
-0.09237994253635406,
-0.05187632143497467,
-0.029985889792442322,
-0.2636169195175171,
0.0836576521396637,
-0.11356183886528015,
0.044859956949949265,
-0.032734520733356476,
0.05524514243006706,
-0.02509741671383381,
-0.02377302199602127,
0.10435255616903305,
-0.0397004634141922,
0.01340692862868309,
-0.03889402002096176,
-0.06461750715970993,
0.05474954843521118,
0.0300492811948061,
0.1524328589439392,
-0.04938359186053276,
0.08611158281564713,
-0.1912832409143448,
-0.020289618521928787,
-0.07247033715248108,
-0.026516882702708244,
-0.029245629906654358,
-0.056185923516750336,
-0.11658737063407898,
0.11397571116685867,
0.010976923629641533,
-0.012797350063920021,
0.04684058576822281,
0.04211754351854324,
-0.05345800146460533,
0.07945816218852997,
0.16833694279193878,
-0.0594140850007534,
0.02177940122783184,
-0.05818069353699684,
0.006181662902235985,
0.10305541753768921,
-0.21333028376102448,
0.06107807904481888,
0.1561795473098755,
-0.036951497197151184,
0.12306792289018631,
-0.027294505387544632,
-0.022774355486035347,
0.10135980695486069,
0.0461563766002655,
-0.013478806242346764,
-0.2327168881893158,
-0.05390944331884384,
0.03331688046455383,
0.014881103299558163,
0.013733678497374058,
0.06870764493942261,
-0.08027055114507675,
-0.04312999173998833,
0.016261661425232887,
0.041993141174316406,
-0.018232764676213264,
0.05009021982550621,
0.02807547152042389,
0.048907727003097534,
-0.08854620903730392,
0.08831498771905899,
0.10355177521705627,
-0.08425500988960266,
-0.008850856684148312,
0.13914526998996735,
-0.08842851221561432,
-0.058991916477680206,
0.05360996723175049,
0.18128447234630585,
0.03521295264363289,
-0.06140977144241333,
-0.08058027923107147,
-0.1101548969745636,
0.06049643084406853,
0.009353341534733772,
0.03255908191204071,
-0.008870216086506844,
-0.022068973630666733,
-0.10531137883663177,
-0.013901536352932453,
0.1438671201467514,
0.04516914114356041,
-0.0616832971572876,
-0.11357752233743668,
-0.1205044835805893,
0.019668733701109886,
0.19689197838306427,
-0.011190976947546005,
-0.026338784024119377,
-0.09640046954154968,
0.011541023850440979,
-0.34082600474357605,
0.10078504681587219,
-0.014987041242420673,
0.032686393707990646,
-0.0689355731010437,
-0.12225639820098877,
-0.09052541106939316,
0.05072176456451416,
-0.10230009257793427,
0.022291235625743866,
-0.008417535573244095,
0.15413302183151245,
-0.11496417224407196,
-0.029124148190021515,
0.0703505277633667,
0.034299466758966446,
0.0909038707613945,
-0.06432048231363297,
-0.07140094041824341,
0.035718586295843124,
-0.05737604573369026,
-0.05043278634548187,
0.028157323598861694,
0.09039624780416489,
0.09037069231271744,
-0.06056061014533043,
0.043138571083545685,
0.06335709244012833,
0.039443690329790115,
-0.0011119104456156492,
0.04552663490176201,
-0.09734423458576202,
-0.07372315227985382,
-0.12185411900281906,
-0.07441020011901855,
-0.059765662997961044,
0.07882926613092422,
0.04143669456243515,
0.048349373042583466,
0.1068854033946991,
-0.0908099114894867,
0.03900356963276863,
-0.12022164463996887,
-0.04066095128655434,
-0.01913243718445301,
-0.034062471240758896,
-0.1542777717113495,
-0.024661829695105553,
0.05041442811489105,
-0.013134058564901352,
0.14636771380901337,
0.029356781393289566,
0.005007976200431585,
0.025586670264601707,
-0.05532437562942505,
-0.012397192418575287,
0.012818253599107265,
0.11121949553489685,
0.02136591449379921,
0.06810908019542694,
-0.019501332193613052,
-0.0186447873711586,
-0.004989929497241974,
0.09583836048841476,
0.06098426133394241,
0.2300427407026291,
0.16888615489006042,
0.0367550365626812,
0.05958066135644913,
0.021027106791734695,
-0.0954073965549469,
0.0884549468755722,
-0.07028986513614655,
0.07006479054689407,
-0.11868095397949219,
0.059634286910295486,
0.11260222643613815,
-0.13592444360256195,
0.05935077369213104,
-0.07332846522331238,
-0.068851999938488,
-0.11583025753498077,
-0.08765838295221329,
-0.10405442863702774,
-0.18337960541248322,
0.030511705204844475,
-0.09814972430467606,
0.019295966252684593,
-0.057034049183130264,
0.1435755342245102,
-0.09990499913692474,
0.007575918920338154,
-0.07904504984617233,
-0.06642672419548035,
0.07795356959104538,
-0.0067808483727276325,
0.05166597664356232,
0.0030315108597278595,
0.07759389281272888,
0.02486041560769081,
-0.01962817646563053,
0.0044425856322050095,
0.04457749426364899,
-0.09260396659374237,
0.0022690314799547195,
-0.0712369978427887,
0.016403555870056152,
-0.01642906665802002,
0.019840072840452194,
0.027805881574749947,
0.1632116138935089,
0.07674074918031693,
-0.017524277791380882,
0.053905896842479706,
0.17400506138801575,
-0.014318355359137058,
-0.10669649392366409,
-0.18005573749542236,
0.01870470866560936,
-0.026682056486606598,
0.019064538180828094,
0.035027459263801575,
-0.011947551742196083,
-0.0820077583193779,
0.23953156173229218,
0.16233204305171967,
-0.09478738158941269,
-0.05108021944761276,
-0.010500391013920307,
0.0019311901414766908,
-0.07182158529758453,
0.05668587610125542,
0.17180249094963074,
0.2092389315366745,
-0.06419691443443298,
-0.07845177501440048,
-0.041424814611673355,
-0.017255930230021477,
-0.09845726937055588,
0.038782358169555664,
0.010528870858252048,
-0.05172333866357803,
-0.014539309777319431,
0.08247487992048264,
-0.05857391655445099,
0.033392298966646194,
-0.04334262013435364,
-0.14181411266326904,
-0.17323046922683716,
-0.018788378685712814,
0.0997379943728447,
0.09168713539838791,
-0.06511643528938293,
-0.008741877973079681,
0.014858894981443882,
0.12172739952802658,
-0.030296554788947105,
-0.06233461946249008,
-0.02137676253914833,
0.1125153973698616,
-0.0191962793469429,
0.12178895622491837,
0.018748773261904716,
0.03544878587126732,
0.07815568149089813,
0.022031012922525406,
-0.06705844402313232,
0.06635315716266632,
0.057200100272893906,
-0.09947652369737625,
-0.001610613544471562,
-0.040359292179346085,
-0.012913435697555542,
0.10988850891590118,
0.051071640104055405,
0.08211375027894974,
0.019979508593678474,
-0.05916845053434372,
-0.04800677299499512,
-0.14431804418563843,
0.1406610757112503,
-0.12620218098163605,
0.07810881733894348,
0.14227797091007233,
-0.04758385568857193,
0.029943937435746193,
-0.07749053835868835,
0.09824013710021973,
-0.06938262283802032,
-0.09042497724294662,
0.0026509270537644625,
-0.0810990035533905,
0.04209659993648529,
0.03548668697476387,
0.08754590898752213,
-0.1607319861650467,
-0.01878754235804081,
-0.0666535496711731,
0.02073119394481182,
-0.07855545729398727,
0.1084858626127243,
0.05252464488148689,
0.03052695095539093,
-0.008666591718792915,
-0.26829805970191956,
-0.047321707010269165,
0.04713156074285507,
-0.05573643743991852,
-0.07146703451871872
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-cased-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0620
- Precision: 0.9406
- Recall: 0.9463
- F1: 0.9434
- Accuracy: 0.9861
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.5855 | 1.0 | 878 | 0.0848 | 0.8965 | 0.8980 | 0.8973 | 0.9760 |
| 0.058 | 2.0 | 1756 | 0.0607 | 0.9357 | 0.9379 | 0.9368 | 0.9840 |
| 0.0282 | 3.0 | 2634 | 0.0604 | 0.9354 | 0.9420 | 0.9387 | 0.9852 |
| 0.0148 | 4.0 | 3512 | 0.0606 | 0.9386 | 0.9485 | 0.9435 | 0.9861 |
| 0.0101 | 5.0 | 4390 | 0.0620 | 0.9406 | 0.9463 | 0.9434 | 0.9861 |
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["conll2003"], "metrics": ["precision", "recall", "f1", "accuracy"], "model_index": [{"name": "bert-base-cased-ner", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.9860628716077}}]}]}
|
token-classification
|
andi611/bert-base-cased-ner-conll2003
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
bert-base-cased-ner
===================
This model is a fine-tuned version of bert-base-cased on the conll2003 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0620
* Precision: 0.9406
* Recall: 0.9463
* F1: 0.9434
* Accuracy: 0.9861
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.8.2
* Pytorch 1.8.1+cu111
* Datasets 1.8.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
63,
116,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
-0.10684671252965927,
0.12163993716239929,
-0.0030159379821270704,
0.12252704054117203,
0.14122141897678375,
0.020981865003705025,
0.11480814218521118,
0.14673390984535217,
-0.09154226630926132,
0.0340488962829113,
0.12447600066661835,
0.16301530599594116,
0.019621621817350388,
0.12087827175855637,
-0.051374003291130066,
-0.2789058983325958,
-0.005799348931759596,
0.03091929294168949,
-0.1041601151227951,
0.1239766851067543,
0.08474834263324738,
-0.12680046260356903,
0.09443235397338867,
-0.00009708118159323931,
-0.1549399346113205,
0.0136216776445508,
0.010816016234457493,
-0.057701025158166885,
0.1401607245206833,
0.02820136956870556,
0.1042487844824791,
0.015545900911092758,
0.1061839759349823,
-0.1849210560321808,
0.005024987738579512,
0.061859775334596634,
0.01124685350805521,
0.10177682340145111,
0.07014184445142746,
0.009123546071350574,
0.1138138696551323,
-0.09201453626155853,
0.05997243523597717,
0.019079895690083504,
-0.11402378231287003,
-0.22473160922527313,
-0.09126577526330948,
0.03350788354873657,
0.08037477731704712,
0.08518347144126892,
0.0023241122253239155,
0.12475313246250153,
-0.0660930797457695,
0.1076764464378357,
0.24987587332725525,
-0.3052060306072235,
-0.06587057560682297,
0.04803990200161934,
0.029389280825853348,
0.0588088221848011,
-0.10806898772716522,
-0.019215011969208717,
0.034008365124464035,
0.03953789547085762,
0.14096081256866455,
-0.040045227855443954,
-0.09523690491914749,
0.018213558942079544,
-0.14157159626483917,
-0.021825823932886124,
0.12131254374980927,
0.03383283317089081,
-0.029405172914266586,
-0.046188514679670334,
-0.07162638753652573,
-0.16810837388038635,
-0.03461674973368645,
-0.019120104610919952,
0.05618971586227417,
-0.03644653037190437,
-0.0741233304142952,
-0.01891818828880787,
-0.09121508151292801,
-0.07131660729646683,
-0.05587216839194298,
0.13045494258403778,
0.046605125069618225,
0.010845904238522053,
-0.015635723248124123,
0.11017727106809616,
0.015402804128825665,
-0.13563278317451477,
0.008659636601805687,
0.03554914891719818,
-0.03372254967689514,
-0.0443698987364769,
-0.04293541982769966,
-0.02670026570558548,
0.011545024812221527,
0.15187405049800873,
-0.055314093828201294,
0.05767460912466049,
0.03783411532640457,
0.026452088728547096,
-0.09775462746620178,
0.18069367110729218,
-0.06387098878622055,
-0.046158578246831894,
0.001392737147398293,
0.06384126842021942,
0.02081451378762722,
-0.01262674666941166,
-0.10947536677122116,
0.007872220128774643,
0.09966804832220078,
0.02866414003074169,
-0.05471939966082573,
0.07178618013858795,
-0.047938503324985504,
-0.024073965847492218,
0.017273321747779846,
-0.10598865151405334,
0.038757599890232086,
0.002231946447864175,
-0.09646914899349213,
-0.04176066070795059,
0.026462743058800697,
0.0026879380457103252,
-0.015624946914613247,
0.12029802054166794,
-0.08780473470687866,
0.018494850024580956,
-0.08647222816944122,
-0.11931363493204117,
0.020684871822595596,
-0.09465201199054718,
0.011911336332559586,
-0.08063677698373795,
-0.17159056663513184,
-0.014404232613742352,
0.06969103962182999,
-0.044349972158670425,
-0.04689040780067444,
-0.04424343258142471,
-0.08666113764047623,
0.01638266071677208,
-0.019677093252539635,
0.13786335289478302,
-0.06491151452064514,
0.09499582648277283,
0.020401790738105774,
0.06724202632904053,
-0.025406550616025925,
0.05660485103726387,
-0.09653082489967346,
0.023783965036273003,
-0.1586836576461792,
0.03382008895277977,
-0.06848403811454773,
0.04538260027766228,
-0.11383070051670074,
-0.100408174097538,
0.03546658903360367,
-0.001766860019415617,
0.077501080930233,
0.0910506471991539,
-0.1847292184829712,
-0.06752041727304459,
0.13996051251888275,
-0.06388869136571884,
-0.1081438809633255,
0.10473968088626862,
-0.05693817883729935,
0.03302650526165962,
0.05714396387338638,
0.16166572272777557,
0.07819382846355438,
-0.08948104828596115,
0.001645195297896862,
0.011320739984512329,
0.05766618996858597,
-0.04734354838728905,
0.06797166168689728,
0.006961209233850241,
0.03952069580554962,
0.018872801214456558,
-0.04845571145415306,
0.040133558213710785,
-0.09014206379652023,
-0.09335753321647644,
-0.03128201514482498,
-0.08846396207809448,
0.0346142053604126,
0.06890825182199478,
0.06061059609055519,
-0.09052013605833054,
-0.09747865796089172,
0.07237070053815842,
0.09664777666330338,
-0.06693583726882935,
0.030953502282500267,
-0.06995168328285217,
0.05646064132452011,
-0.036892764270305634,
-0.019726885482668877,
-0.18065960705280304,
-0.046828743070364,
0.01665511168539524,
-0.023236047476530075,
0.023358270525932312,
0.03470715135335922,
0.07877469807863235,
0.06629019230604172,
-0.06091306731104851,
-0.02773333713412285,
-0.012993193231523037,
0.003005493897944689,
-0.13026481866836548,
-0.22357392311096191,
-0.05165664479136467,
-0.02501535788178444,
0.09839381277561188,
-0.20577718317508698,
0.030479148030281067,
0.006778940558433533,
0.09032891690731049,
0.030242575332522392,
-0.009019127115607262,
-0.030947178602218628,
0.06316783279180527,
-0.04823730140924454,
-0.06958412379026413,
0.06756068021059036,
-0.004744467791169882,
-0.0923672690987587,
-0.038478750735521317,
-0.11605420708656311,
0.1474997103214264,
0.11710864305496216,
-0.07338345050811768,
-0.0851215124130249,
0.006198924966156483,
-0.05945242568850517,
-0.04157179966568947,
-0.04053590074181557,
0.03501983359456062,
0.15345998108386993,
0.00256506004370749,
0.15136294066905975,
-0.06863701343536377,
-0.056920453906059265,
0.022115977481007576,
-0.019952846691012383,
0.017465073615312576,
0.13036265969276428,
0.11821846663951874,
-0.08023740351200104,
0.14273671805858612,
0.14425207674503326,
-0.08641543239355087,
0.12406820058822632,
-0.04050104320049286,
-0.07420984655618668,
-0.030666081234812737,
-0.009233186021447182,
0.006404480431228876,
0.1096319630742073,
-0.09578991681337357,
-0.004575315862894058,
0.02216535434126854,
0.02367323637008667,
0.005023385398089886,
-0.2169017195701599,
-0.0223903376609087,
0.02872599847614765,
-0.061571259051561356,
0.003341584000736475,
-0.012374013662338257,
-0.003878332208842039,
0.10902086645364761,
0.004942171275615692,
-0.10478463768959045,
0.030424945056438446,
-0.002957821125164628,
-0.06313399970531464,
0.20523415505886078,
-0.09033693373203278,
-0.14683133363723755,
-0.11279068142175674,
-0.0703674778342247,
-0.05583933740854263,
0.004497178830206394,
0.049815092235803604,
-0.07667763531208038,
-0.04000324010848999,
-0.0759553462266922,
0.003268056781962514,
-0.004429869819432497,
0.03680512681603432,
-0.013725115917623043,
-0.0007048079860396683,
0.07837861776351929,
-0.11561417579650879,
-0.004506116267293692,
-0.050633884966373444,
-0.07255095988512039,
0.031235652044415474,
0.055167246609926224,
0.10371094197034836,
0.1369403749704361,
-0.011665038764476776,
0.01116883009672165,
-0.030672596767544746,
0.2121092528104782,
-0.06552505493164062,
0.0021287575364112854,
0.12247573584318161,
-0.015043350867927074,
0.050221119076013565,
0.13375185430049896,
0.06782331317663193,
-0.09245191514492035,
0.0031678027007728815,
0.054895829409360886,
-0.03183101490139961,
-0.22687093913555145,
-0.03520457074046135,
-0.04821009188890457,
0.014144638553261757,
0.10775426030158997,
0.04112214222550392,
0.02722112275660038,
0.05121591314673424,
0.0357658714056015,
0.06067069619894028,
-0.027154533192515373,
0.0632658451795578,
0.11189085990190506,
0.0436149463057518,
0.13108311593532562,
-0.03647294640541077,
-0.050093334168195724,
0.048546046018600464,
0.012439358048141003,
0.22327975928783417,
-0.007499504368752241,
0.15080636739730835,
0.04415097460150719,
0.17408069968223572,
-0.016460442915558815,
0.0694180503487587,
-0.004133394919335842,
-0.02672535367310047,
-0.016295360401272774,
-0.04446615278720856,
-0.019892726093530655,
0.03087259829044342,
-0.04673241451382637,
0.04594011977314949,
-0.1035996600985527,
0.018795551732182503,
0.04843657836318016,
0.27561962604522705,
0.04971963167190552,
-0.318155437707901,
-0.0881633311510086,
0.0007291475776582956,
-0.05148302763700485,
-0.018542444333434105,
0.030609501525759697,
0.1139673963189125,
-0.0752941370010376,
0.044801976531744,
-0.08261210471391678,
0.08540406823158264,
-0.06233818456530571,
0.038188204169273376,
0.1075420081615448,
0.1118694543838501,
0.007050806190818548,
0.07257919013500214,
-0.28649458289146423,
0.2797132730484009,
0.014528442174196243,
0.06273042410612106,
-0.07001378387212753,
0.016986064612865448,
0.0399700365960598,
0.051855187863111496,
0.08433029055595398,
-0.01716773398220539,
-0.07031261920928955,
-0.19120720028877258,
-0.0743778795003891,
0.021034669131040573,
0.09430211037397385,
-0.037822309881448746,
0.10131960362195969,
-0.04668797552585602,
-0.013316940516233444,
0.07477427273988724,
-0.04966435953974724,
-0.04901330918073654,
-0.08890576660633087,
0.009970009326934814,
0.0250004343688488,
-0.04767593741416931,
-0.059672918170690536,
-0.11154244840145111,
-0.10638604313135147,
0.17205247282981873,
-0.04979659616947174,
-0.03282998129725456,
-0.12408816814422607,
0.0868641585111618,
0.10552560538053513,
-0.09105083346366882,
0.03829187899827957,
0.007779906503856182,
0.06442045420408249,
0.04673810675740242,
-0.0742318332195282,
0.12333513051271439,
-0.0736493244767189,
-0.19000622630119324,
-0.05698191747069359,
0.10891924053430557,
0.03319238871335983,
0.06451495736837387,
-0.019435204565525055,
0.028698716312646866,
-0.023614713922142982,
-0.0828472152352333,
0.026012899354100227,
-0.00422708922997117,
0.06444716453552246,
0.004930790513753891,
-0.0689101442694664,
0.024340998381376266,
-0.046962328255176544,
-0.02649744413793087,
0.15849027037620544,
0.2595628798007965,
-0.10476633161306381,
0.03894803300499916,
0.04295806959271431,
-0.07351946085691452,
-0.2027466595172882,
0.030944878235459328,
0.05702855437994003,
-0.002999547403305769,
0.0545443519949913,
-0.19147159159183502,
0.12113181501626968,
0.10616900771856308,
-0.01929657720029354,
0.10158565640449524,
-0.32318833470344543,
-0.13135206699371338,
0.11458948254585266,
0.13273386657238007,
0.07587216794490814,
-0.14975541830062866,
-0.027094606310129166,
0.000994518748484552,
-0.0903833732008934,
0.11632131040096283,
-0.07547122240066528,
0.12762925028800964,
-0.020732592791318893,
0.06963960826396942,
0.007908727042376995,
-0.05361757054924965,
0.11151406913995743,
0.018319571390748024,
0.10032662749290466,
-0.049703486263751984,
-0.032065026462078094,
0.03718361631035805,
-0.05010288581252098,
0.02836446277797222,
-0.09117540717124939,
0.0333721749484539,
-0.07739143073558807,
-0.016476212069392204,
-0.07780282944440842,
0.04278353601694107,
-0.03858872503042221,
-0.07023753225803375,
-0.04075117036700249,
0.04462161287665367,
0.05823737755417824,
-0.019420357421040535,
0.16578324139118195,
0.02186906524002552,
0.1249682754278183,
0.12788821756839752,
0.07644063979387283,
-0.04731198400259018,
-0.06910384446382523,
-0.008705748245120049,
-0.011892488226294518,
0.06084604561328888,
-0.13886094093322754,
0.039100002497434616,
0.15088698267936707,
0.018539683893322945,
0.1298917979001999,
0.0722663626074791,
-0.025620201602578163,
-0.008541364222764969,
0.05522831529378891,
-0.1602872908115387,
-0.08661115914583206,
0.008696682751178741,
-0.06464658677577972,
-0.11288691312074661,
0.06279227137565613,
0.1167125403881073,
-0.06879466027021408,
-0.004718456417322159,
0.01386629045009613,
0.02283211424946785,
-0.03662782534956932,
0.21738183498382568,
0.05392024666070938,
0.046925704926252365,
-0.10364688187837601,
0.08632487803697586,
0.04536363109946251,
-0.08375802636146545,
0.014377511106431484,
0.09026583284139633,
-0.08292114734649658,
-0.039055969566106796,
0.06807053089141846,
0.14935682713985443,
-0.05931248515844345,
-0.03964762017130852,
-0.14403122663497925,
-0.101837657392025,
0.08381453156471252,
0.1571367383003235,
0.09730564057826996,
0.02071533538401127,
-0.05611838027834892,
0.022346660494804382,
-0.11277927458286285,
0.11312530189752579,
0.048750102519989014,
0.0753997415304184,
-0.1530901938676834,
0.15574006736278534,
0.0050461627542972565,
0.03887612745165825,
-0.021520858630537987,
0.035265274345874786,
-0.11010020971298218,
-0.0016304051969200373,
-0.1200907826423645,
-0.024437185376882553,
-0.03829478472471237,
0.006005586590617895,
0.0007418150780722499,
-0.06242266669869423,
-0.06039922684431076,
0.01629435271024704,
-0.11393138766288757,
-0.02912972867488861,
0.02293544076383114,
0.05454938858747482,
-0.12833169102668762,
-0.04364260286092758,
0.01520838774740696,
-0.06895679235458374,
0.07842320203781128,
0.019379356876015663,
0.016407139599323273,
0.050356049090623856,
-0.11320262402296066,
0.00550078833475709,
0.05646674335002899,
0.015095776878297329,
0.07415664196014404,
-0.1004885658621788,
-0.012257127091288567,
-0.012727479450404644,
0.05136367678642273,
0.00788186490535736,
0.08000394701957703,
-0.13302867114543915,
0.007186888717114925,
-0.040644340217113495,
-0.06927406787872314,
-0.0653480663895607,
0.04149793088436127,
0.08400918543338776,
0.021585121750831604,
0.1993909627199173,
-0.08194614946842194,
0.03375108540058136,
-0.21188296377658844,
0.0021872662473469973,
-0.009441465139389038,
-0.12415162473917007,
-0.11838370561599731,
-0.05567358061671257,
0.07367000728845596,
-0.06856060773134232,
0.1180567666888237,
0.023727145045995712,
0.051207173615694046,
0.03423192724585533,
-0.04085296764969826,
0.004185989964753389,
0.027101770043373108,
0.1766303926706314,
0.0292072631418705,
-0.03130181506276131,
0.062001585960388184,
0.045000847429037094,
0.09388945996761322,
0.10139118880033493,
0.21044939756393433,
0.13211221992969513,
0.0005032142507843673,
0.0938720703125,
0.04406271129846573,
-0.0986618846654892,
-0.17956408858299255,
0.06165052577853203,
-0.05352064594626427,
0.1272374391555786,
-0.027457062155008316,
0.21697324514389038,
0.0627751350402832,
-0.17336171865463257,
0.04690355807542801,
-0.0493035651743412,
-0.08717738091945648,
-0.12682290375232697,
-0.06049109250307083,
-0.08385873585939407,
-0.13093635439872742,
-0.007763555273413658,
-0.11026014387607574,
0.038087792694568634,
0.1047615185379982,
0.021934043616056442,
-0.0070207081735134125,
0.14470086991786957,
0.004094862379133701,
0.02770751155912876,
0.04503393918275833,
0.025088248774409294,
-0.025167981162667274,
-0.08664560317993164,
-0.07738529145717621,
-0.02395041286945343,
-0.005747572984546423,
0.022827809676527977,
-0.062420133501291275,
-0.04458185285329819,
0.04474911466240883,
-0.019348958507180214,
-0.09506871551275253,
0.012279581278562546,
0.017075182870030403,
0.06245410442352295,
0.04430336877703667,
0.014032488688826561,
0.008132492192089558,
-0.008761237375438213,
0.2312925010919571,
-0.09637078642845154,
-0.05708303302526474,
-0.10595310479402542,
0.2637002468109131,
0.02839106321334839,
-0.011283003725111485,
0.032233599573373795,
-0.06626884639263153,
-0.00918584130704403,
0.23189039528369904,
0.207015722990036,
-0.08564604818820953,
-0.01210513524711132,
0.007757560350000858,
-0.008923985995352268,
-0.029797352850437164,
0.10556532442569733,
0.11705698817968369,
0.0664895698428154,
-0.08661011606454849,
-0.05479961633682251,
-0.03728845715522766,
-0.02084336429834366,
-0.04589574784040451,
0.07012403756380081,
0.033529382199048996,
0.009453291073441505,
-0.032794591039419174,
0.04579613357782364,
-0.04796972498297691,
-0.1448466181755066,
0.06580892950296402,
-0.20440174639225006,
-0.17285288870334625,
-0.019184039905667305,
0.10660183429718018,
0.00764403585344553,
0.054765988141298294,
-0.020997099578380585,
-0.0004655742668546736,
0.08209846168756485,
-0.016328411176800728,
-0.08785679936408997,
-0.08704240620136261,
0.10241628438234329,
-0.10741083323955536,
0.21872290968894958,
-0.04281768947839737,
0.04230158030986786,
0.12536382675170898,
0.048152856528759,
-0.0920872911810875,
0.04921920597553253,
0.06136615201830864,
-0.10060154646635056,
0.028775854036211967,
0.09669464081525803,
-0.04094354808330536,
0.10032645612955093,
0.04066856577992439,
-0.1172420084476471,
0.0076019312255084515,
-0.06513344496488571,
-0.07069529592990875,
-0.042617250233888626,
-0.041309624910354614,
-0.04546051099896431,
0.1380748599767685,
0.22099417448043823,
-0.0316789336502552,
0.004434193018823862,
-0.07848232984542847,
0.011192484758794308,
0.05449604243040085,
0.02916116639971733,
-0.05526284500956535,
-0.22443141043186188,
0.026043571531772614,
0.035675082355737686,
-0.004493676591664553,
-0.19841806590557098,
-0.09617773443460464,
0.01654965803027153,
-0.06325005739927292,
-0.108017697930336,
0.08873183280229568,
0.07317037135362625,
0.04880779609084129,
-0.05118012800812721,
-0.044857703149318695,
-0.0718294233083725,
0.14710843563079834,
-0.16818344593048096,
-0.07864486426115036
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-ner
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1258
- Precision: 0.0269
- Recall: 0.1379
- F1: 0.0451
- Accuracy: 0.1988
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 4 | 2.1296 | 0.0270 | 0.1389 | 0.0452 | 0.1942 |
| No log | 2.0 | 8 | 2.1258 | 0.0269 | 0.1379 | 0.0451 | 0.1988 |
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["conll2003"], "metrics": ["precision", "recall", "f1", "accuracy"], "model_index": [{"name": "bert-base-uncased-ner", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.19881805328292054}}]}]}
|
token-classification
|
andi611/bert-base-uncased-ner-conll2003
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
bert-base-uncased-ner
=====================
This model is a fine-tuned version of bert-base-uncased on the conll2003 dataset.
It achieves the following results on the evaluation set:
* Loss: 2.1258
* Precision: 0.0269
* Recall: 0.1379
* F1: 0.0451
* Accuracy: 0.1988
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.8.2
* Pytorch 1.8.1+cu111
* Datasets 1.8.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
63,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
-0.10909858345985413,
0.10547416657209396,
-0.0018119445303454995,
0.12297603487968445,
0.1679358184337616,
0.035931725054979324,
0.12001754343509674,
0.11613373458385468,
-0.10717010498046875,
0.02439703606069088,
0.12795861065387726,
0.1678411215543747,
0.00792995747178793,
0.10627112537622452,
-0.054348476231098175,
-0.2464005947113037,
-0.002987428568303585,
0.0475882813334465,
-0.09248116612434387,
0.1269427239894867,
0.0958186686038971,
-0.13499830663204193,
0.09417884796857834,
0.0017378629418089986,
-0.22286520898342133,
0.011748525314033031,
0.025641893967986107,
-0.05587988346815109,
0.1440795511007309,
0.035262782126665115,
0.1332864761352539,
-0.0018437056569382548,
0.0949646607041359,
-0.17838039994239807,
0.006789271719753742,
0.05662422627210617,
-0.0002041700790869072,
0.09400812536478043,
0.0540035218000412,
0.009945042431354523,
0.11827317625284195,
-0.08126769959926605,
0.050513576716184616,
0.01917032152414322,
-0.11816280335187912,
-0.21621449291706085,
-0.08997879922389984,
0.02949027344584465,
0.06919457763433456,
0.09517935663461685,
0.006271079182624817,
0.15284816920757294,
-0.08401747047901154,
0.09098813682794571,
0.22006762027740479,
-0.30675607919692993,
-0.06724698841571808,
0.05496193841099739,
0.010036405175924301,
0.056124184280633926,
-0.10499193519353867,
-0.027967777103185654,
0.06140587478876114,
0.04356073960661888,
0.12678180634975433,
-0.04095381125807762,
-0.12662437558174133,
0.023180682212114334,
-0.1434372067451477,
-0.01643308624625206,
0.15229199826717377,
0.04204745218157768,
-0.030511684715747833,
-0.019895387813448906,
-0.06133802980184555,
-0.15253114700317383,
-0.023561298847198486,
-0.012025580741465092,
0.05013848468661308,
-0.032723039388656616,
-0.08286435157060623,
0.002277409890666604,
-0.10687842220067978,
-0.07047515362501144,
-0.08705475181341171,
0.14267726242542267,
0.03622736036777496,
0.022595888003706932,
-0.024104023352265358,
0.09961149841547012,
-0.0000754722350393422,
-0.11827308684587479,
0.01877029798924923,
0.03631308674812317,
-0.015668224543333054,
-0.05954990163445473,
-0.065679632127285,
-0.04509381949901581,
0.016182055696845055,
0.13071206212043762,
-0.04633624479174614,
0.041207075119018555,
0.04720387980341911,
0.0411650724709034,
-0.0883694738149643,
0.18685248494148254,
-0.05438840761780739,
-0.021763116121292114,
0.008853876031935215,
0.036213040351867676,
0.018300725147128105,
-0.004781140014529228,
-0.12383481115102768,
0.002072082832455635,
0.10551651567220688,
0.006900979671627283,
-0.07311931997537613,
0.0699697732925415,
-0.05867324024438858,
-0.022376587614417076,
0.009806462563574314,
-0.08899306505918503,
0.03827179595828056,
-0.0035808440297842026,
-0.08744475245475769,
-0.022928522899746895,
0.01611950248479843,
0.01579391211271286,
-0.0026957045774906874,
0.1067754253745079,
-0.09722351282835007,
0.02754284255206585,
-0.09448361396789551,
-0.11302327364683151,
0.025172293186187744,
-0.08129947632551193,
0.031659409403800964,
-0.09922247380018234,
-0.1604534387588501,
-0.009282956831157207,
0.06274372339248657,
-0.023945428431034088,
-0.05437932163476944,
-0.043382491916418076,
-0.07894254475831985,
0.008022566325962543,
-0.017578357830643654,
0.12824073433876038,
-0.06481204181909561,
0.09350013732910156,
0.03328974172472954,
0.06413920223712921,
-0.07104026526212692,
0.050236962735652924,
-0.09704361110925674,
0.011232509277760983,
-0.15759854018688202,
0.02313351444900036,
-0.055219098925590515,
0.06316210329532623,
-0.09332390129566193,
-0.09936890006065369,
0.03519896790385246,
-0.0002785641117952764,
0.06820189207792282,
0.07358565181493759,
-0.1822279989719391,
-0.07183773070573807,
0.13927523791790009,
-0.0658981204032898,
-0.12337014079093933,
0.11050065606832504,
-0.06558419018983841,
0.041111379861831665,
0.06919814646244049,
0.1451391726732254,
0.06904485076665878,
-0.07721905410289764,
-0.0015170590486377478,
0.016862498596310616,
0.05074930563569069,
-0.0847182422876358,
0.06917781382799149,
0.012520468793809414,
0.013863902539014816,
0.024081019684672356,
-0.038115907460451126,
0.05591341853141785,
-0.09800522029399872,
-0.09212274849414825,
-0.03839883208274841,
-0.10442838072776794,
0.031585779041051865,
0.07050753384828568,
0.0738958790898323,
-0.09605219960212708,
-0.07833192497491837,
0.09401890635490417,
0.0923089012503624,
-0.05710851401090622,
0.026227932423353195,
-0.0678141713142395,
0.06155483424663544,
-0.04878917708992958,
-0.032372817397117615,
-0.1779627501964569,
-0.04396611452102661,
0.00912321824580431,
0.004612626042217016,
0.02374286763370037,
0.041162338107824326,
0.06643414497375488,
0.06277447938919067,
-0.05442088097333908,
-0.010830906219780445,
-0.017301976680755615,
0.0036747462581843138,
-0.13431429862976074,
-0.21052680909633636,
-0.04086076468229294,
-0.018758051097393036,
0.1385951191186905,
-0.21945105493068695,
0.02912295237183571,
0.0013541535008698702,
0.08115095645189285,
0.016616204753518105,
-0.005511585157364607,
-0.0505557544529438,
0.0800098180770874,
-0.05046030879020691,
-0.05626825988292694,
0.0643467828631401,
0.007677561137825251,
-0.0953545942902565,
-0.06031324341893196,
-0.10040897130966187,
0.17297497391700745,
0.137332022190094,
-0.12821021676063538,
-0.07983977347612381,
0.0024052783846855164,
-0.05429501459002495,
-0.03204770386219025,
-0.04428546875715256,
0.03815562650561333,
0.1563681960105896,
-0.01569928601384163,
0.149240180850029,
-0.06486517190933228,
-0.045610737055540085,
0.01900593936443329,
-0.0340820774435997,
0.021703757345676422,
0.11148209869861603,
0.13438475131988525,
-0.08935186266899109,
0.15089116990566254,
0.1519785225391388,
-0.10762390494346619,
0.10981334000825882,
-0.04236331209540367,
-0.07004708796739578,
-0.025164779275655746,
-0.024961939081549644,
0.0018154910067096353,
0.11074694991111755,
-0.11760441213846207,
-0.004639383405447006,
0.020502595230937004,
0.02237643115222454,
0.019459648057818413,
-0.23169951140880585,
-0.037034254521131516,
0.029439430683851242,
-0.03621790185570717,
-0.002684907289221883,
-0.02758593112230301,
-0.0021183451171964407,
0.10365468263626099,
0.00491373473778367,
-0.10279887914657593,
0.048187777400016785,
0.004121769685298204,
-0.07649847120046616,
0.21451526880264282,
-0.09360256791114807,
-0.12601596117019653,
-0.1246097981929779,
-0.0860978439450264,
-0.05031801015138626,
0.01226833462715149,
0.048271600157022476,
-0.08064505457878113,
-0.037491727620363235,
-0.061507467180490494,
0.005801449529826641,
-0.008561479859054089,
0.040449123829603195,
0.00829384010285139,
0.0001938382483785972,
0.07656078040599823,
-0.1125379130244255,
-0.004313433542847633,
-0.05430261418223381,
-0.0807710811495781,
0.04275636374950409,
0.03767097368836403,
0.11283230036497116,
0.1568669080734253,
-0.0193543191999197,
0.004930899944156408,
-0.03000623546540737,
0.23520806431770325,
-0.058231718838214874,
-0.0280222836881876,
0.11691047251224518,
-0.016025401651859283,
0.04167114570736885,
0.11347288638353348,
0.07636485993862152,
-0.09397869557142258,
0.004543949384242296,
0.03715891018509865,
-0.03457300737500191,
-0.22447188198566437,
-0.04467463493347168,
-0.05199765786528587,
-0.01604725793004036,
0.09300883859395981,
0.030425652861595154,
0.042050015181303024,
0.06832975149154663,
0.04712888225913048,
0.0997152179479599,
-0.04900568723678589,
0.05626033991575241,
0.11316933482885361,
0.0520709827542305,
0.1223457083106041,
-0.044061318039894104,
-0.05099990591406822,
0.04642273858189583,
-0.001582156983204186,
0.2246587574481964,
0.009687424637377262,
0.14193153381347656,
0.053676437586545944,
0.18222862482070923,
-0.011533382348716259,
0.07296290993690491,
-0.004362466279417276,
-0.03995177522301674,
-0.01341270562261343,
-0.03885624557733536,
-0.030862901359796524,
0.028284547850489616,
-0.06412709504365921,
0.059567973017692566,
-0.10940450429916382,
0.006270211189985275,
0.053911395370960236,
0.2409835308790207,
0.04936473071575165,
-0.3347453474998474,
-0.09260307252407074,
0.0008549905032850802,
-0.030357083305716515,
-0.016448110342025757,
0.02764669433236122,
0.10199634730815887,
-0.07446461915969849,
0.032771363854408264,
-0.06351441890001297,
0.0895993784070015,
-0.051659006625413895,
0.04211157560348511,
0.09209524095058441,
0.10736402124166489,
0.011756159365177155,
0.08100524544715881,
-0.295752614736557,
0.27245432138442993,
0.011770316399633884,
0.0669448971748352,
-0.07095900923013687,
0.0009577158489264548,
0.028851330280303955,
0.07063912600278854,
0.05029008910059929,
-0.009693341329693794,
-0.045110732316970825,
-0.2017836570739746,
-0.04619096964597702,
0.03133996203541756,
0.08134137839078903,
-0.015996556729078293,
0.09194726496934891,
-0.03554830327630043,
0.0024318797513842583,
0.08612504601478577,
-0.013313362374901772,
-0.04631362110376358,
-0.09422153234481812,
-0.012098991312086582,
0.041164468973875046,
-0.046671848744153976,
-0.06435327976942062,
-0.10823273658752441,
-0.13041743636131287,
0.16553014516830444,
-0.032856088131666183,
-0.019723957404494286,
-0.10837908834218979,
0.09942068159580231,
0.0798133909702301,
-0.08633769303560257,
0.044759321957826614,
0.009348783642053604,
0.06492535769939423,
0.04814297705888748,
-0.07107188552618027,
0.11571135371923447,
-0.06605902314186096,
-0.15941309928894043,
-0.06312020123004913,
0.08759930729866028,
0.03819728642702103,
0.0649406909942627,
-0.007825211621820927,
0.017871687188744545,
-0.03466162458062172,
-0.08578332513570786,
0.017175188288092613,
-0.006479139905422926,
0.07298905402421951,
0.01780020445585251,
-0.052805930376052856,
0.01486605778336525,
-0.04875309020280838,
-0.03273569419980049,
0.18334197998046875,
0.24098144471645355,
-0.10225995630025864,
-0.00019347851048223674,
0.03242424130439758,
-0.06966043263673782,
-0.1954115480184555,
0.0577823780477047,
0.05198659375309944,
0.0013229796895757318,
0.03417346626520157,
-0.17618754506111145,
0.15298554301261902,
0.12066653370857239,
-0.01150561310350895,
0.10538530349731445,
-0.3134491741657257,
-0.12343229353427887,
0.12103642523288727,
0.1412350833415985,
0.12518452107906342,
-0.14384983479976654,
-0.020024560391902924,
-0.026401665061712265,
-0.13309279084205627,
0.12625420093536377,
-0.10175486654043198,
0.11582014709711075,
-0.02054060436785221,
0.07808545231819153,
-0.001893708948045969,
-0.05664126202464104,
0.12291810661554337,
0.0347423329949379,
0.10570935904979706,
-0.051949985325336456,
-0.04614504426717758,
0.04753964766860008,
-0.03518716245889664,
0.006822499446570873,
-0.07721405476331711,
0.02840799279510975,
-0.08592529594898224,
-0.01732509396970272,
-0.0699511244893074,
0.04421662911772728,
-0.031964950263500214,
-0.06254465132951736,
-0.044674716889858246,
0.02746538072824478,
0.04518526419997215,
-0.01752452366054058,
0.14643694460391998,
0.0386141873896122,
0.13749904930591583,
0.10484424978494644,
0.06722481548786163,
-0.07580152899026871,
-0.05967193469405174,
-0.013146713376045227,
-0.019497208297252655,
0.06672791391611099,
-0.13876216113567352,
0.03275148570537567,
0.14551056921482086,
0.020506685599684715,
0.1345527023077011,
0.08421382308006287,
-0.022791102528572083,
0.00638954434543848,
0.06263887882232666,
-0.15486212074756622,
-0.07062888145446777,
0.00893971137702465,
-0.05281680449843407,
-0.10905542969703674,
0.06676509976387024,
0.10108087211847305,
-0.07013718783855438,
-0.005899602547287941,
-0.00013532170851249248,
-0.001251302077434957,
-0.06601271778345108,
0.20095212757587433,
0.06930000334978104,
0.04091949388384819,
-0.10476028174161911,
0.07337883114814758,
0.05111077427864075,
-0.06330949068069458,
-0.00018613356223795563,
0.054364852607250214,
-0.08397570252418518,
-0.04107973352074623,
0.0889936164021492,
0.1682586520910263,
-0.08112195879220963,
-0.05390221253037453,
-0.13212928175926208,
-0.11158676445484161,
0.07290187478065491,
0.17030343413352966,
0.1221141442656517,
0.024076297879219055,
-0.05042440816760063,
0.013773700222373009,
-0.12078267335891724,
0.07655538618564606,
0.03534853085875511,
0.07899749279022217,
-0.155242457985878,
0.16875681281089783,
0.006143348757177591,
0.039788682013750076,
-0.022326108068227768,
0.03443007916212082,
-0.11194916814565659,
0.007169951219111681,
-0.1315043717622757,
-0.021518388763070107,
-0.0331648625433445,
0.010425065644085407,
0.01085478626191616,
-0.0659133717417717,
-0.06601950526237488,
0.015279054641723633,
-0.11730898171663284,
-0.020313305780291557,
0.03984770178794861,
0.054952412843704224,
-0.12135176360607147,
-0.04123536869883537,
0.019490377977490425,
-0.058615490794181824,
0.06595499813556671,
0.04014794901013374,
0.026666736230254173,
0.06685113161802292,
-0.14212003350257874,
-0.0012877159751951694,
0.07461536675691605,
0.021705742925405502,
0.08524840325117111,
-0.07922150939702988,
-0.010359072126448154,
0.005852946080267429,
0.06529439985752106,
0.014465631917119026,
0.07804476469755173,
-0.12790049612522125,
-0.01223995815962553,
-0.03682886064052582,
-0.07952564209699631,
-0.06363380700349808,
0.028107328340411186,
0.10294515639543533,
0.011514954268932343,
0.20921510457992554,
-0.07460173964500427,
0.02308657206594944,
-0.2041468322277069,
0.005004726815968752,
-0.019553694874048233,
-0.1138690859079361,
-0.13935528695583344,
-0.05796818062663078,
0.06285815685987473,
-0.05855056270956993,
0.14155010879039764,
0.014343592338263988,
0.0414494052529335,
0.027260150760412216,
-0.012646886520087719,
0.00880691409111023,
0.027126288041472435,
0.2160845696926117,
0.04278518259525299,
-0.028723441064357758,
0.05682956799864769,
0.05278225615620613,
0.11078575253486633,
0.10041790455579758,
0.19576521217823029,
0.13739432394504547,
-0.02354806289076805,
0.09785446524620056,
0.039597153663635254,
-0.07677469402551651,
-0.14878302812576294,
0.04638800397515297,
-0.05880903825163841,
0.10244037210941315,
-0.020901119336485863,
0.2171119898557663,
0.056533001363277435,
-0.16667021811008453,
0.026902174577116966,
-0.06317676603794098,
-0.08800452202558517,
-0.11235053092241287,
-0.03423750400543213,
-0.08793626725673676,
-0.13967688381671906,
-0.0030674664303660393,
-0.1118570864200592,
0.011620382778346539,
0.1304549127817154,
0.009692220948636532,
-0.01682460308074951,
0.1471324861049652,
0.011232762597501278,
0.03499092906713486,
0.03642614185810089,
0.010920515283942223,
-0.03352557122707367,
-0.1060585156083107,
-0.0705675259232521,
-0.024992238730192184,
-0.014909205958247185,
0.03408665582537651,
-0.06479837000370026,
-0.03878683224320412,
0.04418225586414337,
-0.021368278190493584,
-0.09640517085790634,
0.013081040233373642,
0.016746344044804573,
0.05434136465191841,
0.02238479070365429,
0.009089512750506401,
0.022526228800415993,
-0.006750399712473154,
0.20894227921962738,
-0.08519155532121658,
-0.07299493998289108,
-0.10506164282560349,
0.26041024923324585,
0.0323081836104393,
-0.004160471726208925,
0.03280150890350342,
-0.0717087984085083,
0.0018761266255751252,
0.23997145891189575,
0.20532968640327454,
-0.09920685738325119,
-0.011637787334620953,
0.007178818807005882,
-0.011495918035507202,
-0.038231488317251205,
0.11398962885141373,
0.12572713196277618,
0.04431365057826042,
-0.09218312054872513,
-0.06017213687300682,
-0.05231416970491409,
-0.0035586063750088215,
-0.034377750009298325,
0.045762185007333755,
0.041459642350673676,
0.00654883636161685,
-0.04186185076832771,
0.0453345887362957,
-0.05525178462266922,
-0.11342426389455795,
0.0759604275226593,
-0.19820290803909302,
-0.1651686131954193,
-0.012040015310049057,
0.11301717907190323,
0.0012872477527707815,
0.0631728246808052,
-0.03606175631284714,
0.002891720272600651,
0.07391440123319626,
-0.019138431176543236,
-0.10750740021467209,
-0.08265088498592377,
0.10456087440252304,
-0.09740858525037766,
0.2092922478914261,
-0.04766298085451126,
0.08167978376150131,
0.1310504972934723,
0.06238691508769989,
-0.07748984545469284,
0.05519040301442146,
0.04702913388609886,
-0.08475689589977264,
0.022645050659775734,
0.0626312717795372,
-0.04078920930624008,
0.08758337050676346,
0.04300964996218681,
-0.12791898846626282,
0.015534044243395329,
-0.059760406613349915,
-0.057023484259843826,
-0.04611048102378845,
-0.05343449115753174,
-0.05706407129764557,
0.1308421939611435,
0.2115286886692047,
-0.03177746757864952,
0.002705509075894952,
-0.07024826854467392,
0.014655858278274536,
0.06330719590187073,
0.027338026091456413,
-0.07128245383501053,
-0.22391752898693085,
0.019995812326669693,
0.03686370328068733,
-0.017581172287464142,
-0.20562438666820526,
-0.08662677556276321,
0.0018912868108600378,
-0.07484220713376999,
-0.09483546763658524,
0.08256039023399353,
0.08196866512298584,
0.05105395242571831,
-0.06332121789455414,
-0.051575250923633575,
-0.08346235752105713,
0.14358577132225037,
-0.14595870673656464,
-0.10018817335367203
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-large-uncased-ner
This model is a fine-tuned version of [bert-large-uncased](https://huggingface.co/bert-large-uncased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0591
- Precision: 0.9465
- Recall: 0.9568
- F1: 0.9517
- Accuracy: 0.9877
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.1702 | 1.0 | 878 | 0.0578 | 0.9202 | 0.9347 | 0.9274 | 0.9836 |
| 0.0392 | 2.0 | 1756 | 0.0601 | 0.9306 | 0.9448 | 0.9377 | 0.9851 |
| 0.0157 | 3.0 | 2634 | 0.0517 | 0.9405 | 0.9544 | 0.9474 | 0.9875 |
| 0.0057 | 4.0 | 3512 | 0.0591 | 0.9465 | 0.9568 | 0.9517 | 0.9877 |
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["conll2003"], "metrics": ["precision", "recall", "f1", "accuracy"], "model_index": [{"name": "bert-large-uncased-ner", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.9877039414110284}}]}]}
|
token-classification
|
andi611/bert-large-uncased-ner-conll2003
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
bert-large-uncased-ner
======================
This model is a fine-tuned version of bert-large-uncased on the conll2003 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0591
* Precision: 0.9465
* Recall: 0.9568
* F1: 0.9517
* Accuracy: 0.9877
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 4
### Training results
### Framework versions
* Transformers 4.8.2
* Pytorch 1.8.1+cu111
* Datasets 1.8.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
63,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 4### Training results### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
-0.11068462580442429,
0.10301382839679718,
-0.0017267689108848572,
0.12361399084329605,
0.16973860561847687,
0.03632475435733795,
0.11908187717199326,
0.11458680033683777,
-0.10862753540277481,
0.02475389465689659,
0.12799055874347687,
0.1681796759366989,
0.007694697473198175,
0.10657139867544174,
-0.05418211966753006,
-0.24464693665504456,
-0.003387411357834935,
0.04836477339267731,
-0.09088367223739624,
0.12667255103588104,
0.09516332298517227,
-0.13681261241436005,
0.09492246806621552,
0.0010476489551365376,
-0.22413168847560883,
0.012888994999229908,
0.0244833342730999,
-0.05592229589819908,
0.14349272847175598,
0.035098250955343246,
0.1331191509962082,
-0.0028941894415766,
0.09359811246395111,
-0.17608511447906494,
0.007148897275328636,
0.056991953402757645,
-0.0002538793778512627,
0.09378271549940109,
0.054275836795568466,
0.010223201476037502,
0.11681161820888519,
-0.08245661854743958,
0.05086122080683708,
0.01924777217209339,
-0.11820302903652191,
-0.21763379871845245,
-0.08966095000505447,
0.028125643730163574,
0.06881927698850632,
0.096869096159935,
0.005313749425113201,
0.15218369662761688,
-0.08469217270612717,
0.0923420861363411,
0.21881073713302612,
-0.3081938922405243,
-0.06775641441345215,
0.05686924234032631,
0.01087278500199318,
0.05662241950631142,
-0.10600253939628601,
-0.027258604764938354,
0.06224606931209564,
0.04401734098792076,
0.12812189757823944,
-0.04062916338443756,
-0.12589290738105774,
0.024172453209757805,
-0.1439700871706009,
-0.015369714237749577,
0.15502266585826874,
0.04279924929141998,
-0.02954586036503315,
-0.021671822294592857,
-0.05954369902610779,
-0.1551409810781479,
-0.02430909126996994,
-0.011493834666907787,
0.05046941712498665,
-0.03453122451901436,
-0.08237028121948242,
0.004649786278605461,
-0.10655799508094788,
-0.06916757673025131,
-0.08648111671209335,
0.14008399844169617,
0.03651181980967522,
0.021580860018730164,
-0.024553759023547173,
0.09911154955625534,
-0.0009178505861200392,
-0.11743449419736862,
0.019919387996196747,
0.035203807055950165,
-0.015341304242610931,
-0.060124851763248444,
-0.0663645938038826,
-0.04496428743004799,
0.016839412972331047,
0.13033527135849,
-0.04515806958079338,
0.04190560802817345,
0.04846278578042984,
0.039393067359924316,
-0.08801707625389099,
0.18615861237049103,
-0.05548306182026863,
-0.023825211450457573,
0.00915203895419836,
0.03707265481352806,
0.016401030123233795,
-0.004079556558281183,
-0.12248348444700241,
0.0027966138441115618,
0.10510514676570892,
0.006772488355636597,
-0.07420537620782852,
0.07199005782604218,
-0.05764767527580261,
-0.021605234593153,
0.009537429548799992,
-0.0892852321267128,
0.037421341985464096,
-0.0037322731222957373,
-0.08718635886907578,
-0.02210838347673416,
0.015514276921749115,
0.013996608555316925,
-0.0038168560713529587,
0.10725752264261246,
-0.09838133305311203,
0.026346411556005478,
-0.09542343765497208,
-0.11430010944604874,
0.023658862337470055,
-0.08161678910255432,
0.03288741409778595,
-0.09886298328638077,
-0.15996329486370087,
-0.011380041018128395,
0.06187820807099342,
-0.023629117757081985,
-0.05390924960374832,
-0.04350734502077103,
-0.07968014478683472,
0.00745046604424715,
-0.017207881435751915,
0.13104130327701569,
-0.06477335840463638,
0.09308215975761414,
0.035720985382795334,
0.0640452429652214,
-0.06963680684566498,
0.05108654871582985,
-0.09637638181447983,
0.009683783166110516,
-0.15824824571609497,
0.022183213382959366,
-0.05657774582505226,
0.06616663932800293,
-0.09249066561460495,
-0.10049125552177429,
0.0366351380944252,
0.00038971903268247843,
0.0683148205280304,
0.0739067867398262,
-0.17999796569347382,
-0.07175200432538986,
0.1383303850889206,
-0.06429002434015274,
-0.12431800365447998,
0.1098373532295227,
-0.06555428355932236,
0.039280518889427185,
0.07042793184518814,
0.14509454369544983,
0.0684729740023613,
-0.07895850390195847,
-0.0025090165436267853,
0.01614466682076454,
0.051312413066625595,
-0.08472960442304611,
0.0668894499540329,
0.013433746993541718,
0.012132263742387295,
0.02555510587990284,
-0.03736257553100586,
0.05608697980642319,
-0.0978095605969429,
-0.09191099554300308,
-0.03907744958996773,
-0.10500103235244751,
0.03027886338531971,
0.07154019176959991,
0.0741688534617424,
-0.09705284237861633,
-0.07732376456260681,
0.09127016365528107,
0.09201527386903763,
-0.05698143318295479,
0.024210689589381218,
-0.0674094632267952,
0.0629919171333313,
-0.04962409287691116,
-0.031664419919252396,
-0.17819160223007202,
-0.04423306882381439,
0.008477830328047276,
0.005475493613630533,
0.022690443322062492,
0.04072817042469978,
0.0670095682144165,
0.06256872415542603,
-0.05435338243842125,
-0.011391599662601948,
-0.01581558957695961,
0.003897658083587885,
-0.1348932981491089,
-0.21138113737106323,
-0.04109704867005348,
-0.018484903499484062,
0.13717088103294373,
-0.22101861238479614,
0.02918202430009842,
0.0005986356409266591,
0.08061643689870834,
0.015932049602270126,
-0.0049888030625879765,
-0.049114227294921875,
0.08147583901882172,
-0.04900060221552849,
-0.05575931444764137,
0.06448137760162354,
0.007390524726361036,
-0.09310874342918396,
-0.06283488869667053,
-0.10048016905784607,
0.1756993979215622,
0.13820235431194305,
-0.12911304831504822,
-0.0808810442686081,
0.0019171361345797777,
-0.05408531427383423,
-0.032234784215688705,
-0.04311135411262512,
0.03849950432777405,
0.1554471105337143,
-0.017258694395422935,
0.14923502504825592,
-0.06420502066612244,
-0.04557602480053902,
0.019789734855294228,
-0.03199540823698044,
0.02245854213833809,
0.1130327433347702,
0.1333034634590149,
-0.08636125177145004,
0.15133604407310486,
0.15080024302005768,
-0.10726900398731232,
0.11072143912315369,
-0.043515220284461975,
-0.07013668119907379,
-0.024230672046542168,
-0.025407182052731514,
0.0008313285652548075,
0.11197340488433838,
-0.1195647269487381,
-0.006489150691777468,
0.020465176552534103,
0.020836034789681435,
0.019366972148418427,
-0.22982914745807648,
-0.03664770722389221,
0.02897464670240879,
-0.034811101853847504,
-0.002878938801586628,
-0.026166439056396484,
-0.002699202159419656,
0.10297191888093948,
0.0044105867855250835,
-0.10297685116529465,
0.04757806658744812,
0.004515568260103464,
-0.07500479370355606,
0.2149248719215393,
-0.09361431002616882,
-0.12488816678524017,
-0.12289195507764816,
-0.08750052750110626,
-0.051571499556303024,
0.01039567869156599,
0.04867935553193092,
-0.0829840898513794,
-0.038510601967573166,
-0.06070498004555702,
0.004387092776596546,
-0.008378108963370323,
0.04033101350069046,
0.007395036984235048,
-0.000947698310483247,
0.0757170096039772,
-0.1123160570859909,
-0.003627204103395343,
-0.05505536124110222,
-0.08012551814317703,
0.042487733066082,
0.03854821249842644,
0.11408870667219162,
0.15499980747699738,
-0.018121348693966866,
0.004915204830467701,
-0.0307539664208889,
0.23855042457580566,
-0.05899955332279205,
-0.02887560799717903,
0.11635184288024902,
-0.01397525705397129,
0.0402308851480484,
0.11271902173757553,
0.07723133265972137,
-0.09450684487819672,
0.004474742338061333,
0.03746933117508888,
-0.03328109532594681,
-0.2249659299850464,
-0.04405194893479347,
-0.0530007928609848,
-0.016583653166890144,
0.09277471154928207,
0.030131351202726364,
0.04421447217464447,
0.06860223412513733,
0.04800310358405113,
0.10055331140756607,
-0.050077468156814575,
0.05565723404288292,
0.11562114208936691,
0.05105220153927803,
0.12302336096763611,
-0.04495102912187576,
-0.051252592355012894,
0.04503759741783142,
-0.002120518358424306,
0.22644482553005219,
0.009638707153499126,
0.14190730452537537,
0.05391279235482216,
0.18340565264225006,
-0.011559901759028435,
0.07430334389209747,
-0.0055211130529642105,
-0.04120676964521408,
-0.012166192755103111,
-0.038998641073703766,
-0.02854061871767044,
0.026439525187015533,
-0.0649506226181984,
0.05984283983707428,
-0.11026687175035477,
0.0030220241751521826,
0.053705666214227676,
0.241355761885643,
0.04738875851035118,
-0.33579328656196594,
-0.09276358783245087,
-0.0013548106653615832,
-0.029669590294361115,
-0.016917863860726357,
0.027821609750390053,
0.09984540939331055,
-0.07391209155321121,
0.032788991928100586,
-0.06342639774084091,
0.08968621492385864,
-0.05094178393483162,
0.04309684783220291,
0.09365496784448624,
0.10965462774038315,
0.010354731231927872,
0.07998397946357727,
-0.29916954040527344,
0.27151501178741455,
0.012261676602065563,
0.06872420758008957,
-0.07077717781066895,
0.0009845875902101398,
0.02892007865011692,
0.07162213325500488,
0.05056232213973999,
-0.010802186094224453,
-0.04402999207377434,
-0.20167602598667145,
-0.045123547315597534,
0.03232307359576225,
0.08052655309438705,
-0.015201876871287823,
0.0917133316397667,
-0.0345524437725544,
0.0013467378448694944,
0.08568631112575531,
-0.014833447523415089,
-0.04797515273094177,
-0.09484458714723587,
-0.012209081090986729,
0.042638685554265976,
-0.045855987817049026,
-0.06309440732002258,
-0.10874991863965988,
-0.13382011651992798,
0.16220271587371826,
-0.03331026807427406,
-0.019582858309149742,
-0.10892771929502487,
0.09857967495918274,
0.08003824204206467,
-0.08483404666185379,
0.046145565807819366,
0.009362922050058842,
0.062292806804180145,
0.0491083487868309,
-0.07131936401128769,
0.11606968939304352,
-0.06686697900295258,
-0.15789905190467834,
-0.06250183284282684,
0.08652409166097641,
0.037804555147886276,
0.06348294764757156,
-0.008748560212552547,
0.01798548363149166,
-0.03639120236039162,
-0.08606039732694626,
0.017634715884923935,
-0.008388807997107506,
0.07388456910848618,
0.017728587612509727,
-0.05383734405040741,
0.011738973669707775,
-0.04866338521242142,
-0.03302576765418053,
0.18111279606819153,
0.23813794553279877,
-0.10126227885484695,
-0.002258410444483161,
0.03288348391652107,
-0.06980358064174652,
-0.1934918761253357,
0.05945713445544243,
0.05081642046570778,
0.0020798237528651953,
0.0326259583234787,
-0.17715629935264587,
0.15480351448059082,
0.11977881193161011,
-0.010435931384563446,
0.10854217410087585,
-0.310834676027298,
-0.12305592745542526,
0.12226898223161697,
0.1425883024930954,
0.12711884081363678,
-0.14490140974521637,
-0.019341349601745605,
-0.025093961507081985,
-0.13272565603256226,
0.12356175482273102,
-0.10294985771179199,
0.11613926291465759,
-0.021223846822977066,
0.07690686732530594,
-0.002387840300798416,
-0.05736245587468147,
0.12155447155237198,
0.034871749579906464,
0.10783451050519943,
-0.05209323391318321,
-0.04464641958475113,
0.04674021527171135,
-0.03449273109436035,
0.00773633411154151,
-0.07839208841323853,
0.02811797522008419,
-0.08627083897590637,
-0.016732508316636086,
-0.07120015472173691,
0.04443511739373207,
-0.03268476203083992,
-0.0616292767226696,
-0.04369921237230301,
0.027677396312355995,
0.043936822563409805,
-0.017684780061244965,
0.14472584426403046,
0.037898216396570206,
0.13917741179466248,
0.10210171341896057,
0.0671181008219719,
-0.07585148513317108,
-0.06272096931934357,
-0.011952695436775684,
-0.019863687455654144,
0.06740207225084305,
-0.13830964267253876,
0.03034471720457077,
0.14639519155025482,
0.01976567506790161,
0.13410013914108276,
0.08437195420265198,
-0.021750226616859436,
0.004850402008742094,
0.06317532807588577,
-0.15606354176998138,
-0.06889253854751587,
0.008527526631951332,
-0.056525539606809616,
-0.10858277976512909,
0.06787896156311035,
0.10018552094697952,
-0.07049210369586945,
-0.005950533784925938,
-0.0007959029753692448,
-0.001971712801605463,
-0.06499423086643219,
0.20294015109539032,
0.06935474276542664,
0.040993183851242065,
-0.1063472181558609,
0.0730973556637764,
0.05127972364425659,
-0.06629389524459839,
-0.0003834698291029781,
0.05493302270770073,
-0.08542372286319733,
-0.042074356228113174,
0.08877337723970413,
0.1709810197353363,
-0.07892563194036484,
-0.05414261668920517,
-0.13309471309185028,
-0.11190365254878998,
0.07355304062366486,
0.17094005644321442,
0.12164830416440964,
0.022842101752758026,
-0.04980316013097763,
0.014027589932084084,
-0.12187656760215759,
0.07762642949819565,
0.03441714867949486,
0.07958926260471344,
-0.15508471429347992,
0.16698184609413147,
0.006431987509131432,
0.03822002187371254,
-0.022193865850567818,
0.034804876893758774,
-0.11315381526947021,
0.00662217428907752,
-0.1294611394405365,
-0.023106347769498825,
-0.032584112137556076,
0.011074723675847054,
0.01009680051356554,
-0.06460115313529968,
-0.06554240733385086,
0.016309818252921104,
-0.11737474054098129,
-0.01967405341565609,
0.0425274558365345,
0.056309543550014496,
-0.12030480802059174,
-0.040509555488824844,
0.017972039058804512,
-0.05822159722447395,
0.06532997637987137,
0.04019554704427719,
0.02624855935573578,
0.06665865331888199,
-0.14169760048389435,
-0.0006452247034758329,
0.07304209470748901,
0.020559607073664665,
0.08564586192369461,
-0.07845544815063477,
-0.009663110598921776,
0.005570156499743462,
0.06579922139644623,
0.014802655205130577,
0.07598131895065308,
-0.12802861630916595,
-0.012879610992968082,
-0.03492014855146408,
-0.07663201540708542,
-0.06433238834142685,
0.02827191725373268,
0.10375984758138657,
0.010178939439356327,
0.20847418904304504,
-0.07508847117424011,
0.02211684174835682,
-0.20405104756355286,
0.0047548431903123856,
-0.02029257081449032,
-0.11667507141828537,
-0.13929662108421326,
-0.05797114595770836,
0.06440716981887817,
-0.05937216058373451,
0.14376914501190186,
0.014984313398599625,
0.04200584813952446,
0.026921352371573448,
-0.01167474128305912,
0.007150810677558184,
0.02776760794222355,
0.2157951146364212,
0.04201569780707359,
-0.02886234037578106,
0.057228934019804,
0.05535680428147316,
0.10963723063468933,
0.10323577374219894,
0.1935814470052719,
0.13754118978977203,
-0.02330346591770649,
0.09811384975910187,
0.03950092941522598,
-0.07766275852918625,
-0.14649416506290436,
0.047145742923021317,
-0.0588340237736702,
0.10560812056064606,
-0.022751137614250183,
0.2145397663116455,
0.05552418529987335,
-0.16500884294509888,
0.02788231521844864,
-0.06459366530179977,
-0.08802162110805511,
-0.11064722388982773,
-0.03257327899336815,
-0.08680722862482071,
-0.14067792892456055,
-0.002801859052851796,
-0.1125575453042984,
0.011055226437747478,
0.13076554238796234,
0.010371829383075237,
-0.0180977750569582,
0.15026909112930298,
0.014726992696523666,
0.0360974483191967,
0.0388338640332222,
0.010084599256515503,
-0.0351540707051754,
-0.10714858025312424,
-0.06931158900260925,
-0.025883935391902924,
-0.0175209678709507,
0.033014941960573196,
-0.06608741730451584,
-0.04086142033338547,
0.04522838443517685,
-0.019943593069911003,
-0.09599251300096512,
0.013377743773162365,
0.01721283048391342,
0.05378095433115959,
0.02457689680159092,
0.008147052489221096,
0.021912598982453346,
-0.007385289296507835,
0.2087516188621521,
-0.0848378837108612,
-0.07078955322504044,
-0.10511314868927002,
0.25764772295951843,
0.0331428237259388,
-0.005546145141124725,
0.031787704676389694,
-0.07060899585485458,
0.003923140466213226,
0.2422688752412796,
0.20555251836776733,
-0.09674855321645737,
-0.011858795769512653,
0.007382330484688282,
-0.01153191365301609,
-0.03879667446017265,
0.11420461535453796,
0.1267974078655243,
0.04311187192797661,
-0.09238563477993011,
-0.05849878489971161,
-0.05231386050581932,
-0.00426795007660985,
-0.03391709551215172,
0.0438355877995491,
0.04375245049595833,
0.005665211472660303,
-0.040143050253391266,
0.04581458866596222,
-0.05516925826668739,
-0.11203218996524811,
0.07652445137500763,
-0.19699913263320923,
-0.16539321839809418,
-0.014001240953803062,
0.11310721933841705,
0.0025809365324676037,
0.06255871802568436,
-0.035301461815834045,
0.0029852669686079025,
0.07492106407880783,
-0.019685596227645874,
-0.10561972111463547,
-0.08641495555639267,
0.10494919121265411,
-0.09878258407115936,
0.2080308496952057,
-0.04764431715011597,
0.08082011342048645,
0.13163194060325623,
0.06171361356973648,
-0.07792674750089645,
0.05546736717224121,
0.046208642423152924,
-0.08617596328258514,
0.023688608780503273,
0.062452975660562515,
-0.03991987556219101,
0.08779530227184296,
0.040466416627168655,
-0.1253318339586258,
0.016839031130075455,
-0.06032399460673332,
-0.05585119128227234,
-0.04642823338508606,
-0.05153457820415497,
-0.05665866285562515,
0.13139477372169495,
0.2133215367794037,
-0.031571898609399796,
0.0024720521178096533,
-0.07086780667304993,
0.015202036127448082,
0.06361646205186844,
0.026404738426208496,
-0.07216431200504303,
-0.2240443378686905,
0.020994609221816063,
0.038018934428691864,
-0.018097441643476486,
-0.20608720183372498,
-0.08532199263572693,
0.00163992156740278,
-0.07467888295650482,
-0.09555074572563171,
0.08136676251888275,
0.08450563997030258,
0.05208488181233406,
-0.06333828717470169,
-0.04981011524796486,
-0.08341134339570999,
0.14359278976917267,
-0.1465044915676117,
-0.10043717175722122
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-large-uncased-whole-word-masking-ner-conll2003
This model is a fine-tuned version of [bert-large-uncased-whole-word-masking](https://huggingface.co/bert-large-uncased-whole-word-masking) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0592
- Precision: 0.9527
- Recall: 0.9569
- F1: 0.9548
- Accuracy: 0.9887
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.4071 | 1.0 | 877 | 0.0584 | 0.9306 | 0.9418 | 0.9362 | 0.9851 |
| 0.0482 | 2.0 | 1754 | 0.0594 | 0.9362 | 0.9491 | 0.9426 | 0.9863 |
| 0.0217 | 3.0 | 2631 | 0.0550 | 0.9479 | 0.9584 | 0.9531 | 0.9885 |
| 0.0103 | 4.0 | 3508 | 0.0592 | 0.9527 | 0.9569 | 0.9548 | 0.9887 |
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["conll2003"], "metrics": ["precision", "recall", "f1", "accuracy"], "model_index": [{"name": "bert-large-uncased-whole-word-masking-ner-conll2003", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.9886888970085945}}]}]}
|
token-classification
|
andi611/bert-large-uncased-whole-word-masking-ner-conll2003
|
[
"transformers",
"pytorch",
"bert",
"token-classification",
"generated_from_trainer",
"en",
"dataset:conll2003",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #bert #token-classification #generated_from_trainer #en #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
bert-large-uncased-whole-word-masking-ner-conll2003
===================================================
This model is a fine-tuned version of bert-large-uncased-whole-word-masking on the conll2003 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0592
* Precision: 0.9527
* Recall: 0.9569
* F1: 0.9548
* Accuracy: 0.9887
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 4
* eval\_batch\_size: 1
* seed: 42
* gradient\_accumulation\_steps: 4
* total\_train\_batch\_size: 16
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 4
### Training results
### Framework versions
* Transformers 4.8.2
* Pytorch 1.8.1+cu111
* Datasets 1.8.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #token-classification #generated_from_trainer #en #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
61,
144,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #bert #token-classification #generated_from_trainer #en #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 1\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 4### Training results### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
-0.13086137175559998,
0.12758231163024902,
-0.003223198000341654,
0.10877197980880737,
0.16075709462165833,
0.026853308081626892,
0.132377490401268,
0.11807205528020859,
-0.09700378775596619,
0.061506275087594986,
0.11960465461015701,
0.11477649211883545,
0.03192802891135216,
0.13325561583042145,
-0.029190029948949814,
-0.2831364870071411,
0.00032414039014838636,
0.022182712331414223,
-0.1470089554786682,
0.14213122427463531,
0.07633117586374283,
-0.13276349008083344,
0.06548850983381271,
0.010201343335211277,
-0.17624051868915558,
-0.023117853328585625,
-0.009477581828832626,
-0.05502614751458168,
0.14943857491016388,
0.015480543486773968,
0.1372818648815155,
0.037178706377744675,
0.121534563601017,
-0.16719122231006622,
-0.0018795270007103682,
0.06178533658385277,
0.016966484487056732,
0.10465181618928909,
0.07689473778009415,
0.02213110215961933,
0.09591421484947205,
-0.0833420604467392,
0.06372537463903427,
0.013402346521615982,
-0.11770595610141754,
-0.24509096145629883,
-0.09557090699672699,
0.06012039631605148,
0.09589768946170807,
0.08056923747062683,
0.00023903932014945894,
0.11515229195356369,
-0.10379913449287415,
0.08633562922477722,
0.25603872537612915,
-0.2937174439430237,
-0.0762997642159462,
0.00034506418160162866,
0.019139930605888367,
0.033633459359407425,
-0.1111149862408638,
-0.03875943273305893,
0.037371374666690826,
0.038112346082925797,
0.1386760175228119,
-0.008180846460163593,
-0.03673022985458374,
0.010000207461416721,
-0.15340061485767365,
-0.04990508034825325,
0.0914127379655838,
0.03908054903149605,
-0.030731437727808952,
-0.03971480950713158,
-0.07118509709835052,
-0.22406312823295593,
-0.01696905493736267,
0.011027081869542599,
0.03966672718524933,
-0.06708663702011108,
-0.10344333201646805,
0.04008660838007927,
-0.084748275578022,
-0.07033834606409073,
-0.015687037259340286,
0.1474992334842682,
0.053915198892354965,
0.008965501561760902,
0.010314044542610645,
0.13705205917358398,
0.07065293192863464,
-0.16497214138507843,
0.032097987830638885,
0.03397750109434128,
-0.0609767809510231,
-0.028141383081674576,
-0.041289642453193665,
0.03783440217375755,
-0.00029659015126526356,
0.1549774408340454,
-0.045906729996204376,
0.025613542646169662,
0.06643570214509964,
0.021468279883265495,
-0.09990029036998749,
0.20028598606586456,
-0.09019370377063751,
-0.04023288935422897,
-0.016311852261424065,
0.11136560142040253,
0.007089615799486637,
-0.014410685747861862,
-0.09255272895097733,
0.0010902642970904708,
0.12233668565750122,
0.044098805636167526,
-0.04033353552222252,
0.04154204577207565,
-0.042048629373311996,
-0.025536473840475082,
0.06273015588521957,
-0.099820077419281,
0.03059195540845394,
0.016123134642839432,
-0.0999373272061348,
-0.012810131534934044,
-0.0008829034049995244,
-0.0076839504763484,
-0.015653669834136963,
0.15125098824501038,
-0.09281820803880692,
0.007191911339759827,
-0.08667564392089844,
-0.11267713457345963,
0.01604335755109787,
-0.08480020612478256,
0.0037813959643244743,
-0.07518823444843292,
-0.12350068241357803,
-0.03338078036904335,
0.04236835241317749,
-0.04646700993180275,
-0.07847090810537338,
-0.03247355669736862,
-0.09925222396850586,
0.022721463814377785,
-0.027805695310235023,
0.1467849463224411,
-0.0504358671605587,
0.12415222823619843,
0.03298813849687576,
0.06496748328208923,
0.019375352188944817,
0.05339168757200241,
-0.08788882195949554,
0.03567919135093689,
-0.16049903631210327,
0.03202064707875252,
-0.05000188574194908,
0.046598318964242935,
-0.10437016934156418,
-0.14113622903823853,
0.06398055702447891,
-0.025641581043601036,
0.10262025892734528,
0.09971476346254349,
-0.16162817180156708,
-0.082860067486763,
0.14048852026462555,
-0.07304634898900986,
-0.10634282231330872,
0.12297660857439041,
-0.029743004590272903,
-0.029325788840651512,
0.03649301081895828,
0.10782834887504578,
0.0744946151971817,
-0.07005305588245392,
-0.015240042470395565,
-0.0272978488355875,
0.0995822623372078,
-0.03392275422811508,
0.09656232595443726,
0.016795305535197258,
0.05487842485308647,
0.00889301672577858,
-0.07445861399173737,
0.06000206992030144,
-0.11546313017606735,
-0.09267689287662506,
-0.016674233600497246,
-0.08136676251888275,
0.06874454766511917,
0.07108443975448608,
0.054979629814624786,
-0.07695713639259338,
-0.10730747878551483,
0.05912109464406967,
0.09879684448242188,
-0.07584034651517868,
0.016295751556754112,
-0.05212755128741264,
0.0793527364730835,
-0.04751419648528099,
-0.023549262434244156,
-0.19287385046482086,
-0.06138232350349426,
0.023431194946169853,
-0.02216159552335739,
0.0005113647785037756,
0.009185673668980598,
0.0647844597697258,
0.09324541687965393,
-0.053145669400691986,
-0.05160227045416832,
-0.06808609515428543,
-0.011248236522078514,
-0.11779411882162094,
-0.2220432013273239,
-0.1000971645116806,
-0.009159100241959095,
0.12873560190200806,
-0.20231112837791443,
0.027081547304987907,
-0.0007308508502319455,
0.11291114240884781,
0.008908359333872795,
-0.015476181171834469,
-0.039468131959438324,
0.09051408618688583,
-0.03251963108778,
-0.06846760213375092,
0.060089319944381714,
-0.009396348148584366,
-0.08650219440460205,
-0.03230011835694313,
-0.0967894047498703,
0.17207196354866028,
0.111414335668087,
-0.032090310007333755,
-0.09837299585342407,
-0.008787554688751698,
-0.08247056603431702,
-0.04744845628738403,
-0.038005296140909195,
0.023836597800254822,
0.16039985418319702,
0.029770109802484512,
0.14332912862300873,
-0.06818811595439911,
-0.058442309498786926,
0.03200751170516014,
0.004406333900988102,
0.03330940380692482,
0.13194163143634796,
0.11236974596977234,
-0.06401892006397247,
0.13628464937210083,
0.13462090492248535,
-0.09027255326509476,
0.0995921790599823,
-0.062299810349941254,
-0.08785224705934525,
-0.037123795598745346,
-0.020828453823924065,
0.0157247856259346,
0.12486545741558075,
-0.0864831954240799,
-0.015578744933009148,
0.02742302231490612,
0.014880782924592495,
-0.0006043529137969017,
-0.2275277078151703,
-0.02901996113359928,
0.03356908634305,
-0.04209114611148834,
-0.034561511129140854,
-0.030530868098139763,
0.01251173298805952,
0.11566559970378876,
0.006698284763842821,
-0.10500578582286835,
0.005598769057542086,
0.0015252677258104086,
-0.06259845942258835,
0.21001586318016052,
-0.06492839008569717,
-0.1203097328543663,
-0.12362683564424515,
-0.027735404670238495,
-0.04065953195095062,
-0.010878757573664188,
0.03342917934060097,
-0.08523770421743393,
-0.025277117267251015,
-0.04082631692290306,
0.017027193680405617,
-0.0025117897894233465,
0.04782038554549217,
0.003079197136685252,
-0.0018449152121320367,
0.06105490401387215,
-0.10020236670970917,
0.014321926981210709,
-0.04128677397966385,
-0.06227581202983856,
0.04730284586548805,
0.06112552806735039,
0.10299631208181381,
0.15216103196144104,
-0.011669549159705639,
0.0034807054325938225,
-0.028566090390086174,
0.20525366067886353,
-0.08234965056180954,
-0.02172439731657505,
0.127719447016716,
-0.01251975167542696,
0.060484688729047775,
0.14107200503349304,
0.07513486593961716,
-0.07407249510288239,
-0.0017973955255001783,
0.024090342223644257,
-0.03020578809082508,
-0.2208320051431656,
-0.03291179612278938,
-0.04703453183174133,
-0.0018083045724779367,
0.12791143357753754,
0.015816915780305862,
-0.010503063909709454,
0.04424092546105385,
-0.0009906962513923645,
0.059686169028282166,
-0.03990809619426727,
0.05712781101465225,
0.09291185438632965,
0.052876006811857224,
0.13485169410705566,
-0.01624072529375553,
-0.0636749342083931,
0.02146155573427677,
-0.022241737693548203,
0.23978178203105927,
-0.05203676596283913,
0.14225642383098602,
0.03519663214683533,
0.17819051444530487,
0.00992607045918703,
0.10978960990905762,
0.008426163345575333,
-0.018938589841127396,
0.0016271580243483186,
-0.04429122805595398,
-0.02987690269947052,
0.004142237361520529,
-0.004636471159756184,
0.053894370794296265,
-0.13926316797733307,
-0.00021682905207853764,
0.03329971432685852,
0.30926331877708435,
0.07446588575839996,
-0.33388057351112366,
-0.10581836104393005,
-0.02575162798166275,
-0.036472439765930176,
-0.013735915534198284,
0.009272419847548008,
0.13027586042881012,
-0.10392757505178452,
0.028852423653006554,
-0.07813407480716705,
0.08164714276790619,
-0.0638367235660553,
0.026038477197289467,
0.12740890681743622,
0.09182725846767426,
0.006350736133754253,
0.06579441577196121,
-0.2511981129646301,
0.2975043058395386,
-0.00643105199560523,
0.05250416696071625,
-0.056295305490493774,
0.015423661097884178,
0.02484925650060177,
0.029574736952781677,
0.0398251973092556,
-0.010036949999630451,
-0.02472541853785515,
-0.22453227639198303,
-0.1077534407377243,
0.026011671870946884,
0.09936999529600143,
-0.0717383325099945,
0.1236022487282753,
-0.030431384220719337,
-0.02566649205982685,
0.0496714785695076,
-0.03990743309259415,
-0.02473169006407261,
-0.08574005216360092,
0.021392805501818657,
-0.023771237581968307,
0.0051732417196035385,
-0.0769306868314743,
-0.14265301823616028,
-0.08627667278051376,
0.1531534045934677,
-0.06129780039191246,
-0.04857486113905907,
-0.13028764724731445,
0.11473532021045685,
0.15444965660572052,
-0.08848796039819717,
0.03642354905605316,
0.011817209422588348,
0.08076883107423782,
0.031644273549318314,
-0.042046885937452316,
0.10066771507263184,
-0.06837015599012375,
-0.2423340529203415,
-0.0621693916618824,
0.13234171271324158,
0.03042224608361721,
0.07910323143005371,
-0.03130408748984337,
0.02980968728661537,
-0.016604050993919373,
-0.09003864973783493,
0.019296374171972275,
-0.014587176963686943,
0.0728735402226448,
0.01549119595438242,
-0.03453307971358299,
0.042107999324798584,
-0.06335590779781342,
-0.02485049143433571,
0.14096224308013916,
0.2766227424144745,
-0.1067032665014267,
0.006090708542615175,
0.05179422348737717,
-0.03712186589837074,
-0.1677529662847519,
-0.000557681021746248,
0.11523867398500443,
0.024309560656547546,
0.009765357710421085,
-0.18194511532783508,
0.09151018410921097,
0.1150408685207367,
-0.017862936481833458,
0.0982343927025795,
-0.3279707133769989,
-0.12933790683746338,
0.07545699924230576,
0.11695929616689682,
0.0479874350130558,
-0.15232209861278534,
-0.029979176819324493,
0.011891837231814861,
-0.12152289599180222,
0.10629608482122421,
-0.01663324236869812,
0.1159534826874733,
-0.034592967480421066,
0.04728785902261734,
0.011510324664413929,
-0.06400108337402344,
0.12594613432884216,
0.01737283729016781,
0.07995440065860748,
-0.01724465936422348,
-0.03897339105606079,
0.039162181317806244,
-0.05413748323917389,
-0.007035848684608936,
-0.06238776445388794,
0.032537903636693954,
-0.08358125388622284,
-0.007121722213923931,
-0.10843154042959213,
0.021737853065133095,
-0.05759778246283531,
-0.05261063948273659,
-0.018032710999250412,
0.03796660527586937,
0.04808548465371132,
-0.02356172353029251,
0.12510308623313904,
0.0382123664021492,
0.14690148830413818,
0.10179365426301956,
0.0462469719350338,
-0.032988015562295914,
-0.09376071393489838,
-0.023364447057247162,
-0.0013744481839239597,
0.06042021885514259,
-0.119642473757267,
0.01668158918619156,
0.17547865211963654,
0.05906333029270172,
0.11983213573694229,
0.07683596014976501,
-0.026401283219456673,
-0.00425813440233469,
0.05723775923252106,
-0.13416916131973267,
-0.05599505081772804,
0.002981371246278286,
-0.03880425542593002,
-0.14549659192562103,
0.04099631682038307,
0.10591783374547958,
-0.07759542018175125,
-0.02263695001602173,
0.010589845478534698,
0.007326159626245499,
-0.028534788638353348,
0.25050923228263855,
0.05079340189695358,
0.08993678539991379,
-0.11454179137945175,
0.05395171046257019,
0.06701802462339401,
-0.11022696644067764,
-0.008471623994410038,
0.07855899631977081,
-0.08848802000284195,
-0.013518826104700565,
0.06113484501838684,
0.08347813040018082,
-0.05915674567222595,
-0.026014577597379684,
-0.15337000787258148,
-0.1272629052400589,
0.08262896537780762,
0.13772441446781158,
0.09160108119249344,
0.041849274188280106,
-0.036958176642656326,
0.01888927072286606,
-0.12565739452838898,
0.10340459644794464,
0.06572828441858292,
0.08968115597963333,
-0.14479400217533112,
0.19156311452388763,
0.007576257921755314,
0.04965025559067726,
-0.010775853879749775,
0.03165025636553764,
-0.11363466084003448,
0.0019399344455450773,
-0.09938511252403259,
-0.03695519268512726,
-0.035321999341249466,
-0.005075239576399326,
-0.010452882386744022,
-0.05924508348107338,
-0.045473869889974594,
0.012879735790193081,
-0.10976839065551758,
-0.038221463561058044,
0.008940733037889004,
0.03325049579143524,
-0.12724050879478455,
-0.026138078421354294,
0.033209968358278275,
-0.10697505623102188,
0.09589886665344238,
0.0384289026260376,
0.0411255843937397,
0.03799678757786751,
-0.05839292332530022,
-0.009313645772635937,
0.041547052562236786,
0.009679276496171951,
0.08294909447431564,
-0.1154734343290329,
0.001913006417453289,
-0.03592297062277794,
0.03344763070344925,
0.011153427883982658,
0.05378078296780586,
-0.13997508585453033,
0.001006767968647182,
-0.011331585235893726,
-0.054986581206321716,
-0.059184737503528595,
0.032921623438596725,
0.0756203681230545,
0.01941845938563347,
0.1786886602640152,
-0.07111680507659912,
0.06315258890390396,
-0.2330135703086853,
-0.017880726605653763,
-0.02549012191593647,
-0.09674141556024551,
-0.09889725595712662,
-0.0376080758869648,
0.08656726032495499,
-0.05973995476961136,
0.08606286346912384,
-0.00871890690177679,
0.10018746554851532,
0.02717006951570511,
-0.03647265210747719,
-0.00013766787014901638,
0.05016889423131943,
0.15245544910430908,
0.044183529913425446,
-0.031408458948135376,
0.05910961329936981,
0.03479597344994545,
0.06893709301948547,
0.10561536252498627,
0.20290634036064148,
0.10494104027748108,
0.0305769219994545,
0.07097182422876358,
0.053150564432144165,
-0.10891397297382355,
-0.19730471074581146,
0.025069275870919228,
-0.04902008920907974,
0.12341377884149551,
-0.01942841149866581,
0.20265768468379974,
0.05936870351433754,
-0.18297438323497772,
0.03609949350357056,
-0.045257192105054855,
-0.08595649898052216,
-0.09253261983394623,
-0.006032410077750683,
-0.05907576158642769,
-0.13036786019802094,
0.0006496115238405764,
-0.10703299194574356,
0.01772329956293106,
0.1280590146780014,
0.022490108385682106,
0.010644677095115185,
0.11978116631507874,
0.06819386035203934,
0.037315599620342255,
0.059781841933727264,
0.04333550110459328,
-0.0040489803068339825,
-0.048359859734773636,
-0.06713040173053741,
-0.014589796774089336,
-0.020970603451132774,
0.045106299221515656,
-0.0703667476773262,
-0.07742787897586823,
0.05640881508588791,
0.01203037891536951,
-0.10433872044086456,
0.02769334614276886,
-0.008471556939184666,
0.07974351197481155,
0.0505690760910511,
0.0034206921700388193,
0.027308795601129532,
-0.03542611747980118,
0.2254183441400528,
-0.09448400884866714,
-0.06349202245473862,
-0.12029321491718292,
0.2978479862213135,
0.02010149136185646,
-0.04015207663178444,
0.05208372697234154,
-0.06570378690958023,
-0.027584604918956757,
0.18388842046260834,
0.17635320127010345,
-0.069898821413517,
-0.020001964643597603,
0.029128938913345337,
-0.01724705472588539,
-0.03296085447072983,
0.1140410304069519,
0.1293696016073227,
0.09281585365533829,
-0.09788930416107178,
-0.0476655475795269,
-0.06369034200906754,
-0.032928746193647385,
-0.024490490555763245,
0.053372371941804886,
0.033511705696582794,
0.0036914460361003876,
-0.04671214520931244,
0.05433107540011406,
-0.038604214787483215,
-0.14174702763557434,
0.0864841490983963,
-0.2258460819721222,
-0.2005012184381485,
-0.017167773097753525,
0.08676864951848984,
0.019753802567720413,
0.07447110861539841,
-0.007569951005280018,
-0.020087655633687973,
0.10113400965929031,
-0.01697939820587635,
-0.05902287736535072,
-0.08769740909337997,
0.10491789132356644,
-0.07693088054656982,
0.18835851550102234,
-0.04924768581986427,
0.07011386752128601,
0.12208525836467743,
0.06618648022413254,
-0.10161244124174118,
0.01645801030099392,
0.08181536942720413,
-0.11804841458797455,
0.024252742528915405,
0.12684005498886108,
-0.027726903557777405,
0.0663517490029335,
0.030188968405127525,
-0.12809568643569946,
0.004649425856769085,
-0.05625930055975914,
-0.03278783708810806,
-0.04305766150355339,
-0.035910461097955704,
-0.01618948206305504,
0.13701020181179047,
0.2515277862548828,
-0.049698129296302795,
-0.001725926878862083,
-0.061944372951984406,
-0.004152909852564335,
0.05707891657948494,
0.07610752433538437,
-0.054677899926900864,
-0.24958793818950653,
0.02246621623635292,
0.0115961330011487,
-0.003137777792289853,
-0.19468954205513,
-0.08382895588874817,
0.037122439593076706,
-0.08234450966119766,
-0.10400180518627167,
0.09667276591062546,
0.017254572361707687,
0.06735926121473312,
-0.04423590749502182,
-0.035289913415908813,
-0.0825853943824768,
0.15560074150562286,
-0.18784530460834503,
-0.08526969701051712
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-large-uncased-whole-word-masking-squad2-with-ner-Pistherea-conll2003-with-neg-with-repeat
This model is a fine-tuned version of [deepset/bert-large-uncased-whole-word-masking-squad2](https://huggingface.co/deepset/bert-large-uncased-whole-word-masking-squad2) on the squad_v2 and the conll2003 datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "cc-by-4.0", "tags": ["generated_from_trainer"], "datasets": ["squad_v2", "conll2003"], "model_index": [{"name": "bert-large-uncased-whole-word-masking-squad2-with-ner-Pistherea-conll2003-with-neg-with-repeat", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "squad_v2", "type": "squad_v2", "args": "conll2003"}}, {"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003"}}]}]}
|
question-answering
|
andi611/bert-large-uncased-whole-word-masking-squad2-with-ner-Pistherea-conll2003-with-neg-with-repeat
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"en",
"dataset:squad_v2",
"dataset:conll2003",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-conll2003 #license-cc-by-4.0 #endpoints_compatible #region-us
|
# bert-large-uncased-whole-word-masking-squad2-with-ner-Pistherea-conll2003-with-neg-with-repeat
This model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# bert-large-uncased-whole-word-masking-squad2-with-ner-Pistherea-conll2003-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-conll2003 #license-cc-by-4.0 #endpoints_compatible #region-us \n",
"# bert-large-uncased-whole-word-masking-squad2-with-ner-Pistherea-conll2003-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
63,
94,
6,
12,
8,
3,
113,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-conll2003 #license-cc-by-4.0 #endpoints_compatible #region-us \n# bert-large-uncased-whole-word-masking-squad2-with-ner-Pistherea-conll2003-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.08382361382246017,
0.16701960563659668,
-0.004730961285531521,
0.08389342576265335,
0.09339051693677902,
0.028627080842852592,
0.09868642687797546,
0.14797408878803253,
-0.06956807523965836,
0.13014066219329834,
0.0761680155992508,
0.02055191621184349,
0.07977265119552612,
0.11338075250387192,
-0.016616130247712135,
-0.22543011605739594,
0.017314746975898743,
-0.038028616458177567,
-0.07717114686965942,
0.09908067435026169,
0.11016077548265457,
-0.1136094331741333,
0.0428207702934742,
-0.004908315371721983,
-0.11174914985895157,
0.010890097357332706,
-0.04302696883678436,
-0.04565005376935005,
0.05999671667814255,
0.0277227945625782,
0.08507326245307922,
0.013719172216951847,
0.0960039421916008,
-0.20702703297138214,
0.0005321431090123951,
0.06889989227056503,
0.014131838455796242,
0.08339796960353851,
0.08006337285041809,
0.02204078435897827,
0.0064668357372283936,
-0.16859407722949982,
0.08333807438611984,
0.03262710943818092,
-0.10104042291641235,
-0.17746351659297943,
-0.09093139320611954,
0.07395777851343155,
0.1032794937491417,
0.08226925879716873,
-0.026754992082715034,
0.1317594349384308,
-0.08990310877561569,
0.05086486041545868,
0.16951517760753632,
-0.30995455384254456,
-0.06345276534557343,
0.0430026613175869,
0.028167959302663803,
0.03153378888964653,
-0.1148453950881958,
-0.006465986371040344,
0.017846716567873955,
0.010014624334871769,
0.06929201632738113,
-0.0003667984565254301,
-0.024297097697854042,
0.006720595061779022,
-0.1196947917342186,
-0.047434091567993164,
0.11810632795095444,
0.060847021639347076,
-0.024483568966388702,
-0.15838073194026947,
-0.03362349048256874,
-0.07430164515972137,
-0.030910490080714226,
-0.01830037496984005,
0.031848590821027756,
-0.05628595128655434,
-0.06051301211118698,
-0.01899830996990204,
-0.06939244270324707,
-0.030548084527254105,
0.012114440090954304,
0.06918448209762573,
0.04243720695376396,
0.009156309068202972,
-0.021949592977762222,
0.07986710965633392,
-0.015699489042162895,
-0.16042689979076385,
-0.02213374339044094,
-0.01514999195933342,
-0.13323761522769928,
-0.04815071448683739,
-0.03768380358815193,
-0.022412564605474472,
0.02604963257908821,
0.16888082027435303,
0.005684494506567717,
0.06130162253975868,
0.018980171531438828,
-0.0040246304124593735,
0.0038141028489917517,
0.17664964497089386,
-0.05506720766425133,
-0.07527181506156921,
-0.0131431445479393,
0.11287655681371689,
0.0191364623606205,
-0.01879473775625229,
-0.09085060656070709,
-0.024974310770630836,
0.04060874134302139,
0.05821023881435394,
0.004154276568442583,
0.0232858769595623,
-0.06033443287014961,
-0.05470931902527809,
0.03804141283035278,
-0.1373666524887085,
0.06175229698419571,
0.0232833344489336,
-0.050499994307756424,
-0.0031757389660924673,
-0.019705474376678467,
0.0035479553043842316,
-0.05177481845021248,
0.0802253782749176,
-0.06145799532532692,
-0.02938263863325119,
-0.05837244167923927,
-0.07459988445043564,
0.025996167212724686,
-0.052655622363090515,
-0.030712513253092766,
-0.05583011358976364,
-0.12291574478149414,
-0.046988341957330704,
0.030355319380760193,
-0.07802955061197281,
-0.06789587438106537,
-0.0351809486746788,
-0.04415666311979294,
0.026933614164590836,
0.005704643204808235,
0.10536704957485199,
-0.02402595616877079,
0.05149006471037865,
0.009246240369975567,
0.048212990164756775,
0.07150322198867798,
0.04046661779284477,
-0.07305477559566498,
0.0447135791182518,
-0.12383351475000381,
0.0946444496512413,
-0.08689474314451218,
0.03781089186668396,
-0.14194397628307343,
-0.10525445640087128,
0.012860003858804703,
-0.03007081151008606,
0.06188954785466194,
0.11273842304944992,
-0.1641770899295807,
-0.0401708222925663,
0.18143245577812195,
-0.035331498831510544,
-0.09221550077199936,
0.1124785766005516,
-0.041648704558610916,
-0.026322610676288605,
0.04356585070490837,
0.1351703405380249,
0.15581493079662323,
-0.1129373162984848,
-0.02600572630763054,
0.0022967136465013027,
0.08817701786756516,
0.07428709417581558,
0.09260240197181702,
-0.04487938433885574,
0.05540718510746956,
0.009868894703686237,
-0.05850651487708092,
-0.025518188253045082,
-0.0434797927737236,
-0.0987318828701973,
-0.02406957931816578,
-0.06227245554327965,
0.07892824709415436,
0.00445389561355114,
0.022258365526795387,
-0.06443304568529129,
-0.12814386188983917,
0.005568202119320631,
0.10438192635774612,
-0.050723493099212646,
0.011345487087965012,
-0.09152047336101532,
0.06650621443986893,
-0.027076495811343193,
-0.005487477406859398,
-0.17078426480293274,
-0.12735946476459503,
0.06881753355264664,
-0.06925375759601593,
0.03064158372581005,
0.0066145178861916065,
0.056620001792907715,
0.043139323592185974,
-0.039513085037469864,
-0.024326825514435768,
-0.05527689307928085,
-0.03178209438920021,
-0.08297531306743622,
-0.1551150232553482,
-0.08090288192033768,
-0.02990199439227581,
0.14542730152606964,
-0.176949143409729,
0.0027188018430024385,
0.012096595019102097,
0.12313386797904968,
0.013802132569253445,
-0.05478380620479584,
0.020087705925107002,
0.0191364623606205,
0.010454545728862286,
-0.07584448903799057,
0.03395690396428108,
-0.011018021032214165,
-0.10343096405267715,
-0.05791907384991646,
-0.12023117393255234,
0.014572493731975555,
0.060110896825790405,
0.060550790280103683,
-0.07287417352199554,
-0.048483431339263916,
-0.053518038243055344,
-0.02258232608437538,
-0.06488072872161865,
-0.029832160100340843,
0.20306332409381866,
0.0264134481549263,
0.10159839689731598,
-0.07698676735162735,
-0.0871187150478363,
-0.0028080169577151537,
0.033571403473615646,
-0.0165372584015131,
0.0966813862323761,
0.02757960371673107,
-0.12816208600997925,
0.07745049148797989,
0.1353069245815277,
-0.005592626985162497,
0.09064852446317673,
-0.05505356192588806,
-0.09374129772186279,
-0.05576653406023979,
0.02720694988965988,
0.016987264156341553,
0.07046803086996078,
-0.07922035455703735,
0.004101067781448364,
0.06363363564014435,
0.010622150264680386,
-0.00039860763354226947,
-0.12750975787639618,
-0.0025269135367125273,
0.03776967152953148,
-0.04211398586630821,
-0.006747255567461252,
-0.017629921436309814,
0.041982535272836685,
0.08315365016460419,
0.052516769617795944,
-0.0029606365133076906,
0.006207540165632963,
-0.05249224603176117,
-0.07380998879671097,
0.17079000174999237,
-0.10034219920635223,
-0.19250117242336273,
-0.13630016148090363,
-0.012551571242511272,
-0.05671726539731026,
-0.020671142265200615,
0.0035346003714948893,
-0.0807705745100975,
-0.07388928532600403,
-0.08542974293231964,
-0.00915546715259552,
-0.036200761795043945,
0.007982509210705757,
0.060584064573049545,
0.016547566279768944,
0.08692964911460876,
-0.12479612231254578,
0.016190651804208755,
0.0006054967525415123,
-0.07796366512775421,
-0.002069022273644805,
0.06829742342233658,
0.08192914724349976,
0.11105476319789886,
0.01242773700505495,
0.020312415435910225,
-0.03822573274374008,
0.23527652025222778,
-0.09984923154115677,
0.010519578121602535,
0.10843458026647568,
-0.021031975746154785,
0.06894276291131973,
0.15481117367744446,
0.0348837710916996,
-0.07422146946191788,
0.019755909219384193,
0.04753762111067772,
-0.007841857150197029,
-0.24006947875022888,
-0.03630421310663223,
-0.054527461528778076,
-0.03707582876086235,
0.13576345145702362,
0.038868553936481476,
-0.01959853619337082,
0.041435956954956055,
-0.04134887456893921,
0.025522544980049133,
0.009871250949800014,
0.0797174945473671,
0.08781496435403824,
0.04379028081893921,
0.10383019596338272,
-0.02158556692302227,
-0.03059050440788269,
0.057479534298181534,
0.014146318659186363,
0.2136339396238327,
-0.017748957499861717,
0.17462244629859924,
0.023673180490732193,
0.14655235409736633,
-0.013277017511427402,
0.028388692066073418,
0.00810828898102045,
-0.0034110511187464,
0.011149498634040356,
-0.07459040731191635,
0.007791052106767893,
0.0339217372238636,
0.07088743895292282,
0.034456778317689896,
-0.07478035986423492,
0.0037801614962518215,
0.040707238018512726,
0.2468530237674713,
0.11160722374916077,
-0.25105124711990356,
-0.05775338411331177,
0.03464610502123833,
-0.027338435873389244,
-0.05513631924986839,
0.010456645861268044,
0.10801532864570618,
-0.13432297110557556,
0.07103944569826126,
-0.05463121086359024,
0.08590345829725266,
-0.04055396467447281,
0.011601259000599384,
0.07649500668048859,
0.0972234234213829,
0.012406555935740471,
0.09688079357147217,
-0.15137234330177307,
0.18224841356277466,
0.019128138199448586,
0.05621388182044029,
-0.0681556686758995,
0.05721645429730415,
-0.012843440286815166,
-0.009828601032495499,
0.1374296396970749,
0.0009774016216397285,
-0.02687671221792698,
-0.1339506208896637,
-0.11933181434869766,
0.015045171603560448,
0.12832526862621307,
-0.08649888634681702,
0.0980919748544693,
-0.0382893942296505,
-0.03106558881700039,
0.018473466858267784,
0.021049924194812775,
-0.0806753933429718,
-0.16100694239139557,
0.0497257225215435,
-0.0067079514265060425,
-0.04098521173000336,
-0.07606819272041321,
-0.0728951245546341,
-0.12694789469242096,
0.21850427985191345,
-0.043473981320858,
-0.0486564077436924,
-0.13160260021686554,
0.0973014384508133,
0.14868943393230438,
-0.07249375432729721,
0.028544051572680473,
-0.0019113916205242276,
0.1495879739522934,
0.017876271158456802,
-0.07693206518888474,
0.06513195484876633,
-0.04409684240818024,
-0.1658228039741516,
-0.06191543489694595,
0.1820092499256134,
0.00568276597186923,
0.06006007269024849,
0.0062446254305541515,
0.03240557014942169,
0.018982309848070145,
-0.08068285882472992,
0.05082361400127411,
0.07876145839691162,
0.07373784482479095,
0.06126052513718605,
-0.056670255959033966,
0.013829323463141918,
-0.05653126910328865,
0.030557069927453995,
0.1749054193496704,
0.2494068741798401,
-0.09923946112394333,
0.10126814246177673,
0.03990732878446579,
-0.055282603949308395,
-0.1907043606042862,
0.024777058511972427,
0.11270825564861298,
0.04708492383360863,
0.04796666279435158,
-0.16980427503585815,
0.09485284984111786,
0.08589395880699158,
-0.012527015060186386,
0.02114553563296795,
-0.31961292028427124,
-0.1266287863254547,
0.055395644158124924,
0.03152154013514519,
-0.03216782584786415,
-0.1340569108724594,
-0.07003014534711838,
-0.0358339287340641,
-0.11282101273536682,
0.07314327359199524,
-0.035305436700582504,
0.0964503288269043,
0.0013207000447437167,
0.05148915946483612,
0.0404813326895237,
-0.03902853652834892,
0.15109902620315552,
0.0349728986620903,
0.047485146671533585,
-0.04710642620921135,
-0.010900221765041351,
0.08762438595294952,
-0.07136379182338715,
0.03282932564616203,
-0.037261828780174255,
0.06287059187889099,
-0.1491014063358307,
-0.024007383733987808,
-0.06390564143657684,
0.01762213557958603,
-0.06861107051372528,
-0.04005306586623192,
-0.04588496312499046,
0.05793285742402077,
0.1016669049859047,
-0.012268594466149807,
0.06746060401201248,
0.017047086730599403,
0.06305143237113953,
0.08407590538263321,
0.10353629291057587,
0.06479999423027039,
-0.17463469505310059,
-0.012742482125759125,
-0.010840452276170254,
0.047837842255830765,
-0.11102848500013351,
0.04582991451025009,
0.1285569667816162,
0.044943153858184814,
0.12626995146274567,
0.027243562042713165,
-0.06206154450774193,
-0.029977833852171898,
0.04122738912701607,
-0.089301697909832,
-0.1701001673936844,
-0.009994574822485447,
0.014170329086482525,
-0.2092338502407074,
-0.030120985582470894,
0.1013718992471695,
0.00798726174980402,
-0.0315009169280529,
0.016562849283218384,
0.0301834549754858,
-0.003407564479857683,
0.15069036185741425,
0.019198037683963776,
0.08739439398050308,
-0.0915001630783081,
0.10284419357776642,
0.11039657890796661,
-0.06349825859069824,
0.0382765494287014,
0.08154646307229996,
-0.06687147170305252,
-0.021104907616972923,
0.04630311578512192,
0.08974023908376694,
0.047232143580913544,
0.008203885518014431,
-0.039108503609895706,
-0.11128349602222443,
0.06177926063537598,
0.04379228502511978,
0.039442028850317,
-0.017752310261130333,
-0.02240746095776558,
-0.006120382808148861,
-0.09213050454854965,
0.11743227392435074,
0.04622683674097061,
0.05660529062151909,
-0.10559214651584625,
0.08500484377145767,
-0.031200241297483444,
0.02742147631943226,
-0.004313570912927389,
0.007852849550545216,
-0.10087740421295166,
-0.020122796297073364,
-0.12334807962179184,
0.01078328862786293,
-0.016742948442697525,
-0.0009726047283038497,
-0.016934288665652275,
-0.02327827550470829,
-0.02115923911333084,
0.01034946646541357,
-0.07383321225643158,
-0.07802752405405045,
-0.008271260187029839,
0.0644061490893364,
-0.13541218638420105,
-0.01695692352950573,
0.0315159410238266,
-0.11258820444345474,
0.10956232249736786,
0.02356298826634884,
0.023830970749258995,
-0.002365032909438014,
-0.08232978731393814,
-0.05105726420879364,
0.010139606893062592,
0.04958765208721161,
0.07051415741443634,
-0.11789336055517197,
-0.002486696233972907,
-0.0434979610145092,
-0.019678302109241486,
0.021526329219341278,
0.0006236730259843171,
-0.12504851818084717,
0.0036061378195881844,
-0.04096541553735733,
-0.04699602350592613,
-0.05835770070552826,
0.03009607084095478,
0.04688410088419914,
0.013082063756883144,
0.15090052783489227,
-0.0729135051369667,
0.09059220552444458,
-0.2146083116531372,
-0.028193984180688858,
-0.004327031318098307,
-0.015242553316056728,
-0.04834528639912605,
-0.035055164247751236,
0.08913570642471313,
-0.05510846525430679,
0.0913676917552948,
-0.031075064092874527,
0.06457468122243881,
0.0352398119866848,
-0.013765191659331322,
0.022239146754145622,
0.037080004811286926,
0.15152618288993835,
0.07234658300876617,
-0.043852515518665314,
0.08721870183944702,
-0.03593398630619049,
0.04931056872010231,
0.07670585811138153,
0.15198472142219543,
0.17450827360153198,
0.055808551609516144,
0.03468630090355873,
0.07662907242774963,
-0.09802348911762238,
-0.1284894049167633,
0.12535513937473297,
-0.0500028058886528,
0.12006748467683792,
-0.05314765125513077,
0.11798391491174698,
0.05900159850716591,
-0.1921883225440979,
0.06252540647983551,
-0.06201505288481712,
-0.10859473049640656,
-0.09762789309024811,
-0.1259530484676361,
-0.09483799338340759,
-0.06493274867534637,
0.029259704053401947,
-0.12044945359230042,
0.02858785353600979,
0.056779615581035614,
0.020531052723526955,
0.0015921501908451319,
0.13562607765197754,
-0.056873004883527756,
0.012252163141965866,
0.09766567498445511,
0.03006802313029766,
0.014398905448615551,
0.012623691000044346,
-0.026454340666532516,
0.04356788098812103,
0.029814787209033966,
0.08704408258199692,
-0.02497219666838646,
0.024789951741695404,
0.03071259707212448,
-0.024829505011439323,
-0.09870611876249313,
0.01020668726414442,
-0.0036235274747014046,
0.02696932666003704,
0.06324701756238937,
0.057493679225444794,
0.014078271575272083,
-0.05697404220700264,
0.20024608075618744,
-0.07046596705913544,
-0.0744544118642807,
-0.1429322510957718,
0.12463994324207306,
0.0011752963764593005,
0.0008987821056507528,
0.055925022810697556,
-0.1132129430770874,
-0.0230952650308609,
0.1677791327238083,
0.19420088827610016,
-0.07196831703186035,
-0.010098411701619625,
0.020947478711605072,
-0.0033623238559812307,
-0.031071649864315987,
0.09049709141254425,
0.08146768063306808,
0.05687352269887924,
-0.04690088331699371,
0.009941293857991695,
0.02158992365002632,
-0.043897103518247604,
-0.07632943987846375,
0.07655838876962662,
0.025694595649838448,
0.01591521129012108,
-0.03274717554450035,
0.07733027637004852,
-0.009925796650350094,
-0.15673035383224487,
0.049039751291275024,
-0.18153025209903717,
-0.1955389529466629,
-0.023305807262659073,
0.06665756553411484,
0.014489945955574512,
0.07278808951377869,
-0.005193981342017651,
-0.01987791433930397,
0.12650784850120544,
-0.01601080223917961,
-0.033549822866916656,
-0.07042476534843445,
0.09877529740333557,
-0.10106482356786728,
0.20868530869483948,
0.01492700632661581,
0.08069651573896408,
0.1131771132349968,
0.012557986192405224,
-0.12523560225963593,
-0.0050506917759776115,
0.09823722392320633,
-0.08408451825380325,
0.030397959053516388,
0.1623843014240265,
-0.033890850841999054,
0.10599423944950104,
0.07360553741455078,
-0.07803338021039963,
-0.008439007215201855,
-0.04983356595039368,
0.0036625966895371675,
-0.0993524044752121,
0.020861830562353134,
-0.05995779484510422,
0.15316584706306458,
0.2321539968252182,
-0.055721674114465714,
-0.02303030528128147,
-0.030779266729950905,
0.008311798796057701,
0.030362077057361603,
0.1076221764087677,
-0.03784291446208954,
-0.16890080273151398,
0.02809622697532177,
-0.00722943851724267,
0.06328369677066803,
-0.2298295497894287,
-0.09600318223237991,
0.05647728592157364,
-0.02208806946873665,
-0.0663573294878006,
0.13440975546836853,
0.04830644279718399,
0.019543703645467758,
-0.03471202403306961,
-0.1177414208650589,
-0.03482211381196976,
0.11746986210346222,
-0.14899122714996338,
-0.0520961731672287
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-large-uncased-whole-word-masking-squad2-with-ner-Pwhatisthe-conll2003-with-neg-with-repeat
This model is a fine-tuned version of [deepset/bert-large-uncased-whole-word-masking-squad2](https://huggingface.co/deepset/bert-large-uncased-whole-word-masking-squad2) on the squad_v2 and the conll2003 datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "cc-by-4.0", "tags": ["generated_from_trainer"], "datasets": ["squad_v2", "conll2003"], "model_index": [{"name": "bert-large-uncased-whole-word-masking-squad2-with-ner-Pwhatisthe-conll2003-with-neg-with-repeat", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "squad_v2", "type": "squad_v2", "args": "conll2003"}}, {"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003"}}]}]}
|
question-answering
|
andi611/bert-large-uncased-whole-word-masking-squad2-with-ner-Pwhatisthe-conll2003-with-neg-with-repeat
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"en",
"dataset:squad_v2",
"dataset:conll2003",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-conll2003 #license-cc-by-4.0 #endpoints_compatible #region-us
|
# bert-large-uncased-whole-word-masking-squad2-with-ner-Pwhatisthe-conll2003-with-neg-with-repeat
This model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# bert-large-uncased-whole-word-masking-squad2-with-ner-Pwhatisthe-conll2003-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-conll2003 #license-cc-by-4.0 #endpoints_compatible #region-us \n",
"# bert-large-uncased-whole-word-masking-squad2-with-ner-Pwhatisthe-conll2003-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
63,
94,
6,
12,
8,
3,
113,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-conll2003 #license-cc-by-4.0 #endpoints_compatible #region-us \n# bert-large-uncased-whole-word-masking-squad2-with-ner-Pwhatisthe-conll2003-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.08392086625099182,
0.16476665437221527,
-0.0047838096506893635,
0.08378773927688599,
0.09389138966798782,
0.02902323380112648,
0.09609395265579224,
0.1493975818157196,
-0.06745526194572449,
0.12951746582984924,
0.07545721530914307,
0.021021494641900063,
0.0800667554140091,
0.1115594357252121,
-0.014923932030797005,
-0.22572609782218933,
0.017265981063246727,
-0.037917885929346085,
-0.07899647951126099,
0.09949525445699692,
0.10923869907855988,
-0.11405451595783234,
0.04299692437052727,
-0.005426784511655569,
-0.1107376366853714,
0.009747914969921112,
-0.045428548008203506,
-0.04574453458189964,
0.05942253768444061,
0.02777741849422455,
0.08421570807695389,
0.012293347157537937,
0.09559745341539383,
-0.20938877761363983,
0.0003630137362051755,
0.0693182498216629,
0.012998659163713455,
0.08361692726612091,
0.08198875188827515,
0.020403092727065086,
0.006800874602049589,
-0.16786669194698334,
0.08159520477056503,
0.03260861709713936,
-0.10128364711999893,
-0.17810793220996857,
-0.08995475620031357,
0.0768946036696434,
0.1023813858628273,
0.08264783769845963,
-0.02699834480881691,
0.13355138897895813,
-0.09153713285923004,
0.0525721050798893,
0.172189861536026,
-0.31066372990608215,
-0.06286168843507767,
0.04485069587826729,
0.030532103031873703,
0.029463710263371468,
-0.11616706103086472,
-0.007657675538212061,
0.018476493656635284,
0.010872776620090008,
0.06792894750833511,
-0.001234453171491623,
-0.0224847923964262,
0.007641775067895651,
-0.11988954246044159,
-0.04946482181549072,
0.11892625689506531,
0.06110841780900955,
-0.02373475953936577,
-0.1565927267074585,
-0.03285377472639084,
-0.07533228397369385,
-0.03044574335217476,
-0.016862601041793823,
0.031717438250780106,
-0.05640815570950508,
-0.06080207973718643,
-0.016940845176577568,
-0.07059280574321747,
-0.030876711010932922,
0.012277538888156414,
0.07007637619972229,
0.04204152897000313,
0.009146108292043209,
-0.022794270887970924,
0.0805150642991066,
-0.014700855128467083,
-0.15958023071289062,
-0.02297714352607727,
-0.01495005190372467,
-0.1334076076745987,
-0.04816390946507454,
-0.036815542727708817,
-0.019591933116316795,
0.02655092254281044,
0.167746901512146,
0.0006578990141861141,
0.06089756637811661,
0.02093561552464962,
-0.002259503584355116,
0.005953394807875156,
0.17705444991588593,
-0.05482979118824005,
-0.07482223957777023,
-0.015608790330588818,
0.11278475821018219,
0.01666538044810295,
-0.018620122224092484,
-0.09038287401199341,
-0.024594362825155258,
0.04295153543353081,
0.05845927074551582,
0.005309798754751682,
0.025004487484693527,
-0.06049228087067604,
-0.05524508282542229,
0.03972728177905083,
-0.1372368335723877,
0.06185513734817505,
0.024229010567069054,
-0.05041743814945221,
-0.00506877014413476,
-0.01847807504236698,
0.002389735309407115,
-0.05365126207470894,
0.08136603236198425,
-0.06033647060394287,
-0.029162652790546417,
-0.05912120267748833,
-0.075718954205513,
0.02587568201124668,
-0.050961777567863464,
-0.031095702201128006,
-0.05737287923693657,
-0.12521612644195557,
-0.04800170660018921,
0.029574787244200706,
-0.07585059106349945,
-0.06703374534845352,
-0.034945063292980194,
-0.046125851571559906,
0.024516835808753967,
0.005793862976133823,
0.10643056780099869,
-0.024808648973703384,
0.052176229655742645,
0.008127427659928799,
0.04825187847018242,
0.0723198875784874,
0.040696751326322556,
-0.07455721497535706,
0.04546103999018669,
-0.12115801870822906,
0.09504729509353638,
-0.08740374445915222,
0.038805510848760605,
-0.1416768580675125,
-0.10633745044469833,
0.010882650502026081,
-0.02914852648973465,
0.06111591309309006,
0.11218138039112091,
-0.16420678794384003,
-0.04069045931100845,
0.18313831090927124,
-0.03538275510072708,
-0.09325990825891495,
0.11059045046567917,
-0.04060499370098114,
-0.023781009018421173,
0.04340382665395737,
0.13753755390644073,
0.15709728002548218,
-0.11275812238454819,
-0.027741853147745132,
0.0034498570021241903,
0.0870935395359993,
0.07523559033870697,
0.0910855233669281,
-0.0457226000726223,
0.054179541766643524,
0.010545765049755573,
-0.0595126673579216,
-0.025650519877672195,
-0.04409640282392502,
-0.09860162436962128,
-0.02372354082763195,
-0.06261363625526428,
0.07930529862642288,
0.005589788779616356,
0.02138238027691841,
-0.0639461949467659,
-0.12756145000457764,
0.0066102915443480015,
0.10436899960041046,
-0.05086445435881615,
0.010882209986448288,
-0.09171073138713837,
0.06712660193443298,
-0.0258083064109087,
-0.003403108799830079,
-0.17279452085494995,
-0.12609903514385223,
0.06830132007598877,
-0.0676363930106163,
0.029240187257528305,
0.006328579504042864,
0.057563625276088715,
0.04231050983071327,
-0.038493815809488297,
-0.026182791218161583,
-0.056496936827898026,
-0.03224750608205795,
-0.08333062380552292,
-0.15327811241149902,
-0.08163854479789734,
-0.03031274676322937,
0.14381526410579681,
-0.1771162897348404,
0.003072531195357442,
0.01250454131513834,
0.12298540771007538,
0.013642193749547005,
-0.05465959757566452,
0.02109972946345806,
0.019987422972917557,
0.011062903329730034,
-0.07549896091222763,
0.03469275310635567,
-0.010316061787307262,
-0.10183576494455338,
-0.05698220059275627,
-0.1201435774564743,
0.013007020577788353,
0.060577549040317535,
0.05938237905502319,
-0.07297592610120773,
-0.049485813826322556,
-0.0546722412109375,
-0.022846948355436325,
-0.06343872100114822,
-0.030695151537656784,
0.2024984359741211,
0.026444032788276672,
0.10135260224342346,
-0.07664554566144943,
-0.08741056174039841,
-0.0028173571918159723,
0.03377082943916321,
-0.015256603248417377,
0.09829366207122803,
0.025776969268918037,
-0.12329099327325821,
0.0773712545633316,
0.13511347770690918,
-0.0046914564445614815,
0.09157777577638626,
-0.05589580908417702,
-0.0939437597990036,
-0.05507143586874008,
0.02877841889858246,
0.01672687567770481,
0.06994612514972687,
-0.08090358972549438,
0.0040252250619232655,
0.06418786197900772,
0.010511249303817749,
-0.0003152831341139972,
-0.12798941135406494,
-0.002444497775286436,
0.03757858648896217,
-0.041836172342300415,
-0.009017427451908588,
-0.01840033195912838,
0.0422603115439415,
0.08312929421663284,
0.052044086158275604,
-0.0007927314727567136,
0.005421189125627279,
-0.052996646612882614,
-0.0751500129699707,
0.1724521964788437,
-0.09983125329017639,
-0.1951417475938797,
-0.1347304731607437,
-0.012007426470518112,
-0.05499880388379097,
-0.021455971524119377,
0.004280693829059601,
-0.08104966580867767,
-0.07361150532960892,
-0.0853990837931633,
-0.00876986887305975,
-0.03687857836484909,
0.007014873903244734,
0.06061346083879471,
0.016703380271792412,
0.08636137843132019,
-0.12477035820484161,
0.01637374609708786,
0.0012818514369428158,
-0.07876674085855484,
-0.00013766791380476207,
0.06795059889554977,
0.08303886651992798,
0.1113462820649147,
0.012169091030955315,
0.020816955715417862,
-0.038900867104530334,
0.23616327345371246,
-0.10049320012331009,
0.009607002139091492,
0.10528209805488586,
-0.02050386369228363,
0.06869597733020782,
0.15327121317386627,
0.03447164595127106,
-0.07310578972101212,
0.01994924433529377,
0.04698820412158966,
-0.007024446967989206,
-0.24093545973300934,
-0.03662051260471344,
-0.054362062364816666,
-0.034171659499406815,
0.13665959239006042,
0.039636868983507156,
-0.018229486420750618,
0.040971510112285614,
-0.04144899547100067,
0.026309147477149963,
0.008171365596354008,
0.0795837789773941,
0.08384397625923157,
0.043704237788915634,
0.10227066278457642,
-0.0215153805911541,
-0.029495729133486748,
0.05699790641665459,
0.014530980959534645,
0.21324113011360168,
-0.01987055130302906,
0.17301654815673828,
0.02419336512684822,
0.14784957468509674,
-0.01397672202438116,
0.02917860634624958,
0.008338119834661484,
-0.002783706644549966,
0.012356667779386044,
-0.07467765361070633,
0.0077242921106517315,
0.033421244472265244,
0.07152432203292847,
0.03457896411418915,
-0.0755513608455658,
0.0021953911054879427,
0.041969891637563705,
0.24931657314300537,
0.10910551995038986,
-0.2495512068271637,
-0.05926031991839409,
0.03369869291782379,
-0.02578585408627987,
-0.055044759064912796,
0.010909752920269966,
0.10928492248058319,
-0.13644219934940338,
0.0700596272945404,
-0.05285205692052841,
0.08619418740272522,
-0.04015365615487099,
0.011779041960835457,
0.0774650052189827,
0.0960279181599617,
0.012682687491178513,
0.09813791513442993,
-0.15251831710338593,
0.1808435618877411,
0.01940106973052025,
0.056903593242168427,
-0.06858789175748825,
0.05688709765672684,
-0.013076666742563248,
-0.010037931613624096,
0.1382584422826767,
0.0024767164140939713,
-0.02495286613702774,
-0.13336625695228577,
-0.11662618070840836,
0.014222811907529831,
0.12817740440368652,
-0.08643513917922974,
0.09802339971065521,
-0.03741496801376343,
-0.031554095447063446,
0.01722264103591442,
0.019579874351620674,
-0.0804973840713501,
-0.15843318402767181,
0.04952835664153099,
-0.0070677632465958595,
-0.0391460619866848,
-0.07673242688179016,
-0.07224602997303009,
-0.1288662850856781,
0.21975725889205933,
-0.04193217679858208,
-0.04903511703014374,
-0.12999564409255981,
0.09730691462755203,
0.1474074274301529,
-0.07257653772830963,
0.027191825211048126,
-0.0012490164954215288,
0.14923542737960815,
0.018724925816059113,
-0.07680468261241913,
0.06445561349391937,
-0.04532525688409805,
-0.16713930666446686,
-0.06110053136944771,
0.18278658390045166,
0.005051890853792429,
0.06036404147744179,
0.0065629384480416775,
0.031839560717344284,
0.017619984224438667,
-0.0815853700041771,
0.05156424269080162,
0.07961466163396835,
0.07344217598438263,
0.06314798444509506,
-0.057762179523706436,
0.01194173377007246,
-0.05708365514874458,
0.030835846439003944,
0.1756202131509781,
0.24822667241096497,
-0.0996706411242485,
0.10066124051809311,
0.04158918932080269,
-0.05564696341753006,
-0.19357483088970184,
0.024907050654292107,
0.1141439825296402,
0.04633817449212074,
0.04818475618958473,
-0.16905540227890015,
0.09482915699481964,
0.08554942905902863,
-0.012352660298347473,
0.02339489758014679,
-0.32092374563217163,
-0.12627245485782623,
0.05511407554149628,
0.031284149736166,
-0.03671058267354965,
-0.13562585413455963,
-0.07105114310979843,
-0.035563625395298004,
-0.11512848734855652,
0.07386644184589386,
-0.03452930599451065,
0.09549961984157562,
-0.0003118961176369339,
0.05199694633483887,
0.04070820286870003,
-0.03986719623208046,
0.1505967676639557,
0.03589324653148651,
0.04658167064189911,
-0.048079535365104675,
-0.010712673887610435,
0.08824198693037033,
-0.07085306197404861,
0.03332282230257988,
-0.03767336159944534,
0.06368394196033478,
-0.1494937390089035,
-0.022724002599716187,
-0.06687626987695694,
0.018515128642320633,
-0.0687391459941864,
-0.04032185673713684,
-0.04566097632050514,
0.05823924019932747,
0.10182174295186996,
-0.010847102850675583,
0.06415858119726181,
0.016741234809160233,
0.06423820555210114,
0.08547305315732956,
0.10270677506923676,
0.06416758894920349,
-0.17718736827373505,
-0.013162433169782162,
-0.011114162392914295,
0.04811733588576317,
-0.11152401566505432,
0.04459809884428978,
0.12896513938903809,
0.04676579311490059,
0.1269715577363968,
0.02778487093746662,
-0.06412941217422485,
-0.03135315701365471,
0.04172665998339653,
-0.09007344394922256,
-0.16976484656333923,
-0.010639090090990067,
0.01574619859457016,
-0.20985251665115356,
-0.03037312626838684,
0.09908700734376907,
0.00823144055902958,
-0.031381260603666306,
0.016357824206352234,
0.03001408278942108,
-0.004036851227283478,
0.15121065080165863,
0.020493797957897186,
0.08852788060903549,
-0.09165515005588531,
0.10326578468084335,
0.10999549180269241,
-0.06321893632411957,
0.037744466215372086,
0.08270356059074402,
-0.06711598485708237,
-0.02250785380601883,
0.04604349285364151,
0.08849851042032242,
0.05019406974315643,
0.006953543517738581,
-0.0364539660513401,
-0.11049344390630722,
0.06120377406477928,
0.042397383600473404,
0.03977551311254501,
-0.018416989594697952,
-0.02093719132244587,
-0.005949529819190502,
-0.09266120195388794,
0.11740089952945709,
0.04586830362677574,
0.05652003362774849,
-0.10368142277002335,
0.08641665428876877,
-0.03121718391776085,
0.029822204262018204,
-0.0033740848302841187,
0.006463115569204092,
-0.10176122188568115,
-0.018985051661729813,
-0.12172628939151764,
0.01131315715610981,
-0.01819225586950779,
-0.0018454005476087332,
-0.017108669504523277,
-0.022778062149882317,
-0.02037697099149227,
0.010493692941963673,
-0.07430464029312134,
-0.0792517215013504,
-0.008184969425201416,
0.06460704654455185,
-0.13625790178775787,
-0.017927948385477066,
0.03146781772375107,
-0.1119338795542717,
0.10874705016613007,
0.02438502013683319,
0.023344798013567924,
-0.0009658104972913861,
-0.08187498897314072,
-0.05101398006081581,
0.011187740601599216,
0.04931741952896118,
0.07087069749832153,
-0.11491277813911438,
-0.002627386013045907,
-0.0436708889901638,
-0.021184075623750687,
0.021724462509155273,
0.001399795524775982,
-0.12603983283042908,
0.003754605771973729,
-0.04072842001914978,
-0.04612201452255249,
-0.059509772807359695,
0.03040768764913082,
0.04449747875332832,
0.01323345210403204,
0.15042535960674286,
-0.07304558157920837,
0.09068986773490906,
-0.21505072712898254,
-0.028558718040585518,
-0.004209835547953844,
-0.014578024856746197,
-0.046749331057071686,
-0.03354620933532715,
0.08949942141771317,
-0.056226618587970734,
0.09123051911592484,
-0.031132129952311516,
0.0635971799492836,
0.03375120460987091,
-0.014692457392811775,
0.021364042535424232,
0.03475029021501541,
0.15136152505874634,
0.07147962599992752,
-0.044349655508995056,
0.08635374158620834,
-0.03438606485724449,
0.04819197952747345,
0.07605711370706558,
0.15202732384204865,
0.1734887808561325,
0.05658959969878197,
0.03481091186404228,
0.07630275189876556,
-0.09833703190088272,
-0.12739752233028412,
0.12298385798931122,
-0.050026826560497284,
0.12156324833631516,
-0.05373984947800636,
0.11905618011951447,
0.05916118994355202,
-0.19207781553268433,
0.06313812732696533,
-0.06276766210794449,
-0.1080419048666954,
-0.09604352712631226,
-0.12559178471565247,
-0.09359316527843475,
-0.06432569772005081,
0.02959495596587658,
-0.12131854891777039,
0.027616219595074654,
0.05614764615893364,
0.019713889807462692,
0.002202570205554366,
0.13683457672595978,
-0.05710810422897339,
0.011425203643739223,
0.0975906103849411,
0.028718192130327225,
0.014665813185274601,
0.01199840847402811,
-0.024634813889861107,
0.04334995895624161,
0.02961854450404644,
0.08770830184221268,
-0.024259651079773903,
0.022710995748639107,
0.03166034817695618,
-0.02444012090563774,
-0.09865561127662659,
0.010303611867129803,
-0.0026865017134696245,
0.02467469498515129,
0.06261858344078064,
0.05894418805837631,
0.014568326063454151,
-0.056758731603622437,
0.20037400722503662,
-0.07089719921350479,
-0.07581422477960587,
-0.1424480527639389,
0.1258154958486557,
0.0014693039702251554,
-0.0009388219332322478,
0.05661553889513016,
-0.11460346728563309,
-0.022580118849873543,
0.17024704813957214,
0.1956709921360016,
-0.07072935998439789,
-0.009221983142197132,
0.021832868456840515,
-0.0028610925655812025,
-0.03050798363983631,
0.09054016321897507,
0.08315607160329819,
0.057970114052295685,
-0.047684088349342346,
0.010781259275972843,
0.020564652979373932,
-0.04412335902452469,
-0.07587072253227234,
0.07493432611227036,
0.0253252312541008,
0.014693801291286945,
-0.03282329812645912,
0.07698055356740952,
-0.011539693921804428,
-0.15798979997634888,
0.04918152093887329,
-0.18165308237075806,
-0.19431130588054657,
-0.02348574437201023,
0.06625114381313324,
0.01579209230840206,
0.07377693802118301,
-0.005485639441758394,
-0.01962212845683098,
0.1268387883901596,
-0.016346396878361702,
-0.03374963998794556,
-0.0702206939458847,
0.09781801700592041,
-0.10001227259635925,
0.20785562694072723,
0.014759625308215618,
0.08154677599668503,
0.11228720098733902,
0.01361896563321352,
-0.12487788498401642,
-0.004619354382157326,
0.09872216731309891,
-0.08340168744325638,
0.029103847220540047,
0.16166722774505615,
-0.03391394019126892,
0.10332215577363968,
0.07366202771663666,
-0.07774155586957932,
-0.008745426312088966,
-0.047681402415037155,
0.005564715247601271,
-0.10099884867668152,
0.02015293762087822,
-0.05891245976090431,
0.15247933566570282,
0.23259560763835907,
-0.054932188242673874,
-0.02398047223687172,
-0.03118138574063778,
0.008062656968832016,
0.0293397456407547,
0.1050477921962738,
-0.03951341658830643,
-0.16606321930885315,
0.02876204438507557,
-0.006805299315601587,
0.06433415412902832,
-0.22821703553199768,
-0.09503006190061569,
0.05808800086379051,
-0.02303064428269863,
-0.065370112657547,
0.13458490371704102,
0.046868640929460526,
0.020125269889831543,
-0.034754157066345215,
-0.11497049033641815,
-0.034105848520994186,
0.11859820038080215,
-0.1495378017425537,
-0.05393027141690254
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-large-uncased-whole-word-masking-squad2-with-ner-conll2003-with-neg-with-repeat
This model is a fine-tuned version of [deepset/bert-large-uncased-whole-word-masking-squad2](https://huggingface.co/deepset/bert-large-uncased-whole-word-masking-squad2) on the squad_v2 and the conll2003 datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "cc-by-4.0", "tags": ["generated_from_trainer"], "datasets": ["squad_v2", "conll2003"], "model_index": [{"name": "bert-large-uncased-whole-word-masking-squad2-with-ner-conll2003-with-neg-with-repeat", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "squad_v2", "type": "squad_v2", "args": "conll2003"}}, {"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003"}}]}]}
|
question-answering
|
andi611/bert-large-uncased-whole-word-masking-squad2-with-ner-conll2003-with-neg-with-repeat
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"en",
"dataset:squad_v2",
"dataset:conll2003",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-conll2003 #license-cc-by-4.0 #endpoints_compatible #region-us
|
# bert-large-uncased-whole-word-masking-squad2-with-ner-conll2003-with-neg-with-repeat
This model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# bert-large-uncased-whole-word-masking-squad2-with-ner-conll2003-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-conll2003 #license-cc-by-4.0 #endpoints_compatible #region-us \n",
"# bert-large-uncased-whole-word-masking-squad2-with-ner-conll2003-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
63,
89,
6,
12,
8,
3,
113,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-conll2003 #license-cc-by-4.0 #endpoints_compatible #region-us \n# bert-large-uncased-whole-word-masking-squad2-with-ner-conll2003-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the conll2003 datasets.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.07409392297267914,
0.1541951298713684,
-0.004865502938628197,
0.06811883300542831,
0.08745847642421722,
0.04251257702708244,
0.09068093448877335,
0.1479615420103073,
-0.08567247539758682,
0.13084837794303894,
0.06266805529594421,
0.025621922686696053,
0.0870855450630188,
0.10209289193153381,
-0.011484388262033463,
-0.22694559395313263,
0.011150501668453217,
-0.03395001217722893,
-0.08260097354650497,
0.09275900572538376,
0.10763560980558395,
-0.1083013266324997,
0.03757748007774353,
-0.008597290143370628,
-0.0885661169886589,
0.016317181289196014,
-0.06156634911894798,
-0.03646581992506981,
0.056904375553131104,
0.023756524547934532,
0.07514549791812897,
0.010204901918768883,
0.11225298792123795,
-0.2433425486087799,
0.0011963070137426257,
0.0705975741147995,
0.018719738349318504,
0.07900350540876389,
0.0761948674917221,
0.028019823133945465,
0.03952682763338089,
-0.17593388259410858,
0.09292693436145782,
0.028177578002214432,
-0.08468452841043472,
-0.15315186977386475,
-0.08699755370616913,
0.06697297841310501,
0.1207340806722641,
0.0750410333275795,
-0.022009164094924927,
0.12081450968980789,
-0.10281986743211746,
0.036078423261642456,
0.1712692826986313,
-0.31159543991088867,
-0.0639386922121048,
0.058023128658533096,
0.02866826020181179,
0.03320188447833061,
-0.11414410918951035,
-0.011149088852107525,
0.023460596799850464,
0.004712154157459736,
0.06600804626941681,
0.0073311822488904,
-0.012616250663995743,
0.013269606977701187,
-0.12460170686244965,
-0.04019869863986969,
0.12663982808589935,
0.06750094145536423,
-0.009477760642766953,
-0.16343992948532104,
-0.025966960936784744,
-0.08678160607814789,
-0.034431636333465576,
-0.032321229577064514,
0.028737353160977364,
-0.053380291908979416,
-0.05901361629366875,
-0.013554595410823822,
-0.06570655107498169,
-0.03312567621469498,
0.012982496991753578,
0.06379639357328415,
0.05839689075946808,
-0.0008971458300948143,
-0.011469509452581406,
0.08854909241199493,
0.013418987393379211,
-0.1515112817287445,
-0.027850832790136337,
-0.008801273070275784,
-0.15478946268558502,
-0.05459810793399811,
-0.029754146933555603,
0.014314823783934116,
0.026914382353425026,
0.1706041693687439,
0.01628837175667286,
0.059930428862571716,
0.04411023482680321,
-0.011359812691807747,
0.011949632316827774,
0.14807730913162231,
-0.05175723135471344,
-0.08958615362644196,
-0.022433985024690628,
0.10568930953741074,
0.007820905186235905,
-0.012708908878266811,
-0.08058105409145355,
-0.026289453729987144,
0.04366171360015869,
0.06779469549655914,
-0.00478431535884738,
0.017682315781712532,
-0.05836499109864235,
-0.05524437129497528,
0.022304942831397057,
-0.13956891000270844,
0.06340144574642181,
0.028607847169041634,
-0.061126913875341415,
-0.005633939057588577,
-0.03434862941503525,
0.004258232191205025,
-0.05434010177850723,
0.07811418920755386,
-0.052882399410009384,
-0.0309340488165617,
-0.056292176246643066,
-0.05570386350154877,
0.030004151165485382,
-0.05391015484929085,
-0.026890171691775322,
-0.048104170709848404,
-0.13574552536010742,
-0.045969158411026,
0.03218108043074608,
-0.08498771488666534,
-0.06857595592737198,
-0.04427875205874443,
-0.028825711458921432,
0.011977610178291798,
-0.0041923727840185165,
0.09810982644557953,
-0.03363213315606117,
0.04316973686218262,
-0.0025169304572045803,
0.04594305157661438,
0.08999942243099213,
0.0411088652908802,
-0.06945780664682388,
0.03848779946565628,
-0.08525336533784866,
0.09111155569553375,
-0.08823179453611374,
0.023802723735570908,
-0.1458192765712738,
-0.10434535145759583,
-0.00668403459712863,
-0.032997164875268936,
0.05224526673555374,
0.10765881836414337,
-0.17681899666786194,
-0.041203293949365616,
0.1636434942483902,
-0.03500819951295853,
-0.09763550013303757,
0.10320691764354706,
-0.04211762174963951,
-0.020168090239167213,
0.042007800191640854,
0.11886759847402573,
0.1635238081216812,
-0.12435684353113174,
-0.030469903722405434,
-0.0003220336511731148,
0.08817996829748154,
0.07639193534851074,
0.08852863311767578,
-0.027357233688235283,
0.07013066858053207,
0.01402272004634142,
-0.06379697471857071,
-0.027594365179538727,
-0.050932876765728,
-0.10807255655527115,
-0.018035413697361946,
-0.06008263677358627,
0.060031626373529434,
0.009818587452173233,
0.02039424143731594,
-0.060406822711229324,
-0.11968948692083359,
0.02101854421198368,
0.12071165442466736,
-0.05154314637184143,
0.005459178239107132,
-0.08719801157712936,
0.0607762336730957,
-0.019240502268075943,
-0.00425797188654542,
-0.15573136508464813,
-0.14017602801322937,
0.0664266049861908,
-0.07241565734148026,
0.04556037113070488,
0.04024847596883774,
0.053056519478559494,
0.0424504280090332,
-0.03758537769317627,
-0.019741248339414597,
-0.061271097511053085,
-0.032593984156847,
-0.06671294569969177,
-0.16578000783920288,
-0.0768449679017067,
-0.02493678592145443,
0.15099386870861053,
-0.1905106008052826,
0.0003877172421198338,
-0.008852193132042885,
0.13368652760982513,
0.00341892521828413,
-0.056253980845212936,
0.00911691039800644,
0.022770525887608528,
0.0056620147079229355,
-0.07515183836221695,
0.036600738763809204,
-0.017846081405878067,
-0.1082788035273552,
-0.07456962019205093,
-0.11244586855173111,
0.01388547383248806,
0.05804380029439926,
0.07360897958278656,
-0.07248754799365997,
-0.06839172542095184,
-0.06409204006195068,
-0.02746409736573696,
-0.07306739687919617,
-0.01238254550844431,
0.21511723101139069,
0.024092402309179306,
0.09378200024366379,
-0.07514356821775436,
-0.10070707648992538,
-0.009248042479157448,
0.037486378103494644,
-0.020022183656692505,
0.09308918565511703,
0.06618142873048782,
-0.1083940714597702,
0.0701357051730156,
0.12367065250873566,
0.00012500208686105907,
0.10329243540763855,
-0.0622529610991478,
-0.09910540282726288,
-0.053441617637872696,
0.03236374258995056,
0.00864420272409916,
0.07993482053279877,
-0.08354877680540085,
0.006741639692336321,
0.05413609370589256,
0.011590820737183094,
0.014286746270954609,
-0.12757857143878937,
-0.007817517034709454,
0.04019583761692047,
-0.04556785896420479,
0.009762303903698921,
-0.025508427992463112,
0.04526883363723755,
0.08486589789390564,
0.04442805424332619,
-0.007152214180678129,
-0.010218971408903599,
-0.051025282591581345,
-0.0770428404211998,
0.16278176009655,
-0.09801023453474045,
-0.1736954301595688,
-0.11953120678663254,
0.008187184110283852,
-0.06212949380278587,
-0.032452549785375595,
-0.0029526937287300825,
-0.09228027611970901,
-0.06824605911970139,
-0.08728273957967758,
0.004587183240801096,
-0.04065795615315437,
0.006982127204537392,
0.07457383722066879,
0.03793032467365265,
0.07451659440994263,
-0.1271895468235016,
0.025642968714237213,
-0.009785369038581848,
-0.09324709326028824,
-0.013531523756682873,
0.06449269503355026,
0.08420421928167343,
0.12188352644443512,
0.012358716689050198,
0.023264769464731216,
-0.04366203024983406,
0.21106624603271484,
-0.09718705713748932,
0.003365900134667754,
0.09924568980932236,
-0.007688004523515701,
0.05336526408791542,
0.15270599722862244,
0.03561161458492279,
-0.0818173885345459,
0.01783190295100212,
0.058399394154548645,
-0.003249793080613017,
-0.2432658076286316,
-0.04957572743296623,
-0.05493411794304848,
-0.04591624066233635,
0.14349259436130524,
0.044853463768959045,
-0.03524570167064667,
0.04078296199440956,
-0.04474546015262604,
0.01994573511183262,
0.019876010715961456,
0.0800933688879013,
0.08686676621437073,
0.04462185502052307,
0.09385266900062561,
-0.021873824298381805,
-0.037325117737054825,
0.06226613372564316,
0.03152488172054291,
0.22474305331707,
-0.016913706436753273,
0.15903733670711517,
0.02603810466825962,
0.14477157592773438,
-0.028166143223643303,
0.024951057508587837,
0.007416142150759697,
-0.0047153267078101635,
0.01352942269295454,
-0.07236792892217636,
0.002500408561900258,
0.04285139590501785,
0.07388115674257278,
0.02346525341272354,
-0.06812489032745361,
0.021345095708966255,
0.045880239456892014,
0.21616342663764954,
0.1029760092496872,
-0.23865121603012085,
-0.050579983741045,
0.03778396546840668,
-0.03085619956254959,
-0.04100969433784485,
0.01857437752187252,
0.09660293906927109,
-0.11473467200994492,
0.07477455586194992,
-0.04729845002293587,
0.08584664016962051,
-0.0555306039750576,
0.0004838471650145948,
0.07735700160264969,
0.10813874006271362,
0.024235377088189125,
0.1034124568104744,
-0.1308770328760147,
0.17135721445083618,
0.013705874793231487,
0.07030748575925827,
-0.073569156229496,
0.05944744870066643,
-0.024928206577897072,
-0.01966332085430622,
0.1307077556848526,
-0.0031131200958043337,
-0.03819741681218147,
-0.13475097715854645,
-0.11376859992742538,
0.03110305778682232,
0.13004164397716522,
-0.08743475377559662,
0.10164463520050049,
-0.042564064264297485,
-0.021643487736582756,
0.021419575437903404,
-0.010205590166151524,
-0.10246508568525314,
-0.17559932172298431,
0.0495171993970871,
-0.023332888260483742,
-0.04595018923282623,
-0.07366864383220673,
-0.07833503186702728,
-0.13028466701507568,
0.23621384799480438,
-0.039474911987781525,
-0.03653651475906372,
-0.1269715577363968,
0.09968113899230957,
0.15567076206207275,
-0.06828427314758301,
0.018455643206834793,
0.010853578336536884,
0.15075373649597168,
0.01626206375658512,
-0.07941219210624695,
0.04672326147556305,
-0.047473035752773285,
-0.15810222923755646,
-0.06448490172624588,
0.1825326681137085,
0.014773556962609291,
0.061000291258096695,
0.014871963299810886,
0.020658079534769058,
0.02103826403617859,
-0.0900280624628067,
0.04207969829440117,
0.09103991836309433,
0.06881486624479294,
0.0528264045715332,
-0.07364066690206528,
0.02126466855406761,
-0.051158733665943146,
0.018654320389032364,
0.16357427835464478,
0.24215824902057648,
-0.09820299595594406,
0.10393892973661423,
0.033419545739889145,
-0.06268078833818436,
-0.17597843706607819,
0.013731878250837326,
0.1168917864561081,
0.03858450427651405,
0.04350657016038895,
-0.17916104197502136,
0.10288559645414352,
0.09488760679960251,
-0.0046270135790109634,
0.017888102680444717,
-0.3317503333091736,
-0.12319834530353546,
0.05337287113070488,
0.036728404462337494,
-0.04556214436888695,
-0.1292508840560913,
-0.06355428695678711,
-0.027571165934205055,
-0.11930426210165024,
0.0797622799873352,
-0.029857901856303215,
0.09705023467540741,
0.001644919509999454,
0.04545612260699272,
0.0336151048541069,
-0.03541668877005577,
0.15047231316566467,
0.046702269464731216,
0.04576808586716652,
-0.04470113292336464,
0.00870673917233944,
0.07539810985326767,
-0.07003210484981537,
0.038636840879917145,
-0.03995233774185181,
0.060336001217365265,
-0.15627582371234894,
-0.03144460916519165,
-0.04668070375919342,
0.026764966547489166,
-0.06784101575613022,
-0.04503889009356499,
-0.058596715331077576,
0.06559823453426361,
0.10798608511686325,
-0.017329351976513863,
0.07272002846002579,
0.031257595866918564,
0.06705430895090103,
0.04995156079530716,
0.10693852603435516,
0.05587012320756912,
-0.16487576067447662,
-0.018007619306445122,
-0.0027645898517221212,
0.04078541323542595,
-0.10099190473556519,
0.04141996428370476,
0.13792623579502106,
0.043779145926237106,
0.1275443583726883,
0.027335025370121002,
-0.05716471001505852,
-0.027551209554076195,
0.04878140985965729,
-0.07756668329238892,
-0.17530657351016998,
0.000146872567711398,
-0.0012279261136427522,
-0.19265669584274292,
-0.014056045562028885,
0.09237858653068542,
0.0018670944264158607,
-0.028052810579538345,
0.014610846526920795,
0.033276986330747604,
0.0010727948974817991,
0.15874361991882324,
0.008762664161622524,
0.08106369525194168,
-0.08443225920200348,
0.10590022802352905,
0.10143406689167023,
-0.06057608127593994,
0.03715907782316208,
0.07142089307308197,
-0.0651673674583435,
-0.019190827384591103,
0.04855840280652046,
0.09500940144062042,
0.04821949452161789,
0.0024870894849300385,
-0.0412633903324604,
-0.10494466125965118,
0.06796730309724808,
0.018357474356889725,
0.03471202403306961,
-0.017340783029794693,
-0.022022852674126625,
0.010286211967468262,
-0.09417195618152618,
0.1106492429971695,
0.053091492503881454,
0.055400848388671875,
-0.10627467930316925,
0.07246437668800354,
-0.018176427111029625,
0.02274620160460472,
-0.004991211928427219,
0.00761167798191309,
-0.0941057875752449,
-0.02336440421640873,
-0.11177078634500504,
-0.005101276561617851,
-0.04362831264734268,
0.007298656273633242,
-0.011604891158640385,
-0.02091154083609581,
-0.03075389191508293,
0.01346976961940527,
-0.06408370286226273,
-0.07492256164550781,
-0.018372371792793274,
0.08115962147712708,
-0.14217258989810944,
-0.003115977393463254,
0.030804548412561417,
-0.11747356504201889,
0.10366123914718628,
0.015043864026665688,
0.037744224071502686,
0.0021462782751768827,
-0.081902876496315,
-0.04450897499918938,
0.01662350259721279,
0.061076704412698746,
0.06774187833070755,
-0.12858423590660095,
0.0044280909933149815,
-0.0342550165951252,
-0.01785746030509472,
0.017947912216186523,
-0.014024301432073116,
-0.12527944147586823,
-0.00470989802852273,
-0.06397627294063568,
-0.05279592424631119,
-0.049627818167209625,
0.02877074107527733,
0.053686559200286865,
0.00965348444879055,
0.14419780671596527,
-0.06375978887081146,
0.08796379715204239,
-0.2254253625869751,
-0.033704668283462524,
0.0014533810317516327,
0.0012987651862204075,
-0.04049791395664215,
-0.03635329380631447,
0.08631400763988495,
-0.04896955564618111,
0.11242236942052841,
-0.030449453741312027,
0.054707061499357224,
0.03254859149456024,
-0.013717942871153355,
0.024954121559858322,
0.03512394800782204,
0.16570012271404266,
0.08924026787281036,
-0.03441215306520462,
0.0727229192852974,
-0.04203658178448677,
0.04613586142659187,
0.044196922332048416,
0.14854633808135986,
0.16559094190597534,
0.03970131278038025,
0.02692262828350067,
0.08622092753648758,
-0.12316744774580002,
-0.12241984158754349,
0.15358930826187134,
-0.04593899846076965,
0.10657210648059845,
-0.042396385222673416,
0.12146007269620895,
0.07946901023387909,
-0.20092537999153137,
0.05673595145344734,
-0.05483992397785187,
-0.10750780254602432,
-0.10010666400194168,
-0.12281964719295502,
-0.09341920912265778,
-0.08777696639299393,
0.042210038751363754,
-0.12480541318655014,
0.012934853322803974,
0.046577975153923035,
0.03378401696681976,
0.014144191518425941,
0.12263107299804688,
-0.037681397050619125,
0.011867892928421497,
0.08286377042531967,
0.035075005143880844,
0.012730232439935207,
-0.005264212843030691,
-0.024744395166635513,
0.036914706230163574,
0.014119486324489117,
0.0934833362698555,
-0.03258046507835388,
0.024835243821144104,
0.03286164999008179,
-0.01631634309887886,
-0.08404890447854996,
0.007967507466673851,
-0.01726892776787281,
0.01961859129369259,
0.07139621675014496,
0.05671892315149307,
0.012056690640747547,
-0.05799271538853645,
0.20421357452869415,
-0.057496327906847,
-0.07853686809539795,
-0.14755330979824066,
0.11408458650112152,
0.013467183336615562,
0.007949317805469036,
0.06490729749202728,
-0.10459345579147339,
-0.025040199980139732,
0.16737188398838043,
0.18974971771240234,
-0.060514047741889954,
-0.011883093044161797,
0.026448959484696388,
-0.0037165884859859943,
-0.04103650152683258,
0.0828389897942543,
0.08488880097866058,
0.06437541544437408,
-0.04187878966331482,
-0.011662494391202927,
0.013940547592937946,
-0.04296622425317764,
-0.06922441720962524,
0.075626440346241,
0.031972579658031464,
0.025841156020760536,
-0.02813662961125374,
0.08072654902935028,
-0.008936706930398941,
-0.16258515417575836,
0.05232377350330353,
-0.17282964289188385,
-0.20000247657299042,
-0.025907566770911217,
0.08466627448797226,
-0.0024051570799201727,
0.06479749828577042,
-0.0021449660416692495,
-0.027952663600444794,
0.13217048346996307,
-0.01119296159595251,
-0.04739179462194443,
-0.07080885022878647,
0.09700280427932739,
-0.07017570734024048,
0.20531682670116425,
0.009250517003238201,
0.07958552986383438,
0.10532983392477036,
0.012293711304664612,
-0.1283527910709381,
0.004713371861726046,
0.1045674979686737,
-0.07213147729635239,
0.03193537890911102,
0.15308982133865356,
-0.03769105300307274,
0.11584389209747314,
0.08794897794723511,
-0.07240365445613861,
-0.010209600441157818,
-0.04604924097657204,
-0.0033181270118802786,
-0.10055836290121078,
0.039564549922943115,
-0.05091727897524834,
0.15419046580791473,
0.21942460536956787,
-0.053171541541814804,
-0.021843502297997475,
-0.039896417409181595,
0.019835680723190308,
0.0263515692204237,
0.12939733266830444,
-0.02536938339471817,
-0.16862408816814423,
0.019930856302380562,
-0.005360858514904976,
0.06295019388198853,
-0.20214852690696716,
-0.10362869501113892,
0.07747408747673035,
-0.03150463104248047,
-0.061331991106271744,
0.14181959629058838,
0.057228609919548035,
0.020628010854125023,
-0.03900686278939247,
-0.1517868936061859,
-0.02970067597925663,
0.10632959008216858,
-0.15187405049800873,
-0.042919907718896866
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-movie-with-neg-with-repeat
This model is a fine-tuned version of [deepset/bert-large-uncased-whole-word-masking-squad2](https://huggingface.co/deepset/bert-large-uncased-whole-word-masking-squad2) on the squad_v2 and the mit_movie datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "cc-by-4.0", "tags": ["generated_from_trainer"], "datasets": ["squad_v2", "mit_movie"], "model_index": [{"name": "bert-large-uncased-whole-word-masking-squad2-with-ner-mit-movie-with-neg-with-repeat", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "squad_v2", "type": "squad_v2"}}, {"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "mit_movie", "type": "mit_movie"}}]}]}
|
question-answering
|
andi611/bert-large-uncased-whole-word-masking-squad2-with-ner-mit-movie-with-neg-with-repeat
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"en",
"dataset:squad_v2",
"dataset:mit_movie",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-mit_movie #license-cc-by-4.0 #endpoints_compatible #region-us
|
# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-movie-with-neg-with-repeat
This model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the mit_movie datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-movie-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the mit_movie datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-mit_movie #license-cc-by-4.0 #endpoints_compatible #region-us \n",
"# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-movie-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the mit_movie datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
63,
89,
6,
12,
8,
3,
113,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-mit_movie #license-cc-by-4.0 #endpoints_compatible #region-us \n# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-movie-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the mit_movie datasets.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 4\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.08856895565986633,
0.15482348203659058,
-0.004202490672469139,
0.06460810452699661,
0.10622350871562958,
0.04760512337088585,
0.11063801497220993,
0.17398785054683685,
-0.05831211432814598,
0.1182786151766777,
0.03282558172941208,
0.020227178931236267,
0.09116993844509125,
0.12143826484680176,
0.008984020911157131,
-0.24633513391017914,
-0.012737799435853958,
-0.04360702261328697,
-0.0693482905626297,
0.09423508495092392,
0.11318302154541016,
-0.10517799854278564,
0.040861908346414566,
-0.025327524170279503,
-0.11353884637355804,
0.02060570754110813,
-0.038277704268693924,
-0.03927076235413551,
0.07839228957891464,
0.02322009764611721,
0.04763331636786461,
0.00515914149582386,
0.13430887460708618,
-0.22686807811260223,
0.005511677823960781,
0.06372273713350296,
0.027346044778823853,
0.052139028906822205,
0.09992887824773788,
0.04948849603533745,
0.04942924529314041,
-0.15587644279003143,
0.09622063487768173,
0.024358581751585007,
-0.06644106656312943,
-0.16736528277397156,
-0.05875032767653465,
0.019031833857297897,
0.0827205702662468,
0.07837161421775818,
-0.036223337054252625,
0.1107601746916771,
-0.10857882350683212,
0.04104926809668541,
0.18830308318138123,
-0.2660963833332062,
-0.06769238412380219,
0.046743884682655334,
0.08142019063234329,
0.04847972095012665,
-0.1460689753293991,
-0.0029797942843288183,
0.025447871536016464,
-0.0016620270907878876,
0.06677253544330597,
0.005601358599960804,
0.01419992744922638,
-0.01991395838558674,
-0.11732024699449539,
-0.03402768820524216,
0.12160161137580872,
0.06632289290428162,
-0.02975171059370041,
-0.16630302369594574,
-0.0072689177468419075,
-0.07022736221551895,
-0.05768612399697304,
-0.0024339775554835796,
0.029339751228690147,
-0.0674360916018486,
-0.07638482749462128,
0.012370030395686626,
-0.051165588200092316,
-0.0335208959877491,
0.06920569390058517,
0.05393613502383232,
0.06290491670370102,
0.0075658224523067474,
-0.05604073777794838,
0.08642652630805969,
0.02016034722328186,
-0.16181498765945435,
-0.01641407422721386,
-0.005174245219677687,
-0.1192048192024231,
-0.047426845878362656,
-0.014706766232848167,
0.028260422870516777,
0.005105588585138321,
0.1671856790781021,
0.02238008938729763,
0.06578485667705536,
0.022924652323126793,
0.006763183977454901,
0.01907900534570217,
0.1542501449584961,
-0.06719522178173065,
-0.11748199909925461,
-0.010477826930582523,
0.13087192177772522,
0.00463272538036108,
-0.01282393280416727,
-0.08121660351753235,
-0.006594995968043804,
0.04842374846339226,
0.07266978919506073,
0.015107565559446812,
0.014343790709972382,
-0.06492036581039429,
-0.05771356076002121,
0.01705951802432537,
-0.14034810662269592,
0.0639282837510109,
0.029140086844563484,
-0.0769568532705307,
0.01917164772748947,
-0.05123947188258171,
-0.009702636860311031,
-0.054425496608018875,
0.09794634580612183,
-0.056930214166641235,
-0.00599243026226759,
-0.06422114372253418,
-0.05861382558941841,
0.04903591796755791,
-0.04106185957789421,
-0.027706008404493332,
-0.03819214180111885,
-0.0929795503616333,
-0.03320441395044327,
0.03451315686106682,
-0.09228498488664627,
-0.06289688497781754,
-0.04084618389606476,
0.001409038552083075,
0.030330821871757507,
-0.0017856761114671826,
0.08642205595970154,
-0.035628534853458405,
0.038634542375802994,
0.00035588411265052855,
0.058061935007572174,
0.11536022275686264,
0.053273238241672516,
-0.0593169741332531,
0.0335814505815506,
-0.14367666840553284,
0.08693251013755798,
-0.08572547137737274,
0.016880253329873085,
-0.12095635384321213,
-0.11460445076227188,
0.025195354595780373,
-0.024408243596553802,
0.022083468735218048,
0.130250945687294,
-0.18071559071540833,
-0.05777733027935028,
0.1577984094619751,
-0.03717612475156784,
-0.10177483409643173,
0.11302217096090317,
-0.03700687736272812,
-0.040068842470645905,
0.01656181551516056,
0.14234766364097595,
0.1411619335412979,
-0.14372125267982483,
-0.02228730358183384,
0.03340410441160202,
0.08915548026561737,
0.09262117743492126,
0.11461132019758224,
0.0029327524825930595,
0.07979059964418411,
0.003884068690240383,
-0.09632263332605362,
-0.014826485887169838,
-0.06089067459106445,
-0.09877907484769821,
0.0007274751551449299,
-0.04631159082055092,
0.06452076882123947,
0.009896290488541126,
0.0317503921687603,
-0.05831924453377724,
-0.12106416374444962,
0.0033843449782580137,
0.12586069107055664,
-0.04317522421479225,
0.029273977503180504,
-0.08369001001119614,
0.05912293493747711,
-0.00252407300285995,
-0.0056987968273460865,
-0.15036094188690186,
-0.12900012731552124,
0.06846719235181808,
-0.0999494269490242,
0.027454856783151627,
0.0369584783911705,
0.028296908363699913,
0.08340302109718323,
-0.04894252493977547,
-0.012614810839295387,
-0.06566288322210312,
-0.034228842705488205,
-0.023242469877004623,
-0.1905035376548767,
-0.09218141436576843,
-0.04111740365624428,
0.15332886576652527,
-0.17423571646213531,
-0.010998823679983616,
-0.022220894694328308,
0.1321827620267868,
0.003579025389626622,
-0.07955757528543472,
0.024946393445134163,
0.011464463546872139,
0.014513634145259857,
-0.09671011567115784,
0.03921676427125931,
-0.011823410168290138,
-0.0801980197429657,
-0.051678869873285294,
-0.10764432698488235,
0.03359627723693848,
0.06057542935013771,
0.03301437199115753,
-0.06957875937223434,
-0.01907876692712307,
-0.06466101109981537,
-0.023854706436395645,
-0.081453338265419,
-0.010404831729829311,
0.18019452691078186,
0.03195704519748688,
0.10064924508333206,
-0.06846900284290314,
-0.07778019458055496,
0.0017782392678782344,
0.020980937406420708,
-0.03467043489217758,
0.07497630268335342,
0.07909858226776123,
-0.09312313050031662,
0.07072725147008896,
0.11853611469268799,
0.017538556829094887,
0.1481761634349823,
-0.05042946711182594,
-0.0970623791217804,
-0.04114992171525955,
0.0244572926312685,
0.01181484293192625,
0.08593985438346863,
-0.11856620013713837,
-0.016054553911089897,
0.05147599056363106,
-0.014417160302400589,
-0.005526776425540447,
-0.1254110038280487,
-0.03844152390956879,
0.034403640776872635,
-0.03280751034617424,
-0.005838864482939243,
-0.021905839443206787,
0.02899906225502491,
0.09841875731945038,
0.055360760539770126,
-0.021750299260020256,
-0.003094686195254326,
-0.05889885872602463,
-0.08864758908748627,
0.1500140279531479,
-0.0884408950805664,
-0.1972360610961914,
-0.11073785275220871,
-0.00953826867043972,
-0.03990752249956131,
-0.03230133280158043,
-0.0053982860408723354,
-0.09453605115413666,
-0.052626047283411026,
-0.06754464656114578,
0.007768665440380573,
-0.07169105857610703,
-0.009528426453471184,
0.06564071774482727,
0.04239460825920105,
0.07436216622591019,
-0.10857505351305008,
0.020800229161977768,
0.015199806541204453,
-0.11537979543209076,
-0.0019447017693892121,
0.059929680079221725,
0.08204132318496704,
0.10924242436885834,
-0.007810413837432861,
0.022882046177983284,
-0.051298193633556366,
0.21415147185325623,
-0.12476015090942383,
0.00575281260535121,
0.11462952196598053,
-0.005115531384944916,
0.060221098363399506,
0.1659761667251587,
0.037858836352825165,
-0.06860232353210449,
0.013314088806509972,
0.034101702272892,
-0.009157244116067886,
-0.22613778710365295,
-0.016060413792729378,
-0.05556530877947807,
-0.02089902199804783,
0.144290953874588,
0.03663480281829834,
-0.03495113179087639,
0.03215593844652176,
-0.057483021169900894,
0.011481030844151974,
0.03327345848083496,
0.09101438522338867,
0.05283825471997261,
0.057119980454444885,
0.08460520952939987,
-0.008466878905892372,
-0.030774880200624466,
0.0600019171833992,
0.021095743402838707,
0.22178693115711212,
-0.007913081906735897,
0.14512713253498077,
0.01236689742654562,
0.13463497161865234,
-0.0097757987678051,
0.003919031471014023,
0.031729716807603836,
-0.006921761203557253,
0.006925332359969616,
-0.06618395447731018,
0.006577720865607262,
0.041203469038009644,
0.07106710225343704,
-0.01473117247223854,
-0.05918818712234497,
0.05374300107359886,
0.044430267065763474,
0.19768689572811127,
0.0969097837805748,
-0.23065268993377686,
-0.027013368904590607,
0.028992602601647377,
-0.00876013282686472,
-0.026744084432721138,
0.016856472939252853,
0.10704521834850311,
-0.14749440550804138,
0.1066231057047844,
-0.05036154389381409,
0.06987296044826508,
-0.054888296872377396,
0.0022697646636515856,
0.09064289182424545,
0.07990846782922745,
0.025592418387532234,
0.10802880674600601,
-0.15172821283340454,
0.14209821820259094,
-0.01083935797214508,
0.07246030122041702,
-0.06587176024913788,
0.03860959783196449,
-0.0029096913058310747,
-0.012871747836470604,
0.17433978617191315,
0.009818831458687782,
-0.03296173736453056,
-0.132075697183609,
-0.0883246660232544,
0.031052909791469574,
0.14159564673900604,
-0.08755972981452942,
0.09059568494558334,
-0.04640757665038109,
-0.033617183566093445,
0.002032242715358734,
-0.008417431265115738,
-0.08695175498723984,
-0.15305368602275848,
0.05479675531387329,
-0.060606058686971664,
-0.0645379051566124,
-0.0813981145620346,
-0.08778108656406403,
-0.09118600934743881,
0.2112272083759308,
-0.044271618127822876,
-0.027342917397618294,
-0.1237989142537117,
0.10399407893419266,
0.12514686584472656,
-0.06795641779899597,
0.039259083569049835,
0.011046729050576687,
0.17715394496917725,
-0.010713011026382446,
-0.09023874998092651,
0.061178360134363174,
-0.055198363959789276,
-0.1477183699607849,
-0.07130204141139984,
0.17224016785621643,
-0.013490518555045128,
0.0743289366364479,
-0.006905889604240656,
0.0132325803861022,
0.03773504123091698,
-0.07915618270635605,
0.024467037990689278,
0.061037827283144,
0.05836199223995209,
0.021106339991092682,
-0.07523330301046371,
0.011262253858149052,
-0.06502175331115723,
0.003640288021415472,
0.16060279309749603,
0.2541928291320801,
-0.10129982978105545,
0.06375817209482193,
0.039683811366558075,
-0.05759721249341965,
-0.1916740983724594,
0.013537026010453701,
0.12501265108585358,
0.001366152660921216,
0.058920811861753464,
-0.189969003200531,
0.09934782236814499,
0.0762631744146347,
-0.007556010503321886,
0.03405117616057396,
-0.3330177366733551,
-0.13628192245960236,
0.038775838911533356,
0.06197056174278259,
-0.027936803176999092,
-0.11611612141132355,
-0.03881099075078964,
-0.02086501754820347,
-0.09977513551712036,
0.046246208250522614,
-0.020583385601639748,
0.10551868379116058,
-0.0032243053428828716,
0.03321443125605583,
0.015390630811452866,
-0.04214368015527725,
0.13318918645381927,
0.04760583117604256,
0.04550126940011978,
-0.04385244846343994,
-0.04150007665157318,
0.08217412978410721,
-0.0676693469285965,
0.05357846990227699,
-0.06279902160167694,
0.03134967014193535,
-0.1473664790391922,
-0.02237726002931595,
-0.05316495895385742,
0.017785746604204178,
-0.06458667665719986,
-0.04791806638240814,
-0.0637512058019638,
0.0854300856590271,
0.07347816973924637,
-0.01556089986115694,
0.059175968170166016,
0.015431475825607777,
0.046380627900362015,
0.05784836784005165,
0.08180978149175644,
0.0628650039434433,
-0.1870768517255783,
-0.019524993374943733,
0.005818346980959177,
0.0541551411151886,
-0.07686468213796616,
0.024864353239536285,
0.1357387751340866,
0.04806593433022499,
0.15558698773384094,
0.02148795686662197,
-0.048472803086042404,
-0.013047240674495697,
0.07204359024763107,
-0.047322310507297516,
-0.2075010985136032,
-0.01682763174176216,
0.02615147829055786,
-0.18367670476436615,
-0.05837390571832657,
0.060070887207984924,
-0.02973993681371212,
-0.024541474878787994,
-0.008175062946975231,
0.034823328256607056,
-0.003859816351905465,
0.15379349887371063,
0.020790059119462967,
0.09191884100437164,
-0.09779105335474014,
0.14017802476882935,
0.08685381710529327,
-0.05643424019217491,
0.02855171635746956,
0.11302134394645691,
-0.0677255243062973,
-0.015686867758631706,
0.05330632999539375,
0.10336349904537201,
0.04783083498477936,
0.008507534861564636,
-0.04033425450325012,
-0.09940235316753387,
0.06503774970769882,
-0.02211572602391243,
0.01797737553715706,
-0.017332056537270546,
-0.03595184162259102,
0.013506049290299416,
-0.09179971367120743,
0.11175358295440674,
0.06267218291759491,
0.05469999834895134,
-0.10805616527795792,
0.06693855673074722,
0.004955576732754707,
0.0382210835814476,
-0.01586986519396305,
0.0002219029702246189,
-0.06309824436903,
-0.01704753190279007,
-0.08935340493917465,
0.004410652909427881,
-0.045117538422346115,
0.011244152672588825,
-0.03624175488948822,
-0.014041104353964329,
-0.023355336859822273,
0.016754738986492157,
-0.06455939263105392,
-0.0919843539595604,
-0.03869384154677391,
0.0575173981487751,
-0.14862552285194397,
-0.00554739311337471,
0.03579488396644592,
-0.12881700694561005,
0.08385974913835526,
0.03116733953356743,
0.034874025732278824,
-0.0030688587576150894,
-0.06325077265501022,
-0.09720796346664429,
0.00564703019335866,
0.0513603538274765,
0.05109873041510582,
-0.09614499658346176,
-0.0006330800242722034,
-0.0398075208067894,
-0.013707924634218216,
0.007684431504458189,
-0.03429262712597847,
-0.13466815650463104,
-0.00022845204512123019,
-0.049246225506067276,
-0.06033135578036308,
-0.04200391843914986,
0.0549749881029129,
0.056071121245622635,
-0.005014065187424421,
0.11970467120409012,
-0.07396265864372253,
0.08647620677947998,
-0.2442973405122757,
-0.042253606021404266,
0.004160999786108732,
-0.005123922135680914,
-0.034122250974178314,
-0.03631645813584328,
0.08328135311603546,
-0.03821215033531189,
0.09463755786418915,
-0.008771906606853008,
0.050949033349752426,
0.041193824261426926,
-0.0040166666731238365,
0.021879052743315697,
0.029250968247652054,
0.13567940890789032,
0.08569253981113434,
-0.0368536151945591,
0.04298676922917366,
-0.04706713929772377,
0.048354893922805786,
0.07353663444519043,
0.12940585613250732,
0.18876494467258453,
0.030108654871582985,
0.006009330507367849,
0.08064772188663483,
-0.11142434924840927,
-0.1561744213104248,
0.16707833111286163,
-0.0463106669485569,
0.12236650288105011,
-0.04439828172326088,
0.1110733225941658,
0.1146375983953476,
-0.18594007194042206,
0.06813962012529373,
-0.05605930835008621,
-0.10872194916009903,
-0.08758725225925446,
-0.11861494183540344,
-0.08524716645479202,
-0.10627494007349014,
0.058500245213508606,
-0.1266147941350937,
0.04672420024871826,
0.06002449989318848,
0.06521660834550858,
-0.006131973583251238,
0.16692538559436798,
-0.029740752652287483,
-0.018666794523596764,
0.12405598908662796,
0.04874206334352493,
0.013167569413781166,
0.012296263128519058,
-0.01222930382937193,
0.050675492733716965,
0.00017113539797719568,
0.10245709866285324,
-0.02493450790643692,
-0.0048585254698991776,
0.05774974077939987,
-0.004181650001555681,
-0.09949838370084763,
0.02794811874628067,
-0.02309819869697094,
0.007776795420795679,
0.08916185051202774,
0.035125065594911575,
0.03392064571380615,
-0.04678570106625557,
0.19985376298427582,
-0.06357305496931076,
-0.07572630047798157,
-0.1276557296514511,
0.11209604144096375,
0.009531588293612003,
0.01265567447990179,
0.045437924563884735,
-0.0967402383685112,
-0.03846835345029831,
0.1302981972694397,
0.20613537728786469,
-0.07780821621417999,
-0.035409025847911835,
0.019491320475935936,
-0.0008749224944040179,
-0.048046424984931946,
0.09580978006124496,
0.0762370303273201,
0.03838612884283066,
-0.056759972125291824,
0.02026292122900486,
0.014200231991708279,
-0.051547374576330185,
-0.044218309223651886,
0.05165541172027588,
0.04334118217229843,
0.0412549190223217,
-0.06780564039945602,
0.07528609037399292,
0.0054901037365198135,
-0.17371828854084015,
0.05072355270385742,
-0.1645526885986328,
-0.18589572608470917,
-0.021497339010238647,
0.07607491314411163,
0.014658206142485142,
0.07068821042776108,
0.01369224488735199,
-0.019003579393029213,
0.10209821164608002,
0.005579956341534853,
-0.047608207911252975,
-0.09809775650501251,
0.11112572997808456,
-0.08441796898841858,
0.2020510882139206,
-0.008851551450788975,
0.08033689856529236,
0.07695753127336502,
0.014291767962276936,
-0.12105771899223328,
-0.0005435370840132236,
0.09607312828302383,
-0.05106939375400543,
0.004993849899619818,
0.1983303427696228,
-0.0538848415017128,
0.1115822046995163,
0.07550578564405441,
-0.04980119690299034,
-0.012695173732936382,
-0.06324594467878342,
-0.005430223420262337,
-0.10446643829345703,
0.03127168118953705,
-0.052164189517498016,
0.160514235496521,
0.21621626615524292,
-0.05283751338720322,
-0.014816281385719776,
-0.05286366865038872,
0.003100936533883214,
0.06522101163864136,
0.12731410562992096,
-0.011303952895104885,
-0.17456719279289246,
0.01230405829846859,
-0.034425683319568634,
0.07102981209754944,
-0.19077058136463165,
-0.08952691406011581,
0.0597543865442276,
-0.025359295308589935,
-0.054733503609895706,
0.12664230167865753,
0.03372721001505852,
0.027097506448626518,
-0.04829021915793419,
-0.18473917245864868,
-0.010344121605157852,
0.11068013310432434,
-0.15248969197273254,
-0.024127699434757233
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-restaurant-with-neg-with-repeat
This model is a fine-tuned version of [deepset/bert-large-uncased-whole-word-masking-squad2](https://huggingface.co/deepset/bert-large-uncased-whole-word-masking-squad2) on the squad_v2 and the mit_restaurant datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "cc-by-4.0", "tags": ["generated_from_trainer"], "datasets": ["squad_v2", "mit_restaurant"], "model_index": [{"name": "bert-large-uncased-whole-word-masking-squad2-with-ner-mit-restaurant-with-neg-with-repeat", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "squad_v2", "type": "squad_v2"}}, {"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "mit_restaurant", "type": "mit_restaurant"}}]}]}
|
question-answering
|
andi611/bert-large-uncased-whole-word-masking-squad2-with-ner-mit-restaurant-with-neg-with-repeat
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"en",
"dataset:squad_v2",
"dataset:mit_restaurant",
"license:cc-by-4.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-mit_restaurant #license-cc-by-4.0 #endpoints_compatible #region-us
|
# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-restaurant-with-neg-with-repeat
This model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the mit_restaurant datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-restaurant-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the mit_restaurant datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 1\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-mit_restaurant #license-cc-by-4.0 #endpoints_compatible #region-us \n",
"# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-restaurant-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the mit_restaurant datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 1\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
63,
89,
6,
12,
8,
3,
113,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-mit_restaurant #license-cc-by-4.0 #endpoints_compatible #region-us \n# bert-large-uncased-whole-word-masking-squad2-with-ner-mit-restaurant-with-neg-with-repeat\n\nThis model is a fine-tuned version of deepset/bert-large-uncased-whole-word-masking-squad2 on the squad_v2 and the mit_restaurant datasets.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 1\n- eval_batch_size: 1\n- seed: 42\n- gradient_accumulation_steps: 16\n- total_train_batch_size: 16\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.08050256967544556,
0.16289767622947693,
-0.003942172508686781,
0.05541098490357399,
0.09142646193504333,
0.04232155904173851,
0.07957147061824799,
0.16862936317920685,
-0.04881356656551361,
0.1271510124206543,
0.03958143666386604,
0.03481750190258026,
0.08692291378974915,
0.11974336951971054,
0.007793438620865345,
-0.2043645828962326,
0.005429116543382406,
-0.019719071686267853,
-0.08922240883111954,
0.10806586593389511,
0.10327175259590149,
-0.10363250970840454,
0.04086330160498619,
-0.004596997983753681,
-0.09293261170387268,
0.01562024001032114,
-0.05736176297068596,
-0.04137427359819412,
0.06985872983932495,
0.012030720710754395,
0.0712529867887497,
0.008816265501081944,
0.10234794020652771,
-0.23628449440002441,
0.0016225834842771292,
0.07492198050022125,
0.018432093784213066,
0.07168274372816086,
0.0809795930981636,
0.03692595660686493,
0.022096771746873856,
-0.16956458985805511,
0.0914895161986351,
0.04558545723557472,
-0.07815449684858322,
-0.1697007268667221,
-0.08884056657552719,
0.06232493743300438,
0.09662236273288727,
0.0661369040608406,
-0.02572675608098507,
0.0982043668627739,
-0.12720079720020294,
0.03565595671534538,
0.1583985835313797,
-0.2917262017726898,
-0.06158291548490524,
0.028257522732019424,
0.019106201827526093,
0.04001353681087494,
-0.10545282065868378,
-0.008118248544633389,
0.03504014015197754,
0.008784507401287556,
0.040934838354587555,
0.014188447967171669,
0.028694773092865944,
0.0071032606065273285,
-0.10291095823049545,
-0.04686669632792473,
0.12469638884067535,
0.05539379641413689,
-0.019737547263503075,
-0.1663554310798645,
-0.02915969304740429,
-0.06893114745616913,
-0.04467850923538208,
-0.04701623693108559,
0.030779080465435982,
-0.049201276153326035,
-0.062046222388744354,
-0.003452684497460723,
-0.05583206191658974,
-0.038170453161001205,
0.03325209766626358,
0.06371406465768814,
0.06636010855436325,
-0.01897505857050419,
-0.020290808752179146,
0.08042819052934647,
0.016193509101867676,
-0.1499655693769455,
-0.02504107728600502,
0.00022774811077397317,
-0.1511881798505783,
-0.05022570118308067,
-0.004746627062559128,
0.021507002413272858,
0.04388480260968208,
0.15431992709636688,
0.010796592570841312,
0.06098512187600136,
0.03247452527284622,
-0.031211938709020615,
0.008655340410768986,
0.14086265861988068,
-0.03911859542131424,
-0.11361617594957352,
-0.051112476736307144,
0.0991881713271141,
-0.007800735533237457,
-0.0049325586296617985,
-0.06026449799537659,
-0.01822088658809662,
0.047668639570474625,
0.06164036691188812,
0.02168254926800728,
0.010310904122889042,
-0.05199284106492996,
-0.05662841722369194,
0.035940054804086685,
-0.125856414437294,
0.06293601542711258,
0.04584796354174614,
-0.06647077947854996,
-0.006199055816978216,
-0.0629984438419342,
0.003000350669026375,
-0.05918032303452492,
0.089051753282547,
-0.05906403809785843,
-0.03820629417896271,
-0.05353603884577751,
-0.03876861184835434,
0.03427709639072418,
-0.043746013194322586,
-0.038097307085990906,
-0.039631523191928864,
-0.11990489810705185,
-0.05197243019938469,
0.035119976848363876,
-0.09469092637300491,
-0.07415558397769928,
-0.042277321219444275,
-0.0013952956069260836,
0.018478915095329285,
-0.00029872593586333096,
0.09860854595899582,
-0.026694869622588158,
0.05447082221508026,
-0.010113181546330452,
0.031882159411907196,
0.12374093383550644,
0.06378879398107529,
-0.07471689581871033,
0.04145285487174988,
-0.12239419668912888,
0.09511838853359222,
-0.08874483406543732,
0.023166067898273468,
-0.16220895946025848,
-0.09778250008821487,
-0.003698105691000819,
-0.03440335765480995,
0.04177567735314369,
0.13548213243484497,
-0.1722857803106308,
-0.04368970915675163,
0.16012708842754364,
-0.03992261737585068,
-0.09657657891511917,
0.12214592844247818,
-0.02734353207051754,
-0.011319411918520927,
0.06444182246923447,
0.14848114550113678,
0.1584503948688507,
-0.09923896193504333,
-0.03227052465081215,
0.013865587301552296,
0.09625475108623505,
0.07498276233673096,
0.08989942073822021,
-0.032541703432798386,
0.04024920612573624,
0.013587698340415955,
-0.06562758982181549,
-0.02184627577662468,
-0.04742958024144173,
-0.10254113376140594,
-0.014733023010194302,
-0.052336957305669785,
0.09507544338703156,
0.007489514071494341,
0.029379654675722122,
-0.054828424006700516,
-0.11522486805915833,
0.012783330865204334,
0.11737149953842163,
-0.0439932718873024,
0.0024852317292243242,
-0.06635767966508865,
0.0719204992055893,
0.0032098533120006323,
-0.0077646947465837,
-0.14812928438186646,
-0.10965637117624283,
0.07213850319385529,
-0.1202491968870163,
0.03596797585487366,
0.055284228175878525,
0.03830580785870552,
0.03891121223568916,
-0.024992292746901512,
-0.021731026470661163,
-0.05117867887020111,
-0.020218681544065475,
-0.06261849403381348,
-0.16205430030822754,
-0.06003805622458458,
-0.025915706530213356,
0.15698663890361786,
-0.18975551426410675,
-0.003154468024149537,
-0.005324546713382006,
0.12027604877948761,
0.0020922243129462004,
-0.05062055587768555,
0.00246487557888031,
0.01781962625682354,
0.017313441261649132,
-0.06716067343950272,
0.03132602944970131,
-0.010664176195859909,
-0.10099268704652786,
-0.05153129622340202,
-0.08765576779842377,
0.01497652381658554,
0.05340242758393288,
0.08995275944471359,
-0.07318846881389618,
-0.0695258229970932,
-0.0564596988260746,
-0.03199565038084984,
-0.08162396401166916,
-0.001143930945545435,
0.20446451008319855,
0.027602652087807655,
0.08791130781173706,
-0.06568128615617752,
-0.08601538091897964,
-0.0030643062200397253,
0.04240530729293823,
-0.02757946029305458,
0.1042339876294136,
0.05538833886384964,
-0.12571971118450165,
0.07876896858215332,
0.13567779958248138,
0.041543636471033096,
0.10246729850769043,
-0.05446937680244446,
-0.0856378823518753,
-0.04996791109442711,
0.05483067035675049,
-0.01437793392688036,
0.08258076757192612,
-0.13077083230018616,
0.011310778558254242,
0.048192914575338364,
0.009364409372210503,
0.017340615391731262,
-0.1215878501534462,
-0.00742372265085578,
0.04231946915388107,
-0.050463635474443436,
0.005428905598819256,
-0.049501579254865646,
0.041119158267974854,
0.08072004467248917,
0.05453675612807274,
-0.014446462504565716,
-0.006127756554633379,
-0.0405460000038147,
-0.07688509672880173,
0.15686386823654175,
-0.10587000846862793,
-0.20029474794864655,
-0.09263236075639725,
-0.009553439915180206,
-0.04750348627567291,
-0.036652617156505585,
0.01218942366540432,
-0.11422424763441086,
-0.061459001153707504,
-0.0614832304418087,
0.012825872749090195,
-0.03896264731884003,
-0.005406433250755072,
0.0540158711373806,
0.04117182269692421,
0.05548786371946335,
-0.12571516633033752,
0.026105405762791634,
-0.012317820452153683,
-0.08831651508808136,
-0.02105671353638172,
0.04198039695620537,
0.07999369502067566,
0.11727643758058548,
0.013335258699953556,
0.008914275094866753,
-0.049291208386421204,
0.1954936534166336,
-0.114176444709301,
0.003182688495144248,
0.08882486075162888,
0.029266484081745148,
0.06137882545590401,
0.17077644169330597,
0.041964273899793625,
-0.06802676618099213,
0.00861305370926857,
0.046221327036619186,
0.010084380395710468,
-0.24134278297424316,
-0.05353354662656784,
-0.0528191477060318,
-0.04130074381828308,
0.14958485960960388,
0.0573778934776783,
-0.027217112481594086,
0.035413991659879684,
-0.05815671384334564,
0.02511250041425228,
0.012094898149371147,
0.06624133139848709,
0.09526412934064865,
0.03686531260609627,
0.07741988450288773,
-0.014230960048735142,
-0.036778468638658524,
0.0733788013458252,
0.04602592810988426,
0.2306368201971054,
-0.007088099606335163,
0.1821717619895935,
0.03324633464217186,
0.13459521532058716,
-0.024875890463590622,
0.00937596894800663,
0.006449626758694649,
-0.001196858356706798,
0.0020453883334994316,
-0.07670649886131287,
-0.004221306648105383,
0.03324897959828377,
0.0827925056219101,
0.0016909680562093854,
-0.04567157104611397,
0.020038291811943054,
0.050390519201755524,
0.20655612647533417,
0.09457504004240036,
-0.20780456066131592,
-0.03478134423494339,
0.034483589231967926,
-0.0402621366083622,
-0.04398879036307335,
0.023959476500749588,
0.09047875553369522,
-0.12453079223632812,
0.06689939647912979,
-0.048158492892980576,
0.08999297767877579,
-0.04786711558699608,
0.007951599545776844,
0.09559986740350723,
0.08323296904563904,
0.02189631760120392,
0.11083047091960907,
-0.16333073377609253,
0.17862267792224884,
0.0069584972225129604,
0.06480467319488525,
-0.09526573866605759,
0.05523013323545456,
-0.030411023646593094,
-0.008880378678441048,
0.14149191975593567,
0.005866951774805784,
-0.05498433858156204,
-0.130057230591774,
-0.11377041786909103,
0.03738345205783844,
0.11536133289337158,
-0.07573816180229187,
0.07459045201539993,
-0.04121293127536774,
-0.022433176636695862,
0.013473921455442905,
-0.009844515472650528,
-0.11106988787651062,
-0.16919346153736115,
0.041645266115665436,
-0.06439927220344543,
-0.06148838251829147,
-0.0708107054233551,
-0.07416905462741852,
-0.0863867923617363,
0.20602066814899445,
-0.042597681283950806,
-0.03354978561401367,
-0.13693992793560028,
0.0806439071893692,
0.13312797248363495,
-0.07660616934299469,
0.02371468022465706,
0.010665185749530792,
0.1366541087627411,
0.005102297756820917,
-0.06744315475225449,
0.053452182561159134,
-0.043929729610681534,
-0.14660364389419556,
-0.06374439597129822,
0.15790624916553497,
0.02660500817000866,
0.07128864526748657,
0.02402978017926216,
0.020298972725868225,
0.01927533559501171,
-0.0911412462592125,
-0.0031685992144048214,
0.05930083990097046,
0.06921619921922684,
0.05936048552393913,
-0.06577568501234055,
-0.010403590276837349,
-0.06340344995260239,
0.017085619270801544,
0.13858580589294434,
0.23268458247184753,
-0.09234069287776947,
0.1374373435974121,
0.053920574486255646,
-0.058677010238170624,
-0.15823964774608612,
-0.0035350648686289787,
0.09328862279653549,
0.02133043110370636,
0.03453277796506882,
-0.19065359234809875,
0.12399604171514511,
0.09474862366914749,
-0.016040002927184105,
0.027913497760891914,
-0.3392232656478882,
-0.12091618776321411,
0.048905663192272186,
0.05379810929298401,
-0.04041927307844162,
-0.11316943168640137,
-0.07188288867473602,
-0.025302525609731674,
-0.10727930814027786,
0.05362631380558014,
-0.02671121247112751,
0.10111822932958603,
-0.0023485722485929728,
0.03505808487534523,
0.025749191641807556,
-0.03432426601648331,
0.1428874433040619,
0.06453786045312881,
0.03730503469705582,
-0.04126447066664696,
0.0013914582086727023,
0.06430128216743469,
-0.049772925674915314,
0.03349537402391434,
-0.0647430270910263,
0.04567659646272659,
-0.1778302639722824,
-0.0296216681599617,
-0.05194753035902977,
0.026166316121816635,
-0.08787813782691956,
-0.05112272873520851,
-0.06494290381669998,
0.08641496300697327,
0.11302898079156876,
-0.01926993764936924,
0.03878328576683998,
0.03437104448676109,
0.07674937695264816,
0.04649290069937706,
0.08595620840787888,
0.04549987614154816,
-0.19013544917106628,
-0.02397647686302662,
0.0031798069830983877,
0.042856328189373016,
-0.09601894021034241,
0.04155821353197098,
0.12493277341127396,
0.056019436568021774,
0.13211175799369812,
0.011639469303190708,
-0.05931529775261879,
-0.03197688236832619,
0.04673893377184868,
-0.05925861746072769,
-0.2005452662706375,
0.0015286157140508294,
-0.009208211675286293,
-0.18598733842372894,
-0.03811786323785782,
0.08867521584033966,
-0.0019059183541685343,
-0.034495916217565536,
-0.00023292026889976114,
0.04082107916474342,
0.001998740015551448,
0.18111594021320343,
0.024361616000533104,
0.08497918397188187,
-0.08710895478725433,
0.09135232865810394,
0.11855261027812958,
-0.06502419710159302,
0.026210300624370575,
0.08289119601249695,
-0.06267305463552475,
-0.01971537619829178,
0.07587824761867523,
0.10701464116573334,
0.057457372546195984,
0.0008335782331414521,
-0.05514105409383774,
-0.10037943720817566,
0.08312950283288956,
-0.01744256727397442,
0.04570453613996506,
-0.007086384575814009,
-0.03145299851894379,
-0.003049330785870552,
-0.07480650395154953,
0.1134638786315918,
0.05521713197231293,
0.05107880383729935,
-0.09032125771045685,
0.05130579322576523,
-0.009394858963787556,
0.03542570397257805,
-0.01074265781790018,
0.0036303065717220306,
-0.09347054362297058,
-0.025522718206048012,
-0.1159859299659729,
0.01687740348279476,
-0.0477832667529583,
0.01454770565032959,
-0.030192242935299873,
-0.017329588532447815,
-0.03124331869184971,
0.007667661644518375,
-0.06371349096298218,
-0.06927451491355896,
-0.019743913784623146,
0.08663970977067947,
-0.17163929343223572,
0.014908190816640854,
0.03738106042146683,
-0.11444550007581711,
0.0928693488240242,
0.0025027648080140352,
0.025087809190154076,
-0.01306566596031189,
-0.06801047921180725,
-0.06952573359012604,
-0.011204221285879612,
0.06025504320859909,
0.0707300677895546,
-0.14864854514598846,
-0.0008995431126095355,
-0.0329466238617897,
-0.0041221557185053825,
0.018941769376397133,
-0.014931906946003437,
-0.12738382816314697,
-0.0028736209496855736,
-0.07359902560710907,
-0.06346782296895981,
-0.04854623228311539,
0.0420699417591095,
0.06444978713989258,
-0.017409009858965874,
0.1347673088312149,
-0.0572812557220459,
0.08292839676141739,
-0.222071573138237,
-0.043089210987091064,
0.006815837696194649,
-0.00028187409043312073,
-0.03063255175948143,
-0.03232697397470474,
0.07590870559215546,
-0.016986899077892303,
0.10633917897939682,
-0.0042627062648534775,
0.054111555218696594,
0.04601685330271721,
-0.019920064136385918,
0.0108706159517169,
0.02097110077738762,
0.16496536135673523,
0.07204197347164154,
-0.02249639481306076,
0.07216352224349976,
-0.04032472148537636,
0.021180108189582825,
0.06479208171367645,
0.14572221040725708,
0.21340054273605347,
0.04661224037408829,
0.0024284289684146643,
0.08837129175662994,
-0.11421855539083481,
-0.17550091445446014,
0.15698547661304474,
-0.0668809562921524,
0.1204972118139267,
-0.040589120239019394,
0.10767992585897446,
0.07277035713195801,
-0.21818678081035614,
0.061489004641771317,
-0.06103358417749405,
-0.11254113912582397,
-0.0901891365647316,
-0.13807938992977142,
-0.0908263772726059,
-0.1121920794248581,
0.03894958645105362,
-0.1114019975066185,
0.02066759578883648,
0.059720415621995926,
0.039562925696372986,
0.018844813108444214,
0.12354866415262222,
-0.035930320620536804,
0.014782595448195934,
0.09393763542175293,
0.03917974978685379,
0.024964315816760063,
-0.025980757549405098,
-0.022666068747639656,
0.03035673312842846,
0.0307158175855875,
0.09384015947580338,
-0.032214075326919556,
0.005073057021945715,
0.03811532258987427,
-0.0030778225045651197,
-0.07443469017744064,
0.014550200663506985,
-0.02783726342022419,
0.027931857854127884,
0.08531375974416733,
0.05449649319052696,
0.015535365790128708,
-0.04967309907078743,
0.22643928229808807,
-0.046826109290122986,
-0.08816222846508026,
-0.14398348331451416,
0.12705132365226746,
0.023909350857138634,
0.003721268381923437,
0.07551722228527069,
-0.09767995029687881,
-0.02160334587097168,
0.13911356031894684,
0.19672994315624237,
-0.05425279587507248,
-0.021781431511044502,
0.019546417519450188,
-0.0002798762288875878,
-0.026365019381046295,
0.09641854465007782,
0.08349969238042831,
0.04749675095081329,
-0.05175478383898735,
0.0021425297018140554,
0.024144724011421204,
-0.06836839020252228,
-0.08392375707626343,
0.07891219109296799,
0.04360084980726242,
0.017862221226096153,
-0.03983605280518532,
0.11466674506664276,
0.020579447969794273,
-0.17561057209968567,
0.03797674551606178,
-0.1540263593196869,
-0.2079005092382431,
-0.030001426115632057,
0.10509176552295685,
0.00861271470785141,
0.06790386140346527,
0.01192160788923502,
-0.019155263900756836,
0.12222834676504135,
-0.0019408856751397252,
-0.06573671102523804,
-0.07249688357114792,
0.11222951114177704,
-0.08427814394235611,
0.23465022444725037,
0.018788957968354225,
0.04699983820319176,
0.10129010677337646,
0.010319402441382408,
-0.12887226045131683,
-0.014482271857559681,
0.1055699959397316,
-0.06111312657594681,
0.04049127176403999,
0.14821584522724152,
-0.03657873347401619,
0.08392085880041122,
0.08715108036994934,
-0.08791044354438782,
0.0030647865496575832,
-0.004974877927452326,
0.007471160497516394,
-0.11355917155742645,
0.0439179427921772,
-0.05243518576025963,
0.1617204248905182,
0.21808885037899017,
-0.047630079090595245,
-0.010239088907837868,
-0.03861911967396736,
0.01563073694705963,
0.02606814354658127,
0.12481999397277832,
-0.024051394313573837,
-0.17727969586849213,
0.004637506790459156,
-0.02364436164498329,
0.056700997054576874,
-0.19339995086193085,
-0.11321264505386353,
0.06433417648077011,
-0.029031073674559593,
-0.06236148625612259,
0.14189791679382324,
0.0472661629319191,
0.013139544986188412,
-0.04081234708428383,
-0.19963768124580383,
-0.048293884843587875,
0.1079871878027916,
-0.1411956548690796,
-0.03172323852777481
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-agnews
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the ag_news dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1652
- Accuracy: 0.9474
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 2.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1916 | 1.0 | 3375 | 0.1741 | 0.9412 |
| 0.123 | 2.0 | 6750 | 0.1631 | 0.9483 |
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["ag_news"], "metrics": ["accuracy"], "model_index": [{"name": "distilbert-base-uncased-agnews", "results": [{"dataset": {"name": "ag_news", "type": "ag_news", "args": "default"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.9473684210526315}}]}]}
|
text-classification
|
andi611/distilbert-base-uncased-ner-agnews
|
[
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:ag_news",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #distilbert #text-classification #generated_from_trainer #en #dataset-ag_news #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-agnews
==============================
This model is a fine-tuned version of distilbert-base-uncased on the ag\_news dataset.
It achieves the following results on the evaluation set:
* Loss: 0.1652
* Accuracy: 0.9474
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 8
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 2.0
### Training results
### Framework versions
* Transformers 4.8.2
* Pytorch 1.8.1+cu111
* Datasets 1.8.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 2.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #en #dataset-ag_news #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 2.0",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
62,
116,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #en #dataset-ag_news #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 2.0### Training results### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
-0.09304375946521759,
0.09620014578104019,
-0.0029356940649449825,
0.12128165364265442,
0.16041199862957,
0.024828525260090828,
0.10672727972269058,
0.13477472960948944,
-0.10227645188570023,
0.014363955706357956,
0.12663090229034424,
0.16197285056114197,
0.010059846565127373,
0.14191508293151855,
-0.053239937871694565,
-0.2935856580734253,
0.0018364095594733953,
0.014509083703160286,
-0.05081219971179962,
0.1412666141986847,
0.09285223484039307,
-0.12129712104797363,
0.08782388269901276,
-0.013141931965947151,
-0.15069027245044708,
0.01184981781989336,
-0.006819065660238266,
-0.060559626668691635,
0.14197072386741638,
0.025446515530347824,
0.0800529271364212,
0.017592983320355415,
0.10002561658620834,
-0.20026502013206482,
0.008161869831383228,
0.05081147328019142,
0.006146619096398354,
0.084788978099823,
0.053409673273563385,
-0.006344460882246494,
0.139922097325325,
-0.10482703894376755,
0.052695222198963165,
0.019936110824346542,
-0.1318974494934082,
-0.22737278044223785,
-0.0885954350233078,
0.04007218778133392,
0.08518759906291962,
0.11996716260910034,
-0.006261890754103661,
0.09914570301771164,
-0.10079102963209152,
0.10421401262283325,
0.24027077853679657,
-0.2784375846385956,
-0.05667869746685028,
0.01764542981982231,
0.0058356355875730515,
0.06773929297924042,
-0.1000734269618988,
-0.02571103535592556,
0.03349652141332626,
0.04620920121669769,
0.11432468146085739,
-0.02200865000486374,
-0.1143449991941452,
0.014763636514544487,
-0.13877107203006744,
-0.040977347642183304,
0.14063863456249237,
0.03230739384889603,
-0.02703862637281418,
-0.05701809376478195,
-0.060400430113077164,
-0.15662376582622528,
-0.04270324856042862,
0.009004195220768452,
0.04465939849615097,
-0.029048489406704903,
-0.04653025045990944,
-0.00890776515007019,
-0.08877123147249222,
-0.0682494193315506,
-0.0645180344581604,
0.15578238666057587,
0.04247032850980759,
0.008188901469111443,
-0.01722905784845352,
0.1255541443824768,
0.034161169081926346,
-0.14912185072898865,
0.0034011558163911104,
0.02406858652830124,
-0.0026222807355225086,
-0.026906277984380722,
-0.05260951817035675,
0.007144564762711525,
0.0036108018830418587,
0.14362512528896332,
-0.07484577596187592,
0.04667652025818825,
0.041080277413129807,
0.027972029522061348,
-0.09008422493934631,
0.17364974319934845,
-0.04909047856926918,
-0.038273219019174576,
-0.002750605344772339,
0.07471142709255219,
0.024692097678780556,
-0.01975218765437603,
-0.1106804758310318,
0.008504113182425499,
0.07713580131530762,
0.030196789652109146,
-0.05063282698392868,
0.08162157982587814,
-0.05631410703063011,
-0.02076788991689682,
0.027288319543004036,
-0.11173115670681,
0.02560221031308174,
0.006823495961725712,
-0.09483011066913605,
-0.02639147825539112,
0.02728637121617794,
0.011311610229313374,
-0.02414284273982048,
0.13011033833026886,
-0.07984288036823273,
0.027202485129237175,
-0.08076369762420654,
-0.10382670909166336,
0.010769697837531567,
-0.08599542081356049,
0.0076627470552921295,
-0.08748263865709305,
-0.21007172763347626,
-0.01909853331744671,
0.046727247536182404,
-0.04014938324689865,
-0.06961049884557724,
-0.06955385953187943,
-0.0835360437631607,
0.027159428223967552,
-0.021323073655366898,
0.12893351912498474,
-0.08395511656999588,
0.11104843765497208,
0.01774025894701481,
0.05788698419928551,
-0.031186213716864586,
0.06459437310695648,
-0.11004582792520523,
0.006440747994929552,
-0.15297728776931763,
0.07423743605613708,
-0.060455575585365295,
0.06005608290433884,
-0.09050916135311127,
-0.10834786295890808,
0.02982953190803528,
0.0045703924261033535,
0.06440000236034393,
0.11842235922813416,
-0.18754811584949493,
-0.0734502300620079,
0.12925240397453308,
-0.06333116441965103,
-0.09445462375879288,
0.10706540197134018,
-0.06171907112002373,
0.037315309047698975,
0.07780932635068893,
0.16475434601306915,
0.07861839234828949,
-0.06579409539699554,
0.015997357666492462,
0.008876802399754524,
0.052145931869745255,
-0.02390705794095993,
0.07147536426782608,
0.007620620541274548,
0.0038367165252566338,
0.032775167375802994,
-0.04752260446548462,
0.05851665884256363,
-0.09461914747953415,
-0.09209436178207397,
-0.027468794956803322,
-0.07573463767766953,
0.04888618364930153,
0.06530959904193878,
0.054921429604291916,
-0.09235547482967377,
-0.09294842183589935,
0.04810696467757225,
0.09730982780456543,
-0.04632546380162239,
0.031973931938409805,
-0.059226393699645996,
0.051862381398677826,
0.009635386988520622,
-0.004493179731070995,
-0.18539446592330933,
-0.021456148475408554,
0.00954070407897234,
0.017123086377978325,
0.020320482552051544,
0.011561146937310696,
0.05806531757116318,
0.05564777925610542,
-0.06011062487959862,
-0.03467492759227753,
-0.03978896513581276,
0.0010630569886416197,
-0.1195698082447052,
-0.2091377079486847,
-0.02156534604728222,
-0.019054947420954704,
0.11975281685590744,
-0.20615388453006744,
0.033893246203660965,
-0.005411003716289997,
0.06852928549051285,
0.01503942534327507,
-0.0013657076051458716,
-0.03245620056986809,
0.07266071438789368,
-0.04356828331947327,
-0.050644394010305405,
0.07083238661289215,
0.0017066659638658166,
-0.08441106230020523,
-0.024708416312932968,
-0.0975821241736412,
0.15049195289611816,
0.11403798311948776,
-0.08931748569011688,
-0.07409656047821045,
0.013027248904109001,
-0.06536134332418442,
-0.033918287605047226,
-0.05394705757498741,
0.0337117463350296,
0.1742681860923767,
0.0004998202784918249,
0.15163785219192505,
-0.06879961490631104,
-0.04595618695020676,
0.011945361271500587,
-0.016863899305462837,
0.030330907553434372,
0.13684320449829102,
0.10597702860832214,
-0.06911767274141312,
0.14097626507282257,
0.14680597186088562,
-0.06612197309732437,
0.13471545279026031,
-0.03655075654387474,
-0.0664520338177681,
-0.008385390974581242,
-0.03101762942969799,
-0.021512173116207123,
0.07681774348020554,
-0.10607334226369858,
0.012885188683867455,
0.027304647490382195,
0.031894225627183914,
0.012428516522049904,
-0.20741809904575348,
-0.04069263115525246,
0.023684561252593994,
-0.06953981518745422,
-0.03584778308868408,
-0.0070330919697880745,
0.009760532528162003,
0.11615827679634094,
0.013449206948280334,
-0.09818156808614731,
0.035380296409130096,
0.00032533015473745763,
-0.06869254261255264,
0.20768414437770844,
-0.09812953323125839,
-0.18789535760879517,
-0.11972875148057938,
-0.08373663574457169,
-0.0572274848818779,
0.004424803424626589,
0.07085686922073364,
-0.09051618725061417,
-0.036620840430259705,
-0.08070702850818634,
0.022164005786180496,
0.010139248333871365,
0.023824365809559822,
0.00562083488330245,
0.0010336507111787796,
0.061844103038311005,
-0.10843398422002792,
-0.012833532877266407,
-0.05460793524980545,
-0.05565179884433746,
0.040914010256528854,
0.041607774794101715,
0.09927842020988464,
0.13664476573467255,
0.005856007803231478,
0.016890116035938263,
-0.03343404084444046,
0.23688775300979614,
-0.07384392619132996,
-0.01122202631086111,
0.1391211748123169,
-0.005289889872074127,
0.05740305781364441,
0.1460125893354416,
0.06282637268304825,
-0.09609301388263702,
0.017882216721773148,
0.044834062457084656,
-0.027758488431572914,
-0.22318391501903534,
-0.05818414315581322,
-0.04819286987185478,
0.0032264403998851776,
0.09497928619384766,
0.04202505946159363,
0.01390439085662365,
0.0423554852604866,
0.018735604360699654,
0.04776791110634804,
-0.007059517782181501,
0.05390118062496185,
0.13518476486206055,
0.04148389771580696,
0.13061204552650452,
-0.04259749501943588,
-0.05821669474244118,
0.05470576137304306,
-0.02513025514781475,
0.20699191093444824,
-0.014361931011080742,
0.11245480179786682,
0.049477145075798035,
0.14582882821559906,
-0.0019790446385741234,
0.070774145424366,
0.006923025008291006,
-0.028213953599333763,
-0.021539773792028427,
-0.039465080946683884,
-0.031156616285443306,
0.020327698439359665,
-0.05673275887966156,
0.04798572137951851,
-0.11882109940052032,
0.006636021658778191,
0.056421782821416855,
0.28855907917022705,
0.04328383132815361,
-0.32222849130630493,
-0.08991760015487671,
0.0023686771746724844,
-0.05040239915251732,
-0.020433146506547928,
0.03595375642180443,
0.08314269781112671,
-0.09416136890649796,
0.05995320901274681,
-0.04859716072678566,
0.09670396149158478,
-0.053605105727910995,
0.05268825590610504,
0.08942542225122452,
0.0784498006105423,
0.0005466343136504292,
0.08974827826023102,
-0.3083435297012329,
0.27282994985580444,
-0.0014519041869789362,
0.061287350952625275,
-0.06829816848039627,
0.002997164148837328,
0.04586651176214218,
0.08549149334430695,
0.07746932655572891,
-0.01513440441340208,
-0.021215025335550308,
-0.1954498440027237,
-0.08436164259910583,
0.020659267902374268,
0.0835474506020546,
-0.056183282285928726,
0.09914517402648926,
-0.04492644593119621,
-0.0033520914148539305,
0.06179017573595047,
-0.04891762137413025,
-0.06445353478193283,
-0.10044695436954498,
0.004949296824634075,
0.023198934271931648,
-0.0030609669629484415,
-0.060666557401418686,
-0.12139218300580978,
-0.09524907171726227,
0.1404019594192505,
-0.03946879878640175,
-0.04816878214478493,
-0.11775809526443481,
0.07762610912322998,
0.08436807245016098,
-0.08960853517055511,
0.02790946699678898,
0.01552615687251091,
0.05993396416306496,
0.043738268315792084,
-0.07480939477682114,
0.10892438143491745,
-0.07135085016489029,
-0.21143698692321777,
-0.04320032149553299,
0.11746836453676224,
0.046525727957487106,
0.0633394718170166,
-0.020477348938584328,
0.02948160655796528,
-0.0399242527782917,
-0.09023045748472214,
0.010932980105280876,
0.016879484057426453,
0.07334919273853302,
0.04663066193461418,
-0.06290629506111145,
-0.0016919458284974098,
-0.06853500008583069,
-0.0349527969956398,
0.1780133843421936,
0.25949397683143616,
-0.09962738305330276,
0.06380404531955719,
0.04757758975028992,
-0.06536484509706497,
-0.21630556881427765,
0.005061350297182798,
0.061940502375364304,
-0.007153760176151991,
0.04814966395497322,
-0.19791409373283386,
0.10802061855792999,
0.10847712308168411,
-0.012909363955259323,
0.09715858101844788,
-0.34106749296188354,
-0.12564215064048767,
0.1149054542183876,
0.09814609587192535,
0.11587465554475784,
-0.1341918259859085,
-0.019581787288188934,
-0.02048957720398903,
-0.08835356682538986,
0.12482938915491104,
-0.056770533323287964,
0.12943042814731598,
-0.035701535642147064,
0.07112782448530197,
0.009206142276525497,
-0.04193572327494621,
0.11805569380521774,
0.03238226845860481,
0.09470514953136444,
-0.07075773924589157,
-0.027884934097528458,
0.023056184872984886,
-0.048222851008176804,
0.03412288427352905,
-0.1003323569893837,
0.041811827570199966,
-0.11616313457489014,
-0.021764567121863365,
-0.08933409303426743,
0.03208053484559059,
-0.0376385897397995,
-0.05866633728146553,
-0.03800616413354874,
0.03233632072806358,
0.06476499140262604,
-0.01215091347694397,
0.1641705483198166,
0.007743700873106718,
0.12287954986095428,
0.08561353385448456,
0.07771724462509155,
-0.059384819120168686,
-0.05986449494957924,
-0.02614278718829155,
-0.009929199703037739,
0.050997234880924225,
-0.14570659399032593,
0.03194180503487587,
0.14512166380882263,
0.017762567847967148,
0.13843309879302979,
0.08158053457736969,
-0.015318149700760841,
-0.0005103519069962204,
0.06442739069461823,
-0.17735512554645538,
-0.07308335602283478,
-0.01301504485309124,
-0.05958404764533043,
-0.10300096124410629,
0.04131104797124863,
0.10566125065088272,
-0.062074609100818634,
-0.006456418894231319,
0.006775288842618465,
0.03784332796931267,
-0.03680609166622162,
0.2012059986591339,
0.049246493726968765,
0.05084298923611641,
-0.11833256483078003,
0.0951104462146759,
0.055079542100429535,
-0.07730397582054138,
0.009587360545992851,
0.1167670264840126,
-0.09786129742860794,
-0.04967300221323967,
0.058507196605205536,
0.14490899443626404,
-0.05419238656759262,
-0.04415317252278328,
-0.1385582685470581,
-0.12444993853569031,
0.10533040761947632,
0.13536575436592102,
0.10392849892377853,
0.020100882276892662,
-0.0670023262500763,
0.006453210487961769,
-0.09571701288223267,
0.1060243472456932,
0.056564196944236755,
0.05839190259575844,
-0.13912338018417358,
0.13312087953090668,
0.01129111461341381,
0.06221367418766022,
-0.025003761053085327,
0.01619582250714302,
-0.09783780574798584,
0.017586661502718925,
-0.12129666656255722,
-0.018068283796310425,
-0.03116449899971485,
0.01621393673121929,
-0.016299603506922722,
-0.060485824942588806,
-0.0529521144926548,
0.00885004922747612,
-0.11358160525560379,
-0.02869308926165104,
0.016799651086330414,
0.062166616320610046,
-0.1247488409280777,
-0.04600972309708595,
0.018808165565133095,
-0.07310094684362411,
0.09277989715337753,
0.0459553487598896,
0.0025514932349324226,
0.052564822137355804,
-0.11077852547168732,
-0.00727061927318573,
0.06228511407971382,
0.018891341984272003,
0.0606655515730381,
-0.08388613909482956,
-0.003687905613332987,
-0.00970155093818903,
0.04325396567583084,
0.023066028952598572,
0.0776350200176239,
-0.14114613831043243,
0.02718339115381241,
-0.01936212182044983,
-0.07857812941074371,
-0.06541771441698074,
0.03024759702384472,
0.08048564195632935,
0.028498616069555283,
0.21920040249824524,
-0.08828262984752655,
0.03903079777956009,
-0.1972806453704834,
0.0010018943576142192,
-0.015986520797014236,
-0.12482620030641556,
-0.13158154487609863,
-0.06699343025684357,
0.07922818511724472,
-0.056421227753162384,
0.1302114576101303,
0.032445624470710754,
0.04351352900266647,
0.02991766482591629,
-0.021433105692267418,
0.00839054211974144,
0.01808553747832775,
0.17736123502254486,
0.0445675365626812,
-0.036621689796447754,
0.06595399230718613,
0.02780519798398018,
0.10027021169662476,
0.10584163665771484,
0.22005972266197205,
0.13828721642494202,
0.03556032106280327,
0.08935406804084778,
0.036587465554475784,
-0.09169379621744156,
-0.16152530908584595,
0.03327496349811554,
-0.055170945823192596,
0.11263271421194077,
-0.03230105713009834,
0.20382489264011383,
0.05389487370848656,
-0.16097359359264374,
0.04850563779473305,
-0.04743834584951401,
-0.077738456428051,
-0.11703794449567795,
-0.04184100031852722,
-0.08225864171981812,
-0.1496899425983429,
-0.0032309654634445906,
-0.11738613247871399,
0.04146653413772583,
0.11859798431396484,
0.011069485917687416,
-0.016624020412564278,
0.12960472702980042,
0.008653546683490276,
0.005688495468348265,
0.07222598046064377,
0.002026789588853717,
-0.027530740946531296,
-0.09518201649188995,
-0.07461987435817719,
-0.008936089463531971,
0.0020077198278158903,
0.024525830522179604,
-0.045106761157512665,
-0.06389382481575012,
0.03950436785817146,
-0.039511412382125854,
-0.0957624688744545,
0.01930663175880909,
0.022846154868602753,
0.06768862903118134,
0.05024021491408348,
0.02308797836303711,
0.009200540371239185,
0.00649327551946044,
0.24609850347042084,
-0.08123745769262314,
-0.09457102417945862,
-0.09264760464429855,
0.2692146301269531,
0.0544854961335659,
-0.004913522396236658,
0.031184948980808258,
-0.06925829499959946,
-0.026005607098340988,
0.22552382946014404,
0.21037155389785767,
-0.09574777632951736,
-0.008250987157225609,
-0.0044027152471244335,
-0.000005739032076235162,
-0.004022403620183468,
0.11691966652870178,
0.1421191394329071,
0.07113001495599747,
-0.08421652764081955,
-0.04592859372496605,
-0.052685610949993134,
-0.018193185329437256,
-0.04589454084634781,
0.07432396709918976,
0.02908828854560852,
-0.00138613092713058,
-0.04025131091475487,
0.06771959364414215,
-0.07939858734607697,
-0.0995761826634407,
0.03051609732210636,
-0.20389343798160553,
-0.16967451572418213,
-0.024830279871821404,
0.09670645743608475,
0.01583838276565075,
0.052089542150497437,
-0.010508229956030846,
-0.004842934664338827,
0.08079080283641815,
-0.019669825211167336,
-0.07906021177768707,
-0.0901455208659172,
0.10217557102441788,
-0.10362870991230011,
0.205000102519989,
-0.04385513439774513,
0.04882242903113365,
0.1120545044541359,
0.05669834092259407,
-0.0871119573712349,
0.08042874187231064,
0.044351786375045776,
-0.061959948390722275,
0.036527518182992935,
0.08052059262990952,
-0.033160250633955,
0.06091141328215599,
0.05591949447989464,
-0.12451163679361343,
0.015713131055235863,
-0.05440833419561386,
-0.07676967978477478,
-0.03994416445493698,
-0.03870091587305069,
-0.03903334215283394,
0.12979599833488464,
0.22061756253242493,
-0.03993972763419151,
0.009256713092327118,
-0.07536937296390533,
0.008049887605011463,
0.03563191741704941,
0.02269732765853405,
-0.052810803055763245,
-0.22973395884037018,
0.01045290194451809,
0.04942389205098152,
0.005311143584549427,
-0.21100766956806183,
-0.08661364018917084,
0.005995279178023338,
-0.05699320510029793,
-0.0984952375292778,
0.0967397689819336,
0.0548073835670948,
0.03230090066790581,
-0.04600255936384201,
-0.06765136122703552,
-0.06697617471218109,
0.16541291773319244,
-0.16931584477424622,
-0.07995930314064026
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-ner
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0664
- Precision: 0.9332
- Recall: 0.9423
- F1: 0.9377
- Accuracy: 0.9852
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.2042 | 1.0 | 878 | 0.0636 | 0.9230 | 0.9253 | 0.9241 | 0.9822 |
| 0.0428 | 2.0 | 1756 | 0.0577 | 0.9286 | 0.9370 | 0.9328 | 0.9841 |
| 0.0199 | 3.0 | 2634 | 0.0606 | 0.9364 | 0.9401 | 0.9383 | 0.9851 |
| 0.0121 | 4.0 | 3512 | 0.0641 | 0.9339 | 0.9380 | 0.9360 | 0.9847 |
| 0.0079 | 5.0 | 4390 | 0.0664 | 0.9332 | 0.9423 | 0.9377 | 0.9852 |
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["conll2003"], "metrics": ["precision", "recall", "f1", "accuracy"], "model_index": [{"name": "distilbert-base-uncased-ner", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.985193893275295}}]}]}
|
token-classification
|
andi611/distilbert-base-uncased-ner-conll2003
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-ner
===========================
This model is a fine-tuned version of distilbert-base-uncased on the conll2003 dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0664
* Precision: 0.9332
* Recall: 0.9423
* F1: 0.9377
* Accuracy: 0.9852
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 16
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.8.2
* Pytorch 1.8.1+cu111
* Datasets 1.8.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
65,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #token-classification #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
-0.1076073870062828,
0.10513663291931152,
-0.002198984147980809,
0.11938350647687912,
0.1612469106912613,
0.03563809394836426,
0.11242253333330154,
0.11787554621696472,
-0.11206692457199097,
0.031097255647182465,
0.12465070933103561,
0.16895097494125366,
0.011764971539378166,
0.1181146651506424,
-0.055588092654943466,
-0.24734260141849518,
-0.004848663229495287,
0.05362362042069435,
-0.07395922392606735,
0.12722736597061157,
0.09592223167419434,
-0.138313427567482,
0.0890091136097908,
0.007542874198406935,
-0.21967755258083344,
0.00594168808311224,
0.010875247418880463,
-0.05600154027342796,
0.13196563720703125,
0.030218428000807762,
0.13743732869625092,
-0.0027231976855546236,
0.09000472724437714,
-0.1753266304731369,
0.006341206841170788,
0.05195651948451996,
0.003020689357072115,
0.09048646688461304,
0.04959925636649132,
0.012804215773940086,
0.12239964306354523,
-0.07412251085042953,
0.05553184077143669,
0.019116045907139778,
-0.11375485360622406,
-0.21716178953647614,
-0.0922015830874443,
0.04420822113752365,
0.07545895129442215,
0.09799593687057495,
0.005002251360565424,
0.14178387820720673,
-0.09012165665626526,
0.09143839031457901,
0.21518105268478394,
-0.30111774802207947,
-0.06766077131032944,
0.04930804297327995,
0.008314761333167553,
0.03638077154755592,
-0.10650372505187988,
-0.041916634887456894,
0.05576767027378082,
0.04716026782989502,
0.12713073194026947,
-0.035139698535203934,
-0.12242778390645981,
0.014530899934470654,
-0.1434623897075653,
-0.024591051042079926,
0.16128134727478027,
0.03939983248710632,
-0.03143124654889107,
-0.03972930088639259,
-0.057301707565784454,
-0.16961769759655,
-0.026889771223068237,
-0.009194016456604004,
0.045290861278772354,
-0.03700029477477074,
-0.06484095752239227,
0.010688294656574726,
-0.10694543272256851,
-0.06928358227014542,
-0.08828943222761154,
0.14211688935756683,
0.03859066963195801,
0.017247699201107025,
-0.020721416920423508,
0.10686831921339035,
0.005272739566862583,
-0.11999869346618652,
0.01755138859152794,
0.026397354900836945,
-0.010128836147487164,
-0.05522899329662323,
-0.05797212943434715,
-0.04886199161410332,
0.010077579878270626,
0.13924521207809448,
-0.05526990815997124,
0.041509371250867844,
0.050863925367593765,
0.041902486234903336,
-0.08500851690769196,
0.1805059313774109,
-0.048915937542915344,
-0.019272571429610252,
0.003873433917760849,
0.04066985100507736,
0.019273286685347557,
-0.0015730691375210881,
-0.11869989335536957,
0.0033236858434975147,
0.0979384183883667,
0.0033092934172600508,
-0.0683649480342865,
0.069512739777565,
-0.0634038895368576,
-0.022378316149115562,
0.021969756111502647,
-0.08789990097284317,
0.033683471381664276,
-0.0027671423740684986,
-0.08308728039264679,
-0.019124343991279602,
0.019254568964242935,
0.01608639769256115,
-0.008631453849375248,
0.12029533088207245,
-0.0946977362036705,
0.02056374028325081,
-0.09746742248535156,
-0.10941711068153381,
0.026491057127714157,
-0.09583111107349396,
0.03093912824988365,
-0.099617138504982,
-0.16677825152873993,
-0.013064525090157986,
0.058841772377491,
-0.02399609424173832,
-0.05999923124909401,
-0.042178861796855927,
-0.079273521900177,
0.012289567850530148,
-0.015569733455777168,
0.12468989938497543,
-0.06444535404443741,
0.09785138815641403,
0.033289920538663864,
0.06341839581727982,
-0.05969324707984924,
0.05379883199930191,
-0.09598755836486816,
0.013327586464583874,
-0.15332205593585968,
0.020791025832295418,
-0.05782254785299301,
0.06714814156293869,
-0.08710399270057678,
-0.1057487353682518,
0.01722114160656929,
-0.004660223610699177,
0.06899739056825638,
0.07550425827503204,
-0.17102397978305817,
-0.07026377320289612,
0.13667286932468414,
-0.07122193276882172,
-0.12201110273599625,
0.1113618016242981,
-0.05601189658045769,
0.033961012959480286,
0.05784005671739578,
0.14860473573207855,
0.08345253765583038,
-0.08566517382860184,
-0.013237923383712769,
0.01519695669412613,
0.04839423671364784,
-0.08615140616893768,
0.0694858506321907,
0.009787951596081257,
0.027247179299592972,
0.028364868834614754,
-0.0269267950206995,
0.05479460582137108,
-0.09419265389442444,
-0.09217438846826553,
-0.037932995706796646,
-0.09938852488994598,
0.03452424705028534,
0.079277902841568,
0.07117821276187897,
-0.09231209754943848,
-0.08002957701683044,
0.07848982512950897,
0.09352562576532364,
-0.05412560701370239,
0.024595648050308228,
-0.06708186119794846,
0.07588386535644531,
-0.04389036074280739,
-0.03295448049902916,
-0.18351975083351135,
-0.03943654149770737,
0.010161043144762516,
0.006816448178142309,
0.014031711034476757,
0.03128878027200699,
0.06336448341608047,
0.06478404998779297,
-0.053382452577352524,
-0.02281554415822029,
-0.02759203128516674,
0.0010844708885997534,
-0.13118596374988556,
-0.1999122053384781,
-0.043353237211704254,
-0.022343996912240982,
0.1327085793018341,
-0.20355287194252014,
0.03229957073926926,
0.005411786027252674,
0.0877397209405899,
0.020335905253887177,
-0.009532843716442585,
-0.04357614368200302,
0.08556750416755676,
-0.04855324700474739,
-0.05199933797121048,
0.06643614917993546,
0.011306684464216232,
-0.08663387596607208,
-0.06299934536218643,
-0.09501966834068298,
0.1713522970676422,
0.13490022718906403,
-0.11440056562423706,
-0.08005697280168533,
-0.014047321863472462,
-0.06534023582935333,
-0.03579240292310715,
-0.0397597998380661,
0.03365975618362427,
0.15459010004997253,
-0.016101785004138947,
0.14628881216049194,
-0.06603416055440903,
-0.051174454391002655,
0.02097051776945591,
-0.027250206097960472,
0.013241143897175789,
0.11228148639202118,
0.13831232488155365,
-0.07260540127754211,
0.15550021827220917,
0.14654305577278137,
-0.10409477353096008,
0.1245267391204834,
-0.0473686121404171,
-0.07318548113107681,
-0.021463172510266304,
-0.0209116879850626,
-0.0003847929765470326,
0.10893765091896057,
-0.1288032978773117,
0.0008595709223300219,
0.027947822585701942,
0.01856303960084915,
0.022304123267531395,
-0.2254379391670227,
-0.03289319574832916,
0.027972618117928505,
-0.033183686435222626,
-0.0002915832737926394,
-0.013878337107598782,
0.001768632442690432,
0.10239053517580032,
0.0043759653344750404,
-0.09393318742513657,
0.04668498784303665,
0.0047585503198206425,
-0.07225003093481064,
0.21192607283592224,
-0.09443992376327515,
-0.1361413598060608,
-0.12580442428588867,
-0.0797615498304367,
-0.047133877873420715,
0.009648756124079227,
0.054429374635219574,
-0.07827164977788925,
-0.03769773244857788,
-0.06261102110147476,
0.00028135417960584164,
-0.002811729209497571,
0.03967275843024254,
0.011393156833946705,
0.004439393058419228,
0.06587400287389755,
-0.10977514088153839,
-0.0066919997334480286,
-0.05626662075519562,
-0.06020158529281616,
0.04615163058042526,
0.03869457542896271,
0.11615059524774551,
0.15702944993972778,
-0.015806660056114197,
0.008102092891931534,
-0.032823529094457626,
0.23073172569274902,
-0.0630875751376152,
-0.029108993709087372,
0.12762047350406647,
-0.010333388112485409,
0.044983409345149994,
0.1185576319694519,
0.07289332896471024,
-0.08696767687797546,
0.005688404198735952,
0.03767053037881851,
-0.030900143086910248,
-0.2219444364309311,
-0.05262552946805954,
-0.05667540058493614,
-0.007401083130389452,
0.08993732929229736,
0.02881161868572235,
0.04070110619068146,
0.0698264092206955,
0.040084633976221085,
0.09194222092628479,
-0.04679902642965317,
0.059020884335041046,
0.11880846321582794,
0.047954682260751724,
0.12462294846773148,
-0.045579489320516586,
-0.05193481594324112,
0.043646566569805145,
0.0014209445798769593,
0.22964885830879211,
-0.0005110073834657669,
0.13128761947155,
0.05933918431401253,
0.17934919893741608,
-0.008657144382596016,
0.07512476295232773,
-0.006802459247410297,
-0.03990505635738373,
-0.009401495568454266,
-0.03720758855342865,
-0.03346845135092735,
0.025232834741473198,
-0.054554421454668045,
0.06624618917703629,
-0.11579775810241699,
0.01030728593468666,
0.05299283191561699,
0.249282106757164,
0.04521128535270691,
-0.3302909731864929,
-0.09601487964391708,
0.00003525306601659395,
-0.03025868348777294,
-0.022846153005957603,
0.028701232746243477,
0.09686454385519028,
-0.08039477467536926,
0.03064670041203499,
-0.06491164118051529,
0.08572512865066528,
-0.051187556236982346,
0.0374395027756691,
0.0955391526222229,
0.09682763367891312,
0.012832601554691792,
0.07991021126508713,
-0.2923164665699005,
0.26806753873825073,
0.008694425225257874,
0.07135801017284393,
-0.07323331385850906,
0.009760202839970589,
0.024949723854660988,
0.06730460375547409,
0.0645306333899498,
-0.016186589375138283,
-0.039956290274858475,
-0.1974363625049591,
-0.048683296889066696,
0.021986935287714005,
0.07755471020936966,
-0.015944892540574074,
0.09214916080236435,
-0.03110036440193653,
0.005000619683414698,
0.07664600759744644,
-0.01727386564016342,
-0.04386385902762413,
-0.10033489018678665,
-0.005092066712677479,
0.03699468821287155,
-0.051518764346838,
-0.06591551750898361,
-0.11178892105817795,
-0.12304237484931946,
0.16076022386550903,
-0.05534893646836281,
-0.031749941408634186,
-0.11110914498567581,
0.09153956919908524,
0.08018918335437775,
-0.08376748859882355,
0.05132223665714264,
0.005018382333219051,
0.06862886250019073,
0.04063551127910614,
-0.06763971596956253,
0.10569702088832855,
-0.07259467244148254,
-0.16437004506587982,
-0.06489422917366028,
0.09622326493263245,
0.031733714044094086,
0.06495131552219391,
-0.014171576127409935,
0.015284893102943897,
-0.03807907924056053,
-0.08719412982463837,
0.01875888928771019,
0.0004987449501641095,
0.09201269596815109,
0.021595189347863197,
-0.0610826350748539,
0.00712607940658927,
-0.04136079549789429,
-0.029505610466003418,
0.18300846219062805,
0.23428308963775635,
-0.10049451887607574,
0.008862261660397053,
0.025829745456576347,
-0.06838201731443405,
-0.20117758214473724,
0.04477375000715256,
0.0597800649702549,
0.007067815400660038,
0.027329707518219948,
-0.17403896152973175,
0.14552798867225647,
0.11773774027824402,
-0.01325774472206831,
0.10570386797189713,
-0.31797847151756287,
-0.11987729370594025,
0.12254009395837784,
0.13839900493621826,
0.10827513039112091,
-0.1413266807794571,
-0.020460473373532295,
-0.017596730962395668,
-0.1468520164489746,
0.11765188723802567,
-0.0857839286327362,
0.11668470501899719,
-0.030764076858758926,
0.07137646526098251,
0.0005678002489730716,
-0.06288914382457733,
0.11513124406337738,
0.03002362698316574,
0.10120277106761932,
-0.05444519221782684,
-0.04095597192645073,
0.04474782571196556,
-0.035227902233600616,
0.017136355862021446,
-0.07007727026939392,
0.028946049511432648,
-0.09863274544477463,
-0.020405840128660202,
-0.07310023158788681,
0.046056777238845825,
-0.03519085422158241,
-0.06451907753944397,
-0.045564111322164536,
0.032532598823308945,
0.04390597715973854,
-0.016117092221975327,
0.14353710412979126,
0.0375022254884243,
0.14037056267261505,
0.10261031240224838,
0.07294642925262451,
-0.07114697247743607,
-0.07132519781589508,
-0.019167345017194748,
-0.01889003999531269,
0.06173458322882652,
-0.13308332860469818,
0.025541890412569046,
0.15206046402454376,
0.022239362820982933,
0.13187378644943237,
0.08638618886470795,
-0.021033883094787598,
0.0058138407766819,
0.0670572817325592,
-0.15903478860855103,
-0.06361298263072968,
0.0024999005254358053,
-0.05242021754384041,
-0.11409051716327667,
0.0631243959069252,
0.09146551042795181,
-0.06584148854017258,
-0.005897151306271553,
0.00020780858176294714,
0.0023847478441894054,
-0.05690974369645119,
0.20092368125915527,
0.06393599510192871,
0.04321441426873207,
-0.1024925634264946,
0.07627405226230621,
0.050095219165086746,
-0.07408455014228821,
0.0003610361891333014,
0.055590905249118805,
-0.08640364557504654,
-0.044707417488098145,
0.07537467032670975,
0.15484470129013062,
-0.06019914895296097,
-0.05420789495110512,
-0.13284794986248016,
-0.11693155020475388,
0.08165765553712845,
0.15960997343063354,
0.119594044983387,
0.02531096339225769,
-0.053055815398693085,
0.013325943611562252,
-0.12418671697378159,
0.08743160218000412,
0.035285741090774536,
0.07795166969299316,
-0.1580650806427002,
0.1604331135749817,
0.003312465036287904,
0.0404709093272686,
-0.01882511004805565,
0.03602766990661621,
-0.10866308212280273,
0.006106832530349493,
-0.10814034193754196,
-0.026382356882095337,
-0.038533568382263184,
0.00889658834785223,
0.00634733634069562,
-0.05559166893362999,
-0.06189225986599922,
0.0181900504976511,
-0.1095781996846199,
-0.01697133667767048,
0.03900127857923508,
0.05949300527572632,
-0.11618325859308243,
-0.03906461223959923,
0.01810373179614544,
-0.05881402641534805,
0.06825470924377441,
0.03987310454249382,
0.03165002539753914,
0.05455288663506508,
-0.12807106971740723,
0.00524490512907505,
0.07691537588834763,
0.021141598001122475,
0.07743103802204132,
-0.08579732477664948,
-0.008330726996064186,
0.0050088465213775635,
0.04863749444484711,
0.015080061741173267,
0.07400210946798325,
-0.13535155355930328,
-0.008367255330085754,
-0.03347816690802574,
-0.0788029208779335,
-0.07073642313480377,
0.02812747284770012,
0.10869055241346359,
0.01014740951359272,
0.20828838646411896,
-0.07000979036092758,
0.026251889765262604,
-0.19943229854106903,
0.005642743315547705,
-0.013958584517240524,
-0.10782438516616821,
-0.13039298355579376,
-0.05401451140642166,
0.0606338195502758,
-0.05505683645606041,
0.1366691291332245,
0.017976101487874985,
0.03283514454960823,
0.02825980819761753,
-0.017562348395586014,
0.014394653029739857,
0.024737432599067688,
0.20589514076709747,
0.03586859256029129,
-0.03398741036653519,
0.059271350502967834,
0.04859277233481407,
0.10362850874662399,
0.1074921265244484,
0.19402384757995605,
0.13896389305591583,
-0.018646936863660812,
0.09636865556240082,
0.03770878165960312,
-0.07496269047260284,
-0.15884451568126678,
0.04626081511378288,
-0.05561337247490883,
0.1032300516963005,
-0.021349221467971802,
0.21768906712532043,
0.058014847338199615,
-0.16635145246982574,
0.03043595515191555,
-0.05643855780363083,
-0.08757201582193375,
-0.10523062944412231,
-0.042806051671504974,
-0.08219410479068756,
-0.13157591223716736,
0.0035863465163856745,
-0.11756861209869385,
0.007027503103017807,
0.1198902428150177,
0.008179628290235996,
-0.019413108006119728,
0.156028613448143,
0.008553615771234035,
0.03843332827091217,
0.04382907226681709,
0.01368470024317503,
-0.0346563495695591,
-0.11066200584173203,
-0.06676571816205978,
-0.02588251605629921,
-0.02035604417324066,
0.036015480756759644,
-0.06315228343009949,
-0.047107137739658356,
0.03940960019826889,
-0.011299668811261654,
-0.09134268760681152,
0.010107416659593582,
0.016515564173460007,
0.05728939175605774,
0.03584010526537895,
0.005982487462460995,
0.024028312414884567,
-0.010417236015200615,
0.19815343618392944,
-0.08169329166412354,
-0.06646012514829636,
-0.10989249497652054,
0.24216103553771973,
0.04222467169165611,
-0.008581580594182014,
0.03859928995370865,
-0.07080496102571487,
-0.000011368631930963602,
0.23837029933929443,
0.19439274072647095,
-0.08108504861593246,
-0.01156459841877222,
0.018373429775238037,
-0.011035578325390816,
-0.03819417580962181,
0.10738808661699295,
0.13674227893352509,
0.061294879764318466,
-0.08846057206392288,
-0.059125401079654694,
-0.05591624975204468,
-0.0066296691074967384,
-0.034781575202941895,
0.04781313240528107,
0.042121946811676025,
0.002453041961416602,
-0.038026921451091766,
0.04360378533601761,
-0.06083684042096138,
-0.09507039934396744,
0.08363980054855347,
-0.20202091336250305,
-0.16307908296585083,
-0.009928598999977112,
0.10735698789358139,
-0.0010914831655099988,
0.06032969057559967,
-0.03488588705658913,
-0.00293950317427516,
0.08477321267127991,
-0.020107055082917213,
-0.10389745980501175,
-0.07580144703388214,
0.09531654417514801,
-0.09158367663621902,
0.21323344111442566,
-0.047048263251781464,
0.08128208667039871,
0.13047008216381073,
0.0641089603304863,
-0.08086171001195908,
0.05735881254076958,
0.04762261360883713,
-0.0814509391784668,
0.02551138959825039,
0.07037127017974854,
-0.03555307164788246,
0.08633238822221756,
0.040000658482313156,
-0.13251648843288422,
0.016538256779313087,
-0.0611925832927227,
-0.05308819189667702,
-0.047737594693899155,
-0.033292099833488464,
-0.058221474289894104,
0.13523294031620026,
0.21314223110675812,
-0.02809472195804119,
-0.0063493927009403706,
-0.06802180409431458,
0.019221622496843338,
0.057636331766843796,
0.032789431512355804,
-0.06263890117406845,
-0.22152477502822876,
0.024024588987231255,
0.03306620195508003,
-0.020183339715003967,
-0.20841239392757416,
-0.09204568713903427,
0.0024311039596796036,
-0.0755922719836235,
-0.10170083492994308,
0.07628828287124634,
0.08091872185468674,
0.0480458065867424,
-0.06336529552936554,
-0.04137992858886719,
-0.0825699120759964,
0.1415712833404541,
-0.14768193662166595,
-0.09798780828714371
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-ner-mit-restaurant
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the mit_restaurant dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3097
- Precision: 0.7874
- Recall: 0.8104
- F1: 0.7988
- Accuracy: 0.9119
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 431 | 0.4575 | 0.6220 | 0.6856 | 0.6523 | 0.8650 |
| 1.1705 | 2.0 | 862 | 0.3183 | 0.7747 | 0.7953 | 0.7848 | 0.9071 |
| 0.3254 | 3.0 | 1293 | 0.3163 | 0.7668 | 0.8021 | 0.7841 | 0.9058 |
| 0.2287 | 4.0 | 1724 | 0.3097 | 0.7874 | 0.8104 | 0.7988 | 0.9119 |
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["mit_restaurant"], "metrics": ["precision", "recall", "f1", "accuracy"], "model_index": [{"name": "distilbert-base-uncased-ner-mit-restaurant", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "mit_restaurant", "type": "mit_restaurant"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.9118988661540467}}]}]}
|
token-classification
|
andi611/distilbert-base-uncased-ner-mit-restaurant
|
[
"transformers",
"pytorch",
"distilbert",
"token-classification",
"generated_from_trainer",
"en",
"dataset:mit_restaurant",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #distilbert #token-classification #generated_from_trainer #en #dataset-mit_restaurant #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-ner-mit-restaurant
==========================================
This model is a fine-tuned version of distilbert-base-uncased on the mit\_restaurant dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3097
* Precision: 0.7874
* Recall: 0.8104
* F1: 0.7988
* Accuracy: 0.9119
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 4
### Training results
### Framework versions
* Transformers 4.8.2
* Pytorch 1.8.1+cu111
* Datasets 1.8.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #token-classification #generated_from_trainer #en #dataset-mit_restaurant #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 4",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
63,
116,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #token-classification #generated_from_trainer #en #dataset-mit_restaurant #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 4### Training results### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
-0.10181722790002823,
0.12177235633134842,
-0.002629888476803899,
0.1251061111688614,
0.15225067734718323,
0.023133570328354836,
0.09132829308509827,
0.1513907015323639,
-0.09386350959539413,
0.023969950154423714,
0.11697497963905334,
0.17440630495548248,
0.01058671809732914,
0.14962489902973175,
-0.0457165464758873,
-0.271986186504364,
-0.000782923074439168,
0.02704126574099064,
-0.09166022390127182,
0.13729339838027954,
0.08599930256605148,
-0.12201137095689774,
0.10224752873182297,
-0.0024761392269283533,
-0.16432558000087738,
0.011197204701602459,
0.005257849115878344,
-0.05598961561918259,
0.14598827064037323,
0.017553746700286865,
0.10213784873485565,
0.01403120532631874,
0.11100607365369797,
-0.1873047649860382,
-0.00020992422651033849,
0.05389033257961273,
0.01745704375207424,
0.08375432342290878,
0.05386795476078987,
0.006855651270598173,
0.11130683124065399,
-0.0914265587925911,
0.06576593965291977,
0.0280850138515234,
-0.123501718044281,
-0.24197235703468323,
-0.09538387507200241,
0.0328797847032547,
0.07550521194934845,
0.0888882577419281,
0.0026141032576560974,
0.09015354514122009,
-0.10010212659835815,
0.10767369717359543,
0.23222163319587708,
-0.2785135805606842,
-0.06314734369516373,
0.023732738569378853,
0.01496485061943531,
0.051869362592697144,
-0.10054805129766464,
-0.02943762019276619,
0.038370490074157715,
0.049144335091114044,
0.1220092847943306,
-0.036982737481594086,
-0.0971299335360527,
0.012547989375889301,
-0.12639310956001282,
-0.028838103637099266,
0.13715018332004547,
0.02849535457789898,
-0.04132474586367607,
-0.040611352771520615,
-0.07206525653600693,
-0.16643111407756805,
-0.04282819479703903,
-0.023098701611161232,
0.057129062712192535,
-0.03184490278363228,
-0.06802129745483398,
-0.004038466140627861,
-0.07743756473064423,
-0.0777493491768837,
-0.04512237757444382,
0.1505207121372223,
0.046371426433324814,
-0.0023786844685673714,
-0.00763054471462965,
0.11344239860773087,
0.018773609772324562,
-0.13578765094280243,
-0.0029400684870779514,
0.03064117021858692,
-0.02792595885694027,
-0.04612822085618973,
-0.03186549246311188,
0.019326984882354736,
0.013322204351425171,
0.16977547109127045,
-0.061828065663576126,
0.058663032948970795,
0.042140062898397446,
0.005597410723567009,
-0.10220406949520111,
0.18567988276481628,
-0.04365154728293419,
-0.05909494683146477,
-0.013513087294995785,
0.06191712245345116,
0.010846292600035667,
-0.007704973220825195,
-0.09884583204984665,
0.0004981013480573893,
0.0923500582575798,
0.028868118301033974,
-0.052718114107847214,
0.06946495175361633,
-0.04935450106859207,
-0.01964702643454075,
0.03824459761381149,
-0.10225658863782883,
0.03613422438502312,
0.006826553028076887,
-0.09568940103054047,
-0.030855366960167885,
0.012737519107758999,
0.007212227210402489,
-0.009239770472049713,
0.13004109263420105,
-0.0887894555926323,
0.017456810921430588,
-0.08198654651641846,
-0.10270469635725021,
0.010946564376354218,
-0.07837781310081482,
0.011310459114611149,
-0.08197592198848724,
-0.18539874255657196,
-0.02432692050933838,
0.06960923969745636,
-0.04680031165480614,
-0.05238565430045128,
-0.05609025061130524,
-0.07430567592382431,
0.020673485472798347,
-0.015859464183449745,
0.12426509708166122,
-0.06984497606754303,
0.10592415183782578,
0.027905385941267014,
0.058692365884780884,
-0.019531145691871643,
0.0694471001625061,
-0.10858976095914841,
0.027268458157777786,
-0.18210141360759735,
0.04720906913280487,
-0.05897212773561478,
0.05617620050907135,
-0.11279837042093277,
-0.1112205758690834,
0.03793270140886307,
-0.006417635828256607,
0.06822992116212845,
0.11580155789852142,
-0.18702678382396698,
-0.06247808039188385,
0.13381820917129517,
-0.06778156012296677,
-0.10621833056211472,
0.11360079050064087,
-0.05111626908183098,
0.04739631712436676,
0.07190759479999542,
0.16707341372966766,
0.08595050871372223,
-0.07345585525035858,
0.0051195924170315266,
0.013799655251204967,
0.049940332770347595,
-0.029744137078523636,
0.06769628077745438,
0.006555686239153147,
0.013948258012533188,
0.025737158954143524,
-0.04068716615438461,
0.049080878496170044,
-0.08813664317131042,
-0.09097399562597275,
-0.0331876315176487,
-0.08182047307491302,
0.0572873093187809,
0.05878105387091637,
0.05775752663612366,
-0.09092440456151962,
-0.0898001417517662,
0.058317191898822784,
0.1022050529718399,
-0.05495878681540489,
0.030166789889335632,
-0.0580977126955986,
0.057333480566740036,
0.00038600267725996673,
-0.023613659664988518,
-0.17544758319854736,
-0.02766602486371994,
0.01619395799934864,
-0.016754066571593285,
0.015470276586711407,
0.02586670033633709,
0.06754294782876968,
0.06706346571445465,
-0.05325727537274361,
-0.037939902395009995,
-0.017684899270534515,
0.013494963757693768,
-0.12317857891321182,
-0.2042473703622818,
-0.0384320504963398,
-0.02038734219968319,
0.11729767918586731,
-0.20239794254302979,
0.025925051420927048,
0.0016153439646586776,
0.08215075731277466,
0.024743404239416122,
-0.0072026001289486885,
-0.03989803045988083,
0.06655241549015045,
-0.03986096754670143,
-0.06741488724946976,
0.06115792319178581,
-0.0025598632637411356,
-0.08130289614200592,
-0.04163428023457527,
-0.09384877234697342,
0.1502622663974762,
0.10938612371683121,
-0.06579410284757614,
-0.09374287724494934,
-0.0006477161077782512,
-0.06007468327879906,
-0.03909413143992424,
-0.04166886955499649,
0.039418332278728485,
0.15205560624599457,
0.0036865798756480217,
0.14701588451862335,
-0.055588092654943466,
-0.03932317718863487,
0.01799878478050232,
-0.01710176281630993,
0.023829594254493713,
0.12799042463302612,
0.10500292479991913,
-0.08257482945919037,
0.1410731077194214,
0.15679767727851868,
-0.06093674153089523,
0.1138809323310852,
-0.034057240933179855,
-0.056656766682863235,
-0.02843649499118328,
-0.004433324560523033,
-0.01462448202073574,
0.09547176957130432,
-0.10714489966630936,
0.007416325155645609,
0.017046093940734863,
0.03169059380888939,
0.0015807755989953876,
-0.21066440641880035,
-0.024638520553708076,
0.03693949058651924,
-0.06590636074542999,
-0.013981443829834461,
-0.01834266260266304,
-0.0026267985813319683,
0.09860163927078247,
0.021402573212981224,
-0.1176464632153511,
0.03672586753964424,
0.0055872369557619095,
-0.0583505854010582,
0.19993388652801514,
-0.10710611939430237,
-0.15961742401123047,
-0.10391677916049957,
-0.09446573257446289,
-0.046498458832502365,
0.007287344429641962,
0.06641688197851181,
-0.09546041488647461,
-0.041310351341962814,
-0.05793525278568268,
0.01220650877803564,
-0.005033330991864204,
0.027664845809340477,
-0.02150125615298748,
0.0035955694038420916,
0.06000461056828499,
-0.10709888488054276,
-0.012176385149359703,
-0.044936057180166245,
-0.06305871158838272,
0.03488018363714218,
0.033902961760759354,
0.1014745905995369,
0.13407638669013977,
-0.005793527700006962,
0.006270915735512972,
-0.03239206224679947,
0.23123130202293396,
-0.07133583724498749,
-0.004847121424973011,
0.1347416788339615,
0.015885934233665466,
0.05719447508454323,
0.1515340358018875,
0.0689205676317215,
-0.098207488656044,
0.006855227518826723,
0.04329584538936615,
-0.023280953988432884,
-0.2053815722465515,
-0.0462346188724041,
-0.05071301758289337,
0.0033392508048564196,
0.1142847090959549,
0.037624381482601166,
0.018215054646134377,
0.04704464226961136,
0.02622862160205841,
0.07086356729269028,
-0.026844989508390427,
0.059546031057834625,
0.14060616493225098,
0.03821208328008652,
0.12039490044116974,
-0.02846253104507923,
-0.05450315773487091,
0.053440336138010025,
0.007954095490276814,
0.2171146720647812,
0.0009247413254342973,
0.13365355134010315,
0.04784739762544632,
0.16279423236846924,
-0.01762661710381508,
0.06438013911247253,
0.006313438061624765,
-0.02442861534655094,
-0.02454007789492607,
-0.04631120339035988,
-0.043722737580537796,
0.027084974572062492,
-0.04265061020851135,
0.036055900156497955,
-0.10718575865030289,
0.024591848254203796,
0.05033354461193085,
0.26987534761428833,
0.04502866044640541,
-0.31575408577919006,
-0.08420289307832718,
0.00023896821949165314,
-0.05194908753037453,
-0.026847440749406815,
0.03608765825629234,
0.08806382119655609,
-0.09146872162818909,
0.04324111342430115,
-0.06812933832406998,
0.08939357101917267,
-0.05199380964040756,
0.04846826195716858,
0.11097769439220428,
0.09412192553281784,
0.008792934007942677,
0.0758914053440094,
-0.3050517141819,
0.27390891313552856,
0.007924884557723999,
0.06159967929124832,
-0.07242871820926666,
0.016489000990986824,
0.03817654401063919,
0.08364678919315338,
0.08792464435100555,
-0.011797448620200157,
-0.0825042799115181,
-0.20946884155273438,
-0.06907785683870316,
0.027634968981146812,
0.0899350568652153,
-0.031481094658374786,
0.09074003994464874,
-0.05292949080467224,
-0.009187278337776661,
0.06555911153554916,
-0.05002477392554283,
-0.06306566298007965,
-0.08657685667276382,
-0.0008355265017598867,
0.009342891164124012,
-0.01799856685101986,
-0.05517838895320892,
-0.10134866088628769,
-0.07206320017576218,
0.15351013839244843,
-0.03989313170313835,
-0.03774470090866089,
-0.13636909425258636,
0.06263057887554169,
0.09912917762994766,
-0.09178026020526886,
0.04463937506079674,
0.008185895159840584,
0.04833919554948807,
0.04619327187538147,
-0.062438685446977615,
0.11986168473958969,
-0.07101644575595856,
-0.18933652341365814,
-0.05551805719733238,
0.09923873096704483,
0.03326652571558952,
0.070563405752182,
-0.014291920699179173,
0.034538738429546356,
-0.03030240908265114,
-0.08972761034965515,
-0.00797269120812416,
-0.010917723178863525,
0.06642686575651169,
0.03190447390079498,
-0.06655587255954742,
0.027453172951936722,
-0.05299341678619385,
-0.020546535030007362,
0.1454668641090393,
0.26187238097190857,
-0.1005372405052185,
0.057769689708948135,
0.03513359650969505,
-0.06033559516072273,
-0.18900291621685028,
0.006265787873417139,
0.044584207236766815,
-0.010894231498241425,
0.0390670970082283,
-0.20192402601242065,
0.12812945246696472,
0.11227273941040039,
-0.019601156935095787,
0.09971388429403305,
-0.32413774728775024,
-0.12593460083007812,
0.1085047796368599,
0.13001567125320435,
0.0905771479010582,
-0.12285643815994263,
-0.022872863337397575,
-0.01223661843687296,
-0.10361973941326141,
0.09418173134326935,
-0.053248077630996704,
0.13157474994659424,
-0.03231697902083397,
0.070499949157238,
0.005477479659020901,
-0.044910382479429245,
0.11278297007083893,
0.040285367518663406,
0.10183477401733398,
-0.04963561147451401,
-0.03181876242160797,
0.012923567555844784,
-0.044831033796072006,
0.03051474690437317,
-0.09699168801307678,
0.03690820559859276,
-0.09605934470891953,
-0.012275204062461853,
-0.08132130652666092,
0.04032529518008232,
-0.04818572849035263,
-0.07307227700948715,
-0.04437992349267006,
0.050249647349119186,
0.06712350994348526,
-0.01993754133582115,
0.13932287693023682,
0.022195259109139442,
0.11903714388608932,
0.10162749141454697,
0.05759214237332344,
-0.06156817823648453,
-0.07713840156793594,
-0.010432791896164417,
-0.0071978215128183365,
0.05332155153155327,
-0.13464954495429993,
0.04287850111722946,
0.14626716077327728,
0.023720307275652885,
0.13740552961826324,
0.07229889184236526,
-0.008045949973165989,
-0.011491612531244755,
0.05677010864019394,
-0.15460485219955444,
-0.07252411544322968,
0.0062185549177229404,
-0.08371057361364365,
-0.108035147190094,
0.03652309998869896,
0.11401593685150146,
-0.0664864033460617,
-0.003916048910468817,
0.00032733738771639764,
0.03077121265232563,
-0.044926635921001434,
0.22441276907920837,
0.058692313730716705,
0.050055600702762604,
-0.11209253966808319,
0.07226622849702835,
0.055450234562158585,
-0.0700574740767479,
0.0035355230793356895,
0.09583895653486252,
-0.09192164242267609,
-0.03720709681510925,
0.07941135764122009,
0.14393649995326996,
-0.0679764449596405,
-0.03905767947435379,
-0.14801986515522003,
-0.10592227429151535,
0.0943308100104332,
0.12369388341903687,
0.11067884415388107,
0.03650008514523506,
-0.06441207975149155,
0.008501442149281502,
-0.10248877108097076,
0.10345102101564407,
0.04418657720088959,
0.07213018089532852,
-0.15100646018981934,
0.151418074965477,
0.004166224040091038,
0.05135105922818184,
-0.023485418409109116,
0.028821364045143127,
-0.10878913849592209,
0.0029046405106782913,
-0.11364457756280899,
-0.019130360335111618,
-0.0410974845290184,
0.01799076236784458,
-0.008219605311751366,
-0.05344841256737709,
-0.05628383159637451,
0.012867288663983345,
-0.11187811940908432,
-0.030310891568660736,
0.020720530301332474,
0.06180952489376068,
-0.14107629656791687,
-0.03844134882092476,
0.02406063675880432,
-0.07395031303167343,
0.07270447164773941,
0.024757925420999527,
0.00934139359742403,
0.04377789422869682,
-0.1022205799818039,
-0.017848966643214226,
0.05262333154678345,
0.018016939982771873,
0.07737576216459274,
-0.11365340650081635,
-0.015718387439846992,
-0.013679614290595055,
0.0568675734102726,
0.01535965409129858,
0.08030907809734344,
-0.13928040862083435,
0.005886643659323454,
-0.04006022587418556,
-0.08175528794527054,
-0.06174466013908386,
0.04115486890077591,
0.09440657496452332,
0.005197713151574135,
0.20262062549591064,
-0.077694833278656,
0.03186802193522453,
-0.19873207807540894,
-0.009707117453217506,
-0.01319949235767126,
-0.12171507626771927,
-0.1228509396314621,
-0.06440260261297226,
0.06879942119121552,
-0.042584292590618134,
0.11576130986213684,
0.034953903406858444,
0.06271681189537048,
0.03926277533173561,
-0.04458673298358917,
-0.00048351078294217587,
0.014055930078029633,
0.17440751194953918,
0.03529391065239906,
-0.031061477959156036,
0.07455015182495117,
0.04312463849782944,
0.08820250630378723,
0.10588755458593369,
0.21022917330265045,
0.1526683270931244,
0.008496946655213833,
0.0797777771949768,
0.03624172881245613,
-0.09680625051259995,
-0.20847992599010468,
0.04397161304950714,
-0.05773332715034485,
0.12127763777971268,
-0.02926853857934475,
0.19067096710205078,
0.04055701941251755,
-0.18131567537784576,
0.04937287047505379,
-0.050730012357234955,
-0.08825846761465073,
-0.12201420962810516,
-0.03898301348090172,
-0.0851845070719719,
-0.15423890948295593,
-0.008392720483243465,
-0.10479230433702469,
0.04488343372941017,
0.10673248022794724,
0.01326453872025013,
0.00047811036347411573,
0.1433461457490921,
-0.0048312838189303875,
0.030024658888578415,
0.04180561751127243,
0.01636274717748165,
-0.02211732603609562,
-0.0933784693479538,
-0.0787702351808548,
-0.025002459064126015,
-0.0062232655473053455,
0.027797255665063858,
-0.05968514457345009,
-0.0625602975487709,
0.04079832136631012,
-0.017297370359301567,
-0.08438843488693237,
0.014005376026034355,
0.019011354073882103,
0.06551848351955414,
0.047470301389694214,
0.018970632925629616,
0.010603699833154678,
-0.00000982564870355418,
0.24585427343845367,
-0.09191372245550156,
-0.07697495073080063,
-0.1092909649014473,
0.29653459787368774,
0.05192390829324722,
-0.007858490571379662,
0.03587633743882179,
-0.06043117120862007,
-0.013169586658477783,
0.2120385766029358,
0.19250640273094177,
-0.0911845713853836,
-0.016158994287252426,
-0.006151366047561169,
-0.007152572274208069,
-0.01850944198668003,
0.11482930183410645,
0.12313073128461838,
0.042159080505371094,
-0.08429531008005142,
-0.04969033971428871,
-0.038603734225034714,
-0.037291985005140305,
-0.05676255747675896,
0.060678593814373016,
0.032216865569353104,
0.004914567340165377,
-0.03747829049825668,
0.06733982264995575,
-0.04151498153805733,
-0.12172221392393112,
0.06432345509529114,
-0.19182641804218292,
-0.17244121432304382,
-0.023086123168468475,
0.1004405990242958,
0.021592089906334877,
0.060921210795640945,
-0.018590491265058517,
-0.00993921048939228,
0.09335004538297653,
-0.015187734737992287,
-0.092563197016716,
-0.09257441759109497,
0.11265069991350174,
-0.10004175454378128,
0.2324138581752777,
-0.03552505373954773,
0.03112824261188507,
0.12150689214468002,
0.05476047843694687,
-0.0930866077542305,
0.054933685809373856,
0.05662081018090248,
-0.08393172919750214,
0.035435814410448074,
0.09004133194684982,
-0.04212656244635582,
0.08384162932634354,
0.03994379937648773,
-0.13918203115463257,
0.014952040277421474,
-0.015792876482009888,
-0.0639125406742096,
-0.05127827078104019,
-0.04701989144086838,
-0.04478137940168381,
0.1413038969039917,
0.23205721378326416,
-0.030398763716220856,
0.015205995179712772,
-0.07435452938079834,
0.015479658730328083,
0.05357103794813156,
0.022505640983581543,
-0.06551160663366318,
-0.2252892106771469,
0.011660783551633358,
0.04019005969166756,
-0.01611902564764023,
-0.17932751774787903,
-0.10667135566473007,
0.004140018485486507,
-0.05901787057518959,
-0.10002245008945465,
0.10210227966308594,
0.04641449451446533,
0.04287358745932579,
-0.047792594879865646,
-0.07438290119171143,
-0.08301979303359985,
0.15628063678741455,
-0.16355927288532257,
-0.08044788986444473
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-boolq
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the boolq dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2071
- Accuracy: 0.7315
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.6506 | 1.0 | 531 | 0.6075 | 0.6681 |
| 0.575 | 2.0 | 1062 | 0.5816 | 0.6978 |
| 0.4397 | 3.0 | 1593 | 0.6137 | 0.7253 |
| 0.2524 | 4.0 | 2124 | 0.8124 | 0.7466 |
| 0.126 | 5.0 | 2655 | 1.1437 | 0.7370 |
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["boolq"], "metrics": ["accuracy"], "model_index": [{"name": "distilbert-base-uncased-boolq", "results": [{"task": {"name": "Question Answering", "type": "question-answering"}, "dataset": {"name": "boolq", "type": "boolq", "args": "default"}, "metric": {"name": "Accuracy", "type": "accuracy", "value": 0.7314984709480122}}]}]}
|
text-classification
|
andi611/distilbert-base-uncased-qa-boolq
|
[
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:boolq",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #distilbert #text-classification #generated_from_trainer #en #dataset-boolq #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-boolq
=============================
This model is a fine-tuned version of distilbert-base-uncased on the boolq dataset.
It achieves the following results on the evaluation set:
* Loss: 1.2071
* Accuracy: 0.7315
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 5e-05
* train\_batch\_size: 16
* eval\_batch\_size: 32
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 1000
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.8.2
* Pytorch 1.8.1+cu111
* Datasets 1.8.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #en #dataset-boolq #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
61,
116,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #en #dataset-boolq #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 32\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 1000\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.8.2\n* Pytorch 1.8.1+cu111\n* Datasets 1.8.0\n* Tokenizers 0.10.3"
] |
[
-0.10042570531368256,
0.08829322457313538,
-0.003070604521781206,
0.13538631796836853,
0.16683033108711243,
0.026101969182491302,
0.11089257895946503,
0.12985548377037048,
-0.09479644894599915,
0.013659849762916565,
0.12149586528539658,
0.17460936307907104,
0.010477909818291664,
0.12575513124465942,
-0.05521492287516594,
-0.2861056625843048,
-0.0108097018674016,
0.01747419312596321,
-0.06873250752687454,
0.14129170775413513,
0.0884840115904808,
-0.1273217648267746,
0.08984839171171188,
-0.006277976091951132,
-0.15991750359535217,
0.007488840259611607,
-0.006449763663113117,
-0.055015239864587784,
0.15248332917690277,
0.017279714345932007,
0.0989684984087944,
0.01202397421002388,
0.1042972132563591,
-0.19573268294334412,
0.005842694081366062,
0.04925251379609108,
0.013806699775159359,
0.08414213359355927,
0.05432559922337532,
-0.01308626588433981,
0.14381785690784454,
-0.09289257973432541,
0.06345908343791962,
0.022222571074962616,
-0.12454789876937866,
-0.22316280007362366,
-0.08079832792282104,
0.018408963456749916,
0.07987653464078903,
0.10306835919618607,
-0.0006788243190385401,
0.104586660861969,
-0.10320962220430374,
0.11300421506166458,
0.24157877266407013,
-0.2762509286403656,
-0.06243458390235901,
0.01251326035708189,
0.01655055209994316,
0.07418778538703918,
-0.10222812741994858,
-0.034683093428611755,
0.02522682212293148,
0.04876118153333664,
0.1369674652814865,
-0.034049857407808304,
-0.12446465343236923,
0.018430162221193314,
-0.14117853343486786,
-0.03666084632277489,
0.13293233513832092,
0.027420878410339355,
-0.03352674841880798,
-0.04227815940976143,
-0.0734235942363739,
-0.15400567650794983,
-0.04236537590622902,
0.001737012411467731,
0.04767240211367607,
-0.032247405499219894,
-0.05932847782969475,
-0.00947954598814249,
-0.08294379711151123,
-0.07088010013103485,
-0.05695720016956329,
0.1452055424451828,
0.04257899150252342,
0.007867362350225449,
-0.022341381758451462,
0.11522133648395538,
0.029657352715730667,
-0.13807573914527893,
0.008024921640753746,
0.02134276181459427,
-0.013220456428825855,
-0.034827958792448044,
-0.05242127552628517,
0.004887272138148546,
0.011998024769127369,
0.1482498049736023,
-0.05050359293818474,
0.054833054542541504,
0.03972535580396652,
0.021219659596681595,
-0.10085892677307129,
0.19498345255851746,
-0.03363952413201332,
-0.044574838131666183,
0.003434954211115837,
0.06233753263950348,
0.021175222471356392,
-0.020957596600055695,
-0.11559660732746124,
-0.0005216153804212809,
0.08910957723855972,
0.03197499364614487,
-0.06806966662406921,
0.0735284835100174,
-0.048030123114585876,
-0.02063068002462387,
0.02040884830057621,
-0.10555922240018845,
0.035573866218328476,
0.010044694878160954,
-0.09352067857980728,
-0.045680053532123566,
0.026083361357450485,
0.006201098207384348,
-0.02432025969028473,
0.1203465610742569,
-0.07591826468706131,
0.036400169134140015,
-0.08481906354427338,
-0.12138556689023972,
0.0043991925194859505,
-0.10372123122215271,
0.020213309675455093,
-0.0893319621682167,
-0.19738495349884033,
-0.008895755745470524,
0.05274184048175812,
-0.03163619711995125,
-0.05299342796206474,
-0.06908738613128662,
-0.0793309360742569,
0.01923132687807083,
-0.014971848577260971,
0.13226838409900665,
-0.07497905939817429,
0.10988300293684006,
0.028803352266550064,
0.06085953488945961,
-0.0342426560819149,
0.06144198775291443,
-0.10908761620521545,
0.011737063527107239,
-0.16744858026504517,
0.06644830107688904,
-0.05280664935708046,
0.057861775159835815,
-0.0905952900648117,
-0.11333155632019043,
0.043641019612550735,
-0.005556141957640648,
0.07707865536212921,
0.10787797719240189,
-0.1937810629606247,
-0.06933286786079407,
0.13880981504917145,
-0.057596445083618164,
-0.10649261623620987,
0.11783740669488907,
-0.06622222065925598,
0.034158606082201004,
0.06791672855615616,
0.16925722360610962,
0.09016861766576767,
-0.06676330417394638,
0.022766366600990295,
0.0166002344340086,
0.05335564166307449,
-0.03931596502661705,
0.06443136930465698,
0.006832206156104803,
0.0059731523506343365,
0.030170602723956108,
-0.03160982206463814,
0.06092483922839165,
-0.09326158463954926,
-0.09721764177083969,
-0.03754248470067978,
-0.08643119782209396,
0.053264424204826355,
0.06495800614356995,
0.051032472401857376,
-0.1005207747220993,
-0.08130475133657455,
0.04715968668460846,
0.09244741499423981,
-0.05494413897395134,
0.02897556498646736,
-0.059319138526916504,
0.0482841432094574,
0.00008594069367973134,
-0.01203866582363844,
-0.19201816618442535,
-0.01335589773952961,
0.012399867177009583,
0.01072659995406866,
0.019132493063807487,
0.0012383123394101858,
0.07038971036672592,
0.07214972376823425,
-0.049500007182359695,
-0.03190023824572563,
-0.026786990463733673,
-0.0002576745464466512,
-0.12803618609905243,
-0.20423108339309692,
-0.03697687014937401,
-0.019025254994630814,
0.11731816828250885,
-0.1944635659456253,
0.03563284873962402,
-0.016432765871286392,
0.05665979161858559,
0.008002830669283867,
-0.0034205708652734756,
-0.036456551402807236,
0.08092096447944641,
-0.04292053356766701,
-0.0568864569067955,
0.07434965670108795,
0.0001170081741292961,
-0.09053365886211395,
-0.048598356544971466,
-0.10329505801200867,
0.14285226166248322,
0.11710895597934723,
-0.08405745774507523,
-0.07172118127346039,
0.0014606958720833063,
-0.059033799916505814,
-0.02907843515276909,
-0.0409805104136467,
0.047638583928346634,
0.177193745970726,
0.0004585504357237369,
0.15190933644771576,
-0.06824253499507904,
-0.04361317306756973,
0.017580239102244377,
-0.020314021036028862,
0.0406133271753788,
0.14416547119617462,
0.1168818399310112,
-0.06863971799612045,
0.12898749113082886,
0.14926129579544067,
-0.08090082556009293,
0.12275328487157822,
-0.040215712040662766,
-0.058364901691675186,
-0.014572378247976303,
-0.02216840535402298,
-0.011834516189992428,
0.08754377067089081,
-0.13048075139522552,
0.003381149610504508,
0.021772772073745728,
0.02455597184598446,
0.0008318829350173473,
-0.21560049057006836,
-0.033955540508031845,
0.03211408108472824,
-0.06107652559876442,
-0.048917606472969055,
-0.01446465216577053,
0.014278517104685307,
0.10909087210893631,
0.004592329729348421,
-0.10727622359991074,
0.030071832239627838,
0.002210058504715562,
-0.0654720887541771,
0.2119348794221878,
-0.10501258075237274,
-0.15819025039672852,
-0.10450484603643417,
-0.09581881761550903,
-0.06354866921901703,
0.00901910848915577,
0.07399383932352066,
-0.09329742193222046,
-0.02916816435754299,
-0.07055651396512985,
0.025592157617211342,
-0.00046916937571950257,
0.030230293050408363,
-0.011186275631189346,
-0.004852196667343378,
0.06325769424438477,
-0.11538367718458176,
-0.011664774268865585,
-0.051771555095911026,
-0.06416378915309906,
0.05106009542942047,
0.04849352315068245,
0.11135683208703995,
0.13604851067066193,
-0.009728983975946903,
0.01278348546475172,
-0.031557515263557434,
0.2468695044517517,
-0.061938442289829254,
-0.011307002045214176,
0.1416674554347992,
-0.003233194351196289,
0.058980345726013184,
0.13176435232162476,
0.06558588147163391,
-0.09495842456817627,
0.011093378998339176,
0.03980064019560814,
-0.0369902066886425,
-0.2114402949810028,
-0.050446316599845886,
-0.056501422077417374,
0.004435248207300901,
0.10152498632669449,
0.03236357122659683,
0.004748173989355564,
0.05337951332330704,
0.03281080722808838,
0.05661886930465698,
-0.013156098313629627,
0.05534382537007332,
0.12244408577680588,
0.043902914971113205,
0.12990786135196686,
-0.03504638001322746,
-0.06315039098262787,
0.043041959404945374,
-0.015570923686027527,
0.21577072143554688,
-0.017266588285565376,
0.12099564075469971,
0.04556875303387642,
0.16902701556682587,
-0.014152772724628448,
0.08582364767789841,
0.006359385792165995,
-0.022193143144249916,
-0.02106364071369171,
-0.037034034729003906,
-0.043905556201934814,
0.014042261987924576,
-0.05773874744772911,
0.05796188488602638,
-0.1248636394739151,
0.02438967302441597,
0.05755974352359772,
0.2821565270423889,
0.03197836875915527,
-0.3213941156864166,
-0.0912114828824997,
0.000927834480535239,
-0.04438742250204086,
-0.02567586675286293,
0.03269535303115845,
0.08915580064058304,
-0.0965321734547615,
0.04284593090415001,
-0.060490019619464874,
0.09764263778924942,
-0.04528811573982239,
0.052206892520189285,
0.0778178796172142,
0.10822296142578125,
0.00514005683362484,
0.08126380294561386,
-0.32207629084587097,
0.2628813087940216,
0.004376648925244808,
0.0693279281258583,
-0.07431869953870773,
0.008893675170838833,
0.04964495822787285,
0.07077238708734512,
0.0698080062866211,
-0.013702131807804108,
-0.021111853420734406,
-0.20362000167369843,
-0.06517443805932999,
0.03217345103621483,
0.09279871731996536,
-0.04004792869091034,
0.09800279140472412,
-0.042580608278512955,
-0.0012712860479950905,
0.06553813815116882,
-0.04360188543796539,
-0.05161628499627113,
-0.10026624798774719,
-0.014101455919444561,
0.019503753632307053,
-0.029053328558802605,
-0.0501769557595253,
-0.10557360202074051,
-0.09635549783706665,
0.1301037073135376,
-0.03138718008995056,
-0.04601549357175827,
-0.1172538623213768,
0.06118776276707649,
0.0915011316537857,
-0.09313778579235077,
0.0373593308031559,
0.007339800707995892,
0.053249791264534,
0.035880427807569504,
-0.07505710422992706,
0.11155327409505844,
-0.0760636031627655,
-0.18677355349063873,
-0.04740637168288231,
0.10880737751722336,
0.04120166599750519,
0.06610754132270813,
-0.020766785368323326,
0.023180268704891205,
-0.05092623457312584,
-0.09410034865140915,
0.0130557119846344,
-0.0004071999865118414,
0.06561257690191269,
0.037893615663051605,
-0.06348860263824463,
0.028720345348119736,
-0.06564708799123764,
-0.021362779662013054,
0.17691926658153534,
0.24500322341918945,
-0.09974445402622223,
0.05202268436551094,
0.03372479975223541,
-0.06377382576465607,
-0.1968432366847992,
0.011532756499946117,
0.058948367834091187,
-0.004943398758769035,
0.04340945556759834,
-0.19976797699928284,
0.11293163895606995,
0.0986240953207016,
-0.0070526436902582645,
0.09712830185890198,
-0.3375553488731384,
-0.12488669157028198,
0.12677474319934845,
0.12693198025226593,
0.0988016203045845,
-0.1295420378446579,
-0.011384385637938976,
-0.022445805370807648,
-0.09605232626199722,
0.10770446807146072,
-0.06736888736486435,
0.1282462477684021,
-0.03473936393857002,
0.08262968063354492,
0.007671239320188761,
-0.0447370782494545,
0.1106385588645935,
0.029911408200860023,
0.10452160984277725,
-0.058028656989336014,
-0.019526826217770576,
0.014819586649537086,
-0.04746921360492706,
0.02455810457468033,
-0.11067452281713486,
0.043638043105602264,
-0.09927524626255035,
-0.014762727543711662,
-0.0848042443394661,
0.03919734060764313,
-0.037216994911432266,
-0.06580957770347595,
-0.03169599175453186,
0.021723942831158638,
0.06477266550064087,
-0.013590439222753048,
0.13716766238212585,
0.018804682418704033,
0.11890217661857605,
0.09221003204584122,
0.07724545896053314,
-0.06355315446853638,
-0.05477699637413025,
-0.012581008486449718,
-0.004512421321123838,
0.05158260464668274,
-0.13375622034072876,
0.037907350808382034,
0.14437055587768555,
0.023164013400673866,
0.1343163698911667,
0.08489786833524704,
-0.0006251682061702013,
0.0000919875456020236,
0.05083012208342552,
-0.16572368144989014,
-0.0852072611451149,
-0.0038595523219555616,
-0.07421059161424637,
-0.10416163504123688,
0.048219289630651474,
0.09608939290046692,
-0.060827936977148056,
-0.009697962552309036,
-0.004241861868649721,
0.02701704017817974,
-0.03904545679688454,
0.20441877841949463,
0.04999846965074539,
0.05484466627240181,
-0.12161877006292343,
0.08367069065570831,
0.04761511832475662,
-0.06275476515293121,
0.010485819540917873,
0.1041632816195488,
-0.09709817916154861,
-0.043481361120939255,
0.07941402494907379,
0.16647496819496155,
-0.05719920992851257,
-0.033872924745082855,
-0.13963311910629272,
-0.12382672727108002,
0.09184779971837997,
0.15338748693466187,
0.10577290505170822,
0.014231933280825615,
-0.07505911588668823,
0.013860701583325863,
-0.11535225808620453,
0.10544496774673462,
0.048871733248233795,
0.06431657075881958,
-0.13555780053138733,
0.1516464650630951,
0.004828221630305052,
0.054074015468358994,
-0.017032258212566376,
0.01827801764011383,
-0.10091038048267365,
0.014662881381809711,
-0.13220661878585815,
-0.0232127346098423,
-0.03473958745598793,
0.024787474423646927,
-0.013441120274364948,
-0.05219665914773941,
-0.04638903588056564,
0.011423190124332905,
-0.11178242415189743,
-0.0310920812189579,
0.022081878036260605,
0.0593608133494854,
-0.12920191884040833,
-0.04675625264644623,
0.019877832382917404,
-0.07458556443452835,
0.08609730750322342,
0.05419919639825821,
0.001248556305654347,
0.05307785049080849,
-0.11621755361557007,
-0.007040712051093578,
0.06171872466802597,
0.02124190703034401,
0.07071112841367722,
-0.08948983252048492,
-0.004203661344945431,
-0.011066339910030365,
0.04515308886766434,
0.015061767771840096,
0.09030506759881973,
-0.13562269508838654,
0.021966630592942238,
-0.025087734684348106,
-0.08302278071641922,
-0.06774438917636871,
0.03843942657113075,
0.08421406894922256,
0.024283070117235184,
0.20685526728630066,
-0.0835314691066742,
0.04331773892045021,
-0.20454858243465424,
-0.009411925449967384,
-0.011078075505793095,
-0.12733854353427887,
-0.14622481167316437,
-0.07532059401273727,
0.07728758454322815,
-0.06002482399344444,
0.11811275035142899,
0.03741546347737312,
0.062275223433971405,
0.0182692538946867,
-0.016893485561013222,
0.005785814020782709,
0.013087958097457886,
0.18471260368824005,
0.037168070673942566,
-0.04338657483458519,
0.08293434977531433,
0.04334820806980133,
0.10348743945360184,
0.12485811859369278,
0.20928172767162323,
0.1366647332906723,
0.021386295557022095,
0.08341138064861298,
0.03520805761218071,
-0.08519187569618225,
-0.17271709442138672,
0.0271554347127676,
-0.03219546750187874,
0.1117081567645073,
-0.032054103910923004,
0.2147694230079651,
0.04470248147845268,
-0.16620466113090515,
0.05540676414966583,
-0.04910147190093994,
-0.09076441079378128,
-0.12286786735057831,
-0.03007522225379944,
-0.08173014223575592,
-0.13920480012893677,
-0.00685109943151474,
-0.11374858766794205,
0.03496869280934334,
0.10420972108840942,
0.007270267698913813,
-0.014061512425541878,
0.1331171989440918,
0.018623145297169685,
0.022702926769852638,
0.052194882184267044,
0.00566009571775794,
-0.026605181396007538,
-0.08103381097316742,
-0.08011750131845474,
-0.011687246151268482,
-0.020573778077960014,
0.030576955527067184,
-0.05465559661388397,
-0.06761351972818375,
0.0399683378636837,
-0.04189472272992134,
-0.09469277411699295,
0.0205787755548954,
0.021491030231118202,
0.06488718092441559,
0.06226317584514618,
0.023517288267612457,
0.002039717510342598,
-0.0005489964969456196,
0.2443861961364746,
-0.08767733722925186,
-0.08783994615077972,
-0.0938764214515686,
0.28669601678848267,
0.0488559827208519,
-0.012208596803247929,
0.036117035895586014,
-0.05949166417121887,
-0.01787673868238926,
0.22579151391983032,
0.2042403668165207,
-0.0990406721830368,
-0.009211569093167782,
-0.0095356535166502,
-0.007349525112658739,
-0.010607247240841389,
0.11862505227327347,
0.14716671407222748,
0.052274759858846664,
-0.08887570351362228,
-0.036588963121175766,
-0.05176278576254845,
-0.020745815709233284,
-0.04903433471918106,
0.08224855363368988,
0.03310590237379074,
-0.0021638928446918726,
-0.031927626579999924,
0.0546642541885376,
-0.07638311386108398,
-0.09758196771144867,
0.04468484967947006,
-0.19752970337867737,
-0.1648245006799698,
-0.02359660156071186,
0.09154585748910904,
0.02226129360496998,
0.05794632434844971,
-0.018454331904649734,
-0.006302374880760908,
0.08047284185886383,
-0.020235512405633926,
-0.08368033915758133,
-0.08680910617113113,
0.10193166136741638,
-0.11000428348779678,
0.19898313283920288,
-0.04128385707736015,
0.04323089122772217,
0.12069834023714066,
0.0712035596370697,
-0.08090908825397491,
0.07536090910434723,
0.04864748939871788,
-0.06759776920080185,
0.03467009961605072,
0.08215057104825974,
-0.03928188979625702,
0.06299863755702972,
0.044782351702451706,
-0.1145930290222168,
0.019571460783481598,
-0.04935045540332794,
-0.06817349046468735,
-0.037355076521635056,
-0.043912824243307114,
-0.04448692873120308,
0.12459743767976761,
0.22347943484783173,
-0.030494477599859238,
0.01733223721385002,
-0.08248230069875717,
0.004506018478423357,
0.04341118782758713,
0.02227293699979782,
-0.06883995980024338,
-0.21910180151462555,
0.008414557203650475,
0.0667489543557167,
-0.009777101688086987,
-0.21125715970993042,
-0.09273532032966614,
0.003324436489492655,
-0.06236953288316727,
-0.09898421913385391,
0.09184330701828003,
0.05655656009912491,
0.04528789222240448,
-0.048904262483119965,
-0.05143677443265915,
-0.07927731424570084,
0.16899926960468292,
-0.16247977316379547,
-0.07975635677576065
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-qa-with-ner
This model is a fine-tuned version of [andi611/distilbert-base-uncased-qa](https://huggingface.co/andi611/distilbert-base-uncased-qa) on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["conll2003"], "model_index": [{"name": "distilbert-base-uncased-qa-with-ner", "results": [{"task": {"name": "Question Answering", "type": "question-answering"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}}]}]}
|
question-answering
|
andi611/distilbert-base-uncased-qa-with-ner
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:conll2003",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #endpoints_compatible #region-us
|
# distilbert-base-uncased-qa-with-ner
This model is a fine-tuned version of andi611/distilbert-base-uncased-qa on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# distilbert-base-uncased-qa-with-ner\n\nThis model is a fine-tuned version of andi611/distilbert-base-uncased-qa on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-qa-with-ner\n\nThis model is a fine-tuned version of andi611/distilbert-base-uncased-qa on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
53,
49,
6,
12,
8,
3,
90,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #license-apache-2.0 #endpoints_compatible #region-us \n# distilbert-base-uncased-qa-with-ner\n\nThis model is a fine-tuned version of andi611/distilbert-base-uncased-qa on the conll2003 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.09433707594871521,
0.1258833408355713,
-0.0016772758681327105,
0.09201648831367493,
0.15141506493091583,
0.03206643462181091,
0.08466017246246338,
0.12123577296733856,
-0.12087543308734894,
0.03974694758653641,
0.0761139839887619,
0.08066491037607193,
0.0350324921309948,
0.08044939488172531,
-0.026119204238057137,
-0.24064496159553528,
0.007365181110799313,
0.016892749816179276,
-0.08944078534841537,
0.11528459191322327,
0.09953659772872925,
-0.1148003563284874,
0.05517548322677612,
0.016464337706565857,
-0.1756826788187027,
0.020240897312760353,
-0.04326823726296425,
-0.03720821440219879,
0.09985991567373276,
0.00709439255297184,
0.109153613448143,
0.010507814586162567,
0.127557635307312,
-0.24248144030570984,
0.0033234497532248497,
0.06167558580636978,
0.024185528978705406,
0.06538999080657959,
0.03087528981268406,
0.012880341149866581,
0.0961897224187851,
-0.1145210936665535,
0.08730163425207138,
0.029857216402888298,
-0.06920468807220459,
-0.14275850355625153,
-0.09026502072811127,
0.09133467078208923,
0.09851451963186264,
0.12067695707082748,
-0.0008894266211427748,
0.12344960123300552,
-0.14029163122177124,
0.07497648894786835,
0.15422454476356506,
-0.2832260727882385,
-0.07635226845741272,
0.060644857585430145,
0.03307013958692551,
0.05597568675875664,
-0.11048365384340286,
-0.04043972119688988,
0.0499727725982666,
0.03727053478360176,
0.07957744598388672,
-0.014108158648014069,
-0.09002269804477692,
0.006622293498367071,
-0.14906902611255646,
-0.01766970381140709,
0.1564023494720459,
0.06734336167573929,
-0.0445755235850811,
-0.0650923028588295,
-0.03210185468196869,
-0.10283685475587845,
-0.007970265112817287,
-0.04408060759305954,
0.03324516862630844,
-0.04623907431960106,
-0.07389959692955017,
-0.03547397255897522,
-0.0775240808725357,
-0.06403880566358566,
-0.005369925405830145,
0.08524391055107117,
0.0651896595954895,
0.002060227794572711,
-0.029376298189163208,
0.1167866662144661,
0.02070792391896248,
-0.12048844248056412,
-0.02477250248193741,
-0.00983111746609211,
-0.06509014219045639,
-0.057412855327129364,
-0.0568440817296505,
0.011518548242747784,
0.0036537249106913805,
0.15278756618499756,
-0.05481913685798645,
0.056772373616695404,
0.033679015934467316,
0.009890392422676086,
-0.039845049381256104,
0.15556704998016357,
-0.04715634137392044,
-0.031971678137779236,
-0.008016851730644703,
0.10457819700241089,
-0.025798460468649864,
0.0008004952687770128,
-0.08631382882595062,
-0.017501354217529297,
0.07942086458206177,
0.05471555143594742,
-0.06308098137378693,
0.03848804533481598,
-0.028445685282349586,
-0.04137520119547844,
-0.010886180214583874,
-0.11285246908664703,
0.027823373675346375,
0.005804997403174639,
-0.0912257507443428,
0.000049442813178757206,
0.01724158599972725,
0.02834797278046608,
-0.025754988193511963,
0.08864409476518631,
-0.08285772800445557,
0.006266670301556587,
-0.09504107385873795,
-0.05710989236831665,
0.011079434305429459,
-0.06816836446523666,
0.0007511216681450605,
-0.08657612651586533,
-0.1823163479566574,
-0.036656349897384644,
0.046532515436410904,
-0.028742559254169464,
-0.047964464873075485,
-0.0645855963230133,
-0.06113352254033089,
-0.014708264730870724,
-0.0029538122471421957,
0.11773316562175751,
-0.048558272421360016,
0.08876244723796844,
0.010077507235109806,
0.024391863495111465,
0.011891860514879227,
0.05207663029432297,
-0.08902701735496521,
0.01869143173098564,
-0.06367535889148712,
0.07014806568622589,
-0.08570747822523117,
0.03467991575598717,
-0.09504437446594238,
-0.1370362788438797,
0.007272681221365929,
-0.020462429150938988,
0.06065727770328522,
0.10744660347700119,
-0.17120850086212158,
-0.03130170330405235,
0.14768821001052856,
-0.0511573888361454,
-0.10743220150470734,
0.10209202766418457,
-0.05860232934355736,
0.03817762807011604,
0.047793544828891754,
0.1467011272907257,
0.1251334697008133,
-0.10437749326229095,
-0.018788045272231102,
-0.004599434323608875,
0.08646260201931,
-0.0018833947833627462,
0.0568678043782711,
0.005685100331902504,
0.021717693656682968,
0.017474181950092316,
-0.09072381258010864,
0.00860241986811161,
-0.09355242550373077,
-0.11418437212705612,
-0.05160560458898544,
-0.1091010570526123,
0.0448814257979393,
0.04727071151137352,
0.05732695385813713,
-0.05042190104722977,
-0.09919719398021698,
0.10068175196647644,
0.1277817338705063,
-0.06248888000845909,
0.01763489842414856,
-0.0762428566813469,
0.054583095014095306,
-0.017671527341008186,
-0.03235974907875061,
-0.189421147108078,
-0.09282641112804413,
0.022921770811080933,
-0.03629749268293381,
0.03383228927850723,
0.038276512175798416,
0.05068844556808472,
0.07071220874786377,
-0.053369540721178055,
-0.03316197916865349,
-0.11293487995862961,
0.0030139260925352573,
-0.09094017744064331,
-0.17681454122066498,
-0.04944968968629837,
-0.02451896481215954,
0.1828855276107788,
-0.21172749996185303,
0.024745820090174675,
-0.041058193892240524,
0.110504649579525,
-0.008133740164339542,
-0.038614317774772644,
-0.023948924615979195,
0.07511274516582489,
-0.0066324821673333645,
-0.06419245898723602,
0.06021798029541969,
0.0002849710581358522,
-0.09761886298656464,
-0.11006070673465729,
-0.08538439869880676,
0.06363489478826523,
0.081990085542202,
0.014483096078038216,
-0.07641196250915527,
-0.028372744098305702,
-0.08654395490884781,
-0.046702876687049866,
-0.05501662567257881,
0.024796482175588608,
0.21646423637866974,
-0.002646815264597535,
0.12242574244737625,
-0.05984332039952278,
-0.06564508378505707,
-0.015298393554985523,
0.005437132902443409,
0.002814272651448846,
0.08611360937356949,
0.11291737109422684,
-0.08665111660957336,
0.09620745480060577,
0.11061600595712662,
-0.08373492956161499,
0.14125852286815643,
-0.06307516992092133,
-0.08436065912246704,
-0.013248187489807606,
0.012149788439273834,
-0.018431873992085457,
0.1165744885802269,
-0.11087583005428314,
0.00992937944829464,
0.02482587844133377,
0.02718828059732914,
0.04390746355056763,
-0.18916630744934082,
-0.032783906906843185,
0.023486975580453873,
-0.041408881545066833,
-0.056001927703619,
-0.017504235729575157,
0.02222450263798237,
0.08531035482883453,
0.02762158028781414,
-0.030346639454364777,
0.024286912754178047,
-0.011430462822318077,
-0.08838647603988647,
0.19504043459892273,
-0.1217549666762352,
-0.11564356833696365,
-0.09632838517427444,
0.021674709394574165,
-0.06609896570444107,
-0.04001409560441971,
0.03241724520921707,
-0.11686315387487411,
-0.028028827160596848,
-0.0635359063744545,
-0.019261928275227547,
-0.03766106069087982,
-0.01021351758390665,
0.05715954676270485,
0.006801150273531675,
0.0821908488869667,
-0.13355375826358795,
0.01234257873147726,
-0.035413313657045364,
-0.11209562420845032,
0.008724447339773178,
0.03561177849769592,
0.12806475162506104,
0.14184130728244781,
-0.006572892889380455,
0.022470271214842796,
-0.02904031053185463,
0.252521812915802,
-0.06778868287801743,
-0.025740712881088257,
0.11020658165216446,
0.004533759783953428,
0.04891861230134964,
0.10735984891653061,
0.046204566955566406,
-0.09899578243494034,
0.029421718791127205,
0.07440198212862015,
-0.01174637209624052,
-0.23853865265846252,
-0.04498593881726265,
-0.04077969864010811,
-0.07924310117959976,
0.07992526143789291,
0.03497076779603958,
0.036116745322942734,
0.05199187994003296,
-0.00037382938899099827,
0.027150245383381844,
-0.03787821903824806,
0.07126322388648987,
0.092072032392025,
0.039542462676763535,
0.10745864361524582,
-0.02488526701927185,
-0.026557737961411476,
0.04822617769241333,
0.013035200536251068,
0.2853182256221771,
-0.023161502555012703,
0.05186215415596962,
0.08083248883485794,
0.19184404611587524,
-0.03301745653152466,
0.032583896070718765,
-0.008815093897283077,
-0.018918531015515327,
0.011403566226363182,
-0.05043499171733856,
-0.03241969645023346,
0.023044338449835777,
-0.009239614941179752,
0.07405231148004532,
-0.1161462813615799,
0.033879879862070084,
0.045232534408569336,
0.25807613134384155,
0.022282510995864868,
-0.25631052255630493,
-0.10113473981618881,
0.003732919692993164,
-0.030780892819166183,
-0.026731427758932114,
0.024723460897803307,
0.09448154270648956,
-0.13465848565101624,
0.02151642180979252,
-0.05448765307664871,
0.096007339656353,
-0.012677336111664772,
0.015156463719904423,
0.08558528870344162,
0.10698683559894562,
0.015999820083379745,
0.09570572525262833,
-0.25596126914024353,
0.22614717483520508,
0.0015657237963750958,
0.12608632445335388,
-0.051024530082941055,
0.01957685314118862,
0.020337579771876335,
0.08414699137210846,
0.09119634330272675,
0.0005622446187771857,
0.013500788249075413,
-0.17675107717514038,
-0.04093330353498459,
0.05613081902265549,
0.10204335302114487,
-0.028376873582601547,
0.08657174557447433,
-0.03717435896396637,
0.017519919201731682,
0.05744323134422302,
-0.018423888832330704,
-0.16294069588184357,
-0.13141889870166779,
0.010570275597274303,
-0.012962764129042625,
-0.01395778451114893,
-0.08241528272628784,
-0.10391724109649658,
-0.07688389718532562,
0.16296271979808807,
-0.021435845643281937,
-0.03905352205038071,
-0.11634056270122528,
0.08332783728837967,
0.10045318305492401,
-0.065400630235672,
0.01650303415954113,
0.025783797726035118,
0.08598342537879944,
0.037746258080005646,
-0.07560079544782639,
0.02705804631114006,
-0.08354859799146652,
-0.1560395061969757,
-0.04551088437438011,
0.11761105060577393,
0.06810557097196579,
0.06625603884458542,
0.005900253541767597,
0.0015925501938909292,
-0.006568034645169973,
-0.10508754104375839,
-0.005270631052553654,
0.07375980168581009,
0.10885841399431229,
0.03937751427292824,
-0.11892157793045044,
0.05036647617816925,
-0.07158789783716202,
-0.013757879845798016,
0.1567823886871338,
0.18381349742412567,
-0.09014102071523666,
0.049227114766836166,
0.0503196120262146,
-0.07989007234573364,
-0.15039469301700592,
0.06881286948919296,
0.1168905720114708,
0.015337430872023106,
0.03119717538356781,
-0.1965559422969818,
0.11937661468982697,
0.12808336317539215,
-0.00265951338224113,
0.030920732766389847,
-0.3638768196105957,
-0.11067306995391846,
0.07011149078607559,
0.128363236784935,
0.02584751322865486,
-0.12971605360507965,
-0.016051368787884712,
0.0011254671262577176,
-0.1699378937482834,
0.09573649615049362,
-0.07488081604242325,
0.09998585283756256,
-0.01868410035967827,
0.09848813712596893,
0.011020202189683914,
-0.040584031492471695,
0.14177942276000977,
0.051854103803634644,
0.07156246900558472,
-0.04612914100289345,
0.002490211511030793,
0.053491368889808655,
-0.05698881670832634,
0.024317864328622818,
-0.027024084702134132,
0.0783277228474617,
-0.14914092421531677,
-0.01986723393201828,
-0.09024418145418167,
0.04150829464197159,
-0.06837276369333267,
-0.07602797448635101,
-0.03871263563632965,
0.07014785706996918,
0.0940532386302948,
-0.022507110610604286,
0.043843742460012436,
0.01829618774354458,
0.12480251491069794,
0.032386232167482376,
0.10099437087774277,
-0.015038464218378067,
-0.11356398463249207,
-0.027168812230229378,
-0.008740493096411228,
0.0648367628455162,
-0.12028416246175766,
0.01972617208957672,
0.1558448225259781,
0.047007910907268524,
0.14639273285865784,
0.05883440002799034,
-0.03125881031155586,
0.00463233282789588,
0.036752499639987946,
-0.10218358784914017,
-0.13443975150585175,
-0.0046408167108893394,
-0.0632544457912445,
-0.13895276188850403,
0.018763285130262375,
0.06596023589372635,
-0.05965671315789223,
-0.011604593135416508,
-0.0037444964982569218,
0.007494669407606125,
-0.045962605625391006,
0.18070951104164124,
0.05472315847873688,
0.06378278136253357,
-0.08120497316122055,
0.10181240737438202,
0.05099468678236008,
-0.07184164226055145,
0.028399398550391197,
0.04497839882969856,
-0.10000482946634293,
-0.03430555388331413,
0.047445084899663925,
0.14181305468082428,
-0.053335804492235184,
-0.04254327341914177,
-0.09214647859334946,
-0.08993848413228989,
0.05736379325389862,
0.09366564452648163,
0.0645444318652153,
-0.01974092423915863,
-0.05474335327744484,
0.03855893760919571,
-0.14952844381332397,
0.08821644634008408,
0.03426600992679596,
0.05823865160346031,
-0.14750179648399353,
0.1195506677031517,
0.011644656769931316,
0.06154259666800499,
-0.006531962193548679,
0.017029685899615288,
-0.09122371673583984,
-0.00035788401146419346,
-0.16116997599601746,
-0.04210871830582619,
-0.034528475254774094,
0.012815737165510654,
-0.020264344289898872,
-0.05545533820986748,
-0.050505105406045914,
0.052495479583740234,
-0.07928206771612167,
-0.05021369084715843,
0.02687099575996399,
0.056376803666353226,
-0.13177311420440674,
-0.005701773334294558,
0.025430670008063316,
-0.08486063033342361,
0.07228559255599976,
0.05766622722148895,
0.019536159932613373,
0.05416545644402504,
-0.10107298940420151,
-0.010472658090293407,
0.02580881677567959,
0.04539884254336357,
0.07228753715753555,
-0.0808045044541359,
-0.0036123667377978563,
-0.01930859684944153,
0.03468786180019379,
0.024966225028038025,
0.05955671891570091,
-0.12010239064693451,
-0.021153010427951813,
-0.049417030066251755,
-0.045278966426849365,
-0.06747326254844666,
0.03843463212251663,
0.09746582806110382,
0.04122072085738182,
0.1811121702194214,
-0.07834301888942719,
0.06675133854150772,
-0.18945002555847168,
-0.03742314130067825,
0.006185952574014664,
-0.018116861581802368,
-0.07725530117750168,
-0.051191166043281555,
0.06231158599257469,
-0.06888189911842346,
0.10354170203208923,
-0.04944077506661415,
0.0945499986410141,
0.03627553954720497,
-0.052187371999025345,
0.0004279880376998335,
-0.002332843141630292,
0.22010935842990875,
0.061281949281692505,
-0.01335294172167778,
0.08992283046245575,
0.0032990274485200644,
0.04133985936641693,
0.055062443017959595,
0.20186862349510193,
0.159558966755867,
-0.023565853014588356,
0.0504235178232193,
0.06733383238315582,
-0.09524892270565033,
-0.1348371058702469,
0.08594081550836563,
-0.022931579500436783,
0.08824219554662704,
-0.044921234250068665,
0.17227905988693237,
0.09424673765897751,
-0.18254467844963074,
0.05184752866625786,
-0.05755504593253136,
-0.11347677558660507,
-0.09868437796831131,
-0.02034841664135456,
-0.06477595120668411,
-0.12602803111076355,
0.03804738074541092,
-0.1380731761455536,
0.013529774732887745,
0.0796278789639473,
0.024324942380189896,
-0.007872706279158592,
0.17384962737560272,
-0.023827137425541878,
0.02312646247446537,
0.05668151006102562,
-0.0012477516429498792,
-0.012965825386345387,
-0.07835546880960464,
-0.03885848447680473,
0.03075464256107807,
-0.022157413884997368,
0.07598086446523666,
-0.054354555904865265,
-0.03351413086056709,
0.01696210354566574,
-0.02027120627462864,
-0.05212102457880974,
0.025126662105321884,
0.024272294715046883,
0.05286406725645065,
0.06704152375459671,
0.04478909447789192,
0.001046779565513134,
-0.03364372253417969,
0.2630433142185211,
-0.0680806040763855,
-0.09338140487670898,
-0.1460730880498886,
0.2023874670267105,
0.05700143799185753,
-0.027326589450240135,
0.07562559843063354,
-0.10796625912189484,
-0.0010265440214425325,
0.20947232842445374,
0.16210629045963287,
-0.08964288234710693,
-0.016434594988822937,
0.010587873868644238,
-0.011464055627584457,
-0.059804659336805344,
0.10817406326532364,
0.1403564065694809,
0.042562905699014664,
-0.06147635355591774,
-0.043900277465581894,
-0.027758050709962845,
-0.028308097273111343,
-0.058855872601270676,
0.07402903586626053,
0.03882259503006935,
-0.01015161070972681,
-0.036693036556243896,
0.06909631937742233,
-0.03984961286187172,
-0.13661494851112366,
0.05145386978983879,
-0.15807399153709412,
-0.1842789351940155,
-0.032818350940942764,
0.08328522741794586,
-0.012584459036588669,
0.05179779604077339,
-0.018463358283042908,
-0.02149994671344757,
0.14782898128032684,
-0.015218119136989117,
-0.03770527243614197,
-0.09598147869110107,
0.10595419257879257,
-0.03044046461582184,
0.2071610391139984,
-0.006461804732680321,
0.08235248178243637,
0.11004684120416641,
0.049238283187150955,
-0.09483221918344498,
0.050877176225185394,
0.08681455254554749,
-0.08111730217933655,
0.007441497873514891,
0.12204369902610779,
-0.035768598318099976,
0.08699864894151688,
0.048400722444057465,
-0.13675352931022644,
0.005072943866252899,
-0.04306665062904358,
-0.033558666706085205,
-0.08059106767177582,
0.026030410081148148,
-0.06151232495903969,
0.15704716742038727,
0.23414230346679688,
-0.039006300270557404,
0.006558314431458712,
-0.07450856268405914,
0.04227214306592941,
0.04589500278234482,
0.09984473884105682,
-0.044194724410772324,
-0.20064960420131683,
0.016806619241833687,
-0.027110343798995018,
0.0032211823854595423,
-0.22341270744800568,
-0.10472851991653442,
0.06944173574447632,
-0.06138349324464798,
-0.05939844623208046,
0.1154114305973053,
0.07956157624721527,
0.04094082489609718,
-0.040022511035203934,
-0.10599823296070099,
-0.06854353100061417,
0.13919124007225037,
-0.1528342068195343,
-0.05313125625252724
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-qa
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1925
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model_index": [{"name": "distilbert-base-uncased-qa", "results": [{"task": {"name": "Question Answering", "type": "question-answering"}, "dataset": {"name": "squad", "type": "squad", "args": "plain_text"}}]}]}
|
question-answering
|
andi611/distilbert-base-uncased-squad
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# distilbert-base-uncased-qa
This model is a fine-tuned version of distilbert-base-uncased on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1925
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 3.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# distilbert-base-uncased-qa\n\nThis model is a fine-tuned version of distilbert-base-uncased on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 1.1925",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-qa\n\nThis model is a fine-tuned version of distilbert-base-uncased on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 1.1925",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 3.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
52,
56,
6,
12,
8,
3,
105,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# distilbert-base-uncased-qa\n\nThis model is a fine-tuned version of distilbert-base-uncased on the squad dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 1.1925## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 1000\n- num_epochs: 3.0### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.10978585481643677,
0.15697795152664185,
-0.002723450306802988,
0.11620127409696579,
0.1394302099943161,
0.04254036024212837,
0.07407311350107193,
0.16795817017555237,
-0.04368206486105919,
0.06897994130849838,
0.0698416456580162,
0.07807139307260513,
0.04805368185043335,
0.12392843514680862,
-0.040313128381967545,
-0.22163721919059753,
0.008516478352248669,
-0.036955371499061584,
-0.0647139772772789,
0.09862080961465836,
0.08348526805639267,
-0.09339907020330429,
0.07219716906547546,
-0.031863972544670105,
-0.11474579572677612,
0.012225329875946045,
-0.027162989601492882,
-0.02255047671496868,
0.09924159944057465,
-0.012029103003442287,
0.0670255720615387,
0.01219444815069437,
0.128321573138237,
-0.23151904344558716,
-0.005509914364665747,
0.07400167733430862,
0.03845871239900589,
0.07271479070186615,
0.03447062149643898,
-0.002164536854252219,
0.08587827533483505,
-0.16499511897563934,
0.08147498220205307,
0.030335593968629837,
-0.07929500937461853,
-0.13626889884471893,
-0.09365362673997879,
0.04980757087469101,
0.08806885033845901,
0.09920458495616913,
0.007872242480516434,
0.1176425963640213,
-0.0979931578040123,
0.09145739674568176,
0.21175400912761688,
-0.26438891887664795,
-0.043944068253040314,
0.03839189186692238,
0.037870410829782486,
0.06475254148244858,
-0.10669606178998947,
-0.02280403859913349,
0.015173924155533314,
0.030358746647834778,
0.07632957398891449,
-0.03629722446203232,
-0.1353149116039276,
0.014126193709671497,
-0.115714892745018,
-0.024951353669166565,
0.15694871544837952,
0.033013273030519485,
-0.04147791862487793,
-0.08804747462272644,
-0.06488362699747086,
-0.09845472872257233,
0.00016034157306421548,
-0.021856172010302544,
0.04128195717930794,
-0.05315183848142624,
-0.03468700870871544,
-0.03481004387140274,
-0.0602792426943779,
-0.07298801839351654,
0.002211513463407755,
0.0739450454711914,
0.047898393124341965,
0.02923552505671978,
-0.016745056957006454,
0.11906901746988297,
-0.00401142705231905,
-0.1377677470445633,
-0.029858144000172615,
-0.012994598597288132,
-0.1186618059873581,
-0.029146920889616013,
-0.027537155896425247,
0.01211147103458643,
0.009429561905562878,
0.1537317931652069,
-0.0439145490527153,
0.07181607186794281,
0.043587423861026764,
-0.015815310180187225,
-0.007743991911411285,
0.13938412070274353,
-0.04218010976910591,
-0.06614269316196442,
-0.011187910102307796,
0.09999547153711319,
0.0002782527881208807,
-0.015797801315784454,
-0.06038019806146622,
-0.020643260329961777,
0.07772020250558853,
0.08139865845441818,
-0.02315763756632805,
0.033770106732845306,
-0.04143436625599861,
-0.026433903723955154,
0.012047898955643177,
-0.13908876478672028,
0.04567641392350197,
0.008072014898061752,
-0.1037035882472992,
-0.019857801496982574,
0.03715356066823006,
-0.01905481331050396,
-0.052365973591804504,
0.09684539586305618,
-0.06980681419372559,
-0.0017258807783946395,
-0.07075215131044388,
-0.06994029134511948,
0.011227920651435852,
-0.11656718701124191,
-0.021413523703813553,
-0.047334637492895126,
-0.23162458837032318,
-0.05254027992486954,
0.05103761702775955,
-0.06808407604694366,
-0.030580677092075348,
-0.05490041896700859,
-0.06933087110519409,
0.015515216626226902,
-0.01148685161024332,
0.10843811184167862,
-0.05863606557250023,
0.08994043618440628,
-0.017025373876094818,
0.04685874283313751,
0.03245536610484123,
0.05468665808439255,
-0.09351608902215958,
0.02624109573662281,
-0.11330546438694,
0.0836276188492775,
-0.09926910698413849,
0.01845012977719307,
-0.11172502487897873,
-0.10337793081998825,
0.04698290303349495,
-0.022899258881807327,
0.06213168054819107,
0.1514655351638794,
-0.21937772631645203,
-0.0016306849429383874,
0.11816494911909103,
-0.06539617478847504,
-0.0633830577135086,
0.0775136724114418,
-0.04856004938483238,
0.027789117768406868,
0.056273315101861954,
0.18019185960292816,
0.11746428906917572,
-0.13299039006233215,
-0.04119071736931801,
0.021548228338360786,
0.0456160269677639,
0.01145939901471138,
0.03952588140964508,
0.001125949900597334,
0.061116717755794525,
0.02142505533993244,
-0.09596408903598785,
-0.01974390633404255,
-0.07849787175655365,
-0.09303655475378036,
-0.06428509950637817,
-0.08455503731966019,
0.02914297580718994,
0.039713941514492035,
0.022984227165579796,
-0.04303043708205223,
-0.09405536204576492,
0.09073566645383835,
0.1369021087884903,
-0.052089136093854904,
0.011809730902314186,
-0.057971417903900146,
0.03950544819235802,
0.013896851800382137,
-0.022572927176952362,
-0.20356746017932892,
-0.12330573797225952,
0.044003404676914215,
-0.07890217751264572,
0.021869158372282982,
0.0229856725782156,
0.05600595474243164,
0.04744346812367439,
-0.03978684917092323,
-0.0313323549926281,
-0.0642896220088005,
-0.0063717905431985855,
-0.09052491933107376,
-0.19358129799365997,
-0.050608616322278976,
-0.024050895124673843,
0.12564268708229065,
-0.1951645165681839,
0.007441873662173748,
-0.03538838401436806,
0.1190686970949173,
0.019368771463632584,
-0.05380666255950928,
0.011415843851864338,
0.0343981608748436,
0.002855486935004592,
-0.09456739574670792,
0.04271111264824867,
-0.008901032619178295,
-0.07380633056163788,
-0.06917529553174973,
-0.10781966149806976,
-0.014417803846299648,
0.054149795323610306,
0.0825670138001442,
-0.09868836402893066,
-0.001328971702605486,
-0.059780724346637726,
-0.054364342242479324,
-0.08510817587375641,
-0.0012195713352411985,
0.1650933474302292,
0.03371358662843704,
0.10617726296186447,
-0.04932595044374466,
-0.0709446594119072,
-0.00882794614881277,
0.01886313036084175,
0.017447402700781822,
0.09686299413442612,
0.09912268817424774,
-0.06385543197393417,
0.08406295627355576,
0.07357760518789291,
-0.04176364466547966,
0.12471804767847061,
-0.05740983039140701,
-0.08346811681985855,
-0.025371281430125237,
0.022459544241428375,
-0.019626403227448463,
0.1434049904346466,
-0.06836759299039841,
0.009933949448168278,
0.04341290146112442,
0.03481360152363777,
0.014136852696537971,
-0.16371658444404602,
-0.013515789061784744,
0.004193851258605719,
-0.061564695090055466,
-0.054391857236623764,
-0.009127398021519184,
0.03611740469932556,
0.09326629340648651,
0.01191560085862875,
-0.028982464224100113,
0.011711575090885162,
-0.023998286575078964,
-0.07844308763742447,
0.18329572677612305,
-0.13254618644714355,
-0.13190537691116333,
-0.09705211222171783,
0.0611288920044899,
-0.0646296963095665,
-0.042554207146167755,
0.02596324123442173,
-0.07124540954828262,
-0.05537937209010124,
-0.0882132425904274,
-0.018150407820940018,
-0.011289936490356922,
-0.006121004465967417,
0.004846971016377211,
0.006833527237176895,
0.07050130516290665,
-0.13928242027759552,
0.0036774936597794294,
-0.026811450719833374,
-0.0843002200126648,
0.011096128262579441,
0.059698741883039474,
0.09329620748758316,
0.10414087772369385,
0.008752112276852131,
0.01906684786081314,
-0.021410111337900162,
0.22993133962154388,
-0.06931310147047043,
0.027914050966501236,
0.1072731465101242,
0.017179204151034355,
0.0640057846903801,
0.1478249579668045,
0.02556794136762619,
-0.10019897669553757,
0.02355412021279335,
0.08042032271623611,
-0.01445969007909298,
-0.2488812953233719,
-0.03826363757252693,
-0.03925962373614311,
-0.0329679436981678,
0.0780278742313385,
0.05882367491722107,
-0.024706291034817696,
0.0309157557785511,
0.008221447467803955,
-0.005251898895949125,
-0.03225397691130638,
0.0655435174703598,
0.09370787441730499,
0.0364755280315876,
0.08993388712406158,
-0.027964793145656586,
-0.0006501057650893927,
0.07507628202438354,
0.02180624194443226,
0.3032052218914032,
-0.05818868428468704,
0.10189826041460037,
0.04224341735243797,
0.15321072936058044,
-0.0518091581761837,
0.05822765454649925,
0.0029701509047299623,
0.010815606452524662,
0.006692366674542427,
-0.055373046547174454,
-0.020455140620470047,
0.022591961547732353,
-0.020251577720046043,
0.03915201872587204,
-0.0851924866437912,
0.05311312526464462,
0.043526168912649155,
0.2893036901950836,
0.05174875259399414,
-0.23942576348781586,
-0.06509321928024292,
0.012096450664103031,
-0.040540292859077454,
-0.061960794031620026,
0.008505183272063732,
0.13421021401882172,
-0.1371179074048996,
0.04740285500884056,
-0.05113464221358299,
0.0839063748717308,
-0.04751700535416603,
0.0029340991750359535,
0.07336930185556412,
0.0948772057890892,
0.0006936685531400144,
0.08253355324268341,
-0.223690927028656,
0.21039022505283356,
0.021047750487923622,
0.10447734594345093,
-0.06229676306247711,
0.027966950088739395,
0.014806046150624752,
0.05062783136963844,
0.13589997589588165,
0.013859973289072514,
-0.035928208380937576,
-0.15731598436832428,
-0.09344466030597687,
0.04587215185165405,
0.11362174898386002,
-0.02350698597729206,
0.08311090618371964,
-0.0468834824860096,
-0.0019173381151631474,
0.0291205495595932,
-0.07026606798171997,
-0.12766045331954956,
-0.11275137960910797,
0.027875185012817383,
0.0011526258895173669,
-0.04489394649863243,
-0.04739626869559288,
-0.08693423867225647,
-0.0068093566223979,
0.1472870111465454,
-0.0360717698931694,
-0.07040993869304657,
-0.13930776715278625,
0.03250344470143318,
0.15427246689796448,
-0.0667041540145874,
0.010960793122649193,
0.02116304449737072,
0.061094071716070175,
0.056743521243333817,
-0.08560345321893692,
0.04777972400188446,
-0.06975292414426804,
-0.17925085127353668,
-0.05622263625264168,
0.1199173629283905,
0.06635293364524841,
0.05278845503926277,
-0.01733590103685856,
0.028601422905921936,
-0.013903231360018253,
-0.10707491636276245,
0.0022013592533767223,
0.08738253265619278,
0.07945433259010315,
0.07032632827758789,
-0.10368996113538742,
0.043145425617694855,
-0.03592536598443985,
-0.01548322569578886,
0.11134227365255356,
0.20671215653419495,
-0.09788889437913895,
0.11928395181894302,
0.05376717075705528,
-0.07933463901281357,
-0.17535091936588287,
0.03623080626130104,
0.10375025868415833,
0.014277618378400803,
0.06594907492399216,
-0.18479807674884796,
0.12280229479074478,
0.11030790209770203,
-0.015044399537146091,
0.07352711260318756,
-0.3451770842075348,
-0.11986549943685532,
0.05764725059270859,
0.09357945621013641,
0.044881585985422134,
-0.1481836587190628,
-0.03064621053636074,
-0.005426047835499048,
-0.14778433740139008,
0.11316833645105362,
-0.05739229544997215,
0.11958840489387512,
-0.027771420776844025,
0.11231681704521179,
0.029195213690400124,
-0.038571931421756744,
0.143147811293602,
0.09696966409683228,
0.07960715889930725,
-0.04601693153381348,
-0.006783219985663891,
0.03742172569036484,
-0.0775819793343544,
0.07824066281318665,
-0.05628766492009163,
0.0816970095038414,
-0.1892007291316986,
0.002993704052641988,
-0.08067524433135986,
0.05923519283533096,
-0.06116725131869316,
-0.06232475861907005,
-0.0378720685839653,
0.06864214688539505,
0.08837103843688965,
-0.033795252442359924,
0.033600375056266785,
0.014849057421088219,
0.059260960668325424,
0.08714815974235535,
0.07231544703245163,
0.0063869161531329155,
-0.1281830221414566,
0.004518372472375631,
0.0008220873423852026,
0.04731351137161255,
-0.11641091853380203,
0.03177136182785034,
0.15552778542041779,
0.05715421214699745,
0.13121435046195984,
0.028247086331248283,
-0.029700178653001785,
-0.01186858769506216,
0.030004791915416718,
-0.12988430261611938,
-0.11929793655872345,
0.025617675855755806,
-0.09092574566602707,
-0.13584275543689728,
0.01826408877968788,
0.09405367821455002,
-0.029889775440096855,
-0.008850050158798695,
-0.006860097870230675,
0.03724895417690277,
-0.006690148264169693,
0.20636741816997528,
0.04110006242990494,
0.06467055529356003,
-0.0975659042596817,
0.13360922038555145,
0.05367659404873848,
-0.05635659024119377,
0.042580705136060715,
0.09512939304113388,
-0.10626111924648285,
-0.02409886009991169,
0.06077592447400093,
0.13169386982917786,
-0.05103502795100212,
-0.0268865954130888,
-0.08931569755077362,
-0.07229010760784149,
0.05454348772764206,
0.10032670944929123,
0.04140461981296539,
0.005014955066144466,
-0.04699815809726715,
0.008829121477901936,
-0.12776590883731842,
0.09466792643070221,
0.06753041595220566,
0.05219124257564545,
-0.1284913718700409,
0.09133729338645935,
0.0020710937678813934,
0.06411601603031158,
-0.011473487131297588,
0.022899745032191277,
-0.10259882360696793,
-0.015218072570860386,
-0.11333668977022171,
-0.0085771968588233,
-0.03055313229560852,
0.007046959828585386,
-0.012229898944497108,
-0.05628449469804764,
-0.033566512167453766,
0.044498905539512634,
-0.0756044015288353,
-0.0642329677939415,
0.008527415804564953,
0.05281555652618408,
-0.1638156622648239,
-0.02739928662776947,
0.016378993168473244,
-0.08890742808580399,
0.09220948070287704,
0.06623868644237518,
0.00805373303592205,
0.028682103380560875,
-0.09197225421667099,
-0.022987334057688713,
0.00266600982286036,
0.037323713302612305,
0.07257025688886642,
-0.09105727076530457,
-0.010588167235255241,
-0.03449887037277222,
0.03420308232307434,
0.02480022981762886,
0.051519837230443954,
-0.12916263937950134,
0.015846315771341324,
-0.06771118193864822,
-0.034829940646886826,
-0.07218879461288452,
0.03986406698822975,
0.107783243060112,
0.041915617883205414,
0.16616258025169373,
-0.06976623088121414,
0.057565268129110336,
-0.19124597311019897,
-0.03975316509604454,
0.013959653675556183,
-0.03105119802057743,
-0.06800741702318192,
-0.033060476183891296,
0.09323753416538239,
-0.05732814222574234,
0.10274426639080048,
-0.014039919711649418,
0.10049712657928467,
0.026698175817728043,
-0.028173841536045074,
-0.04029642418026924,
-0.011271584779024124,
0.13258956372737885,
0.037786614149808884,
-0.01624460145831108,
0.09302971512079239,
-0.010471919551491737,
0.02063138037919998,
0.054879192262887955,
0.22387850284576416,
0.1458054929971695,
-0.0007655407534912229,
0.04509229212999344,
0.056101541966199875,
-0.1264875829219818,
-0.14736729860305786,
0.09192080795764923,
-0.04420659691095352,
0.12044630944728851,
-0.06352777779102325,
0.1937357485294342,
0.04001373425126076,
-0.18411456048488617,
0.0823969691991806,
-0.05420771986246109,
-0.12583142518997192,
-0.1200433000922203,
-0.03566497191786766,
-0.06062363088130951,
-0.11355550587177277,
0.02044181153178215,
-0.1232154592871666,
0.06499530375003815,
0.09623928368091583,
0.016793183982372284,
0.00520485220476985,
0.1336447149515152,
-0.05092114582657814,
0.02112651616334915,
0.05749187245965004,
0.024895528331398964,
-0.0050494857132434845,
-0.036707814782857895,
-0.03603528439998627,
0.02639198489487171,
0.007385154254734516,
0.06903774291276932,
-0.029198700562119484,
0.005285837687551975,
0.0163229051977396,
-0.03263043239712715,
-0.05650701746344566,
0.010965843684971333,
0.022210003808140755,
0.025279482826590538,
0.08368639647960663,
0.0697738379240036,
-0.0008194411639124155,
-0.03647240251302719,
0.2986714839935303,
-0.08162151277065277,
-0.09107496589422226,
-0.14119619131088257,
0.20434479415416718,
0.041491784155368805,
-0.031091412529349327,
0.07141956686973572,
-0.12647050619125366,
-0.030046606436371803,
0.16278086602687836,
0.14297553896903992,
-0.08186127245426178,
-0.017510931938886642,
-0.01772494986653328,
-0.003914618398994207,
-0.03705122321844101,
0.10462221503257751,
0.10388155281543732,
0.08744306117296219,
-0.058559268712997437,
-0.004869963973760605,
-0.01768924482166767,
-0.04018709808588028,
-0.09119603782892227,
0.08048181235790253,
0.01908174157142639,
0.0026322773192077875,
-0.03744533658027649,
0.06410231441259384,
-0.02153763920068741,
-0.2003650665283203,
0.049430105835199356,
-0.17402353882789612,
-0.17638030648231506,
-0.033694107085466385,
0.06897787004709244,
-0.009407421573996544,
0.03595186397433281,
-0.012388849630951881,
-0.0022575482726097107,
0.16458213329315186,
-0.0224473774433136,
-0.06862773001194,
-0.10925395041704178,
0.0855826810002327,
-0.08794045448303223,
0.1988396793603897,
0.0063232057727873325,
0.06758587807416916,
0.09560475498437881,
0.022938741371035576,
-0.1355457901954651,
0.04729798063635826,
0.08404779434204102,
-0.10189797729253769,
0.015259940177202225,
0.15980926156044006,
-0.043011266738176346,
0.08141712844371796,
0.03766495734453201,
-0.08036230504512787,
-0.020195672288537025,
-0.004543384071439505,
-0.02040310949087143,
-0.09500950574874878,
-0.008597108535468578,
-0.050760816782712936,
0.15830042958259583,
0.21966831386089325,
-0.026376141235232353,
0.020490722730755806,
-0.08990523219108582,
0.02203851006925106,
0.034933239221572876,
0.0636613667011261,
-0.0349639430642128,
-0.18140822649002075,
0.04381444677710533,
0.0238150954246521,
0.03928397595882416,
-0.18307045102119446,
-0.10290191322565079,
0.04524023085832596,
-0.0543728731572628,
-0.051407188177108765,
0.11371041089296341,
0.027181880548596382,
0.03546065464615822,
-0.030286846682429314,
-0.10773470997810364,
-0.03936576843261719,
0.15159322321414948,
-0.16147595643997192,
-0.05660416558384895
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-squad2-with-ner-mit-restaurant-with-neg-with-repeat
This model is a fine-tuned version of [twmkn9/distilbert-base-uncased-squad2](https://huggingface.co/twmkn9/distilbert-base-uncased-squad2) on the squad_v2 and the mit_restaurant datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"language": ["en"], "tags": ["generated_from_trainer"], "datasets": ["squad_v2", "mit_restaurant"], "model_index": [{"name": "distilbert-base-uncased-squad2-with-ner-mit-restaurant-with-neg-with-repeat", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "squad_v2", "type": "squad_v2"}}, {"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "mit_restaurant", "type": "mit_restaurant"}}]}]}
|
question-answering
|
andi611/distilbert-base-uncased-squad2-with-ner-mit-restaurant-with-neg-with-repeat
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"en",
"dataset:squad_v2",
"dataset:mit_restaurant",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #distilbert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-mit_restaurant #endpoints_compatible #region-us
|
# distilbert-base-uncased-squad2-with-ner-mit-restaurant-with-neg-with-repeat
This model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the squad_v2 and the mit_restaurant datasets.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# distilbert-base-uncased-squad2-with-ner-mit-restaurant-with-neg-with-repeat\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the squad_v2 and the mit_restaurant datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-mit_restaurant #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-squad2-with-ner-mit-restaurant-with-neg-with-repeat\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the squad_v2 and the mit_restaurant datasets.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
56,
75,
6,
12,
8,
3,
90,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #en #dataset-squad_v2 #dataset-mit_restaurant #endpoints_compatible #region-us \n# distilbert-base-uncased-squad2-with-ner-mit-restaurant-with-neg-with-repeat\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the squad_v2 and the mit_restaurant datasets.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.10736114531755447,
0.22061698138713837,
-0.0021042730659246445,
0.07081130892038345,
0.11480206251144409,
0.029228180646896362,
0.06804236024618149,
0.16537511348724365,
-0.04765252396464348,
0.07912977784872055,
0.07595542818307877,
0.06399242579936981,
0.05728776380419731,
0.13082513213157654,
-0.02314796857535839,
-0.1680869609117508,
0.007032749243080616,
0.023427940905094147,
-0.012756217271089554,
0.13458703458309174,
0.10057380050420761,
-0.09178154170513153,
0.06890425086021423,
0.006671129260212183,
-0.12489128857851028,
0.0056547136045992374,
-0.028056558221578598,
-0.036099571734666824,
0.09621332585811615,
0.0005316960741765797,
0.09343216568231583,
0.01888740062713623,
0.09044180810451508,
-0.20472460985183716,
-0.004813592415302992,
0.06275428086519241,
0.01628207601606846,
0.07222458720207214,
0.01831178553402424,
0.030134161934256554,
0.04063839092850685,
-0.1357422024011612,
0.0954776257276535,
0.03415175527334213,
-0.09283361583948135,
-0.1364220678806305,
-0.10805141180753708,
0.07168462127447128,
0.04994085058569908,
0.09578528255224228,
-0.009297181852161884,
0.15688456594944,
-0.09190548211336136,
0.06359723210334778,
0.13247212767601013,
-0.2642923891544342,
-0.05858776718378067,
0.0022070298437029123,
0.030009962618350983,
0.049601078033447266,
-0.09305638074874878,
-0.017737161368131638,
0.04198390990495682,
0.0404328927397728,
0.036021966487169266,
-0.004002667497843504,
-0.02913545072078705,
-0.01123251486569643,
-0.08877839148044586,
-0.06926339119672775,
0.19395442306995392,
0.04367593303322792,
-0.057866841554641724,
-0.11333335191011429,
-0.053071774542331696,
-0.06148356571793556,
-0.006239334587007761,
-0.07315045595169067,
0.014375397004187107,
-0.053013917058706284,
-0.07728416472673416,
-0.047906145453453064,
-0.07499472051858902,
-0.03726968541741371,
0.023228071630001068,
0.07763933390378952,
0.03776297718286514,
-0.004762799479067326,
-0.02856745384633541,
0.08883851021528244,
-0.03697417303919792,
-0.1426810473203659,
-0.022077463567256927,
-0.010726029053330421,
-0.09250480681657791,
-0.0599115788936615,
-0.009682332165539265,
-0.037629760801792145,
0.000613320036791265,
0.16263347864151,
-0.02444264106452465,
0.04323991760611534,
0.0076065706089138985,
-0.03927743807435036,
0.004423638340085745,
0.1356665939092636,
-0.03754657506942749,
-0.09829212725162506,
-0.03583523631095886,
0.1242046132683754,
-0.005939997732639313,
-0.02048877626657486,
-0.059095192700624466,
-0.0323634073138237,
0.1063031256198883,
0.036422308534383774,
0.03463888168334961,
0.023645423352718353,
-0.0427483506500721,
-0.05852403864264488,
0.09901322424411774,
-0.11166871339082718,
0.03368406742811203,
0.004988210741430521,
-0.07934458553791046,
-0.045266035944223404,
-0.020088190212845802,
0.011500949040055275,
-0.04970547929406166,
0.07440707832574844,
-0.08795934915542603,
-0.040883228182792664,
-0.053287241607904434,
-0.049605149775743484,
0.019796350970864296,
-0.05322447791695595,
-0.01802786998450756,
-0.06862592697143555,
-0.160569429397583,
-0.05144191160798073,
0.051971063017845154,
-0.07572635263204575,
-0.06515012681484222,
-0.041867583990097046,
-0.0254278015345335,
0.0162818506360054,
-0.010790708474814892,
0.08076751232147217,
-0.02923731692135334,
0.08570065349340439,
0.016691071912646294,
-0.006156804971396923,
0.02717786468565464,
0.06332031637430191,
-0.09919866919517517,
0.04299993813037872,
-0.07900683581829071,
0.07567013800144196,
-0.07987870275974274,
0.013813012279570103,
-0.14188042283058167,
-0.10847057402133942,
0.0059760757721960545,
-0.052472058683633804,
0.07271923869848251,
0.1457301527261734,
-0.1705399453639984,
-0.0065126726403832436,
0.12131435424089432,
-0.05759771168231964,
-0.1201447919011116,
0.1253337860107422,
-0.034450285136699677,
0.03675695136189461,
0.0757855772972107,
0.17348970472812653,
0.15751631557941437,
-0.1184905394911766,
-0.054209694266319275,
0.01586765982210636,
0.08327425271272659,
0.02626636065542698,
0.0716235339641571,
-0.0037279885727912188,
0.038510508835315704,
0.00943570677191019,
-0.07398175448179245,
-0.01988133229315281,
-0.056221313774585724,
-0.09806719422340393,
-0.032260287553071976,
-0.0801413357257843,
0.12173748016357422,
0.044470321387052536,
0.023467527702450752,
-0.056376419961452484,
-0.11616936326026917,
0.05744768679141998,
0.11159588396549225,
-0.03859705477952957,
-0.0003423251328058541,
-0.06836878508329391,
0.09242341667413712,
-0.02795618772506714,
-0.03762910142540932,
-0.18320511281490326,
-0.10618267953395844,
0.04983404651284218,
-0.050140805542469025,
0.0247312281280756,
0.05407222732901573,
0.05593249574303627,
0.05025982856750488,
-0.03927433863282204,
-0.03208702802658081,
-0.08574690669775009,
0.020177530124783516,
-0.09300793707370758,
-0.11231015622615814,
-0.057153891772031784,
-0.038774631917476654,
0.20081213116645813,
-0.20776835083961487,
-0.002624640241265297,
0.022581716999411583,
0.11714515089988708,
0.01323781255632639,
-0.06747622042894363,
0.0002603679895401001,
0.02603430114686489,
0.0008607867057435215,
-0.0728607177734375,
0.03055517189204693,
0.015809915959835052,
-0.09609616547822952,
-0.0621316023170948,
-0.1146109402179718,
0.024269482120871544,
0.07236611843109131,
0.09683343768119812,
-0.08341116458177567,
-0.03122805990278721,
-0.06027527526021004,
-0.033616915345191956,
-0.06181641295552254,
-0.01716815121471882,
0.18750976026058197,
0.03904423117637634,
0.10374009609222412,
-0.045933373272418976,
-0.058502502739429474,
0.010803943499922752,
0.02892284281551838,
-0.04719436168670654,
0.07357116788625717,
0.012441718019545078,
-0.1831163614988327,
0.09573634713888168,
0.11708192527294159,
0.027104074135422707,
0.10088051855564117,
-0.026668483391404152,
-0.073483407497406,
-0.04602229967713356,
0.045395296066999435,
-0.02237103134393692,
0.12644970417022705,
-0.122380331158638,
0.018936794251203537,
0.05172617733478546,
0.008387700654566288,
0.01339135505259037,
-0.1490909308195114,
-0.014607451856136322,
0.05262492969632149,
-0.03779178857803345,
-0.044080767780542374,
-0.042413778603076935,
0.012622716836631298,
0.07024028897285461,
0.05542540177702904,
-0.03132009878754616,
0.024307716637849808,
-0.01819448731839657,
-0.07266578823328018,
0.14913904666900635,
-0.10648450255393982,
-0.20358799397945404,
-0.08020056039094925,
-0.0017729360843077302,
-0.05374506860971451,
-0.02287857048213482,
0.023970704525709152,
-0.07777907699346542,
-0.045838989317417145,
-0.06276634335517883,
-0.06694625318050385,
-0.049662698060274124,
-0.0349714569747448,
0.029239505529403687,
0.007689944934099913,
0.05909108743071556,
-0.1213105246424675,
0.009417416527867317,
-0.00843288004398346,
-0.05423580855131149,
-0.013848028145730495,
0.01403828151524067,
0.11631891131401062,
0.0944913923740387,
-0.02166527323424816,
0.015214405953884125,
-0.03991715982556343,
0.2581406831741333,
-0.09879842400550842,
-0.007055726833641529,
0.1147441416978836,
0.02665547840297222,
0.07061497122049332,
0.15558838844299316,
0.03134346753358841,
-0.07374249398708344,
0.010974908247590065,
0.03329278901219368,
-0.0020527923479676247,
-0.22235842049121857,
-0.04899891838431358,
-0.05176512151956558,
-0.0730729028582573,
0.1410466879606247,
0.03954267129302025,
0.037287645041942596,
0.05580167844891548,
-0.0402635894715786,
0.051208846271038055,
-0.02027248777449131,
0.0720207467675209,
0.11673320084810257,
0.0320746935904026,
0.07454481720924377,
-0.012716980651021004,
-0.024896616116166115,
0.07484529167413712,
0.05330228805541992,
0.21558603644371033,
-0.035166289657354355,
0.1682482659816742,
0.03456789627671242,
0.15849560499191284,
-0.04927457123994827,
0.0229886956512928,
-0.014060268178582191,
0.0136942770332098,
-0.011898872442543507,
-0.07408569753170013,
-0.07274145632982254,
0.02633672207593918,
0.06194034591317177,
0.004789974074810743,
-0.04689820110797882,
0.00649389298632741,
0.017082884907722473,
0.16935011744499207,
0.08616851270198822,
-0.26719868183135986,
-0.0706486701965332,
0.015592332929372787,
-0.012093842029571533,
-0.06442002952098846,
-0.00013256713282316923,
0.11307154595851898,
-0.13082949817180634,
0.061769191175699234,
-0.0553341768682003,
0.09352613985538483,
-0.04209486395120621,
0.013171497732400894,
0.07505089044570923,
0.0392274484038353,
0.009623976424336433,
0.10960215330123901,
-0.19720974564552307,
0.2086496651172638,
0.02228729799389839,
0.08830016106367111,
-0.1097865030169487,
0.0248708538711071,
-0.02576792798936367,
0.05434301123023033,
0.13765500485897064,
0.014458120800554752,
-0.02243203856050968,
-0.13253332674503326,
-0.09340608865022659,
0.025651924312114716,
0.06999627500772476,
-0.025670522823929787,
0.07076743990182877,
-0.047233060002326965,
-0.00654917536303401,
0.028837470337748528,
0.010618055239319801,
-0.1338084638118744,
-0.16489580273628235,
0.05276450142264366,
-0.03812579810619354,
-0.051497481763362885,
-0.05834075063467026,
-0.0837797075510025,
0.01926431804895401,
0.1876792162656784,
0.03409496322274208,
-0.051590293645858765,
-0.15003791451454163,
0.06497903168201447,
0.12520574033260345,
-0.0781785324215889,
0.005513888783752918,
0.017190653830766678,
0.10405183583498001,
0.003209321526810527,
-0.05894678458571434,
0.03852730244398117,
-0.039698127657175064,
-0.13245384395122528,
-0.04925760626792908,
0.1321968138217926,
0.031668566167354584,
0.07973317801952362,
0.03092498891055584,
0.04108889028429985,
-0.002851777011528611,
-0.07971269637346268,
-0.024108977988362312,
0.011375855654478073,
0.10653630644083023,
0.03650084510445595,
-0.02991431951522827,
0.005815782584249973,
-0.07439898699522018,
0.020380765199661255,
0.12897734344005585,
0.22052811086177826,
-0.07677971571683884,
0.10844991356134415,
0.06836998462677002,
-0.05101707577705383,
-0.13478000462055206,
-0.01332867331802845,
0.06674579530954361,
0.0015460584545508027,
0.05148867890238762,
-0.15674234926700592,
0.06633259356021881,
0.08462611585855484,
-0.022063465788960457,
0.04424137994647026,
-0.3177047371864319,
-0.11742153763771057,
0.04569302499294281,
0.11692235618829727,
0.026242852210998535,
-0.09658512473106384,
-0.06259489804506302,
-0.016449514776468277,
-0.18615399301052094,
0.08824263513088226,
-0.034962985664606094,
0.09238849580287933,
0.0012227381812408566,
0.08613573014736176,
0.03462217003107071,
-0.05304774269461632,
0.1486952006816864,
0.059686433523893356,
0.02511146292090416,
-0.06661410629749298,
-0.03286481648683548,
0.10717920958995819,
-0.050862688571214676,
0.0670970231294632,
-0.0351606123149395,
0.06358327716588974,
-0.20946133136749268,
-0.01941436156630516,
-0.08271227031946182,
0.026360459625720978,
-0.08153911679983139,
-0.07493455708026886,
-0.04121759906411171,
0.09570591896772385,
0.10581571608781815,
-0.02817809395492077,
0.04125162586569786,
0.035551488399505615,
0.10190096497535706,
0.0998801663517952,
0.0908060073852539,
0.009936484508216381,
-0.1577848196029663,
-0.02141927182674408,
-0.010335290804505348,
0.03917164355516434,
-0.07387365400791168,
0.02909959852695465,
0.13444389402866364,
0.07481051981449127,
0.14259935915470123,
0.006667873356491327,
-0.06057824566960335,
-0.008250180631875992,
0.025003479793667793,
-0.11378151923418045,
-0.17354723811149597,
-0.012085392139852047,
-0.027324628084897995,
-0.18197664618492126,
-0.047573741525411606,
0.11887338757514954,
-0.037900298833847046,
-0.026251064613461494,
-0.02001100592315197,
0.027777545154094696,
-0.005312792025506496,
0.19826120138168335,
0.0676894411444664,
0.06755885481834412,
-0.0751618817448616,
0.06193481385707855,
0.12449201941490173,
-0.07676959782838821,
0.03771153837442398,
0.03418334573507309,
-0.09227809309959412,
-0.03275643289089203,
0.05276750400662422,
0.0845104530453682,
-0.014249823987483978,
-0.01814742386341095,
-0.0802292749285698,
-0.03469400107860565,
0.05654517188668251,
0.003737394232302904,
0.05332101881504059,
0.016167042776942253,
-0.050372444093227386,
-0.015082001686096191,
-0.109123095870018,
0.10894990712404251,
0.03458623215556145,
0.06615608185529709,
-0.11794284731149673,
0.04828699678182602,
-0.007847034372389317,
0.059429142624139786,
-0.01920381188392639,
0.007195539306849241,
-0.06929786503314972,
-0.014442779123783112,
-0.12092684209346771,
0.015649186447262764,
-0.03952918201684952,
0.01521338988095522,
-0.04377063363790512,
-0.06465929746627808,
-0.044496603310108185,
0.026518402621150017,
-0.0643252432346344,
-0.06225500628352165,
0.018797997385263443,
0.03692317008972168,
-0.19090604782104492,
-0.004181358031928539,
0.040441855788230896,
-0.09526710212230682,
0.0711279958486557,
0.023427357897162437,
0.0003076006832998246,
-0.004776478745043278,
-0.026244858279824257,
-0.06352932751178741,
-0.033247388899326324,
0.0599842444062233,
0.09186738729476929,
-0.11999804526567459,
-0.014087975025177002,
-0.009164828807115555,
0.02461402676999569,
0.03431214392185211,
0.07884158939123154,
-0.11730904877185822,
-0.03696572035551071,
-0.0364329032599926,
-0.06479988992214203,
-0.0658123642206192,
0.0736284926533699,
0.09546095132827759,
-0.00035676470724865794,
0.1590755730867386,
-0.046121470630168915,
0.05930182710289955,
-0.19403019547462463,
-0.05170348286628723,
0.012638294138014317,
-0.03228000923991203,
-0.01380818709731102,
-0.03139650449156761,
0.07151348888874054,
-0.023666908964514732,
0.07826732099056244,
-0.016905784606933594,
0.12367959320545197,
0.053946446627378464,
-0.023287316784262657,
-0.006929532624781132,
-0.02611912041902542,
0.1648578941822052,
0.04443533718585968,
-0.021392740309238434,
0.07491718232631683,
-0.025747789070010185,
0.0025002441834658384,
0.060867343097925186,
0.16498272120952606,
0.17633862793445587,
0.03479459509253502,
0.00999605655670166,
0.06428173184394836,
-0.07400792092084885,
-0.21413670480251312,
0.059750840067863464,
-0.05154520645737648,
0.12330980598926544,
-0.0359952449798584,
0.126267209649086,
0.06110900267958641,
-0.20035411417484283,
0.06719301640987396,
-0.08096110075712204,
-0.11997479200363159,
-0.06283092498779297,
-0.11866269260644913,
-0.07442895323038101,
-0.09591709822416306,
0.026788849383592606,
-0.11343451589345932,
0.03572941944003105,
0.085904560983181,
0.013023714534938335,
0.0012811539927497506,
0.1541445404291153,
-0.028863398358225822,
0.012225814163684845,
0.052648454904556274,
0.017793983221054077,
0.015774782747030258,
-0.059990495443344116,
-0.01454625092446804,
0.04343411698937416,
0.04979377239942551,
0.10496171563863754,
-0.03845403715968132,
0.01070988830178976,
0.01837329752743244,
0.027091223746538162,
-0.07495374232530594,
0.0002556617255322635,
0.013379438780248165,
0.05437162518501282,
0.06736718863248825,
0.05055881291627884,
0.03547336533665657,
-0.037597186863422394,
0.2536960244178772,
-0.041740044951438904,
-0.09879244863986969,
-0.14420560002326965,
0.17450696229934692,
0.045663733035326004,
-0.02226695790886879,
0.09437102824449539,
-0.11668539047241211,
0.02396034076809883,
0.12543603777885437,
0.16301174461841583,
-0.07696984708309174,
-0.025087157264351845,
0.005698798689991236,
0.005762245040386915,
-0.016357168555259705,
0.09729049354791641,
0.07671086490154266,
0.027758654206991196,
-0.06914402544498444,
-0.009266607463359833,
-0.0010734873358160257,
-0.056411657482385635,
-0.06334389746189117,
0.08367061614990234,
0.03371693193912506,
0.011485296301543713,
-0.057445596903562546,
0.10914258658885956,
0.05138953775167465,
-0.1935967355966568,
0.011527053080499172,
-0.17474527657032013,
-0.20474380254745483,
-0.023985616862773895,
0.1187414899468422,
0.0010011923732236028,
0.06996921449899673,
0.0036910169292241335,
0.008749822154641151,
0.11660224944353104,
0.0001302371674682945,
-0.0771997720003128,
-0.0634361058473587,
0.13041925430297852,
-0.06852802634239197,
0.2606651782989502,
0.008121434599161148,
0.0536004938185215,
0.10657814890146255,
0.006237613968551159,
-0.12660782039165497,
0.009971477091312408,
0.11366019397974014,
-0.02297799289226532,
0.04631780833005905,
0.15175586938858032,
-0.031872570514678955,
0.07001899927854538,
0.05979296937584877,
-0.13674448430538177,
-0.010270042344927788,
0.04119471460580826,
0.030072951689362526,
-0.09766703844070435,
0.01717955246567726,
-0.05966085568070412,
0.18309850990772247,
0.19332106411457062,
-0.047888852655887604,
0.012290127575397491,
-0.06605011224746704,
0.0234098881483078,
0.04975145682692528,
0.09945768862962723,
-0.04187319800257683,
-0.1761123538017273,
-0.016927657648921013,
-0.021466068923473358,
0.03214038535952568,
-0.22648635506629944,
-0.10129924863576889,
0.05443626269698143,
-0.043688736855983734,
-0.03754924237728119,
0.1260896772146225,
0.02449745126068592,
0.016455605626106262,
-0.04451175406575203,
-0.10630597174167633,
-0.08175157010555267,
0.11311762034893036,
-0.13283206522464752,
-0.03464566543698311
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi-with-repeat
This model is a fine-tuned version of [twmkn9/distilbert-base-uncased-squad2](https://huggingface.co/twmkn9/distilbert-base-uncased-squad2) on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["conll2003"], "model_index": [{"name": "distilbert-base-uncased-squad2-with-ner-with-neg-with-multi-with-repeat", "results": [{"task": {"name": "Question Answering", "type": "question-answering"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}}]}]}
|
question-answering
|
andi611/distilbert-base-uncased-squad2-with-ner-with-neg-with-multi-with-repeat
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:conll2003",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us
|
# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi-with-repeat
This model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi-with-repeat\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi-with-repeat\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
45,
67,
6,
12,
8,
3,
90,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi-with-repeat\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.0936797708272934,
0.1558576375246048,
-0.0022049895487725735,
0.10963279008865356,
0.12776020169258118,
0.033177100121974945,
0.08204490691423416,
0.15214477479457855,
-0.11274250596761703,
0.05445998162031174,
0.0752950981259346,
0.05464876815676689,
0.021484944969415665,
0.10035602748394012,
-0.03429994732141495,
-0.2351084202528,
-0.004170737694948912,
0.00555950915440917,
-0.03644676133990288,
0.11082634329795837,
0.0906350389122963,
-0.09373944252729416,
0.07476499676704407,
-0.016785213723778725,
-0.15778742730617523,
0.031373508274555206,
-0.03839004784822464,
-0.027213232591748238,
0.08899655193090439,
0.011090642772614956,
0.09746678173542023,
0.009400768205523491,
0.1221674457192421,
-0.20812317728996277,
-0.0008702778723090887,
0.0709543228149414,
0.028452463448047638,
0.07009486854076385,
0.017763102427124977,
0.0012355041690170765,
0.0978558361530304,
-0.1600927710533142,
0.0947103500366211,
0.024508655071258545,
-0.08603311330080032,
-0.11403603106737137,
-0.08466997742652893,
0.07340017706155777,
0.08400276303291321,
0.09913196414709091,
0.007613977417349815,
0.16704246401786804,
-0.0946422889828682,
0.1016269251704216,
0.17657743394374847,
-0.25859203934669495,
-0.06855523586273193,
0.06887006759643555,
0.045684684067964554,
0.06738577783107758,
-0.12548527121543884,
-0.0276230089366436,
0.052085377275943756,
0.028076183050870895,
0.06374615430831909,
-0.037389449775218964,
-0.14391040802001953,
0.014834391884505749,
-0.14162731170654297,
-0.021905817091464996,
0.2003604620695114,
0.05334843322634697,
-0.0400206632912159,
-0.06834766268730164,
-0.058268580585718155,
-0.09052888303995132,
-0.0024788561277091503,
-0.061927083879709244,
0.032017774879932404,
-0.06304815411567688,
-0.08133098483085632,
-0.0712808296084404,
-0.07189729064702988,
-0.07882275432348251,
-0.013836998492479324,
0.10149849951267242,
0.051806870847940445,
0.010671847499907017,
-0.0360938124358654,
0.1187531128525734,
-0.007177362684160471,
-0.11959975212812424,
-0.04418410360813141,
-0.0012157884193584323,
-0.11424694955348969,
-0.07612613588571548,
-0.0365612767636776,
-0.018007269129157066,
-0.010861414484679699,
0.17731603980064392,
-0.03720300272107124,
0.047100283205509186,
0.007007155101746321,
-0.011742085218429565,
-0.0025886795483529568,
0.1347898244857788,
-0.06383736431598663,
-0.0337129570543766,
-0.009989762678742409,
0.09111381322145462,
-0.0196137186139822,
-0.0001795359858078882,
-0.07936916500329971,
-0.04652620851993561,
0.10103845596313477,
0.06546009331941605,
-0.033305417746305466,
0.059402674436569214,
-0.018890727311372757,
-0.04878129065036774,
0.014587156474590302,
-0.13439863920211792,
0.035097572952508926,
-0.015475664287805557,
-0.09902061522006989,
-0.013448386453092098,
0.03573811799287796,
-0.011983334086835384,
-0.01980391889810562,
0.05547924339771271,
-0.0936957597732544,
-0.005484012421220541,
-0.06545672565698624,
-0.07379406690597534,
0.02156108245253563,
-0.0678924098610878,
-0.007622497156262398,
-0.07466941326856613,
-0.2246389389038086,
-0.04399775341153145,
0.02396652102470398,
-0.04665025323629379,
-0.019988134503364563,
-0.06931205838918686,
-0.06738391518592834,
-0.00977944303303957,
-0.004987826105207205,
0.08064799755811691,
-0.04433959349989891,
0.0850600078701973,
0.008722075261175632,
0.029622334986925125,
0.014931196346879005,
0.04468879476189613,
-0.1084139347076416,
0.014566282741725445,
-0.07980893552303314,
0.08049692958593369,
-0.09044238924980164,
0.027157645672559738,
-0.116459421813488,
-0.12010813504457474,
-0.0059129102155566216,
-0.028767652809619904,
0.04655628651380539,
0.13306453824043274,
-0.1784369796514511,
-0.013755339197814465,
0.14622142910957336,
-0.05477825179696083,
-0.08449370414018631,
0.10323645919561386,
-0.0661768764257431,
0.07302583009004593,
0.07350528985261917,
0.16134600341320038,
0.1606297343969345,
-0.1354302018880844,
-0.02249070443212986,
0.010080775246024132,
0.03539052978157997,
0.040434010326862335,
0.04567277804017067,
0.01683090068399906,
0.0275275781750679,
0.010790486820042133,
-0.08177363127470016,
-0.02109411731362343,
-0.08829274028539658,
-0.09805746376514435,
-0.04995037987828255,
-0.09037687629461288,
0.07668235898017883,
0.04386916384100914,
0.03420119360089302,
-0.06614235043525696,
-0.10165504366159439,
0.13932926952838898,
0.14890000224113464,
-0.052827563136816025,
0.014283334836363792,
-0.08132556080818176,
0.021456386893987656,
-0.03795532509684563,
-0.03366820141673088,
-0.19196514785289764,
-0.14628992974758148,
0.04057758301496506,
-0.01998954266309738,
0.0511227585375309,
0.07096019387245178,
0.073954276740551,
0.05568055808544159,
-0.06108921766281128,
-0.027761848643422127,
-0.09116028994321823,
-0.0012912460369989276,
-0.08556818217039108,
-0.16165117919445038,
-0.05102994292974472,
-0.03579878807067871,
0.1334923803806305,
-0.22877244651317596,
0.02187179960310459,
-0.0036381848622113466,
0.1145668476819992,
0.025271037593483925,
-0.05562504008412361,
0.02115754224359989,
0.0433414950966835,
-0.011190487071871758,
-0.09020529687404633,
0.03735062852501869,
-0.012201204895973206,
-0.08504816144704819,
-0.09726462513208389,
-0.11164739727973938,
0.029381901025772095,
0.054552461951971054,
0.05931243672966957,
-0.11527290940284729,
-0.011845048516988754,
-0.06797458231449127,
-0.0544513501226902,
-0.06280936300754547,
-0.01631447859108448,
0.19492773711681366,
0.006476975046098232,
0.10855584591627121,
-0.055227190256118774,
-0.07229895144701004,
-0.019381485879421234,
0.01029012631624937,
-0.00657914811745286,
0.07914631068706512,
0.0518316775560379,
-0.13342639803886414,
0.0868125930428505,
0.07489161193370819,
-0.05596813187003136,
0.15113826096057892,
-0.04582333564758301,
-0.09309271723031998,
-0.041003286838531494,
0.037499237805604935,
-0.019373977556824684,
0.1054813340306282,
-0.09977403283119202,
-0.0006613809382542968,
0.028547445312142372,
0.024395884945988655,
0.027453070506453514,
-0.16368897259235382,
-0.02154652029275894,
0.034009747207164764,
-0.05554422736167908,
-0.028181729838252068,
0.0053305733017623425,
0.028041759505867958,
0.0869147926568985,
0.01333528757095337,
-0.03211342543363571,
0.01764257624745369,
-0.008624124340713024,
-0.08339057862758636,
0.1807437539100647,
-0.10529245436191559,
-0.14088577032089233,
-0.11564213782548904,
0.03471312299370766,
-0.0772814005613327,
-0.03215760737657547,
0.02039456181228161,
-0.06983713805675507,
-0.02906661294400692,
-0.08906730264425278,
-0.012232457287609577,
-0.04925677552819252,
-0.021600276231765747,
0.017756430432200432,
0.009296354837715626,
0.0708865076303482,
-0.14734108746051788,
0.025217443704605103,
-0.013779859989881516,
-0.13443385064601898,
-0.003215731354430318,
0.020236346870660782,
0.14009471237659454,
0.12048843502998352,
-0.02210761420428753,
0.022337011992931366,
-0.04038817435503006,
0.2222174108028412,
-0.0707215666770935,
0.0010205392027273774,
0.09533270448446274,
0.004812290426343679,
0.049206677824258804,
0.11873438954353333,
0.026414508000016212,
-0.09859777987003326,
0.03775686025619507,
0.0805734246969223,
-0.011646092869341373,
-0.2400171309709549,
-0.03408670425415039,
-0.032730311155319214,
-0.03989603742957115,
0.0967455506324768,
0.04868641123175621,
0.05187363922595978,
0.038196761161088943,
-0.008566569536924362,
0.02785070613026619,
-0.027168279513716698,
0.08319316059350967,
0.096263088285923,
0.023316746577620506,
0.09267660975456238,
-0.02857079543173313,
-0.03907281160354614,
0.06305365264415741,
0.026020178571343422,
0.2705204486846924,
-0.03405598923563957,
0.11201309412717819,
0.04871316999197006,
0.14395567774772644,
-0.05115512013435364,
0.03599221259355545,
0.014902383089065552,
0.0036445308942347765,
0.008707853965461254,
-0.05869192257523537,
-0.017411259934306145,
0.03192387893795967,
0.004659540485590696,
0.051960647106170654,
-0.09438756853342056,
0.02039833925664425,
0.039292726665735245,
0.23083817958831787,
0.05701874941587448,
-0.280874103307724,
-0.08853165805339813,
0.00230906018987298,
-0.03283340111374855,
-0.04717204347252846,
0.014364046044647694,
0.14077261090278625,
-0.1244824007153511,
0.04107467457652092,
-0.04826738312840462,
0.08723582327365875,
-0.052449822425842285,
0.002938752295449376,
0.04810672998428345,
0.09591613709926605,
-0.00796128623187542,
0.10236307233572006,
-0.20306189358234406,
0.21293507516384125,
0.027091603726148605,
0.10118954628705978,
-0.053099628537893295,
0.010734736919403076,
0.0074492874555289745,
0.09939126670360565,
0.12456244975328445,
-0.0018363932613283396,
-0.004384793806821108,
-0.1704183965921402,
-0.06886108964681625,
0.03615977615118027,
0.11275925487279892,
-0.031419530510902405,
0.0939885824918747,
-0.04131198301911354,
0.0041406466625630856,
0.055393703281879425,
-0.06583409011363983,
-0.1609538346529007,
-0.105197474360466,
0.03350658714771271,
0.0019595506601035595,
-0.03554689511656761,
-0.05918937176465988,
-0.10033577680587769,
-0.018690474331378937,
0.18051184713840485,
0.017504863440990448,
-0.037469420582056046,
-0.1304449737071991,
0.0823691114783287,
0.13303586840629578,
-0.07565349340438843,
-0.007579749450087547,
0.036646243184804916,
0.08377186208963394,
0.05165715888142586,
-0.07998625189065933,
0.035707026720047,
-0.048558153212070465,
-0.13816417753696442,
-0.043892670422792435,
0.120078444480896,
0.05958534777164459,
0.05821777507662773,
-0.00332492939196527,
0.008433565497398376,
0.01790166273713112,
-0.09026628732681274,
-0.0020044029224663973,
0.08496097475290298,
0.0820862427353859,
0.06985727697610855,
-0.10803034901618958,
0.058383483439683914,
-0.06119827553629875,
0.018458174541592598,
0.15027861297130585,
0.19480693340301514,
-0.0886356309056282,
0.06141069531440735,
0.058744918555021286,
-0.09186507016420364,
-0.17782729864120483,
0.0771806463599205,
0.08706532418727875,
0.007302515674382448,
0.054355647414922714,
-0.18544861674308777,
0.08354893326759338,
0.12044724076986313,
0.004032164812088013,
0.06160688027739525,
-0.356411874294281,
-0.11502712219953537,
0.06625118851661682,
0.09528017789125443,
0.009306749328970909,
-0.1339748203754425,
-0.02833487093448639,
0.013121514581143856,
-0.14663711190223694,
0.08239219337701797,
-0.06193428486585617,
0.11416678875684738,
-0.010235518217086792,
0.11792133003473282,
0.03284196928143501,
-0.03142205625772476,
0.12894898653030396,
0.08625709265470505,
0.09537740796804428,
-0.05826756730675697,
-0.02069699950516224,
0.10776516050100327,
-0.07424124330282211,
0.08824630826711655,
0.0021981969475746155,
0.09118053317070007,
-0.16928507387638092,
-0.017083216458559036,
-0.08568732440471649,
0.06589353829622269,
-0.05483927205204964,
-0.07589874416589737,
-0.05595790594816208,
0.07424773275852203,
0.07274853438138962,
-0.03306501731276512,
0.05110854655504227,
0.021687472239136696,
0.08173591643571854,
0.05971584841609001,
0.10103030502796173,
0.012839196249842644,
-0.12441476434469223,
-0.01136734802275896,
-0.0055868919007480145,
0.05401153862476349,
-0.1288166642189026,
0.029896171763539314,
0.15012526512145996,
0.06982045620679855,
0.15205205976963043,
0.032439298927783966,
-0.03631158918142319,
-0.010819299146533012,
0.023556537926197052,
-0.12089326977729797,
-0.1451321393251419,
-0.014649923890829086,
-0.10199461132287979,
-0.1573459506034851,
0.03164786845445633,
0.10281910002231598,
-0.061087001115083694,
-0.0020435440819710493,
-0.0056172641925513744,
0.012809502892196178,
-0.022860433906316757,
0.1875590980052948,
0.055848732590675354,
0.053279370069503784,
-0.07183634489774704,
0.10860643535852432,
0.07111188769340515,
-0.052665386348962784,
0.04292850196361542,
0.039710234850645065,
-0.08329950273036957,
-0.03665090352296829,
0.03871936351060867,
0.140919491648674,
-0.04721890389919281,
-0.01560293696820736,
-0.07759525626897812,
-0.02567264623939991,
0.04008680582046509,
0.1292557567358017,
0.05266395956277847,
0.0085499482229352,
-0.04644513130187988,
0.02766392007470131,
-0.1462705135345459,
0.10367158055305481,
0.04661504179239273,
0.07390798628330231,
-0.15065661072731018,
0.13624393939971924,
-0.00978178158402443,
0.05919511988759041,
-0.0196247361600399,
0.006645240355283022,
-0.08554597944021225,
-0.011274542659521103,
-0.12292900681495667,
-0.031000085175037384,
-0.027248157188296318,
0.020568089559674263,
-0.007947302423417568,
-0.06592265516519547,
-0.048164743930101395,
0.0433957502245903,
-0.07116713374853134,
-0.05735240876674652,
0.03842000290751457,
0.06613216549158096,
-0.15045176446437836,
-0.02795075997710228,
0.02800077199935913,
-0.08448322862386703,
0.07113368809223175,
0.055417709052562714,
0.012714657932519913,
0.0354405976831913,
-0.07390586286783218,
-0.021677713841199875,
0.023115379735827446,
0.04512143135070801,
0.06763028353452682,
-0.08052650839090347,
-0.00796305201947689,
-0.03338475152850151,
0.05590968579053879,
0.03160823509097099,
0.04611619561910629,
-0.11494144052267075,
-0.020057139918208122,
-0.07374757528305054,
-0.04573024436831474,
-0.07032541930675507,
0.043249499052762985,
0.09790880233049393,
0.04386154189705849,
0.16187036037445068,
-0.06481503695249557,
0.06255258619785309,
-0.20201070606708527,
-0.046278152614831924,
0.011212076991796494,
-0.0341801680624485,
-0.04172145575284958,
-0.04172052443027496,
0.07738780230283737,
-0.0666443407535553,
0.08882024139165878,
-0.04033287242054939,
0.10267136991024017,
0.02527306228876114,
-0.047287601977586746,
-0.012643096037209034,
-0.01784854382276535,
0.17103274166584015,
0.05818182975053787,
-0.031542953103780746,
0.08722484111785889,
-0.006766168866306543,
0.05163514241576195,
0.034286390990018845,
0.18617607653141022,
0.14142519235610962,
-0.05615759268403053,
0.04746270924806595,
0.08037304133176804,
-0.10061360150575638,
-0.13379083573818207,
0.07146485894918442,
0.003184284083545208,
0.09439445286989212,
-0.047295019030570984,
0.14764931797981262,
0.09116818755865097,
-0.1615091860294342,
0.05986301228404045,
-0.036510542035102844,
-0.1297408491373062,
-0.10333249717950821,
-0.042090803384780884,
-0.08136920630931854,
-0.11064599454402924,
0.030137354508042336,
-0.13714058697223663,
0.03241372108459473,
0.05798095837235451,
0.021128756925463676,
-0.002468815306201577,
0.1863657832145691,
-0.052914056926965714,
0.0120618287473917,
0.05610935017466545,
-0.0057271672412753105,
-0.0021841127891093493,
-0.06926091760396957,
-0.034982919692993164,
0.057389069348573685,
0.01785425655543804,
0.07583096623420715,
-0.0543086938560009,
0.031941503286361694,
0.01756211370229721,
-0.04102940484881401,
-0.06739315390586853,
-0.0025759530253708363,
0.04353508725762367,
0.03054226189851761,
0.032405417412519455,
0.0718672126531601,
-0.012446478009223938,
-0.035495590418577194,
0.26626384258270264,
-0.07048352062702179,
-0.08542486280202866,
-0.14713987708091736,
0.20367400348186493,
0.05040423944592476,
-0.012124688364565372,
0.06909912079572678,
-0.12317673861980438,
0.011791283264756203,
0.17472577095031738,
0.1467847228050232,
-0.049949511885643005,
-0.01006540097296238,
-0.004701677709817886,
-0.0044451383873820305,
-0.040123216807842255,
0.08573959767818451,
0.08520745486021042,
0.013125370256602764,
-0.039743244647979736,
-0.051855139434337616,
-0.005526817869395018,
-0.029508566483855247,
-0.05373920127749443,
0.058867692947387695,
0.02835293859243393,
0.018107132986187935,
-0.04394892230629921,
0.08581068366765976,
-0.011060712859034538,
-0.18385685980319977,
0.04301327094435692,
-0.15769784152507782,
-0.16850458085536957,
-0.024469472467899323,
0.07245940715074539,
-0.016527695581316948,
0.052616752684116364,
-0.027578080072999,
-0.0068796235136687756,
0.1427699476480484,
-0.010557947680354118,
-0.08477587252855301,
-0.11013021320104599,
0.09958361089229584,
-0.04713853821158409,
0.2028806060552597,
-0.004012823570519686,
0.07694704085588455,
0.09923572093248367,
0.00338123831897974,
-0.12879447638988495,
0.05245096981525421,
0.08088873326778412,
-0.054503004997968674,
0.0217300895601511,
0.16358113288879395,
-0.04277164489030838,
0.12513695657253265,
0.0488121472299099,
-0.12740637362003326,
-0.019062863662838936,
-0.010783673264086246,
-0.004942445084452629,
-0.110896036028862,
0.006111595779657364,
-0.052943307906389236,
0.1604195237159729,
0.23216642439365387,
-0.031081968918442726,
0.0214610006660223,
-0.08292635530233383,
0.02247638814151287,
0.06237802654504776,
0.07566479593515396,
-0.03561599180102348,
-0.1766723394393921,
0.011074485257267952,
-0.025905568152666092,
0.029859233647584915,
-0.2158776819705963,
-0.10210727900266647,
0.08609896153211594,
-0.05152461305260658,
-0.030103402212262154,
0.12874647974967957,
0.05162128806114197,
0.03982663154602051,
-0.02316376566886902,
-0.12254847586154938,
-0.0465647391974926,
0.13581393659114838,
-0.17713633179664612,
-0.04967159405350685
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi
This model is a fine-tuned version of [twmkn9/distilbert-base-uncased-squad2](https://huggingface.co/twmkn9/distilbert-base-uncased-squad2) on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["conll2003"], "model_index": [{"name": "distilbert-base-uncased-squad2-with-ner-with-neg-with-multi", "results": [{"task": {"name": "Question Answering", "type": "question-answering"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}}]}]}
|
question-answering
|
andi611/distilbert-base-uncased-squad2-with-ner-with-neg-with-multi
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:conll2003",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us
|
# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi
This model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
45,
61,
6,
12,
8,
3,
90,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n# distilbert-base-uncased-squad2-with-ner-with-neg-with-multi\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5.0### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.08377667516469955,
0.12624859809875488,
-0.0021613389253616333,
0.09186580777168274,
0.1449122577905655,
0.040838029235601425,
0.09639338403940201,
0.10942788422107697,
-0.1271301507949829,
0.061844222247600555,
0.07529900223016739,
0.07232343405485153,
0.022225704044103622,
0.09373793005943298,
-0.03053124248981476,
-0.2554844319820404,
0.007175937760621309,
0.010372807271778584,
-0.0722125768661499,
0.11133471131324768,
0.09282150864601135,
-0.10498446226119995,
0.0618441216647625,
0.004670311696827412,
-0.18565034866333008,
0.026826277375221252,
-0.03275768831372261,
-0.03223852068185806,
0.08990702033042908,
0.010172669775784016,
0.10860294848680496,
0.004330153577029705,
0.1181945651769638,
-0.2157648652791977,
0.0015717742498964071,
0.07568271458148956,
0.023452023044228554,
0.06237158179283142,
0.046730462461709976,
0.006044123321771622,
0.12920604646205902,
-0.13558056950569153,
0.09809619188308716,
0.0326494462788105,
-0.08050595223903656,
-0.11301612108945847,
-0.07736490666866302,
0.09475117921829224,
0.0954323261976242,
0.10757695883512497,
-0.0028026667423546314,
0.14621424674987793,
-0.13478751480579376,
0.0911637470126152,
0.1631198674440384,
-0.2573602795600891,
-0.08282943814992905,
0.06019771099090576,
0.029367320239543915,
0.06317082792520523,
-0.10898672789335251,
-0.035240538418293,
0.04664184898138046,
0.03492534160614014,
0.06382229924201965,
-0.021329794079065323,
-0.10119985044002533,
0.014609803445637226,
-0.15821205079555511,
-0.02061784267425537,
0.20203940570354462,
0.06217571347951889,
-0.03835582360625267,
-0.052264172583818436,
-0.05959264934062958,
-0.0721736028790474,
0.00384121248498559,
-0.05815475806593895,
0.02154475823044777,
-0.05471031740307808,
-0.09384952485561371,
-0.04210931435227394,
-0.07101068645715714,
-0.0571773424744606,
-0.020689766854047775,
0.11504604667425156,
0.06536809355020523,
0.004465870559215546,
-0.040989264845848083,
0.11639092862606049,
0.017784785479307175,
-0.10730978101491928,
-0.02368597500026226,
-0.009808950126171112,
-0.09923869371414185,
-0.06679881364107132,
-0.0520271360874176,
-0.008934248238801956,
-0.004620331339538097,
0.17212042212486267,
-0.031970541924238205,
0.05571991577744484,
0.020821163430809975,
-0.004064141307026148,
-0.01677756942808628,
0.13164804875850677,
-0.05941001698374748,
-0.028805721551179886,
-0.019472181797027588,
0.07851308584213257,
-0.021060572937130928,
-0.0007325918995775282,
-0.0793260782957077,
-0.01960020512342453,
0.09291069209575653,
0.05155570060014725,
-0.050099730491638184,
0.05137626826763153,
-0.02746344357728958,
-0.05956471338868141,
-0.02607775665819645,
-0.1221759021282196,
0.03365917131304741,
-0.0008886920986697078,
-0.07891757041215897,
0.029600616544485092,
0.016584495082497597,
-0.006215689238160849,
-0.03822212293744087,
0.09082042425870895,
-0.10512220859527588,
-0.0002136374096153304,
-0.07986310869455338,
-0.07377830892801285,
0.01614830642938614,
-0.08786530792713165,
-0.014181948266923428,
-0.08341027796268463,
-0.18348027765750885,
-0.03629335016012192,
0.03463231027126312,
-0.02831321395933628,
-0.03792010247707367,
-0.06455048173666,
-0.06181197986006737,
-0.012807458639144897,
0.005535210948437452,
0.07999560981988907,
-0.039199624210596085,
0.08083228021860123,
0.004366551525890827,
0.026907656341791153,
0.01915123499929905,
0.04338867962360382,
-0.08488199859857559,
0.019937774166464806,
-0.07520244270563126,
0.07554740458726883,
-0.08129511028528214,
0.024423006922006607,
-0.10451212525367737,
-0.13351589441299438,
-0.005435519386082888,
-0.02020541951060295,
0.046794719994068146,
0.11897987872362137,
-0.1615172028541565,
-0.036134134978055954,
0.1593552976846695,
-0.06215120479464531,
-0.07526346296072006,
0.11458737403154373,
-0.06236143410205841,
0.03723597154021263,
0.060913678258657455,
0.14823463559150696,
0.13839122653007507,
-0.11458370834589005,
-0.004004909191280603,
-0.004575103521347046,
0.05217823013663292,
0.03926827386021614,
0.04856858029961586,
-0.0008764241356402636,
-0.005004735663533211,
0.01560179702937603,
-0.08035963773727417,
-0.0020685074850916862,
-0.09984558820724487,
-0.09876085072755814,
-0.05028127506375313,
-0.08462323993444443,
0.07493258267641068,
0.03977863863110542,
0.04882432520389557,
-0.060923121869564056,
-0.0871017649769783,
0.141851007938385,
0.13077344000339508,
-0.06058323383331299,
0.01143672689795494,
-0.07872550189495087,
0.024093976244330406,
-0.02976089157164097,
-0.025203805416822433,
-0.19610366225242615,
-0.13601137697696686,
0.026592181995511055,
-0.040487855672836304,
0.053118348121643066,
0.0585772730410099,
0.05418943986296654,
0.07135365903377533,
-0.05665181577205658,
-0.026042301207780838,
-0.10582292824983597,
0.0003296805080026388,
-0.07691001147031784,
-0.17756162583827972,
-0.05441988632082939,
-0.02021368034183979,
0.16004498302936554,
-0.2237398624420166,
0.02937646210193634,
-0.023103896528482437,
0.12476339191198349,
0.012186587788164616,
-0.05498357117176056,
-0.010308757424354553,
0.0765904113650322,
-0.018929701298475266,
-0.07705385237932205,
0.043047793209552765,
-0.00967330764979124,
-0.09528273344039917,
-0.11383543163537979,
-0.10494083166122437,
0.05288400501012802,
0.06722217053174973,
0.01879963092505932,
-0.10500248521566391,
-0.0032366812229156494,
-0.0787280946969986,
-0.05515052005648613,
-0.07437129318714142,
0.003341027069836855,
0.17672888934612274,
-0.014624303206801414,
0.11167684197425842,
-0.05355839431285858,
-0.06340960413217545,
-0.013418869115412235,
0.0030932212248444557,
-0.0051588942296803,
0.08323247730731964,
0.1079496368765831,
-0.12858246266841888,
0.0950816348195076,
0.08140034228563309,
-0.0922466591000557,
0.15489083528518677,
-0.06033986434340477,
-0.086883544921875,
-0.03780997917056084,
0.02319520153105259,
-0.008488951250910759,
0.09737466275691986,
-0.12766337394714355,
0.007496931590139866,
0.02629171870648861,
0.02825772948563099,
0.038203977048397064,
-0.1722986102104187,
-0.019800525158643723,
0.03143543377518654,
-0.040164101868867874,
-0.0522383414208889,
-0.0070139942690730095,
0.03177869692444801,
0.08696123212575912,
0.014189762994647026,
-0.008701225742697716,
0.025506095960736275,
0.002400388941168785,
-0.09581082314252853,
0.1957044005393982,
-0.1223868727684021,
-0.12343120574951172,
-0.11881009489297867,
0.0473516508936882,
-0.1003466472029686,
-0.03199477866292,
0.028797997161746025,
-0.09911321848630905,
-0.031855471432209015,
-0.07590498775243759,
0.00219420762732625,
-0.06381598860025406,
-0.005206212401390076,
0.026604129001498222,
0.004541876260191202,
0.0712743028998375,
-0.13853490352630615,
0.019972017034888268,
-0.018757540732622147,
-0.11996300518512726,
0.011967950500547886,
0.023630771785974503,
0.13843262195587158,
0.14109674096107483,
-0.0058516766875982285,
0.0224084984511137,
-0.028233662247657776,
0.2207602560520172,
-0.06493882089853287,
-0.01626885123550892,
0.09236111491918564,
0.004080017097294331,
0.04955664277076721,
0.09117674827575684,
0.03912811353802681,
-0.09097272902727127,
0.03286164626479149,
0.08532378077507019,
-0.018057536333799362,
-0.24200795590877533,
-0.05142565444111824,
-0.051896337419748306,
-0.05080626532435417,
0.09969540685415268,
0.036505524069070816,
0.04856199398636818,
0.04709145799279213,
-0.010219710879027843,
0.039504677057266235,
-0.030516894534230232,
0.08757677674293518,
0.11011574417352676,
0.03509389981627464,
0.09891965985298157,
-0.030829792842268944,
-0.04984692484140396,
0.06323838233947754,
-0.011240135878324509,
0.3021204471588135,
-0.014027449302375317,
0.0743478536605835,
0.07284847646951675,
0.15527652204036713,
-0.03184390440583229,
0.04824664071202278,
0.018073003739118576,
-0.01035032607614994,
0.015898074954748154,
-0.05268806219100952,
-0.015249400399625301,
0.02580692619085312,
0.02717278152704239,
0.04903332144021988,
-0.10004495829343796,
0.026447193697094917,
0.04659079387784004,
0.2414924055337906,
0.031622663140296936,
-0.2745228409767151,
-0.09237301349639893,
-0.0021471551153808832,
-0.036425039172172546,
-0.036905717104673386,
0.019300082698464394,
0.1287136673927307,
-0.13120920956134796,
0.048619844019412994,
-0.05274137854576111,
0.09210575371980667,
-0.025464249774813652,
0.0013099733041599393,
0.056331101804971695,
0.11770199239253998,
0.0013867696980014443,
0.09434927999973297,
-0.22362516820430756,
0.21739105880260468,
0.007359122857451439,
0.09781872481107712,
-0.051345955580472946,
0.01377839408814907,
0.010462690144777298,
0.0982847809791565,
0.08416643738746643,
0.0109717333689332,
-0.004338584840297699,
-0.15978701412677765,
-0.03934456408023834,
0.04209894686937332,
0.12759354710578918,
-0.040637094527482986,
0.09334369003772736,
-0.036694224923849106,
0.011875000782310963,
0.05541178956627846,
-0.04659934714436531,
-0.16931185126304626,
-0.12934452295303345,
0.020349761471152306,
-0.013392079621553421,
-0.0319201685488224,
-0.0704197958111763,
-0.10157058387994766,
-0.027398884296417236,
0.18764911592006683,
-0.02268761396408081,
-0.02633231319487095,
-0.13490258157253265,
0.08063655346632004,
0.10938140004873276,
-0.06805942207574844,
0.010623234324157238,
0.029936200007796288,
0.0832098200917244,
0.046255484223365784,
-0.08621608465909958,
0.03914792090654373,
-0.06808901578187943,
-0.14686866104602814,
-0.05087799206376076,
0.11853531748056412,
0.07100459188222885,
0.054745499044656754,
-0.011814083904027939,
0.0012952762190252542,
0.017677640542387962,
-0.10550180822610855,
-0.0008405100088566542,
0.11196605116128922,
0.08666128665208817,
0.07927245646715164,
-0.1126406341791153,
0.0427699014544487,
-0.05698303505778313,
-0.0023641453590244055,
0.1449265331029892,
0.17096161842346191,
-0.08182696253061295,
0.04234197363257408,
0.05606943741440773,
-0.08206376433372498,
-0.15832065045833588,
0.0738261491060257,
0.11083970218896866,
0.007794979959726334,
0.034513264894485474,
-0.20026759803295135,
0.11835825443267822,
0.13592319190502167,
0.01504850946366787,
0.04794280603528023,
-0.368386447429657,
-0.11837788671255112,
0.06396598368883133,
0.10686437040567398,
0.03424318507313728,
-0.13500595092773438,
-0.021738145500421524,
0.0017259607557207346,
-0.15726450085639954,
0.09831748902797699,
-0.07973433285951614,
0.10769084095954895,
-0.010316668078303337,
0.09633243083953857,
0.01921910233795643,
-0.03897184878587723,
0.14086592197418213,
0.08923830091953278,
0.07913744449615479,
-0.05218437314033508,
-0.013661312870681286,
0.10309872776269913,
-0.05822976306080818,
0.05282514542341232,
-0.004507495556026697,
0.08053087443113327,
-0.1519233137369156,
-0.025034956634044647,
-0.08599194139242172,
0.05760277435183525,
-0.05721097066998482,
-0.07258886843919754,
-0.05035385116934776,
0.059705741703510284,
0.08505330979824066,
-0.03211640939116478,
0.04455069452524185,
0.02675214596092701,
0.10934973508119583,
0.04092007875442505,
0.09676937758922577,
0.010732094757258892,
-0.13385160267353058,
-0.03887234255671501,
-0.008308831602334976,
0.06366455554962158,
-0.10138818621635437,
0.02078029327094555,
0.14065256714820862,
0.06835818290710449,
0.15879088640213013,
0.056770309805870056,
-0.03308720141649246,
0.002099572913721204,
0.03211298957467079,
-0.12904328107833862,
-0.159125417470932,
-0.014214510098099709,
-0.10092085599899292,
-0.1446443796157837,
0.0396902821958065,
0.08920038491487503,
-0.05070686340332031,
0.0006468362989835441,
-0.007993083447217941,
0.005912691354751587,
-0.03725213557481766,
0.18299774825572968,
0.04579377546906471,
0.05785590037703514,
-0.0763898715376854,
0.10226201266050339,
0.06250814348459244,
-0.08126145601272583,
0.031004955992102623,
0.04873352125287056,
-0.09402041137218475,
-0.028758930042386055,
0.045074645429849625,
0.145476296544075,
-0.0634005069732666,
-0.025500843301415443,
-0.08638904988765717,
-0.0798848494887352,
0.05966413766145706,
0.12303241342306137,
0.056140534579753876,
0.0028625549748539925,
-0.06570226699113846,
0.0443015955388546,
-0.14843758940696716,
0.07832807302474976,
0.05701643228530884,
0.08206168562173843,
-0.14984607696533203,
0.15589942038059235,
0.004213047679513693,
0.04321407526731491,
-0.01097043976187706,
0.0025751322973519564,
-0.10701853781938553,
-0.012561521492898464,
-0.13261209428310394,
-0.04390840232372284,
-0.03738756477832794,
0.01774238608777523,
0.00041124640847556293,
-0.05063590407371521,
-0.04962274059653282,
0.04661766067147255,
-0.07336513698101044,
-0.05603829026222229,
0.032044023275375366,
0.0616932138800621,
-0.1354467123746872,
0.003514524782076478,
0.025695962831377983,
-0.09856827557086945,
0.07169638574123383,
0.07297024875879288,
0.012738410383462906,
0.042230527848005295,
-0.0944146141409874,
-0.017247365787625313,
0.03539817035198212,
0.040310993790626526,
0.0659184455871582,
-0.07339923083782196,
-0.000746925245039165,
-0.02440587803721428,
0.05377393215894699,
0.03068874217569828,
0.04171079024672508,
-0.11203499883413315,
-0.009793572127819061,
-0.060748904943466187,
-0.07198581844568253,
-0.08189361542463303,
0.03141573816537857,
0.09460382163524628,
0.03280258551239967,
0.17378737032413483,
-0.07225225120782852,
0.06227044388651848,
-0.19113905727863312,
-0.04421956092119217,
0.013611922971904278,
-0.028731239959597588,
-0.0375613197684288,
-0.045746318995952606,
0.06891466677188873,
-0.0626981258392334,
0.08917099982500076,
-0.048595719039440155,
0.0760456845164299,
0.026095248758792877,
-0.04349757730960846,
-0.00930511113256216,
-0.005293113179504871,
0.20300908386707306,
0.05820660665631294,
-0.03148038312792778,
0.05690549314022064,
-0.009369807317852974,
0.046295151114463806,
0.04824686795473099,
0.21344292163848877,
0.1426687091588974,
-0.08463440835475922,
0.04531287029385567,
0.0735735073685646,
-0.08971810340881348,
-0.15163138508796692,
0.07166227698326111,
-0.017086366191506386,
0.10600858181715012,
-0.03337101265788078,
0.1547233760356903,
0.09830540418624878,
-0.17789432406425476,
0.048373643308877945,
-0.028413796797394753,
-0.12102438509464264,
-0.10220802575349808,
-0.046631064265966415,
-0.07950141280889511,
-0.12698878347873688,
0.0366644561290741,
-0.1394302248954773,
0.03311457112431526,
0.07421048730611801,
0.03257622942328453,
-0.0005061675328761339,
0.18422628939151764,
-0.030747761949896812,
0.008539403788745403,
0.05508320778608322,
-0.0015583140775561333,
-0.019313834607601166,
-0.05508655309677124,
-0.04409271106123924,
0.03587548807263374,
-0.012735691852867603,
0.08041418343782425,
-0.056131813675165176,
-0.005206833127886057,
0.02153245359659195,
-0.038136184215545654,
-0.05295773223042488,
0.01701461896300316,
0.01781618222594261,
0.028736354783177376,
0.0519535094499588,
0.052562158554792404,
-0.024767696857452393,
-0.038357798010110855,
0.24050447344779968,
-0.07121256738901138,
-0.09680203348398209,
-0.1545623391866684,
0.2342175543308258,
0.06129840761423111,
-0.01739351823925972,
0.08342106640338898,
-0.11134551465511322,
0.0033488404005765915,
0.17731519043445587,
0.1520920693874359,
-0.04464757815003395,
-0.012951920740306377,
0.0085277259349823,
-0.012783779762685299,
-0.0406552292406559,
0.11162073165178299,
0.09399691969156265,
0.057457100600004196,
-0.029141515493392944,
-0.03785303607583046,
-0.02189778722822666,
-0.026530789211392403,
-0.0665297731757164,
0.06615389883518219,
0.05012352764606476,
-0.010416153818368912,
-0.03826012462377548,
0.07984071224927902,
-0.03176279738545418,
-0.14187496900558472,
0.07060074806213379,
-0.14044469594955444,
-0.17044030129909515,
-0.032098885625600815,
0.07541300356388092,
-0.023870110511779785,
0.05984106287360191,
-0.027660313993692398,
-0.028874661773443222,
0.14759795367717743,
0.00008901089313440025,
-0.07742289453744888,
-0.10843171924352646,
0.10371953248977661,
-0.04244426637887955,
0.1821063756942749,
-0.006365304347127676,
0.07941899448633194,
0.10795927047729492,
0.03269209340214729,
-0.09378638863563538,
0.058543942868709564,
0.07458137720823288,
-0.06589244306087494,
0.015107615850865841,
0.14858804643154144,
-0.04507753625512123,
0.10055025666952133,
0.04714122414588928,
-0.1380874663591385,
-0.011903694830834866,
-0.01645311899483204,
-0.009406225755810738,
-0.0952218621969223,
0.020390592515468597,
-0.07010222226381302,
0.1496729999780655,
0.2486104518175125,
-0.03290712833404541,
0.008885546587407589,
-0.07999779284000397,
0.036298967897892,
0.05386994406580925,
0.09265796840190887,
-0.034434858709573746,
-0.19632911682128906,
0.011673693545162678,
-0.05192546918988228,
0.023318786174058914,
-0.2267792671918869,
-0.11008300632238388,
0.06859547644853592,
-0.051678840070962906,
-0.05049583688378334,
0.11992605775594711,
0.07032237946987152,
0.040122535079717636,
-0.0339965857565403,
-0.13109759986400604,
-0.05551880598068237,
0.14034754037857056,
-0.15343163907527924,
-0.050491124391555786
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-squad2-with-ner-with-neg-with-repeat
This model is a fine-tuned version of [twmkn9/distilbert-base-uncased-squad2](https://huggingface.co/twmkn9/distilbert-base-uncased-squad2) on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["conll2003"], "model_index": [{"name": "distilbert-base-uncased-squad2-with-ner-with-neg-with-repeat", "results": [{"task": {"name": "Question Answering", "type": "question-answering"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}}]}]}
|
question-answering
|
andi611/distilbert-base-uncased-squad2-with-ner-with-neg-with-repeat
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:conll2003",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us
|
# distilbert-base-uncased-squad2-with-ner-with-neg-with-repeat
This model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# distilbert-base-uncased-squad2-with-ner-with-neg-with-repeat\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-squad2-with-ner-with-neg-with-repeat\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
45,
63,
6,
12,
8,
3,
90,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n# distilbert-base-uncased-squad2-with-ner-with-neg-with-repeat\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.08582551032304764,
0.14222602546215057,
-0.0021311738528311253,
0.10169481486082077,
0.14611761271953583,
0.036837272346019745,
0.07338898628950119,
0.12399827688932419,
-0.12155384570360184,
0.05251523107290268,
0.0778205394744873,
0.06972306966781616,
0.02489965595304966,
0.0992971733212471,
-0.03290172666311264,
-0.24822862446308136,
-0.00019010488176718354,
0.007933249697089195,
-0.059574246406555176,
0.11123473197221756,
0.09041738510131836,
-0.1055879294872284,
0.06706506013870239,
-0.005110539961606264,
-0.1744271069765091,
0.03037690743803978,
-0.03636103868484497,
-0.026336299255490303,
0.09096716344356537,
0.00890294648706913,
0.1035921722650528,
-0.0015932876849547029,
0.12528079748153687,
-0.21141213178634644,
-0.001290134503506124,
0.07427042722702026,
0.030224941670894623,
0.06782156229019165,
0.030173655599355698,
0.008106782101094723,
0.10022404044866562,
-0.14865803718566895,
0.09254875779151917,
0.027165226638317108,
-0.08324173092842102,
-0.10783975571393967,
-0.07804962247610092,
0.08636066317558289,
0.09150237590074539,
0.10933171212673187,
-0.0005063741118647158,
0.1521303355693817,
-0.11721751093864441,
0.08959654718637466,
0.180610790848732,
-0.2585587799549103,
-0.07370678335428238,
0.053965602070093155,
0.03163880854845047,
0.0645298957824707,
-0.11207607388496399,
-0.036159295588731766,
0.05092022195458412,
0.031238913536071777,
0.06963451951742172,
-0.024498026818037033,
-0.11864491552114487,
0.018698055297136307,
-0.15116403996944427,
-0.023708121851086617,
0.2147594690322876,
0.05825494974851608,
-0.03374185040593147,
-0.05740009620785713,
-0.05571296438574791,
-0.08863331377506256,
0.005623535253107548,
-0.0609453059732914,
0.029532216489315033,
-0.05869554355740547,
-0.08727947622537613,
-0.051463186740875244,
-0.06677725911140442,
-0.06548396497964859,
-0.015019381418824196,
0.09098395705223083,
0.06304755061864853,
0.00043628603452816606,
-0.03632810711860657,
0.11850631237030029,
0.002736251801252365,
-0.10566497594118118,
-0.024333300068974495,
-0.012723131105303764,
-0.09520772844552994,
-0.06792127341032028,
-0.04523969441652298,
-0.011702168732881546,
-0.010034378618001938,
0.15923219919204712,
-0.03920808434486389,
0.04600772261619568,
0.01889946311712265,
-0.0006731775356456637,
-0.012844596058130264,
0.1342727690935135,
-0.06686960160732269,
-0.028863567858934402,
-0.01546292845159769,
0.08358285576105118,
-0.02703813835978508,
0.004556413274258375,
-0.08097774535417557,
-0.0345052033662796,
0.09697303920984268,
0.05831844359636307,
-0.04990333691239357,
0.06263160705566406,
-0.027149394154548645,
-0.053835585713386536,
-0.008668872527778149,
-0.1286386400461197,
0.03049072064459324,
-0.013936081901192665,
-0.09835554659366608,
-0.0033809440210461617,
0.0329374223947525,
-0.008235159330070019,
-0.03773711249232292,
0.07272075116634369,
-0.10108727961778641,
-0.006438500247895718,
-0.07835612446069717,
-0.0770992711186409,
0.006956484634429216,
-0.08286815136671066,
-0.004016163758933544,
-0.08459333330392838,
-0.20677366852760315,
-0.03969695046544075,
0.025112489238381386,
-0.03350386768579483,
-0.039697736501693726,
-0.07399182766675949,
-0.06704369187355042,
-0.019334832206368446,
0.0032310066744685173,
0.08156327158212662,
-0.04029001295566559,
0.08605413138866425,
-0.001278347335755825,
0.026588039472699165,
0.018508022651076317,
0.04717973992228508,
-0.10041145235300064,
0.02181130461394787,
-0.0745663046836853,
0.07071664184331894,
-0.08063077926635742,
0.033245451748371124,
-0.10649138689041138,
-0.13020601868629456,
-0.006621450185775757,
-0.023089565336704254,
0.05310799553990364,
0.12059962004423141,
-0.17163383960723877,
-0.023970497772097588,
0.15769805014133453,
-0.04813086614012718,
-0.0813513696193695,
0.10621935874223709,
-0.06267648935317993,
0.053203366696834564,
0.060927461832761765,
0.16176751255989075,
0.14501871168613434,
-0.12100585550069809,
-0.004518276546150446,
-0.007145735435187817,
0.04008849337697029,
0.0384976863861084,
0.03447580710053444,
0.004891096148639917,
0.006532794330269098,
0.017134515568614006,
-0.08171617984771729,
-0.011530800722539425,
-0.09921684861183167,
-0.09730249643325806,
-0.05004177242517471,
-0.0992090106010437,
0.08128798007965088,
0.04033990949392319,
0.045147914439439774,
-0.06369107216596603,
-0.09113294631242752,
0.14333122968673706,
0.13380694389343262,
-0.05649638548493385,
0.0063326857052743435,
-0.08002230525016785,
0.0252387598156929,
-0.021933147683739662,
-0.02996988035738468,
-0.18784888088703156,
-0.12995073199272156,
0.03064780682325363,
-0.018346738070249557,
0.05544363334774971,
0.06626905500888824,
0.062705397605896,
0.05995917692780495,
-0.05634712055325508,
-0.029232485219836235,
-0.09453660994768143,
0.0021290674339979887,
-0.08731627464294434,
-0.1664196401834488,
-0.045053429901599884,
-0.018905777484178543,
0.14355367422103882,
-0.2387954741716385,
0.024129407480359077,
-0.03878391906619072,
0.1218489333987236,
0.014613817445933819,
-0.053681813180446625,
0.0008711017435416579,
0.062354348599910736,
-0.011427663266658783,
-0.08177479356527328,
0.041782550513744354,
-0.010171961039304733,
-0.08355654031038284,
-0.10828390717506409,
-0.11317245662212372,
0.035728130489587784,
0.06202660873532295,
0.02778332121670246,
-0.11651266366243362,
-0.018756810575723648,
-0.0768996924161911,
-0.05281509831547737,
-0.06658133119344711,
-0.013653811067342758,
0.1888495236635208,
-0.013990487903356552,
0.11734601855278015,
-0.052276864647865295,
-0.0730028823018074,
-0.019704537466168404,
0.001978221582248807,
-0.004886580631136894,
0.08512167632579803,
0.09222047030925751,
-0.12511949241161346,
0.09239747375249863,
0.07833807170391083,
-0.07583560049533844,
0.15486232936382294,
-0.04811419919133186,
-0.08758848905563354,
-0.031018879264593124,
0.02327396720647812,
-0.015740031376481056,
0.10033764690160751,
-0.11098508536815643,
0.0016833358677104115,
0.02831409126520157,
0.025417175143957138,
0.036522556096315384,
-0.16780255734920502,
-0.026409504935145378,
0.03693471848964691,
-0.04205184057354927,
-0.046408407390117645,
-0.009083387441933155,
0.03525524213910103,
0.08955679833889008,
0.015111292712390423,
-0.01498142909258604,
0.020703746005892754,
-0.0005554659874178469,
-0.08996261656284332,
0.17961101233959198,
-0.12573182582855225,
-0.12945088744163513,
-0.11489639431238174,
0.058017902076244354,
-0.07593442499637604,
-0.028012054041028023,
0.02309194765985012,
-0.09154298156499863,
-0.032013654708862305,
-0.08011419326066971,
-0.008681954815983772,
-0.05826030671596527,
-0.015484675765037537,
0.026647299528121948,
0.010311972349882126,
0.07514641433954239,
-0.13985605537891388,
0.02082251012325287,
-0.01490557100623846,
-0.12370527535676956,
0.009628847241401672,
0.015419629402458668,
0.14186562597751617,
0.12946026027202606,
-0.017652912065386772,
0.023890716955065727,
-0.032000862061977386,
0.2337644100189209,
-0.06535207480192184,
-0.007112112361937761,
0.09125908464193344,
0.006565948016941547,
0.04898809641599655,
0.10201243311166763,
0.02523273229598999,
-0.09595222026109695,
0.040496595203876495,
0.0833660140633583,
-0.013973064720630646,
-0.23686812818050385,
-0.04402777552604675,
-0.04058808088302612,
-0.051016297191381454,
0.10273633152246475,
0.03812796622514725,
0.05208465829491615,
0.046201735734939575,
-0.005047956015914679,
0.03964037820696831,
-0.029718805104494095,
0.08196999132633209,
0.09422213584184647,
0.027509931474924088,
0.09230320155620575,
-0.02307504042983055,
-0.04714604467153549,
0.05819033086299896,
0.0019568661227822304,
0.2935417592525482,
-0.030534928664565086,
0.08213755488395691,
0.05947774276137352,
0.1620016098022461,
-0.040912602096796036,
0.04178539291024208,
0.014423922635614872,
-0.00425003794953227,
0.010601853020489216,
-0.04971485957503319,
-0.02010984905064106,
0.02946416288614273,
0.019229117780923843,
0.05544579029083252,
-0.10181838274002075,
0.028331631794571877,
0.04549887776374817,
0.238694429397583,
0.03911292180418968,
-0.28179335594177246,
-0.08829517662525177,
-0.002569210948422551,
-0.029316818341612816,
-0.045459602028131485,
0.02125760354101658,
0.13632316887378693,
-0.13057628273963928,
0.03194727376103401,
-0.04436063766479492,
0.09235930442810059,
-0.033144112676382065,
0.007882790639996529,
0.05156955122947693,
0.11455351859331131,
-0.0031638608779758215,
0.09912144392728806,
-0.21150001883506775,
0.21405711770057678,
0.011900782585144043,
0.09972836822271347,
-0.05137651413679123,
0.0069177718833088875,
0.00592920184135437,
0.08982561528682709,
0.10814127326011658,
0.009516846388578415,
0.0066283768974244595,
-0.17068110406398773,
-0.04012700915336609,
0.038471002131700516,
0.12204797565937042,
-0.035581428557634354,
0.09296676516532898,
-0.04575048387050629,
0.010381272993981838,
0.05678851529955864,
-0.05543462187051773,
-0.1631467044353485,
-0.11682498455047607,
0.017061827704310417,
-0.01138462033122778,
-0.022892067208886147,
-0.06272783130407333,
-0.09042108058929443,
-0.03872506320476532,
0.19014081358909607,
-0.009816951118409634,
-0.03556614741683006,
-0.12610448896884918,
0.08235260099172592,
0.11899842321872711,
-0.06769903004169464,
0.0031537774484604597,
0.041608892381191254,
0.08464035391807556,
0.05229194089770317,
-0.07964885234832764,
0.03975222632288933,
-0.060814905911684036,
-0.14298133552074432,
-0.05172228813171387,
0.11714453250169754,
0.07115641981363297,
0.059894852340221405,
-0.0048003350384533405,
0.004904287401586771,
0.01328063290566206,
-0.09271416068077087,
0.005010698456317186,
0.10395453870296478,
0.088038370013237,
0.07284563034772873,
-0.1186039000749588,
0.06980060786008835,
-0.06657880544662476,
0.0020288543310016394,
0.14861392974853516,
0.18467094004154205,
-0.0898786261677742,
0.04487668722867966,
0.046755120158195496,
-0.09046491980552673,
-0.16450147330760956,
0.07781995832920074,
0.10184477269649506,
0.007487035822123289,
0.04476313665509224,
-0.19005024433135986,
0.10387372970581055,
0.1336791217327118,
0.014545687474310398,
0.05437680333852768,
-0.3625195622444153,
-0.1175193041563034,
0.07225566357374191,
0.10818769782781601,
0.01777925342321396,
-0.12817800045013428,
-0.018936477601528168,
0.0041217440739274025,
-0.15225303173065186,
0.0902915745973587,
-0.062093380838632584,
0.10520479083061218,
-0.007408461067825556,
0.11709577590227127,
0.029744533821940422,
-0.03490792587399483,
0.13254277408123016,
0.0776091068983078,
0.09015877544879913,
-0.061919912695884705,
-0.014702572487294674,
0.09136118739843369,
-0.06418321281671524,
0.06645683944225311,
0.0038907642010599375,
0.08726786077022552,
-0.15948912501335144,
-0.021161096170544624,
-0.08604801446199417,
0.056682758033275604,
-0.052842069417238235,
-0.07973387837409973,
-0.0526120625436306,
0.06848819553852081,
0.0842437669634819,
-0.03219841048121452,
0.03222073242068291,
0.02142276056110859,
0.09786424040794373,
0.043236520141363144,
0.10012015700340271,
0.014589055441319942,
-0.11793706566095352,
-0.020040888339281082,
-0.004549815319478512,
0.05856122449040413,
-0.0991322323679924,
0.025507677346467972,
0.15072156488895416,
0.06717102974653244,
0.15986423194408417,
0.046809613704681396,
-0.032830823212862015,
-0.007956989109516144,
0.026183143258094788,
-0.12914806604385376,
-0.15715265274047852,
-0.01665477454662323,
-0.0961865708231926,
-0.14668022096157074,
0.023757323622703552,
0.09041330963373184,
-0.0599999874830246,
-0.005072290543466806,
-0.008637790568172932,
0.012356477789580822,
-0.029280826449394226,
0.18739208579063416,
0.05700133368372917,
0.05522819608449936,
-0.07310307025909424,
0.10621598362922668,
0.06277631968259811,
-0.060618992894887924,
0.04328467696905136,
0.042147211730480194,
-0.08924876898527145,
-0.029531117528676987,
0.049748826771974564,
0.1426488161087036,
-0.057286638766527176,
-0.019813118502497673,
-0.08313161879777908,
-0.06046918407082558,
0.052175372838974,
0.11489862203598022,
0.048270199447870255,
-0.002670952118933201,
-0.050248462706804276,
0.03761586919426918,
-0.15391382575035095,
0.08787363022565842,
0.05937821418046951,
0.07735142856836319,
-0.14989402890205383,
0.15731322765350342,
-0.009762094356119633,
0.051736440509557724,
-0.01332930102944374,
0.011709350161254406,
-0.10300599038600922,
-0.008421738632023335,
-0.12955349683761597,
-0.040816500782966614,
-0.029100747779011726,
0.014691566117107868,
-0.002767286729067564,
-0.0534026063978672,
-0.04663921147584915,
0.04475417360663414,
-0.0743437111377716,
-0.05632305145263672,
0.03991900011897087,
0.05858349800109863,
-0.13954225182533264,
-0.010032422840595245,
0.03217608854174614,
-0.08669009804725647,
0.06641735136508942,
0.06453127413988113,
0.01166035607457161,
0.04110970348119736,
-0.09087848663330078,
-0.018029535189270973,
0.031462423503398895,
0.04101477935910225,
0.07276761531829834,
-0.07347923517227173,
-0.007729028817266226,
-0.03126151114702225,
0.04889872670173645,
0.027559375390410423,
0.03645298257470131,
-0.11715645343065262,
-0.014475539326667786,
-0.05906136706471443,
-0.056289371103048325,
-0.08290383964776993,
0.03330874443054199,
0.09265273809432983,
0.03924577683210373,
0.16424678266048431,
-0.07034199684858322,
0.06006985530257225,
-0.19091004133224487,
-0.046873874962329865,
0.012306461110711098,
-0.029978571459650993,
-0.01981820911169052,
-0.03820300102233887,
0.07404347509145737,
-0.06517256051301956,
0.0883142426609993,
-0.03768423572182655,
0.08299940824508667,
0.0236879400908947,
-0.04802120849490166,
-0.021916363388299942,
-0.00824918132275343,
0.18316610157489777,
0.04914722591638565,
-0.036200787872076035,
0.05984644591808319,
-0.0058812848292291164,
0.04989597573876381,
0.04806295409798622,
0.2039537876844406,
0.14253465831279755,
-0.07473994046449661,
0.03541134297847748,
0.07278000563383102,
-0.10069311410188675,
-0.1403992921113968,
0.07079451531171799,
-0.0020656620617955923,
0.10352922230958939,
-0.038750309497117996,
0.16248390078544617,
0.0967789962887764,
-0.17194655537605286,
0.057665176689624786,
-0.042447227984666824,
-0.12966929376125336,
-0.09723160415887833,
-0.051550909876823425,
-0.07820869237184525,
-0.11693648248910904,
0.0341065376996994,
-0.1414414346218109,
0.032159268856048584,
0.06448694318532944,
0.02437625825405121,
-0.00669151870533824,
0.1813955307006836,
-0.03499910235404968,
0.008321749046444893,
0.051797714084386826,
-0.003476453712210059,
-0.016621602699160576,
-0.07654288411140442,
-0.04175518453121185,
0.048884980380535126,
0.003122597699984908,
0.07660851627588272,
-0.05510802939534187,
0.004904521629214287,
0.01063646748661995,
-0.03677443414926529,
-0.051519714295864105,
0.008262665010988712,
0.031182963401079178,
0.02047138847410679,
0.03279852867126465,
0.06099211424589157,
-0.020603125914931297,
-0.04066691920161247,
0.26530027389526367,
-0.07562411576509476,
-0.09820602089166641,
-0.1524038314819336,
0.22049547731876373,
0.06243821606040001,
-0.014766070991754532,
0.08099794387817383,
-0.11887293308973312,
0.014760857447981834,
0.19140729308128357,
0.15825368463993073,
-0.05674535781145096,
-0.010448083281517029,
-0.007871227338910103,
-0.007514892611652613,
-0.042562007904052734,
0.09772759675979614,
0.10681135952472687,
0.03234599158167839,
-0.03527820482850075,
-0.028295207768678665,
-0.02205432765185833,
-0.02026863396167755,
-0.06345081329345703,
0.06611768156290054,
0.04881525784730911,
0.0019610051531344652,
-0.04214205592870712,
0.0757095143198967,
-0.033045198768377304,
-0.15692506730556488,
0.04145057871937752,
-0.1444685459136963,
-0.16614282131195068,
-0.032947711646556854,
0.06270325928926468,
-0.020693853497505188,
0.057236652821302414,
-0.026742611080408096,
-0.015871359035372734,
0.1481669545173645,
-0.004055749159306288,
-0.07653551548719406,
-0.10651955008506775,
0.09508480876684189,
-0.042799800634384155,
0.19884741306304932,
-0.006027340888977051,
0.06838610768318176,
0.09935923665761948,
0.02293138951063156,
-0.10770627111196518,
0.05645374953746796,
0.08009903877973557,
-0.05516968294978142,
0.020081911236047745,
0.15535223484039307,
-0.04209888353943825,
0.1049845814704895,
0.04784242436289787,
-0.13500374555587769,
-0.01801261119544506,
-0.02227938361465931,
-0.004939413629472256,
-0.09550273418426514,
0.01448024157434702,
-0.06271038204431534,
0.15605410933494568,
0.24739910662174225,
-0.0333867110311985,
0.020143786445260048,
-0.08735236525535583,
0.025004027411341667,
0.05482039600610733,
0.08302000910043716,
-0.03187571093440056,
-0.18003855645656586,
0.004541716538369656,
-0.025692347437143326,
0.020924808457493782,
-0.22545894980430603,
-0.10336926579475403,
0.06113845854997635,
-0.054087188094854355,
-0.0343872494995594,
0.12579049170017242,
0.0680900439620018,
0.042999330908060074,
-0.032924313098192215,
-0.0991155207157135,
-0.052907273173332214,
0.141610249876976,
-0.15455755591392517,
-0.04987258091568947
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-squad2-with-ner-with-neg
This model is a fine-tuned version of [twmkn9/distilbert-base-uncased-squad2](https://huggingface.co/twmkn9/distilbert-base-uncased-squad2) on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 2.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["conll2003"], "model_index": [{"name": "distilbert-base-uncased-squad2-with-ner-with-neg", "results": [{"task": {"name": "Question Answering", "type": "question-answering"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}}]}]}
|
question-answering
|
andi611/distilbert-base-uncased-squad2-with-ner-with-neg
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:conll2003",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us
|
# distilbert-base-uncased-squad2-with-ner-with-neg
This model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 2.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# distilbert-base-uncased-squad2-with-ner-with-neg\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 2.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-squad2-with-ner-with-neg\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 2.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
45,
57,
6,
12,
8,
3,
105,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n# distilbert-base-uncased-squad2-with-ner-with-neg\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 500\n- num_epochs: 2.0### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.09400216490030289,
0.1302441507577896,
-0.002753168111667037,
0.10021708160638809,
0.1205512136220932,
0.047394897788763046,
0.09530285000801086,
0.14212486147880554,
-0.06550721079111099,
0.055976368486881256,
0.07985403388738632,
0.08141762763261795,
0.03096115030348301,
0.10838901251554489,
-0.031155839562416077,
-0.25752872228622437,
0.0063161407597362995,
-0.016515793278813362,
-0.08573589473962784,
0.12313346564769745,
0.08594115078449249,
-0.09702097624540329,
0.07742370665073395,
-0.0012396842939779162,
-0.13757602870464325,
0.008767828345298767,
-0.0436696782708168,
-0.03878587856888771,
0.1127917692065239,
-0.025408519431948662,
0.08648767322301865,
0.027221916243433952,
0.1410539597272873,
-0.22013741731643677,
0.0018001638818532228,
0.0811704695224762,
0.028065800666809082,
0.08382255584001541,
0.03180718421936035,
-0.015889806672930717,
0.11522447317838669,
-0.15070337057113647,
0.1084824800491333,
0.02374626137316227,
-0.0827406644821167,
-0.09592361003160477,
-0.0719841867685318,
0.05636974796652794,
0.07792440801858902,
0.08862532675266266,
0.020336506888270378,
0.14136265218257904,
-0.11697161942720413,
0.09120223671197891,
0.22092853486537933,
-0.2650813162326813,
-0.06667641550302505,
0.04737086221575737,
0.042638469487428665,
0.044165510684251785,
-0.11893725395202637,
-0.024003298953175545,
0.012594605796039104,
0.022448010742664337,
0.09884858876466751,
-0.021334389224648476,
-0.10772226005792618,
0.0037623734679073095,
-0.12586624920368195,
-0.007396006025373936,
0.10470520704984665,
0.03589814156293869,
-0.033394958823919296,
-0.08281006664037704,
-0.07612684369087219,
-0.0978085920214653,
0.002289207186549902,
-0.05038293078541756,
0.03589068725705147,
-0.0653337612748146,
-0.08122481405735016,
-0.04096893221139908,
-0.05223051458597183,
-0.08199222385883331,
-0.00501272501423955,
0.19472424685955048,
0.043002624064683914,
0.019736872985959053,
-0.02412121184170246,
0.1329420506954193,
0.04196351021528244,
-0.13177989423274994,
-0.024324536323547363,
-0.010312040336430073,
-0.12878793478012085,
-0.050178103148937225,
-0.035252414643764496,
-0.004494790453463793,
-0.016103196889162064,
0.1730145364999771,
-0.03188231587409973,
0.07805370539426804,
0.05550934001803398,
-0.02575647085905075,
-0.014409791678190231,
0.13644079864025116,
-0.06342349946498871,
-0.049080103635787964,
-0.02371273562312126,
0.09539908915758133,
-0.009002897888422012,
-0.01393832266330719,
-0.06219189986586571,
-0.03793555870652199,
0.06875565648078918,
0.06056014448404312,
-0.055976688861846924,
0.056153178215026855,
-0.033464789390563965,
-0.05010610446333885,
0.007145486306399107,
-0.12732857465744019,
0.03085979074239731,
0.008035886101424694,
-0.09766623377799988,
-0.005652419291436672,
0.03001030534505844,
-0.01221047155559063,
-0.01764248125255108,
0.12713904678821564,
-0.08862503618001938,
-0.008046570234000683,
-0.06437356770038605,
-0.06329908221960068,
0.0024027489125728607,
-0.14135916531085968,
-0.018376434221863747,
-0.05877161771059036,
-0.20072366297245026,
-0.027038920670747757,
0.06391329318284988,
-0.060685351490974426,
-0.0117101538926363,
-0.06752386689186096,
-0.060580093413591385,
0.012050773948431015,
-0.00023012723249848932,
0.1241026297211647,
-0.058586299419403076,
0.08246959000825882,
-0.0249829962849617,
0.05882697179913521,
0.02945876307785511,
0.03861286863684654,
-0.1004648208618164,
0.02162793278694153,
-0.08917586505413055,
0.07938853651285172,
-0.07433045655488968,
-0.00011306488158879802,
-0.10898129642009735,
-0.11776641756296158,
-0.002570908982306719,
-0.025311265140771866,
0.05439191684126854,
0.14507919549942017,
-0.19637586176395416,
-0.022186119109392166,
0.12540294229984283,
-0.06696968525648117,
-0.06588034331798553,
0.06789866089820862,
-0.06120286509394646,
0.06476014107465744,
0.04437502473592758,
0.15426813066005707,
0.1241173967719078,
-0.14887069165706635,
-0.01322813518345356,
0.004081082064658403,
0.02422342076897621,
0.04178530350327492,
0.028839193284511566,
0.016487298533320427,
0.04637410491704941,
0.010763145051896572,
-0.08534518629312515,
-0.01779448240995407,
-0.0816560685634613,
-0.08423710614442825,
-0.05263301357626915,
-0.08289647847414017,
0.07783380895853043,
0.02582559548318386,
0.02527502551674843,
-0.05702616274356842,
-0.10413729399442673,
0.10821329057216644,
0.12928332388401031,
-0.06172917038202286,
0.01832762360572815,
-0.07282626628875732,
-0.01507657766342163,
0.013226822018623352,
-0.0362703837454319,
-0.19123822450637817,
-0.13703836500644684,
0.03402923047542572,
-0.04646166041493416,
0.03550170734524727,
0.030475424602627754,
0.09003939479589462,
0.061500612646341324,
-0.05629640817642212,
-0.02648923546075821,
-0.09092015027999878,
0.007924568839371204,
-0.07475227117538452,
-0.190307155251503,
-0.06015031039714813,
-0.029601873829960823,
0.14117130637168884,
-0.16813622415065765,
0.0040915957652032375,
-0.015292048454284668,
0.1406342089176178,
0.036432672291994095,
-0.06618960946798325,
0.014549433253705502,
0.030686238780617714,
0.006049131974577904,
-0.09854967892169952,
0.039897892624139786,
-0.013993822038173676,
-0.09794516116380692,
-0.07014092057943344,
-0.1203785315155983,
0.003847740823403001,
0.048794377595186234,
0.08470543473958969,
-0.10181403160095215,
-0.025711096823215485,
-0.07533881813287735,
-0.055863551795482635,
-0.06550660729408264,
0.015739018097519875,
0.19811135530471802,
0.03346233814954758,
0.09488020092248917,
-0.056925032287836075,
-0.08387459069490433,
-0.013013320975005627,
0.025363577529788017,
0.011813362129032612,
0.07735348492860794,
0.10166683048009872,
-0.09066048264503479,
0.06646652519702911,
0.07904012501239777,
-0.054221585392951965,
0.14314837753772736,
-0.03241504728794098,
-0.08220274746417999,
-0.022259388118982315,
-0.00012868797057308257,
-0.022061897441744804,
0.12422862648963928,
-0.06527864187955856,
0.028863325715065002,
0.03263426572084427,
0.03217898681759834,
0.017999490723013878,
-0.16648879647254944,
-0.017805295065045357,
0.023932088166475296,
-0.06019723042845726,
-0.060373369604349136,
0.004736610688269138,
0.035305205732584,
0.08801895380020142,
0.018553225323557854,
-0.02575027383863926,
0.016355959698557854,
-0.00949916522949934,
-0.07138870656490326,
0.18287275731563568,
-0.11111094057559967,
-0.1266006976366043,
-0.085132896900177,
0.04183776676654816,
-0.06554124504327774,
-0.041202180087566376,
0.00336630386300385,
-0.08438881486654282,
-0.044994525611400604,
-0.08083785325288773,
-0.018024176359176636,
-0.008985153399407864,
0.012259777635335922,
0.02182197943329811,
0.00042053021024912596,
0.05784405395388603,
-0.1398235559463501,
0.014836261980235577,
-0.03807620331645012,
-0.0946696549654007,
0.02542116492986679,
0.05169529840350151,
0.09948369860649109,
0.11730747669935226,
-0.014619223773479462,
0.02694057486951351,
-0.024817155674099922,
0.20383094251155853,
-0.07277321070432663,
0.019943006336688995,
0.08658279478549957,
-0.010317995212972164,
0.04932783171534538,
0.11937500536441803,
0.03358780965209007,
-0.10215547680854797,
0.034204915165901184,
0.08568041026592255,
-0.028088020160794258,
-0.23475885391235352,
-0.034915223717689514,
-0.030526721850037575,
-0.058662641793489456,
0.11019345372915268,
0.043588120490312576,
-0.02041308395564556,
0.039009738713502884,
0.011039421893656254,
-0.030099906027317047,
-0.018560223281383514,
0.070632703602314,
0.07314750552177429,
0.054318301379680634,
0.08886582404375076,
-0.009143686853349209,
-0.026411516591906548,
0.06209797412157059,
0.016051001846790314,
0.2623107135295868,
-0.05730908736586571,
0.09957534819841385,
0.030280007049441338,
0.13074366748332977,
-0.049977365881204605,
0.07987294346094131,
0.013186079449951649,
-0.006593094207346439,
0.0017681869212538004,
-0.05800473317503929,
-0.03077141009271145,
0.02889159508049488,
-0.00018472377269063145,
0.03694945201277733,
-0.07310321927070618,
0.07077668607234955,
0.04242263361811638,
0.27880433201789856,
0.028607400134205818,
-0.2704269587993622,
-0.07341263443231583,
-0.01967398263514042,
-0.03916061297059059,
-0.04947652667760849,
0.01312989927828312,
0.14063256978988647,
-0.1301051527261734,
0.06573561578989029,
-0.07413613051176071,
0.0694318413734436,
-0.04919610545039177,
-0.0015665895771235228,
0.08124814927577972,
0.1436644047498703,
-0.013712516985833645,
0.06857587397098541,
-0.20459584891796112,
0.21859371662139893,
0.014541535638272762,
0.10776624828577042,
-0.07508422434329987,
0.007078062742948532,
0.011725733987987041,
0.03525285795331001,
0.11065703630447388,
0.006868366152048111,
-0.013871999457478523,
-0.12548398971557617,
-0.096284419298172,
0.05010508745908737,
0.13977403938770294,
-0.034991901367902756,
0.08721833676099777,
-0.042572326958179474,
-0.007979479618370533,
0.04782309755682945,
-0.08845419436693192,
-0.13393664360046387,
-0.11309480667114258,
0.03421938046813011,
-0.007805973757058382,
-0.04279814288020134,
-0.05074309930205345,
-0.10460956394672394,
0.004840695299208164,
0.15760108828544617,
0.01039617508649826,
-0.050456464290618896,
-0.15120846033096313,
0.023919465020298958,
0.16278964281082153,
-0.05982629209756851,
0.015508133918046951,
0.029659811407327652,
0.08253981918096542,
0.044388916343450546,
-0.08783453702926636,
0.055550843477249146,
-0.0767812505364418,
-0.1704981029033661,
-0.054098088294267654,
0.13702493906021118,
0.08442769199609756,
0.051952704787254333,
-0.01344100758433342,
0.02880495972931385,
0.011185752227902412,
-0.09741264581680298,
0.02030215971171856,
0.0776628851890564,
0.043975580483675,
0.04516023397445679,
-0.09851653128862381,
0.09358161687850952,
-0.03585504740476608,
0.013432210311293602,
0.12806415557861328,
0.19669727981090546,
-0.08791681379079819,
0.11475950479507446,
0.048085737973451614,
-0.061345528811216354,
-0.16253496706485748,
0.06084636226296425,
0.11768656969070435,
0.011184735223650932,
0.061448026448488235,
-0.20164726674556732,
0.10704821348190308,
0.11302631348371506,
0.004454656504094601,
0.06661463528871536,
-0.3438087999820709,
-0.12687741219997406,
0.04909436032176018,
0.10364416241645813,
0.03380269557237625,
-0.097081258893013,
-0.015115723945200443,
0.0069103180430829525,
-0.15674392879009247,
0.13561470806598663,
-0.049353767186403275,
0.12168072164058685,
-0.025164809077978134,
0.1133381798863411,
0.032378945499658585,
-0.037462569773197174,
0.11440052092075348,
0.08385796844959259,
0.07941064238548279,
-0.042191941291093826,
-0.002943329745903611,
0.02635747566819191,
-0.07011458277702332,
0.048333290964365005,
-0.03827601671218872,
0.07659155875444412,
-0.12702113389968872,
-0.007737337611615658,
-0.09633605927228928,
0.06042841449379921,
-0.04766198247671127,
-0.0792492926120758,
-0.029903391376137733,
0.059773292392492294,
0.09577671438455582,
-0.04030285030603409,
0.03622904047369957,
0.01759388856589794,
0.068299300968647,
0.07168396562337875,
0.10946208238601685,
-0.020072314888238907,
-0.10315808653831482,
-0.01049590203911066,
0.007642440032213926,
0.04869251325726509,
-0.09704602509737015,
0.03653005510568619,
0.14563584327697754,
0.07608570903539658,
0.126254141330719,
0.04014734551310539,
-0.025832822546362877,
-0.025402862578630447,
0.03247079625725746,
-0.14480073750019073,
-0.10193315148353577,
0.02083970233798027,
-0.09912644326686859,
-0.14002281427383423,
0.023858726024627686,
0.10540999472141266,
-0.039616771042346954,
-0.0006523468182422221,
0.001481762737967074,
0.0160911176353693,
-0.012070702388882637,
0.1954088658094406,
0.0380953811109066,
0.05513893440365791,
-0.08822838217020035,
0.1367628425359726,
0.025778882205486298,
-0.04517229646444321,
0.03642963618040085,
0.06902574002742767,
-0.10840560495853424,
-0.0026825652457773685,
0.05061245709657669,
0.0949593260884285,
-0.055673204362392426,
-0.01027199812233448,
-0.09544849395751953,
-0.06078343465924263,
0.04396187886595726,
0.13503752648830414,
0.04089682549238205,
0.006670627277344465,
-0.06754868477582932,
0.04073500260710716,
-0.1411488801240921,
0.0757237896323204,
0.06454131752252579,
0.07325522601604462,
-0.11895719170570374,
0.12895610928535461,
0.0008867387077771127,
0.0367065966129303,
-0.0195370614528656,
0.018818505108356476,
-0.08942930400371552,
-0.020742949098348618,
-0.09241176396608353,
-0.03499269485473633,
-0.032219450920820236,
0.014429708011448383,
-0.004957661032676697,
-0.05892106890678406,
-0.0415884293615818,
0.03136788308620453,
-0.07803770899772644,
-0.051052361726760864,
0.016906287521123886,
0.028606334701180458,
-0.15697762370109558,
-0.01536808256059885,
0.03584260493516922,
-0.10179131478071213,
0.091781847178936,
0.07587652653455734,
-0.0015445137396454811,
0.03762752562761307,
-0.1335785835981369,
-0.03562457486987114,
0.025502361357212067,
0.023315709084272385,
0.07079034298658371,
-0.09709113836288452,
-0.012534447945654392,
-0.04405050724744797,
0.03953848034143448,
0.019939439371228218,
0.062413014471530914,
-0.1163400337100029,
0.020310504361987114,
-0.06600774079561234,
-0.04999103769659996,
-0.06890930235385895,
0.028946558013558388,
0.10079432278871536,
0.032969068735837936,
0.16749778389930725,
-0.06658747047185898,
0.06090817600488663,
-0.17774654924869537,
-0.041622523218393326,
0.020492257550358772,
-0.026031387969851494,
-0.06234216317534447,
-0.031061319634318352,
0.0839870348572731,
-0.058396704494953156,
0.08649589866399765,
-0.036959338933229446,
0.067306287586689,
0.02729102224111557,
-0.04004739597439766,
-0.05170045047998428,
-0.014374610036611557,
0.16021886467933655,
0.050443384796381,
-0.021265115588903427,
0.0992632657289505,
-0.010139968246221542,
0.03366639092564583,
0.027771709486842155,
0.25223854184150696,
0.13269056379795074,
-0.04071178287267685,
0.05697198957204819,
0.06335660070180893,
-0.11032639443874359,
-0.13869042694568634,
0.1106167584657669,
-0.028518984094262123,
0.10172811150550842,
-0.042627524584531784,
0.1932307779788971,
0.05337033048272133,
-0.17798015475273132,
0.05083238333463669,
-0.019204914569854736,
-0.11802781373262405,
-0.11690812557935715,
-0.01737639121711254,
-0.06979739665985107,
-0.12180578708648682,
0.03776851296424866,
-0.12154465168714523,
0.054305147379636765,
0.07246777415275574,
0.031860388815402985,
0.02576831541955471,
0.1844795048236847,
-0.05216199904680252,
0.01264782715588808,
0.05628029257059097,
0.02846050076186657,
-0.008672833442687988,
-0.04165050759911537,
-0.05994155630469322,
0.016712406650185585,
0.012834299355745316,
0.07075410336256027,
-0.06430050730705261,
0.0008503616554662585,
0.01229163073003292,
-0.017403921112418175,
-0.06201037019491196,
0.010795402340590954,
0.023367267102003098,
0.03617166727781296,
0.04368239641189575,
0.06743888556957245,
0.0010779405711218715,
-0.045467399060726166,
0.2778385579586029,
-0.07652998715639114,
-0.0815667062997818,
-0.1420253962278366,
0.21015189588069916,
0.041647784411907196,
-0.020859386771917343,
0.07703899592161179,
-0.10199082642793655,
-0.02993522584438324,
0.15422169864177704,
0.14454443752765656,
-0.08870825171470642,
-0.013122023083269596,
-0.02005704678595066,
-0.00028395216213539243,
-0.02612515166401863,
0.10245469212532043,
0.07901443541049957,
0.023497672751545906,
-0.04733196645975113,
-0.03644658997654915,
-0.013400229625403881,
-0.04143825173377991,
-0.04872104525566101,
0.06305151432752609,
0.01811438798904419,
-0.011346776969730854,
-0.05064520612359047,
0.059446465224027634,
-0.02188839018344879,
-0.20362409949302673,
0.05639653280377388,
-0.17293059825897217,
-0.17322424054145813,
-0.03398006781935692,
0.05250620096921921,
-0.008202103897929192,
0.06390401721000671,
-0.007860854268074036,
-0.02982681430876255,
0.14049828052520752,
0.0019211076432839036,
-0.06727470457553864,
-0.10904820263385773,
0.09415442496538162,
-0.05240887030959129,
0.20288591086864471,
-0.0056631797924637794,
0.0645485520362854,
0.08252128213644028,
0.033663492649793625,
-0.11767897009849548,
0.0334710069000721,
0.08316431939601898,
-0.11794093996286392,
0.0005527316243387759,
0.15700764954090118,
-0.05301482230424881,
0.09125833958387375,
0.041128117591142654,
-0.12480662763118744,
-0.012643394991755486,
-0.0025648875162005424,
-0.028231097385287285,
-0.08304546028375626,
-0.0015855955425649881,
-0.03583023324608803,
0.16208815574645996,
0.2440314143896103,
-0.01872611977159977,
0.017830083146691322,
-0.09227441996335983,
0.0349770188331604,
0.0517246276140213,
0.04817106947302818,
-0.049344006925821304,
-0.19658394157886505,
0.017503589391708374,
0.017790254205465317,
0.016118424013257027,
-0.1697896122932434,
-0.10829341411590576,
0.0563586950302124,
-0.05792001262307167,
-0.03915208950638771,
0.11043139547109604,
0.016957687214016914,
0.05014672130346298,
-0.0156776774674654,
-0.08292742073535919,
-0.02864518016576767,
0.1431574672460556,
-0.18932220339775085,
-0.04145378991961479
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-squad2-with-ner
This model is a fine-tuned version of [twmkn9/distilbert-base-uncased-squad2](https://huggingface.co/twmkn9/distilbert-base-uncased-squad2) on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["conll2003"], "model_index": [{"name": "distilbert-base-uncased-squad2-with-ner", "results": [{"task": {"name": "Question Answering", "type": "question-answering"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}}]}]}
|
question-answering
|
andi611/distilbert-base-uncased-squad2-with-ner
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:conll2003",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us
|
# distilbert-base-uncased-squad2-with-ner
This model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2.0
### Training results
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# distilbert-base-uncased-squad2-with-ner\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 2.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n",
"# distilbert-base-uncased-squad2-with-ner\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 2.0",
"### Training results",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
45,
52,
6,
12,
8,
3,
90,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #question-answering #generated_from_trainer #dataset-conll2003 #endpoints_compatible #region-us \n# distilbert-base-uncased-squad2-with-ner\n\nThis model is a fine-tuned version of twmkn9/distilbert-base-uncased-squad2 on the conll2003 dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 3e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 2.0### Training results### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.09720499813556671,
0.07295864075422287,
-0.0019112036097794771,
0.0802915021777153,
0.15992502868175507,
0.033200953155756,
0.09989998489618301,
0.1124257817864418,
-0.14413636922836304,
0.04061801731586456,
0.06599125266075134,
0.09147419780492783,
0.019672902300953865,
0.08332689106464386,
-0.036918193101882935,
-0.259628027677536,
0.004957047291100025,
0.01939406618475914,
-0.1178784891963005,
0.11346596479415894,
0.10640537738800049,
-0.11082547158002853,
0.05374990031123161,
0.0231920275837183,
-0.19855017960071564,
0.024851959198713303,
-0.024700874462723732,
-0.03637903928756714,
0.10482246428728104,
0.019749043509364128,
0.1315188705921173,
0.010920841246843338,
0.12401136010885239,
-0.21633578836917877,
0.005259618628770113,
0.08500813692808151,
0.024150071665644646,
0.06426956504583359,
0.01080372091382742,
-0.007717467378824949,
0.11922026425600052,
-0.10785474628210068,
0.09235736727714539,
0.037818048149347305,
-0.09167526662349701,
-0.17509661614894867,
-0.09198236465454102,
0.08430447429418564,
0.08650466054677963,
0.10513899475336075,
-0.0008129997295327485,
0.15566208958625793,
-0.12331493943929672,
0.08448890596628189,
0.19280198216438293,
-0.27179479598999023,
-0.08545024693012238,
0.0618000328540802,
0.029333701357245445,
0.0315164178609848,
-0.10722140222787857,
-0.025321917608380318,
0.0602772980928421,
0.04120377078652382,
0.06940117478370667,
-0.020793408155441284,
-0.1375172734260559,
0.0040028090588748455,
-0.13803575932979584,
-0.011332673951983452,
0.16987355053424835,
0.04941598325967789,
-0.04755591228604317,
-0.043215274810791016,
-0.05825270712375641,
-0.08705160766839981,
0.000758385518565774,
-0.0698375254869461,
0.03834156692028046,
-0.04819144681096077,
-0.11938901245594025,
-0.045300811529159546,
-0.08585681766271591,
-0.06513377279043198,
-0.018721453845500946,
0.15745733678340912,
0.04619154334068298,
0.01863132230937481,
-0.053190503269433975,
0.12323939800262451,
-0.006961447652429342,
-0.12301541864871979,
-0.032212626188993454,
-0.010568015277385712,
-0.07615737617015839,
-0.07079675793647766,
-0.06870466470718384,
-0.015168671496212482,
-0.009556123055517673,
0.1915893703699112,
-0.05161910131573677,
0.0523495078086853,
0.018054494634270668,
0.005262007471174002,
-0.03481176495552063,
0.14227794110774994,
-0.06115027144551277,
-0.011041761375963688,
-0.012005171738564968,
0.06365513801574707,
-0.021744655445218086,
-0.007033155299723148,
-0.08083552867174149,
-0.013753648847341537,
0.07526633143424988,
0.036071304231882095,
-0.06596476584672928,
0.05469413474202156,
-0.0209148321300745,
-0.04718375951051712,
-0.02023852802813053,
-0.10672640055418015,
0.026483656838536263,
0.0008289492689073086,
-0.0938565582036972,
0.025556717067956924,
0.03790123760700226,
0.022738346830010414,
-0.008070888929069042,
0.1073259562253952,
-0.09893634170293808,
0.0032521672546863556,
-0.0928722470998764,
-0.08231065422296524,
0.00909377634525299,
-0.06692132353782654,
-0.003840739605948329,
-0.09648963063955307,
-0.1800209879875183,
-0.033002376556396484,
0.05466475337743759,
-0.022122878581285477,
-0.018302591517567635,
-0.05917735397815704,
-0.0793214812874794,
-0.001186918467283249,
-0.0039971498772501945,
0.10523390769958496,
-0.04112362861633301,
0.08053560554981232,
0.0220502819865942,
0.028636978939175606,
-0.014685866422951221,
0.03631702810525894,
-0.08771875500679016,
0.019751161336898804,
-0.10026784241199493,
0.05835866555571556,
-0.07544157654047012,
0.039266686886548996,
-0.08821777254343033,
-0.12405790388584137,
-0.015969930216670036,
-0.013234885409474373,
0.07004817575216293,
0.10540348291397095,
-0.18121999502182007,
-0.04883602634072304,
0.16306188702583313,
-0.07369770854711533,
-0.10925594717264175,
0.08790264278650284,
-0.06719440966844559,
0.08533098548650742,
0.06034516543149948,
0.11562050133943558,
0.12451836466789246,
-0.11970971524715424,
-0.019683126360177994,
-0.01715647615492344,
0.06754308938980103,
0.0398353673517704,
0.036574967205524445,
-0.004373218864202499,
0.027285533025860786,
0.016023647040128708,
-0.07274463027715683,
-0.011796676553785801,
-0.11019597202539444,
-0.0931008979678154,
-0.051481593400239944,
-0.09371157735586166,
0.07179559022188187,
0.04550231993198395,
0.062421299517154694,
-0.07067383080720901,
-0.10425299406051636,
0.15783530473709106,
0.12501183152198792,
-0.05913050100207329,
0.033715926110744476,
-0.08326725661754608,
0.03236480429768562,
-0.007837099954485893,
-0.03338784724473953,
-0.1987222135066986,
-0.13504822552204132,
0.01417329628020525,
0.0017863946268334985,
0.033196646720170975,
0.0450219064950943,
0.06693818420171738,
0.06517677009105682,
-0.06217673048377037,
-0.004239043220877647,
-0.12319979071617126,
-0.0019961923826485872,
-0.08493312448263168,
-0.17861174046993256,
-0.06485843658447266,
-0.022312011569738388,
0.17559242248535156,
-0.22143065929412842,
0.033107370138168335,
-0.002571636810898781,
0.1410849392414093,
0.02336711995303631,
-0.050774410367012024,
-0.04641331359744072,
0.06705430150032043,
-0.02658156305551529,
-0.07352856546640396,
0.0509919635951519,
0.006592504680156708,
-0.09043464064598083,
-0.10874539613723755,
-0.11237116158008575,
0.04368156939744949,
0.08809679746627808,
-0.025437435135245323,
-0.0979272648692131,
-0.010905332863330841,
-0.08330997079610825,
-0.04076801985502243,
-0.05793621018528938,
-0.010660850442945957,
0.18108618259429932,
-0.004996867850422859,
0.12263533473014832,
-0.058632489293813705,
-0.07057454437017441,
-0.006365615408867598,
-0.0073423138819634914,
0.025045759975910187,
0.05207221210002899,
0.09678515791893005,
-0.08425188064575195,
0.09726131707429886,
0.08301006257534027,
-0.08103398233652115,
0.1556047946214676,
-0.04545649513602257,
-0.07986026257276535,
-0.02870197594165802,
0.003574179718270898,
-0.0020187371410429478,
0.10921242088079453,
-0.1322236955165863,
0.007404644042253494,
0.022012870758771896,
0.03677947819232941,
0.04840032383799553,
-0.1950930505990982,
-0.01687372475862503,
0.03435301408171654,
-0.032095011323690414,
-0.05127213895320892,
-0.012970956042408943,
0.02089248038828373,
0.08337718993425369,
0.024185899645090103,
-0.0036187150981277227,
0.024211302399635315,
-0.004981265403330326,
-0.09156715124845505,
0.20303820073604584,
-0.11443541944026947,
-0.10480774194002151,
-0.10209855437278748,
0.011547178030014038,
-0.06915397942066193,
-0.01979076862335205,
0.034172218292951584,
-0.0979277640581131,
-0.02759174443781376,
-0.050683725625276566,
0.019355814903974533,
-0.021364344283938408,
-0.004983681254088879,
0.04785076528787613,
0.014533968642354012,
0.07315798848867416,
-0.15147002041339874,
0.01795913651585579,
-0.038648903369903564,
-0.13305573165416718,
0.011158832348883152,
0.02609081007540226,
0.11924521625041962,
0.14868396520614624,
-0.016762737184762955,
0.026648741215467453,
-0.030531348660588264,
0.2376657873392105,
-0.07693974673748016,
-0.025683483108878136,
0.10196445137262344,
-0.003697979496791959,
0.0426657609641552,
0.09263139218091965,
0.04885547608137131,
-0.10647110641002655,
0.03970293700695038,
0.09298773854970932,
-0.03396354243159294,
-0.24055717885494232,
-0.03516840934753418,
-0.04917445778846741,
-0.06894057244062424,
0.0782318115234375,
0.023565217852592468,
0.054618995636701584,
0.06313136219978333,
-0.0013320129364728928,
0.04516833275556564,
-0.017398430034518242,
0.08512133359909058,
0.08174044638872147,
0.04700859636068344,
0.1107354611158371,
-0.02504028007388115,
-0.03980078920722008,
0.059419214725494385,
-0.01536176260560751,
0.29991772770881653,
-0.013350543566048145,
0.05751809477806091,
0.08641098439693451,
0.14643941819667816,
-0.04343070834875107,
0.06662450730800629,
0.001185241504572332,
-0.03033359907567501,
0.012051258236169815,
-0.056904759258031845,
-0.018689045682549477,
0.03162507712841034,
-0.021212587133049965,
0.07201766967773438,
-0.10980700701475143,
0.028508443385362625,
0.05991443246603012,
0.27389779686927795,
0.048961859196424484,
-0.27508312463760376,
-0.10699764639139175,
-0.0038017865736037493,
-0.026391038671135902,
-0.036917656660079956,
0.017675090581178665,
0.11454284936189651,
-0.13229182362556458,
0.05453329160809517,
-0.05973482131958008,
0.09661441296339035,
0.0007380038732662797,
0.010601680725812912,
0.07498722523450851,
0.1096307560801506,
-0.002036136342212558,
0.07801283150911331,
-0.22721907496452332,
0.22976845502853394,
0.004971166141331196,
0.11100385338068008,
-0.05009210854768753,
0.01783289574086666,
0.022286804392933846,
0.09478838741779327,
0.06868498772382736,
0.0026684184558689594,
-0.002689031884074211,
-0.1508031189441681,
-0.01764589175581932,
0.046577390283346176,
0.13866731524467468,
-0.04857266694307327,
0.11478199809789658,
-0.038079600781202316,
0.019952891394495964,
0.06876224279403687,
-0.0059104179963469505,
-0.16526004672050476,
-0.11631335318088531,
0.009922297671437263,
-0.022538183256983757,
-0.021390441805124283,
-0.07873135805130005,
-0.10733462125062943,
-0.0378078892827034,
0.18739095330238342,
-0.005072679370641708,
-0.034622084349393845,
-0.1330171376466751,
0.08872457593679428,
0.10463925451040268,
-0.059742361307144165,
0.02299925871193409,
0.03390846401453018,
0.1032288521528244,
0.03864069655537605,
-0.07733531296253204,
0.0540202297270298,
-0.07381025701761246,
-0.1736001819372177,
-0.0489438995718956,
0.1179632917046547,
0.07179468870162964,
0.04570373147726059,
-0.0041835494339466095,
0.013743448071181774,
0.019730107858777046,
-0.09801116585731506,
-0.0064035141840577126,
0.06975937634706497,
0.07227031141519547,
0.05596970394253731,
-0.09501400589942932,
0.061738889664411545,
-0.053052373230457306,
0.007352541666477919,
0.1408047378063202,
0.20236530900001526,
-0.08772851526737213,
0.021533921360969543,
0.034264422953128815,
-0.07902391254901886,
-0.1646650731563568,
0.09823650866746902,
0.11470213532447815,
0.0053006443195044994,
0.032156892120838165,
-0.19051434099674225,
0.12462196499109268,
0.12944796681404114,
0.002965507796034217,
0.08032141625881195,
-0.3204555809497833,
-0.13445284962654114,
0.058765921741724014,
0.10511545836925507,
0.051986295729875565,
-0.12914563715457916,
-0.021695798262953758,
-0.018419167026877403,
-0.19209866225719452,
0.1226721927523613,
-0.13294844329357147,
0.10026334971189499,
0.004266035743057728,
0.09863527119159698,
0.01990383490920067,
-0.0327482707798481,
0.1333334594964981,
0.07038720697164536,
0.08925671130418777,
-0.04916645586490631,
-0.011098834685981274,
0.10168717801570892,
-0.05761057138442993,
0.013040823861956596,
0.01721738465130329,
0.0698222815990448,
-0.12000399082899094,
-0.01777726225554943,
-0.09673474729061127,
0.05333370715379715,
-0.06472046673297882,
-0.08202648162841797,
-0.05153435468673706,
0.05456055700778961,
0.07479061931371689,
-0.028080672025680542,
0.08340824395418167,
0.013324555940926075,
0.16445158421993256,
0.058815307915210724,
0.09486639499664307,
-0.02491353265941143,
-0.07983895391225815,
-0.012527617625892162,
-0.009611541405320168,
0.06150438264012337,
-0.13364779949188232,
0.03991502523422241,
0.14325957000255585,
0.07356172055006027,
0.14724023640155792,
0.07699976116418839,
-0.041797708719968796,
0.012688440270721912,
0.04979091137647629,
-0.10970848798751831,
-0.1218441054224968,
-0.005828274879604578,
-0.08513046056032181,
-0.13252517580986023,
0.05938193202018738,
0.1092231497168541,
-0.056060709059238434,
0.000460008712252602,
-0.007863909937441349,
-0.015360265970230103,
-0.06184374541044235,
0.17980779707431793,
0.05630108714103699,
0.05695191025733948,
-0.08208172023296356,
0.09750743210315704,
0.03795364126563072,
-0.04894562065601349,
0.013048382475972176,
0.03591048717498779,
-0.0920381247997284,
-0.014581299386918545,
0.028559355065226555,
0.1336354911327362,
-0.07461435347795486,
-0.03510250896215439,
-0.11071604490280151,
-0.07757823169231415,
0.04237871244549751,
0.1502121239900589,
0.06684604287147522,
0.0017972784116864204,
-0.05080989748239517,
0.06997367739677429,
-0.14938032627105713,
0.07007714360952377,
0.0480571910738945,
0.07979784160852432,
-0.14338280260562897,
0.17716433107852936,
0.015138810500502586,
0.04071194678544998,
-0.015504159964621067,
0.005519283004105091,
-0.10340308398008347,
0.0004739078285638243,
-0.1588662713766098,
-0.05026858299970627,
-0.02929997444152832,
0.004308436997234821,
0.001584466197527945,
-0.055134765803813934,
-0.07119842618703842,
0.04654600843787193,
-0.0876423716545105,
-0.046081554144620895,
0.03500089421868324,
0.03143874555826187,
-0.13765409588813782,
-0.0006699207006022334,
0.03557099401950836,
-0.10125721246004105,
0.07550093531608582,
0.07060135900974274,
0.018539050593972206,
0.06726997345685959,
-0.122517891228199,
-0.025666380301117897,
0.034279726445674896,
0.031907521188259125,
0.07806308567523956,
-0.07315771281719208,
-0.006961824372410774,
-0.030489081516861916,
0.0789395123720169,
0.023153280839323997,
0.0549212209880352,
-0.11386837810277939,
-0.017469508573412895,
-0.04248567670583725,
-0.06379248201847076,
-0.058899763971567154,
0.026626188308000565,
0.07806582748889923,
0.05217486619949341,
0.17488797008991241,
-0.07192429900169373,
0.05312428995966911,
-0.20047183334827423,
-0.03897145017981529,
0.011011063121259212,
-0.024079984053969383,
-0.043316539376974106,
-0.04260125383734703,
0.06404437124729156,
-0.06864476948976517,
0.09627388417720795,
-0.05760772526264191,
0.10879119485616684,
0.03537161275744438,
-0.06803380697965622,
-0.021514713764190674,
-0.0009723163093440235,
0.20467902719974518,
0.06696765124797821,
-0.016104543581604958,
0.054503414779901505,
0.011305706575512886,
0.05238273739814758,
0.03305468335747719,
0.2417614907026291,
0.1337377279996872,
-0.06721420586109161,
0.06389925628900528,
0.06964732706546783,
-0.08737608045339584,
-0.13875798881053925,
0.04990093782544136,
-0.012161045335233212,
0.0834452286362648,
-0.04709421098232269,
0.13486330211162567,
0.1121818870306015,
-0.17766407132148743,
0.0529228150844574,
-0.05466138944029808,
-0.10707169026136398,
-0.1224537342786789,
-0.005600664298981428,
-0.07221636921167374,
-0.15472431480884552,
0.03897270932793617,
-0.14318396151065826,
0.028826139867305756,
0.10316644608974457,
0.021165024489164352,
0.01171597745269537,
0.18491443991661072,
-0.0524444542825222,
0.007816181518137455,
0.04739896208047867,
-0.007964171469211578,
-0.00792670901864767,
-0.07876036316156387,
-0.05252612382173538,
0.02888340689241886,
-0.004272662103176117,
0.06851434707641602,
-0.060910314321517944,
-0.023094305768609047,
0.01789606548845768,
-0.015794025734066963,
-0.05761418864130974,
0.01687261462211609,
0.03116755001246929,
0.04254472255706787,
0.0254985298961401,
0.03924637660384178,
0.003955223131924868,
-0.03557118400931358,
0.2643986940383911,
-0.08922756463289261,
-0.1128101572394371,
-0.17546077072620392,
0.21497561037540436,
0.055666446685791016,
-0.006562006659805775,
0.06680072844028473,
-0.11702421307563782,
-0.015489314682781696,
0.1963413804769516,
0.15647664666175842,
-0.06614163517951965,
-0.007298900745809078,
0.013358294032514095,
-0.010581095702946186,
-0.052648477256298065,
0.11264285445213318,
0.11055928468704224,
0.05374301224946976,
-0.040434498339891434,
-0.06744140386581421,
-0.010175127536058426,
-0.02168237790465355,
-0.036395635455846786,
0.08697406947612762,
0.027511900290846825,
-0.004423871636390686,
-0.04779244214296341,
0.07101383060216904,
-0.03272034972906113,
-0.16361890733242035,
0.0746399536728859,
-0.1642964780330658,
-0.16989555954933167,
-0.031928449869155884,
0.061093300580978394,
-0.017129722982645035,
0.08361054956912994,
-0.02354004606604576,
-0.04326387494802475,
0.16034430265426636,
-0.0008042867411859334,
-0.05561847239732742,
-0.11779578775167465,
0.11442229896783829,
-0.06472214311361313,
0.20742186903953552,
-0.028548849746584892,
0.07376138865947723,
0.11554064601659775,
0.039879750460386276,
-0.07772764563560486,
0.036938417702913284,
0.07598244398832321,
-0.08605966717004776,
-0.009046300314366817,
0.12922127544879913,
-0.05645159259438515,
0.09311328828334808,
0.05228226259350777,
-0.19560858607292175,
0.008929952047765255,
0.013268663547933102,
-0.030240310356020927,
-0.0976858139038086,
-0.0017774720909073949,
-0.07152620702981949,
0.13148219883441925,
0.244767427444458,
-0.03564208373427391,
0.021819129586219788,
-0.07896038144826889,
0.04415837675333023,
0.05777362361550331,
0.09436391294002533,
-0.05526658147573471,
-0.230444073677063,
0.021441403776407242,
0.005974383559077978,
-0.012170053087174892,
-0.23380571603775024,
-0.09623157232999802,
0.07724146544933319,
-0.06860233098268509,
-0.03538496792316437,
0.1113894134759903,
0.06864137202501297,
0.048301417380571365,
-0.03678641468286514,
-0.1122969537973404,
-0.06808706372976303,
0.15562023222446442,
-0.15137861669063568,
-0.0665368139743805
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-ner
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.0814
- eval_precision: 0.9101
- eval_recall: 0.9336
- eval_f1: 0.9217
- eval_accuracy: 0.9799
- eval_runtime: 10.2964
- eval_samples_per_second: 315.646
- eval_steps_per_second: 39.529
- epoch: 1.14
- step: 500
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4.0
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["conll2003"], "model_index": [{"name": "roberta-base-ner", "results": [{"task": {"name": "Token Classification", "type": "token-classification"}, "dataset": {"name": "conll2003", "type": "conll2003", "args": "conll2003"}}]}]}
|
token-classification
|
andi611/roberta-base-ner-conll2003
|
[
"transformers",
"pytorch",
"roberta",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #roberta #token-classification #generated_from_trainer #dataset-conll2003 #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
# roberta-base-ner
This model is a fine-tuned version of roberta-base on the conll2003 dataset.
It achieves the following results on the evaluation set:
- eval_loss: 0.0814
- eval_precision: 0.9101
- eval_recall: 0.9336
- eval_f1: 0.9217
- eval_accuracy: 0.9799
- eval_runtime: 10.2964
- eval_samples_per_second: 315.646
- eval_steps_per_second: 39.529
- epoch: 1.14
- step: 500
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4.0
### Framework versions
- Transformers 4.8.2
- Pytorch 1.8.1+cu111
- Datasets 1.8.0
- Tokenizers 0.10.3
|
[
"# roberta-base-ner\n\nThis model is a fine-tuned version of roberta-base on the conll2003 dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 0.0814\n- eval_precision: 0.9101\n- eval_recall: 0.9336\n- eval_f1: 0.9217\n- eval_accuracy: 0.9799\n- eval_runtime: 10.2964\n- eval_samples_per_second: 315.646\n- eval_steps_per_second: 39.529\n- epoch: 1.14\n- step: 500",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 4.0",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #roberta #token-classification #generated_from_trainer #dataset-conll2003 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"# roberta-base-ner\n\nThis model is a fine-tuned version of roberta-base on the conll2003 dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 0.0814\n- eval_precision: 0.9101\n- eval_recall: 0.9336\n- eval_f1: 0.9217\n- eval_accuracy: 0.9799\n- eval_runtime: 10.2964\n- eval_samples_per_second: 315.646\n- eval_steps_per_second: 39.529\n- epoch: 1.14\n- step: 500",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 4.0",
"### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
57,
141,
6,
12,
8,
3,
90,
34
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #token-classification #generated_from_trainer #dataset-conll2003 #license-mit #autotrain_compatible #endpoints_compatible #region-us \n# roberta-base-ner\n\nThis model is a fine-tuned version of roberta-base on the conll2003 dataset.\nIt achieves the following results on the evaluation set:\n- eval_loss: 0.0814\n- eval_precision: 0.9101\n- eval_recall: 0.9336\n- eval_f1: 0.9217\n- eval_accuracy: 0.9799\n- eval_runtime: 10.2964\n- eval_samples_per_second: 315.646\n- eval_steps_per_second: 39.529\n- epoch: 1.14\n- step: 500## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 32\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 4.0### Framework versions\n\n- Transformers 4.8.2\n- Pytorch 1.8.1+cu111\n- Datasets 1.8.0\n- Tokenizers 0.10.3"
] |
[
-0.08119548857212067,
0.2262132614850998,
-0.004273960366845131,
0.08334942907094955,
0.1312549114227295,
0.008605102077126503,
0.04453509673476219,
0.16617810726165771,
-0.11959061026573181,
0.11348707228899002,
0.0858413428068161,
0.03972220793366432,
0.07206880301237106,
0.141978457570076,
-0.022415289655327797,
-0.15478581190109253,
-0.00808895193040371,
-0.05295068025588989,
-0.0057584913447499275,
0.0904054194688797,
0.1120329275727272,
-0.09528848528862,
0.06590253859758377,
-0.027894990518689156,
-0.09966907650232315,
0.047353655099868774,
-0.05418180301785469,
-0.0693805143237114,
0.06560855358839035,
0.02493872307240963,
0.05577252060174942,
-0.009273232892155647,
0.0931403785943985,
-0.2669813334941864,
-0.017741873860359192,
0.06827651709318161,
0.04432407766580582,
0.07499273866415024,
0.037618596106767654,
-0.01919860765337944,
0.058096203953027725,
-0.19757330417633057,
0.09750907868146896,
0.036306366324424744,
-0.10100400447845459,
-0.16222472488880157,
-0.11702600121498108,
0.08585721254348755,
0.06128516420722008,
0.09561590105295181,
-0.012490051798522472,
0.18161220848560333,
-0.03960542753338814,
0.07140237092971802,
0.2071796953678131,
-0.255245178937912,
-0.048926472663879395,
0.02967972308397293,
0.06239751726388931,
0.05136168748140335,
-0.1074841246008873,
-0.002012957353144884,
0.053444646298885345,
0.007943456061184406,
0.07860232889652252,
-0.0017918695230036974,
-0.07597550004720688,
0.007255783304572105,
-0.09667614102363586,
-0.07949754595756531,
0.21213488280773163,
0.08788351714611053,
-0.06382076442241669,
-0.11257412284612656,
-0.019757075235247612,
-0.15067380666732788,
-0.0016810871893540025,
-0.04772902652621269,
0.03223536163568497,
-0.05458259955048561,
-0.08598007261753082,
-0.019718172028660774,
-0.07461860030889511,
-0.012293455190956593,
0.02450849488377571,
0.09720344096422195,
0.03444541618227959,
-0.0015940169105306268,
0.005796791985630989,
0.08733755350112915,
-0.022169064730405807,
-0.1506062150001526,
-0.07069596648216248,
-0.004983467981219292,
-0.10766599327325821,
-0.07083132863044739,
-0.028641732409596443,
-0.006675133015960455,
0.006836209446191788,
0.22579357028007507,
-0.00845668837428093,
0.06804044544696808,
0.021871814504265785,
-0.02080244943499565,
0.0010302006267011166,
0.16834908723831177,
-0.03464970365166664,
-0.10091537237167358,
-0.018394991755485535,
0.07857350260019302,
-0.0030054538510739803,
-0.009990059770643711,
-0.04219865798950195,
-0.021731574088335037,
0.10422403365373611,
0.0785512700676918,
-0.00910137128084898,
0.015166700817644596,
-0.049143560230731964,
-0.0028280357364565134,
0.004493904300034046,
-0.1503184288740158,
0.03751394525170326,
-0.015376432798802853,
-0.10210954397916794,
-0.06385554373264313,
0.054042790085077286,
-0.003761044703423977,
-0.02382674068212509,
0.010696381330490112,
-0.05282646790146828,
-0.04182140901684761,
-0.04566928744316101,
-0.07466859370470047,
0.02417626976966858,
-0.07299286127090454,
0.027362588793039322,
-0.08984820544719696,
-0.16997000575065613,
-0.05264529585838318,
0.012684816494584084,
-0.06970654428005219,
-0.042587291449308395,
-0.03627970069646835,
-0.06979833543300629,
0.0042637065052986145,
-0.02655908092856407,
0.11712481826543808,
-0.03473920375108719,
0.06993914395570755,
0.029271403327584267,
0.026362432166934013,
0.08122186362743378,
0.04098141938447952,
-0.08448738604784012,
0.03655611723661423,
-0.0822625607252121,
0.1169559434056282,
-0.09634293615818024,
0.021793343126773834,
-0.16752876341342926,
-0.08691873401403427,
0.0030827897135168314,
-0.03709142655134201,
0.09287311136722565,
0.11466576159000397,
-0.1248871311545372,
-0.004133433569222689,
0.11335351318120956,
-0.009213247336447239,
-0.09314781427383423,
0.07598506659269333,
-0.05754878371953964,
0.05918541178107262,
0.07291468232870102,
0.11365177482366562,
0.12665612995624542,
-0.12930235266685486,
-0.0832589864730835,
0.0024669780395925045,
0.05527693033218384,
0.08659882843494415,
0.07579495757818222,
-0.009471911936998367,
0.08847027271986008,
0.018024735152721405,
-0.060932453721761703,
-0.018444327637553215,
-0.08038942515850067,
-0.09310270100831985,
-0.03405633941292763,
-0.07278459519147873,
0.006155448034405708,
0.014328551478683949,
0.01735854521393776,
-0.0839618667960167,
-0.13909809291362762,
0.059752456843853,
0.1426095962524414,
-0.04086686298251152,
0.004955251235514879,
-0.09444494545459747,
0.003807926783338189,
-0.03129120171070099,
-0.023730820044875145,
-0.18690608441829681,
-0.08060471713542938,
0.0493578165769577,
-0.052585750818252563,
0.014985376968979836,
0.025296198204159737,
0.07021691650152206,
0.034774888306856155,
-0.02411096729338169,
-0.008328475058078766,
-0.09286396205425262,
-0.013656760565936565,
-0.07618112862110138,
-0.13868895173072815,
-0.06218020245432854,
-0.031903307884931564,
0.22323651611804962,
-0.20525753498077393,
0.011043179780244827,
0.013345364481210709,
0.1238008365035057,
0.007473309990018606,
-0.08363736420869827,
0.004367237910628319,
-0.004489927086979151,
-0.01926073431968689,
-0.11897371709346771,
-0.00021916523110121489,
0.0013317145640030503,
-0.0891048014163971,
-0.05831054970622063,
-0.16500161588191986,
-0.009920231997966766,
0.08027303963899612,
0.12993448972702026,
-0.11720852553844452,
0.004353510681539774,
-0.060010652989149094,
-0.03508469834923744,
-0.055812884122133255,
-0.02058587409555912,
0.21281227469444275,
0.03072396107017994,
0.11132784932851791,
-0.043463170528411865,
-0.0858803242444992,
0.013708507642149925,
0.01306043192744255,
-0.014834891073405743,
0.11432244628667831,
0.007204086985439062,
-0.13681542873382568,
0.06279861181974411,
0.04999275133013725,
0.038177914917469025,
0.07977630198001862,
-0.03208262473344803,
-0.10454235225915909,
-0.05349806696176529,
0.04120630770921707,
0.02133585512638092,
0.09536981582641602,
-0.06105869263410568,
-0.0001793288829503581,
0.056340716779232025,
-0.01283189095556736,
-0.006918075028806925,
-0.10974671691656113,
0.0008421095553785563,
0.07378682494163513,
-0.02760329283773899,
0.017908232286572456,
-0.040064796805381775,
0.013076050207018852,
0.06715509295463562,
0.03790390491485596,
0.000595522637013346,
-0.004841422662138939,
-0.016041835770010948,
-0.06759117543697357,
0.15299734473228455,
-0.08210944384336472,
-0.15801136195659637,
-0.13012027740478516,
-0.004117162432521582,
-0.02904384769499302,
-0.016276683658361435,
0.023794421926140785,
-0.07444135844707489,
-0.07605089992284775,
-0.10030460357666016,
-0.02703729458153248,
-0.07639486342668533,
-0.03129399195313454,
0.08153974264860153,
0.039063889533281326,
0.10560765862464905,
-0.14418098330497742,
0.014222818426787853,
0.00752458069473505,
-0.0823773443698883,
-0.00865181814879179,
0.0518726110458374,
0.12068046629428864,
0.051694683730602264,
-0.03178238123655319,
0.013160972855985165,
-0.036946769803762436,
0.21703925728797913,
-0.08661708235740662,
-0.03639882802963257,
0.10785175859928131,
0.009614966809749603,
0.04944698140025139,
0.11187475174665451,
-0.0042525362223386765,
-0.08908327668905258,
0.04690288379788399,
0.05523784086108208,
-0.027158668264746666,
-0.2544236183166504,
-0.0020939279347658157,
-0.014006164856255054,
-0.0668501928448677,
0.14517244696617126,
0.04852937161922455,
0.01961742527782917,
0.05498742312192917,
-0.04777051880955696,
0.08217982202768326,
-0.01530213188380003,
0.10245425999164581,
0.07592169940471649,
0.036598969250917435,
0.09334327280521393,
-0.03651053085923195,
-0.01515469141304493,
0.05161016806960106,
0.0008322776993736625,
0.2411300539970398,
-0.029355764389038086,
0.1486544907093048,
0.020887942984700203,
0.14551693201065063,
-0.07331611961126328,
0.005948128644376993,
0.05935606360435486,
0.026597123593091965,
-0.004108520224690437,
-0.0774105116724968,
-0.06125304847955704,
0.04807914420962334,
-0.0035688732750713825,
0.053838592022657394,
-0.07738212496042252,
0.060499511659145355,
0.02119239792227745,
0.21458804607391357,
0.0640687346458435,
-0.31001415848731995,
-0.08668292313814163,
0.02729163132607937,
-0.026144718751311302,
-0.0703171044588089,
-0.04003088176250458,
0.09498382359743118,
-0.14463627338409424,
0.06187991052865982,
-0.010140092112123966,
0.07841911911964417,
-0.06324408203363419,
-0.0018576323054730892,
0.0012052098754793406,
0.08002803474664688,
0.00949933659285307,
0.09639114141464233,
-0.17971044778823853,
0.1764608919620514,
0.026908647269010544,
0.11399427801370621,
-0.05666766315698624,
0.06727690994739532,
-0.013676146045327187,
-0.031734418123960495,
0.15008142590522766,
-0.010805179364979267,
-0.03460092842578888,
-0.20592081546783447,
-0.12164094299077988,
0.028174646198749542,
0.10871387273073196,
-0.1018744483590126,
0.1252438724040985,
-0.031641293317079544,
-0.0061995186842978,
0.025932110846042633,
-0.037222858518362045,
-0.1542394459247589,
-0.14975236356258392,
0.026755934581160545,
-0.009601626545190811,
-0.021344227716326714,
-0.0719761773943901,
-0.07534460723400116,
-0.08662319928407669,
0.22082510590553284,
-0.0056022293865680695,
-0.03866272792220116,
-0.13364526629447937,
0.14142687618732452,
0.15353447198867798,
-0.08063370734453201,
0.014327213168144226,
0.0233355313539505,
0.12047647684812546,
0.045783307403326035,
-0.03662277013063431,
0.024879779666662216,
-0.029350563883781433,
-0.13501118123531342,
-0.0603574775159359,
0.12186774611473083,
0.04093421995639801,
0.06918541342020035,
0.009642783552408218,
0.031000422313809395,
0.016781702637672424,
-0.062042202800512314,
0.005825004540383816,
0.06810683757066727,
0.0714888796210289,
0.0591265894472599,
-0.03479832783341408,
-0.0032359235920011997,
-0.09270764887332916,
-0.011638069525361061,
0.13768796622753143,
0.2678297758102417,
-0.08958641439676285,
0.06143274903297424,
0.0163107980042696,
-0.08816848695278168,
-0.15043212473392487,
0.03619791567325592,
0.10502011328935623,
0.03582833707332611,
0.08606605231761932,
-0.15705324709415436,
0.06609770655632019,
0.1168094053864479,
-0.004783916752785444,
0.008913972415030003,
-0.2896529734134674,
-0.12163937836885452,
0.05202042683959007,
0.06688442081212997,
-0.020632479339838028,
-0.13421869277954102,
-0.05970923975110054,
-0.02916014939546585,
-0.17291373014450073,
0.016727808862924576,
-0.04113335162401199,
0.10343967378139496,
0.010140297003090382,
0.03994559869170189,
0.049970950931310654,
-0.035924557596445084,
0.15774616599082947,
0.05817101150751114,
0.06520623713731766,
-0.06711018085479736,
0.02365349978208542,
0.1317431777715683,
-0.09244342893362045,
0.09615326672792435,
-0.02195613458752632,
0.06684038043022156,
-0.17428070306777954,
-0.019705433398485184,
-0.03617853671312332,
0.06292080134153366,
-0.05655444785952568,
-0.04840873181819916,
-0.05768672749400139,
0.03482794389128685,
0.08495470136404037,
-0.018416475504636765,
0.05407065153121948,
0.029476016759872437,
0.05281049758195877,
0.09183972328901291,
0.035855233669281006,
0.08237036317586899,
-0.12667670845985413,
0.005981443915516138,
-0.010381252504885197,
0.024912001565098763,
-0.17598703503608704,
0.02684822678565979,
0.12282748520374298,
0.05139239877462387,
0.14064815640449524,
-0.004394615534693003,
-0.09496672451496124,
0.02299196645617485,
0.01864670217037201,
-0.0706225261092186,
-0.12318608164787292,
0.009016449563205242,
-0.023700276389718056,
-0.16487811505794525,
-0.032529231160879135,
0.13248828053474426,
-0.054190412163734436,
-0.021640844643115997,
-0.03699571266770363,
0.020901231095194817,
0.006106166169047356,
0.1787622720003128,
0.017310528084635735,
0.0734739676117897,
-0.07214893400669098,
0.1173366978764534,
0.11685547977685928,
-0.04473316669464111,
0.0920286551117897,
0.0005660928436554968,
-0.06780929118394852,
-0.00971912033855915,
0.04465378448367119,
0.08967617899179459,
0.01396053284406662,
-0.0020378802437335253,
-0.06304536014795303,
-0.04405269771814346,
0.05741804465651512,
0.02140987664461136,
0.03185737133026123,
0.003430067328736186,
0.007784249261021614,
0.0048517161048948765,
-0.14720851182937622,
0.09601457417011261,
0.07132168859243393,
0.050247903913259506,
-0.11221598833799362,
0.1008879691362381,
0.001822130405344069,
0.010794702917337418,
0.007015452720224857,
-0.007096792105585337,
-0.0456870011985302,
-0.004052453208714724,
-0.08685211092233658,
0.011170177720487118,
-0.010483494028449059,
0.003279611235484481,
-0.02229621633887291,
-0.041911378502845764,
-0.04664858803153038,
0.06531030684709549,
-0.058947596698999405,
-0.11463197320699692,
0.00914782378822565,
0.09000792354345322,
-0.14042487740516663,
-0.04474193602800369,
0.0363122820854187,
-0.12317007780075073,
0.07751885801553726,
0.05542365089058876,
0.03831951320171356,
-0.009451723657548428,
-0.053133945912122726,
-0.020256461575627327,
0.03019251674413681,
0.04384369030594826,
0.05984319746494293,
-0.11223360896110535,
0.008396136574447155,
-0.03766006603837013,
0.026982706040143967,
0.015661321580410004,
0.00269096321426332,
-0.12816181778907776,
-0.07166960090398788,
-0.06327653676271439,
-0.00785044115036726,
-0.050807684659957886,
0.06293565779924393,
0.10026545822620392,
0.03224784880876541,
0.13842801749706268,
-0.04373420774936676,
0.044125769287347794,
-0.2302417904138565,
-0.038231052458286285,
-0.02798285521566868,
-0.027145015075802803,
-0.06450023502111435,
-0.025483710691332817,
0.08933795243501663,
-0.038109831511974335,
0.09079609811306,
-0.03499087318778038,
0.14683978259563446,
0.040682658553123474,
-0.047728974372148514,
-0.018099574372172356,
-0.000523312424775213,
0.16982664167881012,
0.0977061316370964,
-0.01550288125872612,
0.09853202849626541,
-0.02777063101530075,
0.09988594800233841,
-0.03223158046603203,
0.060706835240125656,
0.1828833818435669,
0.021659867838025093,
0.05406505614519119,
0.048597659915685654,
-0.12266232818365097,
-0.10645092278718948,
0.1356901228427887,
-0.045106299221515656,
0.09685499966144562,
-0.03420409932732582,
0.08186451345682144,
0.08879835903644562,
-0.16102948784828186,
0.05024350434541702,
-0.07601910084486008,
-0.08892831951379776,
-0.08124012500047684,
-0.01501450128853321,
-0.09838013350963593,
-0.05270831286907196,
0.05040444806218147,
-0.10161250829696655,
0.03613191097974777,
0.09576817601919174,
0.00006039501022314653,
0.004351505544036627,
0.10418237745761871,
-0.05082855373620987,
-0.019102856516838074,
0.05397305265069008,
-0.011716069653630257,
-0.014541341923177242,
-0.05739440768957138,
-0.02345310151576996,
0.09318913519382477,
0.06241235136985779,
0.1269753873348236,
-0.01642007939517498,
0.06837012618780136,
0.04066874831914902,
-0.017928393557667732,
-0.1039547324180603,
-0.002160474192351103,
0.027189765125513077,
0.02037227340042591,
0.030541015788912773,
0.06618330627679825,
0.03107038140296936,
-0.043963778764009476,
0.26019784808158875,
-0.0398043617606163,
-0.049775995314121246,
-0.14014849066734314,
0.10352526605129242,
0.09842202812433243,
0.009718194603919983,
0.05492511764168739,
-0.13223513960838318,
0.01611979678273201,
0.11225625872612,
0.09842195361852646,
0.001205972395837307,
-0.023189667612314224,
-0.001599168754182756,
-0.015360201708972454,
-0.06275683641433716,
0.04897516593337059,
0.0862959772348404,
-0.07760277390480042,
-0.05214281752705574,
0.02412448450922966,
0.021803423762321472,
-0.04524409770965576,
-0.05459405109286308,
0.05322388559579849,
-0.014635525643825531,
0.04249542951583862,
-0.022876232862472534,
0.06001274287700653,
0.03987320512533188,
-0.2776928246021271,
0.06395485252141953,
-0.1810896396636963,
-0.17656192183494568,
0.003174686571583152,
0.08182691037654877,
0.013937183655798435,
0.06636925786733627,
0.0033244311343878508,
-0.0017276345752179623,
0.161542147397995,
-0.0074610584415495396,
-0.07333982735872269,
-0.08522292971611023,
0.0698765367269516,
-0.037582654505968094,
0.25103265047073364,
-0.005946434568613768,
0.05492573231458664,
0.11020933836698532,
-0.009548412635922432,
-0.1709594577550888,
0.011054201051592827,
0.09503503888845444,
-0.007703213952481747,
0.06411256641149521,
0.16373828053474426,
-0.0544753260910511,
0.1399986296892166,
0.07956409454345703,
-0.0959118977189064,
-0.036872681230306625,
-0.03663386031985283,
0.034614644944667816,
-0.09638635069131851,
-0.008860450237989426,
-0.03779887408018112,
0.14084771275520325,
0.18294741213321686,
-0.05156267434358597,
0.0011020565871149302,
-0.08648873120546341,
0.007463410496711731,
0.04546285793185234,
0.10903923958539963,
-0.01130974106490612,
-0.15010525286197662,
0.02691849134862423,
-0.01272362656891346,
0.062406621873378754,
-0.2604256570339203,
-0.10227829217910767,
0.08467723429203033,
-0.05965704470872879,
-0.02536117099225521,
0.1332615315914154,
0.0559406541287899,
0.017929820343852043,
-0.0447239987552166,
-0.15265865623950958,
-0.036155372858047485,
0.13667388260364532,
-0.14137932658195496,
-0.04622560366988182
] |
null | null |
transformers
|
# My Awesome Model
|
{"tags": ["conversational"]}
|
text-generation
|
andikarachman/DialoGPT-small-sheldon
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# My Awesome Model
|
[
"# My Awesome Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# My Awesome Model"
] |
[
51,
4
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# My Awesome Model"
] |
[
-0.05259015038609505,
0.05521034821867943,
-0.005910294596105814,
0.017722278833389282,
0.15250112116336823,
0.02286236733198166,
0.07657632976770401,
0.09513414651155472,
-0.025391526520252228,
-0.047348517924547195,
0.15119488537311554,
0.19781284034252167,
-0.020334534347057343,
0.101333387196064,
-0.04688440263271332,
-0.3143521845340729,
0.06439975649118423,
0.05463787540793419,
-0.015605635941028595,
0.12023304402828217,
0.09468326717615128,
-0.0530015267431736,
0.08742043375968933,
-0.012155864387750626,
-0.1293085366487503,
-0.0027921805158257484,
-0.002384399762377143,
-0.10180269181728363,
0.11194873601198196,
0.033712033182382584,
0.05166437849402428,
0.0182647667825222,
-0.05843055993318558,
-0.139859139919281,
0.03845210000872612,
-0.015005595050752163,
-0.05602653697133064,
0.05648263916373253,
0.059830192476511,
-0.07164353132247925,
0.1669619083404541,
0.13275989890098572,
-0.04237370565533638,
0.056127581745386124,
-0.17620700597763062,
0.017941240221261978,
0.01800798624753952,
0.019184142351150513,
0.05306641012430191,
0.10830496996641159,
-0.03932326287031174,
0.09217294305562973,
-0.11410652846097946,
0.08313368260860443,
0.07800983637571335,
-0.29151955246925354,
-0.025312699377536774,
0.10440942645072937,
0.06437138468027115,
0.048375632613897324,
-0.013386772945523262,
0.0621674507856369,
0.02149512618780136,
0.008602659218013287,
0.02225899137556553,
-0.06727100163698196,
-0.05789240449666977,
0.032748885452747345,
-0.0967593789100647,
-0.03634428232908249,
0.19753605127334595,
-0.024647634476423264,
0.053590498864650726,
-0.06265407055616379,
-0.11300963163375854,
-0.039751436561346054,
-0.050429005175828934,
-0.029761891812086105,
-0.05090925097465515,
0.09489558637142181,
0.004352911841124296,
-0.09534718841314316,
-0.13405443727970123,
-0.01370926946401596,
-0.1618979275226593,
0.15892250835895538,
0.012579603120684624,
0.046201955527067184,
-0.19210097193717957,
0.11465331166982651,
-0.03857925534248352,
-0.08259090781211853,
0.030513519421219826,
-0.12010065466165543,
0.03160654753446579,
-0.008132083341479301,
-0.019599268212914467,
-0.049325279891490936,
0.061037879437208176,
0.08101806789636612,
0.018783701583743095,
0.005755073390901089,
0.018167443573474884,
0.05343452841043472,
0.05891622602939606,
0.10033947974443436,
-0.02891627699136734,
-0.0625043511390686,
0.0025436533614993095,
-0.12051084637641907,
-0.01122665498405695,
-0.05357983708381653,
-0.18095199763774872,
0.002246231772005558,
0.02455340512096882,
0.05192234739661217,
0.011778532527387142,
0.09955989569425583,
-0.028496338054537773,
-0.026898741722106934,
0.06898727267980576,
0.002862759632989764,
-0.015707949176430702,
-0.005368964280933142,
-0.010934269987046719,
0.11485416442155838,
-0.023099146783351898,
0.04774846136569977,
-0.12022071331739426,
0.020393015816807747,
-0.07851235568523407,
-0.0019349842332303524,
-0.06214260309934616,
-0.04864754155278206,
-0.0019346009939908981,
-0.06985589861869812,
0.021118074655532837,
-0.14833110570907593,
-0.17990200221538544,
-0.005064866971224546,
0.021302316337823868,
-0.052403319627046585,
-0.09162671118974686,
-0.0982397273182869,
-0.02586611732840538,
0.03574685752391815,
-0.05873546749353409,
0.013170980848371983,
-0.06884536147117615,
0.06542801111936569,
0.0029820678755640984,
0.05682007595896721,
-0.14085575938224792,
0.08719147741794586,
-0.12582023441791534,
-0.023288866505026817,
-0.061977192759513855,
0.1109607070684433,
0.024780582636594772,
0.1267160177230835,
0.004311583004891872,
-0.0033308975398540497,
-0.08729329705238342,
0.08271238207817078,
-0.04243258014321327,
0.22770646214485168,
-0.10479787737131119,
-0.08809807151556015,
0.2632525563240051,
-0.05423165112733841,
-0.16432519257068634,
0.10179096460342407,
-0.014350244775414467,
0.12198644131422043,
0.13850919902324677,
0.16080057621002197,
0.007628654129803181,
0.03313867375254631,
0.10115300863981247,
0.08631709218025208,
-0.08573295921087265,
-0.0611947737634182,
0.023627014830708504,
-0.011463395319879055,
-0.10670105367898941,
0.046802595257759094,
0.04794782027602196,
0.08188598603010178,
-0.04982871189713478,
-0.028600862249732018,
-0.01972118206322193,
-0.044152840971946716,
0.05264130234718323,
0.007675500120967627,
0.13217447698116302,
-0.03674980252981186,
-0.03692879155278206,
-0.023745311424136162,
0.01699630729854107,
-0.03115241602063179,
0.007061392068862915,
-0.05687357112765312,
0.11091547459363937,
-0.03406180441379547,
0.051789235323667526,
-0.16953988373279572,
-0.04873261600732803,
-0.02087729424238205,
0.1402055323123932,
0.04973345249891281,
0.1329866498708725,
0.06287940591573715,
-0.010758201591670513,
0.00859389640390873,
0.007998145185410976,
0.13181665539741516,
0.007865442894399166,
-0.07660657912492752,
-0.047718439251184464,
0.09176599979400635,
-0.05973208695650101,
0.06147782504558563,
-0.098741315305233,
-0.004747362341731787,
-0.01433002483099699,
0.08674649894237518,
0.006352655589580536,
0.029382232576608658,
-0.006192679051309824,
0.003654100699350238,
-0.06161240115761757,
0.017873648554086685,
0.12492607533931732,
-0.01421504095196724,
-0.07439801841974258,
0.22084392607212067,
-0.15798072516918182,
0.18006981909275055,
0.18165533244609833,
-0.3081994652748108,
0.024602634832262993,
-0.08860466629266739,
-0.036338552832603455,
0.03426366671919823,
0.0491504967212677,
-0.034147560596466064,
0.16587987542152405,
-0.016766328364610672,
0.201018825173378,
-0.03547777235507965,
-0.01287798210978508,
-0.010399105958640575,
-0.03656993433833122,
-0.010632630437612534,
0.09065473079681396,
0.15122920274734497,
-0.1677125245332718,
0.18270380795001984,
0.1660280078649521,
0.06873020529747009,
0.17776396870613098,
0.034313347190618515,
-0.006856906693428755,
0.07112615555524826,
-0.022670727223157883,
-0.07675548642873764,
-0.049287427216768265,
-0.26302891969680786,
-0.027947327122092247,
0.06471601128578186,
0.04510856419801712,
0.11924877762794495,
-0.10971947014331818,
-0.037208184599876404,
0.010892451740801334,
-0.013165894895792007,
0.02132410928606987,
0.09682225435972214,
0.01171150617301464,
0.11804302036762238,
-0.021027036011219025,
-0.05209195241332054,
0.0898953229188919,
0.02727191150188446,
-0.0787680521607399,
0.19168277084827423,
-0.10074768215417862,
-0.3233809769153595,
-0.11354339867830276,
-0.18166927993297577,
-0.017843691632151604,
0.05878754332661629,
0.08049646019935608,
-0.09228580445051193,
-0.02625267766416073,
-0.01639235019683838,
0.0758359357714653,
-0.09145816415548325,
-0.015880629420280457,
-0.09367848187685013,
0.034986745566129684,
-0.10827737301588058,
-0.07011983543634415,
-0.05141967162489891,
-0.03368452936410904,
-0.04457031562924385,
0.13157756626605988,
-0.12242637574672699,
0.06396433711051941,
0.2076517641544342,
0.06227295100688934,
0.05622440204024315,
-0.0229496993124485,
0.23288212716579437,
-0.10842552781105042,
0.02383521944284439,
0.1717897206544876,
-0.03566030040383339,
0.0727933868765831,
0.13435456156730652,
0.006721907295286655,
-0.08144525438547134,
0.03465581312775612,
-0.04592517390847206,
-0.08630958944559097,
-0.20441576838493347,
-0.14156180620193481,
-0.12814727425575256,
0.07913564145565033,
0.03285396471619606,
0.05478321388363838,
0.15024253726005554,
0.11386489123106003,
0.007987297140061855,
0.00976672861725092,
-0.006888182368129492,
0.05438044294714928,
0.17482298612594604,
-0.05838097631931305,
0.10041683167219162,
-0.037591226398944855,
-0.1924494504928589,
0.08022978901863098,
0.04309763014316559,
0.08280511945486069,
0.07474655658006668,
0.0856199786067009,
0.013537914492189884,
0.03723837807774544,
0.10897084325551987,
0.1165735274553299,
0.031679023057222366,
-0.038079675287008286,
-0.04882059991359711,
-0.026300756260752678,
-0.03285675123333931,
0.05745977535843849,
0.07790146768093109,
-0.1608346849679947,
-0.06348084658384323,
-0.06350091099739075,
0.07662643492221832,
0.09017108380794525,
0.11811108142137527,
-0.21219493448734283,
0.01579318381845951,
0.092556893825531,
-0.0494147390127182,
-0.1304239183664322,
0.07402537018060684,
-0.00466050673276186,
-0.1397053301334381,
0.037663187831640244,
-0.014095795340836048,
0.1359514445066452,
-0.0778401643037796,
0.10336452722549438,
-0.08307972550392151,
-0.06147889420390129,
0.03632286190986633,
0.1355396956205368,
-0.30774354934692383,
0.2137020230293274,
-0.022472934797406197,
-0.05296783149242401,
-0.10508129745721817,
-0.011727629229426384,
0.020913105458021164,
0.09079049527645111,
0.10090240091085434,
-0.0025442070327699184,
0.0061299679800868034,
-0.0345483273267746,
-0.053218815475702286,
0.024456629529595375,
0.07957815378904343,
-0.08542889356613159,
0.0017540202243253589,
-0.02361489273607731,
-0.004407065454870462,
-0.032844748347997665,
-0.01189463958144188,
-0.011617658659815788,
-0.16786961257457733,
0.06556065380573273,
-0.002625665394589305,
0.11129079759120941,
0.03491498529911041,
0.0024013579823076725,
-0.1009332686662674,
0.19977013766765594,
0.01796281896531582,
-0.08052749931812286,
-0.08830537647008896,
-0.03254766762256622,
0.03660419583320618,
-0.06121435388922691,
0.027481911703944206,
-0.06916457414627075,
0.033381566405296326,
-0.06441576033830643,
-0.18325145542621613,
0.1268530637025833,
-0.10945470631122589,
-0.03609596937894821,
-0.04321056231856346,
0.18323224782943726,
-0.00929707009345293,
-0.0011623724130913615,
0.05866571143269539,
0.0032208464108407497,
-0.1347510665655136,
-0.10740556567907333,
0.020214511081576347,
-0.015275230631232262,
0.009142245166003704,
0.05559912323951721,
-0.009665844030678272,
0.00045268211397342384,
-0.039558928459882736,
-0.023234419524669647,
0.32348164916038513,
0.10732097923755646,
-0.04944206401705742,
0.17007054388523102,
0.13087597489356995,
-0.0827672928571701,
-0.30699312686920166,
-0.10971353948116302,
-0.10529600828886032,
-0.026918673887848854,
-0.037983208894729614,
-0.19617970287799835,
0.09504909813404083,
-0.03528566658496857,
-0.022136637941002846,
0.11253651231527328,
-0.2759084105491638,
-0.0770430713891983,
0.1826775223016739,
0.003314757253974676,
0.3998824954032898,
-0.10265109688043594,
-0.08777514100074768,
-0.06741699576377869,
-0.1120782196521759,
0.2033512443304062,
-0.05560711398720741,
0.08663415163755417,
-0.00517998356372118,
0.15513743460178375,
0.055607251822948456,
-0.02176513522863388,
0.08932057023048401,
-0.005811662413179874,
-0.0546204075217247,
-0.1219351515173912,
-0.03444604203104973,
-0.009159418754279613,
0.007239421829581261,
0.03589896112680435,
-0.04242607578635216,
0.01279151439666748,
-0.1399589478969574,
-0.045490626245737076,
-0.0764620453119278,
0.024699507281184196,
0.021008269861340523,
-0.0652410089969635,
-0.01643640361726284,
-0.03945036977529526,
-0.012804778292775154,
0.03164318576455116,
0.15236099064350128,
-0.06478006392717361,
0.1476556956768036,
0.04904455319046974,
0.15412139892578125,
-0.14745712280273438,
-0.02258288487792015,
-0.06896031647920609,
-0.05498642474412918,
0.04900865629315376,
-0.10053684562444687,
0.050061121582984924,
0.1202658861875534,
-0.0742902010679245,
0.0987328365445137,
0.0922594666481018,
-0.01938629150390625,
0.0012483424507081509,
0.1226617842912674,
-0.2489612102508545,
-0.07742628455162048,
-0.10509459674358368,
0.013337249867618084,
0.10138551890850067,
0.06995654851198196,
0.17304721474647522,
-0.0037713919300585985,
-0.036284226924180984,
-0.0064643872901797295,
0.025414984673261642,
-0.03540204465389252,
0.05724727362394333,
-0.002706433180719614,
0.016663886606693268,
-0.15213344991207123,
0.060368724167346954,
-0.00024176653823815286,
-0.1438901126384735,
-0.013603870756924152,
0.16073721647262573,
-0.11208858340978622,
-0.15145981311798096,
-0.007263668347150087,
0.13685113191604614,
-0.13171035051345825,
-0.03302847594022751,
-0.03708777576684952,
-0.170182466506958,
0.07439173012971878,
0.1024777740240097,
0.08549231290817261,
0.08025266975164413,
-0.06620611250400543,
-0.00807863101363182,
-0.011656313203275204,
-0.026087598875164986,
0.031810320913791656,
-0.023377234116196632,
-0.09044221043586731,
0.03872343525290489,
-0.026654237881302834,
0.13591371476650238,
-0.09607382118701935,
-0.09331836551427841,
-0.135749951004982,
0.039314381778240204,
-0.12405620515346527,
-0.08138058334589005,
-0.12200927734375,
-0.0591500885784626,
0.00224387738853693,
-0.0001289021165575832,
-0.035674065351486206,
-0.06687422841787338,
-0.13582271337509155,
0.04366770386695862,
-0.04484611004590988,
0.0013091047294437885,
-0.040241483598947525,
0.04561002552509308,
0.06766383349895477,
-0.03493715822696686,
0.13722217082977295,
0.11722734570503235,
-0.07864081114530563,
0.08946478366851807,
-0.16657429933547974,
-0.0683990865945816,
0.08854512125253677,
0.008173754438757896,
0.06165994703769684,
0.06743349134922028,
0.033807408064603806,
0.06109451875090599,
0.04151686280965805,
0.03488299250602722,
0.01739438995718956,
-0.09271225333213806,
0.015541021712124348,
0.022296719253063202,
-0.1294609159231186,
-0.04801803454756737,
-0.029226921498775482,
0.00939185917377472,
0.008117396384477615,
0.11003357172012329,
-0.0426274873316288,
0.09439733624458313,
-0.05888751894235611,
0.036728594452142715,
0.016222506761550903,
-0.16461637616157532,
-0.020102784037590027,
-0.11915475130081177,
0.028684545308351517,
-0.0033096212428063154,
0.25625869631767273,
0.06346847862005234,
0.020517030730843544,
0.01250078622251749,
0.08567021042108536,
0.07241600006818771,
0.02562166005373001,
0.1956365555524826,
0.10854171961545944,
-0.05020022392272949,
-0.12334850430488586,
0.09686340391635895,
0.034720368683338165,
0.06432123482227325,
0.13385434448719025,
-0.026959087699651718,
0.002498799469321966,
0.11019360274076462,
0.011678861454129219,
0.04961980879306793,
-0.09859088063240051,
-0.16400282084941864,
-0.00994415208697319,
0.061864156275987625,
-0.04559077322483063,
0.12240655720233917,
0.11382720619440079,
-0.020697353407740593,
0.03180128335952759,
-0.010503606870770454,
-0.05694027617573738,
-0.16998925805091858,
-0.1630837321281433,
-0.08357038348913193,
-0.11794789135456085,
-0.0027763545513153076,
-0.11386270076036453,
0.013879159465432167,
0.06452289968729019,
0.0604364387691021,
-0.09019444137811661,
0.08891061693429947,
0.0687386617064476,
-0.11843101680278778,
0.08828350901603699,
-0.033263903111219406,
0.07249268144369125,
0.0015160300536081195,
0.003872724948450923,
-0.13800905644893646,
0.032393742352724075,
-0.008493867702782154,
0.04159298539161682,
-0.09244006127119064,
0.022458361461758614,
-0.11297028511762619,
-0.07659684121608734,
-0.07971972227096558,
0.05093973129987717,
-0.03541257977485657,
0.1390930563211441,
0.001295371213927865,
-0.035233911126852036,
0.024190181866288185,
0.22729112207889557,
-0.06350252777338028,
-0.030667411163449287,
-0.0618741400539875,
0.21414142847061157,
0.024466563016176224,
0.10703565180301666,
-0.016775688156485558,
0.019240234047174454,
-0.0764411985874176,
0.3689337372779846,
0.344390869140625,
-0.1225387305021286,
-0.0015968306688591838,
0.031062176451086998,
0.036916591227054596,
0.11621878296136856,
0.12602226436138153,
0.057955991476774216,
0.2995031177997589,
-0.08396036922931671,
-0.002026971662417054,
-0.02688612788915634,
-0.03624163940548897,
-0.04409930482506752,
0.10547586530447006,
0.06835740804672241,
-0.03330419585108757,
-0.027012333273887634,
0.1376710683107376,
-0.2966996431350708,
0.12323499470949173,
-0.15714547038078308,
-0.1487535685300827,
-0.06873904913663864,
-0.005042468197643757,
0.08589684963226318,
0.04748665541410446,
0.1069009080529213,
-0.019124338403344154,
-0.08203735202550888,
0.05766449123620987,
0.0320524163544178,
-0.22844897210597992,
0.011852608993649483,
0.08361081779003143,
-0.06153005734086037,
0.011767351068556309,
-0.017906347289681435,
0.038472190499305725,
0.07790610194206238,
0.025976579636335373,
-0.032770540565252304,
0.06325861811637878,
-0.005814229138195515,
-0.05033424496650696,
0.04302205145359039,
0.05059972032904625,
0.017107632011175156,
-0.1511564701795578,
0.07320158183574677,
-0.1762860119342804,
0.0566408596932888,
-0.005331212189048529,
-0.04948166385293007,
0.000018263708625454456,
0.01998119056224823,
-0.06808236241340637,
0.05880929157137871,
0.0952666699886322,
-0.012173139490187168,
-0.002317852806299925,
-0.056667573750019073,
0.007662574760615826,
-0.0679154172539711,
-0.0747012197971344,
-0.10497893393039703,
-0.1338900774717331,
-0.11392296850681305,
0.10846775025129318,
-0.011928223073482513,
-0.19833622872829437,
0.02906924858689308,
-0.11258108913898468,
0.04933213070034981,
-0.13360801339149475,
0.08599711954593658,
0.1282832771539688,
0.021543797105550766,
-0.01265349704772234,
0.04020093381404877,
0.01591683179140091,
0.08550478518009186,
-0.09200563281774521,
-0.10515180230140686
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-marc-en
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the amazon_reviews_multi dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8885
- Mae: 0.4390
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mae |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 1.1089 | 1.0 | 235 | 0.9027 | 0.4756 |
| 0.9674 | 2.0 | 470 | 0.8885 | 0.4390 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.9.0+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["amazon_reviews_multi"], "model-index": [{"name": "xlm-roberta-base-finetuned-marc-en", "results": []}]}
|
text-classification
|
anditya/xlm-roberta-base-finetuned-marc-en
|
[
"transformers",
"pytorch",
"tensorboard",
"xlm-roberta",
"text-classification",
"generated_from_trainer",
"dataset:amazon_reviews_multi",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #xlm-roberta #text-classification #generated_from_trainer #dataset-amazon_reviews_multi #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
xlm-roberta-base-finetuned-marc-en
==================================
This model is a fine-tuned version of xlm-roberta-base on the amazon\_reviews\_multi dataset.
It achieves the following results on the evaluation set:
* Loss: 0.8885
* Mae: 0.4390
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 2
### Training results
### Framework versions
* Transformers 4.11.3
* Pytorch 1.9.0+cu111
* Datasets 1.14.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #xlm-roberta #text-classification #generated_from_trainer #dataset-amazon_reviews_multi #license-mit #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
67,
98,
4,
34
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #xlm-roberta #text-classification #generated_from_trainer #dataset-amazon_reviews_multi #license-mit #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.11.3\n* Pytorch 1.9.0+cu111\n* Datasets 1.14.0\n* Tokenizers 0.10.3"
] |
[
-0.09092789888381958,
0.08008227497339249,
-0.0020140453707426786,
0.11630697548389435,
0.18312716484069824,
0.042973749339580536,
0.15040470659732819,
0.11954569816589355,
-0.09022784978151321,
-0.0003494977136142552,
0.11352355778217316,
0.17042438685894012,
0.007949714548885822,
0.1317906379699707,
-0.06562875211238861,
-0.25790008902549744,
-0.012251557782292366,
0.05035068839788437,
-0.04488401114940643,
0.1443592607975006,
0.10154645889997482,
-0.1380293369293213,
0.09442190825939178,
-0.0014341471251100302,
-0.19770415127277374,
-0.006765956524759531,
0.029228247702121735,
-0.06890206784009933,
0.13384534418582916,
0.03764583170413971,
0.13645893335342407,
0.008102459833025932,
0.07276447862386703,
-0.19063866138458252,
0.020796533674001694,
0.040146905928850174,
0.00358709879219532,
0.0915832370519638,
0.030548246577382088,
-0.01468250248581171,
0.1342829167842865,
-0.060973599553108215,
0.07154899835586548,
0.018368558958172798,
-0.11795462667942047,
-0.2320529818534851,
-0.08308214694261551,
0.035912688821554184,
0.056772612035274506,
0.09991798549890518,
-0.010324102826416492,
0.15634198486804962,
-0.07674280554056168,
0.10339420288801193,
0.23605166375637054,
-0.2893300950527191,
-0.07612571865320206,
0.032290682196617126,
0.043305903673172,
0.08403892815113068,
-0.10349797457456589,
-0.023395158350467682,
0.05919168144464493,
0.05649252235889435,
0.12055753171443939,
-0.0452197901904583,
-0.0962030366063118,
0.01583736389875412,
-0.1441667675971985,
-0.02332693338394165,
0.2023565173149109,
0.03447432816028595,
-0.0476268008351326,
-0.051082272082567215,
-0.032434288412332535,
-0.15748977661132812,
-0.03979404643177986,
-0.0009673985186964273,
0.050246383994817734,
-0.06319781392812729,
-0.08705104142427444,
-0.013781961984932423,
-0.11613631248474121,
-0.05173107236623764,
-0.06630995124578476,
0.1457367241382599,
0.04109196364879608,
0.01682303659617901,
-0.03500403091311455,
0.10437536239624023,
0.021311579272150993,
-0.10318823158740997,
0.012504742480814457,
0.007507571950554848,
-0.010289235971868038,
-0.047606464475393295,
-0.05751515179872513,
-0.07956288009881973,
0.002544892020523548,
0.11920338124036789,
-0.04774501919746399,
0.03242870792746544,
0.03772571310400963,
0.057246528565883636,
-0.07498431205749512,
0.19655898213386536,
-0.028955459594726562,
-0.005452427081763744,
-0.004732458386570215,
0.04949004575610161,
0.015602247789502144,
-0.010551849380135536,
-0.12953022122383118,
0.007022026460617781,
0.08074092119932175,
0.013663754798471928,
-0.07587581127882004,
0.06431995332241058,
-0.06985332071781158,
-0.04672382026910782,
-0.007498918566852808,
-0.07484535127878189,
0.031198130920529366,
-0.008710284717381,
-0.06582239270210266,
-0.02350885048508644,
0.023388126865029335,
0.017721518874168396,
-0.011746599338948727,
0.13322429358959198,
-0.08970562368631363,
0.0364038459956646,
-0.09379757940769196,
-0.10690733790397644,
0.021213319152593613,
-0.07686057686805725,
0.0376054085791111,
-0.10856878012418747,
-0.16822496056556702,
-0.03304174169898033,
0.0522976890206337,
-0.018100610002875328,
-0.060430899262428284,
-0.03577180206775665,
-0.06308238208293915,
0.01012183167040348,
-0.014289181679487228,
0.1470746546983719,
-0.07050348073244095,
0.11098764836788177,
0.03432513028383255,
0.05846457928419113,
-0.04605408012866974,
0.04961748793721199,
-0.09303298592567444,
-0.008509560488164425,
-0.15352317690849304,
0.03393903747200966,
-0.04447499290108681,
0.058807726949453354,
-0.07169647514820099,
-0.11825202405452728,
0.013603618368506432,
0.019700555130839348,
0.04256633669137955,
0.07442475855350494,
-0.1713005006313324,
-0.07580258697271347,
0.14970633387565613,
-0.06509901583194733,
-0.12265316396951675,
0.11653491109609604,
-0.08050192892551422,
0.06815876066684723,
0.07918455451726913,
0.16007547080516815,
0.07368943095207214,
-0.07665113359689713,
0.02364281751215458,
-0.009748673066496849,
0.030511032789945602,
-0.06656751781702042,
0.07645123451948166,
0.023808009922504425,
-0.011088239029049873,
0.031931594014167786,
-0.03572938218712807,
0.036782167851924896,
-0.09431610256433487,
-0.08854455500841141,
-0.03681464493274689,
-0.09542662650346756,
0.05960068479180336,
0.07206001877784729,
0.07265763729810715,
-0.11765731126070023,
-0.07257198542356491,
0.07150136679410934,
0.0861012265086174,
-0.055003076791763306,
0.018849531188607216,
-0.05219917744398117,
0.06374433636665344,
-0.034731317311525345,
-0.022515803575515747,
-0.17951369285583496,
-0.029770378023386,
0.014603286981582642,
0.005661679431796074,
0.032073505222797394,
0.040834296494722366,
0.05372710898518562,
0.04150041192770004,
-0.07131427526473999,
-0.011015200987458229,
-0.050375696271657944,
-0.00942130945622921,
-0.1230582743883133,
-0.19584792852401733,
-0.018969720229506493,
-0.023339437320828438,
0.11454646289348602,
-0.224257692694664,
0.03413281589746475,
-0.04092243313789368,
0.05761338770389557,
0.041867028921842575,
-0.010956901125609875,
-0.02053735964000225,
0.0860079899430275,
-0.03713130205869675,
-0.0327489897608757,
0.07592474669218063,
0.012195399962365627,
-0.10368473827838898,
-0.007822113111615181,
-0.09257585555315018,
0.19031088054180145,
0.1289455145597458,
-0.09699749946594238,
-0.0888260006904602,
0.010719056241214275,
-0.054551877081394196,
-0.03350850194692612,
-0.08110085129737854,
0.03831710293889046,
0.1832561194896698,
-0.00408615218475461,
0.1422782838344574,
-0.08589011430740356,
-0.04746617004275322,
0.027460463345050812,
-0.04416185989975929,
0.026127975434064865,
0.14056192338466644,
0.12522448599338531,
-0.0920635238289833,
0.1394202560186386,
0.14817063510417938,
-0.07915978133678436,
0.1658279448747635,
-0.03801234811544418,
-0.059139613062143326,
-0.024806562811136246,
-0.03590410575270653,
-0.011826027184724808,
0.1085469201207161,
-0.12760300934314728,
0.00472189811989665,
0.03235438093543053,
0.009446932934224606,
0.01708807982504368,
-0.23087909817695618,
-0.04802200570702553,
0.035222526639699936,
-0.040130965411663055,
-0.011457022279500961,
0.006225543096661568,
0.01636500284075737,
0.11100597679615021,
-0.00038215177482925355,
-0.061102356761693954,
0.04150799661874771,
0.007206903304904699,
-0.09109006822109222,
0.21807080507278442,
-0.0752849280834198,
-0.18252205848693848,
-0.13199250400066376,
-0.0493457093834877,
-0.04442271217703819,
-0.00279906764626503,
0.06433742493391037,
-0.07138606905937195,
-0.02895044907927513,
-0.06548784673213959,
0.00514746131375432,
-0.006640486419200897,
0.016602864488959312,
-0.018567554652690887,
0.023830769583582878,
0.03936237096786499,
-0.10331819206476212,
-0.012889090925455093,
-0.061911795288324356,
-0.040967509150505066,
0.053883109241724014,
0.04405555874109268,
0.10898144543170929,
0.14961715042591095,
-0.025291262194514275,
-0.003893762594088912,
-0.03315175324678421,
0.21485087275505066,
-0.08689753711223602,
-0.04712153226137161,
0.13125620782375336,
-0.009326517581939697,
0.03263324499130249,
0.1212800070643425,
0.0720895454287529,
-0.09237991273403168,
0.017520809546113014,
0.02917098067700863,
-0.03997639939188957,
-0.27003076672554016,
-0.03821174427866936,
-0.053288307040929794,
0.0005041555850766599,
0.07316083461046219,
0.026278546079993248,
0.005705300718545914,
0.06592023372650146,
0.04250522330403328,
0.0648341029882431,
-0.02982121892273426,
0.06391338258981705,
0.1108853667974472,
0.03844940662384033,
0.13148561120033264,
-0.05558411031961441,
-0.06147214397788048,
0.05758168175816536,
-0.00863972119987011,
0.24782785773277283,
0.011279144324362278,
0.1309511810541153,
0.07623305916786194,
0.12350870668888092,
0.017918558791279793,
0.05768585205078125,
0.018591217696666718,
-0.03858204931020737,
-0.019616344943642616,
-0.025811797007918358,
-0.029816756024956703,
0.0286216102540493,
-0.04727308079600334,
0.048704832792282104,
-0.13749583065509796,
-0.01498402375727892,
0.06358642131090164,
0.23906491696834564,
0.016769928857684135,
-0.30908310413360596,
-0.10424860566854477,
0.010606772266328335,
-0.05240930989384651,
-0.009383879601955414,
0.026137301698327065,
0.10281414538621902,
-0.12598705291748047,
0.03643062710762024,
-0.08053163439035416,
0.09221653640270233,
-0.0863085463643074,
0.04050378501415253,
0.0738224908709526,
0.0681130588054657,
-0.003933573141694069,
0.07893651723861694,
-0.307219922542572,
0.2819614112377167,
-0.005618869327008724,
0.060745105147361755,
-0.06372545659542084,
-0.025851668789982796,
0.023402828723192215,
0.05463678762316704,
0.06036457046866417,
-0.005185297690331936,
-0.05821243301033974,
-0.17296744883060455,
-0.029245417565107346,
0.025523608550429344,
0.07566779851913452,
-0.01468990370631218,
0.08854345232248306,
-0.0285579115152359,
0.004089497961103916,
0.05787508934736252,
-0.027434229850769043,
-0.05153360217809677,
-0.09460210800170898,
-0.004334294702857733,
0.020693570375442505,
-0.05909181386232376,
-0.06367843598127365,
-0.13336031138896942,
-0.08024092018604279,
0.13815522193908691,
-0.014427115209400654,
-0.04591428115963936,
-0.09696020931005478,
0.07496039569377899,
0.06935662031173706,
-0.0799306333065033,
0.03762155771255493,
0.014699560590088367,
0.0846717432141304,
0.024481261149048805,
-0.047440964728593826,
0.09554848819971085,
-0.05173030123114586,
-0.1872195154428482,
-0.0632166862487793,
0.11352117359638214,
0.028094131499528885,
0.06719598174095154,
-0.023858340457081795,
0.0004107730055693537,
-0.04823746904730797,
-0.08825484663248062,
0.02258949913084507,
0.007237046025693417,
0.08538832515478134,
0.04420587047934532,
-0.06016400828957558,
0.003088439116254449,
-0.0743371769785881,
-0.05789945647120476,
0.20305874943733215,
0.20633313059806824,
-0.09303376823663712,
0.032080233097076416,
0.01414012722671032,
-0.08177021145820618,
-0.17220793664455414,
0.03629900887608528,
0.07108122855424881,
0.012489903718233109,
0.05826587229967117,
-0.15110467374324799,
0.11386826634407043,
0.09753286093473434,
-0.008590045385062695,
0.13361698389053345,
-0.323248952627182,
-0.13557180762290955,
0.09210297465324402,
0.15564033389091492,
0.12722596526145935,
-0.13530485332012177,
-0.012024758383631706,
-0.029694128781557083,
-0.12655147910118103,
0.13825254142284393,
-0.08200353384017944,
0.14067378640174866,
-0.03298668563365936,
0.10618506371974945,
0.0052995807491242886,
-0.05460384488105774,
0.11506109684705734,
0.01607188954949379,
0.10979824513196945,
-0.05073171481490135,
-0.046968698501586914,
0.018168210983276367,
-0.03173650801181793,
0.017488637939095497,
-0.07388205081224442,
0.019537346437573433,
-0.09553373605012894,
-0.037904515862464905,
-0.07616972178220749,
0.03510139882564545,
-0.04053482040762901,
-0.05432239547371864,
-0.04073890298604965,
0.035612355917692184,
0.02205091342329979,
-0.017490994185209274,
0.14471615850925446,
0.005916844122111797,
0.14710642397403717,
0.06948163360357285,
0.09639938920736313,
-0.05343913659453392,
-0.09279846400022507,
-0.03582580387592316,
-0.021688245236873627,
0.049793485552072525,
-0.15473158657550812,
0.02326696179807186,
0.14285890758037567,
0.012413830496370792,
0.15901656448841095,
0.07501823455095291,
-0.028941627591848373,
0.015591477043926716,
0.06824849545955658,
-0.15109407901763916,
-0.0993746891617775,
-0.015658222138881683,
-0.09098188579082489,
-0.11272766441106796,
0.04547811672091484,
0.11424396187067032,
-0.06779132783412933,
-0.027168378233909607,
-0.013252581469714642,
0.009434499777853489,
-0.04961276799440384,
0.19228704273700714,
0.0712907612323761,
0.049355633556842804,
-0.10086462646722794,
0.08726470172405243,
0.05299781262874603,
-0.07277260720729828,
0.009131514467298985,
0.07398980855941772,
-0.0851946696639061,
-0.06054844334721565,
0.06302937865257263,
0.1840636432170868,
-0.06436847895383835,
-0.05052271485328674,
-0.14428043365478516,
-0.12239868193864822,
0.08020304143428802,
0.15456198155879974,
0.1154261901974678,
0.01174027007073164,
-0.04472504183650017,
-0.009678967297077179,
-0.10332822054624557,
0.10373563319444656,
0.06035935878753662,
0.06799294799566269,
-0.15564770996570587,
0.11893093585968018,
0.0298626646399498,
0.0544048435986042,
-0.021874960511922836,
0.03503105044364929,
-0.11320466548204422,
0.016281502321362495,
-0.11635188013315201,
-0.004599275998771191,
-0.01955498568713665,
0.0156586654484272,
0.00008569054625695571,
-0.056630246341228485,
-0.06948243826627731,
0.011811119504272938,
-0.12271115183830261,
-0.015396937727928162,
0.041357602924108505,
0.07619098573923111,
-0.08720040321350098,
-0.03770965710282326,
0.024497678503394127,
-0.04467649757862091,
0.07077261805534363,
0.04765259474515915,
0.00999519880861044,
0.0638277679681778,
-0.1326751559972763,
0.03493008390069008,
0.05847730115056038,
0.016229216009378433,
0.048695411533117294,
-0.1218823567032814,
0.00844301376491785,
0.004147431813180447,
0.07234194129705429,
0.02527628093957901,
0.06878162175416946,
-0.1595860719680786,
-0.003925286699086428,
-0.011753080412745476,
-0.08088759332895279,
-0.0604778528213501,
0.02060185931622982,
0.06034849211573601,
0.033461686223745346,
0.21250495314598083,
-0.08307280391454697,
0.04318675398826599,
-0.19975832104682922,
0.00521842809394002,
-0.01949070766568184,
-0.1242818534374237,
-0.12428144365549088,
-0.0736192986369133,
0.05655497685074806,
-0.0671464130282402,
0.1680191457271576,
0.04778936877846718,
0.05581874027848244,
0.02484714426100254,
-0.020287757739424706,
-0.0074821035377681255,
0.016732243821024895,
0.17049984633922577,
0.007073113229125738,
-0.04048845171928406,
0.0606084018945694,
0.047959793359041214,
0.1063975840806961,
0.10674457252025604,
0.20010076463222504,
0.1684790700674057,
0.009575174190104008,
0.08692093193531036,
0.03743763640522957,
-0.03279959410429001,
-0.13300663232803345,
0.03713468834757805,
-0.025708554312586784,
0.11290872097015381,
-0.026694100350141525,
0.20042958855628967,
0.07072245329618454,
-0.16473351418972015,
0.04714856669306755,
-0.05892984941601753,
-0.08779802173376083,
-0.11389470845460892,
-0.055804088711738586,
-0.09887007623910904,
-0.1443217545747757,
0.005623009521514177,
-0.130331888794899,
-0.001939242472872138,
0.09170602262020111,
0.007379705086350441,
-0.04041507467627525,
0.11972035467624664,
0.02042819932103157,
0.011828257702291012,
0.08732693642377853,
0.013573730364441872,
-0.03270769864320755,
-0.10997237265110016,
-0.04921284690499306,
-0.03101533092558384,
-0.025611599907279015,
0.023357538506388664,
-0.05341451242566109,
-0.06802772730588913,
0.024218278005719185,
-0.026913153007626534,
-0.10152031481266022,
0.014489524997770786,
0.02225584164261818,
0.07951844483613968,
0.03816826641559601,
0.015252734534442425,
0.008539740927517414,
-0.0018916655099019408,
0.2537987232208252,
-0.06090321019291878,
-0.059095606207847595,
-0.12073633074760437,
0.23759934306144714,
0.04082411155104637,
-0.027152735739946365,
0.0369359627366066,
-0.0620994009077549,
0.004789397120475769,
0.250545471906662,
0.23370525240898132,
-0.07233811914920807,
-0.008881565183401108,
0.016480514779686928,
-0.005681920796632767,
-0.014903892762959003,
0.12409383058547974,
0.11327847838401794,
0.043661732226610184,
-0.07554518431425095,
-0.03618474677205086,
-0.053929403424263,
0.002410672837868333,
-0.017594728618860245,
0.06780397146940231,
0.05220600590109825,
0.005234327167272568,
-0.041317231953144073,
0.0750744640827179,
-0.08238773792982101,
-0.11706630140542984,
0.04748406261205673,
-0.2140689343214035,
-0.17265373468399048,
-0.01564285345375538,
0.09141164273023605,
-0.0005080309347249568,
0.06623675674200058,
-0.025556398555636406,
-0.014778113923966885,
0.07295584678649902,
-0.016154099255800247,
-0.1069135069847107,
-0.08071832358837128,
0.09760671108961105,
-0.1033845841884613,
0.18947070837020874,
-0.05197722837328911,
0.05551624298095703,
0.12156101316213608,
0.06087696552276611,
-0.06552910804748535,
0.07936710119247437,
0.036825064569711685,
-0.040335942059755325,
0.04746859520673752,
0.10013407468795776,
-0.03197331726551056,
0.07261445373296738,
0.05393337458372116,
-0.12573927640914917,
0.016867447644472122,
-0.0939512848854065,
-0.04653635248541832,
-0.056750234216451645,
-0.011542480438947678,
-0.07443743944168091,
0.12872548401355743,
0.23667973279953003,
-0.03721931204199791,
-0.007397593930363655,
-0.05932502821087837,
0.02578439563512802,
0.06336025893688202,
0.041056301444768906,
-0.047882936894893646,
-0.22828209400177002,
0.009885349310934544,
0.07289337366819382,
-0.015281859785318375,
-0.26788604259490967,
-0.070579893887043,
0.0017346341628581285,
-0.07060904800891876,
-0.07644132524728775,
0.08083239942789078,
0.07705751806497574,
0.044927142560482025,
-0.06221795082092285,
-0.06259375810623169,
-0.06772700697183609,
0.1547669768333435,
-0.15244202315807343,
-0.0954475924372673
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# andreiliphdpr/bert-base-multilingual-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-multilingual-uncased](https://huggingface.co/bert-base-multilingual-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0423
- Train Accuracy: 0.9869
- Validation Loss: 0.0303
- Validation Accuracy: 0.9913
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 43750, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.0423 | 0.9869 | 0.0303 | 0.9913 | 0 |
### Framework versions
- Transformers 4.15.0.dev0
- TensorFlow 2.6.2
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "andreiliphdpr/bert-base-multilingual-uncased-finetuned-cola", "results": []}]}
|
text-classification
|
andreiliphdpr/bert-base-multilingual-uncased-finetuned-cola
|
[
"transformers",
"tf",
"bert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #tf #bert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
andreiliphdpr/bert-base-multilingual-uncased-finetuned-cola
===========================================================
This model is a fine-tuned version of bert-base-multilingual-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.0423
* Train Accuracy: 0.9869
* Validation Loss: 0.0303
* Validation Accuracy: 0.9913
* Epoch: 0
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'learning\_rate': {'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 2e-05, 'decay\_steps': 43750, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.15.0.dev0
* TensorFlow 2.6.2
* Datasets 1.15.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 43750, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0.dev0\n* TensorFlow 2.6.2\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #tf #bert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 43750, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0.dev0\n* TensorFlow 2.6.2\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
54,
178,
4,
34
] |
[
"passage: TAGS\n#transformers #tf #bert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 43750, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.15.0.dev0\n* TensorFlow 2.6.2\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
-0.07604354619979858,
0.04004450514912605,
-0.004855964332818985,
0.07535058259963989,
0.1570156365633011,
0.05533894896507263,
0.12335391342639923,
0.11806740611791611,
-0.08734147250652313,
0.1044764295220375,
0.15147076547145844,
0.14114727079868317,
0.06838317215442657,
0.11619718372821808,
-0.07617706805467606,
-0.14641806483268738,
0.07352844625711441,
-0.024033086374402046,
-0.08165863156318665,
0.08267858624458313,
0.0854148268699646,
-0.06551729142665863,
0.08561220020055771,
-0.016603903844952583,
-0.10714992880821228,
0.03358760103583336,
0.08991540223360062,
-0.06877151876688004,
0.12221787869930267,
0.07386856526136398,
0.08186998218297958,
-0.021304475143551826,
0.013845540583133698,
-0.19608968496322632,
0.00993469450622797,
0.1105644553899765,
-0.002310708863660693,
0.06789813935756683,
0.017976053059101105,
-0.000811175093986094,
0.09949468821287155,
-0.09758460521697998,
0.06006024777889252,
0.04393984004855156,
-0.12966233491897583,
-0.2620150148868561,
-0.12073546648025513,
0.023061545565724373,
0.07650593668222427,
0.0965787023305893,
0.0005667535006068647,
0.1389661431312561,
-0.04527023807168007,
0.08789300918579102,
0.14940953254699707,
-0.2773582637310028,
-0.037163328379392624,
0.033368080854415894,
0.012309801764786243,
0.06438956409692764,
-0.04722139239311218,
0.008968153037130833,
0.03151431679725647,
0.042871128767728806,
0.030156370252370834,
-0.014962472021579742,
0.013098192401230335,
-0.037350933998823166,
-0.07487091422080994,
-0.07249516993761063,
0.12531481683254242,
0.03929222747683525,
-0.0769432932138443,
-0.06376904249191284,
-0.023607194423675537,
-0.14878904819488525,
0.01048215851187706,
-0.025156881660223007,
0.0012753907358273864,
0.01968318596482277,
-0.033500444144010544,
-0.03265063464641571,
-0.055018048733472824,
-0.06558734178543091,
-0.007083766628056765,
0.1311214566230774,
0.02973947860300541,
0.04461245238780975,
-0.011519726365804672,
0.06716479361057281,
-0.030777888372540474,
-0.10383189469575882,
-0.02179054357111454,
-0.0018984859343618155,
-0.04864130914211273,
-0.021537721157073975,
-0.07961749285459518,
-0.02030433528125286,
0.05572369694709778,
0.15041546523571014,
-0.06215686723589897,
0.11803188174962997,
-0.02017299085855484,
0.02294073812663555,
-0.11985484510660172,
0.12224483489990234,
-0.02155117131769657,
0.006739196367561817,
0.00025449253735132515,
0.08153312653303146,
0.03593133017420769,
-0.047000445425510406,
-0.050760816782712936,
0.006930394098162651,
0.10291526466608047,
0.03423026204109192,
-0.05823007971048355,
0.08447454869747162,
-0.0783843919634819,
-0.00318393437191844,
-0.03349801525473595,
-0.11749120056629181,
0.04289548844099045,
0.020504450425505638,
-0.09205406904220581,
0.004519828595221043,
0.0795428454875946,
0.007584364153444767,
-0.05304846540093422,
0.03990470990538597,
-0.06104334816336632,
-0.04470815137028694,
-0.10958245396614075,
-0.11801827698945999,
0.0228402279317379,
-0.07066377252340317,
-0.012699710205197334,
-0.07444752007722855,
-0.17709225416183472,
-0.04004163667559624,
0.0747729018330574,
-0.03169970214366913,
-0.019439740106463432,
-0.06678032875061035,
-0.15521982312202454,
0.062485624104738235,
-0.00016420263273175806,
0.12851972877979279,
-0.045807916671037674,
0.08256816118955612,
0.004181011114269495,
0.04009903594851494,
-0.051806360483169556,
0.0393509678542614,
-0.03185616806149483,
0.032680049538612366,
-0.1812210977077484,
0.07962477207183838,
-0.07581908255815506,
0.03391289338469505,
-0.15205132961273193,
-0.07603359967470169,
0.05062207207083702,
0.022793469950556755,
0.12553763389587402,
0.10212786495685577,
-0.13661326467990875,
-0.06499016284942627,
0.09658318012952805,
-0.08781414479017258,
-0.0917821079492569,
0.08562012016773224,
-0.05564966797828674,
0.031541675329208374,
0.07928842306137085,
0.07400143146514893,
0.028364745900034904,
-0.10869817435741425,
0.016962526366114616,
-0.055974215269088745,
0.01353627722710371,
0.033990003168582916,
0.03366114944219589,
-0.04538455232977867,
-0.11502443253993988,
0.013602074235677719,
-0.024410175159573555,
0.019882366061210632,
-0.05728639289736748,
-0.058350373059511185,
-0.03289623185992241,
-0.055530451238155365,
0.03004360944032669,
0.025030098855495453,
0.03254194185137749,
-0.09996996074914932,
-0.13924244046211243,
0.05663926899433136,
0.053059015423059464,
-0.058960072696208954,
0.025531398132443428,
-0.08154994249343872,
0.02591938152909279,
0.04913562536239624,
0.015427066944539547,
-0.16006267070770264,
-0.03935461863875389,
0.01941581256687641,
-0.004872950725257397,
0.021090464666485786,
-0.047839198261499405,
0.06828907132148743,
0.019643524661660194,
-0.05093427747488022,
-0.0071347919292747974,
-0.03339146450161934,
0.01618598774075508,
-0.07973432540893555,
-0.22785580158233643,
-0.01467589009553194,
-0.00978896114975214,
0.0779854953289032,
-0.2852429151535034,
0.01430017501115799,
0.06817178428173065,
0.10258768498897552,
0.02091982401907444,
-0.019715122878551483,
-0.040324702858924866,
0.05349591001868248,
-0.01918378844857216,
-0.06199502944946289,
0.028030848130583763,
0.02346423827111721,
-0.13109242916107178,
-0.04098575562238693,
-0.17703606188297272,
0.09348520636558533,
0.12643803656101227,
-0.09328935295343399,
-0.1376393437385559,
0.04795146733522415,
-0.018609685823321342,
-0.03306030109524727,
-0.013089261949062347,
-0.025638021528720856,
0.17456184327602386,
0.02942054718732834,
0.1393793821334839,
-0.04565827175974846,
-0.004311702214181423,
0.03429282829165459,
-0.02152232453227043,
-0.0334579162299633,
0.1278318613767624,
-0.00773880397900939,
-0.08859597146511078,
0.08552662283182144,
0.09926414489746094,
-0.10175111144781113,
0.10548476129770279,
-0.044203490018844604,
-0.050576478242874146,
-0.0709543526172638,
0.05299225449562073,
0.06234383583068848,
0.08092468976974487,
-0.09684323519468307,
0.009629899635910988,
0.010028651915490627,
0.02824103832244873,
-0.021737627685070038,
-0.19314059615135193,
0.0017620291328057647,
0.010956687852740288,
-0.05595461651682854,
0.01113477349281311,
-0.008739102631807327,
0.01690738834440708,
0.12105909734964371,
0.03409339860081673,
-0.03740118443965912,
0.07985645532608032,
-0.028255870565772057,
-0.09688255190849304,
0.22597360610961914,
-0.14466844499111176,
-0.12804804742336273,
-0.14245419204235077,
-0.02479696087539196,
-0.042438164353370667,
0.003238470759242773,
-0.003104757284745574,
-0.0993850976228714,
-0.06183692440390587,
-0.06579182296991348,
-0.019721506163477898,
-0.039739638566970825,
0.033648017793893814,
0.03955758363008499,
-0.0030981528107076883,
0.14093324542045593,
-0.10852077603340149,
-0.04173103719949722,
-0.0012204537633806467,
-0.08123774826526642,
0.015432767570018768,
-0.0052390568889677525,
0.0018036658875644207,
0.10852131247520447,
0.0015193764120340347,
0.023525295779109,
-0.046731479465961456,
0.2320813238620758,
-0.052623599767684937,
-0.017823103815317154,
0.14541587233543396,
-0.017839614301919937,
0.06662338972091675,
0.12415116280317307,
0.0437413789331913,
-0.12193699181079865,
0.058340709656476974,
0.06165708601474762,
-0.017190590500831604,
-0.24715325236320496,
-0.0022587503772228956,
-0.031605061143636703,
-0.08990567922592163,
0.06897079199552536,
0.029301302507519722,
0.14301197230815887,
0.017973776906728745,
-0.0006617592298425734,
0.11366930603981018,
0.05007120966911316,
0.06911671906709671,
0.14969824254512787,
0.06113835796713829,
0.0900401622056961,
-0.03361506760120392,
0.010958344675600529,
0.034093890339136124,
-0.017776748165488243,
0.21014562249183655,
0.021159987896680832,
0.07082808017730713,
0.06302106380462646,
0.08605454117059708,
-0.03692258521914482,
0.0031125580426305532,
0.0031827744096517563,
-0.0028676032088696957,
0.006934436038136482,
-0.06275937706232071,
-0.060208819806575775,
0.05629853904247284,
-0.04254794120788574,
0.08222100138664246,
-0.12384926527738571,
0.019079964607954025,
0.04554875195026398,
0.24227647483348846,
0.08282168209552765,
-0.2956824004650116,
-0.11736009269952774,
0.004843544214963913,
-0.029744576662778854,
-0.060369379818439484,
-0.005246786400675774,
0.07941120862960815,
-0.08834081888198853,
0.07253380864858627,
-0.06083134934306145,
0.05257635936141014,
-0.03811278194189072,
0.04617715999484062,
0.11397749930620193,
0.09906598925590515,
0.002472387393936515,
0.01978406496345997,
-0.35736697912216187,
0.2866699695587158,
0.039590202271938324,
0.15513330698013306,
-0.08148738741874695,
0.04790038987994194,
0.03899963200092316,
-0.03520670533180237,
0.05696055665612221,
-0.006444950122386217,
-0.14155583083629608,
-0.23135673999786377,
-0.03230297565460205,
-0.0006347219459712505,
0.14735685288906097,
0.014698777347803116,
0.10569225996732712,
-0.056716933846473694,
0.022161945700645447,
0.07745598256587982,
-0.021547719836235046,
-0.16496066749095917,
-0.06693603098392487,
0.04970187321305275,
0.051850803196430206,
-0.009873712435364723,
-0.0871572345495224,
-0.07617473602294922,
-0.06076623499393463,
0.17425237596035004,
-0.1623634546995163,
-0.0421607568860054,
-0.13316962122917175,
0.07992509007453918,
0.087840735912323,
-0.06326038390398026,
0.0381888709962368,
-0.0037632673047482967,
0.062020160257816315,
0.04024217650294304,
-0.081662118434906,
0.14022096991539001,
-0.025918830186128616,
-0.23128339648246765,
-0.06286680698394775,
0.09907642751932144,
0.05452476069331169,
0.04203936830163002,
-0.009594287723302841,
0.0712294727563858,
0.03997508063912392,
-0.10522294789552689,
0.08644029498100281,
0.02772989682853222,
0.0509389191865921,
0.07440590858459473,
-0.019436225295066833,
-0.006554835941642523,
-0.046093590557575226,
-0.01799275353550911,
0.09876672178506851,
0.29941943287849426,
-0.06878924369812012,
0.01758437603712082,
0.020324714481830597,
-0.09031124413013458,
-0.20032501220703125,
0.0804174542427063,
0.10268229246139526,
0.01674913801252842,
-0.056862834841012955,
-0.19149935245513916,
0.045758478343486786,
0.09229099750518799,
-0.014819069765508175,
0.05975012108683586,
-0.29032275080680847,
-0.15027214586734772,
0.06828600913286209,
0.1343095302581787,
0.1306988000869751,
-0.17117461562156677,
-0.04722800850868225,
-0.057670578360557556,
-0.03981974720954895,
0.14164814352989197,
-0.058595411479473114,
0.10821007937192917,
0.02737741358578205,
0.05150434374809265,
0.008796297013759613,
-0.040786344558000565,
0.143533393740654,
-0.044377218931913376,
0.10017498582601547,
-0.04893263429403305,
-0.05357201397418976,
0.07636035978794098,
-0.08093875646591187,
0.03014446794986725,
-0.049008823931217194,
0.0289746206253767,
-0.1430375874042511,
0.0010429826797917485,
-0.0809791311621666,
0.042545001953840256,
-0.06484068185091019,
-0.01659516990184784,
-0.011311136186122894,
0.06147895008325577,
0.06692781299352646,
-0.020382743328809738,
0.11076144129037857,
-0.019618671387434006,
0.16946589946746826,
0.1451271027326584,
0.08137113600969315,
0.013960320502519608,
-0.01361884642392397,
0.08526451140642166,
-0.02879086323082447,
0.07549339532852173,
-0.15650013089179993,
0.051679279655218124,
0.1316744089126587,
-0.007664077449589968,
0.16223137080669403,
0.06536892056465149,
-0.07587295025587082,
0.028754981234669685,
0.05481419712305069,
-0.13030821084976196,
-0.08956366032361984,
0.021236008033156395,
0.05078340321779251,
-0.0933580994606018,
0.026722464710474014,
0.1500921994447708,
-0.040318284183740616,
0.021252553910017014,
0.0035019416827708483,
0.044411368668079376,
-0.0713832750916481,
0.1372288167476654,
0.02534930221736431,
0.0636586993932724,
-0.07022670656442642,
0.13201501965522766,
0.05655007064342499,
-0.11085642129182816,
0.11452359706163406,
0.023382075130939484,
-0.06500349193811417,
-0.001460923464037478,
0.03190351650118828,
0.0887800082564354,
0.03537564352154732,
-0.06877660751342773,
-0.11764845997095108,
-0.16664175689220428,
0.07353414595127106,
0.19814041256904602,
0.04301408305764198,
0.06643804162740707,
-0.042590029537677765,
-0.0050694746896624565,
-0.08865378051996231,
0.0545399971306324,
0.0472671315073967,
0.030537355691194534,
-0.14012844860553741,
0.18376798927783966,
0.0014576775720342994,
-0.013497693464159966,
-0.0068137929774820805,
0.007690319325774908,
-0.20128782093524933,
0.00697154738008976,
-0.1484145075082779,
0.008897939696907997,
0.02339988574385643,
-0.012696045450866222,
0.03781593590974808,
-0.06229887902736664,
-0.05453860014677048,
0.03882545977830887,
-0.09069465100765228,
-0.045810356736183167,
0.055744342505931854,
0.06372568756341934,
-0.11707762628793716,
-0.07781826704740524,
0.0391683466732502,
-0.10311362147331238,
0.03648018464446068,
0.05854661390185356,
0.0063970135524868965,
0.03242797404527664,
-0.1079181432723999,
0.037150442600250244,
0.03911622613668442,
-0.002500956179574132,
0.03771447762846947,
-0.16728538274765015,
0.01545899361371994,
-0.031750332564115524,
0.039258431643247604,
0.0388539582490921,
0.10375171154737473,
-0.09528869390487671,
-0.03314268961548805,
-0.010203775018453598,
-0.02763419784605503,
-0.05496792495250702,
0.06598405539989471,
0.14409580826759338,
-0.014908294193446636,
0.17377246916294098,
-0.13017050921916962,
0.008282207883894444,
-0.17972441017627716,
0.009430967271327972,
-0.002881365828216076,
-0.09129374474287033,
-0.10544459521770477,
-0.02111542597413063,
0.11030951887369156,
-0.08180011808872223,
0.08670371770858765,
-0.0612356960773468,
0.08899831771850586,
0.04370829463005066,
-0.09266878664493561,
-0.05633258447051048,
0.06392328441143036,
0.17221295833587646,
0.046912819147109985,
-0.01405514869838953,
0.06268346309661865,
-0.0221415963023901,
0.08953895419836044,
0.10983467102050781,
0.20544156432151794,
0.13792462646961212,
0.03000873327255249,
0.11926544457674026,
0.05926317349076271,
-0.07751438766717911,
-0.07550813257694244,
0.10657522082328796,
-0.07727477699518204,
0.15325725078582764,
-0.06083892285823822,
0.12520934641361237,
0.05765163525938988,
-0.18485531210899353,
0.02267231047153473,
-0.08716076612472534,
-0.10419241338968277,
-0.12494392693042755,
-0.09887450933456421,
-0.07673215866088867,
-0.09468719363212585,
0.00007798698061378673,
-0.09947533905506134,
0.0341622419655323,
0.07214158028364182,
0.027663975954055786,
-0.0005471697659231722,
0.08125882595777512,
-0.05879838392138481,
0.028440158814191818,
0.10627990961074829,
-0.00818169116973877,
-0.01357510406523943,
-0.03966832160949707,
-0.0943431407213211,
0.059168845415115356,
-0.013878033496439457,
0.05007651448249817,
0.005999011918902397,
-0.01785774901509285,
0.042905330657958984,
-0.00926223024725914,
-0.09862003475427628,
0.05371581017971039,
0.022506097331643105,
0.02219892479479313,
0.07271252572536469,
0.044033706188201904,
-0.004984842613339424,
-0.02071317285299301,
0.14431239664554596,
-0.10384654998779297,
-0.026291539892554283,
-0.14573313295841217,
0.2697571814060211,
-0.0202921312302351,
0.03845958784222603,
0.0007889542612247169,
-0.07214446365833282,
-0.03653344139456749,
0.1888655424118042,
0.13509882986545563,
-0.07830430567264557,
-0.026165079325437546,
0.053102046251297,
-0.013163874857127666,
-0.03859217092394829,
0.14458584785461426,
0.07542961835861206,
-0.03592032194137573,
-0.04859170317649841,
-0.025086645036935806,
-0.016383150592446327,
-0.015484845265746117,
-0.031502123922109604,
0.07781076431274414,
0.01073560118675232,
-0.008578506298363209,
-0.014210938476026058,
0.05834640935063362,
-0.05445961281657219,
-0.14369596540927887,
0.08545885235071182,
-0.1904020607471466,
-0.15636901557445526,
-0.00423542782664299,
0.006500741932541132,
-0.006211619358509779,
0.06757977604866028,
-0.010124830529093742,
0.0025311571080237627,
0.10057304799556732,
-0.03877801448106766,
-0.009855229407548904,
-0.12466248124837875,
0.06023101881146431,
-0.029032675549387932,
0.18866881728172302,
-0.012192712165415287,
0.058833807706832886,
0.13803908228874207,
0.022784994915127754,
-0.08534558117389679,
0.03560633957386017,
0.07614405453205109,
-0.09851831197738647,
-0.009074695408344269,
0.08036595582962036,
-0.03000980243086815,
0.1475282907485962,
0.059249211102724075,
-0.08920064568519592,
0.044772177934646606,
-0.09731946140527725,
-0.097465880215168,
-0.037283338606357574,
-0.047667764127254486,
-0.07447955012321472,
0.13273115456104279,
0.245256707072258,
-0.043245043605566025,
0.010304230265319347,
-0.03388388454914093,
-0.0030875850934535265,
0.04923320561647415,
0.027333499863743782,
-0.05678637698292732,
-0.23703518509864807,
0.071747787296772,
0.051359664648771286,
0.04386620223522186,
-0.17634208500385284,
-0.09500159323215485,
0.012889081612229347,
-0.01783173903822899,
-0.0966467559337616,
0.10669451951980591,
0.031704455614089966,
0.049503568559885025,
-0.05349695309996605,
-0.14301671087741852,
-0.04321422427892685,
0.1946382373571396,
-0.09130127727985382,
-0.08506778627634048
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# andreiliphdpr/distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0015
- Train Accuracy: 0.9995
- Validation Loss: 0.0570
- Validation Accuracy: 0.9915
- Epoch: 4
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 43750, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.0399 | 0.9870 | 0.0281 | 0.9908 | 0 |
| 0.0182 | 0.9944 | 0.0326 | 0.9901 | 1 |
| 0.0089 | 0.9971 | 0.0396 | 0.9912 | 2 |
| 0.0040 | 0.9987 | 0.0486 | 0.9918 | 3 |
| 0.0015 | 0.9995 | 0.0570 | 0.9915 | 4 |
### Framework versions
- Transformers 4.15.0.dev0
- TensorFlow 2.6.2
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "model-index": [{"name": "andreiliphdpr/distilbert-base-uncased-finetuned-cola", "results": []}]}
|
text-classification
|
andreiliphdpr/distilbert-base-uncased-finetuned-cola
|
[
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #tf #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
andreiliphdpr/distilbert-base-uncased-finetuned-cola
====================================================
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset.
It achieves the following results on the evaluation set:
* Train Loss: 0.0015
* Train Accuracy: 0.9995
* Validation Loss: 0.0570
* Validation Accuracy: 0.9915
* Epoch: 4
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* optimizer: {'name': 'Adam', 'learning\_rate': {'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 2e-05, 'decay\_steps': 43750, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
* training\_precision: float32
### Training results
### Framework versions
* Transformers 4.15.0.dev0
* TensorFlow 2.6.2
* Datasets 1.15.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 43750, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0.dev0\n* TensorFlow 2.6.2\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #tf #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 43750, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0.dev0\n* TensorFlow 2.6.2\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
56,
178,
4,
34
] |
[
"passage: TAGS\n#transformers #tf #distilbert #text-classification #generated_from_keras_callback #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'learning\\_rate': {'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 43750, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.15.0.dev0\n* TensorFlow 2.6.2\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
-0.07470399886369705,
0.05361977219581604,
-0.005306803621351719,
0.07277624309062958,
0.14869673550128937,
0.0520809143781662,
0.12685774266719818,
0.12341102212667465,
-0.0877026617527008,
0.10561789572238922,
0.14896973967552185,
0.1407383531332016,
0.059846702963113785,
0.11960646510124207,
-0.0792069062590599,
-0.15124525129795074,
0.07255341112613678,
-0.023611722514033318,
-0.07530978322029114,
0.08256996423006058,
0.08681702613830566,
-0.06593990325927734,
0.0884184017777443,
-0.016957109794020653,
-0.11077683418989182,
0.034442342817783356,
0.0867237076163292,
-0.07540891319513321,
0.12110964953899384,
0.06966500729322433,
0.08458662778139114,
-0.01586451195180416,
0.014952250756323338,
-0.1916072964668274,
0.009581568650901318,
0.11546767503023148,
-0.00036362968967296183,
0.06955467909574509,
0.014438637532293797,
-0.01241745613515377,
0.09523700177669525,
-0.10263975709676743,
0.060586679726839066,
0.04251614212989807,
-0.1332281231880188,
-0.26822173595428467,
-0.11771702021360397,
0.024430684745311737,
0.0842369794845581,
0.09278234094381332,
0.004848302807658911,
0.13687501847743988,
-0.03981255739927292,
0.0884346291422844,
0.14555121958255768,
-0.2738858461380005,
-0.034452520310878754,
0.03416474536061287,
0.013926437124609947,
0.058111242949962616,
-0.047397688031196594,
0.00691781984642148,
0.030384600162506104,
0.03970735892653465,
0.03739050775766373,
-0.019983217120170593,
0.030011814087629318,
-0.03914690017700195,
-0.07764989882707596,
-0.07687047868967056,
0.12825843691825867,
0.03655103221535683,
-0.07299347966909409,
-0.06900004297494888,
-0.03395720198750496,
-0.1418086290359497,
0.011736389249563217,
-0.02582949958741665,
0.0016849162057042122,
0.02256271056830883,
-0.037987153977155685,
-0.03802269697189331,
-0.05272819474339485,
-0.05741916969418526,
-0.008332407101988792,
0.12447479367256165,
0.026494888588786125,
0.04287102445960045,
-0.013190149329602718,
0.06585720926523209,
-0.02092578634619713,
-0.10607708990573883,
-0.0237022303044796,
-0.0017837040359154344,
-0.05249103531241417,
-0.022975316271185875,
-0.07427537441253662,
-0.023590853437781334,
0.05292816832661629,
0.16681520640850067,
-0.06531964987516403,
0.11712696403265,
-0.021078215911984444,
0.022657716646790504,
-0.11327260732650757,
0.12071269750595093,
-0.0154820391908288,
0.006470149848610163,
0.0031570931896567345,
0.07693713903427124,
0.044994182884693146,
-0.049638211727142334,
-0.051078397780656815,
0.011873441748321056,
0.10143395513296127,
0.035319335758686066,
-0.04623895138502121,
0.08105254173278809,
-0.07703428715467453,
-0.00580536387860775,
-0.019341493025422096,
-0.12322185188531876,
0.042903151363134384,
0.016658028587698936,
-0.08736948668956757,
0.008906030096113682,
0.0728420689702034,
0.012042413465678692,
-0.055216506123542786,
0.045471932739019394,
-0.06259874999523163,
-0.04626310244202614,
-0.10631345957517624,
-0.11727175861597061,
0.026091350242495537,
-0.07535578310489655,
-0.016712000593543053,
-0.07847655564546585,
-0.17995628714561462,
-0.038367561995983124,
0.0793062373995781,
-0.028959274291992188,
-0.023574691265821457,
-0.06825947016477585,
-0.1529788225889206,
0.06691034138202667,
-0.0034543261863291264,
0.11102095246315002,
-0.04437801614403725,
0.07595702260732651,
0.0019252069760113955,
0.040962476283311844,
-0.05445663258433342,
0.03860224783420563,
-0.037586476653814316,
0.03907375782728195,
-0.18594776093959808,
0.08819770067930222,
-0.07177485525608063,
0.029934478923678398,
-0.15741749107837677,
-0.07200537621974945,
0.048972778022289276,
0.017455745488405228,
0.12069489061832428,
0.107505202293396,
-0.13432937860488892,
-0.06637315452098846,
0.09505344182252884,
-0.09318510442972183,
-0.08014117926359177,
0.08841389417648315,
-0.05584327504038811,
0.018519068136811256,
0.07886271178722382,
0.07920504361391068,
0.0313180536031723,
-0.10768459737300873,
0.0057704453356564045,
-0.05392792820930481,
0.0039240652695298195,
0.032304830849170685,
0.03541715815663338,
-0.05100487545132637,
-0.09116818755865097,
0.007845135405659676,
-0.018310492858290672,
0.019226331263780594,
-0.049266405403614044,
-0.05844658613204956,
-0.03630014881491661,
-0.04860241711139679,
0.03930848836898804,
0.026956729590892792,
0.029794244095683098,
-0.09643591195344925,
-0.13980914652347565,
0.0699000358581543,
0.052125681191682816,
-0.061373814940452576,
0.030814258381724358,
-0.08851709216833115,
0.02305331453680992,
0.043922096490859985,
0.013623317703604698,
-0.16381892561912537,
-0.04922002553939819,
0.021752020344138145,
-0.0071260761469602585,
0.016990436241030693,
-0.05358535051345825,
0.06948816031217575,
0.016778625547885895,
-0.04976808652281761,
-0.014856056310236454,
-0.03214205428957939,
0.015284254215657711,
-0.07513412088155746,
-0.22388005256652832,
-0.018653707578778267,
-0.013843334279954433,
0.0705270916223526,
-0.26588407158851624,
0.015956318005919456,
0.0820138156414032,
0.1145436018705368,
0.029676245525479317,
-0.02296854369342327,
-0.0345798097550869,
0.057070620357990265,
-0.02250470034778118,
-0.0695258229970932,
0.030098807066679,
0.020958665758371353,
-0.1339581310749054,
-0.030531562864780426,
-0.1816764920949936,
0.08577407151460648,
0.12482917308807373,
-0.08067051321268082,
-0.12857426702976227,
0.04960300773382187,
-0.022468915209174156,
-0.03473777323961258,
-0.014467455446720123,
-0.02878977544605732,
0.17462344467639923,
0.031882066279649734,
0.13596704602241516,
-0.04383954405784607,
-0.010934408754110336,
0.03081069327890873,
-0.020190929993987083,
-0.04349585250020027,
0.11904219537973404,
-0.014149222522974014,
-0.08476387709379196,
0.08542450517416,
0.10587642341852188,
-0.09850117564201355,
0.10682107508182526,
-0.04420847445726395,
-0.049390219151973724,
-0.07283845543861389,
0.05539027974009514,
0.06632329523563385,
0.07806932181119919,
-0.08973389118909836,
0.011044230312108994,
0.007914125919342041,
0.03204205259680748,
-0.023866189643740654,
-0.18586090207099915,
0.008051440119743347,
0.008508898317813873,
-0.05776697024703026,
0.020605390891432762,
-0.0028834000695496798,
0.014435943216085434,
0.11426574736833572,
0.03532696142792702,
-0.04020709544420242,
0.08557377755641937,
-0.025411482900381088,
-0.09395240247249603,
0.21635910868644714,
-0.14792829751968384,
-0.1334538459777832,
-0.14168986678123474,
-0.02744264528155327,
-0.04044148698449135,
0.002549397526308894,
-0.0010371898533776402,
-0.09982231259346008,
-0.06202317774295807,
-0.0666702389717102,
-0.02827809378504753,
-0.04048402979969978,
0.030339457094669342,
0.04342583194375038,
-0.0016094360034912825,
0.13494190573692322,
-0.11078377068042755,
-0.040824178606271744,
0.003483556443825364,
-0.0768749788403511,
0.013200541958212852,
-0.002470871666446328,
0.000011841963896586094,
0.11864005029201508,
0.0035226449836045504,
0.024603841826319695,
-0.05016912519931793,
0.22194796800613403,
-0.05384225770831108,
-0.006748410407453775,
0.15269117057323456,
-0.022731713950634003,
0.07248924672603607,
0.12890911102294922,
0.04062303528189659,
-0.1191830039024353,
0.05788680538535118,
0.06460372358560562,
-0.02202097699046135,
-0.2431444525718689,
-0.006772556342184544,
-0.03512098267674446,
-0.07929934561252594,
0.06865032017230988,
0.030157193541526794,
0.13986219465732574,
0.012901149690151215,
-0.0037007536739110947,
0.1104208379983902,
0.050562307238578796,
0.07502562552690506,
0.16149631142616272,
0.06223171204328537,
0.09112004935741425,
-0.03574196621775627,
0.006429768167436123,
0.034164708107709885,
-0.012941332533955574,
0.21021287143230438,
0.020496858283877373,
0.0825275331735611,
0.06018068641424179,
0.07927829027175903,
-0.03176766633987427,
0.0006712687900289893,
0.0026130937039852142,
0.004335401579737663,
0.007416818290948868,
-0.06372585892677307,
-0.05586541071534157,
0.05695090815424919,
-0.03674844652414322,
0.0764390081167221,
-0.11770518869161606,
0.040365107357501984,
0.045542240142822266,
0.2550595700740814,
0.08858844637870789,
-0.29706570506095886,
-0.11443059146404266,
0.008149862289428711,
-0.028692396357655525,
-0.0555114783346653,
-0.006489580497145653,
0.08434716612100601,
-0.08653908967971802,
0.07692396640777588,
-0.06872174143791199,
0.04922477528452873,
-0.049755752086639404,
0.04100138321518898,
0.10732614248991013,
0.10199808329343796,
0.0032377426978200674,
0.009747734293341637,
-0.3569236397743225,
0.27973300218582153,
0.042484987527132034,
0.15807968378067017,
-0.08048976957798004,
0.053418636322021484,
0.03671650588512421,
-0.03521270304918289,
0.06201725825667381,
-0.0025779108982533216,
-0.16417457163333893,
-0.21158988773822784,
-0.0434279628098011,
-0.0038803676143288612,
0.1470792442560196,
0.018993113189935684,
0.10265400260686874,
-0.05437634512782097,
0.016008952632546425,
0.07497530430555344,
-0.02747572585940361,
-0.16333524882793427,
-0.06888190656900406,
0.05355699732899666,
0.06049298867583275,
-0.016933629289269447,
-0.08843731135129929,
-0.0717928409576416,
-0.04474788159132004,
0.17584006488323212,
-0.1662386655807495,
-0.04868176206946373,
-0.1368337869644165,
0.07250618934631348,
0.08879949152469635,
-0.06478063017129898,
0.04214506223797798,
-0.0042030359618365765,
0.06422670185565948,
0.037994250655174255,
-0.08699340373277664,
0.1379442662000656,
-0.027079988270998,
-0.22694522142410278,
-0.06243942305445671,
0.09691522270441055,
0.046188075095415115,
0.037244997918605804,
-0.010595145635306835,
0.07482173293828964,
0.04661024361848831,
-0.10103253275156021,
0.09363020211458206,
0.03668047860264778,
0.04975797235965729,
0.07086612284183502,
-0.015589576214551926,
-0.015771951526403427,
-0.040646664798259735,
-0.013807742856442928,
0.10040310025215149,
0.28697651624679565,
-0.06812883913516998,
0.028564760461449623,
0.02287452295422554,
-0.09124338626861572,
-0.20213046669960022,
0.07599052041769028,
0.09917053580284119,
0.014774146489799023,
-0.05898820981383324,
-0.1936713457107544,
0.04204229265451431,
0.09452756494283676,
-0.012514741159975529,
0.05513065308332443,
-0.3046952188014984,
-0.15049663186073303,
0.05973222106695175,
0.12305096536874771,
0.12522928416728973,
-0.16621999442577362,
-0.05035490170121193,
-0.05864250659942627,
-0.043964844197034836,
0.13380885124206543,
-0.056867584586143494,
0.10412587225437164,
0.02956126630306244,
0.04451797530055046,
0.005975096020847559,
-0.043379057198762894,
0.13215085864067078,
-0.034750547260046005,
0.09712710231542587,
-0.050351474434137344,
-0.05714113637804985,
0.08565225452184677,
-0.08667898178100586,
0.028139250352978706,
-0.04589414224028587,
0.030237819999456406,
-0.14241033792495728,
0.002439948497340083,
-0.07574139535427094,
0.04787975549697876,
-0.061678510159254074,
-0.02146933600306511,
-0.008925356902182102,
0.0590754896402359,
0.07042202353477478,
-0.017356516793370247,
0.12351624667644501,
-0.014873418025672436,
0.16451837122440338,
0.15097112953662872,
0.08582185953855515,
0.009000878781080246,
-0.021508391946554184,
0.08139361441135406,
-0.03140616416931152,
0.0744212418794632,
-0.16338899731636047,
0.05592624470591545,
0.13084757328033447,
-0.0034663507249206305,
0.15918661653995514,
0.0635872408747673,
-0.07435398548841476,
0.03747468441724777,
0.054709553718566895,
-0.1281919628381729,
-0.09895379096269608,
0.021536294370889664,
0.04465784877538681,
-0.08962661772966385,
0.03388670086860657,
0.15774516761302948,
-0.033754028379917145,
0.02335265465080738,
0.006116052158176899,
0.042012400925159454,
-0.0650874674320221,
0.13298054039478302,
0.021087469533085823,
0.06586389243602753,
-0.0689850002527237,
0.14106637239456177,
0.06104908511042595,
-0.11851464956998825,
0.11830557882785797,
0.026889078319072723,
-0.0674038752913475,
0.00043469519005157053,
0.015234103426337242,
0.08239377290010452,
0.026891741901636124,
-0.06555021554231644,
-0.11975576728582382,
-0.1561119556427002,
0.0760340690612793,
0.199196919798851,
0.03865707665681839,
0.07076361775398254,
-0.04323003068566322,
-0.007575625088065863,
-0.09283273667097092,
0.058713801205158234,
0.04555721953511238,
0.0329708494246006,
-0.14505454897880554,
0.18000103533267975,
0.0015220254426822066,
-0.013496868312358856,
-0.0057295034639537334,
0.005342561285942793,
-0.1960735321044922,
0.004930602852255106,
-0.1425158679485321,
0.005810409784317017,
0.013860070146620274,
-0.01519657950848341,
0.036711759865283966,
-0.05491475388407707,
-0.0567319430410862,
0.044923219829797745,
-0.08744239062070847,
-0.04714645445346832,
0.0496358685195446,
0.060490209609270096,
-0.12014807015657425,
-0.07800089567899704,
0.03297564759850502,
-0.10443845391273499,
0.04533843696117401,
0.05265389755368233,
0.0020731131080538034,
0.02317558042705059,
-0.09451019018888474,
0.03137215971946716,
0.042051613330841064,
-0.005533685442060232,
0.036840010434389114,
-0.16479426622390747,
0.013912614434957504,
-0.03191789239645004,
0.037004254758358,
0.03777093440294266,
0.10250185430049896,
-0.09466473013162613,
-0.02403504028916359,
-0.009090549312531948,
-0.02709837630391121,
-0.05721138417720795,
0.06853204220533371,
0.14372390508651733,
-0.01665787398815155,
0.17619192600250244,
-0.12759016454219818,
0.01238891389220953,
-0.1721036434173584,
0.009366003796458244,
-0.0006810732302255929,
-0.08683519065380096,
-0.10530301928520203,
-0.012317449785768986,
0.10674533993005753,
-0.08573559671640396,
0.08107852190732956,
-0.06796438246965408,
0.08799213171005249,
0.05140446498990059,
-0.10050179064273834,
-0.06332789361476898,
0.06291480362415314,
0.1680346429347992,
0.04657205566763878,
-0.018845435231924057,
0.069976806640625,
-0.02356942929327488,
0.08959560841321945,
0.10864482074975967,
0.20048978924751282,
0.14161305129528046,
0.03491447493433952,
0.1222112625837326,
0.0539688877761364,
-0.07443657517433167,
-0.09077032655477524,
0.1108388602733612,
-0.0754433199763298,
0.1600208282470703,
-0.05818016827106476,
0.11350005120038986,
0.06775310635566711,
-0.1782953143119812,
0.02158825285732746,
-0.0772351548075676,
-0.10010237991809845,
-0.1261867731809616,
-0.10292021185159683,
-0.07644340395927429,
-0.09618759900331497,
0.00046067184302955866,
-0.0993729680776596,
0.03658146411180496,
0.06041216477751732,
0.02511672116816044,
0.0016374364495277405,
0.07765016704797745,
-0.058628689497709274,
0.02839116007089615,
0.10692355036735535,
-0.010404616594314575,
-0.012650714255869389,
-0.027512764558196068,
-0.09229219704866409,
0.05605260655283928,
-0.013462955132126808,
0.04849214106798172,
0.008289008401334286,
-0.011895943433046341,
0.046314630657434464,
-0.012510925531387329,
-0.09879443794488907,
0.0530565045773983,
0.02970070019364357,
0.017654873430728912,
0.07048051804304123,
0.0504986010491848,
-0.009886769577860832,
-0.022072352468967438,
0.14046630263328552,
-0.10376813262701035,
-0.027750948444008827,
-0.15251228213310242,
0.27048754692077637,
-0.016845835372805595,
0.03680714592337608,
0.000940284167882055,
-0.07372693717479706,
-0.03631637617945671,
0.17129714787006378,
0.13169951736927032,
-0.06579599529504776,
-0.026044584810733795,
0.04989323392510414,
-0.011504542082548141,
-0.038649268448352814,
0.14434602856636047,
0.06820856779813766,
-0.02836086042225361,
-0.046369053423404694,
-0.03504912555217743,
-0.015449612401425838,
-0.01857980713248253,
-0.038815174251794815,
0.0734090730547905,
0.00584804592654109,
-0.012597383931279182,
-0.007886042818427086,
0.06025306135416031,
-0.058998554944992065,
-0.12030995637178421,
0.08613227307796478,
-0.1917705237865448,
-0.15255361795425415,
0.0022666254080832005,
0.010207136161625385,
-0.007118977140635252,
0.07088325917720795,
-0.0063158064149320126,
-0.005386526696383953,
0.10323347896337509,
-0.036686841398477554,
-0.018060674890875816,
-0.1170710101723671,
0.05244822055101395,
-0.03435426577925682,
0.18685798346996307,
-0.014667244628071785,
0.0574505589902401,
0.1412942260503769,
0.025017455220222473,
-0.09437333792448044,
0.03202136978507042,
0.07401280105113983,
-0.09868666529655457,
-0.008371513336896896,
0.08270182460546494,
-0.0291419867426157,
0.16052906215190887,
0.06135657802224159,
-0.08805018663406372,
0.041760675609111786,
-0.08272396773099899,
-0.08755508810281754,
-0.04133656248450279,
-0.049257755279541016,
-0.0732451006770134,
0.1316535323858261,
0.2371564507484436,
-0.040724240243434906,
0.009743427857756615,
-0.029541784897446632,
-0.0028112600557506084,
0.05063784122467041,
0.02546480856835842,
-0.05871054530143738,
-0.23552989959716797,
0.07311996072530746,
0.05145171657204628,
0.040630754083395004,
-0.16394832730293274,
-0.09813342988491058,
0.014776842668652534,
-0.01703745499253273,
-0.09968898445367813,
0.1056937649846077,
0.0339590422809124,
0.048317618668079376,
-0.05067063122987747,
-0.1356552541255951,
-0.041048210114240646,
0.19174274802207947,
-0.09890907257795334,
-0.0807732418179512
] |
null | null |
transformers
|
# SimCLS
SimCLS is a framework for abstractive summarization presented in [SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization](https://arxiv.org/abs/2106.01890).
It is a two-stage approach consisting of a *generator* and a *scorer*. In the first stage, a large pre-trained model for abstractive summarization (the *generator*) is used to generate candidate summaries, whereas, in the second stage, the *scorer* assigns a score to each candidate given the source document. The final summary is the highest-scoring candidate.
This model is the *scorer* trained for summarization of BillSum ([paper](https://arxiv.org/abs/1910.00523), [datasets](https://huggingface.co/datasets/billsum)). It should be used in conjunction with [google/pegasus-billsum](https://huggingface.co/google/pegasus-billsum). See [our Github repository](https://github.com/andrejmiscic/simcls-pytorch) for details on training, evaluation, and usage.
## Usage
```bash
git clone https://github.com/andrejmiscic/simcls-pytorch.git
cd simcls-pytorch
pip3 install torch torchvision torchaudio transformers sentencepiece
```
```python
from src.model import SimCLS, GeneratorType
summarizer = SimCLS(generator_type=GeneratorType.Pegasus,
generator_path="google/pegasus-billsum",
scorer_path="andrejmiscic/simcls-scorer-billsum")
document = "This is a legal document."
summary = summarizer(document)
print(summary)
```
### Results
All of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See [SimCLS paper](https://arxiv.org/abs/2106.01890) for a description of baselines.
We believe the discrepancies of Rouge-L scores between the original Pegasus work and our evaluation are due to the computation of the metric. Namely, we use a summary level Rouge-L score.
| System | Rouge-1 | Rouge-2 | Rouge-L\* |
|-----------------|----------------------:|----------------------:|----------------------:|
| Pegasus | 57.31 | 40.19 | 45.82 |
| **Our results** | --- | --- | --- |
| Origin | 56.24, [55.74, 56.74] | 37.46, [36.89, 38.03] | 50.71, [50.19, 51.22] |
| Min | 44.37, [43.85, 44.89] | 25.75, [25.30, 26.22] | 38.68, [38.18, 39.16] |
| Max | 62.88, [62.42, 63.33] | 43.96, [43.39, 44.54] | 57.50, [57.01, 58.00] |
| Random | 54.93, [54.43, 55.43] | 35.42, [34.85, 35.97] | 49.19, [48.68, 49.70] |
| **SimCLS** | 57.49, [57.01, 58.00] | 38.54, [37.98, 39.10] | 51.91, [51.39, 52.43] |
### Citation of the original work
```bibtex
@inproceedings{liu-liu-2021-simcls,
title = "{S}im{CLS}: A Simple Framework for Contrastive Learning of Abstractive Summarization",
author = "Liu, Yixin and
Liu, Pengfei",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-short.135",
doi = "10.18653/v1/2021.acl-short.135",
pages = "1065--1072",
}
```
|
{"language": ["en"], "tags": ["simcls"], "datasets": ["billsum"]}
|
feature-extraction
|
andrejmiscic/simcls-scorer-billsum
|
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"simcls",
"en",
"dataset:billsum",
"arxiv:2106.01890",
"arxiv:1910.00523",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.01890",
"1910.00523"
] |
[
"en"
] |
TAGS
#transformers #pytorch #roberta #feature-extraction #simcls #en #dataset-billsum #arxiv-2106.01890 #arxiv-1910.00523 #endpoints_compatible #region-us
|
SimCLS
======
SimCLS is a framework for abstractive summarization presented in SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization.
It is a two-stage approach consisting of a *generator* and a *scorer*. In the first stage, a large pre-trained model for abstractive summarization (the *generator*) is used to generate candidate summaries, whereas, in the second stage, the *scorer* assigns a score to each candidate given the source document. The final summary is the highest-scoring candidate.
This model is the *scorer* trained for summarization of BillSum (paper, datasets). It should be used in conjunction with google/pegasus-billsum. See our Github repository for details on training, evaluation, and usage.
Usage
-----
### Results
All of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.
We believe the discrepancies of Rouge-L scores between the original Pegasus work and our evaluation are due to the computation of the metric. Namely, we use a summary level Rouge-L score.
of the original work
|
[
"### Results\n\n\nAll of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.\nWe believe the discrepancies of Rouge-L scores between the original Pegasus work and our evaluation are due to the computation of the metric. Namely, we use a summary level Rouge-L score.\n\n\n\nof the original work"
] |
[
"TAGS\n#transformers #pytorch #roberta #feature-extraction #simcls #en #dataset-billsum #arxiv-2106.01890 #arxiv-1910.00523 #endpoints_compatible #region-us \n",
"### Results\n\n\nAll of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.\nWe believe the discrepancies of Rouge-L scores between the original Pegasus work and our evaluation are due to the computation of the metric. Namely, we use a summary level Rouge-L score.\n\n\n\nof the original work"
] |
[
60,
89
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #feature-extraction #simcls #en #dataset-billsum #arxiv-2106.01890 #arxiv-1910.00523 #endpoints_compatible #region-us \n### Results\n\n\nAll of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.\nWe believe the discrepancies of Rouge-L scores between the original Pegasus work and our evaluation are due to the computation of the metric. Namely, we use a summary level Rouge-L score.\n\n\n\nof the original work"
] |
[
-0.15926522016525269,
0.07968520373106003,
-0.0021207602694630623,
0.05473798140883446,
0.06995599716901779,
0.02604193240404129,
-0.04203996807336807,
0.06937102228403091,
0.025845305994153023,
0.04364890232682228,
0.11982541531324387,
0.07049716264009476,
0.077130526304245,
0.07378271967172623,
-0.10441885143518448,
-0.08017254620790482,
0.030166666954755783,
0.10307329148054123,
-0.16006846725940704,
0.13522271811962128,
0.06989214569330215,
-0.07717011123895645,
0.11457011848688126,
0.0341247096657753,
-0.0835317000746727,
0.04955895617604256,
0.0984494611620903,
-0.037130918353796005,
0.10146303474903107,
0.03565714880824089,
0.12059003859758377,
0.027188239619135857,
0.07417288422584534,
-0.11627886444330215,
0.028102871030569077,
-0.020698023959994316,
-0.009133387356996536,
0.0577719509601593,
0.015812478959560394,
-0.11197131127119064,
0.20780491828918457,
0.0411461666226387,
0.05655638501048088,
0.02136859856545925,
-0.1349782794713974,
-0.15040703117847443,
-0.14826121926307678,
-0.00010790622036438435,
0.08924731612205505,
0.02696358598768711,
0.037923142313957214,
0.16789786517620087,
-0.14242731034755707,
0.0471508763730526,
0.23837165534496307,
-0.2577868402004242,
-0.008740637451410294,
0.1914757490158081,
0.03331027179956436,
-0.1445411592721939,
-0.03354404494166374,
0.04185478389263153,
0.09076639264822006,
0.003950286190956831,
-0.014405368827283382,
-0.0764300525188446,
0.004523357376456261,
0.06831573694944382,
-0.09939758479595184,
-0.08759328722953796,
0.2593784034252167,
-0.009567534551024437,
-0.09695369750261307,
0.06064602732658386,
-0.04177665337920189,
-0.05444394424557686,
-0.014166882261633873,
-0.05673663318157196,
-0.007060614414513111,
-0.04429803043603897,
0.053662482649087906,
0.11176370084285736,
-0.0532374270260334,
-0.09648454934358597,
-0.09739208966493607,
0.11528809368610382,
0.008584798313677311,
0.09443794190883636,
-0.10234413295984268,
0.14045745134353638,
-0.1919976770877838,
-0.0605158768594265,
-0.027052786201238632,
-0.07384268194437027,
-0.05309033393859863,
0.021663784980773926,
-0.05463603138923645,
0.07578655332326889,
-0.009196240454912186,
0.2699434757232666,
0.03648179769515991,
0.001714917947538197,
0.13744862377643585,
0.0009309912566095591,
0.0479348786175251,
0.1341477781534195,
-0.061298519372940063,
-0.11979669332504272,
0.0338873565196991,
-0.059177231043577194,
0.04683113843202591,
-0.034004829823970795,
0.017794853076338768,
-0.14440570771694183,
-0.03630802780389786,
0.07323481887578964,
-0.059369903057813644,
-0.05110316351056099,
-0.10031133145093918,
-0.01389318984001875,
0.021715102717280388,
-0.11413171887397766,
-0.01821608655154705,
-0.0797765925526619,
-0.08116500079631805,
0.12044606357812881,
0.12285634130239487,
-0.04091527312994003,
-0.023569416254758835,
0.05835261195898056,
-0.1109180673956871,
-0.02703937515616417,
-0.059369977563619614,
-0.09562990814447403,
0.021500779315829277,
-0.09041443467140198,
0.02814108319580555,
-0.03475487604737282,
-0.06056847795844078,
0.0046679237857460976,
0.048400212079286575,
-0.06587434560060501,
-0.004651450552046299,
0.007662358693778515,
-0.02205279842019081,
0.025123190134763718,
0.03280926123261452,
-0.015492742881178856,
-0.044870518147945404,
0.06466997414827347,
0.04165731370449066,
0.06517889350652695,
-0.12704774737358093,
0.012510773725807667,
-0.07367633283138275,
0.06337642669677734,
-0.12722422182559967,
-0.09103914350271225,
-0.044491834938526154,
-0.03939931094646454,
-0.04124780371785164,
-0.10471253097057343,
-0.05523005500435829,
-0.03090597689151764,
0.08051443099975586,
0.15390080213546753,
-0.10875070095062256,
-0.022936612367630005,
0.2111208587884903,
-0.14524270594120026,
-0.14113616943359375,
0.14717985689640045,
-0.05841192230582237,
-0.02116698957979679,
0.04350887984037399,
0.024008408188819885,
0.09051597118377686,
-0.1745421439409256,
0.014630782417953014,
0.05924434959888458,
0.10909926891326904,
-0.15671579539775848,
0.11314146220684052,
-0.020605172961950302,
-0.10463476181030273,
0.05836676433682442,
-0.04100152477622032,
0.03136635199189186,
-0.08280003815889359,
-0.014908299781382084,
-0.059434179216623306,
-0.03593948110938072,
0.030465995892882347,
0.022112805396318436,
0.005111579783260822,
-0.06707361340522766,
-0.04574943333864212,
-0.19785282015800476,
0.11208232492208481,
-0.029187364503741264,
0.019806955009698868,
-0.08856277912855148,
0.19441498816013336,
-0.06020860746502876,
-0.059117771685123444,
-0.1465032994747162,
0.03684712573885918,
-0.017500590533018112,
-0.02558196149766445,
-0.05264172703027725,
0.09714561700820923,
0.009073288179934025,
-0.05484534800052643,
-0.04660654813051224,
0.08522559702396393,
0.007089863996952772,
-0.03783734142780304,
-0.057127490639686584,
-0.08716531842947006,
-0.0066985166631639,
-0.05285666510462761,
0.02832474745810032,
-0.11791002005338669,
-0.022197052836418152,
-0.04735526815056801,
0.10794587433338165,
0.0019492078572511673,
-0.021107440814375877,
0.0008761333301663399,
0.036156415939331055,
-0.0833543986082077,
0.0137112932279706,
-0.008548958227038383,
-0.0004510296566877514,
-0.059443194419145584,
0.02281052991747856,
-0.09375963360071182,
0.14506827294826508,
0.0853051245212555,
0.004228634759783745,
-0.05616316571831703,
0.0004337560385465622,
0.03709184378385544,
0.023310571908950806,
-0.060278575867414474,
0.030802439898252487,
-0.08036494255065918,
-0.0320550911128521,
0.11848858743906021,
-0.07699728012084961,
-0.02855212613940239,
0.03894227370619774,
-0.0901002511382103,
0.0119878388941288,
0.15743087232112885,
0.19211366772651672,
0.030800161883234978,
0.04982142522931099,
0.16655772924423218,
-0.03166762739419937,
0.02653626911342144,
-0.07240667939186096,
-0.07014905661344528,
-0.05861872062087059,
-0.03265145421028137,
0.004876245278865099,
0.21739956736564636,
-0.15455007553100586,
-0.023568231612443924,
0.04454420879483223,
-0.017712511122226715,
0.008354179561138153,
-0.12910954654216766,
-0.05322372168302536,
0.05213271453976631,
0.005330475512892008,
-0.14788874983787537,
0.10677666962146759,
-0.022191263735294342,
0.14965900778770447,
-0.0826013907790184,
-0.09348287433385849,
-0.03909669443964958,
-0.02113068476319313,
-0.062094930559396744,
0.21018756926059723,
-0.02999158576130867,
-0.06759607791900635,
-0.12312253564596176,
-0.026494642719626427,
-0.08416049927473068,
0.011385465040802956,
0.0014002074021846056,
-0.028569327667355537,
-0.054923705756664276,
-0.028977131471037865,
0.06336762756109238,
-0.1159428134560585,
0.03159281611442566,
0.03772955387830734,
-0.018891368061304092,
-0.052975378930568695,
-0.11780206859111786,
-0.054973479360342026,
-0.10536540299654007,
0.01622307486832142,
0.042200807482004166,
-0.11614633351564407,
0.11141522973775864,
0.20740146934986115,
-0.05667302384972572,
0.04204270616173744,
-0.0027349686715751886,
0.1983829289674759,
-0.009916015900671482,
-0.04971618577837944,
0.2276880294084549,
0.0393572673201561,
0.025372633710503578,
0.0735878124833107,
0.016251318156719208,
-0.09081079810857773,
-0.014437360689043999,
-0.03349770978093147,
-0.118062324821949,
-0.22590525448322296,
-0.07542234659194946,
-0.09156038612127304,
0.036973729729652405,
-0.028686106204986572,
-0.03536099195480347,
-0.07327257096767426,
0.05531942471861839,
-0.014612512663006783,
-0.005862759426236153,
-0.02074282430112362,
-0.0025091548450291157,
0.13700389862060547,
0.008058672770857811,
0.10821317881345749,
-0.08055073767900467,
-0.02031427063047886,
0.08429986238479614,
-0.023264991119503975,
0.1712440401315689,
-0.012747811153531075,
-0.0747816264629364,
0.02329784817993641,
0.19611400365829468,
0.021698689088225365,
0.1570630520582199,
0.048860494047403336,
-0.027414409443736076,
-0.027374865487217903,
0.020487627014517784,
-0.11994490027427673,
-0.03550015389919281,
-0.015932699665427208,
0.026767544448375702,
-0.16035453975200653,
-0.011050261557102203,
-0.04565859213471413,
0.026381131261587143,
0.20179110765457153,
-0.21612629294395447,
-0.09269285202026367,
0.05030025914311409,
0.028992829844355583,
-0.0861205980181694,
0.08446165174245834,
0.01619764417409897,
-0.11404553800821304,
0.04600740969181061,
-0.053971149027347565,
0.06820754706859589,
0.08963269740343094,
-0.022211221978068352,
-0.03661586344242096,
-0.07251613587141037,
-0.034341923892498016,
0.06959950178861618,
-0.14528797566890717,
0.24268211424350739,
-0.025619300082325935,
-0.04355545714497566,
-0.011622928082942963,
0.026439165696501732,
0.03955714404582977,
0.10415012389421463,
0.18921954929828644,
0.034055303782224655,
-0.14521226286888123,
-0.12072239071130753,
-0.033668871968984604,
0.07279766350984573,
0.059621717780828476,
-0.023572761565446854,
0.08925671130418777,
-0.05925048887729645,
0.026706691831350327,
0.028050370514392853,
0.06315198540687561,
-0.014216996729373932,
-0.08165787905454636,
0.04847870022058487,
-0.07129563391208649,
-0.1763128787279129,
-0.020906152203679085,
-0.0709252655506134,
0.008749851956963539,
0.05102122575044632,
-0.010803534649312496,
-0.03167719021439552,
-0.056689806282520294,
0.014693506062030792,
0.15526321530342102,
-0.10623887926340103,
0.06602100282907486,
-0.06626153737306595,
0.033894967287778854,
0.03605514392256737,
-0.12923869490623474,
0.03000614047050476,
-0.10495644807815552,
-0.04833897575736046,
-0.026648955419659615,
0.1453256458044052,
0.014653877355158329,
0.028439821675419807,
0.039677832275629044,
0.05054543539881706,
-0.06769830733537674,
-0.10921063274145126,
-0.001360860769636929,
0.011242212727665901,
0.02529955469071865,
0.15387581288814545,
0.030876396223902702,
-0.0482158325612545,
0.016231419518589973,
-0.005154033191502094,
0.0841909646987915,
0.22053782641887665,
-0.11970038712024689,
0.06995666772127151,
0.030741529539227486,
-0.07549052685499191,
-0.3197937607765198,
-0.058258168399333954,
-0.08374308049678802,
0.08215832710266113,
0.0669369325041771,
0.011644169688224792,
0.1259760856628418,
0.03798731416463852,
-0.018288707360625267,
-0.02828490547835827,
-0.33365732431411743,
-0.06285129487514496,
0.1928086280822754,
0.033368583768606186,
0.4620630741119385,
-0.09087101370096207,
0.01308098528534174,
-0.013239651918411255,
-0.24943101406097412,
0.09346675872802734,
0.046051669865846634,
0.02764894813299179,
-0.1219257265329361,
0.03434866666793823,
0.04063110053539276,
-0.02125224471092224,
0.19026805460453033,
0.021836483851075172,
0.10961055010557175,
0.008957048878073692,
-0.020998291671276093,
0.058576297014951706,
-0.06410988420248032,
0.04865574091672897,
0.07672517001628876,
0.06335092335939407,
-0.1919700801372528,
-0.014054955914616585,
-0.06154913082718849,
0.06355394423007965,
-0.0464201495051384,
-0.004254649858921766,
-0.12286444008350372,
-0.00029338389867916703,
-0.0009882465237751603,
-0.04125538840889931,
0.10922941565513611,
0.010995977558195591,
0.13400648534297943,
0.01953214593231678,
0.0985754132270813,
-0.03755339980125427,
-0.03137899935245514,
0.09390353411436081,
0.010403941385447979,
0.012473449110984802,
-0.17345035076141357,
0.0724642351269722,
0.16147632896900177,
0.10977280139923096,
0.014621410518884659,
0.06463789939880371,
-0.02720300853252411,
0.0022277934476733208,
0.07422373443841934,
-0.19700652360916138,
-0.07648961991071701,
-0.000045493732613977045,
-0.12504611909389496,
-0.08868859708309174,
0.11932478845119476,
0.057411182671785355,
-0.03218460455536842,
-0.02733032777905464,
-0.04633031785488129,
0.04837219417095184,
-0.053602609783411026,
0.18543726205825806,
0.02465858682990074,
0.00470344303175807,
-0.11955834925174713,
0.08704019337892532,
-0.015748750418424606,
-0.16435052454471588,
-0.05553001910448074,
-0.12044573575258255,
-0.07225069403648376,
-0.00470329262316227,
-0.08527456223964691,
0.050728973001241684,
-0.2008455991744995,
-0.041178520768880844,
-0.15060123801231384,
-0.1267828792333603,
0.006372510455548763,
0.20171239972114563,
0.14436693489551544,
0.0769650936126709,
-0.0009082251926884055,
-0.1361365169286728,
-0.09674785286188126,
0.04702208563685417,
0.2773990035057068,
0.007736893370747566,
-0.07034578919410706,
-0.026120642200112343,
-0.05155152454972267,
0.0970592051744461,
-0.037549298256635666,
-0.010434439405798912,
-0.08633039146661758,
0.04973089322447777,
-0.14132677018642426,
-0.009748432785272598,
-0.058582231402397156,
-0.011364759877324104,
0.010506696999073029,
-0.04814937338232994,
-0.06944317370653152,
0.023443065583705902,
-0.10011336207389832,
0.06790061295032501,
0.008132737129926682,
0.050924863666296005,
-0.04752260446548462,
-0.024534640833735466,
0.12933750450611115,
-0.05786353349685669,
0.015040980651974678,
0.10778849571943283,
0.02545640431344509,
0.09423346072435379,
-0.1903625875711441,
-0.0022208597511053085,
-0.0033616244327276945,
0.03177230805158615,
0.038000673055648804,
-0.1457098424434662,
0.08131731301546097,
0.05372822657227516,
-0.024793611839413643,
0.006145427003502846,
-0.10583486407995224,
-0.08562223613262177,
-0.09610608965158463,
-0.008140316233038902,
-0.0713144913315773,
-0.01837516389787197,
0.0031951172277331352,
0.11423251777887344,
0.18123185634613037,
0.025197990238666534,
0.02135646715760231,
-0.001167955226264894,
-0.11888492852449417,
-0.008083630353212357,
-0.043664589524269104,
-0.019675495103001595,
-0.12552084028720856,
-0.03590422496199608,
0.05301821604371071,
0.093498595058918,
0.187718465924263,
0.03324102982878685,
0.07586584240198135,
-0.02314727008342743,
-0.015609271824359894,
0.1478785276412964,
-0.014526906423270702,
0.08976678550243378,
0.008711968548595905,
-0.017223967239260674,
0.013118047267198563,
0.06783601641654968,
0.05685197189450264,
0.13442687690258026,
0.20514455437660217,
0.0790472999215126,
0.018715975806117058,
0.05570781230926514,
-0.12884046137332916,
-0.05659232288599014,
0.10485304147005081,
-0.09760891646146774,
-0.0009485800401307642,
0.024858104065060616,
0.03134828433394432,
0.10622277855873108,
0.08955138921737671,
-0.09803376346826553,
0.026211174204945564,
-0.10559628158807755,
-0.07968121021986008,
-0.21750421822071075,
-0.06400266289710999,
-0.038783106952905655,
-0.1126546785235405,
0.009049002081155777,
-0.10878286510705948,
0.04607628658413887,
0.2602236568927765,
0.057096436619758606,
-0.02026546373963356,
0.0007347243372350931,
-0.19368381798267365,
-0.0432068295776844,
0.05892268195748329,
0.05603577569127083,
-0.0029146173037588596,
0.045106396079063416,
0.09137368947267532,
-0.01883392035961151,
-0.049993786960840225,
0.00746936583891511,
0.0002047582238446921,
0.025456711649894714,
-0.0030616282019764185,
-0.04276732727885246,
-0.021081162616610527,
-0.05093761906027794,
0.024122217670083046,
0.05277619883418083,
0.13770948350429535,
0.03549730405211449,
0.02300148271024227,
-0.00994077604264021,
0.24217724800109863,
-0.1119956448674202,
0.06197402626276016,
-0.1367904245853424,
0.21009205281734467,
0.0641840398311615,
0.032487086951732635,
0.03906542435288429,
-0.030007222667336464,
0.04817715659737587,
0.13621006906032562,
0.05811777338385582,
0.10235435515642166,
-0.00692236190661788,
-0.015898138284683228,
0.031195057556033134,
0.051102254539728165,
0.01629449427127838,
0.05886543542146683,
0.24344056844711304,
-0.086883544921875,
-0.05607622489333153,
-0.05948187783360481,
0.0048467679880559444,
0.015847379341721535,
0.1078638955950737,
-0.02417171746492386,
-0.05692474544048309,
-0.05030465126037598,
0.12186603993177414,
-0.0017377176554873586,
-0.17252697050571442,
0.16052915155887604,
-0.16405965387821198,
-0.051594432443380356,
-0.009339902549982071,
-0.06600134074687958,
-0.020122095942497253,
0.07913000136613846,
-0.12458708137273788,
-0.026967104524374008,
0.17580842971801758,
0.05086667835712433,
-0.17062750458717346,
-0.15635405480861664,
0.13512368500232697,
0.054001759737730026,
0.1285114884376526,
0.036969155073165894,
0.11595749855041504,
0.051193080842494965,
0.032136160880327225,
0.0019044291693717241,
-0.04074995964765549,
0.06350942701101303,
0.10479521006345749,
-0.02695312350988388,
0.05217289924621582,
0.025754831731319427,
0.01677826978266239,
-0.04077361524105072,
-0.018410570919513702,
0.019926240667700768,
-0.09405354410409927,
-0.04142777621746063,
-0.08473558723926544,
0.05851518735289574,
-0.0724262073636055,
0.12613506615161896,
0.1444041132926941,
-0.04537690803408623,
0.04062635824084282,
-0.04258442297577858,
0.012966741807758808,
0.04999401420354843,
-0.006293836515396833,
0.045636676251888275,
-0.08940889686346054,
0.08087697625160217,
-0.02714276872575283,
-0.019373182207345963,
-0.11890862882137299,
-0.021440675482153893,
-0.05058731138706207,
-0.006722956895828247,
0.006160476244986057,
0.08420293778181076,
-0.015444133430719376,
0.04146274924278259,
-0.04370046406984329,
0.04268397390842438,
-0.0060537392273545265,
0.10815122723579407,
-0.006428778171539307,
-0.11403557658195496
] |
null | null |
transformers
|
# SimCLS
SimCLS is a framework for abstractive summarization presented in [SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization](https://arxiv.org/abs/2106.01890).
It is a two-stage approach consisting of a *generator* and a *scorer*. In the first stage, a large pre-trained model for abstractive summarization (the *generator*) is used to generate candidate summaries, whereas, in the second stage, the *scorer* assigns a score to each candidate given the source document. The final summary is the highest-scoring candidate.
This model is the *scorer* trained for summarization of CNN/DailyMail ([paper](https://arxiv.org/abs/1602.06023), [datasets](https://huggingface.co/datasets/cnn_dailymail)). It should be used in conjunction with [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn). See [our Github repository](https://github.com/andrejmiscic/simcls-pytorch) for details on training, evaluation, and usage.
## Usage
```bash
git clone https://github.com/andrejmiscic/simcls-pytorch.git
cd simcls-pytorch
pip3 install torch torchvision torchaudio transformers sentencepiece
```
```python
from src.model import SimCLS, GeneratorType
summarizer = SimCLS(generator_type=GeneratorType.Bart,
generator_path="facebook/bart-large-cnn",
scorer_path="andrejmiscic/simcls-scorer-cnndm")
article = "This is a news article."
summary = summarizer(article)
print(summary)
```
### Results
All of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See [SimCLS paper](https://arxiv.org/abs/2106.01890) for a description of baselines.
| System | Rouge-1 | Rouge-2 | Rouge-L |
|------------------|----------------------:|----------------------:|----------------------:|
| BART | 44.16 | 21.28 | 40.90 |
| **SimCLS paper** | --- | --- | --- |
| Origin | 44.39 | 21.21 | 41.28 |
| Min | 33.17 | 11.67 | 30.77 |
| Max | 54.36 | 28.73 | 50.77 |
| Random | 43.98 | 20.06 | 40.94 |
| **SimCLS** | 46.67 | 22.15 | 43.54 |
| **Our results** | --- | --- | --- |
| Origin | 44.41, [44.18, 44.63] | 21.05, [20.80, 21.29] | 41.53, [41.30, 41.75] |
| Min | 33.43, [33.25, 33.62] | 10.97, [10.82, 11.12] | 30.57, [30.40, 30.74] |
| Max | 53.87, [53.67, 54.08] | 29.72, [29.47, 29.98] | 51.13, [50.92, 51.34] |
| Random | 43.94, [43.73, 44.16] | 20.09, [19.86, 20.31] | 41.06, [40.85, 41.27] |
| **SimCLS** | 46.53, [46.32, 46.75] | 22.14, [21.91, 22.37] | 43.56, [43.34, 43.78] |
### Citation of the original work
```bibtex
@inproceedings{liu-liu-2021-simcls,
title = "{S}im{CLS}: A Simple Framework for Contrastive Learning of Abstractive Summarization",
author = "Liu, Yixin and
Liu, Pengfei",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-short.135",
doi = "10.18653/v1/2021.acl-short.135",
pages = "1065--1072",
}
```
|
{"language": ["en"], "tags": ["simcls"], "datasets": ["cnn_dailymail"]}
|
feature-extraction
|
andrejmiscic/simcls-scorer-cnndm
|
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"simcls",
"en",
"dataset:cnn_dailymail",
"arxiv:2106.01890",
"arxiv:1602.06023",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.01890",
"1602.06023"
] |
[
"en"
] |
TAGS
#transformers #pytorch #roberta #feature-extraction #simcls #en #dataset-cnn_dailymail #arxiv-2106.01890 #arxiv-1602.06023 #endpoints_compatible #region-us
|
SimCLS
======
SimCLS is a framework for abstractive summarization presented in SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization.
It is a two-stage approach consisting of a *generator* and a *scorer*. In the first stage, a large pre-trained model for abstractive summarization (the *generator*) is used to generate candidate summaries, whereas, in the second stage, the *scorer* assigns a score to each candidate given the source document. The final summary is the highest-scoring candidate.
This model is the *scorer* trained for summarization of CNN/DailyMail (paper, datasets). It should be used in conjunction with facebook/bart-large-cnn. See our Github repository for details on training, evaluation, and usage.
Usage
-----
### Results
All of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.
of the original work
|
[
"### Results\n\n\nAll of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.\n\n\n\nof the original work"
] |
[
"TAGS\n#transformers #pytorch #roberta #feature-extraction #simcls #en #dataset-cnn_dailymail #arxiv-2106.01890 #arxiv-1602.06023 #endpoints_compatible #region-us \n",
"### Results\n\n\nAll of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.\n\n\n\nof the original work"
] |
[
61,
44
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #feature-extraction #simcls #en #dataset-cnn_dailymail #arxiv-2106.01890 #arxiv-1602.06023 #endpoints_compatible #region-us \n### Results\n\n\nAll of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.\n\n\n\nof the original work"
] |
[
-0.15719319880008698,
0.03445946052670479,
-0.0017400895012542605,
-0.028454462066292763,
0.06328556686639786,
0.009245865978300571,
-0.0003628362901508808,
0.05016249790787697,
0.012165267951786518,
0.04516688361763954,
0.15623894333839417,
0.21202486753463745,
0.032098062336444855,
0.1747933179140091,
-0.09338419884443283,
-0.0743536651134491,
0.030721241608262062,
0.07718624174594879,
-0.11040627956390381,
0.16915439069271088,
0.05696974694728851,
-0.1511692851781845,
0.07577814161777496,
0.0018566574435681105,
-0.1275155246257782,
0.024919690564274788,
0.042208075523376465,
-0.09585338830947876,
0.07567992806434631,
-0.014903238043189049,
0.10878261178731918,
0.05589890107512474,
0.04779575765132904,
-0.08217024058103561,
0.02760029025375843,
0.009538177400827408,
-0.02335287816822529,
0.08188305795192719,
0.029652422294020653,
-0.043082769960165024,
0.1967773288488388,
0.03910225257277489,
0.025485090911388397,
0.027782147750258446,
-0.1407131552696228,
-0.07529860734939575,
-0.06515887379646301,
0.054319124668836594,
0.12770624458789825,
0.04751409590244293,
0.01021465566009283,
0.2022610604763031,
-0.1222635880112648,
0.09115529805421829,
0.10163728892803192,
-0.20316296815872192,
-0.02080547623336315,
0.10562719404697418,
-0.0056084031239151955,
-0.05917052552103996,
-0.014338760636746883,
0.0452410951256752,
0.0812624990940094,
-0.03032061830163002,
-0.11285831034183502,
-0.06115657463669777,
-0.0639353096485138,
0.03108930215239525,
-0.11374212801456451,
-0.09587899595499039,
0.24295976758003235,
0.028489409014582634,
-0.04822497442364693,
0.033560361713171005,
-0.08846233785152435,
-0.04147997125983238,
-0.026685867458581924,
-0.0015442029107362032,
-0.025182237848639488,
-0.01691320911049843,
-0.10022863000631332,
0.13438841700553894,
-0.10672567784786224,
-0.05058141425251961,
-0.10669289529323578,
0.16897474229335785,
0.0018630821723490953,
0.12187141180038452,
-0.10951726883649826,
0.13708163797855377,
0.03102714568376541,
-0.11002591997385025,
-0.0075196935795247555,
-0.0719727873802185,
0.030762959271669388,
0.014947566203773022,
-0.03190157935023308,
0.07191144675016403,
0.013861330226063728,
0.16854354739189148,
0.09492900222539902,
0.018165383487939835,
0.13937833905220032,
0.03901784494519234,
0.05778668075799942,
0.04000091552734375,
-0.07817819714546204,
-0.10571455955505371,
0.023355873301625252,
-0.05100230872631073,
0.022064825519919395,
-0.027423659339547157,
-0.0013013887219130993,
-0.004861053079366684,
-0.0030039723496884108,
0.05514879897236824,
-0.025661053135991096,
0.0020511650945991278,
-0.07537851482629776,
-0.0077355727553367615,
0.0019306562608107924,
-0.08344712108373642,
-0.052838124334812164,
-0.07188455015420914,
-0.06677940487861633,
0.11247680336236954,
0.050044137984514236,
0.0015218798071146011,
0.004260673187673092,
0.052389245480298996,
-0.13999943435192108,
-0.01482227724045515,
-0.044833820313215256,
-0.0627216175198555,
0.014509792439639568,
-0.13477492332458496,
0.04521555453538895,
-0.09187109023332596,
-0.1438509076833725,
-0.018560398370027542,
0.030843069776892662,
-0.029514778405427933,
0.01306593045592308,
-0.034679144620895386,
-0.01367775909602642,
-0.0067505366168916225,
-0.0057996660470962524,
0.07084328681230545,
-0.06853403151035309,
0.08465348184108734,
0.07471174001693726,
0.0520019568502903,
-0.0949339047074318,
-0.008565661497414112,
-0.13167811930179596,
0.02001170441508293,
-0.03919713944196701,
-0.02091958560049534,
-0.016499517485499382,
0.12730064988136292,
-0.05454442650079727,
-0.06658469885587692,
-0.038413193076848984,
-0.00983763299882412,
0.04912364482879639,
0.15920712053775787,
-0.12931151688098907,
-0.035203173756599426,
0.15392515063285828,
-0.12369140982627869,
-0.19708481431007385,
0.05102832615375519,
-0.06798670440912247,
0.11795084178447723,
0.055257879197597504,
0.03815743699669838,
-0.01984979212284088,
-0.02654287777841091,
0.022474724799394608,
-0.003737494582310319,
0.09038881212472916,
-0.1725989580154419,
0.03031904809176922,
-0.04885387793183327,
-0.05079164355993271,
0.07164491713047028,
0.03050747700035572,
0.04759848862886429,
-0.1096702367067337,
-0.030673285946249962,
-0.03496406599879265,
-0.07627259194850922,
-0.04234450310468674,
0.0497983917593956,
0.061668913811445236,
-0.06543094664812088,
0.006286395248025656,
-0.14693085849285126,
0.07461418956518173,
-0.021595731377601624,
-0.03461736440658569,
-0.09429629892110825,
0.16226957738399506,
-0.17700082063674927,
-0.07093062251806259,
-0.16674815118312836,
0.05878318473696709,
-0.04671379551291466,
0.10520458966493607,
-0.04283168166875839,
0.03855469450354576,
0.06434723734855652,
-0.07078856229782104,
0.017451640218496323,
0.025162655860185623,
0.04491717740893364,
0.019689345732331276,
-0.030478093773126602,
-0.058717869222164154,
-0.0007777701248414814,
-0.05914924666285515,
0.006334354169666767,
-0.0888248085975647,
-0.03767358139157295,
-0.02614671178162098,
0.1524331271648407,
-0.005839102901518345,
-0.06224750354886055,
0.019249778240919113,
0.03720828890800476,
-0.07383063435554504,
0.021425535902380943,
0.051546137779951096,
-0.008350093849003315,
-0.08266816288232803,
0.017475752159953117,
-0.0800514742732048,
0.12921778857707977,
0.1217830628156662,
-0.13022325932979584,
-0.01896936073899269,
-0.0015800443943589926,
-0.023722384124994278,
0.015633823350071907,
-0.003569988999515772,
0.03379517421126366,
-0.026705818250775337,
-0.04392293468117714,
0.0724220871925354,
-0.0717400461435318,
-0.0413799062371254,
0.04097571596503258,
-0.05603044107556343,
0.015324077568948269,
0.12748269736766815,
0.18489396572113037,
-0.10985273122787476,
0.07489252835512161,
0.19384482502937317,
0.04684668779373169,
0.04067215323448181,
-0.09005798399448395,
-0.1124257892370224,
-0.012758861295878887,
-0.04665407910943031,
-0.023017127066850662,
0.16235458850860596,
-0.23131121695041656,
-0.03256668522953987,
0.06893617659807205,
-0.021152492612600327,
0.04449539631605148,
-0.13326583802700043,
-0.03546932339668274,
0.032938383519649506,
0.03037925437092781,
-0.14881955087184906,
0.07888167351484299,
-0.027916409075260162,
0.10927126556634903,
-0.09940025955438614,
-0.049416664987802505,
0.0021526829805225134,
-0.02203655242919922,
-0.05456266552209854,
0.18721719086170197,
-0.019338691607117653,
-0.10051988065242767,
-0.09523875266313553,
-0.04397385194897652,
-0.007566827815026045,
0.0022191880270838737,
-0.004686566069722176,
-0.04032687097787857,
-0.08229093253612518,
-0.01924380473792553,
0.007641298696398735,
-0.11694590002298355,
0.05603835731744766,
0.08139674365520477,
0.017924783751368523,
-0.04002959653735161,
-0.09368480741977692,
-0.03755345568060875,
-0.10264552384614944,
-0.008368088863790035,
0.06873616576194763,
-0.05402856320142746,
0.11475428193807602,
0.13658489286899567,
-0.024303600192070007,
0.06149837002158165,
0.025399627164006233,
0.19116482138633728,
-0.02489352412521839,
-0.06221972033381462,
0.21352405846118927,
-0.0197378471493721,
-0.010542673990130424,
0.10881618410348892,
0.033716995269060135,
-0.11281149089336395,
-0.0043699308298528194,
-0.06605464220046997,
-0.10304195433855057,
-0.1676645129919052,
-0.08496494591236115,
-0.09183737635612488,
0.02083965577185154,
0.026025352999567986,
-0.014619488269090652,
-0.01475081779062748,
0.11657647788524628,
-0.004361145664006472,
-0.09420770406723022,
-0.06919063627719879,
-0.01510484516620636,
0.028467252850532532,
-0.013190875761210918,
0.10275907069444656,
-0.043289199471473694,
-0.05613509938120842,
0.043429479002952576,
-0.08389152586460114,
0.16390477120876312,
0.01941457949578762,
-0.035212159156799316,
0.013663792051374912,
0.19494234025478363,
0.04466849938035011,
0.2547432482242584,
0.03371391072869301,
-0.05703965574502945,
0.01252517756074667,
0.016427893191576004,
-0.14680331945419312,
0.019348375499248505,
0.042581405490636826,
-0.024511389434337616,
-0.08518187701702118,
-0.05969124287366867,
0.02492528036236763,
0.03240560367703438,
0.20800548791885376,
-0.3033592998981476,
-0.06410138309001923,
0.020614733919501305,
-0.006747696548700333,
-0.07985150068998337,
0.06387858092784882,
0.01059176865965128,
-0.07088840007781982,
0.03578553348779678,
-0.028906889259815216,
0.07290533185005188,
0.00109545246232301,
-0.025948388502001762,
-0.07390052080154419,
-0.08551555871963501,
0.0022843852639198303,
0.0705682560801506,
-0.18458183109760284,
0.2629737854003906,
-0.017758028581738472,
0.013171610422432423,
-0.05320430174469948,
-0.009334985166788101,
-0.011569485068321228,
-0.05122437700629234,
0.16464859247207642,
0.01370195485651493,
-0.13604886829853058,
-0.11764374375343323,
-0.11446301639080048,
0.06723169982433319,
0.05946045368909836,
-0.06454190611839294,
0.08148284256458282,
0.0007561817183159292,
0.014601465314626694,
0.0314970426261425,
-0.0228608176112175,
-0.05454195663332939,
-0.10979657620191574,
0.007643822114914656,
-0.0030690578278154135,
-0.08213277906179428,
-0.02908615581691265,
-0.06431618332862854,
-0.008680089376866817,
0.09892971068620682,
0.020389648154377937,
-0.07741124927997589,
-0.06981337815523148,
0.06816867738962173,
0.16786792874336243,
-0.04803545027971268,
0.06811827421188354,
-0.044750530272722244,
-0.05216148495674133,
0.08190646022558212,
-0.09911773353815079,
0.0821533054113388,
-0.1190008744597435,
-0.0803326889872551,
-0.05947139859199524,
0.1492116004228592,
-0.006935656536370516,
-0.02427426166832447,
0.018252519890666008,
0.06561808288097382,
-0.09175474941730499,
-0.09696339815855026,
-0.0034650471061468124,
-0.001363875693641603,
0.01065068133175373,
0.12932494282722473,
0.015930606052279472,
0.05405030399560928,
-0.004547981545329094,
-0.0059397039003670216,
0.13831570744514465,
0.24766747653484344,
-0.059074681252241135,
0.028059186413884163,
0.09411416947841644,
-0.015936754643917084,
-0.27820056676864624,
-0.062012869864702225,
-0.0811314731836319,
0.06003759428858757,
-0.034191884100437164,
0.040241632610559464,
0.14513720571994781,
0.058818936347961426,
-0.005309772212058306,
0.07908650487661362,
-0.32624396681785583,
-0.10378065705299377,
0.19863156974315643,
0.06424389779567719,
0.4301724135875702,
-0.09055387228727341,
-0.019902175292372704,
0.0352666936814785,
-0.19661137461662292,
0.07586758583784103,
0.026564517989754677,
0.014754004776477814,
-0.061416953802108765,
0.03500862792134285,
0.02034367248415947,
-0.0844949409365654,
0.15884672105312347,
0.02818654663860798,
0.051337093114852905,
-0.03821355476975441,
-0.00909993052482605,
0.06552053242921829,
0.0030314845498651266,
-0.00891666579991579,
0.07555466145277023,
0.08094070106744766,
-0.1725296974182129,
-0.009399556554853916,
-0.06778229773044586,
0.031159834936261177,
-0.0161079503595829,
-0.03457273170351982,
-0.020657286047935486,
-0.03385040909051895,
-0.02872975543141365,
-0.03544505313038826,
0.17023396492004395,
0.004541994538158178,
0.1576979160308838,
0.15138360857963562,
0.043847426772117615,
-0.1674213409423828,
-0.01681431196630001,
0.024286149069666862,
-0.02162018232047558,
0.007418166380375624,
-0.09215331822633743,
0.024116545915603638,
0.16622111201286316,
0.12629862129688263,
0.0035997938830405474,
0.09929228574037552,
-0.0026000766083598137,
0.010702395811676979,
0.10972476750612259,
-0.23683172464370728,
-0.057098738849163055,
0.02751384675502777,
-0.11174195259809494,
-0.06661436706781387,
0.10889655351638794,
0.06302490830421448,
-0.02089323289692402,
-0.017458949238061905,
-0.046080704778432846,
0.02571752481162548,
-0.051459308713674545,
0.2535494565963745,
0.022887252271175385,
0.01433184091001749,
-0.16334690153598785,
0.09271185100078583,
-0.04091900587081909,
-0.14772555232048035,
-0.08037339895963669,
-0.08573652058839798,
-0.13467107713222504,
-0.01789928413927555,
-0.028244128450751305,
0.03824613615870476,
-0.13991805911064148,
-0.04056427627801895,
-0.13927061855793,
-0.09313974529504776,
0.04531196877360344,
0.1771961897611618,
0.11739204823970795,
0.08706678450107574,
-0.012631661258637905,
-0.09743837267160416,
-0.05756203085184097,
0.009969856590032578,
0.22675594687461853,
0.016633639112114906,
-0.0701402798295021,
0.06615041196346283,
-0.0640973150730133,
0.04811665788292885,
-0.052628107368946075,
-0.005320153199136257,
-0.047598402947187424,
0.0711871013045311,
-0.04617636650800705,
-0.051489707082509995,
-0.10494334250688553,
-0.0725308433175087,
0.06548065692186356,
-0.03904712200164795,
-0.07415153831243515,
0.002573880599811673,
-0.09924305230379105,
0.051289528608322144,
0.018385788425803185,
0.007924611680209637,
-0.019362835213541985,
-0.011780282482504845,
0.08003303408622742,
-0.04139718785881996,
0.016074564307928085,
0.15259777009487152,
-0.029300397261977196,
0.08134294301271439,
-0.2003977745771408,
-0.032085955142974854,
0.09301657229661942,
0.01597362570464611,
0.03717060759663582,
-0.03520895540714264,
0.0728556290268898,
0.10154286026954651,
-0.050593577325344086,
0.0006050254451110959,
-0.06513647735118866,
-0.099769726395607,
-0.08992079645395279,
-0.0049751317128539085,
-0.049332551658153534,
-0.04345668852329254,
-0.06515942513942719,
0.0743519514799118,
0.14185336232185364,
0.08113061636686325,
-0.01161254197359085,
0.01882867142558098,
-0.08708145469427109,
0.01194673590362072,
-0.021464897319674492,
-0.07911906391382217,
-0.07028447836637497,
-0.03290059417486191,
0.06270110607147217,
0.010458434000611305,
0.1986897736787796,
-0.06979885697364807,
0.03604145720601082,
-0.024518344551324844,
-0.023459110409021378,
0.11830489337444305,
-0.0025853149127215147,
0.2556924521923065,
0.0638907253742218,
-0.01796453446149826,
-0.06073625013232231,
0.0833335667848587,
0.049854353070259094,
-0.0028115466702729464,
0.1279279738664627,
0.08166924118995667,
-0.0072715445421636105,
0.07651711255311966,
-0.06327440589666367,
-0.03782271593809128,
0.14785727858543396,
-0.11081711947917938,
-0.029044413939118385,
0.04027824103832245,
0.04812553524971008,
0.12589949369430542,
0.10803831368684769,
-0.030241310596466064,
-0.04587503895163536,
-0.01750597544014454,
-0.049378901720047,
-0.1200370192527771,
-0.013454345054924488,
-0.051793936640024185,
-0.11041088402271271,
0.05013412609696388,
-0.08230432122945786,
0.01023286022245884,
0.20684674382209778,
0.07118279486894608,
-0.025046922266483307,
0.047976065427064896,
-0.04642205685377121,
-0.048180971294641495,
0.07010974735021591,
0.041719529777765274,
-0.0008890393655747175,
-0.0061182077042758465,
0.007143013644963503,
-0.08311603963375092,
-0.03531806543469429,
0.003473206888884306,
-0.016140740364789963,
-0.07644281536340714,
0.011641353368759155,
-0.04003588482737541,
-0.0581478513777256,
-0.054085440933704376,
0.009029936045408249,
0.05568632110953331,
0.10397897660732269,
0.029301011934876442,
0.04997393116354942,
-0.01265010703355074,
0.18654783070087433,
-0.10940445214509964,
0.011288418434560299,
-0.0970403179526329,
0.14481648802757263,
0.06400774419307709,
0.0530218780040741,
0.0584535114467144,
-0.05105865001678467,
0.025411920621991158,
0.21332287788391113,
0.13822472095489502,
0.05754322186112404,
0.01299198903143406,
-0.004150835797190666,
0.01789262890815735,
0.04143219813704491,
-0.007250935770571232,
0.049349602311849594,
0.21912474930286407,
-0.054095543920993805,
-0.07622291147708893,
-0.04161660000681877,
-0.019523458555340767,
-0.0015339883975684643,
0.01876966655254364,
0.009057899937033653,
-0.07746359705924988,
-0.0801364928483963,
0.1203140988945961,
-0.11182467639446259,
-0.09730657190084457,
0.1623132824897766,
-0.17307023704051971,
-0.06329777836799622,
-0.029037630185484886,
0.04068838059902191,
0.04399683699011803,
0.07793872058391571,
-0.08786927908658981,
-0.01487633679062128,
0.07537916302680969,
0.06405283510684967,
-0.1716928780078888,
-0.11172817647457123,
0.09316468983888626,
0.016178272664546967,
0.06946146488189697,
-0.008838155306875706,
0.09375279396772385,
0.0827920213341713,
0.03343794867396355,
-0.025664905086159706,
0.013485175557434559,
0.0583796352148056,
0.018913788720965385,
0.019719919189810753,
0.0489928312599659,
0.05104328319430351,
-0.03735487908124924,
0.011884710751473904,
-0.1351632922887802,
0.03391186147928238,
-0.11095632612705231,
0.028630243614315987,
-0.08643683046102524,
0.04436907172203064,
-0.0014501909026876092,
0.12626466155052185,
0.086760975420475,
-0.06227096915245056,
0.04090794175863266,
-0.03305055946111679,
0.04323345050215721,
0.03344625234603882,
-0.02669144608080387,
-0.02031221240758896,
-0.1324554681777954,
0.00623807217925787,
-0.03239875286817551,
-0.03836248070001602,
-0.11256541311740875,
0.013342341408133507,
-0.06552452594041824,
-0.04513450339436531,
-0.018278779461979866,
0.08738042414188385,
0.0939357653260231,
0.051321838051080704,
-0.027941156178712845,
0.08329601585865021,
0.024279695004224777,
0.1069621741771698,
-0.06987731158733368,
-0.12098768353462219
] |
null | null |
transformers
|
# SimCLS
SimCLS is a framework for abstractive summarization presented in [SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization](https://arxiv.org/abs/2106.01890).
It is a two-stage approach consisting of a *generator* and a *scorer*. In the first stage, a large pre-trained model for abstractive summarization (the *generator*) is used to generate candidate summaries, whereas, in the second stage, the *scorer* assigns a score to each candidate given the source document. The final summary is the highest-scoring candidate.
This model is the *scorer* trained for summarization of XSum ([paper](https://arxiv.org/abs/1808.08745), [datasets](https://huggingface.co/datasets/xsum)). It should be used in conjunction with [google/pegasus-xsum](https://huggingface.co/google/pegasus-xsum). See [our Github repository](https://github.com/andrejmiscic/simcls-pytorch) for details on training, evaluation, and usage.
## Usage
```bash
git clone https://github.com/andrejmiscic/simcls-pytorch.git
cd simcls-pytorch
pip3 install torch torchvision torchaudio transformers sentencepiece
```
```python
from src.model import SimCLS, GeneratorType
summarizer = SimCLS(generator_type=GeneratorType.Pegasus,
generator_path="google/pegasus-xsum",
scorer_path="andrejmiscic/simcls-scorer-xsum")
article = "This is a news article."
summary = summarizer(article)
print(summary)
```
### Results
All of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See [SimCLS paper](https://arxiv.org/abs/2106.01890) for a description of baselines.
| System | Rouge-1 | Rouge-2 | Rouge-L |
|------------------|----------------------:|----------------------:|----------------------:|
| Pegasus | 47.21 | 24.56 | 39.25 |
| **SimCLS paper** | --- | --- | --- |
| Origin | 47.10 | 24.53 | 39.23 |
| Min | 40.97 | 19.18 | 33.68 |
| Max | 52.45 | 28.28 | 43.36 |
| Random | 46.72 | 23.64 | 38.55 |
| **SimCLS** | 47.61 | 24.57 | 39.44 |
| **Our results** | --- | --- | --- |
| Origin | 47.16, [46.85, 47.48] | 24.59, [24.25, 24.92] | 39.30, [38.96, 39.62] |
| Min | 41.06, [40.76, 41.34] | 18.30, [18.03, 18.56] | 32.70, [32.42, 32.97] |
| Max | 51.83, [51.53, 52.14] | 28.92, [28.57, 29.26] | 44.02, [43.69, 44.36] |
| Random | 46.47, [46.17, 46.78] | 23.45, [23.13, 23.77] | 38.28, [37.96, 38.60] |
| **SimCLS** | 47.17, [46.87, 47.46] | 23.90, [23.59, 24.23] | 38.96, [38.64, 39.29] |
### Citation of the original work
```bibtex
@inproceedings{liu-liu-2021-simcls,
title = "{S}im{CLS}: A Simple Framework for Contrastive Learning of Abstractive Summarization",
author = "Liu, Yixin and
Liu, Pengfei",
booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.acl-short.135",
doi = "10.18653/v1/2021.acl-short.135",
pages = "1065--1072",
}
```
|
{"language": ["en"], "tags": ["simcls"], "datasets": ["xsum"]}
|
feature-extraction
|
andrejmiscic/simcls-scorer-xsum
|
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"simcls",
"en",
"dataset:xsum",
"arxiv:2106.01890",
"arxiv:1808.08745",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2106.01890",
"1808.08745"
] |
[
"en"
] |
TAGS
#transformers #pytorch #roberta #feature-extraction #simcls #en #dataset-xsum #arxiv-2106.01890 #arxiv-1808.08745 #endpoints_compatible #region-us
|
SimCLS
======
SimCLS is a framework for abstractive summarization presented in SimCLS: A Simple Framework for Contrastive Learning of Abstractive Summarization.
It is a two-stage approach consisting of a *generator* and a *scorer*. In the first stage, a large pre-trained model for abstractive summarization (the *generator*) is used to generate candidate summaries, whereas, in the second stage, the *scorer* assigns a score to each candidate given the source document. The final summary is the highest-scoring candidate.
This model is the *scorer* trained for summarization of XSum (paper, datasets). It should be used in conjunction with google/pegasus-xsum. See our Github repository for details on training, evaluation, and usage.
Usage
-----
### Results
All of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.
of the original work
|
[
"### Results\n\n\nAll of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.\n\n\n\nof the original work"
] |
[
"TAGS\n#transformers #pytorch #roberta #feature-extraction #simcls #en #dataset-xsum #arxiv-2106.01890 #arxiv-1808.08745 #endpoints_compatible #region-us \n",
"### Results\n\n\nAll of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.\n\n\n\nof the original work"
] |
[
58,
44
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #feature-extraction #simcls #en #dataset-xsum #arxiv-2106.01890 #arxiv-1808.08745 #endpoints_compatible #region-us \n### Results\n\n\nAll of our results are reported together with 95% confidence intervals computed using 10000 iterations of bootstrap. See SimCLS paper for a description of baselines.\n\n\n\nof the original work"
] |
[
-0.17033986747264862,
0.032596196979284286,
-0.00137401616666466,
-0.01656252145767212,
0.07892412692308426,
0.0014636045088991523,
-0.02782597206532955,
0.06548700481653214,
0.05138135328888893,
0.059431616216897964,
0.1426418274641037,
0.2276240885257721,
0.04077858477830887,
0.16872833669185638,
-0.10592474043369293,
-0.08768867701292038,
0.030237112194299698,
0.07422394305467606,
-0.11356596648693085,
0.15813028812408447,
0.06845787912607193,
-0.1501152515411377,
0.0755477100610733,
-0.0004210730839986354,
-0.14258110523223877,
0.016900675371289253,
0.05069328099489212,
-0.08904584497213364,
0.06929250806570053,
-0.010480990633368492,
0.11684293299913406,
0.046326588839292526,
0.041786372661590576,
-0.085638627409935,
0.014977885410189629,
-0.0011842661770060658,
-0.004373420961201191,
0.07829391956329346,
0.04689507558941841,
-0.053145796060562134,
0.16984400153160095,
0.05918406695127487,
0.027513908222317696,
0.03314860537648201,
-0.1326644867658615,
-0.10469689220190048,
-0.06504940241575241,
0.044493742287158966,
0.12667739391326904,
0.032075054943561554,
0.009824451059103012,
0.1875077188014984,
-0.1271631121635437,
0.08655014634132385,
0.1309409886598587,
-0.22196339070796967,
-0.02254421077668667,
0.0961168184876442,
0.029235344380140305,
-0.0706084817647934,
-0.03584892675280571,
0.03207802027463913,
0.06970151513814926,
-0.03290509805083275,
-0.08359815180301666,
-0.07365730404853821,
-0.055056069046258926,
0.027962524443864822,
-0.11504506319761276,
-0.08389844000339508,
0.25491246581077576,
0.023656662553548813,
-0.054772306233644485,
0.03010229393839836,
-0.09669376164674759,
-0.041749030351638794,
-0.021170061081647873,
0.0053969440050423145,
-0.021300649270415306,
-0.016214244067668915,
-0.06859125941991806,
0.15786299109458923,
-0.09329491853713989,
-0.058857034891843796,
-0.07683394104242325,
0.181528240442276,
0.0010219793766736984,
0.11497951298952103,
-0.08190333098173141,
0.13195329904556274,
-0.01737235300242901,
-0.09998886287212372,
-0.012533346191048622,
-0.07907847315073013,
-0.007600685115903616,
0.02386908233165741,
-0.03820694610476494,
0.06534937769174576,
0.022855553776025772,
0.19928671419620514,
0.0981292575597763,
0.030429961159825325,
0.16210919618606567,
0.03729822114109993,
0.05393294617533684,
0.0642903745174408,
-0.05511246249079704,
-0.13477712869644165,
0.027844415977597237,
-0.0722721740603447,
0.019410260021686554,
-0.026352187618613243,
-0.003528282977640629,
-0.024688659235835075,
0.008734876289963722,
0.06061287969350815,
-0.03569939360022545,
-0.013143707066774368,
-0.08099009096622467,
-0.025009412318468094,
-0.019462812691926956,
-0.0863352119922638,
-0.05044523999094963,
-0.0870976373553276,
-0.07126020640134811,
0.08939982205629349,
0.05648760497570038,
-0.003544834442436695,
-0.0007266458705998957,
0.050365470349788666,
-0.12317685037851334,
-0.016966721042990685,
-0.05915717035531998,
-0.06916500627994537,
0.0032260813750326633,
-0.16535334289073944,
0.0585358589887619,
-0.08829948306083679,
-0.14956848323345184,
-0.018316777423024178,
0.052563153207302094,
-0.03925682604312897,
0.004043601453304291,
-0.013008585199713707,
0.002255883999168873,
-0.0009045713231898844,
-0.012185974977910519,
0.0597563311457634,
-0.05682352930307388,
0.08283524215221405,
0.0796792209148407,
0.06773926317691803,
-0.10995319485664368,
0.008288000710308552,
-0.12205548584461212,
0.037628646939992905,
-0.07478608936071396,
-0.03561064600944519,
-0.023717934265732765,
0.13870453834533691,
-0.05199039727449417,
-0.06698356568813324,
-0.039140790700912476,
-0.013361864723265171,
0.06058894470334053,
0.1300232708454132,
-0.11409500986337662,
-0.028209920972585678,
0.1587233990430832,
-0.1314585953950882,
-0.22214645147323608,
0.06111803278326988,
-0.0488983690738678,
0.11653892695903778,
0.029959972947835922,
0.06937933713197708,
0.004045739769935608,
-0.024084679782390594,
0.027151327580213547,
0.0016905020456761122,
0.09599389135837555,
-0.19931861758232117,
0.04807938635349274,
-0.0655735433101654,
-0.028343014419078827,
0.08315590023994446,
0.04480111971497536,
0.03740681707859039,
-0.10027015209197998,
-0.030937300994992256,
-0.05708222836256027,
-0.08388806134462357,
-0.045979782938957214,
0.05687585845589638,
0.05845247209072113,
-0.07449787110090256,
0.00889640860259533,
-0.1503889560699463,
0.08379945158958435,
-0.032018255442380905,
-0.035625897347927094,
-0.08007379621267319,
0.20103377103805542,
-0.17469489574432373,
-0.08133406937122345,
-0.18443916738033295,
0.07150335609912872,
-0.030551142990589142,
0.10310836136341095,
-0.06723552197217941,
0.03173661604523659,
0.060213930904865265,
-0.05752203240990639,
0.016507908701896667,
0.013789888471364975,
0.05712522193789482,
0.009883258491754532,
-0.05131080001592636,
-0.049329232424497604,
-0.021229950711131096,
-0.0559815876185894,
0.013472042046487331,
-0.07983489334583282,
-0.03827409818768501,
-0.04397260770201683,
0.13976235687732697,
-0.012682647444307804,
-0.061667732894420624,
0.0062295603565871716,
0.027682365849614143,
-0.08285349607467651,
0.010461447760462761,
0.056724779307842255,
-0.007256396114826202,
-0.07310129702091217,
0.0006285262061282992,
-0.06995996087789536,
0.14224474132061005,
0.1175316870212555,
-0.1266728639602661,
-0.026333758607506752,
-0.017609739676117897,
-0.023367300629615784,
0.011693871580064297,
0.0036155374255031347,
0.02762693725526333,
-0.04723431169986725,
-0.038622595369815826,
0.06596755236387253,
-0.06861719489097595,
-0.03166138380765915,
0.045718275010585785,
-0.06722472608089447,
0.030031001195311546,
0.12841728329658508,
0.19883954524993896,
-0.11338592320680618,
0.0781468078494072,
0.21233049035072327,
0.03523726388812065,
0.032420460134744644,
-0.09245310723781586,
-0.1122826337814331,
-0.015715470537543297,
-0.018640749156475067,
-0.01356123760342598,
0.17763611674308777,
-0.23800985515117645,
-0.03281397372484207,
0.06922699511051178,
-0.02208888903260231,
0.04291417449712753,
-0.1264614760875702,
-0.01783199980854988,
0.028977321460843086,
0.040092598646879196,
-0.12379900366067886,
0.08264141529798508,
-0.030432485044002533,
0.10038869082927704,
-0.09417425841093063,
-0.052956294268369675,
-0.01047324389219284,
-0.022535273805260658,
-0.04929293692111969,
0.19245445728302002,
-0.023483505472540855,
-0.09237624704837799,
-0.09973781555891037,
-0.012709764763712883,
-0.008375677280128002,
0.012881739996373653,
0.005980115849524736,
-0.009837044402956963,
-0.09054344147443771,
-0.0027794395573437214,
0.008402116596698761,
-0.14331728219985962,
0.06019573286175728,
0.05841720849275589,
0.019646788015961647,
-0.03206156939268112,
-0.08909523487091064,
-0.046883028000593185,
-0.09637375921010971,
0.004393670707941055,
0.07439330220222473,
-0.0673932433128357,
0.12655675411224365,
0.1419239491224289,
-0.026318151503801346,
0.05255875363945961,
0.017356326803565025,
0.1797640025615692,
-0.011594324372708797,
-0.05630922690033913,
0.21402180194854736,
-0.027821464464068413,
-0.0037347162142395973,
0.10480912774801254,
0.01609962247312069,
-0.12853915989398956,
-0.001790348207578063,
-0.06703293323516846,
-0.10825764387845993,
-0.16406512260437012,
-0.08404744416475296,
-0.11005755513906479,
0.019910501316189766,
0.03587969020009041,
-0.024517374113202095,
-0.038407132029533386,
0.1296999603509903,
0.012110370211303234,
-0.0914887860417366,
-0.07752879709005356,
-0.01680523343384266,
0.01583651639521122,
-0.01919408328831196,
0.12407194077968597,
-0.036668356508016586,
-0.033450573682785034,
0.04279443621635437,
-0.06895646452903748,
0.2142299860715866,
0.015930503606796265,
-0.02658648043870926,
0.03284704312682152,
0.22589750587940216,
0.04445120692253113,
0.2633066475391388,
0.03130362555384636,
-0.04774574935436249,
-0.0030637236777693033,
0.017664914950728416,
-0.16970136761665344,
0.022112922742962837,
0.05523483827710152,
-0.0042044976726174355,
-0.10456326603889465,
-0.05586814135313034,
0.027771102264523506,
0.044425565749406815,
0.2033413052558899,
-0.28701314330101013,
-0.07350952923297882,
0.0348363071680069,
-0.000369446468539536,
-0.09071711450815201,
0.059119679033756256,
0.007949368096888065,
-0.08580829203128815,
0.014461871236562729,
-0.042684439569711685,
0.0791856199502945,
0.017277147620916367,
-0.03413941711187363,
-0.0683630183339119,
-0.07309725135564804,
0.008592606522142887,
0.06459001451730728,
-0.17601259052753448,
0.24971750378608704,
-0.01962977834045887,
0.006381772458553314,
-0.04946626350283623,
0.022276869043707848,
-0.015484381467103958,
-0.04666262865066528,
0.17324991524219513,
0.020543958991765976,
-0.18280856311321259,
-0.13088665902614594,
-0.07897698879241943,
0.0566490963101387,
0.06510020047426224,
-0.06246047466993332,
0.08208218216896057,
-0.018278753384947777,
0.02605654112994671,
0.03554448485374451,
0.009631505236029625,
-0.03003835678100586,
-0.1076524555683136,
0.012695045210421085,
-0.024328751489520073,
-0.08836954832077026,
-0.03841309994459152,
-0.04749462008476257,
-0.00479783583432436,
0.107761912047863,
0.0024978125002235174,
-0.07465660572052002,
-0.08509667217731476,
0.04646232724189758,
0.1825942099094391,
-0.056598179042339325,
0.0918358862400055,
-0.058359429240226746,
-0.05381025746464729,
0.07734429836273193,
-0.0934622660279274,
0.10108328610658646,
-0.128027081489563,
-0.0575871542096138,
-0.06429298222064972,
0.14475108683109283,
-0.029135452583432198,
-0.02707635425031185,
0.015165071003139019,
0.06682609021663666,
-0.09680206328630447,
-0.10289694368839264,
-0.002849164418876171,
0.022443905472755432,
0.000040085811633616686,
0.13201628625392914,
-0.01057349145412445,
0.07902713119983673,
0.019323056563735008,
-0.02188100852072239,
0.15218603610992432,
0.21504366397857666,
-0.07280752062797546,
0.031723231077194214,
0.06440962105989456,
-0.034952349960803986,
-0.27903807163238525,
-0.049082186073064804,
-0.06421651691198349,
0.07053426653146744,
-0.014943563379347324,
0.049454279243946075,
0.14641515910625458,
0.054563600569963455,
-0.0010072588920593262,
0.07224146276712418,
-0.3049624562263489,
-0.1058667004108429,
0.20854605734348297,
0.08248686045408249,
0.4063519835472107,
-0.10386794805526733,
-0.01895551197230816,
0.015629133209586143,
-0.19500890374183655,
0.04897569119930267,
0.04921991005539894,
0.025493059307336807,
-0.06979332864284515,
0.011020862497389317,
0.026019956916570663,
-0.08267813920974731,
0.1596723049879074,
0.02403266727924347,
0.05348466336727142,
-0.03451864793896675,
0.019943270832300186,
0.0322488397359848,
-0.007838485762476921,
-0.0010101345833390951,
0.07245858758687973,
0.06867944449186325,
-0.19024144113063812,
-0.008648311719298363,
-0.05505216494202614,
0.055962447077035904,
-0.015861881896853447,
-0.04613687843084335,
-0.014205113053321838,
-0.027455559000372887,
-0.019730094820261,
-0.034995898604393005,
0.15351857244968414,
0.004709779750555754,
0.16404232382774353,
0.17283256351947784,
0.025281118229031563,
-0.1711474359035492,
-0.047750819474458694,
0.03135145828127861,
-0.004793752450495958,
0.0034062161576002836,
-0.08959285914897919,
0.03369096666574478,
0.1772278994321823,
0.12475312501192093,
0.012918135151267052,
0.09742603451013565,
0.00457617687061429,
0.0019190587336197495,
0.10610637813806534,
-0.21245376765727997,
-0.05033394694328308,
0.028777362778782845,
-0.12428699433803558,
-0.07151169329881668,
0.10627752542495728,
0.052936602383852005,
-0.021719301119446754,
-0.013231324963271618,
-0.059976644814014435,
0.03723137825727463,
-0.0511322095990181,
0.26510268449783325,
0.01917334459722042,
0.019998442381620407,
-0.1516113579273224,
0.09780928492546082,
-0.03609480708837509,
-0.1397797018289566,
-0.08625272661447525,
-0.1029009222984314,
-0.12281019240617752,
-0.021963119506835938,
-0.015288397669792175,
0.017932023853063583,
-0.13792209327220917,
-0.04349435865879059,
-0.1463964879512787,
-0.10369092226028442,
0.03346475586295128,
0.14804372191429138,
0.12548893690109253,
0.07942179590463638,
-0.0029444906394928694,
-0.08696580678224564,
-0.06587004661560059,
0.01801510713994503,
0.22693975269794464,
0.03298996016383171,
-0.09333976358175278,
0.04929160699248314,
-0.06583619862794876,
0.02810964547097683,
-0.03980579599738121,
-0.003593274625018239,
-0.06927190721035004,
0.06518518924713135,
-0.05120013654232025,
-0.04540308564901352,
-0.11309609562158585,
-0.06724657118320465,
0.06699125468730927,
-0.020352480933070183,
-0.06485744565725327,
0.014956679195165634,
-0.10383052378892899,
0.03921779617667198,
0.014458865858614445,
0.012893548235297203,
-0.006341245956718922,
-0.021430395543575287,
0.08214415609836578,
-0.04452437162399292,
0.017655685544013977,
0.14773567020893097,
-0.016020193696022034,
0.072006456553936,
-0.18412388861179352,
-0.028296789154410362,
0.09347962588071823,
0.021362019702792168,
0.040851891040802,
-0.0697430819272995,
0.07563845813274384,
0.10469024628400803,
-0.040983106940984726,
-0.007456863299012184,
-0.06940222531557083,
-0.10908836871385574,
-0.1079913079738617,
-0.013599223457276821,
-0.05507954955101013,
-0.03841259330511093,
-0.05705070495605469,
0.07818513363599777,
0.1483212262392044,
0.06388247013092041,
-0.0073476023972034454,
0.029681529849767685,
-0.10253629088401794,
0.009720874950289726,
-0.020635688677430153,
-0.06994135677814484,
-0.06514330208301544,
-0.040639035403728485,
0.054347701370716095,
0.014353103935718536,
0.18456119298934937,
-0.04312661290168762,
0.06521130353212357,
-0.027869675308465958,
-0.030911214649677277,
0.14155711233615875,
0.00887906365096569,
0.260456919670105,
0.04735705628991127,
-0.017451873049139977,
-0.04471805319190025,
0.09584258496761322,
0.05529391020536423,
0.014954326674342155,
0.12783856689929962,
0.09434032440185547,
-0.029891202226281166,
0.07025478035211563,
-0.06573225557804108,
-0.047160085290670395,
0.12758295238018036,
-0.08015289157629013,
-0.006375532131642103,
0.04343641549348831,
0.046130936592817307,
0.1270512491464615,
0.07849520444869995,
-0.06352059543132782,
-0.03264281153678894,
-0.03808426856994629,
-0.058413758873939514,
-0.1337827444076538,
0.00038635110831819475,
-0.04941854998469353,
-0.11082861572504044,
0.03688051179051399,
-0.09049437195062637,
0.00346165569499135,
0.20133966207504272,
0.05640028417110443,
-0.025192389264702797,
0.0448429100215435,
-0.06425823271274567,
-0.04583892226219177,
0.05299844220280647,
0.04662533849477768,
-0.007911461405456066,
0.000168134574778378,
0.013492417521774769,
-0.09080105274915695,
-0.03120291233062744,
0.006157929077744484,
-0.027343356981873512,
-0.08709462732076645,
0.004579006694257259,
-0.02724584937095642,
-0.03955414891242981,
-0.06072620302438736,
0.0070447251200675964,
0.04580603539943695,
0.09764990955591202,
0.023203371092677116,
0.036830585449934006,
-0.021405750885605812,
0.20329317450523376,
-0.1232471764087677,
0.020304476842284203,
-0.08623959124088287,
0.15120218694210052,
0.07095712423324585,
0.04737887158989906,
0.05898401513695717,
-0.03958536684513092,
0.0371311716735363,
0.20011934638023376,
0.11748959869146347,
0.05639271438121796,
0.014362422749400139,
0.0016634106868878007,
0.013098586350679398,
0.0279996395111084,
0.011427119374275208,
0.05089270696043968,
0.2167370319366455,
-0.04724057391285896,
-0.05114222317934036,
-0.040487006306648254,
-0.029303882271051407,
-0.020879806950688362,
0.03886975720524788,
0.003015996655449271,
-0.06474745273590088,
-0.06870205700397491,
0.11407551914453506,
-0.1010264977812767,
-0.10743387043476105,
0.17858374118804932,
-0.18386638164520264,
-0.06839169561862946,
-0.03718537464737892,
0.039120957255363464,
0.041083358228206635,
0.07615530490875244,
-0.10328010469675064,
-0.00781124597415328,
0.09323064982891083,
0.07010321319103241,
-0.17295974493026733,
-0.10855081677436829,
0.08599012345075607,
0.00675625167787075,
0.06249985098838806,
-0.014720636419951916,
0.07985933125019073,
0.0869014784693718,
0.046820636838674545,
-0.041142284870147705,
0.0032941116951406,
0.06004184111952782,
0.010223878547549248,
0.03519386798143387,
0.058477070182561874,
0.0349838063120842,
-0.00045993959065526724,
0.0013062068028375506,
-0.13117605447769165,
0.02658946067094803,
-0.09628740698099136,
0.03859374299645424,
-0.0884980857372284,
0.03372865170240402,
-0.030241921544075012,
0.13453912734985352,
0.08335530757904053,
-0.0636061280965805,
0.02708525024354458,
-0.04290761053562164,
0.04903392121195793,
0.02683679573237896,
-0.008533379063010216,
-0.026436718180775642,
-0.14406418800354004,
0.016365166753530502,
-0.007402108516544104,
-0.0449465811252594,
-0.11744791269302368,
0.0009520488092675805,
-0.051317013800144196,
-0.04632692039012909,
-0.028862211853265762,
0.0941256582736969,
0.08709832280874252,
0.05021527037024498,
-0.03518448770046234,
0.05652095749974251,
0.00033175284625031054,
0.11223793774843216,
-0.04840332642197609,
-0.11615771055221558
] |
null | null |
transformers
|
{"language": false, "license": "cc-by-4.0", "tags": ["translation"], "widget": [{"text": "Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua."}]}
|
translation
|
andrek/LAT2NOB
|
[
"transformers",
"pytorch",
"jax",
"t5",
"text2text-generation",
"translation",
"no",
"license:cc-by-4.0",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"no"
] |
TAGS
#transformers #pytorch #jax #t5 #text2text-generation #translation #no #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
[] |
[
"TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation #no #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
65
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation #no #license-cc-by-4.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.011113007552921772,
0.0444910041987896,
-0.006725424900650978,
0.04868190735578537,
0.1441434770822525,
0.0225673820823431,
0.15364035964012146,
0.1192670464515686,
-0.02185981534421444,
-0.04449192062020302,
0.133904367685318,
0.21237300336360931,
0.015930255874991417,
0.022381654009222984,
-0.07557264715433121,
-0.25125652551651,
0.031183775514364243,
0.05665701627731323,
0.03046879544854164,
0.11474844813346863,
0.09892088919878006,
-0.039891328662633896,
0.10526151955127716,
-0.024646297097206116,
-0.1372194141149521,
0.028298718854784966,
0.07734295725822449,
-0.1247144490480423,
0.09504100680351257,
0.0750681459903717,
0.058729350566864014,
0.07457824796438217,
-0.03500743582844734,
-0.14295151829719543,
0.01840752735733986,
0.0041891420260071754,
-0.1098027303814888,
0.055370308458805084,
0.09555273503065109,
-0.0546380840241909,
0.12248706817626953,
0.020266473293304443,
-0.03351413086056709,
0.08315533399581909,
-0.13922445476055145,
-0.0634009838104248,
-0.0591031089425087,
0.055120594799518585,
0.059239014983177185,
0.09379560500383377,
0.0031805788166821003,
0.12495805323123932,
-0.0720442682504654,
0.1007494255900383,
0.1400282233953476,
-0.41574135422706604,
-0.0027897797990590334,
0.07664203643798828,
0.08444889634847641,
0.1186809241771698,
-0.025724882259964943,
0.06956783682107925,
0.03622916340827942,
0.021658185869455338,
0.015671391040086746,
-0.09599242359399796,
-0.10279922932386398,
0.019122999161481857,
-0.06805050373077393,
-0.049216143786907196,
0.2523804008960724,
-0.05024898424744606,
0.022593967616558075,
-0.017738426104187965,
-0.07390379905700684,
-0.06299538910388947,
-0.01074872724711895,
0.007588647305965424,
-0.013874282129108906,
0.07563455402851105,
0.025622177869081497,
-0.05847162380814552,
-0.14651791751384735,
-0.005830046720802784,
-0.1969510316848755,
0.05860942229628563,
0.019109318032860756,
0.05306593328714371,
-0.17752112448215485,
0.07945820689201355,
0.01084638200700283,
-0.1005430668592453,
0.017875468358397484,
-0.09456072747707367,
0.08898701518774033,
0.0052075511775910854,
-0.04600628465414047,
-0.05956481769680977,
0.059943318367004395,
0.08096954971551895,
-0.03695524483919144,
0.023686468601226807,
-0.09471098333597183,
0.11408829689025879,
-0.024743888527154922,
0.04358319938182831,
-0.00757765956223011,
-0.0016036310698837042,
0.07933644950389862,
-0.12149379402399063,
0.03653033450245857,
-0.038682714104652405,
-0.1993517130613327,
-0.08437510579824448,
0.002213523956015706,
0.12303926050662994,
0.013721284456551075,
0.08592192083597183,
-0.020236391574144363,
-0.01607886329293251,
0.06775805354118347,
-0.05327562987804413,
0.003940509166568518,
0.009807093068957329,
0.03829089179635048,
0.13033674657344818,
0.04386330023407936,
0.008699266240000725,
-0.14496679604053497,
0.04152277857065201,
-0.07053548097610474,
0.00838262215256691,
-0.031111232936382294,
-0.07808646559715271,
0.05340171977877617,
-0.07578657567501068,
0.019716819748282433,
-0.1648884117603302,
-0.12312082201242447,
0.03597413748502731,
-0.0011947351740673184,
-0.026250818744301796,
-0.06282250583171844,
-0.03575941175222397,
-0.03907537832856178,
0.05210418999195099,
-0.0837392508983612,
-0.019527345895767212,
-0.05387004464864731,
0.1074889525771141,
-0.05668121576309204,
0.054520364850759506,
-0.15912403166294098,
0.06114409118890762,
-0.1324215829372406,
-0.030319994315505028,
-0.09094533324241638,
0.05213010311126709,
-0.010165941901504993,
0.0939265713095665,
-0.05777271091938019,
-0.06026608124375343,
-0.019539525732398033,
0.036783527582883835,
-0.04181807115674019,
0.195048987865448,
-0.1430751532316208,
-0.07642987370491028,
0.23339439928531647,
-0.08778176456689835,
-0.17422491312026978,
0.09775354713201523,
0.020199080929160118,
0.02980584278702736,
0.077785924077034,
0.1516215056180954,
0.013917520642280579,
-0.05903894454240799,
0.0824209600687027,
0.1163051426410675,
-0.06555566191673279,
-0.11019253730773926,
0.03816595673561096,
-0.05289965867996216,
-0.09281410276889801,
0.04161139577627182,
0.0028247854206711054,
0.058115266263484955,
-0.027293188497424126,
-0.03573846444487572,
-0.03868086636066437,
0.012415177188813686,
0.04886641725897789,
-0.007373135071247816,
0.09410540759563446,
-0.06718412786722183,
-0.03256889060139656,
0.008256076835095882,
-0.029471086338162422,
-0.014398663304746151,
0.07533005625009537,
-0.04379396513104439,
0.1042388305068016,
-0.02822532132267952,
0.03870847076177597,
-0.12422272562980652,
-0.009596963413059711,
-0.004014921374619007,
0.1539497822523117,
0.06428924947977066,
0.07435291260480881,
0.0373535230755806,
-0.013490435667335987,
-0.042694296687841415,
0.007658120710402727,
0.13246536254882812,
-0.0009241015650331974,
-0.05646327883005142,
-0.11772220581769943,
0.0595390610396862,
-0.020699944347143173,
-0.03374334052205086,
-0.07796188443899155,
0.013212001882493496,
0.06344299763441086,
0.09112092852592468,
-0.01981229893863201,
0.07552590221166611,
-0.03416912257671356,
0.028391290456056595,
-0.07877770811319351,
0.01052823942154646,
0.11356494575738907,
0.00402103690430522,
-0.10445092618465424,
0.276447594165802,
-0.17561176419258118,
0.2070440649986267,
0.22299110889434814,
-0.2679833173751831,
0.02294953539967537,
-0.0449334979057312,
-0.002654636511579156,
0.004939118400216103,
0.07782039046287537,
0.01119379885494709,
0.09118811786174774,
-0.0019129087449982762,
0.2044692039489746,
-0.09790759533643723,
-0.0229465551674366,
0.010956903919577599,
-0.05728158727288246,
-0.030235832557082176,
0.08587720990180969,
0.12192200869321823,
-0.1927311271429062,
0.1791437715291977,
0.2754870057106018,
0.022192664444446564,
0.1902310997247696,
-0.01941496692597866,
-0.018166683614253998,
0.03263641893863678,
0.011337777599692345,
-0.007957383058965206,
-0.05610177665948868,
-0.13199608027935028,
-0.014164399355649948,
0.08527962863445282,
0.04635424166917801,
0.06488056480884552,
-0.11386717855930328,
-0.0476672425866127,
-0.01644853502511978,
-0.03173566982150078,
-0.04437801614403725,
0.08786970376968384,
0.037833236157894135,
0.12990769743919373,
-0.0402056947350502,
-0.016991984099149704,
0.13044877350330353,
0.010955232195556164,
-0.12880641222000122,
0.16839773952960968,
-0.17402897775173187,
-0.2655417323112488,
-0.1806153953075409,
-0.15118521451950073,
-0.054953064769506454,
0.02347889542579651,
0.14331310987472534,
-0.05233421549201012,
-0.02672019600868225,
-0.025752827525138855,
0.00266033667139709,
-0.1306256353855133,
-0.022913789376616478,
-0.11339065432548523,
0.047709908336400986,
-0.06349597126245499,
-0.10260391980409622,
-0.045431576669216156,
0.021087273955345154,
-0.09442763030529022,
0.12558773159980774,
-0.11740933358669281,
0.04563438892364502,
0.14074258506298065,
-0.011553877964615822,
0.041546594351530075,
-0.07685361802577972,
0.16243161261081696,
-0.04460256174206734,
-0.0043192291632294655,
0.20879553258419037,
-0.015169919468462467,
0.06295886635780334,
0.11955270916223526,
0.013497285544872284,
-0.051065169274806976,
-0.0008094933000393212,
-0.0605677105486393,
-0.07511682063341141,
-0.2807048261165619,
-0.07786435633897781,
-0.14245404303073883,
0.06748868525028229,
0.03844721242785454,
0.05324888974428177,
0.14953431487083435,
0.06607185304164886,
-0.040120966732501984,
0.07254981249570847,
0.045204538851976395,
0.11202462017536163,
0.2553665339946747,
-0.001803274150006473,
0.11441762000322342,
-0.08096662163734436,
-0.0537777878344059,
0.0975322350859642,
0.07781092822551727,
0.10862527042627335,
0.07260147482156754,
0.11881745606660843,
0.049781691282987595,
0.14745785295963287,
0.14198097586631775,
0.11705681681632996,
0.036436777561903,
0.007874852046370506,
-0.04993272200226784,
-0.06749666482210159,
-0.024707654491066933,
0.027248984202742577,
0.01880635693669319,
-0.11857786029577255,
-0.07640586048364639,
-0.07009441405534744,
0.05168179050087929,
0.12255754321813583,
0.03296005725860596,
-0.1958189308643341,
0.03801657259464264,
0.07339153438806534,
0.0007719259592704475,
-0.1131349429488182,
0.10632379353046417,
-0.003532401053234935,
-0.12016323953866959,
0.10473303496837616,
-0.04845462366938591,
0.12050292640924454,
-0.012980960309505463,
0.06679729372262955,
-0.040614619851112366,
-0.09479960054159164,
0.032451990991830826,
0.11306624114513397,
-0.3829692304134369,
0.17978684604167938,
0.014846445992588997,
-0.027739018201828003,
-0.0726892352104187,
-0.021351953968405724,
-0.005108516663312912,
0.18909025192260742,
0.13894306123256683,
0.00713957566767931,
-0.06502575427293777,
-0.015687299892306328,
-0.006442282348871231,
0.014109350740909576,
0.10443465411663055,
-0.010756013914942741,
-0.017677202820777893,
-0.05091188848018646,
-0.0021837332751601934,
-0.03352053090929985,
0.05488860607147217,
-0.04548939689993858,
-0.17161165177822113,
0.07414260506629944,
0.057454995810985565,
0.06860263645648956,
0.005700501147657633,
-0.022221049293875694,
-0.08441650867462158,
0.19395318627357483,
-0.17907029390335083,
-0.10012736171483994,
-0.11793211102485657,
-0.11539284139871597,
0.06477576494216919,
-0.06488651037216187,
0.05204414948821068,
-0.08538004010915756,
-0.021020304411649704,
-0.06129448488354683,
-0.20758579671382904,
0.1386503279209137,
-0.08449172228574753,
-0.06103429198265076,
-0.03646478429436684,
0.17354625463485718,
-0.12017735093832016,
0.031554628163576126,
0.021636703982949257,
0.003052959218621254,
-0.08177172392606735,
-0.0944179967045784,
-0.011861798353493214,
-0.0005066675366833806,
0.05515472963452339,
-0.005885514430701733,
-0.11499712616205215,
-0.06940623372793198,
-0.03181900456547737,
-0.04206151142716408,
0.27633970975875854,
0.19781582057476044,
-0.06500622630119324,
0.18190674483776093,
0.15823014080524445,
-0.13285474479198456,
-0.26057180762290955,
-0.11664874106645584,
-0.11960123479366302,
-0.03403230383992195,
-0.009882810525596142,
-0.13592703640460968,
0.005400790832936764,
0.0076415748335421085,
-0.011725804768502712,
0.12301356345415115,
-0.2310335338115692,
-0.1026141494512558,
0.10043886303901672,
0.014357764273881912,
0.33392006158828735,
-0.13605807721614838,
-0.11734752357006073,
-0.04087470471858978,
-0.2147815227508545,
0.19651192426681519,
-0.06620751321315765,
0.09528548270463943,
-0.025823839008808136,
0.08775286376476288,
0.018187837675213814,
-0.030151180922985077,
0.07204817235469818,
-0.0219814945012331,
0.018111493438482285,
-0.10374269634485245,
-0.05039626359939575,
0.1028854250907898,
0.009597412310540676,
0.014844047836959362,
-0.1298929750919342,
0.018013961613178253,
-0.11266180127859116,
-0.023237863555550575,
-0.08895048499107361,
0.05398623272776604,
-0.022029254585504532,
-0.0594581738114357,
-0.0038023556116968393,
-0.05500961095094681,
0.02953554317355156,
-0.015909744426608086,
0.23524044454097748,
-0.06160188093781471,
0.1317899227142334,
0.18225470185279846,
0.16154658794403076,
-0.08765915036201477,
0.03156700357794762,
-0.0841277614235878,
-0.0827488899230957,
0.07423359900712967,
-0.11429943889379501,
0.043635305017232895,
0.12553353607654572,
-0.04624899476766586,
0.06719360500574112,
0.09402789175510406,
0.0036176249850541353,
0.0027748593129217625,
0.11436799168586731,
-0.19859202206134796,
-0.023199593648314476,
-0.062429096549749374,
0.01671871356666088,
0.08789700269699097,
0.051457714289426804,
0.19403210282325745,
-0.017049752175807953,
-0.03991992399096489,
0.006250880192965269,
0.008870009332895279,
-0.06980738043785095,
0.032322339713573456,
0.0046806479804217815,
0.004193649627268314,
-0.1282404512166977,
0.11171748489141464,
0.034870695322752,
-0.15442152321338654,
0.009935024194419384,
0.18278701603412628,
-0.1408742070198059,
-0.12739591300487518,
-0.024828452616930008,
0.07055606693029404,
-0.18903575837612152,
-0.04342970252037048,
-0.037991274148225784,
-0.1559569388628006,
0.06144837290048599,
0.14878317713737488,
0.022554170340299606,
0.0702887773513794,
-0.03976375237107277,
-0.07178101688623428,
-0.010616782121360302,
0.0038255401886999607,
-0.06244109198451042,
0.02271546795964241,
-0.08821810036897659,
0.08448692411184311,
-0.014428729191422462,
0.11479716002941132,
-0.06420392543077469,
-0.01084406953305006,
-0.12124704569578171,
0.024183940142393112,
-0.1414751261472702,
-0.011322847567498684,
-0.06953700631856918,
-0.03867913782596588,
-0.007284826133400202,
-0.02137904427945614,
-0.0514959990978241,
-0.029393037781119347,
-0.11784622818231583,
0.012062003836035728,
-0.034387700259685516,
0.07273639738559723,
-0.055987078696489334,
-0.01750750094652176,
0.040794067084789276,
-0.021624630317091942,
0.12831872701644897,
0.1029701679944992,
-0.11162696033716202,
0.10538159310817719,
-0.1409258097410202,
-0.05448806658387184,
0.07477200031280518,
0.02639753557741642,
0.044499896466732025,
0.08450733870267868,
0.013028199784457684,
0.09869986772537231,
0.003409312106668949,
0.04033893346786499,
-0.0016710308846086264,
-0.12895502150058746,
-0.024845417588949203,
-0.019918126985430717,
-0.11913180351257324,
-0.05621066316962242,
-0.003430143231526017,
0.06229740008711815,
-0.0008467131410725415,
0.1366741806268692,
-0.06432726979255676,
0.07726593315601349,
-0.05709131062030792,
0.01991286315023899,
-0.004203334450721741,
-0.1441156566143036,
-0.11722081154584885,
-0.10425948351621628,
0.019262349233031273,
-0.003935534972697496,
0.1978766918182373,
0.028951287269592285,
0.004263665527105331,
0.04130031168460846,
0.04965388774871826,
-0.021064555272459984,
0.0066953618079423904,
0.2581484019756317,
0.04240664467215538,
-0.03947542980313301,
-0.1304943561553955,
0.03736669942736626,
-0.01026113796979189,
-0.004231574013829231,
0.12696674466133118,
0.06628357619047165,
0.02619437873363495,
0.110025554895401,
0.02392006292939186,
0.01964910700917244,
-0.08033084124326706,
-0.11417447775602341,
0.06193380802869797,
0.09693148732185364,
-0.03332780674099922,
0.08308830857276917,
0.18776163458824158,
-0.03854767605662346,
0.019761445000767708,
-0.027128908783197403,
-0.04301066696643829,
-0.18424849212169647,
-0.16417522728443146,
-0.06485016644001007,
-0.11863076686859131,
0.007389910984784365,
-0.09548148512840271,
0.08490397781133652,
-0.002620008774101734,
0.07798907905817032,
-0.09658244997262955,
0.01512572355568409,
0.03763238340616226,
-0.13460683822631836,
0.08761437982320786,
-0.011467987671494484,
0.08557022362947464,
-0.03700621426105499,
-0.0103765819221735,
-0.04972028732299805,
-0.05769762024283409,
-0.02384006232023239,
0.0721486434340477,
0.002262681722640991,
0.026357345283031464,
-0.11986155062913895,
-0.09131824970245361,
-0.024180222302675247,
0.07656846940517426,
0.004077413119375706,
0.20594137907028198,
0.01085876114666462,
-0.034453134983778,
0.03986063227057457,
0.1668778508901596,
-0.06708694994449615,
-0.08822906762361526,
0.002742933575063944,
0.2464081346988678,
0.04807261377573013,
0.09266326576471329,
0.003671094076707959,
0.000530242279637605,
-0.04503447934985161,
0.31903138756752014,
0.2885515093803406,
-0.08761239796876907,
-0.001634150859899819,
0.02448836900293827,
0.03223051503300667,
0.10125432163476944,
0.13673095405101776,
0.07781486958265305,
0.24329347908496857,
-0.07340242713689804,
0.025958461686968803,
-0.034678541123867035,
0.03430356830358505,
-0.10844395309686661,
0.11306065320968628,
-0.009907981380820274,
-0.0991881713271141,
0.010477469302713871,
0.08459296077489853,
-0.20985408127307892,
0.09112150967121124,
-0.03215159475803375,
-0.11081407219171524,
-0.018246060237288475,
-0.02096456103026867,
0.13527916371822357,
0.03761613368988037,
0.042578715831041336,
-0.02514502964913845,
-0.06290861964225769,
0.08215594291687012,
0.014004964381456375,
-0.2033291906118393,
0.025655120611190796,
0.06201848387718201,
-0.08995717018842697,
0.031116371974349022,
-0.002184450626373291,
0.07623296231031418,
0.0790623128414154,
0.09270888566970825,
-0.05003160238265991,
0.059993088245391846,
0.006408952176570892,
-0.009650450199842453,
0.03546195104718208,
-0.011725964024662971,
0.011993605643510818,
-0.05167074501514435,
0.04647251218557358,
-0.09172306954860687,
0.07201354205608368,
0.011836569756269455,
-0.0621035136282444,
-0.013986346311867237,
0.027640430256724358,
-0.057357389479875565,
0.052104685455560684,
0.07152314484119415,
-0.015867087990045547,
-0.032512012869119644,
-0.08771568536758423,
-0.03940077871084213,
0.02492530457675457,
-0.10863379389047623,
-0.039347801357507706,
-0.06803732365369797,
-0.0714278370141983,
0.13234227895736694,
0.035548146814107895,
-0.20501312613487244,
0.02124202623963356,
-0.08670545369386673,
0.03556486964225769,
-0.22441457211971283,
0.07981348782777786,
0.09468921273946762,
-0.007422801572829485,
0.006081585772335529,
-0.08300436288118362,
0.05625935643911362,
0.07104276120662689,
-0.11060250550508499,
-0.06391263008117676
] |
||
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-cased-finetuned-squad
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.2
- Datasets 1.13.3
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["squad"], "model-index": [{"name": "bert-base-cased-finetuned-squad", "results": []}]}
|
question-answering
|
andresestevez/bert-base-cased-finetuned-squad
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us
|
# bert-base-cased-finetuned-squad
This model is a fine-tuned version of bert-base-cased on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.2
- Datasets 1.13.3
- Tokenizers 0.10.3
|
[
"# bert-base-cased-finetuned-squad\n\nThis model is a fine-tuned version of bert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 2\n- eval_batch_size: 2\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.11.3\n- Pytorch 1.10.2\n- Datasets 1.13.3\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n",
"# bert-base-cased-finetuned-squad\n\nThis model is a fine-tuned version of bert-base-cased on the squad dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 2\n- eval_batch_size: 2\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP",
"### Training results",
"### Framework versions\n\n- Transformers 4.11.3\n- Pytorch 1.10.2\n- Datasets 1.13.3\n- Tokenizers 0.10.3"
] |
[
50,
39,
6,
12,
8,
3,
103,
4,
30
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-squad #license-apache-2.0 #endpoints_compatible #region-us \n# bert-base-cased-finetuned-squad\n\nThis model is a fine-tuned version of bert-base-cased on the squad dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 2\n- eval_batch_size: 2\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- Transformers 4.11.3\n- Pytorch 1.10.2\n- Datasets 1.13.3\n- Tokenizers 0.10.3"
] |
[
-0.07310211658477783,
0.07519020140171051,
-0.0026218167040497065,
0.04544032737612724,
0.1473884880542755,
0.017622200772166252,
0.10186264663934708,
0.11956318467855453,
-0.10235770046710968,
0.03354891762137413,
0.04702116549015045,
0.045705102384090424,
0.06058087572455406,
0.10552949458360672,
-0.011767732910811901,
-0.2790325880050659,
0.02617131918668747,
0.03254890441894531,
-0.0870722085237503,
0.08843248337507248,
0.13016383349895477,
-0.11203014105558395,
0.03293382376432419,
0.03549519181251526,
-0.14039508998394012,
0.04482436552643776,
-0.04288868606090546,
-0.05371396243572235,
0.1106339767575264,
0.02025868371129036,
0.136409729719162,
-0.0037951290141791105,
0.11838746070861816,
-0.24191589653491974,
0.014822295866906643,
0.0737561360001564,
0.05029568448662758,
0.07866828888654709,
0.07075197994709015,
0.029315464198589325,
0.09065611660480499,
-0.08038343489170074,
0.12473272532224655,
0.028835833072662354,
-0.06614851951599121,
-0.23783278465270996,
-0.08661472052335739,
0.04955505579710007,
0.11377781629562378,
0.08410441875457764,
0.0008628295036032796,
0.11720561981201172,
-0.11979664862155914,
0.05127653107047081,
0.16459596157073975,
-0.2692326307296753,
-0.0807492658495903,
0.008251454681158066,
0.05542946606874466,
0.04902320355176926,
-0.10780028253793716,
-0.048564568161964417,
0.033937968313694,
0.057848166674375534,
0.09666039049625397,
-0.005617523565888405,
-0.10800240933895111,
0.009761844761669636,
-0.1483624130487442,
-0.025234417989850044,
0.15793080627918243,
0.049457769840955734,
-0.05812624841928482,
-0.0809684470295906,
-0.018884509801864624,
-0.03603323921561241,
-0.05118635296821594,
-0.03179795295000076,
0.031018059700727463,
-0.02030671201646328,
-0.06480666995048523,
-0.053012870252132416,
-0.08953505009412766,
-0.08624651283025742,
-0.004902176093310118,
0.11208656430244446,
0.07262532413005829,
-0.008120246231555939,
-0.04235050082206726,
0.09588274359703064,
0.0025340148713439703,
-0.0947701707482338,
-0.0006801308481954038,
-0.005929028615355492,
-0.0746634379029274,
-0.059723176062107086,
-0.05601823329925537,
-0.02393595688045025,
0.0232477355748415,
0.12804514169692993,
-0.03193894028663635,
0.07178197801113129,
-0.021570676937699318,
0.018193857744336128,
-0.006359318736940622,
0.12518949806690216,
-0.04289201274514198,
0.016433198004961014,
-0.009989215061068535,
0.08223078399896622,
-0.034236542880535126,
0.002685061888769269,
-0.07413117587566376,
-0.00016555417096242309,
0.08386946469545364,
0.0594598762691021,
-0.045460354536771774,
0.019645672291517258,
-0.047939736396074295,
-0.027207637205719948,
-0.02177521400153637,
-0.11850163340568542,
0.0347260981798172,
-0.007411295082420111,
-0.0683484897017479,
-0.00025286234449595213,
0.0014023258117958903,
0.027854926884174347,
-0.038367561995983124,
0.062389176338911057,
-0.06623174250125885,
-0.010183321312069893,
-0.10408168286085129,
-0.0820247232913971,
0.024944722652435303,
-0.04476282745599747,
0.002096879994496703,
-0.08844193071126938,
-0.1624724119901657,
-0.018240315839648247,
0.05183450132608414,
-0.04530501738190651,
-0.0626022070646286,
-0.03686758130788803,
-0.03489494323730469,
0.00969554390758276,
-0.02350982464849949,
0.1805509775876999,
-0.052859827876091,
0.0772586464881897,
-0.01885729469358921,
0.006976368371397257,
-0.008210813626646996,
0.05980439484119415,
-0.0732961967587471,
0.03097810037434101,
-0.12324443459510803,
0.05677112191915512,
-0.11386904865503311,
0.019642174243927002,
-0.13868345320224762,
-0.11061619967222214,
0.004440566059201956,
-0.006430528126657009,
0.09455956518650055,
0.08483622968196869,
-0.1504441350698471,
-0.03208598494529724,
0.1172347217798233,
-0.07674961537122726,
-0.07723220437765121,
0.10229763388633728,
-0.056448739022016525,
0.053832340985536575,
0.044115230441093445,
0.1508265733718872,
0.099319688975811,
-0.11226879060268402,
0.032153621315956116,
0.01868327707052231,
0.08868458122015,
0.02574945241212845,
0.07844860106706619,
-0.01959727145731449,
-0.05976943299174309,
0.013545180670917034,
-0.0506502166390419,
0.04130377992987633,
-0.09334097802639008,
-0.07581967860460281,
-0.03150692209601402,
-0.08143699169158936,
0.0395851768553257,
0.018472004681825638,
0.03760416805744171,
-0.07031293958425522,
-0.09819567203521729,
0.1557316929101944,
0.13706015050411224,
-0.0554516576230526,
0.014667858369648457,
-0.09412796050310135,
0.039779819548130035,
-0.04013081267476082,
-0.008553468622267246,
-0.1966560333967209,
-0.11581117659807205,
0.04449905827641487,
-0.05150962620973587,
0.03254540264606476,
0.046999454498291016,
0.056457776576280594,
0.05077849328517914,
-0.045339521020650864,
-0.039792463183403015,
-0.11833100020885468,
0.00011697028821799904,
-0.10680209845304489,
-0.19133464992046356,
-0.07847069203853607,
-0.0461711660027504,
0.14171504974365234,
-0.20237882435321808,
0.016432588919997215,
-0.035754796117544174,
0.12919442355632782,
0.011057612486183643,
-0.03254709020256996,
-0.026480570435523987,
0.0740211233496666,
0.002261731307953596,
-0.06600043922662735,
0.053936950862407684,
0.022902272641658783,
-0.07809140533208847,
-0.025532424449920654,
-0.0623900331556797,
0.04740113392472267,
0.07838759571313858,
0.0023092543706297874,
-0.07651878893375397,
-0.07307083159685135,
-0.06824726611375809,
-0.04450047388672829,
-0.07756222784519196,
-0.0034998853225260973,
0.23527304828166962,
0.01369581650942564,
0.12998148798942566,
-0.07000395655632019,
-0.05684733763337135,
-0.004275200888514519,
-0.0010704582091420889,
-0.004949708469212055,
0.07874992489814758,
0.06474270671606064,
-0.07413111627101898,
0.07973147928714752,
0.11417742818593979,
-0.060539111495018005,
0.12753157317638397,
-0.06937503814697266,
-0.11276540905237198,
-0.0006787683814764023,
0.016586506739258766,
-0.018338631838560104,
0.12125468254089355,
-0.16253861784934998,
0.01142224669456482,
0.03610963746905327,
0.03436223417520523,
0.05245901644229889,
-0.1864532232284546,
0.0005694167921319604,
0.014958256855607033,
-0.030102333053946495,
-0.051301389932632446,
-0.03860959783196449,
0.04250384867191315,
0.09518881887197495,
0.03716467320919037,
-0.030164919793605804,
0.021840739995241165,
-0.015125931240618229,
-0.08034001290798187,
0.1908901035785675,
-0.1324881911277771,
-0.12856033444404602,
-0.12148351967334747,
0.00015128713857848197,
-0.0254858136177063,
-0.029293419793248177,
0.03550676628947258,
-0.09901752322912216,
-0.05043232440948486,
-0.07415402680635452,
0.03282884135842323,
-0.07134199142456055,
-0.00963262002915144,
0.02614055946469307,
0.01216580718755722,
0.09696053713560104,
-0.14022301137447357,
0.014231768436729908,
-0.020114000886678696,
-0.10795304924249649,
0.00046857722918502986,
0.05075548589229584,
0.07880605757236481,
0.12307365238666534,
-0.01696772873401642,
0.0030787738505750895,
-0.051084306091070175,
0.19352315366268158,
-0.04496939107775688,
-0.03591359034180641,
0.12044426053762436,
0.003909100778400898,
0.049083296209573746,
0.10564859211444855,
0.04520828649401665,
-0.08230122923851013,
0.041686829179525375,
0.07704610377550125,
0.000721001997590065,
-0.26383936405181885,
-0.03238716721534729,
-0.04119617119431496,
-0.08929120004177094,
0.11138646304607391,
0.050734903663396835,
-0.05829019472002983,
0.06288868188858032,
-0.01628093607723713,
0.03671697527170181,
-0.016295485198497772,
0.09170717746019363,
0.11514239758253098,
0.009640425443649292,
0.09586024284362793,
-0.02373471111059189,
-0.047708310186862946,
0.059699416160583496,
0.031625665724277496,
0.2705502510070801,
-0.00869592186063528,
0.062116410583257675,
0.06649477034807205,
0.17288430035114288,
-0.009670712985098362,
0.0457271970808506,
-0.0009160085464827716,
0.00397301884368062,
-0.0072226631455123425,
-0.04222199693322182,
-0.03221246972680092,
0.02075476199388504,
0.014111759141087532,
0.033325109630823135,
-0.11191218346357346,
-0.041785627603530884,
0.002801924478262663,
0.3241698741912842,
0.01491275243461132,
-0.25257062911987305,
-0.07559370994567871,
0.021928289905190468,
-0.0749865248799324,
-0.09666483849287033,
0.02509135752916336,
0.10241227596998215,
-0.14145630598068237,
0.01093236356973648,
-0.04986856132745743,
0.1097206100821495,
-0.023716771975159645,
-0.0066580623388290405,
0.04748931899666786,
0.12396705150604248,
0.005842194426804781,
0.10572648048400879,
-0.24013514816761017,
0.21485719084739685,
-0.004783282522112131,
0.10030100494623184,
-0.04745323956012726,
0.048519354313611984,
0.021167593076825142,
0.03315474092960358,
0.06167538836598396,
-0.0016744992462918162,
-0.0414171926677227,
-0.20727378129959106,
-0.057792045176029205,
0.0653148740530014,
0.0992489755153656,
-0.0274864062666893,
0.08935840427875519,
-0.044123727828264236,
0.03527959808707237,
0.052937328815460205,
-0.06389523297548294,
-0.17927652597427368,
-0.12134747952222824,
-0.005365092772990465,
-0.0018573201959952712,
0.028428224846720695,
-0.12421585619449615,
-0.10857558995485306,
-0.02960844524204731,
0.18104934692382812,
0.01833694986999035,
-0.03418063744902611,
-0.13048586249351501,
0.07671196013689041,
0.11464950442314148,
-0.04077784717082977,
0.03009266033768654,
0.011208400130271912,
0.15118078887462616,
0.026160314679145813,
-0.06725680083036423,
0.06536093354225159,
-0.08078523725271225,
-0.1324775665998459,
-0.06293877959251404,
0.12117446213960648,
0.06519711017608643,
0.05297011509537697,
0.014016415923833847,
-0.004017441999167204,
-0.0051412503235042095,
-0.08045289665460587,
-0.004137571435421705,
0.07464905083179474,
0.07117148488759995,
0.08412623405456543,
-0.12287957966327667,
0.03468277305364609,
-0.04041452333331108,
-0.010780551470816135,
0.1575365960597992,
0.20820482075214386,
-0.0653819665312767,
0.06156110018491745,
0.10594207048416138,
-0.082233726978302,
-0.17126810550689697,
0.06723427027463913,
0.1361558735370636,
0.0043397280387580395,
0.022359997034072876,
-0.25990772247314453,
0.12072494626045227,
0.11718311160802841,
-0.018318409100174904,
-0.0005956310196779668,
-0.2999231517314911,
-0.10507256537675858,
0.13290265202522278,
0.13889220356941223,
0.015844721347093582,
-0.12468943744897842,
-0.02430085465312004,
-0.028119714930653572,
-0.14443029463291168,
0.1016199067234993,
-0.09677650779485703,
0.0937068834900856,
-0.004997361917048693,
0.09386728703975677,
0.022497987374663353,
-0.030788656324148178,
0.15057595074176788,
0.0038168500177562237,
0.08263036608695984,
-0.0325874388217926,
0.0746065080165863,
0.0614064559340477,
-0.04044148698449135,
0.0395275354385376,
-0.011908918619155884,
0.04762399196624756,
-0.18432481586933136,
-0.02943846583366394,
-0.061849888414144516,
0.059294264763593674,
-0.03528105840086937,
-0.07217365503311157,
-0.02394714392721653,
0.05670783296227455,
0.050430700182914734,
-0.022176474332809448,
0.06299407035112381,
0.001940066576935351,
0.12619028985500336,
0.03810673579573631,
0.12624435126781464,
-0.02677796594798565,
-0.10718145221471786,
-0.004835671279579401,
-0.030720897018909454,
0.07822316884994507,
-0.09252993762493134,
0.026173731312155724,
0.11996570229530334,
0.030133306980133057,
0.14811360836029053,
0.05498243123292923,
-0.06160109490156174,
0.01715281419456005,
0.04012531414628029,
-0.07638978213071823,
-0.15401606261730194,
-0.005923622753471136,
0.09565616399049759,
-0.16203412413597107,
0.007923649623990059,
0.09069594740867615,
-0.055866338312625885,
-0.032044973224401474,
-0.007561061531305313,
0.007627077866345644,
-0.04901086911559105,
0.1694217324256897,
0.028689438477158546,
0.06334877759218216,
-0.08143237233161926,
0.12475217878818512,
0.08517846465110779,
-0.09991303086280823,
0.04761946573853493,
0.0437500886619091,
-0.05738961696624756,
-0.024508442729711533,
0.06724254041910172,
0.17921793460845947,
-0.0128521379083395,
-0.05763588473200798,
-0.0639520063996315,
-0.13872769474983215,
0.05705694109201431,
0.07282283157110214,
0.03957574442028999,
-0.015160116367042065,
-0.04972149059176445,
0.05164264515042305,
-0.11889948695898056,
0.06847600638866425,
0.044554974883794785,
0.07573585212230682,
-0.10213466733694077,
0.10573548823595047,
0.011697018519043922,
0.04142151027917862,
-0.010690459981560707,
-0.014705139212310314,
-0.08968019485473633,
0.005107716657221317,
-0.18270324170589447,
-0.014176595956087112,
-0.042864106595516205,
0.018549788743257523,
0.0031300627160817385,
-0.05079859495162964,
-0.034860458225011826,
0.03818417340517044,
-0.09077577292919159,
-0.04728880897164345,
0.018917087465524673,
0.08496546745300293,
-0.12288924306631088,
-0.002995795803144574,
0.03996988758444786,
-0.10157611221075058,
0.08002031594514847,
0.06060019135475159,
0.03216645494103432,
0.05511367321014404,
-0.12154088914394379,
-0.027683721855282784,
0.01758434809744358,
0.03842030093073845,
0.0590987429022789,
-0.11770866811275482,
-0.006390826310962439,
-0.011294623836874962,
0.030888520181179047,
0.000523440889082849,
0.053964387625455856,
-0.13202200829982758,
-0.0726751759648323,
-0.032108157873153687,
-0.07476893812417984,
-0.06704507023096085,
0.029116475954651833,
0.092470183968544,
0.07225479185581207,
0.16075879335403442,
-0.07610701769590378,
0.031798433512449265,
-0.18607674539089203,
-0.02013997547328472,
-0.026609310880303383,
-0.024061691015958786,
-0.054015107452869415,
-0.06361769139766693,
0.059149038046598434,
-0.052474524825811386,
0.13013271987438202,
-0.03207867965102196,
0.0811874270439148,
0.03529796749353409,
-0.0732961967587471,
0.041366927325725555,
0.019655423238873482,
0.23846818506717682,
0.06956440955400467,
-0.014046362601220608,
0.06051216647028923,
-0.0020782393403351307,
0.038998305797576904,
0.10478312522172928,
0.1413315236568451,
0.18010582029819489,
0.03199207782745361,
0.04891693964600563,
0.09022995084524155,
-0.08534074574708939,
-0.10790067166090012,
0.11410443484783173,
0.01954987272620201,
0.09959099441766739,
-0.052025504410266876,
0.24893249571323395,
0.08770394325256348,
-0.18804436922073364,
0.053760871291160583,
-0.06049637123942375,
-0.09261585026979446,
-0.09273695200681686,
-0.029483722522854805,
-0.06356817483901978,
-0.14952732622623444,
0.009461354464292526,
-0.13636505603790283,
0.012872438877820969,
0.09817831218242645,
0.012681129388511181,
0.014498652890324593,
0.12078690528869629,
-0.015008763410151005,
0.007374714594334364,
0.06880630552768707,
-0.005921431817114353,
0.014498664997518063,
-0.08449635654687881,
-0.09090983867645264,
0.05276632308959961,
0.008245560340583324,
0.06498667597770691,
-0.035340823233127594,
-0.01943233795464039,
0.026224011555314064,
-0.007027538027614355,
-0.06225883215665817,
0.025139158591628075,
-0.004567497409880161,
0.030086467042565346,
0.08301541954278946,
0.052938807755708694,
-0.004197944886982441,
-0.03568344563245773,
0.29014575481414795,
-0.07305193692445755,
-0.08975284546613693,
-0.1526060253381729,
0.2250816822052002,
0.01971849426627159,
-0.014182101003825665,
0.07031465321779251,
-0.08692019432783127,
-0.024370960891246796,
0.15635231137275696,
0.1114964634180069,
-0.05588001012802124,
-0.016972215846180916,
-0.0002831934834830463,
-0.02916558086872101,
-0.08267229795455933,
0.13822659850120544,
0.14411737024784088,
-0.00044296120177023113,
-0.07535011321306229,
-0.020119212567806244,
-0.03200218454003334,
-0.011759737506508827,
-0.08507055044174194,
0.054664239287376404,
0.039478421211242676,
-0.013364016078412533,
-0.0426470972597599,
0.08039505034685135,
-0.00048565957695245743,
-0.17419636249542236,
-0.013422299176454544,
-0.08245939016342163,
-0.17606449127197266,
-0.04956676438450813,
0.061111729592084885,
-0.004225386306643486,
0.04548970237374306,
-0.0334879532456398,
0.02349098213016987,
0.14522892236709595,
-0.010020596906542778,
-0.005942608695477247,
-0.13940954208374023,
0.1515444666147232,
-0.055403441190719604,
0.210720956325531,
-0.00014434308104682714,
0.060618311166763306,
0.11052931100130081,
0.04417882859706879,
-0.11693279445171356,
0.04858940467238426,
0.06615652143955231,
-0.08203615248203278,
0.024323761463165283,
0.14965234696865082,
-0.03702927008271217,
0.11440040171146393,
0.035145457834005356,
-0.11754557490348816,
0.010521617718040943,
-0.10030481219291687,
-0.03150486573576927,
-0.06456893682479858,
0.011672206223011017,
-0.08132817596197128,
0.13671763241291046,
0.20749954879283905,
-0.04234354570508003,
-0.016239838674664497,
-0.0905156210064888,
0.021039607003331184,
0.03463936224579811,
0.08758780360221863,
-0.039430901408195496,
-0.20824626088142395,
-0.0005795739707536995,
0.03858274593949318,
0.019481778144836426,
-0.2518274486064911,
-0.09588677436113358,
0.04559881240129471,
-0.03390764445066452,
-0.04812713712453842,
0.10730338096618652,
0.07319614291191101,
0.0381324328482151,
-0.04926621913909912,
-0.1694556474685669,
-0.056992191821336746,
0.15144358575344086,
-0.12125390768051147,
-0.03337099030613899
] |
null | null |
transformers
|
# Rick and Morty DialoGPT Model
|
{"tags": ["conversational"]}
|
text-generation
|
anduush/DialoGPT-small-Rick
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Rick and Morty DialoGPT Model
|
[
"# Rick and Morty DialoGPT Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Rick and Morty DialoGPT Model"
] |
[
51,
10
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Rick and Morty DialoGPT Model"
] |
[
-0.01990443281829357,
0.10367733240127563,
-0.006012056488543749,
0.013662099838256836,
0.1287931650876999,
0.004103946499526501,
0.13405320048332214,
0.13470496237277985,
-0.029608309268951416,
-0.0377325713634491,
0.1409052610397339,
0.2081032246351242,
-0.009616929106414318,
0.025026321411132812,
-0.08027864247560501,
-0.33285143971443176,
0.04419311136007309,
0.04611847549676895,
-0.04805411398410797,
0.11171722412109375,
0.09962809830904007,
-0.03511058911681175,
0.07650627940893173,
0.012189619243144989,
-0.11959464848041534,
0.014523470774292946,
0.01571112684905529,
-0.09889741986989975,
0.11399844288825989,
0.07783890515565872,
0.031239205971360207,
0.033389654010534286,
-0.042143791913986206,
-0.13308840990066528,
0.04855761677026749,
-0.0014628645731136203,
-0.03996938467025757,
0.06519230455160141,
0.0068825362250208855,
-0.09896008670330048,
0.13105708360671997,
0.11774895340204239,
-0.001342291128821671,
0.030811335891485214,
-0.1546017825603485,
-0.03095608949661255,
-0.013916928321123123,
0.04583658277988434,
0.05571185424923897,
0.1092928797006607,
-0.03970988467335701,
0.11546611040830612,
-0.046847838908433914,
0.11656361073255539,
0.13404695689678192,
-0.27711591124534607,
-0.013774634338915348,
0.14150507748126984,
0.03755388408899307,
0.031246060505509377,
-0.03764049708843231,
0.09234841167926788,
0.010574371553957462,
-0.009135077707469463,
-0.054559025913476944,
-0.07839421927928925,
-0.06956472247838974,
0.03881034255027771,
-0.08538595587015152,
-0.0028573249001055956,
0.22309143841266632,
-0.029777048155665398,
0.0931403860449791,
-0.061110686510801315,
-0.083645299077034,
0.0022445949725806713,
-0.04396601766347885,
-0.031562261283397675,
-0.0995510146021843,
0.08443354815244675,
-0.04024428874254227,
-0.08693728595972061,
-0.10731299221515656,
-0.022938303649425507,
-0.15873323380947113,
0.16214832663536072,
0.03501884266734123,
0.03956814110279083,
-0.21219894289970398,
0.07603893429040909,
-0.04213596507906914,
-0.10128775984048843,
0.025763655081391335,
-0.0809730738401413,
0.0031352867372334003,
0.01420458871871233,
-0.034850042313337326,
-0.01257789321243763,
0.09354974329471588,
0.11913833022117615,
-0.002085368847474456,
0.028482265770435333,
-0.03459439426660538,
0.04555915296077728,
0.04445279389619827,
0.04635937884449959,
-0.030874032527208328,
-0.005519113503396511,
0.024999095126986504,
-0.0903957337141037,
-0.010871811769902706,
-0.060442280024290085,
-0.1946737915277481,
0.013364237733185291,
0.05735969915986061,
0.055262304842472076,
0.030765585601329803,
0.13551434874534607,
0.0010974886827170849,
-0.0475224107503891,
0.03023342229425907,
-0.020769428461790085,
-0.016528211534023285,
0.029149476438760757,
-0.0072809201665222645,
0.1526104062795639,
0.022983204573392868,
0.05690442770719528,
-0.11451500654220581,
0.012773441150784492,
-0.03330712020397186,
-0.006917042192071676,
-0.03216493874788284,
-0.061537809669971466,
0.003289242973551154,
0.0014469954185187817,
0.013694697991013527,
-0.12761977314949036,
-0.15719962120056152,
-0.003717299085110426,
0.00613630935549736,
-0.05369097366929054,
-0.10004933178424835,
-0.10542158782482147,
-0.03153182193636894,
0.046352777630090714,
-0.053748197853565216,
0.03198752924799919,
-0.039340607821941376,
0.09383489936590195,
-0.03441528603434563,
0.0691300630569458,
-0.0863635316491127,
0.0905333161354065,
-0.06098577380180359,
-0.04111234471201897,
-0.0643690675497055,
0.12356391549110413,
0.011561519466340542,
0.04442533850669861,
-0.03781363368034363,
-0.01636880449950695,
-0.11087207496166229,
0.06495212018489838,
-0.03516015037894249,
0.22487092018127441,
-0.08996163308620453,
-0.09683383256196976,
0.22284504771232605,
-0.04562665522098541,
-0.12769415974617004,
0.12243670970201492,
-0.03600937873125076,
0.09682484716176987,
0.11536505818367004,
0.16257616877555847,
0.03866875544190407,
-0.0002237519365735352,
0.10846788436174393,
0.10610917955636978,
-0.07603283226490021,
0.006744202226400375,
0.0250004380941391,
-0.02382737584412098,
-0.09139634668827057,
0.015165179036557674,
0.07776524871587753,
0.04803644120693207,
-0.05478836968541145,
-0.015317765064537525,
0.015090391971170902,
-0.003627530997619033,
0.06564177572727203,
-0.017049036920070648,
0.11691898107528687,
-0.03955721855163574,
-0.07620245963335037,
-0.014626736752688885,
0.028113901615142822,
-0.06986767798662186,
0.026787258684635162,
-0.07962338626384735,
0.02948051132261753,
-0.01967560686171055,
0.06687499582767487,
-0.16950036585330963,
-0.09430424869060516,
-0.06010226905345917,
0.23349159955978394,
0.07496993243694305,
0.11698364466428757,
0.06350064277648926,
-0.056928664445877075,
0.0006459777359850705,
0.037900060415267944,
0.19767099618911743,
-0.006904584355652332,
-0.07503941655158997,
-0.11777795851230621,
0.10312607139348984,
-0.07375676929950714,
0.06138577312231064,
-0.0416308231651783,
0.007855354808270931,
0.019795136526226997,
0.11127804219722748,
-0.04220014438033104,
0.039965033531188965,
0.012499134056270123,
-0.03696384280920029,
-0.05908297002315521,
0.0004571304307319224,
0.09440597146749496,
-0.0005542659782804549,
-0.10514124482870102,
0.2379530370235443,
-0.21215155720710754,
0.12180843949317932,
0.1799643337726593,
-0.2256188690662384,
0.008836638182401657,
-0.10462760180234909,
-0.016665222123265266,
0.01030759233981371,
0.03996801748871803,
-0.040312353521585464,
0.24249082803726196,
-0.014560520648956299,
0.17035135626792908,
-0.04880015179514885,
-0.05010494217276573,
-0.0440804697573185,
-0.05291803553700447,
0.0003277618088759482,
0.12486644089221954,
0.09157522767782211,
-0.18372175097465515,
0.17465431988239288,
0.06325390189886093,
0.03004654310643673,
0.1566917598247528,
0.022896459326148033,
0.020663797855377197,
0.05599488690495491,
-0.0012882096925750375,
-0.03033529780805111,
-0.07880529016256332,
-0.20945574343204498,
-0.012111871503293514,
0.07547834515571594,
0.04618273675441742,
0.10363037884235382,
-0.1018955409526825,
-0.030724551528692245,
-0.006948297843337059,
-0.030821966007351875,
0.03848150745034218,
0.13554143905639648,
0.015318007208406925,
0.12024796009063721,
-0.019162237644195557,
-0.06668011844158173,
0.0741129145026207,
0.01461794413626194,
-0.09263674914836884,
0.18050695955753326,
-0.1221487745642662,
-0.3382752537727356,
-0.10329627990722656,
-0.20327065885066986,
-0.04040617123246193,
0.0422586165368557,
0.11002974957227707,
-0.1460546851158142,
-0.029720865190029144,
0.0010455691954120994,
0.08435780555009842,
-0.1366978883743286,
0.006720550823956728,
-0.017843635752797127,
-0.01294276025146246,
-0.1374056041240692,
-0.09384968876838684,
-0.04747654125094414,
-0.060003772377967834,
-0.03218422830104828,
0.10381519794464111,
-0.1596987098455429,
0.007801016326993704,
0.230968177318573,
0.04797196388244629,
0.07053504139184952,
-0.036995481699705124,
0.17910921573638916,
-0.08220451325178146,
0.016473548486828804,
0.24478016793727875,
-0.05610832944512367,
0.0740312784910202,
0.10560029745101929,
-0.005553957540541887,
-0.052998270839452744,
0.03756273165345192,
0.00788428820669651,
-0.0785532221198082,
-0.21784749627113342,
-0.1030275970697403,
-0.11046822369098663,
0.04284128174185753,
0.05120398849248886,
0.04543844982981682,
0.1585974246263504,
0.06446543335914612,
-0.05187172442674637,
-0.011306295171380043,
0.08315242826938629,
0.08576013147830963,
0.24794787168502808,
-0.06311704963445663,
0.1473274976015091,
-0.020790869370102882,
-0.16434483230113983,
0.07334780693054199,
0.06416254490613937,
0.07227631658315659,
0.06913222372531891,
0.11215730756521225,
0.0020037174690514803,
0.017364054918289185,
0.12614323198795319,
0.05889604985713959,
-0.011050567030906677,
-0.031410302966833115,
-0.04586650803685188,
-0.04347039759159088,
-0.020151739940047264,
0.041160233318805695,
0.05188119783997536,
-0.1600257307291031,
-0.02415069006383419,
0.022831739857792854,
0.046689603477716446,
-0.003216250566765666,
0.08608495444059372,
-0.19217506051063538,
-0.018159521743655205,
0.06477150321006775,
-0.0016290671192109585,
-0.09313707798719406,
0.08108778297901154,
-0.009849769994616508,
-0.09697907418012619,
0.03780587762594223,
-0.03585495799779892,
0.1301390826702118,
-0.0750122219324112,
0.07286842167377472,
-0.1119815781712532,
-0.02080838568508625,
-0.0087605444714427,
0.11860883235931396,
-0.3024371266365051,
0.1707288920879364,
-0.0030656929593533278,
-0.04842326417565346,
-0.11293680220842361,
-0.015061003156006336,
0.03821004554629326,
0.08916047215461731,
0.10371578484773636,
-0.030773809179663658,
-0.06436607241630554,
0.0791664570569992,
-0.050910793244838715,
0.03525971621274948,
0.10187692940235138,
-0.04662879928946495,
-0.014911266043782234,
-0.05685164034366608,
0.0027524156030267477,
0.02270045317709446,
-0.10804066807031631,
0.014929873868823051,
-0.19113284349441528,
0.07794220000505447,
0.0811065286397934,
0.0722472071647644,
0.04095001146197319,
-0.029467018321156502,
-0.1261810064315796,
0.2744207978248596,
0.007417048793286085,
-0.09985779225826263,
-0.11269644647836685,
0.04465123638510704,
0.05646880716085434,
-0.07145541161298752,
-0.028514720499515533,
-0.07924950867891312,
0.052012015134096146,
-0.07113154232501984,
-0.1981293261051178,
0.11338871717453003,
-0.09873685240745544,
-0.04736494645476341,
-0.03962721675634384,
0.2276533544063568,
-0.027753405272960663,
0.02130931057035923,
0.0393831804394722,
-0.001616212772205472,
-0.12734149396419525,
-0.09492160379886627,
0.004517016001045704,
-0.0013660878175869584,
0.02586340345442295,
0.022777099162340164,
-0.04388801380991936,
0.0049570053815841675,
-0.06949588656425476,
-0.0037953434512019157,
0.3158918023109436,
0.10998717695474625,
-0.04474896565079689,
0.1561327874660492,
0.10242960602045059,
-0.06360200047492981,
-0.28859275579452515,
-0.11298105865716934,
-0.07240703701972961,
-0.05466444417834282,
-0.0838940367102623,
-0.18133240938186646,
0.08497140556573868,
-0.042584747076034546,
-0.00881777424365282,
0.042027126997709274,
-0.2644155025482178,
-0.09412363916635513,
0.18815293908119202,
-0.01533579919487238,
0.4300551414489746,
-0.11307147145271301,
-0.07450833916664124,
-0.05387028306722641,
-0.13561248779296875,
0.18766070902347565,
-0.018648525699973106,
0.0966244488954544,
0.00443116994574666,
0.20654869079589844,
0.05815155804157257,
-0.0008219819865189493,
0.0747876986861229,
0.011587066575884819,
-0.0452013723552227,
-0.09014920890331268,
-0.09217863529920578,
-0.020688166841864586,
0.005974666681140661,
0.034957773983478546,
-0.0941787138581276,
0.05258546397089958,
-0.11336535215377808,
-0.05589618906378746,
-0.07209338247776031,
0.026715638116002083,
0.02418643794953823,
-0.06410122662782669,
-0.006407043896615505,
-0.048794936388731,
-0.0010418962920084596,
0.00979152973741293,
0.21295785903930664,
-0.11305148899555206,
0.12096642702817917,
0.04414689913392067,
0.1508360654115677,
-0.08366664499044418,
-0.03614836558699608,
-0.04910365119576454,
-0.05565084517002106,
0.0676501989364624,
-0.1319035291671753,
0.04462771117687225,
0.10053624957799911,
-0.030742639675736427,
0.0898696631193161,
0.11227817088365555,
-0.02972952462732792,
0.0016581144882366061,
0.07279330492019653,
-0.23832836747169495,
-0.08509121090173721,
-0.07718803733587265,
0.05435929819941521,
0.057659514248371124,
0.09007556736469269,
0.21964938938617706,
0.011087107472121716,
-0.023847850039601326,
0.027587326243519783,
0.029717741534113884,
-0.01658647321164608,
0.05797221511602402,
0.008770608343183994,
0.031205764040350914,
-0.14632299542427063,
0.04562913626432419,
-0.010501107200980186,
-0.07197817414999008,
0.03429242596030235,
0.16717956960201263,
-0.10209374874830246,
-0.12234743684530258,
-0.04288604483008385,
0.17517046630382538,
-0.13247300684452057,
-0.017495078966021538,
-0.05478521063923836,
-0.1241658553481102,
0.07977617532014847,
0.11423204839229584,
0.05072414129972458,
0.042339734733104706,
-0.09691346436738968,
-0.03881148621439934,
-0.05552472919225693,
0.01957569271326065,
0.018891409039497375,
-0.030404040589928627,
-0.037885911762714386,
0.025801094248890877,
-0.04172535613179207,
0.11203933507204056,
-0.087384894490242,
-0.09792038798332214,
-0.16838693618774414,
0.03925701230764389,
-0.049022991210222244,
-0.07899222522974014,
-0.09344983100891113,
-0.03523614630103111,
0.014231358654797077,
-0.03348008170723915,
-0.018664700910449028,
-0.02225758694112301,
-0.0958842933177948,
0.03419994190335274,
-0.048781368881464005,
-0.005008503329008818,
-0.08496184647083282,
0.017331385985016823,
0.04781922325491905,
-0.023604100570082664,
0.1431105136871338,
0.12453559041023254,
-0.11789791285991669,
0.10031480342149734,
-0.16611437499523163,
-0.06820093840360641,
0.09455996751785278,
0.02471991442143917,
0.043245621025562286,
0.028927266597747803,
0.005174829158931971,
0.04808570072054863,
0.05950818210840225,
0.03694291412830353,
0.041101954877376556,
-0.07111897319555283,
0.061451081186532974,
-0.06278520077466965,
-0.11226452142000198,
-0.04257739707827568,
-0.005422866903245449,
0.00011432790051912889,
0.07346735894680023,
0.11052975058555603,
-0.05098198726773262,
0.09580544382333755,
-0.050767768174409866,
0.046003878116607666,
0.0289035402238369,
-0.16526201367378235,
0.008764104917645454,
-0.08482556790113449,
0.05248309671878815,
0.0030253108125180006,
0.15688744187355042,
0.028536081314086914,
-0.03175791725516319,
0.02630779519677162,
0.05105529725551605,
0.06318540126085281,
-0.00840448122471571,
0.19050461053848267,
0.09726009517908096,
-0.04487645998597145,
-0.09418396651744843,
0.08849480748176575,
0.05022666975855827,
0.05143674090504646,
0.1403687596321106,
-0.020687401294708252,
0.012512898072600365,
0.07724163681268692,
0.014415515586733818,
0.017872430384159088,
-0.07756411284208298,
-0.09487451612949371,
-0.011494439095258713,
0.025514457374811172,
-0.02882363088428974,
0.1138797178864479,
0.16729387640953064,
-0.0008394720498472452,
0.013234704732894897,
-0.01801590994000435,
-0.05735309422016144,
-0.20129387080669403,
-0.1959676295518875,
-0.09400797635316849,
-0.13690303266048431,
-0.0009418319095857441,
-0.13835963606834412,
0.03616710752248764,
0.042394787073135376,
0.09917435795068741,
-0.039446551352739334,
0.019261397421360016,
0.026794444769620895,
-0.10323353111743927,
0.039175424724817276,
-0.04838612675666809,
0.09421038627624512,
-0.007761404849588871,
0.005773975048214197,
-0.046786144375801086,
0.02436385303735733,
0.02127891033887863,
0.038409680128097534,
-0.012736459262669086,
0.024856114760041237,
-0.11602245271205902,
-0.09478921443223953,
-0.058010075241327286,
0.0558818019926548,
0.0046934462152421474,
0.18179026246070862,
0.02449701726436615,
-0.03384847193956375,
0.0275272186845541,
0.19317778944969177,
-0.06196035072207451,
-0.09709009528160095,
-0.08241496980190277,
0.2182236760854721,
-0.018931716680526733,
0.09253086894750595,
-0.035876765847206116,
0.012440751306712627,
-0.07121489197015762,
0.33243879675865173,
0.29320472478866577,
-0.10524016618728638,
0.010426074266433716,
-0.0019151283195242286,
0.0405552051961422,
0.1290767937898636,
0.07575080543756485,
0.11663594841957092,
0.256552129983902,
-0.06501701474189758,
-0.057690393179655075,
-0.014668738469481468,
-0.027142031118273735,
-0.06502988189458847,
0.04214107245206833,
0.04939494654536247,
-0.07117093354463577,
-0.00912293791770935,
0.12242040783166885,
-0.24606983363628387,
0.04577518254518509,
-0.13518153131008148,
-0.14807558059692383,
-0.0726354643702507,
0.002261551097035408,
0.09914402663707733,
0.010166509076952934,
0.08546656370162964,
-0.014570544473826885,
-0.0710548534989357,
0.03896206244826317,
0.021210450679063797,
-0.2144380509853363,
0.021960165351629257,
0.07259857654571533,
-0.028754761442542076,
-0.07154250144958496,
-0.013138728216290474,
0.08338925242424011,
0.09720319509506226,
0.03173141926527023,
-0.009079075418412685,
0.04570826143026352,
-0.0000614441087236628,
-0.06747788935899734,
0.035688117146492004,
0.022403022274374962,
0.01331246830523014,
-0.05491582676768303,
0.07895619422197342,
-0.17176033556461334,
0.020258452743291855,
-0.03599786013364792,
-0.06506339460611343,
-0.006352625321596861,
0.02872123196721077,
-0.06236473098397255,
0.0810769721865654,
0.08681372553110123,
-0.010693355463445187,
-0.015406738966703415,
-0.019259916618466377,
-0.012411676347255707,
-0.028850549831986427,
-0.07069326192140579,
-0.09390060603618622,
-0.15529757738113403,
-0.12466321885585785,
0.08110006153583527,
-0.008061634376645088,
-0.2096063792705536,
0.012769150547683239,
-0.13104628026485443,
0.04622570425271988,
-0.10809949785470963,
0.09371429681777954,
0.08394473046064377,
0.020185640081763268,
-0.007141938898712397,
0.003890183288604021,
0.036074474453926086,
0.07894916087388992,
-0.13067346811294556,
-0.08049263805150986
] |
null | null |
transformers
|
# Medical History Model based on ruGPT2 by @sberbank-ai
A simple model for helping medical staff to complete patient's medical histories.
Model used pretrained [sberbank-ai/rugpt3small_based_on_gpt2](https://huggingface.co/sberbank-ai/rugpt3small_based_on_gpt2)
|
{"language": ["ru"], "license": "mit", "tags": ["PyTorch", "Transformers"]}
|
text-generation
|
anechaev/ru_med_gpt3sm_based_on_gpt2
|
[
"transformers",
"pytorch",
"safetensors",
"gpt2",
"text-generation",
"PyTorch",
"Transformers",
"ru",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"ru"
] |
TAGS
#transformers #pytorch #safetensors #gpt2 #text-generation #PyTorch #Transformers #ru #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# Medical History Model based on ruGPT2 by @sberbank-ai
A simple model for helping medical staff to complete patient's medical histories.
Model used pretrained sberbank-ai/rugpt3small_based_on_gpt2
|
[
"# Medical History Model based on ruGPT2 by @sberbank-ai\n\nA simple model for helping medical staff to complete patient's medical histories.\nModel used pretrained sberbank-ai/rugpt3small_based_on_gpt2"
] |
[
"TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #PyTorch #Transformers #ru #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# Medical History Model based on ruGPT2 by @sberbank-ai\n\nA simple model for helping medical staff to complete patient's medical histories.\nModel used pretrained sberbank-ai/rugpt3small_based_on_gpt2"
] |
[
68,
57
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #gpt2 #text-generation #PyTorch #Transformers #ru #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Medical History Model based on ruGPT2 by @sberbank-ai\n\nA simple model for helping medical staff to complete patient's medical histories.\nModel used pretrained sberbank-ai/rugpt3small_based_on_gpt2"
] |
[
-0.03622813522815704,
0.042556457221508026,
-0.005221223458647728,
-0.06183047965168953,
0.0620671883225441,
0.05883243307471275,
0.16856946051120758,
0.1413961797952652,
-0.000020800634956685826,
0.05130678042769432,
0.2202240377664566,
0.025663968175649643,
0.04905559495091438,
0.08175023645162582,
0.003930386155843735,
-0.2796454131603241,
0.024299101904034615,
0.127010777592659,
0.06426301598548889,
0.15510806441307068,
0.042881809175014496,
-0.027045229449868202,
0.04581443965435028,
0.07889457046985626,
-0.10959097743034363,
-0.005062061361968517,
0.09975732862949371,
-0.06948095560073853,
0.1590237319469452,
-0.027392080053687096,
0.027618175372481346,
0.005268234293907881,
0.09517653286457062,
-0.08378401398658752,
0.0361233651638031,
-0.05204040929675102,
-0.04070327430963516,
0.07354631274938583,
-0.05154532566666603,
-0.12738028168678284,
0.19980527460575104,
0.011641398072242737,
0.006463685538619757,
-0.012688965536653996,
-0.11215443164110184,
-0.1149243637919426,
-0.075428806245327,
0.1605091392993927,
0.009521213360130787,
0.12179727107286453,
0.00638124393299222,
0.14664968848228455,
-0.09609805047512054,
0.00033574458211660385,
0.19984963536262512,
-0.3336467742919922,
-0.025647077709436417,
0.07661084085702896,
0.10028184950351715,
-0.08803113549947739,
-0.022898053750395775,
0.13313990831375122,
-0.034968070685863495,
0.06174095347523689,
0.017788009718060493,
-0.06393998861312866,
0.06771119683980942,
0.07109009474515915,
-0.09195362776517868,
-0.026919953525066376,
0.16036711633205414,
-0.1664959043264389,
0.03140956163406372,
-0.07071179151535034,
-0.028090480715036392,
-0.01970086805522442,
-0.0031309188343584538,
-0.08771545439958572,
-0.04885181784629822,
-0.006535997148603201,
0.0020846526604145765,
0.004742532968521118,
-0.06821764260530472,
-0.08142184466123581,
-0.143644779920578,
0.005164532922208309,
0.005628267768770456,
0.08913324773311615,
-0.1127999946475029,
0.12772753834724426,
0.036449577659368515,
-0.01274983398616314,
0.0043721385300159454,
-0.04031519591808319,
-0.01575479283928871,
-0.01670742966234684,
-0.016161585226655006,
0.05648139864206314,
0.1320410966873169,
0.17464947700500488,
-0.043248500674963,
-0.07326090335845947,
0.087818443775177,
0.025171462446451187,
-0.0031734141521155834,
-0.004750356078147888,
-0.15540693700313568,
-0.0030979267321527004,
-0.012116041965782642,
-0.0046013519167900085,
0.011598261073231697,
0.04669758677482605,
-0.0501638799905777,
-0.016967620700597763,
-0.011391292326152325,
0.06873925030231476,
-0.10164312273263931,
0.048100799322128296,
-0.014322309754788876,
0.0011317296884953976,
0.08363267034292221,
0.0072302743792533875,
-0.08124620467424393,
0.0022280756384134293,
0.014596624299883842,
0.020979145541787148,
0.08446786552667618,
0.02830727957189083,
-0.07358209043741226,
0.126780167222023,
-0.0174570232629776,
-0.02098924107849598,
0.007662578020244837,
0.0200835969299078,
0.024375218898057938,
-0.0708920881152153,
0.06328085064888,
-0.15243534743785858,
-0.18445315957069397,
-0.011879541911184788,
-0.0125888055190444,
-0.05095118284225464,
0.048511095345020294,
-0.0373636856675148,
0.022964730858802795,
-0.009042711928486824,
-0.02455691620707512,
-0.03017248585820198,
-0.062233954668045044,
0.04315702244639397,
-0.18204884231090546,
0.03280890733003616,
-0.16302086412906647,
-0.02052072435617447,
-0.1301296055316925,
0.03883973881602287,
-0.017013056203722954,
-0.10355135053396225,
0.0561407133936882,
-0.0635949894785881,
-0.0696338564157486,
-0.01706809736788273,
-0.08409092575311661,
0.0016930202255025506,
0.047894056886434555,
0.09895874559879303,
-0.016513360664248466,
-0.08150216937065125,
0.25702527165412903,
-0.028984785079956055,
-0.25397583842277527,
0.10791859030723572,
-0.007632010150700808,
0.16235274076461792,
0.08450967818498611,
0.18024776875972748,
0.08601551502943039,
-0.0992787554860115,
-0.06351160258054733,
-0.13924747705459595,
-0.11838050931692123,
-0.05135713145136833,
0.06891223788261414,
0.014477670192718506,
-0.027619704604148865,
0.005282358266413212,
-0.12849308550357819,
0.009181024506688118,
-0.11832874268293381,
0.022318413481116295,
-0.002541421679779887,
-0.12105752527713776,
0.013434834778308868,
-0.008531435392796993,
0.08667111396789551,
-0.04868528991937637,
-0.019657721742987633,
0.049596138298511505,
0.07894256711006165,
0.00983403716236353,
-0.033241670578718185,
-0.17134490609169006,
0.03162013739347458,
0.06679221242666245,
0.05492304638028145,
-0.1095457449555397,
-0.030202189460396767,
-0.0704970434308052,
0.04584667086601257,
-0.04751512408256531,
0.058459483087062836,
0.03176712617278099,
0.03641867637634277,
-0.045512907207012177,
0.023506663739681244,
0.13135354220867157,
-0.015807054936885834,
-0.05140594393014908,
-0.041228875517845154,
0.0933295264840126,
-0.059961751103401184,
0.07785110175609589,
-0.008860625326633453,
0.05552057921886444,
-0.11966755241155624,
0.1075117290019989,
-0.022227156907320023,
-0.00823533721268177,
0.02395198494195938,
-0.006838734727352858,
0.01672418601810932,
0.0104605033993721,
0.06436007469892502,
-0.02945137210190296,
-0.055841706693172455,
0.2172788381576538,
-0.14957402646541595,
0.189424529671669,
0.0532122366130352,
-0.07215624302625656,
-0.023614320904016495,
-0.06789339333772659,
-0.06760728359222412,
0.031168103218078613,
-0.19732899963855743,
-0.03245049715042114,
0.19169248640537262,
-0.005928827915340662,
0.08887511491775513,
0.05071433633565903,
-0.054454561322927475,
-0.0295686237514019,
-0.06084177643060684,
0.06044537201523781,
0.037673674523830414,
0.08170198649168015,
-0.21096724271774292,
0.00838739238679409,
0.19858476519584656,
0.08448570221662521,
0.13574299216270447,
0.042878657579422,
0.005633647553622723,
-0.0803956687450409,
-0.11452548205852509,
0.013590759597718716,
0.020471248775720596,
-0.169459268450737,
-0.013442524708807468,
0.0664375051856041,
-0.05217878893017769,
0.02828047052025795,
-0.03610369935631752,
-0.07715363800525665,
-0.04788578301668167,
-0.020236222073435783,
-0.05050826072692871,
0.1641024798154831,
-0.045019812881946564,
0.1635541170835495,
0.01517251692712307,
-0.010589966550469398,
0.02658817730844021,
0.06092590466141701,
-0.13978423178195953,
0.13028836250305176,
-0.03034244477748871,
-0.2363644540309906,
-0.03373038023710251,
-0.10732954740524292,
0.10907686501741409,
0.02767397277057171,
0.0392162948846817,
-0.07132041454315186,
0.040276795625686646,
-0.008881616406142712,
0.050698600709438324,
0.093093641102314,
-0.016089240089058876,
-0.007026086561381817,
0.01747818849980831,
-0.037429966032505035,
0.01910235360264778,
-0.07380358129739761,
-0.14780879020690918,
-0.08720982074737549,
0.1750914305448532,
-0.11190550029277802,
0.01204358134418726,
0.1464066356420517,
0.04317546635866165,
0.006167809944599867,
-0.035485077649354935,
0.03604472428560257,
-0.08417385816574097,
0.05964157357811928,
0.22160299122333527,
0.013334522023797035,
-0.03449904918670654,
0.08529692888259888,
0.0773094892501831,
-0.05197152495384216,
0.06978684663772583,
0.000427800725447014,
-0.017601197585463524,
-0.2474881261587143,
-0.09301433712244034,
-0.03568512201309204,
0.05998505279421806,
-0.000987377017736435,
0.053606826812028885,
0.026345424354076385,
0.13297854363918304,
0.03039061278104782,
-0.025618670508265495,
-0.009285458363592625,
0.03889760002493858,
0.11125296354293823,
-0.03336144611239433,
0.16728626191616058,
0.037771306931972504,
-0.11787574738264084,
0.07905080914497375,
-0.08191512525081635,
0.1218249574303627,
0.03329380974173546,
-0.08512217551469803,
0.13825127482414246,
0.0461416020989418,
0.10540078580379486,
0.06620002537965775,
0.14407701790332794,
-0.02645755745470524,
-0.0032279309816658497,
-0.0036249086260795593,
-0.08424244076013565,
-0.02242099493741989,
-0.10651770979166031,
-0.07163543999195099,
-0.005044817458838224,
0.031943436712026596,
0.0753142461180687,
0.1585291028022766,
0.13692833483219147,
-0.14652679860591888,
-0.08292632550001144,
0.020203836262226105,
-0.05099740996956825,
-0.1056513786315918,
0.09834443032741547,
-0.00003482776810415089,
-0.1311863511800766,
0.0690932422876358,
-0.06784965842962265,
0.0967184379696846,
-0.005704554263502359,
0.1383890062570572,
0.008526669815182686,
-0.18432804942131042,
-0.04958265274763107,
0.08322539180517197,
-0.2362082153558731,
0.18365880846977234,
-0.017880335450172424,
0.006570244673639536,
-0.08448763936758041,
-0.05309027060866356,
0.05655785650014877,
0.20352721214294434,
0.1771170198917389,
0.02524408884346485,
0.005146249197423458,
0.02591482363641262,
-0.05464887246489525,
0.0645986869931221,
0.0731148049235344,
-0.11544250696897507,
0.00779705448076129,
-0.04150800034403801,
0.026591157540678978,
-0.07054334878921509,
-0.18173642456531525,
-0.11403576284646988,
0.004182123113423586,
0.09096553921699524,
-0.025831418111920357,
0.11978907883167267,
-0.037216026335954666,
-0.04088151827454567,
-0.05977177619934082,
0.07325396686792374,
0.04567928984761238,
-0.02364754118025303,
-0.12310885637998581,
0.13147872686386108,
0.03699752688407898,
-0.08097638934850693,
-0.009670192375779152,
0.015724098309874535,
-0.020870385691523552,
-0.012939001433551311,
-0.12757058441638947,
0.012307573109865189,
-0.15168800950050354,
-0.12802527844905853,
-0.047565657645463943,
0.10796226561069489,
0.0499122329056263,
0.03352701663970947,
0.031122323125600815,
0.05614041909575462,
-0.08519133180379868,
-0.057934921234846115,
0.2101282924413681,
0.004485801327973604,
0.03721201419830322,
0.032553546130657196,
-0.06936195492744446,
-0.0774175301194191,
-0.0056121679954230785,
-0.09177547693252563,
0.11112061142921448,
0.12622861564159393,
-0.05086784437298775,
0.13087907433509827,
0.12249297648668289,
-0.04884438216686249,
-0.328509122133255,
-0.11467605829238892,
-0.059739213436841965,
-0.007713387720286846,
0.03764086216688156,
-0.11386007815599442,
0.15669044852256775,
0.07844292372465134,
-0.014794299378991127,
-0.05617385357618332,
-0.17944853007793427,
-0.04508507624268532,
0.18939171731472015,
-0.002872707787901163,
0.3338836133480072,
-0.06916893273591995,
-0.045211393386125565,
0.06304911524057388,
-0.11580657958984375,
0.07938061654567719,
-0.1894848644733429,
0.0383453331887722,
-0.13941235840320587,
0.0496358685195446,
0.03335469961166382,
-0.031038103625178337,
0.03198965638875961,
-0.053501587361097336,
-0.03119404800236225,
-0.10397748649120331,
-0.032023124396800995,
0.1401626318693161,
0.02500493824481964,
0.03416900709271431,
0.03854740411043167,
-0.013463972136378288,
-0.10723336786031723,
-0.028670813888311386,
-0.06548837572336197,
-0.0020226717460900545,
0.046991072595119476,
-0.1238429993391037,
-0.07412601262331009,
0.06245773658156395,
-0.04271269589662552,
-0.00040213874308392406,
0.023976454511284828,
-0.12028402090072632,
0.06312324106693268,
-0.07098826766014099,
0.13371974229812622,
-0.03116658702492714,
0.07425788044929504,
-0.025104189291596413,
-0.11866742372512817,
0.009918338619172573,
-0.09337082505226135,
-0.01010008342564106,
0.12468154728412628,
0.0027743116952478886,
0.06517700105905533,
0.06561212986707687,
-0.03711545839905739,
0.034030888229608536,
0.0829930305480957,
-0.22714775800704956,
-0.09803496301174164,
-0.05348977446556091,
0.012734729796648026,
0.0051620700396597385,
0.07140365988016129,
0.18056431412696838,
-0.06859228014945984,
-0.040022652596235275,
-0.020775744691491127,
0.03530076891183853,
-0.08628705143928528,
0.19965310394763947,
-0.03385763615369797,
-0.0017905286513268948,
-0.06435322016477585,
-0.03756605461239815,
0.050840891897678375,
0.008505258709192276,
0.009364032186567783,
0.030974827706813812,
-0.1559506505727768,
-0.14121964573860168,
-0.03920932114124298,
0.14998912811279297,
-0.17943722009658813,
-0.08957741409540176,
-0.08286283165216446,
-0.1485915184020996,
0.08329376578330994,
0.1601911336183548,
0.009215629659593105,
0.01138351671397686,
-0.08398323506116867,
-0.016829535365104675,
-0.04265579953789711,
0.10143937915563583,
0.04079563543200493,
0.03487483412027359,
0.05995873734354973,
0.035703420639038086,
-0.009123441763222218,
0.14544740319252014,
-0.11416241526603699,
0.051249612122774124,
-0.10008420050144196,
0.014648105017840862,
-0.19958476722240448,
-0.05163038522005081,
-0.008632810786366463,
-0.10425849258899689,
-0.030124494805932045,
-0.02007594332098961,
-0.06261206418275833,
-0.07518541812896729,
-0.02121841348707676,
0.0311332568526268,
0.033854011446237564,
0.002254686551168561,
-0.07332088053226471,
0.00011068409366998821,
0.10307303816080093,
0.0050740959122776985,
0.19398410618305206,
0.06741855293512344,
-0.012995634227991104,
0.04489480331540108,
-0.13571298122406006,
-0.016208218410611153,
0.06646883487701416,
0.07596433907747269,
-0.00020360334019642323,
-0.10053321719169617,
0.006374315824359655,
0.044856902211904526,
-0.1052270159125328,
0.133968323469162,
0.025866147130727768,
0.01231048908084631,
0.03885453939437866,
-0.0637645572423935,
0.03815087676048279,
-0.0697033703327179,
-0.0488264299929142,
-0.01412177924066782,
0.05856253579258919,
0.056862663477659225,
-0.09462834894657135,
0.023921016603708267,
-0.08917315304279327,
-0.005155148915946484,
0.002616296289488673,
-0.07725300639867783,
-0.10642967373132706,
-0.00447419099509716,
0.053591158241033554,
0.07632878422737122,
0.225823774933815,
-0.0628114640712738,
-0.13266244530677795,
0.027121908962726593,
0.19100570678710938,
0.037765536457300186,
-0.05424461141228676,
0.08142078667879105,
0.06923297792673111,
-0.05761715769767761,
-0.16706202924251556,
0.01583283208310604,
-0.014201984740793705,
-0.11961027979850769,
0.1616271585226059,
-0.052546191960573196,
0.08859720081090927,
-0.10416897386312485,
-0.06201315298676491,
0.04597583785653114,
-0.08641806244850159,
-0.16586995124816895,
-0.07249454408884048,
-0.07968732714653015,
0.020548759028315544,
0.05003415793180466,
0.22247810661792755,
-0.01468265987932682,
-0.008026186376810074,
-0.061247777193784714,
-0.019589265808463097,
-0.12712660431861877,
-0.10256332159042358,
-0.013334281742572784,
-0.08331552147865295,
-0.014166864566504955,
-0.09738864749670029,
0.019722606986761093,
0.11622896790504456,
0.07574866712093353,
-0.005984662566334009,
0.09507763385772705,
-0.05173918604850769,
0.05463268607854843,
0.06836447864770889,
-0.03511550650000572,
0.02055521309375763,
-0.0785331204533577,
0.07856804877519608,
-0.07085390388965607,
0.02809523418545723,
0.06637360900640488,
-0.0022953881416469812,
0.005894429050385952,
-0.03534940630197525,
-0.07352543622255325,
-0.008040422573685646,
0.032670214772224426,
-0.0347026102244854,
-0.040425557643175125,
0.040082745254039764,
0.11081530153751373,
0.06278092414140701,
0.05802393704652786,
0.2224443107843399,
0.00933571346104145,
-0.11167910695075989,
-0.10512803494930267,
0.02425011433660984,
-0.004170891363173723,
0.022574061527848244,
-0.012253087013959885,
0.03741884231567383,
0.04013071954250336,
0.34588563442230225,
0.2446148544549942,
-0.04506867378950119,
0.00962132215499878,
0.055694349110126495,
0.03478991612792015,
0.1082707941532135,
0.10172830522060394,
0.11615382879972458,
0.1351083517074585,
-0.07154113054275513,
-0.03909166157245636,
-0.08287058025598526,
-0.08262399584054947,
-0.08814011514186859,
0.02682492695748806,
0.0945851281285286,
-0.02882315404713154,
0.01996767893433571,
0.08749993145465851,
-0.12126117944717407,
-0.0016262581339105964,
-0.07062175869941711,
-0.08524968475103378,
-0.021246161311864853,
-0.03627645596861839,
0.07203122228384018,
0.010006069205701351,
0.04955485090613365,
-0.01622900739312172,
0.024733588099479675,
0.1246008351445198,
0.04059983044862747,
-0.2016901671886444,
-0.001864572404883802,
0.07980965822935104,
-0.014496730640530586,
0.12148625403642654,
-0.044840991497039795,
0.025037039071321487,
0.06561805307865143,
-0.01887684501707554,
-0.04892687499523163,
0.15543793141841888,
-0.0627671480178833,
-0.005023750010877848,
0.05338858440518379,
0.011490190401673317,
0.08468350768089294,
-0.078297920525074,
0.05765734612941742,
-0.10207950323820114,
0.0037182930391281843,
0.01235282514244318,
-0.04247443005442619,
-0.06931665539741516,
0.20675207674503326,
0.006130095571279526,
0.06514526158571243,
0.0295877605676651,
-0.03283000737428665,
0.11385083198547363,
-0.11026758700609207,
0.07203289121389389,
-0.006394772790372372,
0.05111505836248398,
0.021160388365387917,
-0.08946546912193298,
0.05218387022614479,
-0.04008948430418968,
-0.010920282453298569,
-0.2598887085914612,
-0.0359145849943161,
-0.12029516696929932,
0.02997552789747715,
-0.07899942994117737,
0.07406998425722122,
0.06359703838825226,
0.04931323230266571,
-0.023546269163489342,
0.035689596086740494,
0.010798678733408451,
0.04039807245135307,
-0.08368559181690216,
-0.08468353748321533
] |
null | null |
transformers
|
# Model Trained Using AutoNLP
- Problem type: Summarization
- Model ID: 583416409
- CO2 Emissions (in grams): 72.26141764997115
## Validation Metrics
- Loss: 1.4701834917068481
- Rouge1: 47.7785
- Rouge2: 24.8518
- RougeL: 40.2231
- RougeLsum: 43.9487
- Gen Len: 18.8029
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_HUGGINGFACE_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/anegi/autonlp-dialogue-summariztion-583416409
```
|
{"language": "en", "tags": "autonlp", "datasets": ["anegi/autonlp-data-dialogue-summariztion"], "widget": [{"text": "I love AutoNLP \ud83e\udd17"}], "co2_eq_emissions": 72.26141764997115}
|
text2text-generation
|
anegi/autonlp-dialogue-summariztion-583416409
|
[
"transformers",
"pytorch",
"bart",
"text2text-generation",
"autonlp",
"en",
"dataset:anegi/autonlp-data-dialogue-summariztion",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #bart #text2text-generation #autonlp #en #dataset-anegi/autonlp-data-dialogue-summariztion #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us
|
# Model Trained Using AutoNLP
- Problem type: Summarization
- Model ID: 583416409
- CO2 Emissions (in grams): 72.26141764997115
## Validation Metrics
- Loss: 1.4701834917068481
- Rouge1: 47.7785
- Rouge2: 24.8518
- RougeL: 40.2231
- RougeLsum: 43.9487
- Gen Len: 18.8029
## Usage
You can use cURL to access this model:
|
[
"# Model Trained Using AutoNLP\n\n- Problem type: Summarization\n- Model ID: 583416409\n- CO2 Emissions (in grams): 72.26141764997115",
"## Validation Metrics\n\n- Loss: 1.4701834917068481\n- Rouge1: 47.7785\n- Rouge2: 24.8518\n- RougeL: 40.2231\n- RougeLsum: 43.9487\n- Gen Len: 18.8029",
"## Usage\n\nYou can use cURL to access this model:"
] |
[
"TAGS\n#transformers #pytorch #bart #text2text-generation #autonlp #en #dataset-anegi/autonlp-data-dialogue-summariztion #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Trained Using AutoNLP\n\n- Problem type: Summarization\n- Model ID: 583416409\n- CO2 Emissions (in grams): 72.26141764997115",
"## Validation Metrics\n\n- Loss: 1.4701834917068481\n- Rouge1: 47.7785\n- Rouge2: 24.8518\n- RougeL: 40.2231\n- RougeLsum: 43.9487\n- Gen Len: 18.8029",
"## Usage\n\nYou can use cURL to access this model:"
] |
[
73,
41,
54,
13
] |
[
"passage: TAGS\n#transformers #pytorch #bart #text2text-generation #autonlp #en #dataset-anegi/autonlp-data-dialogue-summariztion #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n# Model Trained Using AutoNLP\n\n- Problem type: Summarization\n- Model ID: 583416409\n- CO2 Emissions (in grams): 72.26141764997115## Validation Metrics\n\n- Loss: 1.4701834917068481\n- Rouge1: 47.7785\n- Rouge2: 24.8518\n- RougeL: 40.2231\n- RougeLsum: 43.9487\n- Gen Len: 18.8029## Usage\n\nYou can use cURL to access this model:"
] |
[
-0.1870366930961609,
0.12556131184101105,
-0.0015887677436694503,
0.06795724481344223,
0.04877448081970215,
0.008806940168142319,
0.11767777055501938,
0.06759350001811981,
0.04215208441019058,
0.01804986223578453,
0.12763690948486328,
0.11940869688987732,
-0.004427247680723667,
0.20136862993240356,
-0.05939555540680885,
-0.15732890367507935,
0.07373552769422531,
0.04381376504898071,
0.039959926158189774,
0.11685851961374283,
0.12017955631017685,
-0.08591421693563461,
0.12891602516174316,
0.08579981327056885,
-0.14019496738910675,
-0.013903859071433544,
0.039628639817237854,
-0.07513321191072464,
0.13121309876441956,
0.10954245924949646,
0.1569463312625885,
0.0798877403140068,
0.1233903095126152,
-0.09445350617170334,
0.007651573978364468,
-0.04645640775561333,
-0.04827269911766052,
0.11770806461572647,
0.03747403249144554,
-0.06276635080575943,
0.011967171914875507,
0.03994715213775635,
0.02625562623143196,
0.046761926263570786,
-0.08829747140407562,
-0.034861620515584946,
-0.05955180153250694,
-0.05593518540263176,
0.129425510764122,
0.08457989245653152,
-0.01901906356215477,
0.2553984820842743,
-0.19227732717990875,
0.02629212476313114,
0.13040538132190704,
-0.1899430751800537,
0.002246327232569456,
0.13607993721961975,
0.026864521205425262,
-0.12374471873044968,
-0.06119566410779953,
0.07413773238658905,
0.10372458398342133,
-0.033036161214113235,
0.05838411673903465,
-0.08496315777301788,
-0.01390030700713396,
0.009717551060020924,
-0.11299978196620941,
-0.013839587569236755,
0.23967741429805756,
0.08419078588485718,
-0.08642591536045074,
0.019826414063572884,
-0.07030592858791351,
-0.10443983227014542,
-0.048147059977054596,
-0.11887337267398834,
-0.00007829261448932812,
-0.0629548653960228,
-0.053802721202373505,
0.03727087378501892,
-0.16055655479431152,
-0.04563439264893532,
-0.12547388672828674,
0.0694584846496582,
-0.04330943524837494,
0.006088416092097759,
-0.03069334663450718,
0.13195790350437164,
-0.19476871192455292,
-0.061538487672805786,
-0.03628591448068619,
-0.065697580575943,
-0.09276153147220612,
-0.01758127100765705,
-0.02520461566746235,
0.10050283372402191,
-0.002797264838591218,
0.2277870774269104,
0.024210289120674133,
-0.03840743377804756,
0.10885141789913177,
0.020367499440908432,
0.009516666643321514,
0.176579087972641,
-0.06924036145210266,
-0.11405713111162186,
0.05771234631538391,
-0.0655985027551651,
0.026238489896059036,
-0.06345585733652115,
-0.13108965754508972,
-0.12125337868928909,
0.055449653416872025,
0.02808966115117073,
0.02189333736896515,
-0.033119745552539825,
-0.1390409767627716,
-0.020042575895786285,
0.1796305924654007,
0.007944216951727867,
0.03517860919237137,
-0.015585977584123611,
-0.016289643943309784,
0.0511983186006546,
0.11228868365287781,
0.06507687270641327,
-0.02134723775088787,
0.07276497036218643,
-0.1351063847541809,
-0.023340368643403053,
-0.04150623455643654,
-0.07682842761278152,
0.0443890206515789,
-0.04899369552731514,
0.03724543750286102,
-0.2065383344888687,
-0.06688535958528519,
0.03007648140192032,
-0.025930894538760185,
-0.036997005343437195,
-0.06914003938436508,
-0.045340608805418015,
-0.007991504855453968,
0.02700653485953808,
0.009431375190615654,
-0.026518480852246284,
-0.042739588767290115,
-0.032549332827329636,
0.024239687249064445,
0.05261052027344704,
-0.1522074192762375,
0.023536929860711098,
-0.06543044000864029,
0.008375142700970173,
-0.12319163233041763,
0.012958312407135963,
0.015749545767903328,
-0.042760904878377914,
-0.1251462996006012,
-0.06319905072450638,
0.016982248052954674,
-0.032806139439344406,
0.11246921867132187,
0.21010558307170868,
-0.0891207903623581,
-0.07261351495981216,
0.04885944724082947,
-0.06131824105978012,
-0.09747251123189926,
0.10952095687389374,
-0.03135162964463234,
0.01474340632557869,
0.026329156011343002,
-0.03099312074482441,
0.06985048949718475,
-0.12577442824840546,
-0.018242457881569862,
0.07859720289707184,
-0.015624270774424076,
-0.15159650146961212,
0.1103176400065422,
-0.023445384576916695,
-0.14510156214237213,
-0.018984856083989143,
0.05503536015748978,
0.035945430397987366,
-0.11258844286203384,
-0.11333395540714264,
-0.05038568750023842,
0.0006387017201632261,
0.050545599311590195,
-0.06563127785921097,
0.03262345865368843,
-0.01198448333889246,
-0.09049785882234573,
-0.09009797871112823,
0.10395810753107071,
0.007901553995907307,
0.03011929988861084,
-0.09092237800359726,
0.10293040424585342,
-0.1428758203983307,
-0.053969383239746094,
-0.13638141751289368,
-0.04923606663942337,
-0.011582274921238422,
-0.016613079234957695,
-0.017413415014743805,
0.07478100061416626,
0.012013601139187813,
0.07421812415122986,
-0.0013434631982818246,
0.012505524791777134,
0.014969509094953537,
-0.008956468664109707,
-0.13553385436534882,
-0.14073725044727325,
-0.033672068268060684,
-0.009931123815476894,
0.2599031925201416,
-0.09980522841215134,
-0.01939108967781067,
-0.028748231008648872,
0.10320515185594559,
-0.0352880023419857,
0.041714858263731,
-0.012165379710495472,
0.015577315352857113,
-0.08665388077497482,
0.01961534470319748,
0.010215194895863533,
-0.006614474579691887,
-0.18208463490009308,
0.10622864961624146,
-0.14412827789783478,
0.13965043425559998,
0.14746959507465363,
-0.030003948137164116,
-0.07697132229804993,
-0.04050933197140694,
-0.00008888661977835,
-0.0019879762548953295,
-0.06940246373414993,
-0.0285895187407732,
0.04716011881828308,
0.003917728550732136,
0.1052483320236206,
-0.06258407235145569,
-0.007250874303281307,
0.08117605000734329,
-0.0709766298532486,
0.02517002262175083,
0.1395743489265442,
0.18270714581012726,
-0.11873295903205872,
0.07335663586854935,
0.12667560577392578,
-0.1069871336221695,
0.006582766305655241,
0.07599128037691116,
-0.07616253942251205,
-0.06398044526576996,
-0.09700905531644821,
0.026895469054579735,
0.11601938307285309,
-0.06698160618543625,
0.0740446075797081,
0.10546182096004486,
-0.035611141473054886,
-0.006795637309551239,
-0.15068252384662628,
-0.032603781670331955,
0.03524406999349594,
0.03765159100294113,
-0.09197518974542618,
0.06016895920038223,
-0.012019125744700432,
0.14189797639846802,
-0.012878550216555595,
-0.170961394906044,
0.00997348502278328,
0.032760754227638245,
-0.14806640148162842,
0.2747958302497864,
-0.07078889012336731,
-0.27387863397598267,
-0.11449169367551804,
-0.017126521095633507,
-0.004780590999871492,
0.04131260886788368,
0.08477985858917236,
-0.0706278383731842,
-0.11531531065702438,
-0.009463262744247913,
0.0027311472222208977,
0.01899793930351734,
0.09510153532028198,
-0.04048501327633858,
-0.07049834728240967,
-0.04064760357141495,
-0.11525005847215652,
-0.023706955835223198,
-0.035200074315071106,
-0.011901813559234142,
0.13203158974647522,
-0.07905406504869461,
0.1311579942703247,
0.18438521027565002,
-0.022133473306894302,
-0.035633351653814316,
0.04698013514280319,
0.26389235258102417,
-0.06417997926473618,
0.03794487193226814,
0.11804038286209106,
0.040631767362356186,
0.03719185292720795,
0.10877037793397903,
0.03217029199004173,
-0.0586814284324646,
0.005441874731332064,
-0.02099783346056938,
-0.07194960862398148,
-0.21643680334091187,
-0.13876406848430634,
-0.020717810839414597,
0.004879136569797993,
0.032889172434806824,
-0.00843715202063322,
0.16933833062648773,
0.14570046961307526,
-0.023492883890867233,
0.025773074477910995,
-0.08412446826696396,
0.09130000323057175,
0.11179855465888977,
-0.016254838556051254,
0.16718536615371704,
-0.05290863662958145,
-0.09789475798606873,
0.12321851402521133,
-0.04278172180056572,
0.11805898696184158,
0.12917903065681458,
0.02577587589621544,
-0.007991589605808258,
0.13167895376682281,
0.0877789780497551,
0.1433541625738144,
0.12533625960350037,
-0.06660643219947815,
-0.036071114242076874,
-0.052702706307172775,
-0.015780603513121605,
0.08918731659650803,
0.11018681526184082,
-0.013842783868312836,
-0.11301665008068085,
0.056725915521383286,
0.019673366099596024,
0.018650513142347336,
0.18586747348308563,
-0.38553130626678467,
-0.08472535759210587,
-0.002314519602805376,
0.0102224824950099,
-0.06855914741754532,
-0.0485093854367733,
-0.040990546345710754,
-0.17391785979270935,
0.005380586721003056,
0.009507116861641407,
0.08776962757110596,
0.022172892466187477,
0.015325242653489113,
-0.1409122198820114,
0.0449993722140789,
-0.027944041416049004,
0.07091684639453888,
-0.2221163809299469,
0.28557509183883667,
0.045973338186740875,
-0.04140695929527283,
-0.05699741095304489,
0.008781833574175835,
0.0019541815854609013,
0.19653263688087463,
0.16363860666751862,
0.03164563328027725,
0.06939979642629623,
-0.11364257335662842,
-0.20522989332675934,
0.08094406872987747,
-0.008554251864552498,
-0.07481718063354492,
0.03555701673030853,
0.033068522810935974,
-0.079239122569561,
0.033292606472969055,
0.0030486444011330605,
-0.1467607170343399,
-0.056162912398576736,
0.10246768593788147,
0.11875271052122116,
-0.05430230870842934,
0.005811134818941355,
-0.11841912567615509,
0.02830728143453598,
0.2210516482591629,
-0.012167578563094139,
-0.021296242251992226,
-0.13683922588825226,
0.004519931972026825,
0.139864981174469,
-0.1018168181180954,
0.08793549239635468,
-0.054837167263031006,
0.09994202852249146,
-0.04694780707359314,
-0.03871120139956474,
0.1297958940267563,
-0.12318746745586395,
-0.06943630427122116,
-0.022128285840153694,
0.13224467635154724,
0.04474350064992905,
0.09557940065860748,
0.07259666174650192,
0.00006117126031313092,
-0.10975254327058792,
-0.15133748948574066,
-0.0021588951349258423,
0.006290218327194452,
0.052638035267591476,
-0.014197110198438168,
0.02943485602736473,
-0.08548634499311447,
-0.012600711546838284,
0.027690667659044266,
0.15078425407409668,
0.21767385303974152,
-0.09003604203462601,
0.04538692533969879,
0.19754618406295776,
-0.014645957387983799,
-0.2179892659187317,
-0.026840034872293472,
-0.009176772087812424,
0.08240270614624023,
-0.08561474084854126,
-0.08161046355962753,
0.10029276460409164,
0.1053541973233223,
-0.04476054757833481,
0.012358367443084717,
-0.20588397979736328,
-0.16210684180259705,
0.22766312956809998,
-0.01603328436613083,
0.271081805229187,
-0.0038571199402213097,
-0.0035480624064803123,
-0.09191688150167465,
-0.2693941295146942,
0.17231805622577667,
-0.00792308896780014,
0.07914519309997559,
-0.011073652654886246,
0.08961085230112076,
0.04218081012368202,
-0.030962461605668068,
0.21345295011997223,
0.04956628382205963,
-0.010538297705352306,
0.020923350006341934,
-0.08794780820608139,
0.015606299042701721,
-0.04954142868518829,
0.09864075481891632,
0.027124403044581413,
0.04619099572300911,
-0.10988642275333405,
-0.04229756072163582,
0.0012327672448009253,
0.13613900542259216,
-0.046186890453100204,
-0.07688803970813751,
-0.020840464159846306,
-0.014043762348592281,
-0.024672528728842735,
-0.03297065570950508,
0.06629011780023575,
0.008719916455447674,
0.017040081322193146,
0.08228873461484909,
0.17983302474021912,
-0.08559879660606384,
-0.03045467473566532,
0.02111206017434597,
-0.08042851835489273,
0.1010521799325943,
-0.1433231681585312,
0.07227910310029984,
0.1328490525484085,
-0.007421485148370266,
0.045509643852710724,
0.032538555562496185,
-0.07668063789606094,
-0.01783902570605278,
0.10804591327905655,
-0.15034787356853485,
0.026945294812321663,
-0.03480517864227295,
0.018356630578637123,
-0.03361691161990166,
0.10256327688694,
0.13027669489383698,
-0.03644142672419548,
-0.05920659005641937,
0.004061822779476643,
-0.02762337028980255,
-0.04873498156666756,
0.17414362728595734,
0.05407004803419113,
0.0783412978053093,
-0.12447313219308853,
0.00924971979111433,
0.009628505446016788,
-0.029206572100520134,
-0.030606381595134735,
-0.0026841757353395224,
-0.11815188080072403,
-0.10081282258033752,
-0.030226562172174454,
0.1199755147099495,
-0.3727124333381653,
-0.053302545100450516,
-0.037031810730695724,
-0.05330673232674599,
0.042170342057943344,
0.17501461505889893,
0.11491596698760986,
0.04266957938671112,
0.009521217085421085,
-0.11364135891199112,
-0.1092766523361206,
-0.025674674659967422,
0.09039882570505142,
0.04438050091266632,
-0.012045889161527157,
0.020435722544789314,
-0.024586601182818413,
0.146748349070549,
-0.045561082661151886,
-0.012673893012106419,
-0.11003602296113968,
-0.02212548442184925,
-0.11032582819461823,
-0.014794698916375637,
-0.06574070453643799,
-0.024059994146227837,
-0.024815088137984276,
-0.08677081018686295,
-0.07421454787254333,
0.036396559327840805,
-0.07191553711891174,
0.0005962528521195054,
-0.009586974047124386,
0.03263142332434654,
-0.06307350844144821,
-0.021705003455281258,
0.06436993926763535,
-0.021124430000782013,
0.08678461611270905,
0.11886615306138992,
0.054163627326488495,
0.06498762965202332,
-0.17062650620937347,
0.013215736486017704,
0.1036614328622818,
0.032820675522089005,
0.13846661150455475,
-0.16578206419944763,
0.03114491142332554,
0.037407808005809784,
0.0587335079908371,
-0.017511244863271713,
0.03056642971932888,
-0.11306513100862503,
0.0254774522036314,
-0.059574805200099945,
-0.14461924135684967,
-0.06576774269342422,
-0.01184750534594059,
0.06278304010629654,
0.05402318760752678,
0.04021403193473816,
0.0225503109395504,
0.0723387748003006,
-0.10028435289859772,
0.052910998463630676,
-0.08052849769592285,
-0.029579782858490944,
-0.10379156470298767,
-0.06710323691368103,
0.04951959475874901,
-0.0009761503897607327,
0.13611559569835663,
-0.057524945586919785,
0.17375421524047852,
-0.0152394138276577,
0.041759271174669266,
0.0697738453745842,
-0.012183175422251225,
0.09238757193088531,
0.133841872215271,
0.036551956087350845,
0.030448168516159058,
0.13938982784748077,
0.10358979552984238,
-0.014019506983458996,
0.14289787411689758,
-0.0749320238828659,
0.03378472104668617,
0.18378876149654388,
-0.03347829729318619,
-0.12671148777008057,
-0.0688752830028534,
-0.09171710163354874,
-0.09092588722705841,
-0.0028694935608655214,
0.01648791693150997,
0.04529838263988495,
0.10616134107112885,
-0.06913323700428009,
-0.007918104529380798,
0.012127917259931564,
-0.049767035990953445,
-0.24163536727428436,
-0.06986240297555923,
-0.13662494719028473,
-0.08294374495744705,
-0.03079543448984623,
-0.11698999255895615,
-0.041769590228796005,
0.035927291959524155,
0.06872621178627014,
-0.027028106153011322,
0.054982028901576996,
-0.05888642743229866,
-0.016756515949964523,
-0.017017599195241928,
0.03571339324116707,
0.06109369173645973,
-0.05381070077419281,
-0.009672132320702076,
0.0025016821455210447,
0.0492071732878685,
0.009311717003583908,
-0.03940162435173988,
0.03544975817203522,
0.11597796529531479,
0.026038430631160736,
-0.0957394391298294,
-0.057878993451595306,
-0.007320299278944731,
0.05839047580957413,
0.03821147605776787,
0.0037725907750427723,
0.0524829626083374,
0.01569843478500843,
0.1659683734178543,
-0.06201281398534775,
-0.010584763251245022,
-0.15936213731765747,
0.25801289081573486,
-0.04185103625059128,
0.05825776234269142,
0.018826914951205254,
-0.03756004571914673,
-0.004825791344046593,
0.175334170460701,
0.15585030615329742,
-0.0029086689464747906,
0.03408470004796982,
-0.0019579213112592697,
0.011510041542351246,
0.019888248294591904,
-0.04153181612491608,
0.07026401907205582,
0.18139329552650452,
-0.12692449986934662,
-0.028182949870824814,
-0.007632516790181398,
-0.013028828427195549,
0.032033443450927734,
0.041157934814691544,
-0.010985207743942738,
-0.04253434017300606,
-0.04188239574432373,
0.07210458815097809,
-0.041103724390268326,
0.03786121681332588,
0.06369294971227646,
-0.1277441829442978,
-0.13609114289283752,
0.0142092015594244,
-0.05274582281708717,
0.019396478310227394,
0.12338952720165253,
-0.1145983412861824,
-0.0901213064789772,
0.14969675242900848,
0.04139045998454094,
-0.18251799046993256,
-0.1076449304819107,
0.053526587784290314,
0.10403203964233398,
0.14820004999637604,
0.018969377502799034,
0.16761159896850586,
0.10921932011842728,
0.07533948868513107,
-0.11244411766529083,
0.08369141817092896,
0.058942314237356186,
-0.07747757434844971,
0.10733094811439514,
0.007630618289113045,
-0.023380860686302185,
0.06690478324890137,
0.00620084535330534,
-0.16639837622642517,
0.05539475008845329,
-0.05555718392133713,
-0.0019053566502407193,
-0.041939400136470795,
0.014029075391590595,
-0.10467024147510529,
0.11516875773668289,
0.07801761478185654,
-0.0805373266339302,
-0.08371859043836594,
-0.01904708333313465,
0.10497237741947174,
0.07384705543518066,
-0.15036451816558838,
-0.024535560980439186,
-0.10509750992059708,
0.1077326312661171,
0.006265583448112011,
0.04254470020532608,
-0.09195774793624878,
-0.02661922574043274,
-0.05161314457654953,
-0.05608963593840599,
-0.03476804494857788,
0.03414274752140045,
-0.015753408893942833,
0.02004065550863743,
-0.02093280293047428,
-0.08307062089443207,
-0.0009785882430151105,
0.05893693119287491,
-0.07750964909791946,
-0.18969595432281494
] |
null | null |
transformers
|
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 412010597
- CO2 Emissions (in grams): 10.411685187181709
## Validation Metrics
- Loss: 0.12585781514644623
- Accuracy: 0.9475446428571429
- Precision: 0.9454660748256183
- Recall: 0.964424320827943
- AUC: 0.990229573862156
- F1: 0.9548511047070125
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/anel/autonlp-cml-412010597
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("anel/autonlp-cml-412010597", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("anel/autonlp-cml-412010597", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
```
|
{"language": "en", "tags": "autonlp", "datasets": ["anel/autonlp-data-cml"], "widget": [{"text": "I love AutoNLP \ud83e\udd17"}], "co2_eq_emissions": 10.411685187181709}
|
text-classification
|
anel/autonlp-cml-412010597
|
[
"transformers",
"pytorch",
"roberta",
"text-classification",
"autonlp",
"en",
"dataset:anel/autonlp-data-cml",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #roberta #text-classification #autonlp #en #dataset-anel/autonlp-data-cml #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us
|
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 412010597
- CO2 Emissions (in grams): 10.411685187181709
## Validation Metrics
- Loss: 0.12585781514644623
- Accuracy: 0.9475446428571429
- Precision: 0.9454660748256183
- Recall: 0.964424320827943
- AUC: 0.990229573862156
- F1: 0.9548511047070125
## Usage
You can use cURL to access this model:
Or Python API:
|
[
"# Model Trained Using AutoNLP\n\n- Problem type: Binary Classification\n- Model ID: 412010597\n- CO2 Emissions (in grams): 10.411685187181709",
"## Validation Metrics\n\n- Loss: 0.12585781514644623\n- Accuracy: 0.9475446428571429\n- Precision: 0.9454660748256183\n- Recall: 0.964424320827943\n- AUC: 0.990229573862156\n- F1: 0.9548511047070125",
"## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] |
[
"TAGS\n#transformers #pytorch #roberta #text-classification #autonlp #en #dataset-anel/autonlp-data-cml #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Trained Using AutoNLP\n\n- Problem type: Binary Classification\n- Model ID: 412010597\n- CO2 Emissions (in grams): 10.411685187181709",
"## Validation Metrics\n\n- Loss: 0.12585781514644623\n- Accuracy: 0.9475446428571429\n- Precision: 0.9454660748256183\n- Recall: 0.964424320827943\n- AUC: 0.990229573862156\n- F1: 0.9548511047070125",
"## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] |
[
67,
42,
80,
17
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #text-classification #autonlp #en #dataset-anel/autonlp-data-cml #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n# Model Trained Using AutoNLP\n\n- Problem type: Binary Classification\n- Model ID: 412010597\n- CO2 Emissions (in grams): 10.411685187181709## Validation Metrics\n\n- Loss: 0.12585781514644623\n- Accuracy: 0.9475446428571429\n- Precision: 0.9454660748256183\n- Recall: 0.964424320827943\n- AUC: 0.990229573862156\n- F1: 0.9548511047070125## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] |
[
-0.14829126000404358,
0.13050992786884308,
-0.0008166239713318646,
0.07318787276744843,
0.11990471929311752,
0.032161176204681396,
0.036206141114234924,
0.09389826655387878,
0.00878667738288641,
0.06382101029157639,
0.1510053426027298,
0.19947928190231323,
0.015326591208577156,
0.11443499475717545,
-0.15171268582344055,
-0.14437153935432434,
0.04260392487049103,
0.08147598803043365,
0.09956836700439453,
0.12121161073446274,
0.09578512609004974,
-0.10261750966310501,
0.13568611443042755,
0.050183314830064774,
-0.15994201600551605,
-0.010805320926010609,
0.08073961734771729,
-0.11157629638910294,
0.08752943575382233,
0.08373056352138519,
0.16875207424163818,
0.025261979550123215,
0.08743996173143387,
-0.11174236983060837,
-0.021789327263832092,
0.0029246946796774864,
-0.012402675114572048,
0.09083763509988785,
0.040642328560352325,
-0.06335341930389404,
-0.011017335578799248,
0.005673951003700495,
0.0811346098780632,
0.045510075986385345,
-0.08324988186359406,
-0.06802733242511749,
-0.060603559017181396,
0.05714980140328407,
0.13293147087097168,
0.11666695028543472,
0.011711565777659416,
0.2524005174636841,
-0.084551602602005,
0.08503817766904831,
0.0650254338979721,
-0.28106603026390076,
-0.013772530481219292,
0.11862688511610031,
-0.01775064691901207,
-0.10456305742263794,
-0.025465697050094604,
0.010630282573401928,
0.09437470138072968,
0.025924809277057648,
0.06359849125146866,
-0.06424666941165924,
-0.058318283408880234,
0.006098214071244001,
-0.10350295901298523,
-0.06659876555204391,
0.18418912589550018,
0.015404317528009415,
-0.08550664782524109,
-0.0133893433958292,
-0.0924895852804184,
-0.12586992979049683,
-0.07035792618989944,
-0.03310932219028473,
-0.02387041039764881,
-0.03611796349287033,
-0.05304548144340515,
0.08262193202972412,
-0.11946085095405579,
-0.04742923006415367,
-0.18574540317058563,
0.12176764756441116,
0.007185015827417374,
0.058125950396060944,
-0.041725024580955505,
0.10883297771215439,
-0.074915811419487,
-0.08711790293455124,
-0.0262365210801363,
-0.030839573591947556,
-0.047320656478405,
-0.0670214518904686,
-0.03230069577693939,
0.02432013303041458,
-0.014175549149513245,
0.20363810658454895,
0.04372938722372055,
0.038923561573028564,
0.03469054773449898,
0.0027654345612972975,
-0.01771172694861889,
0.18544048070907593,
-0.08836611360311508,
-0.015108139254152775,
0.06849922984838486,
-0.07656976580619812,
0.020613310858607292,
-0.034976642578840256,
-0.0839197114109993,
-0.12656280398368835,
0.1287752389907837,
0.02055312693119049,
0.019102128222584724,
0.0642324760556221,
-0.07150367647409439,
-0.03919284790754318,
0.07776694744825363,
-0.04877042770385742,
0.043120305985212326,
-0.019211897626519203,
-0.06032102555036545,
0.0723293125629425,
0.11086517572402954,
0.046771090477705,
-0.07975878566503525,
0.0967869684100151,
-0.11253786087036133,
0.02057473547756672,
-0.04500322788953781,
-0.10366681963205338,
0.0566377155482769,
-0.07062310725450516,
0.03766750916838646,
-0.21226489543914795,
-0.1579974889755249,
-0.0033977238927036524,
0.011026313528418541,
-0.04075419157743454,
-0.027639100328087807,
-0.011558130383491516,
-0.03327959403395653,
0.05500435456633568,
-0.024891283363103867,
-0.03113635815680027,
-0.03580861538648605,
0.03047836944460869,
0.04997523874044418,
0.040493983775377274,
-0.11443280428647995,
0.034410037100315094,
-0.07727626711130142,
0.006227824836969376,
-0.1238541230559349,
0.025021258741617203,
-0.019575880840420723,
0.062174346297979355,
-0.1473524570465088,
-0.0705549567937851,
0.0792483240365982,
-0.0342630110681057,
0.07278935611248016,
0.129432812333107,
-0.038068145513534546,
-0.0033181391190737486,
0.060491736978292465,
-0.0464506559073925,
-0.11607389897108078,
0.10009320825338364,
-0.05043439939618111,
-0.01513058040291071,
0.06667131930589676,
-0.05103937163949013,
0.14856192469596863,
-0.13102489709854126,
-0.06724952906370163,
0.030815863981842995,
-0.05843652784824371,
-0.13896936178207397,
0.053052645176649094,
0.028048165142536163,
-0.1727866381406784,
0.028548642992973328,
0.06709357351064682,
0.010938465595245361,
-0.05222621187567711,
-0.08531925082206726,
-0.07546167820692062,
-0.026893965899944305,
0.0278092622756958,
-0.007270548492670059,
0.07212625443935394,
-0.029407262802124023,
-0.0772673711180687,
-0.025883667171001434,
0.11617670208215714,
-0.021473059430718422,
-0.0010753528913483024,
-0.1548132449388504,
0.11303536593914032,
-0.20171627402305603,
-0.05375389754772186,
-0.19299593567848206,
-0.010679949074983597,
-0.0186174176633358,
0.03573977202177048,
-0.03011307679116726,
-0.04626627638936043,
0.055000025779008865,
0.007561295293271542,
0.001288564526475966,
-0.0037998936604708433,
0.13588491082191467,
-0.0014100682456046343,
-0.1441437005996704,
-0.049077004194259644,
-0.030394265428185463,
-0.010682692751288414,
0.19781170785427094,
-0.12279069423675537,
-0.019294997677206993,
0.007044519297778606,
0.10474537312984467,
-0.017734820023179054,
0.03240249305963516,
-0.03995180130004883,
0.045231033116579056,
-0.06010863184928894,
0.0016122108791023493,
0.019914526492357254,
-0.0067677791230380535,
-0.11438004672527313,
0.04228644445538521,
-0.16899999976158142,
0.19522014260292053,
0.16693374514579773,
-0.0901605486869812,
-0.07524322718381882,
0.0021110880188643932,
0.02601795829832554,
-0.013737744651734829,
-0.026995841413736343,
0.01659047231078148,
0.12220693379640579,
0.005254930350929499,
0.13216833770275116,
-0.07469332218170166,
-0.04842305928468704,
0.06229221075773239,
-0.08341174572706223,
-0.024388710036873817,
0.13447178900241852,
0.06837031245231628,
-0.21432086825370789,
0.08138789981603622,
0.03827786445617676,
-0.12028630822896957,
-0.006426936015486717,
0.047359440475702286,
-0.05277642235159874,
-0.03374676778912544,
-0.04924435913562775,
0.036440614610910416,
0.04590360075235367,
-0.019609929993748665,
0.052783526480197906,
0.08898178488016129,
-0.01716465689241886,
0.017767807468771935,
-0.1293654590845108,
0.022477006539702415,
0.03181764855980873,
0.026829037815332413,
-0.08406245708465576,
0.015477964654564857,
0.04640558362007141,
0.11549805104732513,
0.038860492408275604,
-0.10094618052244186,
0.05922788381576538,
0.0348939374089241,
-0.13651520013809204,
0.24865351617336273,
-0.08784772455692291,
-0.2043866217136383,
-0.16669709980487823,
-0.14470818638801575,
-0.022855427116155624,
0.0030272165313363075,
0.0268528014421463,
-0.02960003726184368,
-0.1301194727420807,
-0.028509290888905525,
-0.09462487697601318,
-0.04191257432103157,
0.024209871888160706,
-0.01209721714258194,
-0.019738154485821724,
0.04191895201802254,
-0.07083361595869064,
-0.05181976407766342,
-0.017556844279170036,
-0.010992844589054585,
0.1366373747587204,
-0.06045062467455864,
0.10557923465967178,
0.17502179741859436,
-0.04298420622944832,
0.004403233993798494,
0.021899288520216942,
0.19707442820072174,
-0.029725542291998863,
-0.02168034203350544,
0.158016175031662,
-0.0032497397623956203,
0.03971845656633377,
0.13928964734077454,
0.01941591314971447,
-0.059960540384054184,
-0.006793082691729069,
-0.0177567470818758,
-0.04590770974755287,
-0.1990499496459961,
-0.16475893557071686,
-0.017337828874588013,
-0.01517219003289938,
0.12520691752433777,
0.009049156680703163,
0.11764813214540482,
0.15445390343666077,
0.0032542389817535877,
0.09095504879951477,
-0.06685055792331696,
0.11601023375988007,
0.16651707887649536,
0.03685441613197327,
0.1541493982076645,
-0.0685235932469368,
-0.0732007548213005,
0.06440421938896179,
-0.0006236726185306907,
0.09428870677947998,
0.04957076907157898,
-0.03224927932024002,
-0.026726773008704185,
0.13489463925361633,
0.07748553156852722,
0.13899774849414825,
0.055680591613054276,
-0.04029811918735504,
0.026761727407574654,
-0.021354179829359055,
-0.0993238091468811,
0.04206777364015579,
0.054697684943675995,
0.013977150432765484,
-0.1116199791431427,
-0.013539264909923077,
-0.007260837592184544,
0.0947447419166565,
0.15853697061538696,
-0.4944573640823364,
-0.0876128152012825,
0.016649028286337852,
-0.04406573995947838,
-0.12768054008483887,
-0.015034063719213009,
-0.10867336392402649,
-0.1506519615650177,
0.05356866866350174,
-0.04381101578474045,
0.11561145633459091,
-0.04651132971048355,
-0.012472013011574745,
-0.0720336064696312,
0.005202747881412506,
-0.020886801183223724,
0.069678895175457,
-0.2351701259613037,
0.22640089690685272,
0.05538056790828705,
0.03875303640961647,
-0.08390958607196808,
-0.000669703702442348,
0.008886925876140594,
0.1106002926826477,
0.12445990741252899,
-0.007743517402559519,
0.013108160346746445,
-0.25516438484191895,
-0.14137916266918182,
0.037598107010126114,
-0.014885257929563522,
0.0262939240783453,
0.10318208485841751,
-0.0022024260833859444,
-0.030463671311736107,
0.008263512514531612,
-0.04314253479242325,
-0.06711667776107788,
-0.053688742220401764,
0.04433925077319145,
0.12099529802799225,
-0.033848147839307785,
0.009675804525613785,
-0.06167168170213699,
-0.022449186071753502,
0.1446307897567749,
-0.07463550567626953,
-0.07016682624816895,
-0.1473631113767624,
-0.0059305219911038876,
0.12289243936538696,
-0.11772183328866959,
0.08900151401758194,
-0.04794399067759514,
0.06299713999032974,
0.002787770237773657,
-0.12983529269695282,
0.10464860498905182,
-0.087710902094841,
-0.035432811826467514,
0.01213237177580595,
0.07270872592926025,
0.002557693747803569,
0.03619243577122688,
0.08037255704402924,
0.028436725959181786,
-0.10392901301383972,
-0.11090564727783203,
-0.009454308077692986,
0.04792182892560959,
0.16693115234375,
0.0711725503206253,
0.040858589112758636,
-0.12876881659030914,
-0.05096026137471199,
0.08204798400402069,
0.16760236024856567,
0.15217474102973938,
-0.07915028929710388,
-0.019648214802145958,
0.11765290051698685,
-0.009870023466646671,
-0.2188677042722702,
-0.018622448667883873,
-0.011789347976446152,
0.05003874748945236,
-0.1476992517709732,
-0.013513483107089996,
0.12158636748790741,
0.06087716668844223,
-0.045992832630872726,
-0.012503840029239655,
-0.15867556631565094,
-0.12050072848796844,
0.26820361614227295,
0.03630776330828667,
0.2036009430885315,
-0.05731077119708061,
-0.028373226523399353,
-0.14095552265644073,
-0.3267349600791931,
0.12489490956068039,
-0.00385266006924212,
0.0772317573428154,
-0.06050243601202965,
0.13445332646369934,
0.04369863495230675,
-0.08082924783229828,
0.13245047628879547,
0.02292884700000286,
0.037184227257966995,
-0.04543817788362503,
-0.052010782063007355,
-0.05086452513933182,
-0.07817555963993073,
0.14588405191898346,
0.04373655095696449,
0.06614265590906143,
-0.1884118914604187,
-0.054376039654016495,
-0.020989563316106796,
0.09587865322828293,
-0.010238777846097946,
-0.048155780881643295,
-0.02823137305676937,
-0.032470785081386566,
0.004791119135916233,
-0.05989634245634079,
0.04668385535478592,
-0.009899317286908627,
0.04473608732223511,
0.13985471427440643,
0.16719582676887512,
-0.09764168411493301,
-0.022565269842743874,
0.03722735866904259,
-0.08750063180923462,
0.10811891406774521,
-0.1556635946035385,
0.092319056391716,
0.12794259190559387,
-0.024040061980485916,
0.07714972645044327,
0.06125618889927864,
-0.041869860142469406,
-0.022064348682761192,
0.05949557572603226,
-0.14269636571407318,
0.08154388517141342,
0.0010647771414369345,
0.0028402814641594887,
-0.047193743288517,
0.06896520406007767,
0.1511022448539734,
-0.05387767404317856,
-0.042160920798778534,
0.00416143424808979,
-0.012341814115643501,
-0.032634422183036804,
0.1983155608177185,
0.04608045145869255,
0.05465935170650482,
-0.12364614009857178,
0.03750596567988396,
0.025933334603905678,
-0.05538846552371979,
0.023702148348093033,
-0.031292594969272614,
-0.13358773291110992,
-0.09503944963216782,
-0.0053751966916024685,
0.10637158900499344,
-0.2726069688796997,
-0.061798371374607086,
-0.021618453785777092,
-0.08711857348680496,
0.086231529712677,
0.2211495339870453,
0.09977705776691437,
0.050255145877599716,
-0.0303172804415226,
-0.08521939069032669,
-0.14033518731594086,
0.008934198878705502,
0.1139485090970993,
0.06322254985570908,
-0.14767199754714966,
0.1321520060300827,
-0.04655101150274277,
0.06377474963665009,
-0.04238526150584221,
0.017507372424006462,
-0.146836519241333,
0.021504152566194534,
-0.1764572560787201,
0.044937893748283386,
-0.07475557923316956,
0.018495498225092888,
0.006765189580619335,
-0.02804647758603096,
-0.07701434195041656,
0.030139947310090065,
-0.07075564563274384,
-0.020002225413918495,
0.03145680949091911,
0.007918231189250946,
-0.0608692392706871,
-0.053373854607343674,
0.06675998866558075,
-0.025525839999318123,
0.06968854367733002,
0.1523447036743164,
0.01895793341100216,
0.04968305677175522,
-0.10431693494319916,
-0.030492698773741722,
0.11275254189968109,
0.03632056713104248,
0.1047593355178833,
-0.1520490050315857,
0.07195230573415756,
0.07452020794153214,
0.023788079619407654,
0.05239491909742355,
0.10089344531297684,
-0.10454432666301727,
-0.030535778030753136,
-0.07697135955095291,
-0.07794869691133499,
-0.12534382939338684,
0.00027081003645434976,
0.12812013924121857,
0.051399555057287216,
0.10474088042974472,
-0.03693345934152603,
0.04472964257001877,
-0.10627356916666031,
0.007506153546273708,
-0.07016808539628983,
-0.07292597740888596,
-0.05620816349983215,
-0.038867320865392685,
0.07099882513284683,
-0.023025689646601677,
0.08705417811870575,
-0.04553500935435295,
0.08047368377447128,
0.001583046279847622,
0.08449381589889526,
0.030248936265707016,
-0.008584065362811089,
0.15052056312561035,
0.10593371093273163,
-0.03310870751738548,
0.046677183359861374,
0.10092330724000931,
0.08543100953102112,
-0.029156217351555824,
0.029862191528081894,
0.02236364781856537,
-0.004408904816955328,
0.14421485364437103,
0.007578871212899685,
-0.05473775789141655,
-0.06928502023220062,
-0.07698822021484375,
-0.15262973308563232,
0.040502727031707764,
0.023917509242892265,
0.05240509286522865,
0.12750007212162018,
-0.0691034272313118,
-0.02350224182009697,
-0.04407421872019768,
-0.0760381892323494,
-0.20294001698493958,
-0.09207522124052048,
-0.1478896290063858,
-0.06444411724805832,
0.004658655263483524,
-0.08248075097799301,
-0.03968317061662674,
0.0862758532166481,
0.03820068761706352,
-0.024925032630562782,
0.06907258182764053,
-0.0862920880317688,
-0.011774875223636627,
0.002594622317701578,
0.01731918193399906,
0.01768379658460617,
0.021671747788786888,
0.00796110462397337,
0.000595061865169555,
0.0014925284776836634,
0.05437539145350456,
-0.008864681236445904,
0.03678082674741745,
0.11081326752901077,
0.004100237973034382,
-0.08740513026714325,
-0.03786388784646988,
0.0575253888964653,
0.07612015306949615,
0.06927473098039627,
0.013178786262869835,
0.053896404802799225,
-0.013457088731229305,
0.2084084302186966,
-0.10728438943624496,
-0.006570087280124426,
-0.14675238728523254,
0.2961617112159729,
0.029311850666999817,
0.04817305877804756,
0.03231394663453102,
-0.03934164717793465,
0.02675667405128479,
0.2029884308576584,
0.09402648359537125,
-0.013050821609795094,
0.006767825223505497,
-0.0025980141945183277,
-0.013460141606628895,
0.0026986603625118732,
0.03298647329211235,
0.0858365148305893,
0.20152735710144043,
-0.0980454534292221,
-0.01640511490404606,
0.0011454548221081495,
-0.007080403156578541,
-0.02874080464243889,
0.03827506676316261,
-0.03836635872721672,
-0.03899525851011276,
-0.03873114660382271,
0.07231054455041885,
-0.07331065088510513,
0.0965997576713562,
0.06957326084375381,
-0.07465778291225433,
-0.10480599105358124,
0.03953859582543373,
-0.04670054093003273,
-0.030131982639431953,
0.09413474798202515,
-0.08029705286026001,
-0.053503796458244324,
0.07427596300840378,
0.007202385924756527,
-0.1691310703754425,
-0.03798035532236099,
0.02790449745953083,
0.16716592013835907,
0.17547520995140076,
0.0280692670494318,
0.1594904661178589,
0.15805841982364655,
0.05640554055571556,
-0.11910451948642731,
0.08030185103416443,
0.01370283029973507,
-0.0966215506196022,
0.14048509299755096,
0.023202285170555115,
-0.014494726434350014,
0.0415056049823761,
0.043154094368219376,
-0.16814780235290527,
0.00320456107147038,
-0.09082484990358353,
0.059570152312517166,
-0.08459680527448654,
-0.005431215278804302,
-0.07868906855583191,
0.10697680711746216,
0.11223134398460388,
-0.06432127207517624,
-0.027458811178803444,
-0.03650742024183273,
0.06050681695342064,
0.019881131127476692,
-0.08177990466356277,
-0.023009687662124634,
-0.11021562665700912,
0.07447172701358795,
-0.02803812362253666,
0.013657557778060436,
-0.20416028797626495,
-0.028891412541270256,
-0.013664279133081436,
-0.08079737424850464,
-0.042840003967285156,
0.06764035671949387,
0.0405300036072731,
0.02499525249004364,
-0.05060545727610588,
-0.059409547597169876,
-0.009315479546785355,
0.10298454761505127,
-0.08971668779850006,
-0.1482522040605545
] |
null | null |
transformers
|
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 432211280
- CO2 Emissions (in grams): 8.898145050355591
## Validation Metrics
- Loss: 0.12489336729049683
- Accuracy: 0.9520089285714286
- Precision: 0.9436443331246086
- Recall: 0.9747736093143596
- AUC: 0.9910066767410616
- F1: 0.958956411072224
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/anelnurkayeva/autonlp-covid-432211280
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("anelnurkayeva/autonlp-covid-432211280", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("anelnurkayeva/autonlp-covid-432211280", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
```
|
{"language": "en", "tags": "autonlp", "datasets": ["anelnurkayeva/autonlp-data-covid"], "widget": [{"text": "I love AutoNLP \ud83e\udd17"}], "co2_eq_emissions": 8.898145050355591}
|
text-classification
|
anelnurkayeva/autonlp-covid-432211280
|
[
"transformers",
"pytorch",
"roberta",
"text-classification",
"autonlp",
"en",
"dataset:anelnurkayeva/autonlp-data-covid",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #roberta #text-classification #autonlp #en #dataset-anelnurkayeva/autonlp-data-covid #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us
|
# Model Trained Using AutoNLP
- Problem type: Binary Classification
- Model ID: 432211280
- CO2 Emissions (in grams): 8.898145050355591
## Validation Metrics
- Loss: 0.12489336729049683
- Accuracy: 0.9520089285714286
- Precision: 0.9436443331246086
- Recall: 0.9747736093143596
- AUC: 0.9910066767410616
- F1: 0.958956411072224
## Usage
You can use cURL to access this model:
Or Python API:
|
[
"# Model Trained Using AutoNLP\n\n- Problem type: Binary Classification\n- Model ID: 432211280\n- CO2 Emissions (in grams): 8.898145050355591",
"## Validation Metrics\n\n- Loss: 0.12489336729049683\n- Accuracy: 0.9520089285714286\n- Precision: 0.9436443331246086\n- Recall: 0.9747736093143596\n- AUC: 0.9910066767410616\n- F1: 0.958956411072224",
"## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] |
[
"TAGS\n#transformers #pytorch #roberta #text-classification #autonlp #en #dataset-anelnurkayeva/autonlp-data-covid #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n",
"# Model Trained Using AutoNLP\n\n- Problem type: Binary Classification\n- Model ID: 432211280\n- CO2 Emissions (in grams): 8.898145050355591",
"## Validation Metrics\n\n- Loss: 0.12489336729049683\n- Accuracy: 0.9520089285714286\n- Precision: 0.9436443331246086\n- Recall: 0.9747736093143596\n- AUC: 0.9910066767410616\n- F1: 0.958956411072224",
"## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] |
[
70,
42,
79,
17
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #text-classification #autonlp #en #dataset-anelnurkayeva/autonlp-data-covid #co2_eq_emissions #autotrain_compatible #endpoints_compatible #region-us \n# Model Trained Using AutoNLP\n\n- Problem type: Binary Classification\n- Model ID: 432211280\n- CO2 Emissions (in grams): 8.898145050355591## Validation Metrics\n\n- Loss: 0.12489336729049683\n- Accuracy: 0.9520089285714286\n- Precision: 0.9436443331246086\n- Recall: 0.9747736093143596\n- AUC: 0.9910066767410616\n- F1: 0.958956411072224## Usage\n\nYou can use cURL to access this model:\n\n\n\nOr Python API:"
] |
[
-0.16381917893886566,
0.1521633416414261,
-0.0009368985192850232,
0.08059846609830856,
0.12998417019844055,
0.026351192966103554,
0.05564902722835541,
0.09257683902978897,
0.018965618684887886,
0.06356372684240341,
0.16317637264728546,
0.19942522048950195,
0.026447521522641182,
0.1431794911623001,
-0.12009322643280029,
-0.16220317780971527,
0.061270713806152344,
0.07639306783676147,
0.1097090020775795,
0.12177339941263199,
0.09053833782672882,
-0.10720416903495789,
0.1266566514968872,
0.055147141218185425,
-0.16359955072402954,
-0.01915031112730503,
0.08367830514907837,
-0.12276535481214523,
0.1043589785695076,
0.08444498479366302,
0.15957540273666382,
0.025375839322805405,
0.10434890538454056,
-0.0793926939368248,
-0.024899525567889214,
-0.008273004554212093,
-0.021883144974708557,
0.10164616256952286,
0.05835408717393875,
-0.04558010771870613,
-0.0016354294493794441,
0.0046769967302680016,
0.08606185019016266,
0.03032154031097889,
-0.09168344736099243,
-0.05847412347793579,
-0.05983852967619896,
0.04624813422560692,
0.12094037979841232,
0.12014968693256378,
-0.0006233133608475327,
0.2776283323764801,
-0.09455205500125885,
0.08334751427173615,
0.03269415348768234,
-0.2687121629714966,
-0.021450944244861603,
0.12026333063840866,
-0.03816264867782593,
-0.11588678508996964,
-0.020943516865372658,
0.014632975682616234,
0.08958175778388977,
0.016590604558587074,
0.0564703531563282,
-0.05532506853342056,
-0.05899642035365105,
-0.004476901609450579,
-0.1120738685131073,
-0.07749694585800171,
0.1792588233947754,
0.022640012204647064,
-0.07710803300142288,
-0.0169970765709877,
-0.09636376053094864,
-0.12617573142051697,
-0.06835267692804337,
-0.05263981968164444,
-0.031411804258823395,
-0.050307098776102066,
-0.061006754636764526,
0.08711446076631546,
-0.1182173639535904,
-0.05398431047797203,
-0.17215068638324738,
0.11424862593412399,
0.015858223661780357,
0.04963534325361252,
-0.021805597469210625,
0.1139412522315979,
-0.09896548092365265,
-0.08359207957983017,
-0.002918739803135395,
-0.026993876323103905,
-0.05885166674852371,
-0.05007927492260933,
-0.027352040633559227,
0.03996783867478371,
-0.011433249339461327,
0.18819628655910492,
0.04909013584256172,
0.019391782581806183,
0.06786193698644638,
0.0025475947186350822,
-0.008991199545562267,
0.20240095257759094,
-0.1138211339712143,
-0.009946947917342186,
0.05788933113217354,
-0.053217608481645584,
0.022184982895851135,
-0.03419646993279457,
-0.09136723726987839,
-0.13200876116752625,
0.1547035127878189,
0.026756752282381058,
0.02229374088346958,
0.05834682285785675,
-0.07848731428384781,
-0.03667794540524483,
0.08508336544036865,
-0.044688768684864044,
0.02694428153336048,
-0.03245959058403969,
-0.08521973341703415,
0.06511770188808441,
0.09860237687826157,
0.0359121598303318,
-0.0788983479142189,
0.08234374225139618,
-0.121835857629776,
0.022506646811962128,
-0.046362899243831635,
-0.11315497010946274,
0.0415695458650589,
-0.09173093736171722,
0.04026942700147629,
-0.21494217216968536,
-0.1627638041973114,
-0.018834251910448074,
-0.00735465670004487,
-0.04847314581274986,
-0.035945646464824677,
-0.02974613383412361,
-0.026639804244041443,
0.04143407940864563,
-0.02241514064371586,
-0.027229899540543556,
-0.03860562667250633,
0.043784525245428085,
0.04050811380147934,
0.043226826936006546,
-0.10668237507343292,
0.037158481776714325,
-0.0989488884806633,
-0.005091649480164051,
-0.09799175709486008,
0.020468972623348236,
-0.014965576119720936,
0.026730971410870552,
-0.1350795328617096,
-0.07272537797689438,
0.10885223746299744,
-0.026368556544184685,
0.09373584389686584,
0.15017692744731903,
-0.061922285705804825,
-0.008744559250772,
0.055909156799316406,
-0.053387150168418884,
-0.10303931683301926,
0.10144301503896713,
-0.04310780391097069,
0.01763346791267395,
0.06348633021116257,
-0.01632346771657467,
0.13904398679733276,
-0.09792420268058777,
-0.04435142129659653,
0.006938806269317865,
-0.04160601645708084,
-0.13388697803020477,
0.058532536029815674,
0.015294915065169334,
-0.16224642097949982,
0.040553152561187744,
0.05491746589541435,
0.032568518072366714,
-0.07289064675569534,
-0.09022416174411774,
-0.056164734065532684,
-0.03617957606911659,
0.04577132314443588,
0.006837665569037199,
0.0796150341629982,
-0.025386393070220947,
-0.07461774349212646,
-0.021382778882980347,
0.13177813589572906,
-0.012404101900756359,
-0.015880990773439407,
-0.15065151453018188,
0.13046793639659882,
-0.18763549625873566,
-0.05478106811642647,
-0.19656123220920563,
-0.007978797890245914,
-0.017414333298802376,
0.02962139993906021,
-0.0266879852861166,
-0.031203728169202805,
0.051609452813863754,
0.03245061635971069,
0.021989837288856506,
-0.02131965383887291,
0.10393089801073074,
0.00632341206073761,
-0.13440516591072083,
-0.05841529741883278,
-0.018063699826598167,
-0.006139322649687529,
0.22975321114063263,
-0.11979249864816666,
-0.021977927535772324,
-0.031510695815086365,
0.0957607552409172,
-0.026080483570694923,
0.03027910180389881,
-0.02567581832408905,
0.059454161673784256,
-0.056247785687446594,
0.007040131837129593,
0.040374092757701874,
-0.02235136553645134,
-0.0945701003074646,
0.03523324057459831,
-0.1719469279050827,
0.2075246274471283,
0.158697247505188,
-0.06965596973896027,
-0.08971483260393143,
0.011902065016329288,
0.029724307358264923,
-0.013802731409668922,
-0.023222604766488075,
0.027024617418646812,
0.09887221455574036,
0.009549576789140701,
0.12951497733592987,
-0.0634850487112999,
-0.00628868630155921,
0.07638595998287201,
-0.08614734560251236,
-0.022175338119268417,
0.1562480330467224,
0.07254492491483688,
-0.18268349766731262,
0.09122245758771896,
0.03286204859614372,
-0.11708545684814453,
0.014310614205896854,
0.03992890566587448,
-0.05893026664853096,
-0.03914931043982506,
-0.06685812771320343,
0.02285517007112503,
0.06752407550811768,
-0.03289853036403656,
0.04104762151837349,
0.09678884595632553,
-0.019466793164610863,
0.010768860578536987,
-0.13011083006858826,
0.005062805023044348,
0.02364620938897133,
0.03010762482881546,
-0.06711463630199432,
0.026381295174360275,
0.04034578800201416,
0.131409153342247,
0.027330581098794937,
-0.13364453613758087,
0.04497033730149269,
0.039884503930807114,
-0.132025808095932,
0.24211610853672028,
-0.08778050541877747,
-0.212844118475914,
-0.1735229343175888,
-0.10877758264541626,
-0.04409058019518852,
0.001538658863864839,
0.025383293628692627,
-0.027964208275079727,
-0.11518596112728119,
-0.023432472720742226,
-0.07286018133163452,
-0.008251301944255829,
0.009309082292020321,
-0.035890404134988785,
-0.033480823040008545,
0.039506178349256516,
-0.07605357468128204,
-0.048821449279785156,
-0.03425760939717293,
-0.02681102231144905,
0.15657685697078705,
-0.05583847314119339,
0.12340274453163147,
0.17958326637744904,
-0.0504845455288887,
0.006559290457516909,
0.023912793025374413,
0.20259664952754974,
-0.02492699585855007,
-0.018479682505130768,
0.17943811416625977,
0.02143693156540394,
0.03190530464053154,
0.1298113465309143,
0.014999791979789734,
-0.0559576116502285,
-0.01546228863298893,
-0.03024527244269848,
-0.03876105323433876,
-0.18121811747550964,
-0.18420788645744324,
0.005947598721832037,
-0.00039661070331931114,
0.1398823857307434,
-0.00515766954049468,
0.11585789918899536,
0.1707974523305893,
0.011567514389753342,
0.0598122663795948,
-0.08986100554466248,
0.10678394138813019,
0.1837472766637802,
0.028846340253949165,
0.16573745012283325,
-0.05605265498161316,
-0.07681126147508621,
0.059862636029720306,
-0.037365637719631195,
0.08580858260393143,
0.03601511940360069,
-0.05156758427619934,
-0.02255316823720932,
0.1601090133190155,
0.07065080106258392,
0.14549686014652252,
0.07912825793027878,
-0.03741539642214775,
0.019908813759684563,
-0.03332861512899399,
-0.1209469735622406,
0.029746579006314278,
0.061885882169008255,
0.008447983302175999,
-0.13355255126953125,
-0.012281443923711777,
-0.007801097352057695,
0.06065748631954193,
0.17495138943195343,
-0.48808741569519043,
-0.10300018638372421,
-0.007190281990915537,
-0.023573553189635277,
-0.13288846611976624,
-0.01838047057390213,
-0.1062714159488678,
-0.17571160197257996,
0.014728689566254616,
-0.0343390554189682,
0.10708999633789062,
-0.051113665103912354,
-0.004951613023877144,
-0.1009378433227539,
0.0116537194699049,
-0.015906421467661858,
0.09260417520999908,
-0.2326086163520813,
0.25362586975097656,
0.04750877618789673,
0.036347731947898865,
-0.08957742899656296,
-0.011238272301852703,
0.008002247661352158,
0.09205464273691177,
0.12034282833337784,
-0.003777767764404416,
0.06115517392754555,
-0.2891516387462616,
-0.1566808670759201,
0.04145337641239166,
-0.036681726574897766,
0.01444634422659874,
0.09448433667421341,
0.007497353013604879,
-0.02321620099246502,
0.001687521580606699,
-0.046854402869939804,
-0.07641351968050003,
-0.04795321822166443,
0.03990296274423599,
0.09962566196918488,
-0.016104865819215775,
0.01199195347726345,
-0.09023252874612808,
-0.03451451659202576,
0.11967414617538452,
-0.026851560920476913,
-0.0737612321972847,
-0.12975731492042542,
0.02352888695895672,
0.1292334794998169,
-0.12766005098819733,
0.08446705341339111,
-0.055138614028692245,
0.04923304170370102,
-0.0037301077973097563,
-0.11517815291881561,
0.10023228824138641,
-0.0791131928563118,
-0.05124226212501526,
0.016774123534560204,
0.0714396983385086,
0.01432629395276308,
0.03828665614128113,
0.07653162628412247,
0.03989265114068985,
-0.09956599026918411,
-0.10822448879480362,
-0.002554743317887187,
0.05382736027240753,
0.14563000202178955,
0.06424173712730408,
0.0387641079723835,
-0.14838533103466034,
-0.06920957565307617,
0.06580909341573715,
0.1704954355955124,
0.19126662611961365,
-0.09146186709403992,
-0.029289022088050842,
0.1398629993200302,
0.00508460495620966,
-0.21480686962604523,
-0.026379326358437538,
-0.0040460433810949326,
0.06288012117147446,
-0.12477302551269531,
-0.03785547986626625,
0.08991498500108719,
0.08530999720096588,
-0.044928986579179764,
-0.0227804034948349,
-0.1934574544429779,
-0.1254984438419342,
0.29600557684898376,
0.05810174718499184,
0.19740250706672668,
-0.05151017755270004,
-0.01644882559776306,
-0.10609021782875061,
-0.28246402740478516,
0.13775324821472168,
0.022341012954711914,
0.08572688698768616,
-0.061347417533397675,
0.14048154652118683,
0.056752316653728485,
-0.07077977806329727,
0.14296087622642517,
0.0009732143371365964,
0.023966018110513687,
-0.043072327971458435,
-0.07216553390026093,
-0.05101097747683525,
-0.06748028099536896,
0.14751122891902924,
0.07416503876447678,
0.07347730547189713,
-0.18221332132816315,
-0.04534376412630081,
-0.03758818656206131,
0.1008274033665657,
-0.014446955174207687,
-0.06757055222988129,
-0.0193558968603611,
-0.009148950688540936,
-0.020210353657603264,
-0.06504961848258972,
0.02349076233804226,
0.002616006648167968,
0.03834538906812668,
0.14520315825939178,
0.1168651282787323,
-0.06987441331148148,
-0.020830243825912476,
0.026018651202321053,
-0.08813882619142532,
0.10070040822029114,
-0.1320122331380844,
0.06860120594501495,
0.13442650437355042,
-0.01277910452336073,
0.07715275883674622,
0.03841591626405716,
-0.058626431971788406,
-0.018881535157561302,
0.05517389252781868,
-0.160154789686203,
0.07981832325458527,
-0.000298161874525249,
0.009641467593610287,
-0.02524203062057495,
0.06328167766332626,
0.13824118673801422,
-0.07044083625078201,
-0.044539887458086014,
0.00738187599927187,
-0.015589823946356773,
-0.018022550269961357,
0.22390295565128326,
0.049833305180072784,
0.05837102606892586,
-0.12759752571582794,
0.04003255069255829,
0.03973967954516411,
-0.058273863047361374,
0.029119204729795456,
-0.039404258131980896,
-0.13577648997306824,
-0.09908557683229446,
-0.030914440751075745,
0.13003207743167877,
-0.29961344599723816,
-0.07446656376123428,
-0.03485828638076782,
-0.07049078494310379,
0.06506827473640442,
0.19808267056941986,
0.11904903501272202,
0.03568382188677788,
-0.029588529840111732,
-0.11023970693349838,
-0.1369076520204544,
0.006751323584467173,
0.13935476541519165,
0.04680730402469635,
-0.13434408605098724,
0.15106748044490814,
-0.03273846209049225,
0.08455602079629898,
-0.04336491972208023,
-0.01003278885036707,
-0.14466796815395355,
0.02000856027007103,
-0.15751948952674866,
0.025342000648379326,
-0.08438339084386826,
0.014086627401411533,
0.009117784909904003,
-0.029682133346796036,
-0.06238808482885361,
0.017027074471116066,
-0.06956175714731216,
-0.00868729967623949,
0.02569357119500637,
0.011040223762392998,
-0.06645192205905914,
-0.05443143844604492,
0.055141616612672806,
-0.013444457203149796,
0.059358954429626465,
0.15065868198871613,
0.04421994090080261,
0.06507266312837601,
-0.08567606657743454,
-0.00987659115344286,
0.10548463463783264,
0.026946792379021645,
0.1195228099822998,
-0.14580225944519043,
0.07182289659976959,
0.05830498784780502,
0.024585356935858727,
0.05290275812149048,
0.11801747232675552,
-0.10994037985801697,
-0.007086510770022869,
-0.05571635812520981,
-0.07907071709632874,
-0.12558303773403168,
0.019804026931524277,
0.09733504056930542,
0.05553179606795311,
0.10974705964326859,
-0.04749652370810509,
0.056582652032375336,
-0.11322911083698273,
0.0047056409530341625,
-0.08280424028635025,
-0.06691767275333405,
-0.04079873487353325,
-0.05115688592195511,
0.06336252391338348,
-0.01679244637489319,
0.07561216503381729,
-0.026876574382185936,
0.10437789559364319,
-0.009782949462532997,
0.07918042689561844,
0.03041304089128971,
-0.017608368769288063,
0.1331101655960083,
0.11419478058815002,
-0.015277067199349403,
0.029546866193413734,
0.1295439898967743,
0.08374843746423721,
-0.004839922767132521,
0.010240022093057632,
0.0022958884947001934,
-0.02240883931517601,
0.14341719448566437,
0.0028272138442844152,
-0.06927943974733353,
-0.04183010011911392,
-0.09159459173679352,
-0.13715553283691406,
0.032736822962760925,
0.017801938578486443,
0.051120638847351074,
0.11600854992866516,
-0.07046829909086227,
-0.024618735536932945,
-0.0481274276971817,
-0.07088258862495422,
-0.1791909784078598,
-0.09101080894470215,
-0.14180275797843933,
-0.06172693148255348,
-0.004161984194070101,
-0.07596102356910706,
-0.05297406017780304,
0.10190519690513611,
0.034281227737665176,
-0.04504281282424927,
0.06780993193387985,
-0.08432185649871826,
-0.012876027263700962,
-0.005293369293212891,
0.015451141633093357,
0.016206413507461548,
-0.023416096344590187,
-0.005619159899652004,
-0.015481541864573956,
0.018996190279722214,
0.0533599853515625,
-0.007318508345633745,
0.02438005805015564,
0.10394671559333801,
0.0035802789498120546,
-0.09717851132154465,
-0.03830404952168465,
0.04802671819925308,
0.07212549448013306,
0.07629916816949844,
0.021299364045262337,
0.051839474588632584,
-0.01862294226884842,
0.22441962361335754,
-0.08161066472530365,
-0.011388103477656841,
-0.13475348055362701,
0.32211413979530334,
0.01550955418497324,
0.0462164431810379,
0.041917361319065094,
-0.05172627791762352,
0.024338373914361,
0.20087480545043945,
0.11716970056295395,
-0.01588103175163269,
0.005534637253731489,
-0.006750909145921469,
-0.014986539259552956,
-0.016218727454543114,
0.0366809107363224,
0.06659594178199768,
0.17876240611076355,
-0.10685542225837708,
0.010787221603095531,
-0.024904770776629448,
-0.002850305289030075,
-0.030294107273221016,
0.026698565110564232,
-0.017735304310917854,
-0.0340469665825367,
-0.05384768545627594,
0.08169877529144287,
-0.08116816729307175,
0.0918121263384819,
0.06779258698225021,
-0.10018269717693329,
-0.13006368279457092,
0.028607141226530075,
-0.053486935794353485,
-0.011939866468310356,
0.11935137957334518,
-0.0952271968126297,
-0.021171867847442627,
0.0398196280002594,
0.019423363730311394,
-0.16089263558387756,
-0.06442026793956757,
0.03825115039944649,
0.1940261423587799,
0.18392565846443176,
0.024765245616436005,
0.17457807064056396,
0.1473865807056427,
0.056372273713350296,
-0.12298567593097687,
0.1001100242137909,
0.019287772476673126,
-0.079874686896801,
0.13599957525730133,
0.0036751271691173315,
0.0035608878824859858,
0.015600628219544888,
0.03831096366047859,
-0.17861002683639526,
0.013129239901900291,
-0.09447114169597626,
0.06889183819293976,
-0.07277428358793259,
0.016050420701503754,
-0.07055778056383133,
0.11227977275848389,
0.11020778119564056,
-0.06036730483174324,
-0.037159357219934464,
-0.04235037788748741,
0.07320336252450943,
0.025771742686629295,
-0.11237536370754242,
-0.02480119839310646,
-0.11220186948776245,
0.07513079792261124,
-0.05090945214033127,
0.01731164939701557,
-0.21180568635463715,
-0.01651906594634056,
-0.030897067859768867,
-0.08907699584960938,
-0.03854057192802429,
0.08081313222646713,
0.008166279643774033,
0.031500499695539474,
-0.04028630629181862,
-0.04585794359445572,
-0.008131120353937149,
0.10506606101989746,
-0.08555722236633301,
-0.16146671772003174
] |
null | null |
transformers
|
# BERT for Patents
BERT for Patents is a model trained by Google on 100M+ patents (not just US patents). It is based on BERT<sub>LARGE</sub>.
If you want to learn more about the model, check out the [blog post](https://cloud.google.com/blog/products/ai-machine-learning/how-ai-improves-patent-analysis), [white paper](https://services.google.com/fh/files/blogs/bert_for_patents_white_paper.pdf) and [GitHub page](https://github.com/google/patents-public-data/blob/master/models/BERT%20for%20Patents.md) containing the original TensorFlow checkpoint.
---
### Projects using this model (or variants of it):
- [Patents4IPPC](https://github.com/ec-jrc/Patents4IPPC) (carried out by [Pi School](https://picampus-school.com/) and commissioned by the [Joint Research Centre (JRC)](https://ec.europa.eu/jrc/en) of the European Commission)
|
{"language": ["en"], "license": "apache-2.0", "tags": ["masked-lm", "pytorch"], "metrics": ["perplexity"], "pipeline-tag": "fill-mask", "mask-token": "[MASK]", "widget": [{"text": "The present [MASK] provides a torque sensor that is small and highly rigid and for which high production efficiency is possible."}, {"text": "The present invention relates to [MASK] accessories and pertains particularly to a brake light unit for bicycles."}, {"text": "The present invention discloses a space-bound-free [MASK] and its coordinate determining circuit for determining a coordinate of a stylus pen."}, {"text": "The illuminated [MASK] includes a substantially translucent canopy supported by a plurality of ribs pivotally swingable towards and away from a shaft."}]}
|
fill-mask
|
anferico/bert-for-patents
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"fill-mask",
"masked-lm",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #safetensors #fill-mask #masked-lm #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us
|
# BERT for Patents
BERT for Patents is a model trained by Google on 100M+ patents (not just US patents). It is based on BERT<sub>LARGE</sub>.
If you want to learn more about the model, check out the blog post, white paper and GitHub page containing the original TensorFlow checkpoint.
---
### Projects using this model (or variants of it):
- Patents4IPPC (carried out by Pi School and commissioned by the Joint Research Centre (JRC) of the European Commission)
|
[
"# BERT for Patents\n\nBERT for Patents is a model trained by Google on 100M+ patents (not just US patents). It is based on BERT<sub>LARGE</sub>.\n\nIf you want to learn more about the model, check out the blog post, white paper and GitHub page containing the original TensorFlow checkpoint.\n\n---",
"### Projects using this model (or variants of it):\n- Patents4IPPC (carried out by Pi School and commissioned by the Joint Research Centre (JRC) of the European Commission)"
] |
[
"TAGS\n#transformers #pytorch #tf #safetensors #fill-mask #masked-lm #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n",
"# BERT for Patents\n\nBERT for Patents is a model trained by Google on 100M+ patents (not just US patents). It is based on BERT<sub>LARGE</sub>.\n\nIf you want to learn more about the model, check out the blog post, white paper and GitHub page containing the original TensorFlow checkpoint.\n\n---",
"### Projects using this model (or variants of it):\n- Patents4IPPC (carried out by Pi School and commissioned by the Joint Research Centre (JRC) of the European Commission)"
] |
[
61,
81,
45
] |
[
"passage: TAGS\n#transformers #pytorch #tf #safetensors #fill-mask #masked-lm #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# BERT for Patents\n\nBERT for Patents is a model trained by Google on 100M+ patents (not just US patents). It is based on BERT<sub>LARGE</sub>.\n\nIf you want to learn more about the model, check out the blog post, white paper and GitHub page containing the original TensorFlow checkpoint.\n\n---### Projects using this model (or variants of it):\n- Patents4IPPC (carried out by Pi School and commissioned by the Joint Research Centre (JRC) of the European Commission)"
] |
[
0.00972627941519022,
0.07630278915166855,
-0.00027916941326111555,
0.08849065750837326,
0.02344997599720955,
-0.08142474293708801,
0.22104212641716003,
0.023340245708823204,
0.1373320370912552,
-0.05651729553937912,
0.14177000522613525,
0.058000653982162476,
-0.033669959753751755,
0.1642535775899887,
0.0046648504212498665,
-0.21260268986225128,
0.021847007796168327,
0.059619467705488205,
-0.0825466439127922,
0.06225740909576416,
0.08262655138969421,
-0.10403218865394592,
0.05601562559604645,
0.07653132826089859,
-0.04436325654387474,
0.01237813662737608,
0.02548452652990818,
-0.11643868684768677,
0.08966857194900513,
-0.04781899228692055,
0.15595409274101257,
0.08377831429243088,
0.003595057176426053,
0.01773134618997574,
0.032249484211206436,
-0.04028027877211571,
-0.004869228228926659,
0.09792740643024445,
0.12955322861671448,
0.023016778752207756,
0.041878122836351395,
0.1381434053182602,
-0.011963987722992897,
0.022960877045989037,
-0.0630110576748848,
-0.10749544203281403,
-0.08730141818523407,
0.09281589835882187,
-0.0455983467400074,
0.041904281824827194,
0.007981427945196629,
0.160924032330513,
0.03962855413556099,
0.09427894651889801,
0.291929692029953,
-0.45697659254074097,
-0.03433522582054138,
0.11326117068529129,
0.17910981178283691,
-0.13604775071144104,
0.033146027475595474,
0.09084420651197433,
0.015222679823637009,
0.03879503533244133,
0.18390309810638428,
-0.1019287183880806,
-0.08712513744831085,
-0.0389762818813324,
-0.11978533864021301,
0.006684047169983387,
0.22989358007907867,
-0.00876178964972496,
-0.08259738981723785,
-0.005624748300760984,
-0.019546668976545334,
-0.01261813286691904,
0.01676701009273529,
-0.03233833238482475,
0.08845086395740509,
0.003435663180425763,
-0.06739043444395065,
-0.12661363184452057,
-0.10692688077688217,
0.0007906855898909271,
-0.10494226217269897,
0.1987665444612503,
0.007529248483479023,
0.05478260666131973,
-0.10205260664224625,
0.03762717545032501,
0.0052354540675878525,
-0.1120712086558342,
0.03658493608236313,
-0.04729925096035004,
0.13217265903949738,
-0.001977526815608144,
-0.030256075784564018,
-0.08723736554384232,
0.06304430961608887,
0.06334513425827026,
0.02606436237692833,
-0.050469689071178436,
-0.011372916400432587,
0.11817652732133865,
0.0006119685131125152,
0.13611085712909698,
-0.14783939719200134,
0.01716855727136135,
0.10499332845211029,
-0.09437695890665054,
0.012361875735223293,
-0.012199455872178078,
-0.17470550537109375,
0.04077886417508125,
-0.020388757809996605,
-0.026167597621679306,
0.00834012869745493,
0.08011800050735474,
-0.08422967046499252,
0.02403804287314415,
0.009405133314430714,
-0.01634763926267624,
0.021048424765467644,
-0.0914623886346817,
0.015999644994735718,
0.01111450232565403,
0.09781942516565323,
0.016980553045868874,
-0.03010784462094307,
0.033242106437683105,
-0.07844285666942596,
-0.04921095818281174,
-0.0628458559513092,
-0.06763884425163269,
0.06292533874511719,
-0.09992418438196182,
0.06790252029895782,
-0.2020295262336731,
-0.13199517130851746,
0.04344921559095383,
0.12029977142810822,
-0.02603228949010372,
0.006965640466660261,
0.06133965402841568,
-0.06883910298347473,
-0.09383083134889603,
-0.018013333901762962,
-0.04954851418733597,
-0.0406915582716465,
0.059094127267599106,
-0.09011803567409515,
0.11574015021324158,
-0.13509365916252136,
0.03466591611504555,
-0.1583825796842575,
0.07240205258131027,
-0.1935891956090927,
-0.06306710839271545,
-0.04694955796003342,
0.009162230417132378,
-0.03956754878163338,
-0.09805317968130112,
0.013342723250389099,
0.03043392300605774,
0.05188120901584625,
0.08237481117248535,
0.015118932351469994,
-0.031016264110803604,
0.09384121000766754,
-0.058736659586429596,
-0.17189715802669525,
0.09558195620775223,
-0.024888228625059128,
0.17376482486724854,
0.041723571717739105,
0.20682761073112488,
-0.06400670856237411,
-0.19408756494522095,
-0.03352538123726845,
0.017968883737921715,
-0.06207198649644852,
-0.1471984088420868,
0.07229924947023392,
0.00903791282325983,
-0.20029982924461365,
0.01809491217136383,
-0.09273801743984222,
0.030609773471951485,
-0.049668483436107635,
-0.009374636225402355,
0.0005843183607794344,
-0.04370046779513359,
0.11090455204248428,
-0.020151279866695404,
0.11767896264791489,
-0.054771192371845245,
0.010369215160608292,
0.05520343780517578,
-0.0631905198097229,
-0.055276528000831604,
0.0014659517910331488,
-0.06058073416352272,
0.17531174421310425,
-0.1092788428068161,
0.0007688066107220948,
-0.12384888529777527,
-0.0668933242559433,
0.035026468336582184,
-0.1276770979166031,
0.05297807231545448,
0.17630119621753693,
0.03502119705080986,
0.05327148362994194,
-0.001581132528372109,
0.07311618328094482,
-0.06681603193283081,
0.03332040086388588,
-0.04048953950405121,
-0.07615723460912704,
-0.03990110754966736,
-0.05679189786314964,
-0.11842571943998337,
0.06872015446424484,
0.01194017194211483,
-0.06661050021648407,
0.0706801638007164,
0.07465006411075592,
-0.027331404387950897,
0.042399801313877106,
0.05268154665827751,
-0.017814844846725464,
0.010613701306283474,
0.061065804213285446,
0.03636687248945236,
-0.03748014196753502,
0.12726275622844696,
0.0762040913105011,
0.24173371493816376,
0.11553972959518433,
-0.1587526649236679,
-0.07548552006483078,
0.004758047871291637,
-0.036363959312438965,
0.058352574706077576,
-0.06851735711097717,
0.005875876173377037,
0.12516722083091736,
-0.016405995935201645,
0.12118105590343475,
-0.10687896609306335,
-0.026910414919257164,
0.057049110531806946,
-0.14613111317157745,
-0.02534620091319084,
0.07197742164134979,
0.20079897344112396,
-0.04420746862888336,
0.12611016631126404,
0.21580012142658234,
-0.029386868700385094,
0.14957860112190247,
-0.004097647964954376,
-0.016706589609384537,
-0.07031062990427017,
-0.07634237408638,
0.007016930263489485,
0.15914900600910187,
-0.16608357429504395,
0.009647149592638016,
0.08747821301221848,
-0.07413937151432037,
0.039819877594709396,
-0.13137292861938477,
-0.05928308144211769,
-0.01924801431596279,
-0.006519733462482691,
-0.13226686418056488,
0.050356604158878326,
-0.08232942968606949,
0.07831041514873505,
0.03995416313409805,
-0.11173304170370102,
0.07019974291324615,
0.0158288162201643,
-0.03915773332118988,
0.12292329967021942,
-0.042264532297849655,
-0.19852329790592194,
-0.18616452813148499,
-0.03781909868121147,
-0.03254217281937599,
0.10194317996501923,
0.035511408001184464,
0.08933907747268677,
-0.0949706956744194,
-0.044205375015735626,
0.00040658272337168455,
-0.008753838948905468,
0.030676962807774544,
-0.014189621433615685,
0.013995221816003323,
-0.021471437066793442,
-0.06946776807308197,
-0.04932093992829323,
-0.022230977192521095,
0.02569965086877346,
0.08561147749423981,
-0.007783445529639721,
0.1059521958231926,
0.054312847554683685,
-0.09376855939626694,
-0.021031299605965614,
-0.042057495564222336,
0.10493913292884827,
-0.03626763075590134,
0.011105584912002087,
0.16184251010417938,
0.028547942638397217,
0.036531779915094376,
0.1253415197134018,
0.07669802010059357,
-0.0834265798330307,
0.04274410381913185,
-0.12227284163236618,
-0.11241210252046585,
-0.17806756496429443,
-0.01398845948278904,
-0.043460119515657425,
0.09273923933506012,
0.009481513872742653,
-0.018009545281529427,
-0.012693261727690697,
0.11148664355278015,
0.08416400849819183,
0.08299241214990616,
-0.04873465374112129,
0.049625422805547714,
0.1027769073843956,
-0.02680615335702896,
0.10593321919441223,
-0.013404474593698978,
-0.10802146792411804,
0.043312836438417435,
0.06386969983577728,
0.1705617755651474,
0.05621090531349182,
0.033299315720796585,
0.08782543241977692,
0.13981808722019196,
0.0746307224035263,
0.18899236619472504,
0.04514097794890404,
-0.00440019927918911,
-0.03519798442721367,
-0.09657339006662369,
0.03968748450279236,
0.0010983471293002367,
-0.09421940892934799,
-0.10685869306325912,
0.0463283397257328,
-0.13401968777179718,
0.017900459468364716,
0.22816039621829987,
0.02741364948451519,
-0.16915781795978546,
-0.011974388733506203,
-0.0023512498009949923,
-0.027025293558835983,
-0.022039134055376053,
0.06524743139743805,
0.12780527770519257,
-0.1023586094379425,
0.03814377635717392,
-0.04383715242147446,
0.02819077856838703,
0.024607306346297264,
0.047224365174770355,
0.03596581518650055,
-0.04827209934592247,
0.02418006770312786,
0.04432941600680351,
-0.17758145928382874,
0.2717016041278839,
-0.01134288590401411,
-0.004195013083517551,
-0.05972437560558319,
-0.02274598740041256,
-0.018514085561037064,
0.044195808470249176,
0.18286588788032532,
-0.007756057661026716,
0.10240491479635239,
-0.1214628741145134,
-0.06850941479206085,
0.02227744832634926,
0.0034780509304255247,
-0.06270340830087662,
0.040757548063993454,
0.01466425135731697,
-0.08195048570632935,
-0.03139299154281616,
-0.03298329561948776,
0.0008805265533737838,
-0.014213920570909977,
0.055325839668512344,
-0.014950159937143326,
0.03427576646208763,
-0.0494166724383831,
-0.07612834870815277,
-0.12775814533233643,
0.06226015463471413,
-0.06631860882043839,
-0.0990329310297966,
-0.07837579399347305,
-0.048517487943172455,
0.05551556125283241,
-0.03361215069890022,
0.12366421520709991,
-0.038672175258398056,
-0.010118012316524982,
-0.024543285369873047,
-0.14381395280361176,
0.09321524202823639,
-0.11642727255821228,
-0.09873805940151215,
-0.025155125185847282,
0.030848728492856026,
-0.05876602232456207,
0.03410959243774414,
0.031098317354917526,
0.021955477073788643,
-0.11297791451215744,
-0.08764585107564926,
0.015255481004714966,
-0.1157594695687294,
0.11074088513851166,
-0.033467166125774384,
-0.04141904041171074,
0.06900740414857864,
0.04168664664030075,
0.011975479312241077,
0.03746770694851875,
0.11005832999944687,
-0.08870275318622589,
0.09976030886173248,
0.20612606406211853,
-0.0089775575324893,
-0.3188249170780182,
-0.041444290429353714,
-0.014665967784821987,
-0.014024070464074612,
0.06593629717826843,
-0.08985918760299683,
0.14118170738220215,
0.05922074615955353,
-0.0720958486199379,
0.05640228092670441,
-0.13448965549468994,
-0.12949903309345245,
0.09339519590139389,
0.08502782881259918,
0.26316291093826294,
-0.09994935244321823,
-0.02706286869943142,
0.008573844097554684,
-0.09722922742366791,
0.12115538865327835,
-0.09282267838716507,
0.07870348542928696,
0.0046542552299797535,
-0.016008026897907257,
-0.00635865842923522,
-0.0983307957649231,
0.03758162260055542,
-0.0689140185713768,
0.050782013684511185,
-0.0011825596448034048,
-0.029825562611222267,
0.04661468043923378,
0.05465176701545715,
0.15766581892967224,
0.10006503015756607,
0.03464777022600174,
0.09559860080480576,
-0.08133476227521896,
-0.07264772057533264,
0.180280864238739,
0.01837136410176754,
-0.12366575747728348,
-0.05394785478711128,
0.009700043126940727,
-0.09189030528068542,
-0.01745232194662094,
0.1777370721101761,
0.013849192298948765,
0.021499156951904297,
0.1810505986213684,
0.09757452458143234,
-0.1334386020898819,
-0.03348766267299652,
0.03556424751877785,
-0.12214948982000351,
0.08594151586294174,
-0.02883366122841835,
-0.09055671840906143,
0.06948669999837875,
-0.011926032602787018,
0.031451914459466934,
0.06795740127563477,
-0.04777195677161217,
0.03557533770799637,
0.1088547483086586,
-0.21465849876403809,
-0.177581325173378,
-0.06036039814352989,
-0.022632086649537086,
0.04286143183708191,
0.12892545759677887,
0.16530567407608032,
-0.1425265371799469,
-0.02578926831483841,
0.0010895838495343924,
-0.017708342522382736,
-0.0381397120654583,
0.033896248787641525,
0.06796596944332123,
0.013216452673077583,
-0.09090015292167664,
0.05202842503786087,
0.05765936151146889,
0.004808762110769749,
0.017547158524394035,
-0.07258982211351395,
-0.0709744542837143,
-0.08101353794336319,
-0.018795479089021683,
0.1382424235343933,
-0.05514528229832649,
-0.035837169736623764,
-0.0886433944106102,
-0.08772158622741699,
-0.0025869563687592745,
0.15374089777469635,
0.0531255304813385,
0.031107408925890923,
-0.02334209531545639,
-0.01348962914198637,
0.0065006990917027,
0.029044808819890022,
-0.03734259307384491,
0.06382497400045395,
-0.11674394458532333,
0.08426373451948166,
-0.038166504353284836,
0.08357162773609161,
-0.11703336983919144,
0.06332453340291977,
-0.15249408781528473,
-0.026387469843029976,
0.018037941306829453,
-0.020588798448443413,
-0.1333865076303482,
-0.07520470023155212,
-0.017732640728354454,
-0.04220176115632057,
-0.020259402692317963,
0.04482853040099144,
-0.1169961541891098,
0.029344161972403526,
0.035111576318740845,
0.01607651077210903,
-0.074787937104702,
-0.011945836246013641,
0.11828651279211044,
-0.027734221890568733,
0.0833582654595375,
-0.011841966770589352,
-0.02406008541584015,
0.03039715252816677,
-0.1475827842950821,
0.007432359270751476,
0.039250101894140244,
0.02338944561779499,
0.01834017224609852,
-0.03619049862027168,
0.004526323173195124,
-0.00416462030261755,
0.04542429745197296,
0.013403434306383133,
0.03621421009302139,
-0.15643420815467834,
-0.04751012101769447,
0.023657530546188354,
0.012601408176124096,
-0.04597080126404762,
-0.03121037222445011,
0.09095925837755203,
0.05822950601577759,
0.0880407840013504,
-0.04075150191783905,
0.03339046612381935,
-0.003844813909381628,
0.021289819851517677,
-0.003628546604886651,
-0.1530108004808426,
-0.08474350720643997,
-0.06884638220071793,
-0.011590463109314442,
-0.014064449816942215,
0.20372523367404938,
0.16528519988059998,
-0.07309519499540329,
-0.021426113322377205,
0.05745275318622589,
0.10162507742643356,
0.005854785908013582,
0.025400949642062187,
-0.029425520449876785,
-0.019715452566742897,
-0.14022867381572723,
0.0894145593047142,
0.03328343853354454,
-0.052218031138181686,
0.14545322954654694,
0.05947060510516167,
-0.09358865022659302,
0.02318219467997551,
0.07458018511533737,
0.09267701953649521,
-0.15834037959575653,
-0.1451951563358307,
0.047240324318408966,
0.17035438120365143,
-0.09216462075710297,
0.06090607866644859,
-0.003895198693498969,
-0.07431428134441376,
0.07813054323196411,
0.07175388932228088,
-0.005467575043439865,
-0.08316584676504135,
-0.14387618005275726,
-0.029140733182430267,
-0.04007917270064354,
0.015539612621068954,
-0.07948444038629532,
-0.023769672960042953,
0.11035193502902985,
0.0027440048288553953,
-0.05838790535926819,
0.12050474435091019,
-0.17298316955566406,
0.006114340852946043,
0.10204828530550003,
0.02795289270579815,
-0.028962455689907074,
-0.09314523637294769,
-0.0627879649400711,
-0.13575051724910736,
0.0701163113117218,
-0.03405815735459328,
0.040294814854860306,
-0.04766562953591347,
-0.09002252668142319,
-0.027360055595636368,
-0.04913059622049332,
-0.05981051176786423,
0.04002536088228226,
0.05530403554439545,
0.0341399610042572,
-0.024793829768896103,
-0.05504513904452324,
0.025116998702287674,
0.20006854832172394,
-0.04277648404240608,
0.004116739612072706,
-0.13058441877365112,
0.04378996044397354,
-0.04366664960980415,
-0.0028209059964865446,
0.01657641865313053,
-0.03143942356109619,
-0.048151835799217224,
0.3913959562778473,
0.20076197385787964,
-0.006693971808999777,
0.02629043720662594,
0.013143106363713741,
0.008610264398157597,
0.02373616024851799,
0.11705294251441956,
0.0353192538022995,
0.3020641803741455,
-0.04994184523820877,
-0.020827753469347954,
-0.05280124768614769,
0.006270000711083412,
-0.1104588583111763,
-0.017989356070756912,
0.036978378891944885,
-0.1353578120470047,
-0.08827413618564606,
0.04114561900496483,
-0.1068606749176979,
-0.21331465244293213,
0.018884968012571335,
-0.08899412304162979,
-0.020772581920027733,
-0.014202402904629707,
0.0009399037808179855,
-0.013559205457568169,
0.09812512993812561,
-0.05002634972333908,
0.04144845902919769,
0.061759330332279205,
0.04512813314795494,
-0.1289309710264206,
0.11262283474206924,
0.1079307347536087,
0.030073845759034157,
0.153629869222641,
0.009154900908470154,
0.050410982221364975,
0.05866609886288643,
0.029599543660879135,
-0.08242477476596832,
0.04453897848725319,
0.035543251782655716,
-0.06403534859418869,
0.003745968919247389,
-0.05232721194624901,
0.006120557431131601,
-0.11395708471536636,
-0.026115750893950462,
0.008734374307096004,
0.018480436876416206,
-0.012201395817101002,
0.02334519475698471,
-0.06372614949941635,
0.038182299584150314,
-0.09384845197200775,
0.08159389346837997,
0.08572635799646378,
-0.056026287376880646,
-0.04249238967895508,
-0.05471748113632202,
0.06548593193292618,
0.03403932601213455,
-0.19437003135681152,
-0.013191357254981995,
-0.012193849310278893,
-0.03886422514915466,
-0.061516083776950836,
-0.05919203907251358,
-0.21384435892105103,
0.03377857431769371,
-0.061419274657964706,
0.05176706239581108,
-0.11749071627855301,
-0.08588949590921402,
0.16304059326648712,
-0.005808728747069836,
-0.003074161009863019,
0.1590987592935562,
-0.02683972381055355,
-0.024642281234264374,
-0.08123871684074402,
-0.07169096916913986
] |
null | null |
transformers
|
#Monke Messenger DialoGPT Model
|
{"tags": ["conversational"]}
|
text-generation
|
ange/DialoGPT-medium-Monke
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
#Monke Messenger DialoGPT Model
|
[] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
51
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.009697278961539268,
0.03208012506365776,
-0.007204889785498381,
0.004809224978089333,
0.16726240515708923,
0.014898733235895634,
0.09765533357858658,
0.13672804832458496,
-0.007841327227652073,
-0.031050153076648712,
0.14490588009357452,
0.20411323010921478,
-0.006439372431486845,
0.0661218985915184,
-0.07572533935308456,
-0.2683109939098358,
0.05759621039032936,
0.046649303287267685,
0.016515716910362244,
0.1200079694390297,
0.08573378622531891,
-0.05473608896136284,
0.08714032918214798,
-0.014583407901227474,
-0.150366872549057,
0.017733458429574966,
0.043394338339567184,
-0.12260226160287857,
0.11910516023635864,
0.05462685227394104,
0.07063519209623337,
0.014929565601050854,
-0.07541623711585999,
-0.1631229966878891,
0.03031250834465027,
0.01425902172923088,
-0.0594632662832737,
0.04757995903491974,
0.059961482882499695,
-0.10165371745824814,
0.10819483548402786,
0.09530027210712433,
-0.013078106567263603,
0.06798283755779266,
-0.16849711537361145,
-0.020869607105851173,
-0.01446688175201416,
0.009899779222905636,
0.05550243332982063,
0.09964893013238907,
-0.03413357585668564,
0.10497362166643143,
-0.09214533120393753,
0.11017382889986038,
0.10932035744190216,
-0.32057443261146545,
-0.005767723545432091,
0.09167823940515518,
0.039358653128147125,
0.07352814823389053,
-0.04467793554067612,
0.06258884817361832,
0.018015462905168533,
0.017986174672842026,
-0.014015024527907372,
-0.07283061742782593,
-0.11612214148044586,
0.04717336222529411,
-0.08668071031570435,
-0.059868961572647095,
0.2244078367948532,
-0.05464440956711769,
0.06881742179393768,
-0.05281897634267807,
-0.10522868484258652,
-0.04308144748210907,
-0.029833965003490448,
0.00475557055324316,
-0.07660607248544693,
0.08692064881324768,
0.00869679357856512,
-0.09547875821590424,
-0.1376667022705078,
-0.02496783249080181,
-0.1776352822780609,
0.16140350699424744,
0.02465328387916088,
0.05232657864689827,
-0.2027255892753601,
0.09623090922832489,
0.017906051129102707,
-0.08045592904090881,
0.022091427817940712,
-0.10046248883008957,
0.029131146147847176,
0.013760408386588097,
-0.04754498973488808,
-0.061387211084365845,
0.0843690037727356,
0.11199145019054413,
-0.01731434464454651,
0.025486016646027565,
-0.039331406354904175,
0.08100687712430954,
0.03553595021367073,
0.09077847748994827,
0.007288969587534666,
-0.028338588774204254,
0.025842782109975815,
-0.13719046115875244,
-0.003647835226729512,
-0.07116208970546722,
-0.16572439670562744,
-0.021088803187012672,
0.02994808368384838,
0.08289173990488052,
0.015449047088623047,
0.11682453751564026,
-0.03272046521306038,
-0.025152435526251793,
0.03602350503206253,
-0.047656361013650894,
-0.012649794109165668,
0.016648368909955025,
0.013163427822291851,
0.12399329990148544,
-0.0022096503525972366,
0.03235051408410072,
-0.13653022050857544,
0.031423524022102356,
-0.06793295592069626,
-0.003740974934771657,
-0.03486552834510803,
-0.040637075901031494,
0.009043924510478973,
-0.06862333416938782,
0.003486064961180091,
-0.15030112862586975,
-0.15063877403736115,
0.007587034720927477,
-0.007836631499230862,
-0.04107699543237686,
-0.06370922178030014,
-0.06952770054340363,
-0.013550350442528725,
0.04251532256603241,
-0.07093454152345657,
-0.011352915316820145,
-0.06403283774852753,
0.11004766076803207,
-0.03197755664587021,
0.07921615242958069,
-0.11953279376029968,
0.08390819281339645,
-0.11260783672332764,
-0.02386913076043129,
-0.060801517218351364,
0.09317506104707718,
-0.0006014376995153725,
0.09549830108880997,
-0.006563255097717047,
-0.017931854352355003,
-0.07981178909540176,
0.06445012241601944,
-0.042872510850429535,
0.21701598167419434,
-0.0615808479487896,
-0.11181682348251343,
0.28781595826148987,
-0.052628401666879654,
-0.1370542049407959,
0.11647392809391022,
0.008682746440172195,
0.05777018144726753,
0.10703510791063309,
0.19733482599258423,
-0.015276194550096989,
0.004040541127324104,
0.09471915662288666,
0.11263324320316315,
-0.11276852339506149,
-0.033160366117954254,
0.013019153848290443,
-0.04081077128648758,
-0.10867965966463089,
0.04689536616206169,
0.09810488671064377,
0.07090286910533905,
-0.04786505550146103,
-0.03377414867281914,
-0.01366397924721241,
0.0052589005790650845,
0.08885077387094498,
-0.007157256826758385,
0.10962837189435959,
-0.05819983780384064,
-0.03796621412038803,
-0.029282379895448685,
-0.012126247398555279,
-0.03951939567923546,
0.03137664496898651,
-0.043376367539167404,
0.10821941494941711,
-0.011204327456653118,
0.06364280730485916,
-0.16185984015464783,
-0.07691477984189987,
-0.017002692446112633,
0.1581239402294159,
0.024538565427064896,
0.09859629720449448,
0.0552486926317215,
-0.040398042649030685,
-0.0012767292791977525,
0.012792680412530899,
0.15581141412258148,
-0.022091681137681007,
-0.065607450902462,
-0.052166227251291275,
0.08642971515655518,
-0.05641226842999458,
0.04504093527793884,
-0.05937713757157326,
0.012367865070700645,
0.05064384639263153,
0.10342344641685486,
-0.00018274025933351368,
0.03323284164071083,
-0.008164864964783192,
0.002145637758076191,
-0.058205123990774155,
0.007405933458358049,
0.10799351334571838,
0.00036868182360194623,
-0.07365862280130386,
0.22074243426322937,
-0.17796069383621216,
0.1765957772731781,
0.1893044263124466,
-0.299345999956131,
0.017949223518371582,
-0.10759581625461578,
-0.04561871662735939,
0.014407722279429436,
0.05567655712366104,
-0.0454222597181797,
0.1703362911939621,
-0.009871348738670349,
0.18874616920948029,
-0.04946064203977585,
-0.04464937001466751,
-0.0200483538210392,
-0.05118836089968681,
-0.0024189651012420654,
0.07781197130680084,
0.10685696452856064,
-0.13992026448249817,
0.1964332014322281,
0.1621224284172058,
0.048237916082143784,
0.19945049285888672,
0.015346456319093704,
-0.011589210480451584,
0.0909530371427536,
0.005220826715230942,
-0.058739423751831055,
-0.07409929484128952,
-0.2594851851463318,
-0.030033592134714127,
0.07992640137672424,
0.0422382652759552,
0.1212305948138237,
-0.11349532753229141,
-0.038956157863140106,
-0.01763172075152397,
-0.023146281018853188,
0.021672505885362625,
0.0914369598031044,
0.06075398623943329,
0.13201528787612915,
-0.001710098935291171,
-0.007300339173525572,
0.10524573177099228,
0.01783694699406624,
-0.09354141354560852,
0.18308524787425995,
-0.13652534782886505,
-0.37097251415252686,
-0.13911493122577667,
-0.18057456612586975,
-0.05449081212282181,
0.05712554603815079,
0.11679314076900482,
-0.12011238187551498,
-0.018752124160528183,
0.01578843593597412,
0.10931742936372757,
-0.08449502289295197,
0.0021454424131661654,
-0.06880278885364532,
0.0321490578353405,
-0.10310184955596924,
-0.09194442629814148,
-0.055416494607925415,
-0.031392451375722885,
-0.08001253753900528,
0.1423761546611786,
-0.10777941346168518,
0.04476889222860336,
0.20262959599494934,
0.04653622955083847,
0.05625178664922714,
-0.044105201959609985,
0.19377262890338898,
-0.11264272034168243,
-0.01661740615963936,
0.19215328991413116,
-0.048360925167798996,
0.07476246356964111,
0.1232115849852562,
-0.006348740309476852,
-0.08765771239995956,
0.03011748194694519,
-0.02085109055042267,
-0.07988511025905609,
-0.23219464719295502,
-0.13938382267951965,
-0.12429051846265793,
0.09477275609970093,
0.028005298227071762,
0.056365787982940674,
0.17219258844852448,
0.06577219814062119,
-0.038416244089603424,
0.006410336587578058,
0.02959546446800232,
0.08237514644861221,
0.23417828977108002,
-0.06035616248846054,
0.1364797055721283,
-0.03420931473374367,
-0.14982740581035614,
0.08169995993375778,
0.0713929831981659,
0.10213395953178406,
0.06678459793329239,
0.0804823637008667,
0.0149586396291852,
0.06188136339187622,
0.1311223804950714,
0.08191446959972382,
0.019586285576224327,
-0.02480296604335308,
-0.03388110175728798,
-0.025523077696561813,
-0.05937909707427025,
0.040128443390131,
0.06589099019765854,
-0.16763372719287872,
-0.039227183908224106,
-0.09338314831256866,
0.09657008945941925,
0.0873042419552803,
0.06609832495450974,
-0.1842060089111328,
-0.008006223477423191,
0.08488986641168594,
-0.03854905813932419,
-0.13727426528930664,
0.09535189718008041,
0.01523482333868742,
-0.15144726634025574,
0.03139317408204079,
-0.04061909019947052,
0.12188644707202911,
-0.07804752141237259,
0.09809603542089462,
-0.08108244836330414,
-0.07448557764291763,
0.02123199962079525,
0.1261177361011505,
-0.30527687072753906,
0.20240111649036407,
-0.0024993624538183212,
-0.06486981362104416,
-0.1243603527545929,
-0.0032166161108762026,
0.002410882618278265,
0.07357452809810638,
0.10519039630889893,
-0.007196315098553896,
0.001897757756523788,
-0.06300821900367737,
-0.01829923689365387,
0.032471053302288055,
0.13080233335494995,
-0.0401318334043026,
-0.021158374845981598,
-0.050194524228572845,
-0.001653497340157628,
-0.03173094615340233,
-0.06934895366430283,
0.02002747356891632,
-0.19509181380271912,
0.08751901984214783,
0.04166261479258537,
0.09648149460554123,
0.029994789510965347,
0.004265148192644119,
-0.09651939570903778,
0.24698667228221893,
-0.07148019969463348,
-0.10072879493236542,
-0.10919588059186935,
-0.046813901513814926,
0.03569883480668068,
-0.05628936365246773,
0.04309194162487984,
-0.0788632407784462,
0.028997479006648064,
-0.06352769583463669,
-0.19235502183437347,
0.12410202622413635,
-0.09027006477117538,
-0.04412810131907463,
-0.02371402643620968,
0.2110891044139862,
-0.05598580464720726,
0.010335659608244896,
0.02930437959730625,
0.01208863127976656,
-0.11645778268575668,
-0.09678568691015244,
0.031018631532788277,
-0.007351789623498917,
0.050603240728378296,
0.041841957718133926,
-0.05915454775094986,
-0.017138581722974777,
-0.052199993282556534,
-0.022926922887563705,
0.3496883809566498,
0.14231905341148376,
-0.043836336582899094,
0.19347235560417175,
0.12347975373268127,
-0.07452994585037231,
-0.3159443140029907,
-0.1066238060593605,
-0.10937739163637161,
-0.04680149629712105,
-0.07012093812227249,
-0.2002030611038208,
0.06474938243627548,
0.00662544509395957,
-0.013415241613984108,
0.12749312818050385,
-0.2561831772327423,
-0.07571036368608475,
0.15906259417533875,
-0.017980827018618584,
0.3745945692062378,
-0.1168576180934906,
-0.10926306992769241,
-0.03950892388820648,
-0.14175476133823395,
0.16968177258968353,
-0.01989765651524067,
0.11221715062856674,
-0.009765521623194218,
0.14388824999332428,
0.05548359826207161,
-0.023479344323277473,
0.08544106781482697,
0.004999885335564613,
-0.03290518373250961,
-0.10304180532693863,
-0.05676887184381485,
0.007092386484146118,
0.02477436140179634,
0.018026655539870262,
-0.041834570467472076,
0.02227151393890381,
-0.11731979995965958,
-0.04657655209302902,
-0.08982590585947037,
0.04431166127324104,
0.03899754583835602,
-0.07325074821710587,
-0.002380647463724017,
-0.07165111601352692,
-0.012272949330508709,
0.022334342822432518,
0.20356793701648712,
-0.08029330521821976,
0.16448934376239777,
0.09239562600851059,
0.12419285625219345,
-0.14376309514045715,
-0.00019283240544609725,
-0.0762530043721199,
-0.05611240118741989,
0.07737895101308823,
-0.09433035552501678,
0.058893077075481415,
0.10901971161365509,
-0.04567738622426987,
0.08828683942556381,
0.10377411544322968,
0.008936077356338501,
0.003213887568563223,
0.10916902124881744,
-0.2667325437068939,
-0.0296600554138422,
-0.07532413303852081,
0.000883326749317348,
0.09092561900615692,
0.08562852442264557,
0.18840822577476501,
0.025361526757478714,
-0.04293036088347435,
-0.002770674182102084,
0.028597986325621605,
-0.039021048694849014,
0.051667019724845886,
0.001123449532315135,
0.01947369985282421,
-0.1530752182006836,
0.072522833943367,
0.01490565575659275,
-0.15215420722961426,
0.021316176280379295,
0.16572684049606323,
-0.11656328290700912,
-0.1283872276544571,
-0.06520111113786697,
0.08313824236392975,
-0.11755692958831787,
-0.01578943058848381,
-0.03279297426342964,
-0.13145680725574493,
0.07992171496152878,
0.12629036605358124,
0.05557859688997269,
0.0972496047616005,
-0.06061713397502899,
-0.020469192415475845,
-0.018721895292401314,
-0.014099318534135818,
-0.012384648434817791,
-0.007667020428925753,
-0.055978111922740936,
0.0590752474963665,
-0.026677248999476433,
0.1425808072090149,
-0.09221141785383224,
-0.1037059873342514,
-0.16142144799232483,
0.0374140702188015,
-0.11013076454401016,
-0.08825794607400894,
-0.08821134269237518,
-0.050188567489385605,
0.002360827289521694,
-0.019856395199894905,
-0.04037635400891304,
-0.05829505994915962,
-0.12300454825162888,
0.0338277705013752,
-0.040771447122097015,
0.024727050215005875,
-0.07512269169092178,
0.015856385231018066,
0.08507686108350754,
-0.03285100311040878,
0.15655414760112762,
0.1450488418340683,
-0.1006515845656395,
0.10741901397705078,
-0.14806775748729706,
-0.09138492494821548,
0.11116421222686768,
0.015329592861235142,
0.0449691042304039,
0.09723787009716034,
0.013362943194806576,
0.0635865181684494,
0.032776717096567154,
0.05308786407113075,
0.027619892731308937,
-0.11959987878799438,
0.06483134627342224,
-0.03626115620136261,
-0.14700546860694885,
-0.049338050186634064,
-0.05282869189977646,
0.01647452637553215,
0.013054544106125832,
0.09622690081596375,
-0.05301849544048309,
0.10698331147432327,
-0.04055701196193695,
0.0346808135509491,
0.017554637044668198,
-0.1730053424835205,
-0.03816922754049301,
-0.08538098633289337,
0.03681723028421402,
0.014741539023816586,
0.25266793370246887,
0.030072299763560295,
0.012416383251547813,
0.032671261578798294,
0.08285367488861084,
0.03899408504366875,
0.010228337720036507,
0.17482228577136993,
0.1162426546216011,
-0.06621865928173065,
-0.10445023328065872,
0.0729617029428482,
0.016332454979419708,
0.01286179106682539,
0.13617953658103943,
0.008365051820874214,
0.005795429926365614,
0.08649782836437225,
-0.016865963116288185,
0.009968153201043606,
-0.10052056610584259,
-0.13426925241947174,
-0.022176474332809448,
0.05151832848787308,
-0.04655967652797699,
0.11727844923734665,
0.1406494379043579,
-0.01806013658642769,
0.03222079202532768,
-0.021771740168333054,
-0.05699979141354561,
-0.1683429479598999,
-0.1429590880870819,
-0.06883849948644638,
-0.13416796922683716,
0.00897989235818386,
-0.11180389672517776,
0.05395037308335304,
0.06001098081469536,
0.06750501692295074,
-0.06899319589138031,
0.10220931470394135,
0.04626858979463577,
-0.11440542340278625,
0.06264589726924896,
-0.0296088308095932,
0.09430401772260666,
-0.02759445086121559,
-0.019505485892295837,
-0.09039592742919922,
0.014574515633285046,
0.011419114656746387,
0.06245238706469536,
-0.04707273095846176,
0.007463190704584122,
-0.14696238934993744,
-0.08972041308879852,
-0.0523175448179245,
0.0718572810292244,
-0.050409089773893356,
0.14282815158367157,
0.00775480642914772,
-0.0170906875282526,
0.039554283022880554,
0.22787313163280487,
-0.07476283609867096,
-0.04778539761900902,
-0.05269690603017807,
0.20717895030975342,
0.02975541539490223,
0.1171872541308403,
-0.022938819602131844,
-0.006106364540755749,
-0.0919521227478981,
0.3764844834804535,
0.30030161142349243,
-0.09031439572572708,
0.011794124729931355,
0.02137952297925949,
0.04502861574292183,
0.1316293478012085,
0.1216534823179245,
0.10318691283464432,
0.3006802201271057,
-0.07452366501092911,
-0.04653361067175865,
-0.012629742734134197,
-0.023858042433857918,
-0.09059546142816544,
0.1021224707365036,
0.04839762672781944,
-0.06382183730602264,
-0.03313443064689636,
0.0954432487487793,
-0.25862133502960205,
0.1277991235256195,
-0.12311873584985733,
-0.17578600347042084,
-0.06654827296733856,
0.009760108776390553,
0.10465722531080246,
0.015642458572983742,
0.0946015790104866,
0.007128213066607714,
-0.11252258718013763,
0.06305865943431854,
0.03397420793771744,
-0.22762253880500793,
0.0006893770187161863,
0.06642123311758041,
-0.07006710022687912,
-0.0024247700348496437,
-0.026499588042497635,
0.05657242611050606,
0.0656052976846695,
0.054629553109407425,
-0.00971333310008049,
0.03816632181406021,
0.0034184439573436975,
-0.0585215799510479,
0.016623929142951965,
0.05121519789099693,
0.02472509816288948,
-0.09763528406620026,
0.06927435845136642,
-0.1574270874261856,
0.04766253009438515,
-0.0030655991286039352,
-0.04124255105853081,
0.006064958870410919,
0.008823691867291927,
-0.06491616368293762,
0.05165379121899605,
0.07916834205389023,
-0.0016257909592241049,
-0.0062433634884655476,
-0.057178743183612823,
-0.02632102556526661,
-0.027755750343203545,
-0.09291748702526093,
-0.10495562851428986,
-0.14682936668395996,
-0.11640441417694092,
0.09368976950645447,
-0.01011267676949501,
-0.1848134547472,
0.022154374048113823,
-0.08606051653623581,
0.08319322764873505,
-0.1670055389404297,
0.08040720224380493,
0.07041648775339127,
0.013038921169936657,
-0.0031511052511632442,
-0.02002427540719509,
0.054132770746946335,
0.086809903383255,
-0.10407156497240067,
-0.07400695979595184
] |
null | null |
transformers
|
# Wav2Vec2-Large-XLSR-53-Turkish
Fine-tuned [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on Turkish using the [Common Voice](https://huggingface.co/datasets/common_voice).
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
```python
import torch
import torchaudio
from datasets import load_dataset
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
from unicode_tr import unicode_tr
test_dataset = load_dataset("common_voice", "tr", split="test[:2%]")
processor = Wav2Vec2Processor.from_pretrained("aniltrkkn/wav2vec2-large-xlsr-53-turkish")
model = Wav2Vec2ForCTC.from_pretrained("aniltrkkn/wav2vec2-large-xlsr-53-turkish")
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
\tspeech_array, sampling_rate = torchaudio.load(batch["path"])
\tbatch["speech"] = resampler(speech_array).squeeze().numpy()
\treturn batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
inputs = processor(test_dataset["speech"][:2], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
\tlogits = model(inputs.input_values, attention_mask=inputs.attention_mask).logits
predicted_ids = torch.argmax(logits, dim=-1)
print("Prediction:", processor.batch_decode(predicted_ids))
print("Reference:", test_dataset["sentence"][:2])
```
## Evaluation
The model can be evaluated as follows on the Turkish test data of Common Voice.
```python
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
import re
test_dataset = load_dataset("common_voice", "tr", split="test")
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("aniltrkkn/wav2vec2-large-xlsr-53-turkish")
model = Wav2Vec2ForCTC.from_pretrained("aniltrkkn/wav2vec2-large-xlsr-53-turkish")
model.to("cuda")
chars_to_ignore_regex = '[\\,\\?\\.\\!\\-\\;\\:\\"\\“]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
\tbatch["sentence"] = str(unicode_tr(re.sub(chars_to_ignore_regex, "", batch["sentence"])).lower())
\tspeech_array, sampling_rate = torchaudio.load(batch["path"])
\tbatch["speech"] = resampler(speech_array).squeeze().numpy()
\treturn batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def evaluate(batch):
\tinputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
\twith torch.no_grad():
\t\tlogits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
\tpred_ids = torch.argmax(logits, dim=-1)
\tbatch["pred_strings"] = processor.batch_decode(pred_ids)
\treturn batch
result = test_dataset.map(evaluate, batched=True, batch_size=8)
print("WER: {:2f}".format(100 * wer.compute(predictions=result["pred_strings"], references=result["sentence"])))
```
**Test Result**: 17.46 %
## Training
unicode_tr package is used for converting sentences to lower case since regular lower() does not work well with Turkish.
Since training data is very limited for Turkish, all data is employed with a K-Fold (k=5) training approach. Best model out of the 5 trainings is uploaded. Training arguments:
--num_train_epochs="30" \\
--per_device_train_batch_size="32" \\
--evaluation_strategy="steps" \\
--activation_dropout="0.055" \\
--attention_dropout="0.094" \\
--feat_proj_dropout="0.04" \\
--hidden_dropout="0.047" \\
--layerdrop="0.041" \\
--learning_rate="2.34e-4" \\
--mask_time_prob="0.082" \\
--warmup_steps="250" \\
All trainings took ~20 hours with a GeForce RTX 3090 Graphics Card.
|
{"language": "tr", "license": "apache-2.0", "tags": ["audio", "automatic-speech-recognition", "speech", "xlsr-fine-tuning-week"], "datasets": ["common_voice"], "metrics": ["wer"], "results": [{"task": {"name": "Speech Recognition", "type": "automatic-speech-recognition"}, "dataset": {"name": "Common Voice tr", "type": "common_voice", "args": "tr"}, "metrics": [{"name": "Test WER", "type": "wer", "value": 17.46}]}]}
|
automatic-speech-recognition
|
aniltrkkn/wav2vec2-large-xlsr-53-turkish
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"audio",
"speech",
"xlsr-fine-tuning-week",
"tr",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"tr"
] |
TAGS
#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #tr #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us
|
# Wav2Vec2-Large-XLSR-53-Turkish
Fine-tuned facebook/wav2vec2-large-xlsr-53 on Turkish using the Common Voice.
When using this model, make sure that your speech input is sampled at 16kHz.
## Usage
The model can be used directly (without a language model) as follows:
## Evaluation
The model can be evaluated as follows on the Turkish test data of Common Voice.
Test Result: 17.46 %
## Training
unicode_tr package is used for converting sentences to lower case since regular lower() does not work well with Turkish.
Since training data is very limited for Turkish, all data is employed with a K-Fold (k=5) training approach. Best model out of the 5 trainings is uploaded. Training arguments:
--num_train_epochs="30" \\
--per_device_train_batch_size="32" \\
--evaluation_strategy="steps" \\
--activation_dropout="0.055" \\
--attention_dropout="0.094" \\
--feat_proj_dropout="0.04" \\
--hidden_dropout="0.047" \\
--layerdrop="0.041" \\
--learning_rate="2.34e-4" \\
--mask_time_prob="0.082" \\
--warmup_steps="250" \\
All trainings took ~20 hours with a GeForce RTX 3090 Graphics Card.
|
[
"# Wav2Vec2-Large-XLSR-53-Turkish\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Turkish using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Turkish test data of Common Voice. \n\n\n\n\nTest Result: 17.46 %",
"## Training\nunicode_tr package is used for converting sentences to lower case since regular lower() does not work well with Turkish.\n\nSince training data is very limited for Turkish, all data is employed with a K-Fold (k=5) training approach. Best model out of the 5 trainings is uploaded. Training arguments:\n --num_train_epochs=\"30\" \\\\\n --per_device_train_batch_size=\"32\" \\\\\n --evaluation_strategy=\"steps\" \\\\\n --activation_dropout=\"0.055\" \\\\\n --attention_dropout=\"0.094\" \\\\\n --feat_proj_dropout=\"0.04\" \\\\\n --hidden_dropout=\"0.047\" \\\\\n --layerdrop=\"0.041\" \\\\\n --learning_rate=\"2.34e-4\" \\\\\n --mask_time_prob=\"0.082\" \\\\\n --warmup_steps=\"250\" \\\\\n\nAll trainings took ~20 hours with a GeForce RTX 3090 Graphics Card."
] |
[
"TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #tr #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n",
"# Wav2Vec2-Large-XLSR-53-Turkish\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Turkish using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.",
"## Usage\n\nThe model can be used directly (without a language model) as follows:",
"## Evaluation\n\nThe model can be evaluated as follows on the Turkish test data of Common Voice. \n\n\n\n\nTest Result: 17.46 %",
"## Training\nunicode_tr package is used for converting sentences to lower case since regular lower() does not work well with Turkish.\n\nSince training data is very limited for Turkish, all data is employed with a K-Fold (k=5) training approach. Best model out of the 5 trainings is uploaded. Training arguments:\n --num_train_epochs=\"30\" \\\\\n --per_device_train_batch_size=\"32\" \\\\\n --evaluation_strategy=\"steps\" \\\\\n --activation_dropout=\"0.055\" \\\\\n --attention_dropout=\"0.094\" \\\\\n --feat_proj_dropout=\"0.04\" \\\\\n --hidden_dropout=\"0.047\" \\\\\n --layerdrop=\"0.041\" \\\\\n --learning_rate=\"2.34e-4\" \\\\\n --mask_time_prob=\"0.082\" \\\\\n --warmup_steps=\"250\" \\\\\n\nAll trainings took ~20 hours with a GeForce RTX 3090 Graphics Card."
] |
[
76,
64,
20,
28,
221
] |
[
"passage: TAGS\n#transformers #pytorch #jax #wav2vec2 #automatic-speech-recognition #audio #speech #xlsr-fine-tuning-week #tr #dataset-common_voice #license-apache-2.0 #endpoints_compatible #region-us \n# Wav2Vec2-Large-XLSR-53-Turkish\n\nFine-tuned facebook/wav2vec2-large-xlsr-53 on Turkish using the Common Voice.\nWhen using this model, make sure that your speech input is sampled at 16kHz.## Usage\n\nThe model can be used directly (without a language model) as follows:## Evaluation\n\nThe model can be evaluated as follows on the Turkish test data of Common Voice. \n\n\n\n\nTest Result: 17.46 %## Training\nunicode_tr package is used for converting sentences to lower case since regular lower() does not work well with Turkish.\n\nSince training data is very limited for Turkish, all data is employed with a K-Fold (k=5) training approach. Best model out of the 5 trainings is uploaded. Training arguments:\n --num_train_epochs=\"30\" \\\\\n --per_device_train_batch_size=\"32\" \\\\\n --evaluation_strategy=\"steps\" \\\\\n --activation_dropout=\"0.055\" \\\\\n --attention_dropout=\"0.094\" \\\\\n --feat_proj_dropout=\"0.04\" \\\\\n --hidden_dropout=\"0.047\" \\\\\n --layerdrop=\"0.041\" \\\\\n --learning_rate=\"2.34e-4\" \\\\\n --mask_time_prob=\"0.082\" \\\\\n --warmup_steps=\"250\" \\\\\n\nAll trainings took ~20 hours with a GeForce RTX 3090 Graphics Card."
] |
[
-0.08582102507352829,
-0.00908301118761301,
-0.0029396822210401297,
0.041449662297964096,
0.07567466050386429,
0.03322187438607216,
0.14107158780097961,
0.15376360714435577,
-0.08434385806322098,
0.08252322673797607,
0.045700009912252426,
0.009794929064810276,
0.10336989909410477,
0.1623224914073944,
0.013186769559979439,
-0.17585521936416626,
0.012872809544205666,
-0.051693595945835114,
0.055520638823509216,
0.11161395162343979,
0.10476598143577576,
-0.05994546413421631,
-0.0065272506326437,
0.038876309990882874,
-0.06430824100971222,
-0.0035188833717256784,
0.0016451687552034855,
-0.09556692093610764,
0.0646081268787384,
0.020157465711236,
0.07522091269493103,
-0.025142010301351547,
-0.015434077940881252,
-0.19685673713684082,
0.021314047276973724,
0.09948670119047165,
0.00632533710449934,
0.04089419171214104,
0.12854591012001038,
-0.050159912556409836,
0.1531934142112732,
-0.20647671818733215,
-0.00006771374319214374,
0.05053641274571419,
-0.06099863350391388,
-0.16737355291843414,
-0.10137377679347992,
0.08704224973917007,
0.11765645444393158,
0.0714733675122261,
-0.05669551342725754,
0.1027325913310051,
-0.06000914052128792,
0.07519520074129105,
0.13707132637500763,
-0.2637762427330017,
-0.03876050189137459,
-0.07153181731700897,
-0.0592472217977047,
0.00991656444966793,
-0.0755005031824112,
0.001466365996748209,
0.008616364561021328,
-0.01627017930150032,
-0.042974963784217834,
0.00039468990871682763,
-0.011733276769518852,
-0.07752689719200134,
-0.06160624697804451,
-0.06606601178646088,
0.14405478537082672,
0.021742356941103935,
-0.04005380719900131,
-0.1847728192806244,
-0.041036780923604965,
-0.10642854869365692,
-0.027812829241156578,
0.021027637645602226,
-0.010645110160112381,
0.00009155731822829694,
0.035758890211582184,
-0.004084254149347544,
-0.09439150989055634,
-0.03947041183710098,
-0.0672558844089508,
0.1528601348400116,
0.07975976914167404,
-0.007803248707205057,
-0.0016352144302800298,
0.10693687945604324,
-0.07170939445495605,
-0.08351270854473114,
-0.03779864311218262,
-0.004833439365029335,
-0.11397555470466614,
0.029720813035964966,
-0.032637640833854675,
-0.09637800604104996,
0.019638167694211006,
0.053935520350933075,
0.03303676098585129,
0.1139635294675827,
0.0364118255674839,
0.026230093091726303,
-0.053819600492715836,
0.06633396446704865,
-0.014599664136767387,
-0.11396434903144836,
0.02159160003066063,
0.03584537282586098,
0.010385607369244099,
-0.011259226128458977,
-0.011500513181090355,
-0.02965652570128441,
0.03909365087747574,
0.05245393142104149,
0.01407473161816597,
0.03578440845012665,
-0.013230513781309128,
-0.0016453518765047193,
0.05728866532444954,
-0.12762530148029327,
0.011111557483673096,
0.01688334159553051,
-0.10239071398973465,
0.09361393004655838,
0.011910571716725826,
-0.05388282984495163,
-0.11336211115121841,
0.08877153694629669,
0.02974397875368595,
0.04608689993619919,
-0.0675714835524559,
-0.08469942957162857,
-0.010405163280665874,
-0.06225450709462166,
-0.05307784304022789,
-0.09045582264661789,
-0.15056878328323364,
0.001634268555790186,
0.03371920809149742,
-0.05902332067489624,
0.08697089552879333,
-0.02992340736091137,
-0.08973400294780731,
0.0701955258846283,
-0.010504781268537045,
0.055326953530311584,
-0.06398016959428787,
0.07439038157463074,
0.030454667285084724,
0.056973524391651154,
0.08041179925203323,
0.02596697397530079,
-0.06276661902666092,
0.010888279415667057,
-0.07589864730834961,
0.17274382710456848,
-0.029336920008063316,
-0.09388092905282974,
-0.12811510264873505,
-0.045694414526224136,
-0.04006371274590492,
0.023579346016049385,
0.08065740764141083,
0.08083542436361313,
-0.1895703673362732,
-0.02031318098306656,
0.16930514574050903,
-0.07689155638217926,
0.007870921865105629,
0.161931112408638,
-0.026166997849941254,
-0.014069889672100544,
0.12402310967445374,
0.1949906200170517,
0.12216266989707947,
-0.10111740976572037,
-0.12764114141464233,
0.030815381556749344,
-0.06406018882989883,
0.09033861756324768,
0.045484211295843124,
-0.05973702296614647,
0.12491247057914734,
0.03148099035024643,
-0.03141309693455696,
0.06089727208018303,
0.01617453806102276,
-0.04291391372680664,
-0.03889210522174835,
-0.042468562722206116,
0.11677845567464828,
-0.010737661272287369,
0.017974497750401497,
-0.11204016208648682,
-0.15702404081821442,
0.06455110758543015,
0.13004305958747864,
-0.024961154907941818,
-0.005449033807963133,
-0.1236138790845871,
0.05667700991034508,
-0.11861809343099594,
0.011934406124055386,
-0.11566413938999176,
0.01799093559384346,
-0.029810283333063126,
-0.0359967015683651,
0.048517558723688126,
0.047794975340366364,
0.07947767525911331,
0.028156956657767296,
-0.050888337194919586,
-0.0199479628354311,
-0.02730356529355049,
0.0044297813437879086,
-0.047308776527643204,
-0.11841283738613129,
0.02477286383509636,
0.0022231636103242636,
0.10478261858224869,
-0.07221107184886932,
0.003150002798065543,
0.10089564323425293,
0.11540396511554718,
-0.036860957741737366,
-0.0401696115732193,
0.022090444341301918,
-0.006850098725408316,
0.020766910165548325,
-0.08310004323720932,
0.02032862789928913,
-0.007541716564446688,
-0.014577826485037804,
0.03530170023441315,
-0.24414855241775513,
-0.11384745687246323,
0.14859126508235931,
0.08911880850791931,
-0.1155426874756813,
0.07965762913227081,
-0.010266821831464767,
-0.04887207970023155,
-0.057417113333940506,
-0.04295852407813072,
0.2773037850856781,
0.05526380240917206,
0.1317971795797348,
-0.0713268369436264,
-0.06136670336127281,
0.012855694629251957,
-0.04709003493189812,
0.020481612533330917,
0.07291701436042786,
-0.08052285015583038,
-0.0967305451631546,
0.05301414057612419,
-0.0028228438459336758,
-0.02209145948290825,
0.23114721477031708,
-0.008461140096187592,
-0.10808101296424866,
-0.037395596504211426,
0.0922074243426323,
0.031132005155086517,
0.16985273361206055,
-0.05282864347100258,
0.01878383196890354,
0.04840675741434097,
0.07618288695812225,
0.04260833561420441,
-0.1537637710571289,
0.042659785598516464,
0.0467439703643322,
-0.08308123797178268,
-0.049192164093256,
0.028652118518948555,
0.026870040223002434,
0.0959310382604599,
-0.033088717609643936,
0.048890553414821625,
0.030313054099678993,
-0.03117002360522747,
-0.060889024287462234,
0.12034443765878677,
-0.12955719232559204,
-0.1962173879146576,
-0.08731602132320404,
0.04691874608397484,
-0.040099628269672394,
-0.032761815935373306,
0.03943140059709549,
-0.16605962812900543,
-0.02826240286231041,
-0.023531289771199226,
0.04249833524227142,
-0.07206030189990997,
-0.03652390465140343,
-0.10392649471759796,
-0.007329726591706276,
0.10284357517957687,
-0.11273770779371262,
-0.0055091953836381435,
0.0050958991050720215,
-0.03990515321493149,
0.005714563187211752,
0.07115829735994339,
-0.018833119422197342,
0.08958784490823746,
-0.07733850926160812,
0.04095719754695892,
-0.030182214453816414,
0.06027299165725708,
-0.11406941711902618,
0.06289097666740417,
0.11660449206829071,
0.03565075993537903,
0.05745819956064224,
0.06977143883705139,
-0.007217857521027327,
-0.038655299693346024,
-0.032811347395181656,
0.04691985994577408,
-0.06482894718647003,
-0.22121132910251617,
-0.05512678623199463,
-0.1069357767701149,
-0.03738101199269295,
0.0529981292784214,
0.017951183021068573,
-0.005783334374427795,
0.06715096533298492,
-0.10217262804508209,
0.03722011297941208,
0.0004785055061802268,
0.04224192351102829,
0.12177296727895737,
0.08180953562259674,
0.057553038001060486,
-0.11688970029354095,
0.008064812049269676,
0.06579473614692688,
0.011998818255960941,
0.17669153213500977,
-0.08091500401496887,
0.14575006067752838,
0.054722633212804794,
0.11002835631370544,
0.030406037345528603,
0.052084341645240784,
-0.031665313988924026,
-0.0039026266895234585,
0.0360388308763504,
-0.07727782428264618,
-0.06995020806789398,
0.01921159215271473,
0.054165322333574295,
-0.03295008838176727,
-0.02097480557858944,
0.09325512498617172,
0.11333814263343811,
0.09508003294467926,
0.03300027549266815,
-0.2131897509098053,
-0.07058991491794586,
-0.03570804372429848,
-0.03952281177043915,
-0.0498838797211647,
-0.03056851029396057,
0.14502301812171936,
-0.12800391018390656,
0.10191468149423599,
-0.07483198493719101,
0.07862929999828339,
-0.12283774465322495,
-0.027710195630788803,
-0.002128445077687502,
0.1565750390291214,
0.0163424015045166,
0.09753202646970749,
-0.15770076215267181,
0.1262725442647934,
0.014704485423862934,
0.1856199949979782,
-0.06106993928551674,
0.038226015865802765,
0.11656907200813293,
-0.11881846934556961,
0.08170045167207718,
0.006344552151858807,
-0.02834811620414257,
-0.08314572274684906,
-0.1657777726650238,
0.006203727796673775,
0.06807712465524673,
0.06076429411768913,
0.056035712361335754,
-0.0422515906393528,
0.007601067423820496,
-0.011371537111699581,
-0.06655518710613251,
-0.11836591362953186,
-0.1817157119512558,
0.036631159484386444,
-0.001068290090188384,
-0.038549020886421204,
-0.022336922585964203,
-0.034163959324359894,
-0.11111922562122345,
0.17196926474571228,
-0.09807489812374115,
-0.13198183476924896,
-0.0818704217672348,
-0.011522406712174416,
0.11425911635160446,
-0.06349757313728333,
0.0034421102609485388,
0.010305941104888916,
0.04108932241797447,
-0.020579269155859947,
-0.044007863849401474,
0.007187091279774904,
-0.07521753758192062,
-0.11204708367586136,
-0.01536561455577612,
0.13308563828468323,
-0.0013013877905905247,
0.07333497703075409,
0.05125434696674347,
-0.004427260719239712,
-0.00628041522577405,
-0.10288731753826141,
0.0032589274924248457,
0.026728449389338493,
-0.014697556383907795,
0.05143776163458824,
0.08169286698102951,
-0.041230082511901855,
-0.049125030636787415,
0.008975997567176819,
0.10573172569274902,
0.23050308227539062,
-0.09051875025033951,
0.1461258828639984,
-0.029237624257802963,
-0.04212617501616478,
-0.11071319133043289,
-0.005913500674068928,
0.0812329426407814,
0.051955003291368484,
-0.04018876701593399,
-0.11187834292650223,
0.027542980387806892,
0.07797438651323318,
0.012147323228418827,
0.031595829874277115,
-0.31737565994262695,
-0.11613757908344269,
0.070046067237854,
0.011394182220101357,
0.05284764990210533,
-0.09890883415937424,
-0.038629431277513504,
-0.06265413761138916,
-0.002255625557154417,
0.005226358771324158,
-0.16465219855308533,
0.12502726912498474,
-0.0062727476470172405,
0.02978678233921528,
0.03406750410795212,
-0.0661327987909317,
0.18731024861335754,
-0.012376170605421066,
0.04710063710808754,
-0.0917053371667862,
0.06073000654578209,
0.1232292503118515,
-0.05347727984189987,
0.1182718276977539,
-0.1738971769809723,
0.016498317942023277,
-0.1764102727174759,
-0.02239927463233471,
-0.029147189110517502,
0.05722053721547127,
-0.015680469572544098,
-0.00813234318047762,
-0.03605830669403076,
-0.008863233961164951,
0.11198090016841888,
-0.0011922656558454037,
-0.12712043523788452,
-0.0131909204646945,
-0.06963035464286804,
0.041146524250507355,
0.09244585782289505,
0.10811162739992142,
-0.24661189317703247,
-0.04205074533820152,
0.03747440129518509,
0.04665638878941536,
-0.1570681929588318,
-0.012986605986952782,
0.06990396231412888,
-0.00008725489897187799,
0.11496885865926743,
-0.041000209748744965,
-0.08229454606771469,
0.09795285761356354,
0.07623037695884705,
-0.03285471722483635,
-0.14273297786712646,
0.046358659863471985,
-0.017029626294970512,
-0.023533668369054794,
-0.07020044326782227,
0.08354559540748596,
-0.07746152579784393,
0.03461910039186478,
-0.0007703760056756437,
0.013914825394749641,
-0.061220426112413406,
0.20283761620521545,
0.007424214854836464,
0.07535485178232193,
-0.08675510436296463,
0.08604113757610321,
0.023300211876630783,
-0.01895259879529476,
0.10004023462533951,
0.02206374518573284,
-0.09345049411058426,
-0.0174888726323843,
0.06050225347280502,
0.06032094731926918,
0.10669420659542084,
-0.09221591055393219,
-0.08487184345722198,
-0.06527659296989441,
-0.01933504082262516,
-0.062282856553792953,
0.019319266080856323,
0.0024017684627324343,
-0.06502047181129456,
-0.02022320032119751,
-0.11940833181142807,
0.053966328501701355,
0.13735419511795044,
-0.00557991536334157,
-0.07801442593336105,
0.22435849905014038,
0.059310898184776306,
-0.019095178693532944,
0.006926613859832287,
-0.005439902190119028,
0.023281676694750786,
0.019955923780798912,
-0.18766066431999207,
0.061739251017570496,
-0.06150835379958153,
-0.008221358992159367,
-0.00044170071487315,
0.04946768283843994,
-0.02583472803235054,
0.038275666534900665,
-0.0795358419418335,
-0.05362784117460251,
-0.003905049292370677,
0.030212324112653732,
-0.1146058663725853,
0.03688793629407883,
0.011177361942827702,
-0.09062664955854416,
0.06384030729532242,
0.09320954978466034,
-0.03024144284427166,
0.01420497801154852,
-0.08362691849470139,
-0.0567161850631237,
-0.025975732132792473,
0.026975402608513832,
-0.0012502862373366952,
-0.07587364315986633,
0.07418365776538849,
0.022811297327280045,
0.01909864880144596,
0.028878914192318916,
0.06026249751448631,
-0.11804158985614777,
0.01349472813308239,
-0.08697864413261414,
0.05600983276963234,
-0.0656530112028122,
0.07725351303815842,
0.06905092298984528,
0.038444843143224716,
0.15107563138008118,
-0.10339666903018951,
0.047878820449113846,
-0.13209670782089233,
-0.022917181253433228,
-0.021979549899697304,
-0.07671468704938889,
0.015178976580500603,
-0.044684428721666336,
0.10248056054115295,
-0.048353418707847595,
0.11069415509700775,
0.0266517736017704,
-0.03262423723936081,
0.019715217873454094,
-0.06417135894298553,
-0.026406897231936455,
-0.014437584206461906,
0.1576457917690277,
-0.01548814121633768,
0.0008855548221617937,
-0.02457074448466301,
-0.03352575749158859,
0.058467164635658264,
-0.0503384992480278,
0.14100702106952667,
0.21628133952617645,
0.02596501260995865,
0.08314843475818634,
-0.0013649979373440146,
-0.12877808511257172,
-0.13327926397323608,
0.08424866944551468,
-0.13753125071525574,
0.07465551793575287,
-0.03255094587802887,
0.05802065506577492,
0.13488604128360748,
-0.1490052342414856,
0.0899539515376091,
-0.022109052166342735,
-0.09300165623426437,
-0.01744534634053707,
-0.10703890025615692,
-0.02920992113649845,
-0.11469140648841858,
0.0269676074385643,
-0.05470352992415428,
0.06284116208553314,
0.12865349650382996,
0.06616293638944626,
-0.03503989055752754,
0.2559673488140106,
-0.10959704965353012,
-0.07062579691410065,
0.07814972847700119,
-0.03024696372449398,
-0.0065666306763887405,
-0.012712115421891212,
-0.06555937975645065,
0.023794803768396378,
-0.05692475289106369,
0.12740491330623627,
0.04579487815499306,
0.028695065528154373,
-0.005663977935910225,
-0.023868778720498085,
-0.06114433705806732,
0.005686990451067686,
0.015781214460730553,
0.12174598127603531,
0.18377122282981873,
0.10870660096406937,
-0.044128116220235825,
-0.08489443361759186,
0.11165603995323181,
-0.018711159005761147,
-0.18460147082805634,
-0.10218314081430435,
-0.0018327627331018448,
0.05044804513454437,
-0.02836054563522339,
0.01886257342994213,
-0.1357761025428772,
0.02239026501774788,
0.15468335151672363,
0.16282497346401215,
0.02462448552250862,
-0.004118223674595356,
-0.05173728987574577,
-0.011104709468781948,
-0.01284969411790371,
0.08228535205125809,
-0.008506253361701965,
0.11901374906301498,
-0.02204444259405136,
0.04871642589569092,
-0.03296691179275513,
-0.06203126162290573,
-0.0396561436355114,
0.10684405267238617,
-0.033652111887931824,
-0.04701390489935875,
-0.021777106449007988,
0.08315397799015045,
0.01897190697491169,
-0.2008150964975357,
0.06168489158153534,
-0.0920679047703743,
-0.1476641148328781,
0.0679248720407486,
0.08996390551328659,
0.0777023583650589,
0.031207222491502762,
-0.011737548746168613,
0.022724973037838936,
0.05298569053411484,
-0.003258300479501486,
-0.08606134355068207,
-0.06777894496917725,
0.006725301966071129,
-0.0948537290096283,
0.15575388073921204,
0.04033795744180679,
0.11523762345314026,
0.1162940189242363,
0.05843573436141014,
-0.06185530871152878,
0.08507364243268967,
0.04856318607926369,
-0.08051812648773193,
0.06847921013832092,
0.15474210679531097,
-0.0206760261207819,
-0.004712916910648346,
0.07218346744775772,
0.006565378047525883,
0.016836602240800858,
0.0015402734279632568,
0.042337361723184586,
-0.09378138184547424,
0.044901277869939804,
-0.047080524265766144,
0.12179498374462128,
0.2501809895038605,
-0.03377646580338478,
-0.016539130359888077,
-0.06241855397820473,
0.025841690599918365,
0.03718726709485054,
0.03467504680156708,
-0.028355497866868973,
-0.165096253156662,
0.0003597994218580425,
0.03689833730459213,
0.08830077946186066,
-0.15233135223388672,
-0.10648094862699509,
0.03538987785577774,
-0.08309324830770493,
-0.04805926978588104,
0.10184351354837418,
0.0821298286318779,
0.0732317641377449,
-0.049952324479818344,
-0.006671153474599123,
-0.01458795927464962,
0.08094874024391174,
-0.09358125180006027,
-0.05448579043149948
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sagemaker-BioclinicalBERT-ADR
This model is a fine-tuned version of [emilyalsentzer/Bio_ClinicalBERT](https://huggingface.co/emilyalsentzer/Bio_ClinicalBERT) on the ade_corpus_v2 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 171 | 0.9441 |
### Framework versions
- Transformers 4.12.3
- Pytorch 1.9.1
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"tags": ["generated_from_trainer"], "datasets": ["ade_corpus_v2"], "model-index": [{"name": "sagemaker-BioclinicalBERT-ADR", "results": []}]}
|
question-answering
|
anindabitm/sagemaker-BioclinicalBERT-ADR
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"generated_from_trainer",
"dataset:ade_corpus_v2",
"endpoints_compatible",
"has_space",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-ade_corpus_v2 #endpoints_compatible #has_space #region-us
|
sagemaker-BioclinicalBERT-ADR
=============================
This model is a fine-tuned version of emilyalsentzer/Bio\_ClinicalBERT on the ade\_corpus\_v2 dataset.
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 64
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 1
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.3
* Pytorch 1.9.1
* Datasets 1.15.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.1\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-ade_corpus_v2 #endpoints_compatible #has_space #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.1\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
51,
131,
4,
31
] |
[
"passage: TAGS\n#transformers #pytorch #bert #question-answering #generated_from_trainer #dataset-ade_corpus_v2 #endpoints_compatible #has_space #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.1\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
-0.09891840815544128,
0.08913470059633255,
-0.002293934114277363,
0.09105642139911652,
0.14140817523002625,
0.0344599224627018,
0.07669659703969955,
0.14229746162891388,
-0.05839009955525398,
0.060519009828567505,
0.13236528635025024,
0.1099979504942894,
0.012860510498285294,
0.12673784792423248,
-0.039570800960063934,
-0.27049171924591064,
-0.004412887152284384,
0.028895430266857147,
-0.10699943453073502,
0.1325443983078003,
0.08290282636880875,
-0.1540583074092865,
0.06378393620252609,
-0.013514054007828236,
-0.1368577629327774,
0.016986194998025894,
-0.016164282336831093,
-0.03236112743616104,
0.1328057050704956,
-0.008606957271695137,
0.12130231410264969,
0.020736437290906906,
0.09887205809354782,
-0.2223549485206604,
0.007306781131774187,
0.041717611253261566,
0.03154212608933449,
0.07563869655132294,
0.04905880242586136,
-0.015414644032716751,
0.08946996927261353,
-0.12312502413988113,
0.06894905865192413,
0.008933844976127148,
-0.13676764070987701,
-0.283743292093277,
-0.08474601805210114,
0.00499568460509181,
0.07675701379776001,
0.10710924118757248,
-0.02940111793577671,
0.12156382948160172,
-0.11625678092241287,
0.08918660879135132,
0.2782362103462219,
-0.2656592130661011,
-0.07473619282245636,
0.017639510333538055,
0.046932436525821686,
0.08852963149547577,
-0.12149924784898758,
-0.026582371443510056,
0.02991364151239395,
0.049233462661504745,
0.12509967386722565,
-0.02922772243618965,
-0.0590578131377697,
0.04400809109210968,
-0.14947998523712158,
-0.05083725228905678,
0.14723218977451324,
0.05820409953594208,
-0.013591461814939976,
-0.04813488572835922,
-0.040231503546237946,
-0.15869174897670746,
-0.029804306104779243,
-0.027117019519209862,
0.042812857776880264,
-0.07156247645616531,
-0.10022364556789398,
-0.005046335980296135,
-0.09525666385889053,
-0.08729372173547745,
-0.02564801648259163,
0.14718376100063324,
0.04567955061793327,
-0.0040544383227825165,
-0.015084500424563885,
0.11578673124313354,
0.0065317279659211636,
-0.14523325860500336,
0.0009067095234058797,
0.012455688789486885,
-0.0629720687866211,
-0.035475295037031174,
-0.073232002556324,
-0.0004314058169256896,
0.0058240401558578014,
0.13370546698570251,
-0.0819900780916214,
0.051351554691791534,
0.057468920946121216,
0.007830140180885792,
-0.07401267439126968,
0.17095211148262024,
-0.07047465443611145,
-0.033716462552547455,
-0.057694047689437866,
0.0779913142323494,
-0.03704578056931496,
-0.00033427891321480274,
-0.07560941576957703,
-0.00297397468239069,
0.11295576393604279,
0.030662616714835167,
-0.04810703918337822,
0.04076571390032768,
-0.04337484389543533,
-0.01881270296871662,
-0.04786534979939461,
-0.09105135500431061,
0.04264170676469803,
0.00973141472786665,
-0.09723498672246933,
-0.03966052830219269,
0.0067678107880055904,
0.01094844937324524,
-0.0010549627477303147,
0.1005135029554367,
-0.09294553101062775,
0.027281103655695915,
-0.09180135279893875,
-0.13338378071784973,
0.00836899969726801,
-0.06737955659627914,
0.02602146752178669,
-0.07748932391405106,
-0.13872398436069489,
-0.04240167886018753,
0.04892127588391304,
-0.06238550692796707,
-0.03383221849799156,
-0.04960964247584343,
-0.07106421887874603,
0.014296932145953178,
-0.021046198904514313,
0.16953879594802856,
-0.06753677874803543,
0.1255030632019043,
0.027232667431235313,
0.09445547312498093,
0.02830973081290722,
0.06670945882797241,
-0.08625463396310806,
0.02986190654337406,
-0.14715376496315002,
0.05805059149861336,
-0.07841286808252335,
0.037243738770484924,
-0.11470551788806915,
-0.13913428783416748,
-0.0013822722248733044,
-0.00718294084072113,
0.10398583114147186,
0.12349814176559448,
-0.14907121658325195,
-0.05890451371669769,
0.16099077463150024,
-0.03614932671189308,
-0.12377994507551193,
0.1022757887840271,
-0.06406374275684357,
0.03281467780470848,
0.038260165601968765,
0.210649773478508,
0.05555245280265808,
-0.11061049997806549,
0.02438647300004959,
-0.023745650425553322,
0.0911712795495987,
0.005724531132727861,
0.07596510648727417,
0.004059013910591602,
0.019147787243127823,
0.006607017945498228,
-0.06326906383037567,
0.05507560074329376,
-0.1326768547296524,
-0.08812405169010162,
-0.02087472379207611,
-0.10057298094034195,
0.07082249969244003,
0.07228435575962067,
0.06544031202793121,
-0.1039801836013794,
-0.08830881118774414,
0.06902668625116348,
0.07966212928295135,
-0.0745861679315567,
0.021430684253573418,
-0.03868221491575241,
0.051642730832099915,
-0.04875371977686882,
-0.032497208565473557,
-0.1862875074148178,
-0.054588865488767624,
0.001004490302875638,
0.023178504779934883,
0.0026911781169474125,
0.04862123727798462,
0.09538077563047409,
0.039075057953596115,
-0.06898809969425201,
-0.05111088976264,
-0.07452819496393204,
0.011085943318903446,
-0.12791389226913452,
-0.19601106643676758,
-0.05771031230688095,
-0.02660774067044258,
0.11038751155138016,
-0.2025536298751831,
0.01258515939116478,
-0.0204033050686121,
0.0887235626578331,
0.019572660326957703,
-0.02474948763847351,
-0.014236045069992542,
0.07618258893489838,
-0.012486215680837631,
-0.05036552995443344,
0.05329526588320732,
-0.02353646047413349,
-0.09353494644165039,
-0.03658845275640488,
-0.09212680160999298,
0.13164348900318146,
0.09910547733306885,
-0.07305087894201279,
-0.09660863876342773,
-0.017893757671117783,
-0.07806757092475891,
-0.03766949474811554,
-0.050497591495513916,
0.01977614127099514,
0.13587446510791779,
0.0040996018797159195,
0.11146564781665802,
-0.06783811748027802,
-0.04923368617892265,
0.0076325517147779465,
-0.029005754739046097,
0.016667647287249565,
0.16296911239624023,
0.08758848905563354,
-0.04943845421075821,
0.13294707238674164,
0.16136950254440308,
-0.07518456876277924,
0.1278662532567978,
-0.0444275364279747,
-0.11115449666976929,
-0.04479614272713661,
-0.008627152070403099,
-0.014289473183453083,
0.1358688920736313,
-0.12640362977981567,
0.003652476239949465,
0.030022144317626953,
0.044328685849905014,
0.02122241072356701,
-0.2088167518377304,
-0.06656874716281891,
0.044340137392282486,
-0.04658205062150955,
-0.03077075444161892,
0.004486394114792347,
0.03272412717342377,
0.11360303312540054,
0.01891023851931095,
-0.04646182805299759,
-0.008334684185683727,
-0.011211973614990711,
-0.06522632390260696,
0.20859041810035706,
-0.07577203959226608,
-0.1271410435438156,
-0.062084607779979706,
-0.032113201916217804,
-0.018076781183481216,
-0.0029620418790727854,
0.06488292664289474,
-0.10417991131544113,
0.0048684305511415005,
-0.059910450130701065,
0.034674301743507385,
-0.04165402799844742,
0.03281758725643158,
-0.018214784562587738,
-0.009771701879799366,
0.06619851291179657,
-0.09811381250619888,
0.0023564419243484735,
-0.05979609861969948,
-0.04807622730731964,
0.06872766464948654,
0.032440513372421265,
0.13158948719501495,
0.13481560349464417,
-0.0027678008191287518,
0.023998506367206573,
-0.03658514469861984,
0.2596266269683838,
-0.08523044735193253,
-0.030065547674894333,
0.10429392755031586,
0.02987355925142765,
0.04654858261346817,
0.1225380077958107,
0.056271109730005264,
-0.10134625434875488,
0.028571806848049164,
0.058720313012599945,
-0.029149161651730537,
-0.21004782617092133,
-0.038227446377277374,
-0.057558685541152954,
-0.03022860549390316,
0.10564278066158295,
0.004771249834448099,
0.0014528280589729548,
0.04369409382343292,
0.03907153010368347,
-0.009440802969038486,
-0.05191577225923538,
0.05090557783842087,
0.09666198492050171,
0.019815456122159958,
0.125431090593338,
-0.010408519767224789,
-0.06438557803630829,
0.02916829288005829,
0.005524007137864828,
0.255780428647995,
-0.017458613961935043,
0.07376740872859955,
0.06246952340006828,
0.18342547118663788,
-0.0296618714928627,
0.06870733946561813,
0.01344595942646265,
-0.046177368611097336,
-0.010509105399250984,
-0.038939882069826126,
-0.01621864177286625,
0.03837857022881508,
0.02989305555820465,
0.025668129324913025,
-0.1435709446668625,
-0.047332484275102615,
0.0467388890683651,
0.25162339210510254,
0.08050133287906647,
-0.27550458908081055,
-0.08844193816184998,
0.013138453476130962,
-0.047266636043787,
-0.018109800294041634,
0.010072881355881691,
0.13494767248630524,
-0.09623325616121292,
0.005493721459060907,
-0.05716380849480629,
0.08975185453891754,
-0.007875326089560986,
0.03427237644791603,
0.05498432740569115,
0.05861001834273338,
-0.005800166632980108,
0.07870250195264816,
-0.28614863753318787,
0.3133821487426758,
-0.0043787299655377865,
0.04285665974020958,
-0.062128081917762756,
-0.025762058794498444,
0.015779396519064903,
0.0046857730485498905,
0.09968315809965134,
0.0001382846530759707,
-0.05509532243013382,
-0.21167950332164764,
-0.045145053416490555,
0.03491666540503502,
0.11715306341648102,
-0.06512870639562607,
0.13051727414131165,
-0.008730670437216759,
0.011964374221861362,
0.0694974735379219,
0.013215161859989166,
-0.0856265127658844,
-0.06667723506689072,
0.003324841847643256,
-0.005936483386904001,
-0.005266366992145777,
-0.0723196417093277,
-0.11547048389911652,
-0.0928647369146347,
0.10761421173810959,
-0.0014665474882349372,
-0.01861618645489216,
-0.1264576017856598,
0.11786103248596191,
0.1461954563856125,
-0.0780433639883995,
0.02200258895754814,
0.03595240041613579,
0.04574764892458916,
0.042477190494537354,
-0.032682787626981735,
0.11431261897087097,
-0.05748739466071129,
-0.17808617651462555,
-0.0434875562787056,
0.1268319934606552,
0.063253253698349,
0.08810239285230637,
-0.0232111606746912,
0.03859959915280342,
-0.053548429161310196,
-0.09312118589878082,
0.05850250646471977,
-0.03859329596161842,
0.07160291075706482,
0.05590282008051872,
-0.06221751123666763,
0.0907648503780365,
-0.05013230815529823,
-0.02413301356136799,
0.18825314939022064,
0.2792138457298279,
-0.10682935267686844,
0.02432234399020672,
0.0021061571314930916,
-0.05257188156247139,
-0.18257227540016174,
0.05462266877293587,
0.0937439501285553,
0.015244944021105766,
0.07489319145679474,
-0.19405387341976166,
0.08073137700557709,
0.08928468078374863,
-0.011220818385481834,
0.06960324943065643,
-0.33267858624458313,
-0.12116304039955139,
0.09293335676193237,
0.14183588325977325,
0.085801862180233,
-0.16071158647537231,
-0.014209083281457424,
0.028167519718408585,
-0.13517919182777405,
0.09983141720294952,
-0.04686855524778366,
0.1353672295808792,
-0.029818367213010788,
0.10553931444883347,
0.024799199774861336,
-0.0642414540052414,
0.14983698725700378,
0.011114370077848434,
0.09551116824150085,
-0.031345877796411514,
-0.024792440235614777,
0.03790353238582611,
-0.04875093325972557,
0.020802654325962067,
-0.04495818540453911,
0.028181536123156548,
-0.14748427271842957,
-0.020752547308802605,
-0.12304461002349854,
0.023776689544320107,
-0.03923838958144188,
-0.06678928434848785,
-0.033420413732528687,
0.06187189370393753,
0.07586666941642761,
-0.011260113678872585,
0.11532004177570343,
-0.02927534095942974,
0.16709642112255096,
0.0650751069188118,
0.06628836691379547,
-0.02167370356619358,
-0.07100238651037216,
0.01654001511633396,
0.0017112018540501595,
0.04085950180888176,
-0.1468404084444046,
0.03387851268053055,
0.1690419465303421,
0.04407495632767677,
0.13893736898899078,
0.07632184773683548,
-0.024899922311306,
0.0009391314233653247,
0.04910672828555107,
-0.1490963250398636,
-0.13115215301513672,
-0.0035753147676587105,
-0.07141292840242386,
-0.14638593792915344,
0.016257058829069138,
0.09120632708072662,
-0.049411576241254807,
-0.008540255948901176,
-0.020529339089989662,
0.01310649886727333,
-0.04928029701113701,
0.23178181052207947,
0.07982572168111801,
0.06937547028064728,
-0.08429200947284698,
0.05729900300502777,
0.04461273178458214,
-0.11498329043388367,
0.02346048876643181,
0.08180516958236694,
-0.055829569697380066,
-0.04147058352828026,
0.0555427148938179,
0.1257186084985733,
-0.040546830743551254,
-0.009170831181108952,
-0.13575725257396698,
-0.11531814932823181,
0.08738669008016586,
0.13209529221057892,
0.08505553007125854,
0.0023505273275077343,
-0.02468879334628582,
0.023868802934885025,
-0.1088465005159378,
0.10559876263141632,
0.06554213166236877,
0.054262418299913406,
-0.11902432143688202,
0.1513933539390564,
-0.023088015615940094,
0.05746060609817505,
-0.025317037478089333,
0.017699019983410835,
-0.1273573338985443,
0.03288482502102852,
-0.12848296761512756,
-0.06140276789665222,
-0.031671397387981415,
-0.013423659838736057,
-0.010065889917314053,
-0.09412115812301636,
-0.06199584901332855,
0.011295255273580551,
-0.13403427600860596,
-0.028926916420459747,
0.010788965038955212,
0.05448746308684349,
-0.12896102666854858,
-0.058550480753183365,
0.0391557514667511,
-0.07462094724178314,
0.0775763988494873,
0.048997871577739716,
0.023206720128655434,
0.02623096853494644,
-0.10462839901447296,
-0.003017617389559746,
0.02639761194586754,
0.0002766836842056364,
0.05768844485282898,
-0.146115243434906,
-0.024059990420937538,
-0.029553424566984177,
0.07120110839605331,
0.027079252526164055,
0.03609622269868851,
-0.12561388313770294,
-0.0008427569991908967,
-0.039162687957286835,
-0.07623238861560822,
-0.06099145859479904,
0.03267546743154526,
0.0640828013420105,
0.04159829765558243,
0.14261192083358765,
-0.06606815755367279,
0.06519591808319092,
-0.23303323984146118,
-0.015499409288167953,
-0.020497575402259827,
-0.08574157953262329,
-0.05801091715693474,
-0.059235867112874985,
0.09645741432905197,
-0.0482100211083889,
0.12556332349777222,
-0.00553537905216217,
0.08331911265850067,
0.03252594172954559,
-0.04836445301771164,
0.04773147031664848,
0.030732519924640656,
0.20335111021995544,
0.01965387910604477,
-0.05094115436077118,
0.07833374291658401,
0.06480804830789566,
0.06961620599031448,
0.16664841771125793,
0.22311493754386902,
0.17941266298294067,
0.05195866897702217,
0.056367985904216766,
0.02095704711973667,
-0.09162582457065582,
-0.1464042365550995,
0.046624746173620224,
-0.006737209390848875,
0.09422427415847778,
-0.02822711132466793,
0.21648570895195007,
0.046175431460142136,
-0.19958098232746124,
0.0628008246421814,
-0.06562799215316772,
-0.09643453359603882,
-0.09345850348472595,
-0.0026899015065282583,
-0.0753229558467865,
-0.17234830558300018,
0.003239409066736698,
-0.13219726085662842,
0.040859539061784744,
0.10038530081510544,
0.027590597048401833,
0.008755362592637539,
0.17197681963443756,
0.053283367305994034,
0.02049250528216362,
0.06508557498455048,
0.010869471356272697,
0.002111267764121294,
-0.08213755488395691,
-0.05382730811834335,
0.011490621604025364,
-0.03246736526489258,
0.03101813979446888,
-0.061390720307826996,
-0.10838674008846283,
0.018828293308615685,
-0.020424634218215942,
-0.09933343529701233,
-0.0028232194017618895,
0.02740437164902687,
0.07426928728818893,
0.07192346453666687,
0.012453673407435417,
0.004773185588419437,
-0.03493882715702057,
0.2746492326259613,
-0.0943751409649849,
-0.05844502151012421,
-0.11309665441513062,
0.25760871171951294,
0.0427878275513649,
0.001795996562577784,
0.030357716605067253,
-0.07704769819974899,
-0.0007444641669280827,
0.2122371792793274,
0.1632263958454132,
-0.09588178992271423,
0.003479819279164076,
0.02581445313990116,
-0.004474040120840073,
-0.020210476592183113,
0.060308944433927536,
0.12940406799316406,
0.044177718460559845,
-0.11724924296140671,
-0.03918304294347763,
-0.06067817285656929,
-0.031566597521305084,
-0.02691766247153282,
0.06377612054347992,
0.07337692379951477,
-0.003176545724272728,
-0.05337082967162132,
0.07815342396497726,
-0.06690644472837448,
-0.12284320592880249,
0.0719708800315857,
-0.23431949317455292,
-0.16430804133415222,
-0.03176289424300194,
0.06726814806461334,
0.0018793706549331546,
0.060856979340314865,
-0.031107595190405846,
-0.009366275742650032,
0.08289211243391037,
-0.006908742245286703,
-0.046162866055965424,
-0.10904766619205475,
0.128231942653656,
-0.0873372033238411,
0.17672185599803925,
-0.04452025145292282,
0.04529194161295891,
0.12606029212474823,
0.04192188382148743,
-0.07623115181922913,
0.058069322258234024,
0.09400609135627747,
-0.10134004056453705,
0.014326874166727066,
0.13637585937976837,
-0.03265771642327309,
0.0946742594242096,
0.051519475877285004,
-0.16164319217205048,
0.012789168395102024,
-0.04781542718410492,
-0.042760204523801804,
-0.04839560389518738,
-0.03337644413113594,
-0.042586371302604675,
0.13479027152061462,
0.22379489243030548,
-0.04870914667844772,
0.020078139379620552,
-0.06333401799201965,
0.011222447268664837,
0.04712442308664322,
0.09028305113315582,
-0.0723412036895752,
-0.22928670048713684,
0.02419242635369301,
0.05738191306591034,
-0.014452625066041946,
-0.2162725329399109,
-0.11090017855167389,
0.04141039773821831,
-0.05908584222197533,
-0.04421235993504524,
0.10862042009830475,
0.06434731185436249,
0.04982880875468254,
-0.05165843293070793,
-0.11406668275594711,
-0.06041833758354187,
0.18188579380512238,
-0.16451063752174377,
-0.06483820825815201
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# sagemaker-distilbert-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2434
- Accuracy: 0.9165
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.9423 | 1.0 | 500 | 0.2434 | 0.9165 |
### Framework versions
- Transformers 4.12.3
- Pytorch 1.9.1
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["emotion"], "metrics": ["accuracy"], "model-index": [{"name": "sagemaker-distilbert-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.9165, "name": "Accuracy"}]}]}]}
|
text-classification
|
anindabitm/sagemaker-distilbert-emotion
|
[
"transformers",
"pytorch",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-emotion #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
sagemaker-distilbert-emotion
============================
This model is a fine-tuned version of distilbert-base-uncased on the emotion dataset.
It achieves the following results on the evaluation set:
* Loss: 0.2434
* Accuracy: 0.9165
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 3e-05
* train\_batch\_size: 32
* eval\_batch\_size: 64
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 1
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.12.3
* Pytorch 1.9.1
* Datasets 1.15.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.1\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-emotion #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.1\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
63,
131,
4,
31
] |
[
"passage: TAGS\n#transformers #pytorch #distilbert #text-classification #generated_from_trainer #dataset-emotion #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 3e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.12.3\n* Pytorch 1.9.1\n* Datasets 1.15.1\n* Tokenizers 0.10.3"
] |
[
-0.11837613582611084,
0.1543351113796234,
-0.003603972727432847,
0.11183581501245499,
0.12739664316177368,
0.022878220304846764,
0.10721538960933685,
0.16699880361557007,
-0.10111086815595627,
0.06387995183467865,
0.11303813010454178,
0.1483011543750763,
0.043047476559877396,
0.1800822913646698,
-0.07202117890119553,
-0.28924816846847534,
0.023244589567184448,
0.032159626483917236,
-0.013603254221379757,
0.1261374056339264,
0.10769407451152802,
-0.11838165670633316,
0.09594909101724625,
0.0010638361563906074,
-0.14941614866256714,
-0.007261131890118122,
-0.01208103820681572,
-0.057330116629600525,
0.10750635713338852,
0.0014201451558619738,
0.07195501029491425,
0.03991401568055153,
0.07995164394378662,
-0.22014452517032623,
0.012164656072854996,
0.04840351268649101,
0.015382297337055206,
0.09114862233400345,
0.058430466800928116,
-0.058908965438604355,
0.14157436788082123,
-0.09782223403453827,
0.08080454915761948,
0.02933705970644951,
-0.12579043209552765,
-0.2903655767440796,
-0.1011161059141159,
0.04775403067469597,
0.08300988376140594,
0.08396870642900467,
-0.021364055573940277,
0.14067555963993073,
-0.07596412301063538,
0.11064133048057556,
0.2744048535823822,
-0.24716363847255707,
-0.054917991161346436,
0.005511750467121601,
0.017445571720600128,
0.04044044017791748,
-0.1113806664943695,
-0.03889385610818863,
0.03182241693139076,
0.04706019535660744,
0.14486002922058105,
-0.024018313735723495,
-0.0740620419383049,
-0.007533211261034012,
-0.12171243131160736,
-0.0633034035563469,
0.16323760151863098,
0.03745148330926895,
-0.02962128259241581,
-0.09584372490644455,
-0.04318343847990036,
-0.17879915237426758,
-0.04851372539997101,
0.00667280750349164,
0.056851621717214584,
-0.03543173149228096,
-0.08462603390216827,
0.03528746962547302,
-0.08264046162366867,
-0.031160978600382805,
-0.017575401812791824,
0.11850959062576294,
0.03588465601205826,
0.008479141630232334,
-0.01874055340886116,
0.0899289920926094,
-0.011930983513593674,
-0.158717080950737,
-0.020114438608288765,
0.0030990380328148603,
-0.0031545378733426332,
-0.04256492480635643,
-0.04705759882926941,
-0.038762595504522324,
0.006422210484743118,
0.15545488893985748,
-0.11857292056083679,
0.08842629194259644,
0.012996894307434559,
0.007682775612920523,
-0.03404007852077484,
0.1736244559288025,
-0.027913648635149002,
-0.018383368849754333,
-0.013358066789805889,
0.07133235037326813,
0.03232838958501816,
-0.008974552154541016,
-0.08438459038734436,
0.047629330307245255,
0.09521305561065674,
0.0383065827190876,
-0.06879504770040512,
0.06931496411561966,
-0.07647304981946945,
-0.016467349603772163,
0.01397458091378212,
-0.10510764271020889,
0.0430348701775074,
0.02499917522072792,
-0.08724648505449295,
-0.031164446845650673,
0.002788531593978405,
-0.009516514837741852,
-0.03220942243933678,
0.09179986268281937,
-0.07090102136135101,
0.024775264784693718,
-0.07785706967115402,
-0.12903408706188202,
0.04065423458814621,
-0.10105662792921066,
0.00963031966239214,
-0.06942633539438248,
-0.17784561216831207,
-0.038965292274951935,
0.06810345500707626,
-0.05729544162750244,
-0.038596637547016144,
-0.07722818106412888,
-0.07238946855068207,
0.042007025331258774,
-0.009164949879050255,
0.09392977505922318,
-0.08157063275575638,
0.07733868062496185,
0.023606736212968826,
0.09834159165620804,
0.012647953815758228,
0.05767158046364784,
-0.1123456358909607,
0.029518958181142807,
-0.1950959861278534,
0.08706353604793549,
-0.08051420003175735,
0.07977759838104248,
-0.10023565590381622,
-0.11927729845046997,
0.06214994937181473,
-0.018791886046528816,
0.0882602110505104,
0.13789469003677368,
-0.1989341825246811,
-0.08109022676944733,
0.1874118447303772,
-0.09164733439683914,
-0.1158265471458435,
0.1065397784113884,
-0.05862856283783913,
0.043826110661029816,
0.06908378005027771,
0.22259867191314697,
0.04803643748164177,
-0.0827505812048912,
-0.038985978811979294,
-0.02437431365251541,
0.06559756398200989,
-0.015065304934978485,
0.053651563823223114,
0.03099905140697956,
0.07602955400943756,
0.028325917199254036,
0.001142260618507862,
0.03702513873577118,
-0.10235582292079926,
-0.0862097293138504,
-0.039290837943553925,
-0.07971367985010147,
0.023167163133621216,
0.0724732056260109,
0.048356588929891586,
-0.13896535336971283,
-0.09049472212791443,
0.03247474506497383,
0.10335296392440796,
-0.07529491931200027,
0.04016033560037613,
-0.07861742377281189,
0.06997567415237427,
0.017843758687376976,
-0.005587626714259386,
-0.18841251730918884,
-0.007240258157253265,
0.030585063621401787,
0.0016626478172838688,
-0.004395666066557169,
-0.049733731895685196,
0.07025570422410965,
0.045390885323286057,
-0.05947435647249222,
-0.04165800288319588,
-0.023398151621222496,
0.0075073097832500935,
-0.09686305373907089,
-0.24679021537303925,
-0.04165744036436081,
-0.04643191397190094,
0.12916356325149536,
-0.1773872673511505,
0.04002687707543373,
0.058347634971141815,
0.10489656776189804,
0.04189150780439377,
-0.04356949031352997,
0.014493320137262344,
0.061084989458322525,
-0.04397936537861824,
-0.07461843639612198,
0.056644365191459656,
0.013763496652245522,
-0.07663474977016449,
0.0045542954467237,
-0.12263242900371552,
0.11598627269268036,
0.11511194705963135,
-0.001906289136968553,
-0.08724113553762436,
-0.02875882014632225,
-0.0692279115319252,
-0.013547703623771667,
-0.04575734585523605,
0.05855436623096466,
0.14891910552978516,
0.015346912667155266,
0.1367952674627304,
-0.07775311172008514,
-0.04299632087349892,
0.03867389261722565,
-0.034631580114364624,
-0.007141605485230684,
0.13975439965724945,
0.03596067801117897,
-0.08201639354228973,
0.14243470132350922,
0.10314483940601349,
-0.04581154137849808,
0.13645175099372864,
-0.06408238410949707,
-0.06139000877737999,
-0.03766321390867233,
-0.03011254407465458,
0.0006627661059610546,
0.11162186414003372,
-0.1258595734834671,
-0.021620837971568108,
0.034058600664138794,
0.019893422722816467,
-0.01158282533288002,
-0.1868378072977066,
-0.019171809777617455,
0.038299042731523514,
-0.05880694463849068,
-0.047663457691669464,
-0.011641432531177998,
0.018677890300750732,
0.10764066129922867,
0.003986363299190998,
-0.055185310542583466,
0.009195665828883648,
0.0028655354399234056,
-0.07642689347267151,
0.19908322393894196,
-0.12159758806228638,
-0.17786571383476257,
-0.07665104418992996,
-0.07649917155504227,
-0.05434083566069603,
-0.016904905438423157,
0.08014795184135437,
-0.12079811096191406,
-0.0411948561668396,
-0.10055164992809296,
-0.0008590462384745479,
0.008758091367781162,
0.019678879529237747,
0.02470877207815647,
0.004008642863482237,
0.0454113632440567,
-0.11157272011041641,
-0.027560485526919365,
-0.04230765253305435,
-0.002857549348846078,
0.0769641101360321,
0.014526997692883015,
0.09557095915079117,
0.1401277482509613,
-0.0013876197626814246,
0.04866979271173477,
-0.05239533260464668,
0.22267955541610718,
-0.07544218748807907,
-0.006031872238963842,
0.1071099266409874,
0.004186802078038454,
0.0721597820520401,
0.1333232969045639,
0.050544291734695435,
-0.11408083140850067,
0.008745088241994381,
0.03685984015464783,
-0.04053723067045212,
-0.22257384657859802,
-0.03704514354467392,
-0.03992819786071777,
0.020139485597610474,
0.1042783111333847,
0.027278758585453033,
0.005482812877744436,
0.052087362855672836,
0.023433586582541466,
-0.017469104379415512,
-0.015712963417172432,
0.09295354783535004,
0.10768511146306992,
0.030773909762501717,
0.11299581080675125,
-0.04381914809346199,
-0.015720488503575325,
0.06628048419952393,
-0.011439019814133644,
0.2233322262763977,
-0.03464122116565704,
0.1589147001504898,
0.04811767488718033,
0.1493292897939682,
-0.018851323053240776,
0.0694626197218895,
-0.012259836308658123,
-0.017061851918697357,
-0.015374460257589817,
-0.04035329446196556,
-0.049273695796728134,
0.03298238292336464,
-0.04660562798380852,
0.04422784596681595,
-0.1512918323278427,
0.014314874075353146,
0.07183079421520233,
0.3302708864212036,
0.0589555986225605,
-0.34346017241477966,
-0.11173069477081299,
0.013120951130986214,
-0.0428515262901783,
-0.02635222300887108,
0.009747270494699478,
0.08257672190666199,
-0.08503390848636627,
0.09766539931297302,
-0.05262461304664612,
0.09564774483442307,
-0.07680870592594147,
0.0361005999147892,
0.03936290368437767,
0.08292390406131744,
-0.01231981161981821,
0.06103919446468353,
-0.2909838855266571,
0.27077189087867737,
0.010940955020487309,
0.07318498194217682,
-0.08081170916557312,
0.008950899355113506,
0.047714028507471085,
0.05568809434771538,
0.07670014351606369,
0.000006953746378712822,
-0.11198442429304123,
-0.1783648431301117,
-0.08167610317468643,
0.015564302913844585,
0.08265890926122665,
-0.00588970584794879,
0.10358839482069016,
-0.003525681793689728,
-0.0032993813510984182,
0.048775266855955124,
-0.0531260184943676,
-0.06408720463514328,
-0.09419245272874832,
0.008202844299376011,
0.029899228364229202,
-0.01646074838936329,
-0.06377828866243362,
-0.11641669273376465,
-0.03497447073459625,
0.16022245585918427,
0.02565125934779644,
-0.06161787733435631,
-0.14062124490737915,
0.0689857080578804,
0.08391270786523819,
-0.08637922257184982,
0.022689804434776306,
0.002772259758785367,
0.1096462681889534,
0.013663090765476227,
-0.06999297440052032,
0.10933694988489151,
-0.06911008805036545,
-0.18869486451148987,
-0.04719846323132515,
0.11738148331642151,
0.06046763435006142,
0.07299277186393738,
-0.0038077894132584333,
0.0315268412232399,
-0.02446954883635044,
-0.08080863207578659,
0.05297119542956352,
0.03195960447192192,
0.08176760375499725,
0.011373446322977543,
-0.03340031951665878,
0.022487932816147804,
-0.06869642436504364,
-0.0307016521692276,
0.16960974037647247,
0.2629065215587616,
-0.09611620754003525,
0.0872335359454155,
0.03145379573106766,
-0.06567491590976715,
-0.17972074449062347,
0.04460444673895836,
0.07519912719726562,
-0.00024864799343049526,
0.015201414935290813,
-0.21626658737659454,
0.06182254105806351,
0.07966611534357071,
-0.009729016572237015,
0.0681985393166542,
-0.2905254364013672,
-0.11711744964122772,
0.11910469084978104,
0.11460644751787186,
0.07299523800611496,
-0.1515073925256729,
-0.01971275545656681,
-0.02064916305243969,
-0.10996430367231369,
0.13154596090316772,
-0.07878710329532623,
0.11792054772377014,
-0.028099937364459038,
0.10213853418827057,
0.020324520766735077,
-0.045450806617736816,
0.12259246408939362,
0.033844154328107834,
0.09771598130464554,
-0.05779586359858513,
0.018801843747496605,
0.06296809017658234,
-0.08221397548913956,
0.0674949586391449,
-0.09313590824604034,
0.03408272564411163,
-0.1389821320772171,
-0.015654917806386948,
-0.093145452439785,
0.02080852910876274,
-0.03764854744076729,
-0.052348822355270386,
-0.05797258019447327,
0.04278038442134857,
0.10190743207931519,
-0.025698086246848106,
0.1333283931016922,
0.005760887172073126,
0.12026791274547577,
0.13177889585494995,
0.09839772433042526,
-0.09058015793561935,
-0.05844483524560928,
0.007070167921483517,
-0.016856076195836067,
0.03183398023247719,
-0.18063884973526,
0.037690624594688416,
0.14328686892986298,
0.018046490848064423,
0.1486673653125763,
0.06972695142030716,
-0.049082983285188675,
0.014150657691061497,
0.06275259703397751,
-0.1383049637079239,
-0.08541978895664215,
0.004360504914075136,
-0.017115797847509384,
-0.13802103698253632,
0.026750091463327408,
0.09348145872354507,
-0.05954287573695183,
-0.016794737428426743,
-0.007997254841029644,
0.04708428308367729,
-0.012471656315028667,
0.19100423157215118,
0.024301907047629356,
0.051281802356243134,
-0.12916436791419983,
0.10173992812633514,
0.02499849908053875,
-0.12302640825510025,
0.062201015651226044,
0.11744813621044159,
-0.0925678238272667,
-0.03540899604558945,
0.07276899367570877,
0.18206922709941864,
-0.05268663167953491,
-0.05029132589697838,
-0.1615176945924759,
-0.13878220319747925,
0.10869903117418289,
0.15774159133434296,
0.0825851634144783,
0.022886332124471664,
-0.044025320559740067,
0.011712602339684963,
-0.12630650401115417,
0.09506743401288986,
0.08651246875524521,
0.05231746658682823,
-0.11164930462837219,
0.12111131846904755,
-0.003965741954743862,
0.03271762281656265,
-0.015358959324657917,
-0.004825884010642767,
-0.11675626784563065,
0.009485473856329918,
-0.12980639934539795,
-0.0069797751493752,
-0.07128564268350601,
0.013277019374072552,
-0.007787046954035759,
-0.03179322928190231,
-0.043717239052057266,
0.01376134529709816,
-0.12043898552656174,
-0.026978667825460434,
0.004508053418248892,
0.061205703765153885,
-0.13844263553619385,
-0.041000496596097946,
0.006765751633793116,
-0.09196198731660843,
0.10163606703281403,
0.05801330879330635,
-0.005373474210500717,
0.026491383090615273,
-0.08311029523611069,
-0.004849073011428118,
0.07716238498687744,
-0.01412529032677412,
0.06808032840490341,
-0.13256385922431946,
-0.014777755364775658,
-0.007917745970189571,
0.008419463410973549,
0.029371142387390137,
0.10639815032482147,
-0.11208819597959518,
0.02575114369392395,
-0.0019627632573246956,
-0.06521273404359818,
-0.062467120587825775,
0.07209564000368118,
0.09943100810050964,
0.017558665946125984,
0.17507526278495789,
-0.08254262059926987,
0.034964419901371,
-0.20692650973796844,
-0.012348880991339684,
-0.0011038414668291807,
-0.1236477941274643,
-0.10039031505584717,
-0.037640418857336044,
0.08573988080024719,
-0.06129469349980354,
0.12877348065376282,
0.03895843029022217,
0.006093417294323444,
0.03583093360066414,
-0.02027837559580803,
-0.017640287056565285,
0.02333320491015911,
0.14525777101516724,
0.01790521666407585,
-0.05678962916135788,
0.09238801151514053,
0.04573407396674156,
0.09581481665372849,
0.12464381754398346,
0.21681421995162964,
0.12954676151275635,
0.0830640196800232,
0.09286683797836304,
0.01819639839231968,
-0.05779906362295151,
-0.16253724694252014,
0.07260998338460922,
-0.04267285391688347,
0.14041000604629517,
-0.012748371809720993,
0.2006291151046753,
0.07843073457479477,
-0.1765861064195633,
0.08515732735395432,
-0.047791868448257446,
-0.08835535496473312,
-0.1149892583489418,
-0.0679900199174881,
-0.09158860892057419,
-0.15626230835914612,
-0.0040159691125154495,
-0.14893268048763275,
0.04185378924012184,
0.05038124695420265,
0.02336333878338337,
0.002437800634652376,
0.0959417074918747,
0.014424346387386322,
0.02176879718899727,
0.09789475798606873,
0.005082043819129467,
-0.060347188264131546,
-0.06804761290550232,
-0.05849951505661011,
0.004710895009338856,
-0.005629145074635744,
0.04984884709119797,
-0.014791627414524555,
-0.045716166496276855,
0.03775022178888321,
-0.03004000522196293,
-0.10855578631162643,
0.015347364358603954,
0.017828311771154404,
0.06680692732334137,
0.04981262981891632,
0.01979895494878292,
-0.004206622950732708,
-0.0026987676974385977,
0.21402043104171753,
-0.0733867883682251,
-0.02979242242872715,
-0.13238458335399628,
0.2621976435184479,
0.026569610461592674,
-0.03176701068878174,
0.048486821353435516,
-0.07207082211971283,
-0.030228544026613235,
0.175558939576149,
0.19087213277816772,
-0.027840644121170044,
-0.004068813286721706,
-0.03574039041996002,
-0.0014217686839401722,
-0.03389324992895126,
0.09978348016738892,
0.13023768365383148,
0.011857615783810616,
-0.09112060815095901,
-0.011114929802715778,
-0.05494571849703789,
-0.04664573073387146,
-0.0459960512816906,
0.0710899829864502,
0.045706186443567276,
0.001936300890520215,
-0.03808143362402916,
0.06844818592071533,
-0.07579191029071808,
-0.0714210569858551,
0.060162097215652466,
-0.21802693605422974,
-0.16615860164165497,
-0.02034897170960903,
0.04306776449084282,
0.03574448823928833,
0.06552550196647644,
0.013666756451129913,
-0.004965460859239101,
0.12732325494289398,
-0.01480206847190857,
-0.09256330877542496,
-0.11389236152172089,
0.11374516785144806,
-0.15393441915512085,
0.1912686675786972,
-0.05101769417524338,
0.01951257511973381,
0.13197007775306702,
0.03952339291572571,
-0.10324212163686752,
0.05166172608733177,
0.05417279154062271,
-0.05462360754609108,
0.002384183695539832,
0.14519453048706055,
-0.04710952937602997,
0.08871696144342422,
0.04123096913099289,
-0.13678009808063507,
-0.0051964325830340385,
-0.057282187044620514,
-0.05457058176398277,
-0.03973320499062538,
-0.017201481387019157,
-0.040711838752031326,
0.10954301804304123,
0.2188241332769394,
-0.04361145943403244,
0.010944243520498276,
-0.07149084657430649,
0.01361110433936119,
0.043276604264974594,
-0.008871621452271938,
-0.047616828233003616,
-0.24529977142810822,
0.01836584508419037,
0.10463471710681915,
0.0043829986825585365,
-0.23500698804855347,
-0.09399379044771194,
0.008361619897186756,
-0.050332095474004745,
-0.08896009624004364,
0.09696648269891739,
0.04677649214863777,
0.0587175115942955,
-0.057566385716199875,
-0.056300994008779526,
-0.04263915866613388,
0.18627896904945374,
-0.1378536820411682,
-0.059317294508218765
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# albert-base-v2-finetuned-qnli
This model is a fine-tuned version of [albert-base-v2](https://huggingface.co/albert-base-v2) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3194
- Accuracy: 0.9112
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.3116 | 1.0 | 6547 | 0.2818 | 0.8849 |
| 0.2467 | 2.0 | 13094 | 0.2532 | 0.9001 |
| 0.1858 | 3.0 | 19641 | 0.3194 | 0.9112 |
| 0.1449 | 4.0 | 26188 | 0.4338 | 0.9103 |
| 0.0584 | 5.0 | 32735 | 0.5752 | 0.9052 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "albert-base-v2-finetuned-qnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "qnli"}, "metrics": [{"type": "accuracy", "value": 0.9112209408749771, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/albert-base-v2-finetuned-qnli
|
[
"transformers",
"pytorch",
"tensorboard",
"albert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
albert-base-v2-finetuned-qnli
=============================
This model is a fine-tuned version of albert-base-v2 on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3194
* Accuracy: 0.9112
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.10851341485977173,
0.08287949860095978,
-0.0015716948546469212,
0.12378216534852982,
0.1644035279750824,
0.034203559160232544,
0.11756833642721176,
0.1303785890340805,
-0.08961959183216095,
0.01978359930217266,
0.13182155787944794,
0.1571536362171173,
0.019342191517353058,
0.10459926724433899,
-0.050261739641427994,
-0.2606520652770996,
-0.015629388391971588,
0.05233924835920334,
-0.047430187463760376,
0.13125889003276825,
0.08640951663255692,
-0.12352532893419266,
0.10119590908288956,
0.016996311023831367,
-0.19539082050323486,
0.006376367527991533,
0.009158787317574024,
-0.0564240999519825,
0.14400716125965118,
0.03155115246772766,
0.11656567454338074,
-0.004729350097477436,
0.08431360125541687,
-0.1941358745098114,
0.011082910932600498,
0.04820818454027176,
0.0049487631767988205,
0.09291473031044006,
0.04774554818868637,
0.0033729670103639364,
0.14480124413967133,
-0.09592609107494354,
0.0522734709084034,
0.027718715369701385,
-0.12452755123376846,
-0.21515901386737823,
-0.08181779086589813,
0.03230409324169159,
0.08416493982076645,
0.11512179672718048,
-0.005395589396357536,
0.12806573510169983,
-0.08085677772760391,
0.08780300617218018,
0.23827265202999115,
-0.30694082379341125,
-0.06739146262407303,
0.025221778079867363,
0.00799495168030262,
0.032211191952228546,
-0.10505171120166779,
-0.02871882915496826,
0.05733394995331764,
0.04659018665552139,
0.12641079723834991,
-0.033901676535606384,
-0.11446501314640045,
0.014867664314806461,
-0.13147439062595367,
-0.03409222885966301,
0.16096091270446777,
0.04539426788687706,
-0.033270757645368576,
-0.05664939805865288,
-0.051134154200553894,
-0.15639479458332062,
-0.035082269459962845,
-0.004277274943888187,
0.04805569350719452,
-0.02432965487241745,
-0.0460941419005394,
-0.0016077898908406496,
-0.11016025394201279,
-0.06246815249323845,
-0.08158069103956223,
0.11988197267055511,
0.03115209750831127,
0.016247021034359932,
-0.037016041576862335,
0.11204948276281357,
0.0013566170819103718,
-0.13201206922531128,
0.012988509610295296,
0.02687028981745243,
0.00729374261572957,
-0.044786155223846436,
-0.053867775946855545,
-0.053863268345594406,
0.00430115032941103,
0.12289395928382874,
-0.04026487097144127,
0.040145393460989,
0.04916608706116676,
0.04391423612833023,
-0.09642386436462402,
0.20065082609653473,
-0.03766079246997833,
-0.018071161583065987,
0.00929968897253275,
0.03564739599823952,
0.024339625611901283,
-0.009158243425190449,
-0.11860004812479019,
-0.000343729363521561,
0.07517223805189133,
0.005142418202012777,
-0.07092933356761932,
0.07137508690357208,
-0.05376800149679184,
-0.02296004444360733,
-0.0021376493386924267,
-0.08764661103487015,
0.03296663612127304,
-0.002671253401786089,
-0.07513636350631714,
-0.015495436266064644,
0.030281322076916695,
0.021347740665078163,
-0.014448919333517551,
0.11615916341543198,
-0.08655691146850586,
0.03218672797083855,
-0.09627551585435867,
-0.10221421718597412,
0.023033635690808296,
-0.10830260813236237,
0.03779095411300659,
-0.09124257415533066,
-0.1834726780653,
-0.009110176004469395,
0.06036468595266342,
-0.025423433631658554,
-0.061499450355768204,
-0.0546928234398365,
-0.06525935232639313,
0.015102635137736797,
-0.007776893209666014,
0.13168306648731232,
-0.06625673919916153,
0.08166395872831345,
0.02689969912171364,
0.06423691660165787,
-0.04112587869167328,
0.052426837384700775,
-0.10481588542461395,
0.014568629674613476,
-0.14707937836647034,
0.031188983470201492,
-0.037114519625902176,
0.07600796222686768,
-0.08274073898792267,
-0.09499000012874603,
0.015633273869752884,
-0.0025613398756831884,
0.0618094764649868,
0.09867966920137405,
-0.17744845151901245,
-0.07985639572143555,
0.15661577880382538,
-0.06437281519174576,
-0.1325954645872116,
0.12001652270555496,
-0.0612134151160717,
0.045997507870197296,
0.06028265133500099,
0.1512797325849533,
0.06379801779985428,
-0.08271874487400055,
-0.004605399910360575,
0.02407730370759964,
0.04843881353735924,
-0.07159419357776642,
0.07669540494680405,
0.008389415219426155,
0.002951052039861679,
0.0340876542031765,
-0.019045770168304443,
0.061201177537441254,
-0.08733832836151123,
-0.10032474249601364,
-0.04622561112046242,
-0.08203887939453125,
0.028672071173787117,
0.0776790976524353,
0.07325249910354614,
-0.09814091771841049,
-0.08594372123479843,
0.03338034823536873,
0.0765068531036377,
-0.04751443490386009,
0.028013426810503006,
-0.05600306764245033,
0.06213730573654175,
-0.043482739478349686,
-0.023733915761113167,
-0.17221353948116302,
-0.017427945509552956,
-0.000443327211542055,
-0.006937176920473576,
0.009066986851394176,
0.026353659108281136,
0.06821393221616745,
0.05675322189927101,
-0.05200904235243797,
-0.010675321333110332,
-0.02982199192047119,
-0.0041316417045891285,
-0.13505974411964417,
-0.2008715569972992,
-0.03134380653500557,
-0.02398592233657837,
0.15125882625579834,
-0.20457713305950165,
0.04213083162903786,
-0.021903015673160553,
0.06635771691799164,
0.012865194119513035,
-0.0053146895952522755,
-0.04286836460232735,
0.0698426365852356,
-0.04436494782567024,
-0.05092164874076843,
0.07527109980583191,
0.019110465422272682,
-0.09808443486690521,
-0.04705537110567093,
-0.08996978402137756,
0.15877945721149445,
0.13385072350502014,
-0.1099434345960617,
-0.07223377376794815,
-0.0058516887947916985,
-0.06560415774583817,
-0.03339000791311264,
-0.05574433505535126,
0.040243301540613174,
0.21176877617835999,
-0.007056929636746645,
0.15119796991348267,
-0.0662810429930687,
-0.04895230755209923,
0.026796160265803337,
-0.03642735630273819,
0.021458491683006287,
0.12952668964862823,
0.13312320411205292,
-0.05990966781973839,
0.14652734994888306,
0.15359628200531006,
-0.09059371799230576,
0.13318632543087006,
-0.03999984264373779,
-0.07472026348114014,
-0.017710551619529724,
-0.03931165114045143,
-0.004119568970054388,
0.10875216871500015,
-0.1615610122680664,
-0.004651137627661228,
0.030832653865218163,
0.014737037010490894,
0.020087437704205513,
-0.22307056188583374,
-0.04519743472337723,
0.04278237000107765,
-0.03178243339061737,
-0.021580059081315994,
-0.007723218761384487,
0.0037444550544023514,
0.10487890243530273,
0.0055309683084487915,
-0.08151063323020935,
0.03770234063267708,
0.005515687167644501,
-0.08669986575841904,
0.21752367913722992,
-0.06947796791791916,
-0.15518920123577118,
-0.12592779099941254,
-0.07883378863334656,
-0.04955562576651573,
0.0021320621017366648,
0.07115191966295242,
-0.09382370859384537,
-0.032456617802381516,
-0.07577887177467346,
0.019918303936719894,
0.004875612910836935,
0.031803544610738754,
0.01505905669182539,
0.0028410842642188072,
0.06437543779611588,
-0.10163018107414246,
-0.016886616125702858,
-0.05557944253087044,
-0.050702955573797226,
0.036809612065553665,
0.035607509315013885,
0.11485431343317032,
0.14555047452449799,
-0.016081763431429863,
0.013742087408900261,
-0.030881134793162346,
0.22703666985034943,
-0.06183459237217903,
-0.033547043800354004,
0.13601787388324738,
-0.008285166695713997,
0.04035327211022377,
0.1131085455417633,
0.07463839650154114,
-0.07826251536607742,
-0.00111157086212188,
0.03700360655784607,
-0.03763021528720856,
-0.2307037115097046,
-0.046411383897066116,
-0.06072646379470825,
0.007775360718369484,
0.09654852747917175,
0.02273011952638626,
0.02955014817416668,
0.07200337946414948,
0.04007653146982193,
0.08878234028816223,
-0.05143848434090614,
0.059663355350494385,
0.10464063286781311,
0.04082774370908737,
0.1216021254658699,
-0.05594692751765251,
-0.06648729741573334,
0.04218735173344612,
-0.01800714246928692,
0.2236419916152954,
0.016036270186305046,
0.13117147982120514,
0.05609254539012909,
0.15269875526428223,
-0.004133033100515604,
0.0846896767616272,
-0.0062759071588516235,
-0.05142221599817276,
-0.01511458307504654,
-0.03756638243794441,
-0.03472224622964859,
0.032171182334423065,
-0.08392827957868576,
0.079414002597332,
-0.1315862387418747,
0.016221044585108757,
0.05463474988937378,
0.26152393221855164,
0.04587202146649361,
-0.321972519159317,
-0.09329250454902649,
0.009788069874048233,
-0.029937151819467545,
-0.027202531695365906,
0.03143763169646263,
0.08129655569791794,
-0.09414859861135483,
0.035591233521699905,
-0.0738305002450943,
0.10116495192050934,
-0.04594714939594269,
0.0491180457174778,
0.08238666504621506,
0.07942087948322296,
0.0074699679389595985,
0.09445410221815109,
-0.30067723989486694,
0.2845527231693268,
0.004858710337430239,
0.06776915490627289,
-0.08622097969055176,
0.008372905664145947,
0.04453708976507187,
0.06563539057970047,
0.09501554816961288,
-0.0140914935618639,
-0.04899032041430473,
-0.18717963993549347,
-0.06885536760091782,
0.03427093103528023,
0.05553438887000084,
-0.03511710464954376,
0.08751165121793747,
-0.028084883466362953,
0.0065111275762319565,
0.07207874953746796,
0.018553245812654495,
-0.04902458190917969,
-0.11115951836109161,
-0.01427517831325531,
0.026247471570968628,
-0.07107964158058167,
-0.05908683314919472,
-0.11763347685337067,
-0.1296166181564331,
0.15435954928398132,
-0.02876298874616623,
-0.02926171012222767,
-0.11168454587459564,
0.08655610680580139,
0.049155063927173615,
-0.09150857478380203,
0.0343179889023304,
0.005369671154767275,
0.08151274919509888,
0.02817639894783497,
-0.0781378522515297,
0.10608948767185211,
-0.07388927042484283,
-0.15226562321186066,
-0.06808533519506454,
0.09889303892850876,
0.030818484723567963,
0.06943147629499435,
-0.01059445645660162,
0.015278245322406292,
-0.05115121603012085,
-0.0893559604883194,
0.025140443816781044,
0.008072792552411556,
0.08026784658432007,
0.005880521144717932,
-0.06082103028893471,
0.021673541516065598,
-0.05721529945731163,
-0.032977454364299774,
0.20603495836257935,
0.21837805211544037,
-0.10593485087156296,
0.01963035576045513,
0.00011520516272867098,
-0.07776135206222534,
-0.19612984359264374,
0.04137551784515381,
0.04815450683236122,
0.018216095864772797,
0.03512553870677948,
-0.17524226009845734,
0.15107755362987518,
0.1101469025015831,
-0.014013183303177357,
0.10103233903646469,
-0.30580073595046997,
-0.12347456812858582,
0.13789689540863037,
0.12939028441905975,
0.13237634301185608,
-0.1306103616952896,
-0.01185387559235096,
-0.028198547661304474,
-0.1425543576478958,
0.09835414588451385,
-0.10393572598695755,
0.11367519944906235,
-0.044036321341991425,
0.07140891253948212,
0.0034511450212448835,
-0.06000930443406105,
0.12067489326000214,
0.025124182924628258,
0.09810183197259903,
-0.056498534977436066,
-0.034325432032346725,
0.03099016286432743,
-0.04756839945912361,
0.031243259087204933,
-0.10857679694890976,
0.023675616830587387,
-0.12081367522478104,
-0.025438150390982628,
-0.06328153610229492,
0.049994807690382004,
-0.04249459132552147,
-0.060809362679719925,
-0.03294748067855835,
0.01564195565879345,
0.05251622945070267,
-0.009419661946594715,
0.14960907399654388,
0.023017099127173424,
0.14949700236320496,
0.08569129556417465,
0.08571472764015198,
-0.07848557829856873,
-0.0662907212972641,
-0.018670717254281044,
-0.01171959936618805,
0.052121590822935104,
-0.1567266285419464,
0.0222612377256155,
0.14933155477046967,
0.023854093626141548,
0.14069582521915436,
0.08444757014513016,
-0.012314979918301105,
0.006973995827138424,
0.05782342329621315,
-0.16315408051013947,
-0.08276087045669556,
-0.01947229914367199,
-0.05304446816444397,
-0.12343880534172058,
0.044341687113046646,
0.08120200037956238,
-0.07319604605436325,
-0.01001099031418562,
-0.008966249413788319,
0.00801245216280222,
-0.060733210295438766,
0.17431361973285675,
0.04631480574607849,
0.04427378252148628,
-0.103228360414505,
0.06995489448308945,
0.04022670537233353,
-0.08173894137144089,
0.006496574729681015,
0.0679832473397255,
-0.07813875377178192,
-0.05333561822772026,
0.08378839492797852,
0.21410124003887177,
-0.04703439027070999,
-0.04646718502044678,
-0.1423557847738266,
-0.13277803361415863,
0.08483558148145676,
0.14282391965389252,
0.11973793059587479,
0.011175474151968956,
-0.0649867057800293,
-0.0028957000467926264,
-0.11970946192741394,
0.0956927016377449,
0.04556654393672943,
0.06415167450904846,
-0.1412634551525116,
0.13030587136745453,
0.014540751464664936,
0.04957909509539604,
-0.018316565081477165,
0.02747558057308197,
-0.09735246002674103,
0.01003090851008892,
-0.11197996139526367,
-0.014422730542719364,
-0.03730938956141472,
0.010056210681796074,
-0.005486046429723501,
-0.04593383148312569,
-0.06191306561231613,
0.010265232995152473,
-0.10761867463588715,
-0.020014287903904915,
0.03203214704990387,
0.06906575709581375,
-0.09926522523164749,
-0.03584988787770271,
0.025884181261062622,
-0.06411145627498627,
0.06664050370454788,
0.04775208234786987,
0.024460744112730026,
0.05203469842672348,
-0.13428330421447754,
0.020065493881702423,
0.07125937938690186,
0.023615064099431038,
0.06857342272996902,
-0.10210324823856354,
-0.0048413085751235485,
-0.0018470374634489417,
0.03993143141269684,
0.01879689283668995,
0.060570936650037766,
-0.13668784499168396,
-0.0018133589765056968,
-0.0059414212591946125,
-0.08354639261960983,
-0.06759142130613327,
0.025267822667956352,
0.09829063713550568,
0.010769153945147991,
0.20157437026500702,
-0.07453074306249619,
0.05111348256468773,
-0.21752247214317322,
0.008580947294831276,
-0.011457758024334908,
-0.10675018280744553,
-0.11847562342882156,
-0.07331645488739014,
0.060511115938425064,
-0.05910194292664528,
0.15643341839313507,
0.04336113855242729,
0.03594420477747917,
0.029215503484010696,
-0.016089623793959618,
0.025112714618444443,
0.013912186026573181,
0.20906245708465576,
0.031809382140636444,
-0.03966294229030609,
0.06834982335567474,
0.04653722792863846,
0.10755541920661926,
0.1338290572166443,
0.20528732240200043,
0.14109309017658234,
-0.0016062030335888267,
0.104256771504879,
0.03466791287064552,
-0.05762480944395065,
-0.1563846468925476,
0.03676179423928261,
-0.04096238315105438,
0.11012841761112213,
-0.017167802900075912,
0.20791314542293549,
0.07202707976102829,
-0.17344166338443756,
0.04336996749043465,
-0.05798328295350075,
-0.0792505294084549,
-0.12193125486373901,
-0.048324186354875565,
-0.08294760435819626,
-0.12893500924110413,
0.004624804016202688,
-0.11551131308078766,
0.0029855608008801937,
0.11830372363328934,
0.0003146572853438556,
-0.025349488481879234,
0.15853318572044373,
0.012229030951857567,
0.03587009757757187,
0.0623665414750576,
0.009986549615859985,
-0.033745404332876205,
-0.12574751675128937,
-0.049979232251644135,
-0.01642908900976181,
-0.03286297246813774,
0.02869069203734398,
-0.06801576167345047,
-0.043811243027448654,
0.0380096510052681,
-0.018702475354075432,
-0.0994282215833664,
0.008727424778044224,
0.010006695054471493,
0.06349524110555649,
0.04147202521562576,
0.009688003920018673,
0.02728598564863205,
-0.008197645656764507,
0.20150049030780792,
-0.08199817687273026,
-0.05192165821790695,
-0.10646841675043106,
0.24835076928138733,
0.04308316856622696,
-0.024649769067764282,
0.02875283546745777,
-0.062391892075538635,
0.00807760376483202,
0.25195541977882385,
0.20674537122249603,
-0.06871533393859863,
-0.007414078805595636,
0.005721509922295809,
-0.0078910943120718,
-0.0229549128562212,
0.09814517945051193,
0.13991482555866241,
0.039135899394750595,
-0.10182734578847885,
-0.05337280035018921,
-0.05527487024664879,
-0.020565170794725418,
-0.03373938426375389,
0.08031013607978821,
0.05193829908967018,
0.0009627835243009031,
-0.02825058251619339,
0.049038421362638474,
-0.06365194916725159,
-0.07327164709568024,
0.06637400388717651,
-0.21353663504123688,
-0.1605113446712494,
-0.009253967553377151,
0.09987538307905197,
0.011628348380327225,
0.06883668899536133,
-0.02442094497382641,
-0.005353689659386873,
0.09306224435567856,
-0.01920473948121071,
-0.106890469789505,
-0.07222694903612137,
0.08502697199583054,
-0.1229671910405159,
0.2248678207397461,
-0.042769718915224075,
0.04968447983264923,
0.12772446870803833,
0.07353010773658752,
-0.08143990486860275,
0.05894053354859352,
0.03563410043716431,
-0.05077586695551872,
0.029180480167269707,
0.07570891827344894,
-0.03623399883508682,
0.05392841622233391,
0.0446491502225399,
-0.1353415846824646,
0.023813316598534584,
-0.06738412380218506,
-0.061082519590854645,
-0.04242687672376633,
-0.020358486101031303,
-0.053828101605176926,
0.13348907232284546,
0.22147727012634277,
-0.02643861249089241,
-0.01265759114176035,
-0.06895022839307785,
0.011453851126134396,
0.0556543804705143,
0.027651680633425713,
-0.06081826612353325,
-0.20048682391643524,
0.02100181393325329,
0.04676947742700577,
-0.02104194276034832,
-0.2525334358215332,
-0.09901151061058044,
0.0026946943253278732,
-0.08497780561447144,
-0.08980201184749603,
0.06273863464593887,
0.0974934995174408,
0.05462854355573654,
-0.0601477175951004,
-0.05967121943831444,
-0.06285107880830765,
0.14960379898548126,
-0.1336909383535385,
-0.09875859320163727
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# albert-base-v2-finetuned-rte
This model is a fine-tuned version of [albert-base-v2](https://huggingface.co/albert-base-v2) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2496
- Accuracy: 0.7581
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 10
- eval_batch_size: 10
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 249 | 0.5914 | 0.6751 |
| No log | 2.0 | 498 | 0.5843 | 0.7184 |
| 0.5873 | 3.0 | 747 | 0.6925 | 0.7220 |
| 0.5873 | 4.0 | 996 | 1.1613 | 0.7545 |
| 0.2149 | 5.0 | 1245 | 1.2496 | 0.7581 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "albert-base-v2-finetuned-rte", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "rte"}, "metrics": [{"type": "accuracy", "value": 0.7581227436823105, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/albert-base-v2-finetuned-rte
|
[
"transformers",
"pytorch",
"tensorboard",
"albert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
albert-base-v2-finetuned-rte
============================
This model is a fine-tuned version of albert-base-v2 on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 1.2496
* Accuracy: 0.7581
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 10
* eval\_batch\_size: 10
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 10\n* eval\\_batch\\_size: 10\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 10\n* eval\\_batch\\_size: 10\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 10\n* eval\\_batch\\_size: 10\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.10812227427959442,
0.08284442871809006,
-0.0015580077888444066,
0.12244495004415512,
0.16290700435638428,
0.03442436084151268,
0.11997165530920029,
0.12939207255840302,
-0.08743293583393097,
0.01921333745121956,
0.13137651979923248,
0.15945979952812195,
0.019810274243354797,
0.10694754123687744,
-0.05106024071574211,
-0.2621508538722992,
-0.014195792376995087,
0.05278029665350914,
-0.04353747144341469,
0.13184162974357605,
0.08598319441080093,
-0.12341554462909698,
0.10074693709611893,
0.015679562464356422,
-0.1942601501941681,
0.006165030878037214,
0.008540485054254532,
-0.058614201843738556,
0.14441728591918945,
0.028705954551696777,
0.11618482321500778,
-0.004100530408322811,
0.08499672263860703,
-0.19648362696170807,
0.011473403312265873,
0.047650501132011414,
0.00452944403514266,
0.09177328646183014,
0.04591868445277214,
0.0025606248527765274,
0.14506928622722626,
-0.09642227739095688,
0.05479328706860542,
0.02678256668150425,
-0.12495309114456177,
-0.21436306834220886,
-0.08185486495494843,
0.03439687564969063,
0.08545161038637161,
0.11497870832681656,
-0.00704431626945734,
0.12902942299842834,
-0.08128020167350769,
0.08706013113260269,
0.24161988496780396,
-0.30761340260505676,
-0.06689779460430145,
0.025999119505286217,
0.010811397805809975,
0.033626001328229904,
-0.10598018020391464,
-0.029820840805768967,
0.057460296899080276,
0.0453692227602005,
0.12635180354118347,
-0.034481607377529144,
-0.11089450865983963,
0.014401528984308243,
-0.13264597952365875,
-0.03350748494267464,
0.1629151701927185,
0.047308363020420074,
-0.03316374495625496,
-0.05942146107554436,
-0.04950757697224617,
-0.15870119631290436,
-0.0357450395822525,
-0.0067693935707211494,
0.04712338373064995,
-0.024480249732732773,
-0.046772316098213196,
-0.0004205122822895646,
-0.11031024903059006,
-0.06184318661689758,
-0.08021574467420578,
0.11916148662567139,
0.03234322369098663,
0.014026869088411331,
-0.03719112649559975,
0.11222857981920242,
0.0006609389674849808,
-0.13266374170780182,
0.013174224644899368,
0.02464505471289158,
0.006400905083864927,
-0.046067800372838974,
-0.053933948278427124,
-0.051930271089076996,
0.004880490712821484,
0.1251206398010254,
-0.03353549540042877,
0.041673459112644196,
0.05016564950346947,
0.0428360253572464,
-0.09581884741783142,
0.20034925639629364,
-0.03589331731200218,
-0.022892795503139496,
0.008429545909166336,
0.03780374675989151,
0.024145780131220818,
-0.009151838719844818,
-0.11865413933992386,
0.0003033840039279312,
0.07693354785442352,
0.004838700871914625,
-0.07210666686296463,
0.07228454202413559,
-0.05257178097963333,
-0.022996045649051666,
-0.0056682457216084,
-0.08841325342655182,
0.03352183476090431,
-0.0014572732616215944,
-0.07455167919397354,
-0.014107013121247292,
0.027440382167696953,
0.02270200289785862,
-0.014683040790259838,
0.11508310586214066,
-0.08568273484706879,
0.031704094260931015,
-0.09580230712890625,
-0.1030389666557312,
0.02285224385559559,
-0.11259704828262329,
0.03726596757769585,
-0.09101028740406036,
-0.18348291516304016,
-0.0100685004144907,
0.060151342302560806,
-0.022707665339112282,
-0.06010086089372635,
-0.05793559551239014,
-0.066010981798172,
0.01335061900317669,
-0.007147449534386396,
0.13123619556427002,
-0.06641251593828201,
0.08052718639373779,
0.02494223043322563,
0.06419709324836731,
-0.04216453805565834,
0.05222154036164284,
-0.10420840978622437,
0.013122577220201492,
-0.14495594799518585,
0.030937576666474342,
-0.03678807616233826,
0.07637365162372589,
-0.08259858191013336,
-0.09401261806488037,
0.0149227324873209,
-0.0021897535771131516,
0.05989959090948105,
0.10013262927532196,
-0.18082264065742493,
-0.08013774454593658,
0.1591133326292038,
-0.06347007304430008,
-0.13310538232326508,
0.11787702888250351,
-0.06260894984006882,
0.048464272171258926,
0.0631970688700676,
0.15259835124015808,
0.06378328055143356,
-0.08348127454519272,
-0.005408478435128927,
0.021814236417412758,
0.048930659890174866,
-0.06933651119470596,
0.07577134668827057,
0.007370618637651205,
0.005409549456089735,
0.03380436822772026,
-0.019116850569844246,
0.058931175619363785,
-0.08602556586265564,
-0.10096555203199387,
-0.045310650020837784,
-0.08115530014038086,
0.02798488922417164,
0.07748840749263763,
0.07319899648427963,
-0.09830629080533981,
-0.08539480715990067,
0.03303101658821106,
0.07598836719989777,
-0.04915986955165863,
0.028289390727877617,
-0.05608874931931496,
0.061022695153951645,
-0.04128282517194748,
-0.023319358006119728,
-0.17212943732738495,
-0.016190392896533012,
0.0005925387376919389,
-0.011388246901333332,
0.008831306360661983,
0.0268185306340456,
0.06821294873952866,
0.05501342937350273,
-0.05090034380555153,
-0.011488190852105618,
-0.030438117682933807,
-0.0028543774969875813,
-0.13574649393558502,
-0.20113439857959747,
-0.0303611159324646,
-0.024897273629903793,
0.15370577573776245,
-0.20515237748622894,
0.04379026219248772,
-0.0218726247549057,
0.06603100150823593,
0.013438552618026733,
-0.006492766551673412,
-0.043173227459192276,
0.07049788534641266,
-0.043958332389593124,
-0.05093691870570183,
0.07494747638702393,
0.01759704016149044,
-0.0997835174202919,
-0.050611332058906555,
-0.09440138190984726,
0.15558204054832458,
0.13407504558563232,
-0.10751816630363464,
-0.0719401016831398,
-0.0066538527607917786,
-0.06547510623931885,
-0.032848749309778214,
-0.0556197315454483,
0.03889739513397217,
0.2107492834329605,
-0.005632870364934206,
0.14939838647842407,
-0.06562180072069168,
-0.0491597056388855,
0.026276521384716034,
-0.03514552861452103,
0.021682914346456528,
0.12900611758232117,
0.1326981484889984,
-0.0608346089720726,
0.1453821212053299,
0.157245472073555,
-0.09012020379304886,
0.13328681886196136,
-0.039709266275167465,
-0.07490214705467224,
-0.01796800084412098,
-0.04088638722896576,
-0.005572074092924595,
0.10911056399345398,
-0.16404250264167786,
-0.006710957735776901,
0.030228465795516968,
0.014972531236708164,
0.018954835832118988,
-0.2238462120294571,
-0.045524124056100845,
0.042652279138565063,
-0.031195417046546936,
-0.024214979261159897,
-0.0070041390135884285,
0.002806958043947816,
0.10440855473279953,
0.005079586990177631,
-0.08259882777929306,
0.03672057017683983,
0.0037537789903581142,
-0.08651493489742279,
0.21775184571743011,
-0.06903107464313507,
-0.1535114049911499,
-0.12222585082054138,
-0.08019690960645676,
-0.048169177025556564,
0.0020812905859202147,
0.07224124670028687,
-0.09284645318984985,
-0.031278096139431,
-0.07695784419775009,
0.01751301996409893,
0.006551098078489304,
0.03173869475722313,
0.014846734702587128,
0.0031716942321509123,
0.06113307178020477,
-0.10139652341604233,
-0.017856748774647713,
-0.057308267802000046,
-0.050315167754888535,
0.035979967564344406,
0.03342410549521446,
0.11453315615653992,
0.14718614518642426,
-0.013780667446553707,
0.015139991417527199,
-0.032366957515478134,
0.22485680878162384,
-0.06382457911968231,
-0.033108726143836975,
0.13422982394695282,
-0.009643759578466415,
0.04004998132586479,
0.11612338572740555,
0.07465962320566177,
-0.07813198864459991,
-0.0009871404618024826,
0.036782220005989075,
-0.036834467202425,
-0.22948744893074036,
-0.04466257244348526,
-0.05993299558758736,
0.007193927187472582,
0.09785855561494827,
0.02257000282406807,
0.03244899958372116,
0.07305794954299927,
0.041135963052511215,
0.08613163977861404,
-0.0483919233083725,
0.06095427647233009,
0.10213974118232727,
0.0407949835062027,
0.12108917534351349,
-0.055966634303331375,
-0.06715616583824158,
0.0403464213013649,
-0.018208958208560944,
0.22627267241477966,
0.019024720415472984,
0.12926751375198364,
0.05745925009250641,
0.15136666595935822,
-0.004715285263955593,
0.08385144919157028,
-0.003752421122044325,
-0.05124466493725777,
-0.015357966534793377,
-0.03749721869826317,
-0.03234773501753807,
0.032910168170928955,
-0.08123943209648132,
0.07988361269235611,
-0.1318807154893875,
0.012943650595843792,
0.0551995150744915,
0.26127544045448303,
0.047058720141649246,
-0.3217346966266632,
-0.09054535627365112,
0.010688036680221558,
-0.03129878267645836,
-0.025067448616027832,
0.03157437592744827,
0.07830601185560226,
-0.09578756242990494,
0.03814588859677315,
-0.0738612711429596,
0.10083618015050888,
-0.0458424836397171,
0.05078555643558502,
0.0804804340004921,
0.07735026627779007,
0.006706494837999344,
0.09341100603342056,
-0.3036722242832184,
0.2834796905517578,
0.0035002632066607475,
0.06949956715106964,
-0.08718965202569962,
0.007772665470838547,
0.045814331620931625,
0.06404118984937668,
0.09618905931711197,
-0.01632317155599594,
-0.04745786264538765,
-0.18542560935020447,
-0.06897355616092682,
0.036088909953832626,
0.05739724636077881,
-0.03728330507874489,
0.08835294842720032,
-0.028076564893126488,
0.006294912192970514,
0.07221323996782303,
0.01991957239806652,
-0.05177241191267967,
-0.11080443114042282,
-0.01316156703978777,
0.025379929691553116,
-0.06901412457227707,
-0.05881553143262863,
-0.11750908195972443,
-0.13316702842712402,
0.15201975405216217,
-0.0310304407030344,
-0.028276722878217697,
-0.10973682999610901,
0.08481571823358536,
0.04866652935743332,
-0.09133616089820862,
0.03455067798495293,
0.006649347487837076,
0.08086611330509186,
0.02914884127676487,
-0.07710839062929153,
0.10547436028718948,
-0.0720558762550354,
-0.1527612954378128,
-0.0684044286608696,
0.10014782100915909,
0.032016582787036896,
0.06865500658750534,
-0.007832658477127552,
0.015194501727819443,
-0.05106080695986748,
-0.0897548645734787,
0.023692140355706215,
0.010981851257383823,
0.07974552363157272,
0.005882829427719116,
-0.06248464435338974,
0.017326340079307556,
-0.05921978875994682,
-0.03291132673621178,
0.20764625072479248,
0.2192559540271759,
-0.10535869002342224,
0.020500577986240387,
-0.00006998603203101084,
-0.0781852975487709,
-0.19457176327705383,
0.04168194532394409,
0.04751267284154892,
0.01738695427775383,
0.034920211881399155,
-0.17412154376506805,
0.15018044412136078,
0.111032634973526,
-0.012775520794093609,
0.09992801398038864,
-0.30901405215263367,
-0.12349440157413483,
0.14063270390033722,
0.12993454933166504,
0.12902319431304932,
-0.13090476393699646,
-0.011571654118597507,
-0.0294426828622818,
-0.14248374104499817,
0.10050363093614578,
-0.106709785759449,
0.11364344507455826,
-0.042604390531778336,
0.06998642534017563,
0.003539761994034052,
-0.0601409412920475,
0.11970093846321106,
0.027830716222524643,
0.09865162521600723,
-0.056436408311128616,
-0.037706006318330765,
0.032426685094833374,
-0.04790187627077103,
0.030735043808817863,
-0.10820221900939941,
0.02033689245581627,
-0.11749106645584106,
-0.025151778012514114,
-0.06331976503133774,
0.04986036941409111,
-0.04251127317547798,
-0.06181933358311653,
-0.03260885179042816,
0.015196210704743862,
0.05443786084651947,
-0.009022600017488003,
0.15055452287197113,
0.02123957872390747,
0.15013273060321808,
0.07863333821296692,
0.08657515048980713,
-0.07703308761119843,
-0.06609087437391281,
-0.019071929156780243,
-0.012569399550557137,
0.05087347328662872,
-0.1562030017375946,
0.022460510954260826,
0.14846327900886536,
0.02353331819176674,
0.14283978939056396,
0.08526536077260971,
-0.010798539966344833,
0.007355075795203447,
0.05790757015347481,
-0.16167131066322327,
-0.0826474279165268,
-0.017122425138950348,
-0.055330365896224976,
-0.1237967386841774,
0.04717588052153587,
0.08043020963668823,
-0.07182483375072479,
-0.010483046062290668,
-0.009834694676101208,
0.007744421251118183,
-0.06063826382160187,
0.17511682212352753,
0.046105481684207916,
0.04325881972908974,
-0.10343967378139496,
0.07066210359334946,
0.039044808596372604,
-0.0840577557682991,
0.00690052704885602,
0.06680739670991898,
-0.08035073429346085,
-0.053326625376939774,
0.08293135464191437,
0.21501411497592926,
-0.044330909848213196,
-0.04578620567917824,
-0.14233599603176117,
-0.13313572108745575,
0.08464813232421875,
0.14407436549663544,
0.11949189007282257,
0.011414795182645321,
-0.06579258292913437,
-0.0011755614541471004,
-0.1191522628068924,
0.09452519565820694,
0.047057755291461945,
0.06393074989318848,
-0.1401626169681549,
0.13182871043682098,
0.014576438814401627,
0.04705206677317619,
-0.017599385231733322,
0.02661941386759281,
-0.0982167050242424,
0.010065296664834023,
-0.11071572452783585,
-0.01638023555278778,
-0.039149753749370575,
0.00935458019375801,
-0.00443012872710824,
-0.04479186236858368,
-0.06155195087194443,
0.00842215958982706,
-0.10835433006286621,
-0.02062421478331089,
0.031824398785829544,
0.06987245380878448,
-0.09949462860822678,
-0.03449708968400955,
0.024442551657557487,
-0.06465699523687363,
0.06727047264575958,
0.04811401665210724,
0.023914756253361702,
0.052063822746276855,
-0.13652265071868896,
0.020782122388482094,
0.07181064039468765,
0.02361925318837166,
0.06701847165822983,
-0.10220653563737869,
-0.004876402206718922,
-0.0017990900669246912,
0.04090458154678345,
0.019799603149294853,
0.0603041872382164,
-0.13507509231567383,
-0.0022187200374901295,
-0.006090464070439339,
-0.08331664651632309,
-0.0679805725812912,
0.0249689519405365,
0.09742291271686554,
0.009958837181329727,
0.20324599742889404,
-0.07385322451591492,
0.051081717014312744,
-0.21548718214035034,
0.00877484492957592,
-0.01059644017368555,
-0.10749498754739761,
-0.11681066453456879,
-0.07492906600236893,
0.06093829497694969,
-0.059279702603816986,
0.15723103284835815,
0.04010610282421112,
0.03651111572980881,
0.029036229476332664,
-0.014900012873113155,
0.024790026247501373,
0.015106980688869953,
0.21222227811813354,
0.032654862850904465,
-0.040924038738012314,
0.06674925237894058,
0.04646531492471695,
0.10732819885015488,
0.13213618099689484,
0.20533587038516998,
0.142837256193161,
-0.00258012511767447,
0.10387079417705536,
0.03378888592123985,
-0.05719137564301491,
-0.1541213095188141,
0.03995395824313164,
-0.040848247706890106,
0.11034556478261948,
-0.01728212460875511,
0.21199411153793335,
0.07046006619930267,
-0.17349955439567566,
0.04224659875035286,
-0.057234689593315125,
-0.07918669283390045,
-0.1217208206653595,
-0.05066544935107231,
-0.0836598351597786,
-0.12687627971172333,
0.005300647579133511,
-0.11485069990158081,
0.0027721673250198364,
0.1179838627576828,
0.001846663886681199,
-0.02463175170123577,
0.15798072516918182,
0.009304087609052658,
0.03368942067027092,
0.0642816573381424,
0.009936983697116375,
-0.03598416596651077,
-0.1271963268518448,
-0.04774310439825058,
-0.016977321356534958,
-0.03416181355714798,
0.028776120394468307,
-0.06920942664146423,
-0.044600386172533035,
0.03728986531496048,
-0.020476868376135826,
-0.09851723909378052,
0.009807947091758251,
0.009499584324657917,
0.06362617015838623,
0.041688475757837296,
0.011955582536756992,
0.025226576253771782,
-0.006849984638392925,
0.20173397660255432,
-0.08332333713769913,
-0.05610296502709389,
-0.10735572129487991,
0.2468525618314743,
0.044899292290210724,
-0.024989979341626167,
0.02815493382513523,
-0.06256871670484543,
0.010907826945185661,
0.2521902918815613,
0.2085081934928894,
-0.06703315675258636,
-0.006053169723600149,
0.006465097889304161,
-0.008075987920165062,
-0.02444472908973694,
0.09652169048786163,
0.13940924406051636,
0.03708821162581444,
-0.10022728890180588,
-0.049726471304893494,
-0.052722834050655365,
-0.02125512808561325,
-0.034651994705200195,
0.08233179152011871,
0.05292993411421776,
-0.00044139812234789133,
-0.027730140835046768,
0.051345095038414,
-0.06320849806070328,
-0.07256685942411423,
0.06447811424732208,
-0.21257071197032928,
-0.15995584428310394,
-0.009265612810850143,
0.10035287588834763,
0.012283086776733398,
0.06838137656450272,
-0.023861100897192955,
-0.006440651137381792,
0.09381371736526489,
-0.01769033633172512,
-0.10623181611299515,
-0.0727761834859848,
0.0859944075345993,
-0.12497203052043915,
0.2244545966386795,
-0.04299304261803627,
0.0498611256480217,
0.12851198017597198,
0.072982557117939,
-0.0786944329738617,
0.060381922870874405,
0.03657121583819389,
-0.048771921545267105,
0.029440419748425484,
0.07410670071840286,
-0.03660116717219353,
0.05301995947957039,
0.045426223427057266,
-0.13402394950389862,
0.026019947603344917,
-0.06739998608827591,
-0.06193745881319046,
-0.043238814920186996,
-0.0188886895775795,
-0.052731383591890335,
0.13421235978603363,
0.2197740375995636,
-0.026442497968673706,
-0.011201264336705208,
-0.06913057714700699,
0.011395959183573723,
0.05622691288590431,
0.026137161999940872,
-0.06217151880264282,
-0.19980597496032715,
0.02065468765795231,
0.04513493925333023,
-0.020284051075577736,
-0.24929621815681458,
-0.09989801049232483,
0.002649355214089155,
-0.0853772908449173,
-0.089887835085392,
0.061325278133153915,
0.09884094446897507,
0.05472015216946602,
-0.060149237513542175,
-0.06073954328894615,
-0.06216742470860481,
0.15102964639663696,
-0.1342906653881073,
-0.0989464744925499
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# albert-base-v2-finetuned-wnli
This model is a fine-tuned version of [albert-base-v2](https://huggingface.co/albert-base-v2) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6878
- Accuracy: 0.5634
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.6878 | 0.5634 |
| No log | 2.0 | 80 | 0.6919 | 0.5634 |
| No log | 3.0 | 120 | 0.6877 | 0.5634 |
| No log | 4.0 | 160 | 0.6984 | 0.4085 |
| No log | 5.0 | 200 | 0.6957 | 0.5211 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "albert-base-v2-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.5633802816901409, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/albert-base-v2-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"albert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
albert-base-v2-finetuned-wnli
=============================
This model is a fine-tuned version of albert-base-v2 on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6878
* Accuracy: 0.5634
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.10851341485977173,
0.08287949860095978,
-0.0015716948546469212,
0.12378216534852982,
0.1644035279750824,
0.034203559160232544,
0.11756833642721176,
0.1303785890340805,
-0.08961959183216095,
0.01978359930217266,
0.13182155787944794,
0.1571536362171173,
0.019342191517353058,
0.10459926724433899,
-0.050261739641427994,
-0.2606520652770996,
-0.015629388391971588,
0.05233924835920334,
-0.047430187463760376,
0.13125889003276825,
0.08640951663255692,
-0.12352532893419266,
0.10119590908288956,
0.016996311023831367,
-0.19539082050323486,
0.006376367527991533,
0.009158787317574024,
-0.0564240999519825,
0.14400716125965118,
0.03155115246772766,
0.11656567454338074,
-0.004729350097477436,
0.08431360125541687,
-0.1941358745098114,
0.011082910932600498,
0.04820818454027176,
0.0049487631767988205,
0.09291473031044006,
0.04774554818868637,
0.0033729670103639364,
0.14480124413967133,
-0.09592609107494354,
0.0522734709084034,
0.027718715369701385,
-0.12452755123376846,
-0.21515901386737823,
-0.08181779086589813,
0.03230409324169159,
0.08416493982076645,
0.11512179672718048,
-0.005395589396357536,
0.12806573510169983,
-0.08085677772760391,
0.08780300617218018,
0.23827265202999115,
-0.30694082379341125,
-0.06739146262407303,
0.025221778079867363,
0.00799495168030262,
0.032211191952228546,
-0.10505171120166779,
-0.02871882915496826,
0.05733394995331764,
0.04659018665552139,
0.12641079723834991,
-0.033901676535606384,
-0.11446501314640045,
0.014867664314806461,
-0.13147439062595367,
-0.03409222885966301,
0.16096091270446777,
0.04539426788687706,
-0.033270757645368576,
-0.05664939805865288,
-0.051134154200553894,
-0.15639479458332062,
-0.035082269459962845,
-0.004277274943888187,
0.04805569350719452,
-0.02432965487241745,
-0.0460941419005394,
-0.0016077898908406496,
-0.11016025394201279,
-0.06246815249323845,
-0.08158069103956223,
0.11988197267055511,
0.03115209750831127,
0.016247021034359932,
-0.037016041576862335,
0.11204948276281357,
0.0013566170819103718,
-0.13201206922531128,
0.012988509610295296,
0.02687028981745243,
0.00729374261572957,
-0.044786155223846436,
-0.053867775946855545,
-0.053863268345594406,
0.00430115032941103,
0.12289395928382874,
-0.04026487097144127,
0.040145393460989,
0.04916608706116676,
0.04391423612833023,
-0.09642386436462402,
0.20065082609653473,
-0.03766079246997833,
-0.018071161583065987,
0.00929968897253275,
0.03564739599823952,
0.024339625611901283,
-0.009158243425190449,
-0.11860004812479019,
-0.000343729363521561,
0.07517223805189133,
0.005142418202012777,
-0.07092933356761932,
0.07137508690357208,
-0.05376800149679184,
-0.02296004444360733,
-0.0021376493386924267,
-0.08764661103487015,
0.03296663612127304,
-0.002671253401786089,
-0.07513636350631714,
-0.015495436266064644,
0.030281322076916695,
0.021347740665078163,
-0.014448919333517551,
0.11615916341543198,
-0.08655691146850586,
0.03218672797083855,
-0.09627551585435867,
-0.10221421718597412,
0.023033635690808296,
-0.10830260813236237,
0.03779095411300659,
-0.09124257415533066,
-0.1834726780653,
-0.009110176004469395,
0.06036468595266342,
-0.025423433631658554,
-0.061499450355768204,
-0.0546928234398365,
-0.06525935232639313,
0.015102635137736797,
-0.007776893209666014,
0.13168306648731232,
-0.06625673919916153,
0.08166395872831345,
0.02689969912171364,
0.06423691660165787,
-0.04112587869167328,
0.052426837384700775,
-0.10481588542461395,
0.014568629674613476,
-0.14707937836647034,
0.031188983470201492,
-0.037114519625902176,
0.07600796222686768,
-0.08274073898792267,
-0.09499000012874603,
0.015633273869752884,
-0.0025613398756831884,
0.0618094764649868,
0.09867966920137405,
-0.17744845151901245,
-0.07985639572143555,
0.15661577880382538,
-0.06437281519174576,
-0.1325954645872116,
0.12001652270555496,
-0.0612134151160717,
0.045997507870197296,
0.06028265133500099,
0.1512797325849533,
0.06379801779985428,
-0.08271874487400055,
-0.004605399910360575,
0.02407730370759964,
0.04843881353735924,
-0.07159419357776642,
0.07669540494680405,
0.008389415219426155,
0.002951052039861679,
0.0340876542031765,
-0.019045770168304443,
0.061201177537441254,
-0.08733832836151123,
-0.10032474249601364,
-0.04622561112046242,
-0.08203887939453125,
0.028672071173787117,
0.0776790976524353,
0.07325249910354614,
-0.09814091771841049,
-0.08594372123479843,
0.03338034823536873,
0.0765068531036377,
-0.04751443490386009,
0.028013426810503006,
-0.05600306764245033,
0.06213730573654175,
-0.043482739478349686,
-0.023733915761113167,
-0.17221353948116302,
-0.017427945509552956,
-0.000443327211542055,
-0.006937176920473576,
0.009066986851394176,
0.026353659108281136,
0.06821393221616745,
0.05675322189927101,
-0.05200904235243797,
-0.010675321333110332,
-0.02982199192047119,
-0.0041316417045891285,
-0.13505974411964417,
-0.2008715569972992,
-0.03134380653500557,
-0.02398592233657837,
0.15125882625579834,
-0.20457713305950165,
0.04213083162903786,
-0.021903015673160553,
0.06635771691799164,
0.012865194119513035,
-0.0053146895952522755,
-0.04286836460232735,
0.0698426365852356,
-0.04436494782567024,
-0.05092164874076843,
0.07527109980583191,
0.019110465422272682,
-0.09808443486690521,
-0.04705537110567093,
-0.08996978402137756,
0.15877945721149445,
0.13385072350502014,
-0.1099434345960617,
-0.07223377376794815,
-0.0058516887947916985,
-0.06560415774583817,
-0.03339000791311264,
-0.05574433505535126,
0.040243301540613174,
0.21176877617835999,
-0.007056929636746645,
0.15119796991348267,
-0.0662810429930687,
-0.04895230755209923,
0.026796160265803337,
-0.03642735630273819,
0.021458491683006287,
0.12952668964862823,
0.13312320411205292,
-0.05990966781973839,
0.14652734994888306,
0.15359628200531006,
-0.09059371799230576,
0.13318632543087006,
-0.03999984264373779,
-0.07472026348114014,
-0.017710551619529724,
-0.03931165114045143,
-0.004119568970054388,
0.10875216871500015,
-0.1615610122680664,
-0.004651137627661228,
0.030832653865218163,
0.014737037010490894,
0.020087437704205513,
-0.22307056188583374,
-0.04519743472337723,
0.04278237000107765,
-0.03178243339061737,
-0.021580059081315994,
-0.007723218761384487,
0.0037444550544023514,
0.10487890243530273,
0.0055309683084487915,
-0.08151063323020935,
0.03770234063267708,
0.005515687167644501,
-0.08669986575841904,
0.21752367913722992,
-0.06947796791791916,
-0.15518920123577118,
-0.12592779099941254,
-0.07883378863334656,
-0.04955562576651573,
0.0021320621017366648,
0.07115191966295242,
-0.09382370859384537,
-0.032456617802381516,
-0.07577887177467346,
0.019918303936719894,
0.004875612910836935,
0.031803544610738754,
0.01505905669182539,
0.0028410842642188072,
0.06437543779611588,
-0.10163018107414246,
-0.016886616125702858,
-0.05557944253087044,
-0.050702955573797226,
0.036809612065553665,
0.035607509315013885,
0.11485431343317032,
0.14555047452449799,
-0.016081763431429863,
0.013742087408900261,
-0.030881134793162346,
0.22703666985034943,
-0.06183459237217903,
-0.033547043800354004,
0.13601787388324738,
-0.008285166695713997,
0.04035327211022377,
0.1131085455417633,
0.07463839650154114,
-0.07826251536607742,
-0.00111157086212188,
0.03700360655784607,
-0.03763021528720856,
-0.2307037115097046,
-0.046411383897066116,
-0.06072646379470825,
0.007775360718369484,
0.09654852747917175,
0.02273011952638626,
0.02955014817416668,
0.07200337946414948,
0.04007653146982193,
0.08878234028816223,
-0.05143848434090614,
0.059663355350494385,
0.10464063286781311,
0.04082774370908737,
0.1216021254658699,
-0.05594692751765251,
-0.06648729741573334,
0.04218735173344612,
-0.01800714246928692,
0.2236419916152954,
0.016036270186305046,
0.13117147982120514,
0.05609254539012909,
0.15269875526428223,
-0.004133033100515604,
0.0846896767616272,
-0.0062759071588516235,
-0.05142221599817276,
-0.01511458307504654,
-0.03756638243794441,
-0.03472224622964859,
0.032171182334423065,
-0.08392827957868576,
0.079414002597332,
-0.1315862387418747,
0.016221044585108757,
0.05463474988937378,
0.26152393221855164,
0.04587202146649361,
-0.321972519159317,
-0.09329250454902649,
0.009788069874048233,
-0.029937151819467545,
-0.027202531695365906,
0.03143763169646263,
0.08129655569791794,
-0.09414859861135483,
0.035591233521699905,
-0.0738305002450943,
0.10116495192050934,
-0.04594714939594269,
0.0491180457174778,
0.08238666504621506,
0.07942087948322296,
0.0074699679389595985,
0.09445410221815109,
-0.30067723989486694,
0.2845527231693268,
0.004858710337430239,
0.06776915490627289,
-0.08622097969055176,
0.008372905664145947,
0.04453708976507187,
0.06563539057970047,
0.09501554816961288,
-0.0140914935618639,
-0.04899032041430473,
-0.18717963993549347,
-0.06885536760091782,
0.03427093103528023,
0.05553438887000084,
-0.03511710464954376,
0.08751165121793747,
-0.028084883466362953,
0.0065111275762319565,
0.07207874953746796,
0.018553245812654495,
-0.04902458190917969,
-0.11115951836109161,
-0.01427517831325531,
0.026247471570968628,
-0.07107964158058167,
-0.05908683314919472,
-0.11763347685337067,
-0.1296166181564331,
0.15435954928398132,
-0.02876298874616623,
-0.02926171012222767,
-0.11168454587459564,
0.08655610680580139,
0.049155063927173615,
-0.09150857478380203,
0.0343179889023304,
0.005369671154767275,
0.08151274919509888,
0.02817639894783497,
-0.0781378522515297,
0.10608948767185211,
-0.07388927042484283,
-0.15226562321186066,
-0.06808533519506454,
0.09889303892850876,
0.030818484723567963,
0.06943147629499435,
-0.01059445645660162,
0.015278245322406292,
-0.05115121603012085,
-0.0893559604883194,
0.025140443816781044,
0.008072792552411556,
0.08026784658432007,
0.005880521144717932,
-0.06082103028893471,
0.021673541516065598,
-0.05721529945731163,
-0.032977454364299774,
0.20603495836257935,
0.21837805211544037,
-0.10593485087156296,
0.01963035576045513,
0.00011520516272867098,
-0.07776135206222534,
-0.19612984359264374,
0.04137551784515381,
0.04815450683236122,
0.018216095864772797,
0.03512553870677948,
-0.17524226009845734,
0.15107755362987518,
0.1101469025015831,
-0.014013183303177357,
0.10103233903646469,
-0.30580073595046997,
-0.12347456812858582,
0.13789689540863037,
0.12939028441905975,
0.13237634301185608,
-0.1306103616952896,
-0.01185387559235096,
-0.028198547661304474,
-0.1425543576478958,
0.09835414588451385,
-0.10393572598695755,
0.11367519944906235,
-0.044036321341991425,
0.07140891253948212,
0.0034511450212448835,
-0.06000930443406105,
0.12067489326000214,
0.025124182924628258,
0.09810183197259903,
-0.056498534977436066,
-0.034325432032346725,
0.03099016286432743,
-0.04756839945912361,
0.031243259087204933,
-0.10857679694890976,
0.023675616830587387,
-0.12081367522478104,
-0.025438150390982628,
-0.06328153610229492,
0.049994807690382004,
-0.04249459132552147,
-0.060809362679719925,
-0.03294748067855835,
0.01564195565879345,
0.05251622945070267,
-0.009419661946594715,
0.14960907399654388,
0.023017099127173424,
0.14949700236320496,
0.08569129556417465,
0.08571472764015198,
-0.07848557829856873,
-0.0662907212972641,
-0.018670717254281044,
-0.01171959936618805,
0.052121590822935104,
-0.1567266285419464,
0.0222612377256155,
0.14933155477046967,
0.023854093626141548,
0.14069582521915436,
0.08444757014513016,
-0.012314979918301105,
0.006973995827138424,
0.05782342329621315,
-0.16315408051013947,
-0.08276087045669556,
-0.01947229914367199,
-0.05304446816444397,
-0.12343880534172058,
0.044341687113046646,
0.08120200037956238,
-0.07319604605436325,
-0.01001099031418562,
-0.008966249413788319,
0.00801245216280222,
-0.060733210295438766,
0.17431361973285675,
0.04631480574607849,
0.04427378252148628,
-0.103228360414505,
0.06995489448308945,
0.04022670537233353,
-0.08173894137144089,
0.006496574729681015,
0.0679832473397255,
-0.07813875377178192,
-0.05333561822772026,
0.08378839492797852,
0.21410124003887177,
-0.04703439027070999,
-0.04646718502044678,
-0.1423557847738266,
-0.13277803361415863,
0.08483558148145676,
0.14282391965389252,
0.11973793059587479,
0.011175474151968956,
-0.0649867057800293,
-0.0028957000467926264,
-0.11970946192741394,
0.0956927016377449,
0.04556654393672943,
0.06415167450904846,
-0.1412634551525116,
0.13030587136745453,
0.014540751464664936,
0.04957909509539604,
-0.018316565081477165,
0.02747558057308197,
-0.09735246002674103,
0.01003090851008892,
-0.11197996139526367,
-0.014422730542719364,
-0.03730938956141472,
0.010056210681796074,
-0.005486046429723501,
-0.04593383148312569,
-0.06191306561231613,
0.010265232995152473,
-0.10761867463588715,
-0.020014287903904915,
0.03203214704990387,
0.06906575709581375,
-0.09926522523164749,
-0.03584988787770271,
0.025884181261062622,
-0.06411145627498627,
0.06664050370454788,
0.04775208234786987,
0.024460744112730026,
0.05203469842672348,
-0.13428330421447754,
0.020065493881702423,
0.07125937938690186,
0.023615064099431038,
0.06857342272996902,
-0.10210324823856354,
-0.0048413085751235485,
-0.0018470374634489417,
0.03993143141269684,
0.01879689283668995,
0.060570936650037766,
-0.13668784499168396,
-0.0018133589765056968,
-0.0059414212591946125,
-0.08354639261960983,
-0.06759142130613327,
0.025267822667956352,
0.09829063713550568,
0.010769153945147991,
0.20157437026500702,
-0.07453074306249619,
0.05111348256468773,
-0.21752247214317322,
0.008580947294831276,
-0.011457758024334908,
-0.10675018280744553,
-0.11847562342882156,
-0.07331645488739014,
0.060511115938425064,
-0.05910194292664528,
0.15643341839313507,
0.04336113855242729,
0.03594420477747917,
0.029215503484010696,
-0.016089623793959618,
0.025112714618444443,
0.013912186026573181,
0.20906245708465576,
0.031809382140636444,
-0.03966294229030609,
0.06834982335567474,
0.04653722792863846,
0.10755541920661926,
0.1338290572166443,
0.20528732240200043,
0.14109309017658234,
-0.0016062030335888267,
0.104256771504879,
0.03466791287064552,
-0.05762480944395065,
-0.1563846468925476,
0.03676179423928261,
-0.04096238315105438,
0.11012841761112213,
-0.017167802900075912,
0.20791314542293549,
0.07202707976102829,
-0.17344166338443756,
0.04336996749043465,
-0.05798328295350075,
-0.0792505294084549,
-0.12193125486373901,
-0.048324186354875565,
-0.08294760435819626,
-0.12893500924110413,
0.004624804016202688,
-0.11551131308078766,
0.0029855608008801937,
0.11830372363328934,
0.0003146572853438556,
-0.025349488481879234,
0.15853318572044373,
0.012229030951857567,
0.03587009757757187,
0.0623665414750576,
0.009986549615859985,
-0.033745404332876205,
-0.12574751675128937,
-0.049979232251644135,
-0.01642908900976181,
-0.03286297246813774,
0.02869069203734398,
-0.06801576167345047,
-0.043811243027448654,
0.0380096510052681,
-0.018702475354075432,
-0.0994282215833664,
0.008727424778044224,
0.010006695054471493,
0.06349524110555649,
0.04147202521562576,
0.009688003920018673,
0.02728598564863205,
-0.008197645656764507,
0.20150049030780792,
-0.08199817687273026,
-0.05192165821790695,
-0.10646841675043106,
0.24835076928138733,
0.04308316856622696,
-0.024649769067764282,
0.02875283546745777,
-0.062391892075538635,
0.00807760376483202,
0.25195541977882385,
0.20674537122249603,
-0.06871533393859863,
-0.007414078805595636,
0.005721509922295809,
-0.0078910943120718,
-0.0229549128562212,
0.09814517945051193,
0.13991482555866241,
0.039135899394750595,
-0.10182734578847885,
-0.05337280035018921,
-0.05527487024664879,
-0.020565170794725418,
-0.03373938426375389,
0.08031013607978821,
0.05193829908967018,
0.0009627835243009031,
-0.02825058251619339,
0.049038421362638474,
-0.06365194916725159,
-0.07327164709568024,
0.06637400388717651,
-0.21353663504123688,
-0.1605113446712494,
-0.009253967553377151,
0.09987538307905197,
0.011628348380327225,
0.06883668899536133,
-0.02442094497382641,
-0.005353689659386873,
0.09306224435567856,
-0.01920473948121071,
-0.106890469789505,
-0.07222694903612137,
0.08502697199583054,
-0.1229671910405159,
0.2248678207397461,
-0.042769718915224075,
0.04968447983264923,
0.12772446870803833,
0.07353010773658752,
-0.08143990486860275,
0.05894053354859352,
0.03563410043716431,
-0.05077586695551872,
0.029180480167269707,
0.07570891827344894,
-0.03623399883508682,
0.05392841622233391,
0.0446491502225399,
-0.1353415846824646,
0.023813316598534584,
-0.06738412380218506,
-0.061082519590854645,
-0.04242687672376633,
-0.020358486101031303,
-0.053828101605176926,
0.13348907232284546,
0.22147727012634277,
-0.02643861249089241,
-0.01265759114176035,
-0.06895022839307785,
0.011453851126134396,
0.0556543804705143,
0.027651680633425713,
-0.06081826612353325,
-0.20048682391643524,
0.02100181393325329,
0.04676947742700577,
-0.02104194276034832,
-0.2525334358215332,
-0.09901151061058044,
0.0026946943253278732,
-0.08497780561447144,
-0.08980201184749603,
0.06273863464593887,
0.0974934995174408,
0.05462854355573654,
-0.0601477175951004,
-0.05967121943831444,
-0.06285107880830765,
0.14960379898548126,
-0.1336909383535385,
-0.09875859320163727
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# albert-large-v2-finetuned-rte
This model is a fine-tuned version of [albert-large-v2](https://huggingface.co/albert-large-v2) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6827
- Accuracy: 0.5487
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 18 | 0.6954 | 0.5271 |
| No log | 2.0 | 36 | 0.6860 | 0.5379 |
| No log | 3.0 | 54 | 0.6827 | 0.5487 |
| No log | 4.0 | 72 | 0.7179 | 0.5235 |
| No log | 5.0 | 90 | 0.7504 | 0.5379 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "albert-large-v2-finetuned-rte", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "rte"}, "metrics": [{"type": "accuracy", "value": 0.5487364620938628, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/albert-large-v2-finetuned-rte
|
[
"transformers",
"pytorch",
"tensorboard",
"albert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
albert-large-v2-finetuned-rte
=============================
This model is a fine-tuned version of albert-large-v2 on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6827
* Accuracy: 0.5487
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
-0.1043897345662117,
0.09041710197925568,
-0.0018204497173428535,
0.12200300395488739,
0.16448982059955597,
0.03956865519285202,
0.13497143983840942,
0.1270168423652649,
-0.08481080830097198,
0.015455087646842003,
0.12540465593338013,
0.153660848736763,
0.023047437891364098,
0.09091728925704956,
-0.04355344548821449,
-0.26016730070114136,
-0.019337916746735573,
0.05018600821495056,
-0.060656387358903885,
0.13257810473442078,
0.08433875441551208,
-0.12124460190534592,
0.0994998961687088,
0.013166174292564392,
-0.19330553710460663,
0.006458010524511337,
0.0084362318739295,
-0.05300622433423996,
0.14907553791999817,
0.032914966344833374,
0.12039439380168915,
0.0010248207254335284,
0.08637624979019165,
-0.20333199203014374,
0.011971702799201012,
0.0485745444893837,
0.0050692204385995865,
0.09407994151115417,
0.0499265193939209,
0.008502666838467121,
0.13834148645401,
-0.0866832360625267,
0.05521291494369507,
0.032211288809776306,
-0.12968367338180542,
-0.21511566638946533,
-0.07950890064239502,
0.029673296958208084,
0.07841763645410538,
0.11043371260166168,
-0.00716405501589179,
0.12414336204528809,
-0.08651991188526154,
0.08560904860496521,
0.2336435168981552,
-0.2942045331001282,
-0.06738997250795364,
0.0342283695936203,
0.008772971108555794,
0.04296833649277687,
-0.10561344772577286,
-0.02603508159518242,
0.05543508380651474,
0.04415561258792877,
0.12440583109855652,
-0.030688507482409477,
-0.1179208979010582,
0.018727442249655724,
-0.13574853539466858,
-0.0319734551012516,
0.1635868400335312,
0.04732310399413109,
-0.03021303191781044,
-0.05170556530356407,
-0.04995705932378769,
-0.15496456623077393,
-0.03114769048988819,
-0.008811096660792828,
0.050622597336769104,
-0.024373497813940048,
-0.045188650488853455,
-0.005700754001736641,
-0.10870394110679626,
-0.06958650797605515,
-0.07608795166015625,
0.11917579174041748,
0.032009463757276535,
0.013160968199372292,
-0.03322426229715347,
0.11538581550121307,
0.002047595800831914,
-0.1271321177482605,
0.023262539878487587,
0.028598491102457047,
0.0013884284999221563,
-0.04260369762778282,
-0.05158384516835213,
-0.05187821388244629,
0.00866610649973154,
0.12119698524475098,
-0.0374576710164547,
0.03818407654762268,
0.054203156381845474,
0.04675496742129326,
-0.09873638302087784,
0.1985936313867569,
-0.0405128039419651,
-0.019729875028133392,
-0.003358252579346299,
0.046564240008592606,
0.020224222913384438,
-0.011167053133249283,
-0.11705933511257172,
0.0015681820223107934,
0.07488088309764862,
0.007234811782836914,
-0.06780674308538437,
0.06805625557899475,
-0.06067543849349022,
-0.026134150102734566,
-0.00582540687173605,
-0.08523271977901459,
0.02753078006207943,
0.0015990851679816842,
-0.07716228067874908,
-0.016031717881560326,
0.03096054121851921,
0.02303786389529705,
-0.008554721251130104,
0.11098548024892807,
-0.0875086784362793,
0.030586492270231247,
-0.09531625360250473,
-0.10716207325458527,
0.01914925128221512,
-0.10970630496740341,
0.033739835023880005,
-0.08660628646612167,
-0.17853286862373352,
-0.010206986218690872,
0.05820174887776375,
-0.02350551448762417,
-0.05964024364948273,
-0.05787455663084984,
-0.06257002800703049,
0.011450978927314281,
-0.0047058057971298695,
0.12778115272521973,
-0.06686297804117203,
0.08575877547264099,
0.026083016768097878,
0.06403693556785583,
-0.040051139891147614,
0.05534994974732399,
-0.10165484994649887,
0.011382686905562878,
-0.13711349666118622,
0.03074643574655056,
-0.04538505896925926,
0.06834545731544495,
-0.08063977211713791,
-0.09354162961244583,
0.01778559572994709,
0.0006385404267348349,
0.05734335258603096,
0.10040190070867538,
-0.18436020612716675,
-0.08850996196269989,
0.1568250209093094,
-0.06554961204528809,
-0.12835486233234406,
0.12294255197048187,
-0.058451417833566666,
0.049851201474666595,
0.06012613698840141,
0.15347108244895935,
0.0778881162405014,
-0.08005436509847641,
0.0018642847426235676,
0.025876367464661598,
0.05513768270611763,
-0.0625874325633049,
0.07856831699609756,
0.000510420766659081,
0.0018666420364752412,
0.03265516459941864,
-0.024883462116122246,
0.06353659182786942,
-0.0928104966878891,
-0.10366548597812653,
-0.04003290459513664,
-0.08430887013673782,
0.04179241508245468,
0.08037082105875015,
0.06832953542470932,
-0.09617464244365692,
-0.0834355279803276,
0.037935853004455566,
0.0807463750243187,
-0.04798172414302826,
0.022671908140182495,
-0.050730880349874496,
0.05897333472967148,
-0.037085819989442825,
-0.023646162822842598,
-0.1692752242088318,
-0.02350112423300743,
0.0016649349126964808,
-0.010917467065155506,
0.016340263187885284,
0.04113783687353134,
0.06870339065790176,
0.06370645016431808,
-0.052015453577041626,
-0.014911185018718243,
-0.04408225044608116,
-0.003138963133096695,
-0.12948457896709442,
-0.20873630046844482,
-0.03088158369064331,
-0.021439122036099434,
0.16991887986660004,
-0.20735763013362885,
0.04764413461089134,
-0.02439938299357891,
0.062108322978019714,
0.016043543815612793,
-0.007533009629696608,
-0.04192341864109039,
0.07536851614713669,
-0.041417963802814484,
-0.05027645081281662,
0.07738333940505981,
0.011750129982829094,
-0.10127293318510056,
-0.05371388792991638,
-0.09444814920425415,
0.16020287573337555,
0.13115833699703217,
-0.11465359479188919,
-0.0724000409245491,
-0.01126941293478012,
-0.06456045061349869,
-0.034696999937295914,
-0.05413107946515083,
0.03604563698172569,
0.20521071553230286,
-0.0061521041207015514,
0.148649662733078,
-0.06420578807592392,
-0.042227379977703094,
0.022919071838259697,
-0.03824517875909805,
0.023668700829148293,
0.13661155104637146,
0.13755904138088226,
-0.050012361258268356,
0.1495388299226761,
0.1569514274597168,
-0.088559590280056,
0.1428588181734085,
-0.041790347546339035,
-0.07380831241607666,
-0.01835949718952179,
-0.03915363550186157,
-0.005380677524954081,
0.11051664501428604,
-0.1627344787120819,
-0.004747296683490276,
0.027950014919042587,
0.01196958962827921,
0.01957515813410282,
-0.22983019053936005,
-0.049459058791399,
0.044974543154239655,
-0.03769618272781372,
-0.018881667405366898,
-0.009966891258955002,
0.002720503369346261,
0.10604012757539749,
0.0008937334059737623,
-0.0848451629281044,
0.03172663599252701,
0.0022889338433742523,
-0.0838480219244957,
0.2177211195230484,
-0.07052799314260483,
-0.15200303494930267,
-0.1336527019739151,
-0.07158368080854416,
-0.05252264440059662,
0.0019286591559648514,
0.06796949356794357,
-0.10265792161226273,
-0.02505035512149334,
-0.07416200637817383,
0.032283633947372437,
0.007663427852094173,
0.026632966473698616,
0.007754262536764145,
0.005139422602951527,
0.06503865122795105,
-0.1074826717376709,
-0.01233445480465889,
-0.057872917503118515,
-0.0591345876455307,
0.03848171979188919,
0.034625981003046036,
0.115079365670681,
0.15166349709033966,
-0.012620719149708748,
0.009598924778401852,
-0.029637129977345467,
0.22535473108291626,
-0.06218699738383293,
-0.03518056496977806,
0.1383657604455948,
-0.00844966433942318,
0.04050154611468315,
0.10554705560207367,
0.08095891773700714,
-0.07628611475229263,
-0.0016992816235870123,
0.04296675696969032,
-0.03480858728289604,
-0.23365439474582672,
-0.04573305323719978,
-0.05494158715009689,
0.014232792891561985,
0.09255701303482056,
0.019993748515844345,
0.029775872826576233,
0.06984301656484604,
0.03963799774646759,
0.08110816776752472,
-0.04758360981941223,
0.050262778997421265,
0.10383641719818115,
0.03471602872014046,
0.12061983346939087,
-0.053521495312452316,
-0.0673314705491066,
0.04152953252196312,
-0.01598353125154972,
0.22365815937519073,
0.02313443087041378,
0.13652832806110382,
0.06385821849107742,
0.15331554412841797,
-0.00816011056303978,
0.0783262625336647,
-0.0021609202958643436,
-0.04902302101254463,
-0.016767462715506554,
-0.040544673800468445,
-0.03478052467107773,
0.028274979442358017,
-0.07335725426673889,
0.0820470005273819,
-0.1323360651731491,
0.011274303309619427,
0.052326325327157974,
0.2542121410369873,
0.044810209423303604,
-0.31556814908981323,
-0.09052421897649765,
0.00866544246673584,
-0.02356516383588314,
-0.01863424852490425,
0.028864992782473564,
0.08615357428789139,
-0.09284189343452454,
0.02936282567679882,
-0.07169239223003387,
0.09910397231578827,
-0.054707691073417664,
0.0510663241147995,
0.08254608511924744,
0.08267651498317719,
0.005808492656797171,
0.09461677074432373,
-0.2964196503162384,
0.2862945795059204,
0.0039495741948485374,
0.05975591763854027,
-0.0789700374007225,
0.006556149106472731,
0.046888552606105804,
0.06812030076980591,
0.08470743149518967,
-0.013251593336462975,
-0.020075054839253426,
-0.2003544569015503,
-0.0688004121184349,
0.03323981165885925,
0.062028754502534866,
-0.046769145876169205,
0.08501488715410233,
-0.028460077941417694,
0.009700755588710308,
0.07592134177684784,
0.016847729682922363,
-0.05689072608947754,
-0.10987614095211029,
-0.010992318391799927,
0.0211940910667181,
-0.06817856431007385,
-0.06148962303996086,
-0.12117602676153183,
-0.13506893813610077,
0.14350582659244537,
-0.035363178700208664,
-0.025651419535279274,
-0.10787783563137054,
0.08389080315828323,
0.05256499722599983,
-0.09024346619844437,
0.03252946212887764,
0.006361968349665403,
0.07474474608898163,
0.02820555865764618,
-0.0720175951719284,
0.10513556748628616,
-0.07201699167490005,
-0.1567019373178482,
-0.06933965533971786,
0.09822496026754379,
0.036299943923950195,
0.07195328176021576,
-0.016210954636335373,
0.009944227524101734,
-0.04810528829693794,
-0.08777511119842529,
0.02948813885450363,
0.01058610063046217,
0.06848503649234772,
0.0174538716673851,
-0.06458134204149246,
0.023709574714303017,
-0.06219806522130966,
-0.03656124323606491,
0.20266254246234894,
0.23392795026302338,
-0.10307903587818146,
0.016509070992469788,
0.008817926980555058,
-0.07866012305021286,
-0.19457846879959106,
0.04412205144762993,
0.04552039876580238,
0.016301503404974937,
0.0460701584815979,
-0.18476153910160065,
0.14131374657154083,
0.11442817002534866,
-0.013105911202728748,
0.10344818979501724,
-0.3194306790828705,
-0.12042511254549026,
0.14050836861133575,
0.13497504591941833,
0.11862707883119583,
-0.13775183260440826,
-0.015473544597625732,
-0.024572154507040977,
-0.14168329536914825,
0.104047492146492,
-0.10836904495954514,
0.11888211220502853,
-0.04508076608181,
0.06266004592180252,
0.0037345942109823227,
-0.05790884792804718,
0.12771525979042053,
0.020587172359228134,
0.10045302659273148,
-0.05350383371114731,
-0.033641278743743896,
0.03071650117635727,
-0.04200958088040352,
0.022741947323083878,
-0.11080576479434967,
0.02306830883026123,
-0.11522656679153442,
-0.02256503328680992,
-0.06575943529605865,
0.04976709559559822,
-0.04724540561437607,
-0.0656195804476738,
-0.03238914906978607,
0.01894870586693287,
0.04376131296157837,
-0.009512390941381454,
0.13846907019615173,
0.019795814529061317,
0.15673676133155823,
0.08587489277124405,
0.08065654337406158,
-0.06639920175075531,
-0.07217054069042206,
-0.023511311039328575,
-0.01077340543270111,
0.05354699864983559,
-0.15230706334114075,
0.017872119322419167,
0.14700889587402344,
0.02564365789294243,
0.1493699550628662,
0.08398130536079407,
-0.016955314204096794,
0.005387521348893642,
0.05736297369003296,
-0.16045571863651276,
-0.08898655325174332,
-0.022810498252511024,
-0.057850807905197144,
-0.12351630628108978,
0.04439418017864227,
0.08429817855358124,
-0.07308221608400345,
-0.007165681105107069,
-0.007022528909146786,
0.006990774534642696,
-0.059922955930233,
0.17914733290672302,
0.05237842723727226,
0.04748550057411194,
-0.0986846312880516,
0.0733439028263092,
0.040773339569568634,
-0.07562568783760071,
-0.0016482991632074118,
0.06309086829423904,
-0.07577116042375565,
-0.05331952124834061,
0.07815593481063843,
0.21861189603805542,
-0.046605970710515976,
-0.044787194579839706,
-0.1488434225320816,
-0.13134630024433136,
0.07773518562316895,
0.1400838941335678,
0.12035413086414337,
0.011797239072620869,
-0.06410318613052368,
0.0008707083179615438,
-0.11228730529546738,
0.09598390012979507,
0.04239305108785629,
0.06328403204679489,
-0.1388140320777893,
0.1358414888381958,
0.020042940974235535,
0.04542526602745056,
-0.01682385802268982,
0.023282723501324654,
-0.10045109689235687,
0.009182396344840527,
-0.10750310868024826,
-0.024285614490509033,
-0.026428405195474625,
0.01211906410753727,
-0.005861642770469189,
-0.04758259654045105,
-0.0566759891808033,
0.005985803436487913,
-0.10838011652231216,
-0.021770738065242767,
0.03533035144209862,
0.07539134472608566,
-0.10229399800300598,
-0.03510992228984833,
0.031127886846661568,
-0.062376491725444794,
0.06643445789813995,
0.03968359902501106,
0.026856200769543648,
0.05519283562898636,
-0.14207905530929565,
0.023693231865763664,
0.06791891902685165,
0.025541288778185844,
0.06395706534385681,
-0.09913884848356247,
-0.008636538870632648,
-0.013803153298795223,
0.04345450550317764,
0.01916884072124958,
0.061185795813798904,
-0.13449762761592865,
-0.0013455058215186,
-0.010348033159971237,
-0.08741991221904755,
-0.06637092679738998,
0.027137411758303642,
0.0962412878870964,
0.012502683326601982,
0.2001606822013855,
-0.07415919005870819,
0.05315055325627327,
-0.22284531593322754,
0.007784545887261629,
-0.009375255554914474,
-0.10596857219934464,
-0.117481529712677,
-0.0776430070400238,
0.058280833065509796,
-0.061822276562452316,
0.15235701203346252,
0.0425226092338562,
0.032762084156274796,
0.025983376428484917,
-0.013488364405930042,
0.02397947758436203,
0.012583584524691105,
0.20739060640335083,
0.03827144205570221,
-0.03573043271899223,
0.06570188701152802,
0.0477876141667366,
0.10386498272418976,
0.12428651750087738,
0.20066112279891968,
0.14294204115867615,
-0.015342270024120808,
0.0958690345287323,
0.043385203927755356,
-0.06158531829714775,
-0.14614607393741608,
0.044377125799655914,
-0.034792233258485794,
0.1079014241695404,
-0.018612291663885117,
0.21717964112758636,
0.06784909218549728,
-0.17057251930236816,
0.04580388590693474,
-0.053313955664634705,
-0.08039191365242004,
-0.12306919693946838,
-0.03409576043486595,
-0.07853835076093674,
-0.1308421641588211,
0.0009298619697801769,
-0.1138087809085846,
0.0009946830105036497,
0.12857311964035034,
0.0004376996657811105,
-0.02373930811882019,
0.15620113909244537,
0.01808966137468815,
0.03327104076743126,
0.06001606956124306,
0.008834846317768097,
-0.037406764924526215,
-0.14248812198638916,
-0.055841442197561264,
-0.010784979909658432,
-0.024631042033433914,
0.024992385879158974,
-0.0701456367969513,
-0.052444834262132645,
0.0370294526219368,
-0.019407952204346657,
-0.10484905540943146,
0.010788097977638245,
0.003588929073885083,
0.05906722694635391,
0.03694666177034378,
0.008093061856925488,
0.0268037561327219,
-0.0061460998840630054,
0.20350411534309387,
-0.07954155653715134,
-0.05374414101243019,
-0.09890949726104736,
0.24820192158222198,
0.03600170463323593,
-0.019097449257969856,
0.030537579208612442,
-0.06343317031860352,
0.008436888456344604,
0.24792559444904327,
0.21281282603740692,
-0.07731661945581436,
-0.006568665150552988,
0.01074000634253025,
-0.007876424118876457,
-0.02606234699487686,
0.09869705140590668,
0.13348488509655,
0.02694064937531948,
-0.10093294084072113,
-0.04489089176058769,
-0.05451182648539543,
-0.020151840522885323,
-0.02722935564815998,
0.07710536569356918,
0.0591934435069561,
0.005201260559260845,
-0.032312966883182526,
0.052586235105991364,
-0.06001158803701401,
-0.07486385107040405,
0.0692172423005104,
-0.21624009311199188,
-0.16352620720863342,
-0.016660498455166817,
0.10600580275058746,
0.010035257786512375,
0.0684928447008133,
-0.02732844464480877,
-0.004101307597011328,
0.08941727131605148,
-0.018484266474843025,
-0.1078256294131279,
-0.08087000250816345,
0.08553116768598557,
-0.1175895631313324,
0.22102504968643188,
-0.045581746846437454,
0.05132993310689926,
0.12591637670993805,
0.06818245351314545,
-0.07157515734434128,
0.06208227574825287,
0.03936963155865669,
-0.05185438692569733,
0.022042429074645042,
0.06987792253494263,
-0.034005582332611084,
0.06092541292309761,
0.04595522955060005,
-0.13468651473522186,
0.027433451265096664,
-0.06213991343975067,
-0.06929901987314224,
-0.03811642527580261,
-0.020439451560378075,
-0.053217582404613495,
0.13125689327716827,
0.22217302024364471,
-0.02466833032667637,
-0.011345439590513706,
-0.0688401386141777,
0.009084232151508331,
0.06049910560250282,
0.030142048373818398,
-0.06082216650247574,
-0.19785021245479584,
0.018391301855444908,
0.04134780913591385,
-0.01892002485692501,
-0.26198142766952515,
-0.09846517443656921,
0.0023176679387688637,
-0.08312686532735825,
-0.08758088946342468,
0.06375978142023087,
0.09902011603116989,
0.057703327387571335,
-0.05802591145038605,
-0.06846413016319275,
-0.06341823935508728,
0.15233372151851654,
-0.13802897930145264,
-0.09735766798257828
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# albert-large-v2-finetuned-wnli
This model is a fine-tuned version of [albert-large-v2](https://huggingface.co/albert-large-v2) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6919
- Accuracy: 0.5352
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 17 | 0.7292 | 0.4366 |
| No log | 2.0 | 34 | 0.6919 | 0.5352 |
| No log | 3.0 | 51 | 0.7084 | 0.4648 |
| No log | 4.0 | 68 | 0.7152 | 0.5352 |
| No log | 5.0 | 85 | 0.7343 | 0.5211 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "albert-large-v2-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.5352112676056338, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/albert-large-v2-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"albert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
albert-large-v2-finetuned-wnli
==============================
This model is a fine-tuned version of albert-large-v2 on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6919
* Accuracy: 0.5352
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
-0.1043897345662117,
0.09041710197925568,
-0.0018204497173428535,
0.12200300395488739,
0.16448982059955597,
0.03956865519285202,
0.13497143983840942,
0.1270168423652649,
-0.08481080830097198,
0.015455087646842003,
0.12540465593338013,
0.153660848736763,
0.023047437891364098,
0.09091728925704956,
-0.04355344548821449,
-0.26016730070114136,
-0.019337916746735573,
0.05018600821495056,
-0.060656387358903885,
0.13257810473442078,
0.08433875441551208,
-0.12124460190534592,
0.0994998961687088,
0.013166174292564392,
-0.19330553710460663,
0.006458010524511337,
0.0084362318739295,
-0.05300622433423996,
0.14907553791999817,
0.032914966344833374,
0.12039439380168915,
0.0010248207254335284,
0.08637624979019165,
-0.20333199203014374,
0.011971702799201012,
0.0485745444893837,
0.0050692204385995865,
0.09407994151115417,
0.0499265193939209,
0.008502666838467121,
0.13834148645401,
-0.0866832360625267,
0.05521291494369507,
0.032211288809776306,
-0.12968367338180542,
-0.21511566638946533,
-0.07950890064239502,
0.029673296958208084,
0.07841763645410538,
0.11043371260166168,
-0.00716405501589179,
0.12414336204528809,
-0.08651991188526154,
0.08560904860496521,
0.2336435168981552,
-0.2942045331001282,
-0.06738997250795364,
0.0342283695936203,
0.008772971108555794,
0.04296833649277687,
-0.10561344772577286,
-0.02603508159518242,
0.05543508380651474,
0.04415561258792877,
0.12440583109855652,
-0.030688507482409477,
-0.1179208979010582,
0.018727442249655724,
-0.13574853539466858,
-0.0319734551012516,
0.1635868400335312,
0.04732310399413109,
-0.03021303191781044,
-0.05170556530356407,
-0.04995705932378769,
-0.15496456623077393,
-0.03114769048988819,
-0.008811096660792828,
0.050622597336769104,
-0.024373497813940048,
-0.045188650488853455,
-0.005700754001736641,
-0.10870394110679626,
-0.06958650797605515,
-0.07608795166015625,
0.11917579174041748,
0.032009463757276535,
0.013160968199372292,
-0.03322426229715347,
0.11538581550121307,
0.002047595800831914,
-0.1271321177482605,
0.023262539878487587,
0.028598491102457047,
0.0013884284999221563,
-0.04260369762778282,
-0.05158384516835213,
-0.05187821388244629,
0.00866610649973154,
0.12119698524475098,
-0.0374576710164547,
0.03818407654762268,
0.054203156381845474,
0.04675496742129326,
-0.09873638302087784,
0.1985936313867569,
-0.0405128039419651,
-0.019729875028133392,
-0.003358252579346299,
0.046564240008592606,
0.020224222913384438,
-0.011167053133249283,
-0.11705933511257172,
0.0015681820223107934,
0.07488088309764862,
0.007234811782836914,
-0.06780674308538437,
0.06805625557899475,
-0.06067543849349022,
-0.026134150102734566,
-0.00582540687173605,
-0.08523271977901459,
0.02753078006207943,
0.0015990851679816842,
-0.07716228067874908,
-0.016031717881560326,
0.03096054121851921,
0.02303786389529705,
-0.008554721251130104,
0.11098548024892807,
-0.0875086784362793,
0.030586492270231247,
-0.09531625360250473,
-0.10716207325458527,
0.01914925128221512,
-0.10970630496740341,
0.033739835023880005,
-0.08660628646612167,
-0.17853286862373352,
-0.010206986218690872,
0.05820174887776375,
-0.02350551448762417,
-0.05964024364948273,
-0.05787455663084984,
-0.06257002800703049,
0.011450978927314281,
-0.0047058057971298695,
0.12778115272521973,
-0.06686297804117203,
0.08575877547264099,
0.026083016768097878,
0.06403693556785583,
-0.040051139891147614,
0.05534994974732399,
-0.10165484994649887,
0.011382686905562878,
-0.13711349666118622,
0.03074643574655056,
-0.04538505896925926,
0.06834545731544495,
-0.08063977211713791,
-0.09354162961244583,
0.01778559572994709,
0.0006385404267348349,
0.05734335258603096,
0.10040190070867538,
-0.18436020612716675,
-0.08850996196269989,
0.1568250209093094,
-0.06554961204528809,
-0.12835486233234406,
0.12294255197048187,
-0.058451417833566666,
0.049851201474666595,
0.06012613698840141,
0.15347108244895935,
0.0778881162405014,
-0.08005436509847641,
0.0018642847426235676,
0.025876367464661598,
0.05513768270611763,
-0.0625874325633049,
0.07856831699609756,
0.000510420766659081,
0.0018666420364752412,
0.03265516459941864,
-0.024883462116122246,
0.06353659182786942,
-0.0928104966878891,
-0.10366548597812653,
-0.04003290459513664,
-0.08430887013673782,
0.04179241508245468,
0.08037082105875015,
0.06832953542470932,
-0.09617464244365692,
-0.0834355279803276,
0.037935853004455566,
0.0807463750243187,
-0.04798172414302826,
0.022671908140182495,
-0.050730880349874496,
0.05897333472967148,
-0.037085819989442825,
-0.023646162822842598,
-0.1692752242088318,
-0.02350112423300743,
0.0016649349126964808,
-0.010917467065155506,
0.016340263187885284,
0.04113783687353134,
0.06870339065790176,
0.06370645016431808,
-0.052015453577041626,
-0.014911185018718243,
-0.04408225044608116,
-0.003138963133096695,
-0.12948457896709442,
-0.20873630046844482,
-0.03088158369064331,
-0.021439122036099434,
0.16991887986660004,
-0.20735763013362885,
0.04764413461089134,
-0.02439938299357891,
0.062108322978019714,
0.016043543815612793,
-0.007533009629696608,
-0.04192341864109039,
0.07536851614713669,
-0.041417963802814484,
-0.05027645081281662,
0.07738333940505981,
0.011750129982829094,
-0.10127293318510056,
-0.05371388792991638,
-0.09444814920425415,
0.16020287573337555,
0.13115833699703217,
-0.11465359479188919,
-0.0724000409245491,
-0.01126941293478012,
-0.06456045061349869,
-0.034696999937295914,
-0.05413107946515083,
0.03604563698172569,
0.20521071553230286,
-0.0061521041207015514,
0.148649662733078,
-0.06420578807592392,
-0.042227379977703094,
0.022919071838259697,
-0.03824517875909805,
0.023668700829148293,
0.13661155104637146,
0.13755904138088226,
-0.050012361258268356,
0.1495388299226761,
0.1569514274597168,
-0.088559590280056,
0.1428588181734085,
-0.041790347546339035,
-0.07380831241607666,
-0.01835949718952179,
-0.03915363550186157,
-0.005380677524954081,
0.11051664501428604,
-0.1627344787120819,
-0.004747296683490276,
0.027950014919042587,
0.01196958962827921,
0.01957515813410282,
-0.22983019053936005,
-0.049459058791399,
0.044974543154239655,
-0.03769618272781372,
-0.018881667405366898,
-0.009966891258955002,
0.002720503369346261,
0.10604012757539749,
0.0008937334059737623,
-0.0848451629281044,
0.03172663599252701,
0.0022889338433742523,
-0.0838480219244957,
0.2177211195230484,
-0.07052799314260483,
-0.15200303494930267,
-0.1336527019739151,
-0.07158368080854416,
-0.05252264440059662,
0.0019286591559648514,
0.06796949356794357,
-0.10265792161226273,
-0.02505035512149334,
-0.07416200637817383,
0.032283633947372437,
0.007663427852094173,
0.026632966473698616,
0.007754262536764145,
0.005139422602951527,
0.06503865122795105,
-0.1074826717376709,
-0.01233445480465889,
-0.057872917503118515,
-0.0591345876455307,
0.03848171979188919,
0.034625981003046036,
0.115079365670681,
0.15166349709033966,
-0.012620719149708748,
0.009598924778401852,
-0.029637129977345467,
0.22535473108291626,
-0.06218699738383293,
-0.03518056496977806,
0.1383657604455948,
-0.00844966433942318,
0.04050154611468315,
0.10554705560207367,
0.08095891773700714,
-0.07628611475229263,
-0.0016992816235870123,
0.04296675696969032,
-0.03480858728289604,
-0.23365439474582672,
-0.04573305323719978,
-0.05494158715009689,
0.014232792891561985,
0.09255701303482056,
0.019993748515844345,
0.029775872826576233,
0.06984301656484604,
0.03963799774646759,
0.08110816776752472,
-0.04758360981941223,
0.050262778997421265,
0.10383641719818115,
0.03471602872014046,
0.12061983346939087,
-0.053521495312452316,
-0.0673314705491066,
0.04152953252196312,
-0.01598353125154972,
0.22365815937519073,
0.02313443087041378,
0.13652832806110382,
0.06385821849107742,
0.15331554412841797,
-0.00816011056303978,
0.0783262625336647,
-0.0021609202958643436,
-0.04902302101254463,
-0.016767462715506554,
-0.040544673800468445,
-0.03478052467107773,
0.028274979442358017,
-0.07335725426673889,
0.0820470005273819,
-0.1323360651731491,
0.011274303309619427,
0.052326325327157974,
0.2542121410369873,
0.044810209423303604,
-0.31556814908981323,
-0.09052421897649765,
0.00866544246673584,
-0.02356516383588314,
-0.01863424852490425,
0.028864992782473564,
0.08615357428789139,
-0.09284189343452454,
0.02936282567679882,
-0.07169239223003387,
0.09910397231578827,
-0.054707691073417664,
0.0510663241147995,
0.08254608511924744,
0.08267651498317719,
0.005808492656797171,
0.09461677074432373,
-0.2964196503162384,
0.2862945795059204,
0.0039495741948485374,
0.05975591763854027,
-0.0789700374007225,
0.006556149106472731,
0.046888552606105804,
0.06812030076980591,
0.08470743149518967,
-0.013251593336462975,
-0.020075054839253426,
-0.2003544569015503,
-0.0688004121184349,
0.03323981165885925,
0.062028754502534866,
-0.046769145876169205,
0.08501488715410233,
-0.028460077941417694,
0.009700755588710308,
0.07592134177684784,
0.016847729682922363,
-0.05689072608947754,
-0.10987614095211029,
-0.010992318391799927,
0.0211940910667181,
-0.06817856431007385,
-0.06148962303996086,
-0.12117602676153183,
-0.13506893813610077,
0.14350582659244537,
-0.035363178700208664,
-0.025651419535279274,
-0.10787783563137054,
0.08389080315828323,
0.05256499722599983,
-0.09024346619844437,
0.03252946212887764,
0.006361968349665403,
0.07474474608898163,
0.02820555865764618,
-0.0720175951719284,
0.10513556748628616,
-0.07201699167490005,
-0.1567019373178482,
-0.06933965533971786,
0.09822496026754379,
0.036299943923950195,
0.07195328176021576,
-0.016210954636335373,
0.009944227524101734,
-0.04810528829693794,
-0.08777511119842529,
0.02948813885450363,
0.01058610063046217,
0.06848503649234772,
0.0174538716673851,
-0.06458134204149246,
0.023709574714303017,
-0.06219806522130966,
-0.03656124323606491,
0.20266254246234894,
0.23392795026302338,
-0.10307903587818146,
0.016509070992469788,
0.008817926980555058,
-0.07866012305021286,
-0.19457846879959106,
0.04412205144762993,
0.04552039876580238,
0.016301503404974937,
0.0460701584815979,
-0.18476153910160065,
0.14131374657154083,
0.11442817002534866,
-0.013105911202728748,
0.10344818979501724,
-0.3194306790828705,
-0.12042511254549026,
0.14050836861133575,
0.13497504591941833,
0.11862707883119583,
-0.13775183260440826,
-0.015473544597625732,
-0.024572154507040977,
-0.14168329536914825,
0.104047492146492,
-0.10836904495954514,
0.11888211220502853,
-0.04508076608181,
0.06266004592180252,
0.0037345942109823227,
-0.05790884792804718,
0.12771525979042053,
0.020587172359228134,
0.10045302659273148,
-0.05350383371114731,
-0.033641278743743896,
0.03071650117635727,
-0.04200958088040352,
0.022741947323083878,
-0.11080576479434967,
0.02306830883026123,
-0.11522656679153442,
-0.02256503328680992,
-0.06575943529605865,
0.04976709559559822,
-0.04724540561437607,
-0.0656195804476738,
-0.03238914906978607,
0.01894870586693287,
0.04376131296157837,
-0.009512390941381454,
0.13846907019615173,
0.019795814529061317,
0.15673676133155823,
0.08587489277124405,
0.08065654337406158,
-0.06639920175075531,
-0.07217054069042206,
-0.023511311039328575,
-0.01077340543270111,
0.05354699864983559,
-0.15230706334114075,
0.017872119322419167,
0.14700889587402344,
0.02564365789294243,
0.1493699550628662,
0.08398130536079407,
-0.016955314204096794,
0.005387521348893642,
0.05736297369003296,
-0.16045571863651276,
-0.08898655325174332,
-0.022810498252511024,
-0.057850807905197144,
-0.12351630628108978,
0.04439418017864227,
0.08429817855358124,
-0.07308221608400345,
-0.007165681105107069,
-0.007022528909146786,
0.006990774534642696,
-0.059922955930233,
0.17914733290672302,
0.05237842723727226,
0.04748550057411194,
-0.0986846312880516,
0.0733439028263092,
0.040773339569568634,
-0.07562568783760071,
-0.0016482991632074118,
0.06309086829423904,
-0.07577116042375565,
-0.05331952124834061,
0.07815593481063843,
0.21861189603805542,
-0.046605970710515976,
-0.044787194579839706,
-0.1488434225320816,
-0.13134630024433136,
0.07773518562316895,
0.1400838941335678,
0.12035413086414337,
0.011797239072620869,
-0.06410318613052368,
0.0008707083179615438,
-0.11228730529546738,
0.09598390012979507,
0.04239305108785629,
0.06328403204679489,
-0.1388140320777893,
0.1358414888381958,
0.020042940974235535,
0.04542526602745056,
-0.01682385802268982,
0.023282723501324654,
-0.10045109689235687,
0.009182396344840527,
-0.10750310868024826,
-0.024285614490509033,
-0.026428405195474625,
0.01211906410753727,
-0.005861642770469189,
-0.04758259654045105,
-0.0566759891808033,
0.005985803436487913,
-0.10838011652231216,
-0.021770738065242767,
0.03533035144209862,
0.07539134472608566,
-0.10229399800300598,
-0.03510992228984833,
0.031127886846661568,
-0.062376491725444794,
0.06643445789813995,
0.03968359902501106,
0.026856200769543648,
0.05519283562898636,
-0.14207905530929565,
0.023693231865763664,
0.06791891902685165,
0.025541288778185844,
0.06395706534385681,
-0.09913884848356247,
-0.008636538870632648,
-0.013803153298795223,
0.04345450550317764,
0.01916884072124958,
0.061185795813798904,
-0.13449762761592865,
-0.0013455058215186,
-0.010348033159971237,
-0.08741991221904755,
-0.06637092679738998,
0.027137411758303642,
0.0962412878870964,
0.012502683326601982,
0.2001606822013855,
-0.07415919005870819,
0.05315055325627327,
-0.22284531593322754,
0.007784545887261629,
-0.009375255554914474,
-0.10596857219934464,
-0.117481529712677,
-0.0776430070400238,
0.058280833065509796,
-0.061822276562452316,
0.15235701203346252,
0.0425226092338562,
0.032762084156274796,
0.025983376428484917,
-0.013488364405930042,
0.02397947758436203,
0.012583584524691105,
0.20739060640335083,
0.03827144205570221,
-0.03573043271899223,
0.06570188701152802,
0.0477876141667366,
0.10386498272418976,
0.12428651750087738,
0.20066112279891968,
0.14294204115867615,
-0.015342270024120808,
0.0958690345287323,
0.043385203927755356,
-0.06158531829714775,
-0.14614607393741608,
0.044377125799655914,
-0.034792233258485794,
0.1079014241695404,
-0.018612291663885117,
0.21717964112758636,
0.06784909218549728,
-0.17057251930236816,
0.04580388590693474,
-0.053313955664634705,
-0.08039191365242004,
-0.12306919693946838,
-0.03409576043486595,
-0.07853835076093674,
-0.1308421641588211,
0.0009298619697801769,
-0.1138087809085846,
0.0009946830105036497,
0.12857311964035034,
0.0004376996657811105,
-0.02373930811882019,
0.15620113909244537,
0.01808966137468815,
0.03327104076743126,
0.06001606956124306,
0.008834846317768097,
-0.037406764924526215,
-0.14248812198638916,
-0.055841442197561264,
-0.010784979909658432,
-0.024631042033433914,
0.024992385879158974,
-0.0701456367969513,
-0.052444834262132645,
0.0370294526219368,
-0.019407952204346657,
-0.10484905540943146,
0.010788097977638245,
0.003588929073885083,
0.05906722694635391,
0.03694666177034378,
0.008093061856925488,
0.0268037561327219,
-0.0061460998840630054,
0.20350411534309387,
-0.07954155653715134,
-0.05374414101243019,
-0.09890949726104736,
0.24820192158222198,
0.03600170463323593,
-0.019097449257969856,
0.030537579208612442,
-0.06343317031860352,
0.008436888456344604,
0.24792559444904327,
0.21281282603740692,
-0.07731661945581436,
-0.006568665150552988,
0.01074000634253025,
-0.007876424118876457,
-0.02606234699487686,
0.09869705140590668,
0.13348488509655,
0.02694064937531948,
-0.10093294084072113,
-0.04489089176058769,
-0.05451182648539543,
-0.020151840522885323,
-0.02722935564815998,
0.07710536569356918,
0.0591934435069561,
0.005201260559260845,
-0.032312966883182526,
0.052586235105991364,
-0.06001158803701401,
-0.07486385107040405,
0.0692172423005104,
-0.21624009311199188,
-0.16352620720863342,
-0.016660498455166817,
0.10600580275058746,
0.010035257786512375,
0.0684928447008133,
-0.02732844464480877,
-0.004101307597011328,
0.08941727131605148,
-0.018484266474843025,
-0.1078256294131279,
-0.08087000250816345,
0.08553116768598557,
-0.1175895631313324,
0.22102504968643188,
-0.045581746846437454,
0.05132993310689926,
0.12591637670993805,
0.06818245351314545,
-0.07157515734434128,
0.06208227574825287,
0.03936963155865669,
-0.05185438692569733,
0.022042429074645042,
0.06987792253494263,
-0.034005582332611084,
0.06092541292309761,
0.04595522955060005,
-0.13468651473522186,
0.027433451265096664,
-0.06213991343975067,
-0.06929901987314224,
-0.03811642527580261,
-0.020439451560378075,
-0.053217582404613495,
0.13125689327716827,
0.22217302024364471,
-0.02466833032667637,
-0.011345439590513706,
-0.0688401386141777,
0.009084232151508331,
0.06049910560250282,
0.030142048373818398,
-0.06082216650247574,
-0.19785021245479584,
0.018391301855444908,
0.04134780913591385,
-0.01892002485692501,
-0.26198142766952515,
-0.09846517443656921,
0.0023176679387688637,
-0.08312686532735825,
-0.08758088946342468,
0.06375978142023087,
0.09902011603116989,
0.057703327387571335,
-0.05802591145038605,
-0.06846413016319275,
-0.06341823935508728,
0.15233372151851654,
-0.13802897930145264,
-0.09735766798257828
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# albert-xlarge-v2-finetuned-mrpc
This model is a fine-tuned version of [albert-xlarge-v2](https://huggingface.co/albert-xlarge-v2) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5563
- Accuracy: 0.7132
- F1: 0.8146
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log | 1.0 | 63 | 0.6898 | 0.5221 | 0.6123 |
| No log | 2.0 | 126 | 0.6298 | 0.6838 | 0.8122 |
| No log | 3.0 | 189 | 0.6043 | 0.7010 | 0.8185 |
| No log | 4.0 | 252 | 0.5834 | 0.7010 | 0.8146 |
| No log | 5.0 | 315 | 0.5563 | 0.7132 | 0.8146 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy", "f1"], "model-index": [{"name": "albert-xlarge-v2-finetuned-mrpc", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "mrpc"}, "metrics": [{"type": "accuracy", "value": 0.7132352941176471, "name": "Accuracy"}, {"type": "f1", "value": 0.8145800316957211, "name": "F1"}]}]}]}
|
text-classification
|
anirudh21/albert-xlarge-v2-finetuned-mrpc
|
[
"transformers",
"pytorch",
"tensorboard",
"albert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
albert-xlarge-v2-finetuned-mrpc
===============================
This model is a fine-tuned version of albert-xlarge-v2 on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.5563
* Accuracy: 0.7132
* F1: 0.8146
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.10851341485977173,
0.08287949860095978,
-0.0015716948546469212,
0.12378216534852982,
0.1644035279750824,
0.034203559160232544,
0.11756833642721176,
0.1303785890340805,
-0.08961959183216095,
0.01978359930217266,
0.13182155787944794,
0.1571536362171173,
0.019342191517353058,
0.10459926724433899,
-0.050261739641427994,
-0.2606520652770996,
-0.015629388391971588,
0.05233924835920334,
-0.047430187463760376,
0.13125889003276825,
0.08640951663255692,
-0.12352532893419266,
0.10119590908288956,
0.016996311023831367,
-0.19539082050323486,
0.006376367527991533,
0.009158787317574024,
-0.0564240999519825,
0.14400716125965118,
0.03155115246772766,
0.11656567454338074,
-0.004729350097477436,
0.08431360125541687,
-0.1941358745098114,
0.011082910932600498,
0.04820818454027176,
0.0049487631767988205,
0.09291473031044006,
0.04774554818868637,
0.0033729670103639364,
0.14480124413967133,
-0.09592609107494354,
0.0522734709084034,
0.027718715369701385,
-0.12452755123376846,
-0.21515901386737823,
-0.08181779086589813,
0.03230409324169159,
0.08416493982076645,
0.11512179672718048,
-0.005395589396357536,
0.12806573510169983,
-0.08085677772760391,
0.08780300617218018,
0.23827265202999115,
-0.30694082379341125,
-0.06739146262407303,
0.025221778079867363,
0.00799495168030262,
0.032211191952228546,
-0.10505171120166779,
-0.02871882915496826,
0.05733394995331764,
0.04659018665552139,
0.12641079723834991,
-0.033901676535606384,
-0.11446501314640045,
0.014867664314806461,
-0.13147439062595367,
-0.03409222885966301,
0.16096091270446777,
0.04539426788687706,
-0.033270757645368576,
-0.05664939805865288,
-0.051134154200553894,
-0.15639479458332062,
-0.035082269459962845,
-0.004277274943888187,
0.04805569350719452,
-0.02432965487241745,
-0.0460941419005394,
-0.0016077898908406496,
-0.11016025394201279,
-0.06246815249323845,
-0.08158069103956223,
0.11988197267055511,
0.03115209750831127,
0.016247021034359932,
-0.037016041576862335,
0.11204948276281357,
0.0013566170819103718,
-0.13201206922531128,
0.012988509610295296,
0.02687028981745243,
0.00729374261572957,
-0.044786155223846436,
-0.053867775946855545,
-0.053863268345594406,
0.00430115032941103,
0.12289395928382874,
-0.04026487097144127,
0.040145393460989,
0.04916608706116676,
0.04391423612833023,
-0.09642386436462402,
0.20065082609653473,
-0.03766079246997833,
-0.018071161583065987,
0.00929968897253275,
0.03564739599823952,
0.024339625611901283,
-0.009158243425190449,
-0.11860004812479019,
-0.000343729363521561,
0.07517223805189133,
0.005142418202012777,
-0.07092933356761932,
0.07137508690357208,
-0.05376800149679184,
-0.02296004444360733,
-0.0021376493386924267,
-0.08764661103487015,
0.03296663612127304,
-0.002671253401786089,
-0.07513636350631714,
-0.015495436266064644,
0.030281322076916695,
0.021347740665078163,
-0.014448919333517551,
0.11615916341543198,
-0.08655691146850586,
0.03218672797083855,
-0.09627551585435867,
-0.10221421718597412,
0.023033635690808296,
-0.10830260813236237,
0.03779095411300659,
-0.09124257415533066,
-0.1834726780653,
-0.009110176004469395,
0.06036468595266342,
-0.025423433631658554,
-0.061499450355768204,
-0.0546928234398365,
-0.06525935232639313,
0.015102635137736797,
-0.007776893209666014,
0.13168306648731232,
-0.06625673919916153,
0.08166395872831345,
0.02689969912171364,
0.06423691660165787,
-0.04112587869167328,
0.052426837384700775,
-0.10481588542461395,
0.014568629674613476,
-0.14707937836647034,
0.031188983470201492,
-0.037114519625902176,
0.07600796222686768,
-0.08274073898792267,
-0.09499000012874603,
0.015633273869752884,
-0.0025613398756831884,
0.0618094764649868,
0.09867966920137405,
-0.17744845151901245,
-0.07985639572143555,
0.15661577880382538,
-0.06437281519174576,
-0.1325954645872116,
0.12001652270555496,
-0.0612134151160717,
0.045997507870197296,
0.06028265133500099,
0.1512797325849533,
0.06379801779985428,
-0.08271874487400055,
-0.004605399910360575,
0.02407730370759964,
0.04843881353735924,
-0.07159419357776642,
0.07669540494680405,
0.008389415219426155,
0.002951052039861679,
0.0340876542031765,
-0.019045770168304443,
0.061201177537441254,
-0.08733832836151123,
-0.10032474249601364,
-0.04622561112046242,
-0.08203887939453125,
0.028672071173787117,
0.0776790976524353,
0.07325249910354614,
-0.09814091771841049,
-0.08594372123479843,
0.03338034823536873,
0.0765068531036377,
-0.04751443490386009,
0.028013426810503006,
-0.05600306764245033,
0.06213730573654175,
-0.043482739478349686,
-0.023733915761113167,
-0.17221353948116302,
-0.017427945509552956,
-0.000443327211542055,
-0.006937176920473576,
0.009066986851394176,
0.026353659108281136,
0.06821393221616745,
0.05675322189927101,
-0.05200904235243797,
-0.010675321333110332,
-0.02982199192047119,
-0.0041316417045891285,
-0.13505974411964417,
-0.2008715569972992,
-0.03134380653500557,
-0.02398592233657837,
0.15125882625579834,
-0.20457713305950165,
0.04213083162903786,
-0.021903015673160553,
0.06635771691799164,
0.012865194119513035,
-0.0053146895952522755,
-0.04286836460232735,
0.0698426365852356,
-0.04436494782567024,
-0.05092164874076843,
0.07527109980583191,
0.019110465422272682,
-0.09808443486690521,
-0.04705537110567093,
-0.08996978402137756,
0.15877945721149445,
0.13385072350502014,
-0.1099434345960617,
-0.07223377376794815,
-0.0058516887947916985,
-0.06560415774583817,
-0.03339000791311264,
-0.05574433505535126,
0.040243301540613174,
0.21176877617835999,
-0.007056929636746645,
0.15119796991348267,
-0.0662810429930687,
-0.04895230755209923,
0.026796160265803337,
-0.03642735630273819,
0.021458491683006287,
0.12952668964862823,
0.13312320411205292,
-0.05990966781973839,
0.14652734994888306,
0.15359628200531006,
-0.09059371799230576,
0.13318632543087006,
-0.03999984264373779,
-0.07472026348114014,
-0.017710551619529724,
-0.03931165114045143,
-0.004119568970054388,
0.10875216871500015,
-0.1615610122680664,
-0.004651137627661228,
0.030832653865218163,
0.014737037010490894,
0.020087437704205513,
-0.22307056188583374,
-0.04519743472337723,
0.04278237000107765,
-0.03178243339061737,
-0.021580059081315994,
-0.007723218761384487,
0.0037444550544023514,
0.10487890243530273,
0.0055309683084487915,
-0.08151063323020935,
0.03770234063267708,
0.005515687167644501,
-0.08669986575841904,
0.21752367913722992,
-0.06947796791791916,
-0.15518920123577118,
-0.12592779099941254,
-0.07883378863334656,
-0.04955562576651573,
0.0021320621017366648,
0.07115191966295242,
-0.09382370859384537,
-0.032456617802381516,
-0.07577887177467346,
0.019918303936719894,
0.004875612910836935,
0.031803544610738754,
0.01505905669182539,
0.0028410842642188072,
0.06437543779611588,
-0.10163018107414246,
-0.016886616125702858,
-0.05557944253087044,
-0.050702955573797226,
0.036809612065553665,
0.035607509315013885,
0.11485431343317032,
0.14555047452449799,
-0.016081763431429863,
0.013742087408900261,
-0.030881134793162346,
0.22703666985034943,
-0.06183459237217903,
-0.033547043800354004,
0.13601787388324738,
-0.008285166695713997,
0.04035327211022377,
0.1131085455417633,
0.07463839650154114,
-0.07826251536607742,
-0.00111157086212188,
0.03700360655784607,
-0.03763021528720856,
-0.2307037115097046,
-0.046411383897066116,
-0.06072646379470825,
0.007775360718369484,
0.09654852747917175,
0.02273011952638626,
0.02955014817416668,
0.07200337946414948,
0.04007653146982193,
0.08878234028816223,
-0.05143848434090614,
0.059663355350494385,
0.10464063286781311,
0.04082774370908737,
0.1216021254658699,
-0.05594692751765251,
-0.06648729741573334,
0.04218735173344612,
-0.01800714246928692,
0.2236419916152954,
0.016036270186305046,
0.13117147982120514,
0.05609254539012909,
0.15269875526428223,
-0.004133033100515604,
0.0846896767616272,
-0.0062759071588516235,
-0.05142221599817276,
-0.01511458307504654,
-0.03756638243794441,
-0.03472224622964859,
0.032171182334423065,
-0.08392827957868576,
0.079414002597332,
-0.1315862387418747,
0.016221044585108757,
0.05463474988937378,
0.26152393221855164,
0.04587202146649361,
-0.321972519159317,
-0.09329250454902649,
0.009788069874048233,
-0.029937151819467545,
-0.027202531695365906,
0.03143763169646263,
0.08129655569791794,
-0.09414859861135483,
0.035591233521699905,
-0.0738305002450943,
0.10116495192050934,
-0.04594714939594269,
0.0491180457174778,
0.08238666504621506,
0.07942087948322296,
0.0074699679389595985,
0.09445410221815109,
-0.30067723989486694,
0.2845527231693268,
0.004858710337430239,
0.06776915490627289,
-0.08622097969055176,
0.008372905664145947,
0.04453708976507187,
0.06563539057970047,
0.09501554816961288,
-0.0140914935618639,
-0.04899032041430473,
-0.18717963993549347,
-0.06885536760091782,
0.03427093103528023,
0.05553438887000084,
-0.03511710464954376,
0.08751165121793747,
-0.028084883466362953,
0.0065111275762319565,
0.07207874953746796,
0.018553245812654495,
-0.04902458190917969,
-0.11115951836109161,
-0.01427517831325531,
0.026247471570968628,
-0.07107964158058167,
-0.05908683314919472,
-0.11763347685337067,
-0.1296166181564331,
0.15435954928398132,
-0.02876298874616623,
-0.02926171012222767,
-0.11168454587459564,
0.08655610680580139,
0.049155063927173615,
-0.09150857478380203,
0.0343179889023304,
0.005369671154767275,
0.08151274919509888,
0.02817639894783497,
-0.0781378522515297,
0.10608948767185211,
-0.07388927042484283,
-0.15226562321186066,
-0.06808533519506454,
0.09889303892850876,
0.030818484723567963,
0.06943147629499435,
-0.01059445645660162,
0.015278245322406292,
-0.05115121603012085,
-0.0893559604883194,
0.025140443816781044,
0.008072792552411556,
0.08026784658432007,
0.005880521144717932,
-0.06082103028893471,
0.021673541516065598,
-0.05721529945731163,
-0.032977454364299774,
0.20603495836257935,
0.21837805211544037,
-0.10593485087156296,
0.01963035576045513,
0.00011520516272867098,
-0.07776135206222534,
-0.19612984359264374,
0.04137551784515381,
0.04815450683236122,
0.018216095864772797,
0.03512553870677948,
-0.17524226009845734,
0.15107755362987518,
0.1101469025015831,
-0.014013183303177357,
0.10103233903646469,
-0.30580073595046997,
-0.12347456812858582,
0.13789689540863037,
0.12939028441905975,
0.13237634301185608,
-0.1306103616952896,
-0.01185387559235096,
-0.028198547661304474,
-0.1425543576478958,
0.09835414588451385,
-0.10393572598695755,
0.11367519944906235,
-0.044036321341991425,
0.07140891253948212,
0.0034511450212448835,
-0.06000930443406105,
0.12067489326000214,
0.025124182924628258,
0.09810183197259903,
-0.056498534977436066,
-0.034325432032346725,
0.03099016286432743,
-0.04756839945912361,
0.031243259087204933,
-0.10857679694890976,
0.023675616830587387,
-0.12081367522478104,
-0.025438150390982628,
-0.06328153610229492,
0.049994807690382004,
-0.04249459132552147,
-0.060809362679719925,
-0.03294748067855835,
0.01564195565879345,
0.05251622945070267,
-0.009419661946594715,
0.14960907399654388,
0.023017099127173424,
0.14949700236320496,
0.08569129556417465,
0.08571472764015198,
-0.07848557829856873,
-0.0662907212972641,
-0.018670717254281044,
-0.01171959936618805,
0.052121590822935104,
-0.1567266285419464,
0.0222612377256155,
0.14933155477046967,
0.023854093626141548,
0.14069582521915436,
0.08444757014513016,
-0.012314979918301105,
0.006973995827138424,
0.05782342329621315,
-0.16315408051013947,
-0.08276087045669556,
-0.01947229914367199,
-0.05304446816444397,
-0.12343880534172058,
0.044341687113046646,
0.08120200037956238,
-0.07319604605436325,
-0.01001099031418562,
-0.008966249413788319,
0.00801245216280222,
-0.060733210295438766,
0.17431361973285675,
0.04631480574607849,
0.04427378252148628,
-0.103228360414505,
0.06995489448308945,
0.04022670537233353,
-0.08173894137144089,
0.006496574729681015,
0.0679832473397255,
-0.07813875377178192,
-0.05333561822772026,
0.08378839492797852,
0.21410124003887177,
-0.04703439027070999,
-0.04646718502044678,
-0.1423557847738266,
-0.13277803361415863,
0.08483558148145676,
0.14282391965389252,
0.11973793059587479,
0.011175474151968956,
-0.0649867057800293,
-0.0028957000467926264,
-0.11970946192741394,
0.0956927016377449,
0.04556654393672943,
0.06415167450904846,
-0.1412634551525116,
0.13030587136745453,
0.014540751464664936,
0.04957909509539604,
-0.018316565081477165,
0.02747558057308197,
-0.09735246002674103,
0.01003090851008892,
-0.11197996139526367,
-0.014422730542719364,
-0.03730938956141472,
0.010056210681796074,
-0.005486046429723501,
-0.04593383148312569,
-0.06191306561231613,
0.010265232995152473,
-0.10761867463588715,
-0.020014287903904915,
0.03203214704990387,
0.06906575709581375,
-0.09926522523164749,
-0.03584988787770271,
0.025884181261062622,
-0.06411145627498627,
0.06664050370454788,
0.04775208234786987,
0.024460744112730026,
0.05203469842672348,
-0.13428330421447754,
0.020065493881702423,
0.07125937938690186,
0.023615064099431038,
0.06857342272996902,
-0.10210324823856354,
-0.0048413085751235485,
-0.0018470374634489417,
0.03993143141269684,
0.01879689283668995,
0.060570936650037766,
-0.13668784499168396,
-0.0018133589765056968,
-0.0059414212591946125,
-0.08354639261960983,
-0.06759142130613327,
0.025267822667956352,
0.09829063713550568,
0.010769153945147991,
0.20157437026500702,
-0.07453074306249619,
0.05111348256468773,
-0.21752247214317322,
0.008580947294831276,
-0.011457758024334908,
-0.10675018280744553,
-0.11847562342882156,
-0.07331645488739014,
0.060511115938425064,
-0.05910194292664528,
0.15643341839313507,
0.04336113855242729,
0.03594420477747917,
0.029215503484010696,
-0.016089623793959618,
0.025112714618444443,
0.013912186026573181,
0.20906245708465576,
0.031809382140636444,
-0.03966294229030609,
0.06834982335567474,
0.04653722792863846,
0.10755541920661926,
0.1338290572166443,
0.20528732240200043,
0.14109309017658234,
-0.0016062030335888267,
0.104256771504879,
0.03466791287064552,
-0.05762480944395065,
-0.1563846468925476,
0.03676179423928261,
-0.04096238315105438,
0.11012841761112213,
-0.017167802900075912,
0.20791314542293549,
0.07202707976102829,
-0.17344166338443756,
0.04336996749043465,
-0.05798328295350075,
-0.0792505294084549,
-0.12193125486373901,
-0.048324186354875565,
-0.08294760435819626,
-0.12893500924110413,
0.004624804016202688,
-0.11551131308078766,
0.0029855608008801937,
0.11830372363328934,
0.0003146572853438556,
-0.025349488481879234,
0.15853318572044373,
0.012229030951857567,
0.03587009757757187,
0.0623665414750576,
0.009986549615859985,
-0.033745404332876205,
-0.12574751675128937,
-0.049979232251644135,
-0.01642908900976181,
-0.03286297246813774,
0.02869069203734398,
-0.06801576167345047,
-0.043811243027448654,
0.0380096510052681,
-0.018702475354075432,
-0.0994282215833664,
0.008727424778044224,
0.010006695054471493,
0.06349524110555649,
0.04147202521562576,
0.009688003920018673,
0.02728598564863205,
-0.008197645656764507,
0.20150049030780792,
-0.08199817687273026,
-0.05192165821790695,
-0.10646841675043106,
0.24835076928138733,
0.04308316856622696,
-0.024649769067764282,
0.02875283546745777,
-0.062391892075538635,
0.00807760376483202,
0.25195541977882385,
0.20674537122249603,
-0.06871533393859863,
-0.007414078805595636,
0.005721509922295809,
-0.0078910943120718,
-0.0229549128562212,
0.09814517945051193,
0.13991482555866241,
0.039135899394750595,
-0.10182734578847885,
-0.05337280035018921,
-0.05527487024664879,
-0.020565170794725418,
-0.03373938426375389,
0.08031013607978821,
0.05193829908967018,
0.0009627835243009031,
-0.02825058251619339,
0.049038421362638474,
-0.06365194916725159,
-0.07327164709568024,
0.06637400388717651,
-0.21353663504123688,
-0.1605113446712494,
-0.009253967553377151,
0.09987538307905197,
0.011628348380327225,
0.06883668899536133,
-0.02442094497382641,
-0.005353689659386873,
0.09306224435567856,
-0.01920473948121071,
-0.106890469789505,
-0.07222694903612137,
0.08502697199583054,
-0.1229671910405159,
0.2248678207397461,
-0.042769718915224075,
0.04968447983264923,
0.12772446870803833,
0.07353010773658752,
-0.08143990486860275,
0.05894053354859352,
0.03563410043716431,
-0.05077586695551872,
0.029180480167269707,
0.07570891827344894,
-0.03623399883508682,
0.05392841622233391,
0.0446491502225399,
-0.1353415846824646,
0.023813316598534584,
-0.06738412380218506,
-0.061082519590854645,
-0.04242687672376633,
-0.020358486101031303,
-0.053828101605176926,
0.13348907232284546,
0.22147727012634277,
-0.02643861249089241,
-0.01265759114176035,
-0.06895022839307785,
0.011453851126134396,
0.0556543804705143,
0.027651680633425713,
-0.06081826612353325,
-0.20048682391643524,
0.02100181393325329,
0.04676947742700577,
-0.02104194276034832,
-0.2525334358215332,
-0.09901151061058044,
0.0026946943253278732,
-0.08497780561447144,
-0.08980201184749603,
0.06273863464593887,
0.0974934995174408,
0.05462854355573654,
-0.0601477175951004,
-0.05967121943831444,
-0.06285107880830765,
0.14960379898548126,
-0.1336909383535385,
-0.09875859320163727
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# albert-xlarge-v2-finetuned-wnli
This model is a fine-tuned version of [albert-xlarge-v2](https://huggingface.co/albert-xlarge-v2) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6869
- Accuracy: 0.5634
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.6906 | 0.5070 |
| No log | 2.0 | 80 | 0.6869 | 0.5634 |
| No log | 3.0 | 120 | 0.6905 | 0.5352 |
| No log | 4.0 | 160 | 0.6960 | 0.4225 |
| No log | 5.0 | 200 | 0.7011 | 0.3803 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "albert-xlarge-v2-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.5633802816901409, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/albert-xlarge-v2-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"albert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
albert-xlarge-v2-finetuned-wnli
===============================
This model is a fine-tuned version of albert-xlarge-v2 on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6869
* Accuracy: 0.5634
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.10851341485977173,
0.08287949860095978,
-0.0015716948546469212,
0.12378216534852982,
0.1644035279750824,
0.034203559160232544,
0.11756833642721176,
0.1303785890340805,
-0.08961959183216095,
0.01978359930217266,
0.13182155787944794,
0.1571536362171173,
0.019342191517353058,
0.10459926724433899,
-0.050261739641427994,
-0.2606520652770996,
-0.015629388391971588,
0.05233924835920334,
-0.047430187463760376,
0.13125889003276825,
0.08640951663255692,
-0.12352532893419266,
0.10119590908288956,
0.016996311023831367,
-0.19539082050323486,
0.006376367527991533,
0.009158787317574024,
-0.0564240999519825,
0.14400716125965118,
0.03155115246772766,
0.11656567454338074,
-0.004729350097477436,
0.08431360125541687,
-0.1941358745098114,
0.011082910932600498,
0.04820818454027176,
0.0049487631767988205,
0.09291473031044006,
0.04774554818868637,
0.0033729670103639364,
0.14480124413967133,
-0.09592609107494354,
0.0522734709084034,
0.027718715369701385,
-0.12452755123376846,
-0.21515901386737823,
-0.08181779086589813,
0.03230409324169159,
0.08416493982076645,
0.11512179672718048,
-0.005395589396357536,
0.12806573510169983,
-0.08085677772760391,
0.08780300617218018,
0.23827265202999115,
-0.30694082379341125,
-0.06739146262407303,
0.025221778079867363,
0.00799495168030262,
0.032211191952228546,
-0.10505171120166779,
-0.02871882915496826,
0.05733394995331764,
0.04659018665552139,
0.12641079723834991,
-0.033901676535606384,
-0.11446501314640045,
0.014867664314806461,
-0.13147439062595367,
-0.03409222885966301,
0.16096091270446777,
0.04539426788687706,
-0.033270757645368576,
-0.05664939805865288,
-0.051134154200553894,
-0.15639479458332062,
-0.035082269459962845,
-0.004277274943888187,
0.04805569350719452,
-0.02432965487241745,
-0.0460941419005394,
-0.0016077898908406496,
-0.11016025394201279,
-0.06246815249323845,
-0.08158069103956223,
0.11988197267055511,
0.03115209750831127,
0.016247021034359932,
-0.037016041576862335,
0.11204948276281357,
0.0013566170819103718,
-0.13201206922531128,
0.012988509610295296,
0.02687028981745243,
0.00729374261572957,
-0.044786155223846436,
-0.053867775946855545,
-0.053863268345594406,
0.00430115032941103,
0.12289395928382874,
-0.04026487097144127,
0.040145393460989,
0.04916608706116676,
0.04391423612833023,
-0.09642386436462402,
0.20065082609653473,
-0.03766079246997833,
-0.018071161583065987,
0.00929968897253275,
0.03564739599823952,
0.024339625611901283,
-0.009158243425190449,
-0.11860004812479019,
-0.000343729363521561,
0.07517223805189133,
0.005142418202012777,
-0.07092933356761932,
0.07137508690357208,
-0.05376800149679184,
-0.02296004444360733,
-0.0021376493386924267,
-0.08764661103487015,
0.03296663612127304,
-0.002671253401786089,
-0.07513636350631714,
-0.015495436266064644,
0.030281322076916695,
0.021347740665078163,
-0.014448919333517551,
0.11615916341543198,
-0.08655691146850586,
0.03218672797083855,
-0.09627551585435867,
-0.10221421718597412,
0.023033635690808296,
-0.10830260813236237,
0.03779095411300659,
-0.09124257415533066,
-0.1834726780653,
-0.009110176004469395,
0.06036468595266342,
-0.025423433631658554,
-0.061499450355768204,
-0.0546928234398365,
-0.06525935232639313,
0.015102635137736797,
-0.007776893209666014,
0.13168306648731232,
-0.06625673919916153,
0.08166395872831345,
0.02689969912171364,
0.06423691660165787,
-0.04112587869167328,
0.052426837384700775,
-0.10481588542461395,
0.014568629674613476,
-0.14707937836647034,
0.031188983470201492,
-0.037114519625902176,
0.07600796222686768,
-0.08274073898792267,
-0.09499000012874603,
0.015633273869752884,
-0.0025613398756831884,
0.0618094764649868,
0.09867966920137405,
-0.17744845151901245,
-0.07985639572143555,
0.15661577880382538,
-0.06437281519174576,
-0.1325954645872116,
0.12001652270555496,
-0.0612134151160717,
0.045997507870197296,
0.06028265133500099,
0.1512797325849533,
0.06379801779985428,
-0.08271874487400055,
-0.004605399910360575,
0.02407730370759964,
0.04843881353735924,
-0.07159419357776642,
0.07669540494680405,
0.008389415219426155,
0.002951052039861679,
0.0340876542031765,
-0.019045770168304443,
0.061201177537441254,
-0.08733832836151123,
-0.10032474249601364,
-0.04622561112046242,
-0.08203887939453125,
0.028672071173787117,
0.0776790976524353,
0.07325249910354614,
-0.09814091771841049,
-0.08594372123479843,
0.03338034823536873,
0.0765068531036377,
-0.04751443490386009,
0.028013426810503006,
-0.05600306764245033,
0.06213730573654175,
-0.043482739478349686,
-0.023733915761113167,
-0.17221353948116302,
-0.017427945509552956,
-0.000443327211542055,
-0.006937176920473576,
0.009066986851394176,
0.026353659108281136,
0.06821393221616745,
0.05675322189927101,
-0.05200904235243797,
-0.010675321333110332,
-0.02982199192047119,
-0.0041316417045891285,
-0.13505974411964417,
-0.2008715569972992,
-0.03134380653500557,
-0.02398592233657837,
0.15125882625579834,
-0.20457713305950165,
0.04213083162903786,
-0.021903015673160553,
0.06635771691799164,
0.012865194119513035,
-0.0053146895952522755,
-0.04286836460232735,
0.0698426365852356,
-0.04436494782567024,
-0.05092164874076843,
0.07527109980583191,
0.019110465422272682,
-0.09808443486690521,
-0.04705537110567093,
-0.08996978402137756,
0.15877945721149445,
0.13385072350502014,
-0.1099434345960617,
-0.07223377376794815,
-0.0058516887947916985,
-0.06560415774583817,
-0.03339000791311264,
-0.05574433505535126,
0.040243301540613174,
0.21176877617835999,
-0.007056929636746645,
0.15119796991348267,
-0.0662810429930687,
-0.04895230755209923,
0.026796160265803337,
-0.03642735630273819,
0.021458491683006287,
0.12952668964862823,
0.13312320411205292,
-0.05990966781973839,
0.14652734994888306,
0.15359628200531006,
-0.09059371799230576,
0.13318632543087006,
-0.03999984264373779,
-0.07472026348114014,
-0.017710551619529724,
-0.03931165114045143,
-0.004119568970054388,
0.10875216871500015,
-0.1615610122680664,
-0.004651137627661228,
0.030832653865218163,
0.014737037010490894,
0.020087437704205513,
-0.22307056188583374,
-0.04519743472337723,
0.04278237000107765,
-0.03178243339061737,
-0.021580059081315994,
-0.007723218761384487,
0.0037444550544023514,
0.10487890243530273,
0.0055309683084487915,
-0.08151063323020935,
0.03770234063267708,
0.005515687167644501,
-0.08669986575841904,
0.21752367913722992,
-0.06947796791791916,
-0.15518920123577118,
-0.12592779099941254,
-0.07883378863334656,
-0.04955562576651573,
0.0021320621017366648,
0.07115191966295242,
-0.09382370859384537,
-0.032456617802381516,
-0.07577887177467346,
0.019918303936719894,
0.004875612910836935,
0.031803544610738754,
0.01505905669182539,
0.0028410842642188072,
0.06437543779611588,
-0.10163018107414246,
-0.016886616125702858,
-0.05557944253087044,
-0.050702955573797226,
0.036809612065553665,
0.035607509315013885,
0.11485431343317032,
0.14555047452449799,
-0.016081763431429863,
0.013742087408900261,
-0.030881134793162346,
0.22703666985034943,
-0.06183459237217903,
-0.033547043800354004,
0.13601787388324738,
-0.008285166695713997,
0.04035327211022377,
0.1131085455417633,
0.07463839650154114,
-0.07826251536607742,
-0.00111157086212188,
0.03700360655784607,
-0.03763021528720856,
-0.2307037115097046,
-0.046411383897066116,
-0.06072646379470825,
0.007775360718369484,
0.09654852747917175,
0.02273011952638626,
0.02955014817416668,
0.07200337946414948,
0.04007653146982193,
0.08878234028816223,
-0.05143848434090614,
0.059663355350494385,
0.10464063286781311,
0.04082774370908737,
0.1216021254658699,
-0.05594692751765251,
-0.06648729741573334,
0.04218735173344612,
-0.01800714246928692,
0.2236419916152954,
0.016036270186305046,
0.13117147982120514,
0.05609254539012909,
0.15269875526428223,
-0.004133033100515604,
0.0846896767616272,
-0.0062759071588516235,
-0.05142221599817276,
-0.01511458307504654,
-0.03756638243794441,
-0.03472224622964859,
0.032171182334423065,
-0.08392827957868576,
0.079414002597332,
-0.1315862387418747,
0.016221044585108757,
0.05463474988937378,
0.26152393221855164,
0.04587202146649361,
-0.321972519159317,
-0.09329250454902649,
0.009788069874048233,
-0.029937151819467545,
-0.027202531695365906,
0.03143763169646263,
0.08129655569791794,
-0.09414859861135483,
0.035591233521699905,
-0.0738305002450943,
0.10116495192050934,
-0.04594714939594269,
0.0491180457174778,
0.08238666504621506,
0.07942087948322296,
0.0074699679389595985,
0.09445410221815109,
-0.30067723989486694,
0.2845527231693268,
0.004858710337430239,
0.06776915490627289,
-0.08622097969055176,
0.008372905664145947,
0.04453708976507187,
0.06563539057970047,
0.09501554816961288,
-0.0140914935618639,
-0.04899032041430473,
-0.18717963993549347,
-0.06885536760091782,
0.03427093103528023,
0.05553438887000084,
-0.03511710464954376,
0.08751165121793747,
-0.028084883466362953,
0.0065111275762319565,
0.07207874953746796,
0.018553245812654495,
-0.04902458190917969,
-0.11115951836109161,
-0.01427517831325531,
0.026247471570968628,
-0.07107964158058167,
-0.05908683314919472,
-0.11763347685337067,
-0.1296166181564331,
0.15435954928398132,
-0.02876298874616623,
-0.02926171012222767,
-0.11168454587459564,
0.08655610680580139,
0.049155063927173615,
-0.09150857478380203,
0.0343179889023304,
0.005369671154767275,
0.08151274919509888,
0.02817639894783497,
-0.0781378522515297,
0.10608948767185211,
-0.07388927042484283,
-0.15226562321186066,
-0.06808533519506454,
0.09889303892850876,
0.030818484723567963,
0.06943147629499435,
-0.01059445645660162,
0.015278245322406292,
-0.05115121603012085,
-0.0893559604883194,
0.025140443816781044,
0.008072792552411556,
0.08026784658432007,
0.005880521144717932,
-0.06082103028893471,
0.021673541516065598,
-0.05721529945731163,
-0.032977454364299774,
0.20603495836257935,
0.21837805211544037,
-0.10593485087156296,
0.01963035576045513,
0.00011520516272867098,
-0.07776135206222534,
-0.19612984359264374,
0.04137551784515381,
0.04815450683236122,
0.018216095864772797,
0.03512553870677948,
-0.17524226009845734,
0.15107755362987518,
0.1101469025015831,
-0.014013183303177357,
0.10103233903646469,
-0.30580073595046997,
-0.12347456812858582,
0.13789689540863037,
0.12939028441905975,
0.13237634301185608,
-0.1306103616952896,
-0.01185387559235096,
-0.028198547661304474,
-0.1425543576478958,
0.09835414588451385,
-0.10393572598695755,
0.11367519944906235,
-0.044036321341991425,
0.07140891253948212,
0.0034511450212448835,
-0.06000930443406105,
0.12067489326000214,
0.025124182924628258,
0.09810183197259903,
-0.056498534977436066,
-0.034325432032346725,
0.03099016286432743,
-0.04756839945912361,
0.031243259087204933,
-0.10857679694890976,
0.023675616830587387,
-0.12081367522478104,
-0.025438150390982628,
-0.06328153610229492,
0.049994807690382004,
-0.04249459132552147,
-0.060809362679719925,
-0.03294748067855835,
0.01564195565879345,
0.05251622945070267,
-0.009419661946594715,
0.14960907399654388,
0.023017099127173424,
0.14949700236320496,
0.08569129556417465,
0.08571472764015198,
-0.07848557829856873,
-0.0662907212972641,
-0.018670717254281044,
-0.01171959936618805,
0.052121590822935104,
-0.1567266285419464,
0.0222612377256155,
0.14933155477046967,
0.023854093626141548,
0.14069582521915436,
0.08444757014513016,
-0.012314979918301105,
0.006973995827138424,
0.05782342329621315,
-0.16315408051013947,
-0.08276087045669556,
-0.01947229914367199,
-0.05304446816444397,
-0.12343880534172058,
0.044341687113046646,
0.08120200037956238,
-0.07319604605436325,
-0.01001099031418562,
-0.008966249413788319,
0.00801245216280222,
-0.060733210295438766,
0.17431361973285675,
0.04631480574607849,
0.04427378252148628,
-0.103228360414505,
0.06995489448308945,
0.04022670537233353,
-0.08173894137144089,
0.006496574729681015,
0.0679832473397255,
-0.07813875377178192,
-0.05333561822772026,
0.08378839492797852,
0.21410124003887177,
-0.04703439027070999,
-0.04646718502044678,
-0.1423557847738266,
-0.13277803361415863,
0.08483558148145676,
0.14282391965389252,
0.11973793059587479,
0.011175474151968956,
-0.0649867057800293,
-0.0028957000467926264,
-0.11970946192741394,
0.0956927016377449,
0.04556654393672943,
0.06415167450904846,
-0.1412634551525116,
0.13030587136745453,
0.014540751464664936,
0.04957909509539604,
-0.018316565081477165,
0.02747558057308197,
-0.09735246002674103,
0.01003090851008892,
-0.11197996139526367,
-0.014422730542719364,
-0.03730938956141472,
0.010056210681796074,
-0.005486046429723501,
-0.04593383148312569,
-0.06191306561231613,
0.010265232995152473,
-0.10761867463588715,
-0.020014287903904915,
0.03203214704990387,
0.06906575709581375,
-0.09926522523164749,
-0.03584988787770271,
0.025884181261062622,
-0.06411145627498627,
0.06664050370454788,
0.04775208234786987,
0.024460744112730026,
0.05203469842672348,
-0.13428330421447754,
0.020065493881702423,
0.07125937938690186,
0.023615064099431038,
0.06857342272996902,
-0.10210324823856354,
-0.0048413085751235485,
-0.0018470374634489417,
0.03993143141269684,
0.01879689283668995,
0.060570936650037766,
-0.13668784499168396,
-0.0018133589765056968,
-0.0059414212591946125,
-0.08354639261960983,
-0.06759142130613327,
0.025267822667956352,
0.09829063713550568,
0.010769153945147991,
0.20157437026500702,
-0.07453074306249619,
0.05111348256468773,
-0.21752247214317322,
0.008580947294831276,
-0.011457758024334908,
-0.10675018280744553,
-0.11847562342882156,
-0.07331645488739014,
0.060511115938425064,
-0.05910194292664528,
0.15643341839313507,
0.04336113855242729,
0.03594420477747917,
0.029215503484010696,
-0.016089623793959618,
0.025112714618444443,
0.013912186026573181,
0.20906245708465576,
0.031809382140636444,
-0.03966294229030609,
0.06834982335567474,
0.04653722792863846,
0.10755541920661926,
0.1338290572166443,
0.20528732240200043,
0.14109309017658234,
-0.0016062030335888267,
0.104256771504879,
0.03466791287064552,
-0.05762480944395065,
-0.1563846468925476,
0.03676179423928261,
-0.04096238315105438,
0.11012841761112213,
-0.017167802900075912,
0.20791314542293549,
0.07202707976102829,
-0.17344166338443756,
0.04336996749043465,
-0.05798328295350075,
-0.0792505294084549,
-0.12193125486373901,
-0.048324186354875565,
-0.08294760435819626,
-0.12893500924110413,
0.004624804016202688,
-0.11551131308078766,
0.0029855608008801937,
0.11830372363328934,
0.0003146572853438556,
-0.025349488481879234,
0.15853318572044373,
0.012229030951857567,
0.03587009757757187,
0.0623665414750576,
0.009986549615859985,
-0.033745404332876205,
-0.12574751675128937,
-0.049979232251644135,
-0.01642908900976181,
-0.03286297246813774,
0.02869069203734398,
-0.06801576167345047,
-0.043811243027448654,
0.0380096510052681,
-0.018702475354075432,
-0.0994282215833664,
0.008727424778044224,
0.010006695054471493,
0.06349524110555649,
0.04147202521562576,
0.009688003920018673,
0.02728598564863205,
-0.008197645656764507,
0.20150049030780792,
-0.08199817687273026,
-0.05192165821790695,
-0.10646841675043106,
0.24835076928138733,
0.04308316856622696,
-0.024649769067764282,
0.02875283546745777,
-0.062391892075538635,
0.00807760376483202,
0.25195541977882385,
0.20674537122249603,
-0.06871533393859863,
-0.007414078805595636,
0.005721509922295809,
-0.0078910943120718,
-0.0229549128562212,
0.09814517945051193,
0.13991482555866241,
0.039135899394750595,
-0.10182734578847885,
-0.05337280035018921,
-0.05527487024664879,
-0.020565170794725418,
-0.03373938426375389,
0.08031013607978821,
0.05193829908967018,
0.0009627835243009031,
-0.02825058251619339,
0.049038421362638474,
-0.06365194916725159,
-0.07327164709568024,
0.06637400388717651,
-0.21353663504123688,
-0.1605113446712494,
-0.009253967553377151,
0.09987538307905197,
0.011628348380327225,
0.06883668899536133,
-0.02442094497382641,
-0.005353689659386873,
0.09306224435567856,
-0.01920473948121071,
-0.106890469789505,
-0.07222694903612137,
0.08502697199583054,
-0.1229671910405159,
0.2248678207397461,
-0.042769718915224075,
0.04968447983264923,
0.12772446870803833,
0.07353010773658752,
-0.08143990486860275,
0.05894053354859352,
0.03563410043716431,
-0.05077586695551872,
0.029180480167269707,
0.07570891827344894,
-0.03623399883508682,
0.05392841622233391,
0.0446491502225399,
-0.1353415846824646,
0.023813316598534584,
-0.06738412380218506,
-0.061082519590854645,
-0.04242687672376633,
-0.020358486101031303,
-0.053828101605176926,
0.13348907232284546,
0.22147727012634277,
-0.02643861249089241,
-0.01265759114176035,
-0.06895022839307785,
0.011453851126134396,
0.0556543804705143,
0.027651680633425713,
-0.06081826612353325,
-0.20048682391643524,
0.02100181393325329,
0.04676947742700577,
-0.02104194276034832,
-0.2525334358215332,
-0.09901151061058044,
0.0026946943253278732,
-0.08497780561447144,
-0.08980201184749603,
0.06273863464593887,
0.0974934995174408,
0.05462854355573654,
-0.0601477175951004,
-0.05967121943831444,
-0.06285107880830765,
0.14960379898548126,
-0.1336909383535385,
-0.09875859320163727
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# albert-xxlarge-v2-finetuned-wnli
This model is a fine-tuned version of [albert-xxlarge-v2](https://huggingface.co/albert-xxlarge-v2) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6970
- Accuracy: 0.5070
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 13 | 0.8066 | 0.4366 |
| No log | 2.0 | 26 | 0.6970 | 0.5070 |
| No log | 3.0 | 39 | 0.7977 | 0.4507 |
| No log | 4.0 | 52 | 0.7906 | 0.4930 |
| No log | 5.0 | 65 | 0.8459 | 0.4366 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "albert-xxlarge-v2-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.5070422535211268, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/albert-xxlarge-v2-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"albert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
albert-xxlarge-v2-finetuned-wnli
================================
This model is a fine-tuned version of albert-xxlarge-v2 on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6970
* Accuracy: 0.5070
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #albert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
-0.1043897345662117,
0.09041710197925568,
-0.0018204497173428535,
0.12200300395488739,
0.16448982059955597,
0.03956865519285202,
0.13497143983840942,
0.1270168423652649,
-0.08481080830097198,
0.015455087646842003,
0.12540465593338013,
0.153660848736763,
0.023047437891364098,
0.09091728925704956,
-0.04355344548821449,
-0.26016730070114136,
-0.019337916746735573,
0.05018600821495056,
-0.060656387358903885,
0.13257810473442078,
0.08433875441551208,
-0.12124460190534592,
0.0994998961687088,
0.013166174292564392,
-0.19330553710460663,
0.006458010524511337,
0.0084362318739295,
-0.05300622433423996,
0.14907553791999817,
0.032914966344833374,
0.12039439380168915,
0.0010248207254335284,
0.08637624979019165,
-0.20333199203014374,
0.011971702799201012,
0.0485745444893837,
0.0050692204385995865,
0.09407994151115417,
0.0499265193939209,
0.008502666838467121,
0.13834148645401,
-0.0866832360625267,
0.05521291494369507,
0.032211288809776306,
-0.12968367338180542,
-0.21511566638946533,
-0.07950890064239502,
0.029673296958208084,
0.07841763645410538,
0.11043371260166168,
-0.00716405501589179,
0.12414336204528809,
-0.08651991188526154,
0.08560904860496521,
0.2336435168981552,
-0.2942045331001282,
-0.06738997250795364,
0.0342283695936203,
0.008772971108555794,
0.04296833649277687,
-0.10561344772577286,
-0.02603508159518242,
0.05543508380651474,
0.04415561258792877,
0.12440583109855652,
-0.030688507482409477,
-0.1179208979010582,
0.018727442249655724,
-0.13574853539466858,
-0.0319734551012516,
0.1635868400335312,
0.04732310399413109,
-0.03021303191781044,
-0.05170556530356407,
-0.04995705932378769,
-0.15496456623077393,
-0.03114769048988819,
-0.008811096660792828,
0.050622597336769104,
-0.024373497813940048,
-0.045188650488853455,
-0.005700754001736641,
-0.10870394110679626,
-0.06958650797605515,
-0.07608795166015625,
0.11917579174041748,
0.032009463757276535,
0.013160968199372292,
-0.03322426229715347,
0.11538581550121307,
0.002047595800831914,
-0.1271321177482605,
0.023262539878487587,
0.028598491102457047,
0.0013884284999221563,
-0.04260369762778282,
-0.05158384516835213,
-0.05187821388244629,
0.00866610649973154,
0.12119698524475098,
-0.0374576710164547,
0.03818407654762268,
0.054203156381845474,
0.04675496742129326,
-0.09873638302087784,
0.1985936313867569,
-0.0405128039419651,
-0.019729875028133392,
-0.003358252579346299,
0.046564240008592606,
0.020224222913384438,
-0.011167053133249283,
-0.11705933511257172,
0.0015681820223107934,
0.07488088309764862,
0.007234811782836914,
-0.06780674308538437,
0.06805625557899475,
-0.06067543849349022,
-0.026134150102734566,
-0.00582540687173605,
-0.08523271977901459,
0.02753078006207943,
0.0015990851679816842,
-0.07716228067874908,
-0.016031717881560326,
0.03096054121851921,
0.02303786389529705,
-0.008554721251130104,
0.11098548024892807,
-0.0875086784362793,
0.030586492270231247,
-0.09531625360250473,
-0.10716207325458527,
0.01914925128221512,
-0.10970630496740341,
0.033739835023880005,
-0.08660628646612167,
-0.17853286862373352,
-0.010206986218690872,
0.05820174887776375,
-0.02350551448762417,
-0.05964024364948273,
-0.05787455663084984,
-0.06257002800703049,
0.011450978927314281,
-0.0047058057971298695,
0.12778115272521973,
-0.06686297804117203,
0.08575877547264099,
0.026083016768097878,
0.06403693556785583,
-0.040051139891147614,
0.05534994974732399,
-0.10165484994649887,
0.011382686905562878,
-0.13711349666118622,
0.03074643574655056,
-0.04538505896925926,
0.06834545731544495,
-0.08063977211713791,
-0.09354162961244583,
0.01778559572994709,
0.0006385404267348349,
0.05734335258603096,
0.10040190070867538,
-0.18436020612716675,
-0.08850996196269989,
0.1568250209093094,
-0.06554961204528809,
-0.12835486233234406,
0.12294255197048187,
-0.058451417833566666,
0.049851201474666595,
0.06012613698840141,
0.15347108244895935,
0.0778881162405014,
-0.08005436509847641,
0.0018642847426235676,
0.025876367464661598,
0.05513768270611763,
-0.0625874325633049,
0.07856831699609756,
0.000510420766659081,
0.0018666420364752412,
0.03265516459941864,
-0.024883462116122246,
0.06353659182786942,
-0.0928104966878891,
-0.10366548597812653,
-0.04003290459513664,
-0.08430887013673782,
0.04179241508245468,
0.08037082105875015,
0.06832953542470932,
-0.09617464244365692,
-0.0834355279803276,
0.037935853004455566,
0.0807463750243187,
-0.04798172414302826,
0.022671908140182495,
-0.050730880349874496,
0.05897333472967148,
-0.037085819989442825,
-0.023646162822842598,
-0.1692752242088318,
-0.02350112423300743,
0.0016649349126964808,
-0.010917467065155506,
0.016340263187885284,
0.04113783687353134,
0.06870339065790176,
0.06370645016431808,
-0.052015453577041626,
-0.014911185018718243,
-0.04408225044608116,
-0.003138963133096695,
-0.12948457896709442,
-0.20873630046844482,
-0.03088158369064331,
-0.021439122036099434,
0.16991887986660004,
-0.20735763013362885,
0.04764413461089134,
-0.02439938299357891,
0.062108322978019714,
0.016043543815612793,
-0.007533009629696608,
-0.04192341864109039,
0.07536851614713669,
-0.041417963802814484,
-0.05027645081281662,
0.07738333940505981,
0.011750129982829094,
-0.10127293318510056,
-0.05371388792991638,
-0.09444814920425415,
0.16020287573337555,
0.13115833699703217,
-0.11465359479188919,
-0.0724000409245491,
-0.01126941293478012,
-0.06456045061349869,
-0.034696999937295914,
-0.05413107946515083,
0.03604563698172569,
0.20521071553230286,
-0.0061521041207015514,
0.148649662733078,
-0.06420578807592392,
-0.042227379977703094,
0.022919071838259697,
-0.03824517875909805,
0.023668700829148293,
0.13661155104637146,
0.13755904138088226,
-0.050012361258268356,
0.1495388299226761,
0.1569514274597168,
-0.088559590280056,
0.1428588181734085,
-0.041790347546339035,
-0.07380831241607666,
-0.01835949718952179,
-0.03915363550186157,
-0.005380677524954081,
0.11051664501428604,
-0.1627344787120819,
-0.004747296683490276,
0.027950014919042587,
0.01196958962827921,
0.01957515813410282,
-0.22983019053936005,
-0.049459058791399,
0.044974543154239655,
-0.03769618272781372,
-0.018881667405366898,
-0.009966891258955002,
0.002720503369346261,
0.10604012757539749,
0.0008937334059737623,
-0.0848451629281044,
0.03172663599252701,
0.0022889338433742523,
-0.0838480219244957,
0.2177211195230484,
-0.07052799314260483,
-0.15200303494930267,
-0.1336527019739151,
-0.07158368080854416,
-0.05252264440059662,
0.0019286591559648514,
0.06796949356794357,
-0.10265792161226273,
-0.02505035512149334,
-0.07416200637817383,
0.032283633947372437,
0.007663427852094173,
0.026632966473698616,
0.007754262536764145,
0.005139422602951527,
0.06503865122795105,
-0.1074826717376709,
-0.01233445480465889,
-0.057872917503118515,
-0.0591345876455307,
0.03848171979188919,
0.034625981003046036,
0.115079365670681,
0.15166349709033966,
-0.012620719149708748,
0.009598924778401852,
-0.029637129977345467,
0.22535473108291626,
-0.06218699738383293,
-0.03518056496977806,
0.1383657604455948,
-0.00844966433942318,
0.04050154611468315,
0.10554705560207367,
0.08095891773700714,
-0.07628611475229263,
-0.0016992816235870123,
0.04296675696969032,
-0.03480858728289604,
-0.23365439474582672,
-0.04573305323719978,
-0.05494158715009689,
0.014232792891561985,
0.09255701303482056,
0.019993748515844345,
0.029775872826576233,
0.06984301656484604,
0.03963799774646759,
0.08110816776752472,
-0.04758360981941223,
0.050262778997421265,
0.10383641719818115,
0.03471602872014046,
0.12061983346939087,
-0.053521495312452316,
-0.0673314705491066,
0.04152953252196312,
-0.01598353125154972,
0.22365815937519073,
0.02313443087041378,
0.13652832806110382,
0.06385821849107742,
0.15331554412841797,
-0.00816011056303978,
0.0783262625336647,
-0.0021609202958643436,
-0.04902302101254463,
-0.016767462715506554,
-0.040544673800468445,
-0.03478052467107773,
0.028274979442358017,
-0.07335725426673889,
0.0820470005273819,
-0.1323360651731491,
0.011274303309619427,
0.052326325327157974,
0.2542121410369873,
0.044810209423303604,
-0.31556814908981323,
-0.09052421897649765,
0.00866544246673584,
-0.02356516383588314,
-0.01863424852490425,
0.028864992782473564,
0.08615357428789139,
-0.09284189343452454,
0.02936282567679882,
-0.07169239223003387,
0.09910397231578827,
-0.054707691073417664,
0.0510663241147995,
0.08254608511924744,
0.08267651498317719,
0.005808492656797171,
0.09461677074432373,
-0.2964196503162384,
0.2862945795059204,
0.0039495741948485374,
0.05975591763854027,
-0.0789700374007225,
0.006556149106472731,
0.046888552606105804,
0.06812030076980591,
0.08470743149518967,
-0.013251593336462975,
-0.020075054839253426,
-0.2003544569015503,
-0.0688004121184349,
0.03323981165885925,
0.062028754502534866,
-0.046769145876169205,
0.08501488715410233,
-0.028460077941417694,
0.009700755588710308,
0.07592134177684784,
0.016847729682922363,
-0.05689072608947754,
-0.10987614095211029,
-0.010992318391799927,
0.0211940910667181,
-0.06817856431007385,
-0.06148962303996086,
-0.12117602676153183,
-0.13506893813610077,
0.14350582659244537,
-0.035363178700208664,
-0.025651419535279274,
-0.10787783563137054,
0.08389080315828323,
0.05256499722599983,
-0.09024346619844437,
0.03252946212887764,
0.006361968349665403,
0.07474474608898163,
0.02820555865764618,
-0.0720175951719284,
0.10513556748628616,
-0.07201699167490005,
-0.1567019373178482,
-0.06933965533971786,
0.09822496026754379,
0.036299943923950195,
0.07195328176021576,
-0.016210954636335373,
0.009944227524101734,
-0.04810528829693794,
-0.08777511119842529,
0.02948813885450363,
0.01058610063046217,
0.06848503649234772,
0.0174538716673851,
-0.06458134204149246,
0.023709574714303017,
-0.06219806522130966,
-0.03656124323606491,
0.20266254246234894,
0.23392795026302338,
-0.10307903587818146,
0.016509070992469788,
0.008817926980555058,
-0.07866012305021286,
-0.19457846879959106,
0.04412205144762993,
0.04552039876580238,
0.016301503404974937,
0.0460701584815979,
-0.18476153910160065,
0.14131374657154083,
0.11442817002534866,
-0.013105911202728748,
0.10344818979501724,
-0.3194306790828705,
-0.12042511254549026,
0.14050836861133575,
0.13497504591941833,
0.11862707883119583,
-0.13775183260440826,
-0.015473544597625732,
-0.024572154507040977,
-0.14168329536914825,
0.104047492146492,
-0.10836904495954514,
0.11888211220502853,
-0.04508076608181,
0.06266004592180252,
0.0037345942109823227,
-0.05790884792804718,
0.12771525979042053,
0.020587172359228134,
0.10045302659273148,
-0.05350383371114731,
-0.033641278743743896,
0.03071650117635727,
-0.04200958088040352,
0.022741947323083878,
-0.11080576479434967,
0.02306830883026123,
-0.11522656679153442,
-0.02256503328680992,
-0.06575943529605865,
0.04976709559559822,
-0.04724540561437607,
-0.0656195804476738,
-0.03238914906978607,
0.01894870586693287,
0.04376131296157837,
-0.009512390941381454,
0.13846907019615173,
0.019795814529061317,
0.15673676133155823,
0.08587489277124405,
0.08065654337406158,
-0.06639920175075531,
-0.07217054069042206,
-0.023511311039328575,
-0.01077340543270111,
0.05354699864983559,
-0.15230706334114075,
0.017872119322419167,
0.14700889587402344,
0.02564365789294243,
0.1493699550628662,
0.08398130536079407,
-0.016955314204096794,
0.005387521348893642,
0.05736297369003296,
-0.16045571863651276,
-0.08898655325174332,
-0.022810498252511024,
-0.057850807905197144,
-0.12351630628108978,
0.04439418017864227,
0.08429817855358124,
-0.07308221608400345,
-0.007165681105107069,
-0.007022528909146786,
0.006990774534642696,
-0.059922955930233,
0.17914733290672302,
0.05237842723727226,
0.04748550057411194,
-0.0986846312880516,
0.0733439028263092,
0.040773339569568634,
-0.07562568783760071,
-0.0016482991632074118,
0.06309086829423904,
-0.07577116042375565,
-0.05331952124834061,
0.07815593481063843,
0.21861189603805542,
-0.046605970710515976,
-0.044787194579839706,
-0.1488434225320816,
-0.13134630024433136,
0.07773518562316895,
0.1400838941335678,
0.12035413086414337,
0.011797239072620869,
-0.06410318613052368,
0.0008707083179615438,
-0.11228730529546738,
0.09598390012979507,
0.04239305108785629,
0.06328403204679489,
-0.1388140320777893,
0.1358414888381958,
0.020042940974235535,
0.04542526602745056,
-0.01682385802268982,
0.023282723501324654,
-0.10045109689235687,
0.009182396344840527,
-0.10750310868024826,
-0.024285614490509033,
-0.026428405195474625,
0.01211906410753727,
-0.005861642770469189,
-0.04758259654045105,
-0.0566759891808033,
0.005985803436487913,
-0.10838011652231216,
-0.021770738065242767,
0.03533035144209862,
0.07539134472608566,
-0.10229399800300598,
-0.03510992228984833,
0.031127886846661568,
-0.062376491725444794,
0.06643445789813995,
0.03968359902501106,
0.026856200769543648,
0.05519283562898636,
-0.14207905530929565,
0.023693231865763664,
0.06791891902685165,
0.025541288778185844,
0.06395706534385681,
-0.09913884848356247,
-0.008636538870632648,
-0.013803153298795223,
0.04345450550317764,
0.01916884072124958,
0.061185795813798904,
-0.13449762761592865,
-0.0013455058215186,
-0.010348033159971237,
-0.08741991221904755,
-0.06637092679738998,
0.027137411758303642,
0.0962412878870964,
0.012502683326601982,
0.2001606822013855,
-0.07415919005870819,
0.05315055325627327,
-0.22284531593322754,
0.007784545887261629,
-0.009375255554914474,
-0.10596857219934464,
-0.117481529712677,
-0.0776430070400238,
0.058280833065509796,
-0.061822276562452316,
0.15235701203346252,
0.0425226092338562,
0.032762084156274796,
0.025983376428484917,
-0.013488364405930042,
0.02397947758436203,
0.012583584524691105,
0.20739060640335083,
0.03827144205570221,
-0.03573043271899223,
0.06570188701152802,
0.0477876141667366,
0.10386498272418976,
0.12428651750087738,
0.20066112279891968,
0.14294204115867615,
-0.015342270024120808,
0.0958690345287323,
0.043385203927755356,
-0.06158531829714775,
-0.14614607393741608,
0.044377125799655914,
-0.034792233258485794,
0.1079014241695404,
-0.018612291663885117,
0.21717964112758636,
0.06784909218549728,
-0.17057251930236816,
0.04580388590693474,
-0.053313955664634705,
-0.08039191365242004,
-0.12306919693946838,
-0.03409576043486595,
-0.07853835076093674,
-0.1308421641588211,
0.0009298619697801769,
-0.1138087809085846,
0.0009946830105036497,
0.12857311964035034,
0.0004376996657811105,
-0.02373930811882019,
0.15620113909244537,
0.01808966137468815,
0.03327104076743126,
0.06001606956124306,
0.008834846317768097,
-0.037406764924526215,
-0.14248812198638916,
-0.055841442197561264,
-0.010784979909658432,
-0.024631042033433914,
0.024992385879158974,
-0.0701456367969513,
-0.052444834262132645,
0.0370294526219368,
-0.019407952204346657,
-0.10484905540943146,
0.010788097977638245,
0.003588929073885083,
0.05906722694635391,
0.03694666177034378,
0.008093061856925488,
0.0268037561327219,
-0.0061460998840630054,
0.20350411534309387,
-0.07954155653715134,
-0.05374414101243019,
-0.09890949726104736,
0.24820192158222198,
0.03600170463323593,
-0.019097449257969856,
0.030537579208612442,
-0.06343317031860352,
0.008436888456344604,
0.24792559444904327,
0.21281282603740692,
-0.07731661945581436,
-0.006568665150552988,
0.01074000634253025,
-0.007876424118876457,
-0.02606234699487686,
0.09869705140590668,
0.13348488509655,
0.02694064937531948,
-0.10093294084072113,
-0.04489089176058769,
-0.05451182648539543,
-0.020151840522885323,
-0.02722935564815998,
0.07710536569356918,
0.0591934435069561,
0.005201260559260845,
-0.032312966883182526,
0.052586235105991364,
-0.06001158803701401,
-0.07486385107040405,
0.0692172423005104,
-0.21624009311199188,
-0.16352620720863342,
-0.016660498455166817,
0.10600580275058746,
0.010035257786512375,
0.0684928447008133,
-0.02732844464480877,
-0.004101307597011328,
0.08941727131605148,
-0.018484266474843025,
-0.1078256294131279,
-0.08087000250816345,
0.08553116768598557,
-0.1175895631313324,
0.22102504968643188,
-0.045581746846437454,
0.05132993310689926,
0.12591637670993805,
0.06818245351314545,
-0.07157515734434128,
0.06208227574825287,
0.03936963155865669,
-0.05185438692569733,
0.022042429074645042,
0.06987792253494263,
-0.034005582332611084,
0.06092541292309761,
0.04595522955060005,
-0.13468651473522186,
0.027433451265096664,
-0.06213991343975067,
-0.06929901987314224,
-0.03811642527580261,
-0.020439451560378075,
-0.053217582404613495,
0.13125689327716827,
0.22217302024364471,
-0.02466833032667637,
-0.011345439590513706,
-0.0688401386141777,
0.009084232151508331,
0.06049910560250282,
0.030142048373818398,
-0.06082216650247574,
-0.19785021245479584,
0.018391301855444908,
0.04134780913591385,
-0.01892002485692501,
-0.26198142766952515,
-0.09846517443656921,
0.0023176679387688637,
-0.08312686532735825,
-0.08758088946342468,
0.06375978142023087,
0.09902011603116989,
0.057703327387571335,
-0.05802591145038605,
-0.06846413016319275,
-0.06341823935508728,
0.15233372151851654,
-0.13802897930145264,
-0.09735766798257828
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-cola
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9664
- Matthews Correlation: 0.5797
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5017 | 1.0 | 535 | 0.5252 | 0.4841 |
| 0.2903 | 2.0 | 1070 | 0.5550 | 0.4967 |
| 0.1839 | 3.0 | 1605 | 0.7295 | 0.5634 |
| 0.1132 | 4.0 | 2140 | 0.7762 | 0.5702 |
| 0.08 | 5.0 | 2675 | 0.9664 | 0.5797 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["matthews_correlation"], "model-index": [{"name": "bert-base-uncased-finetuned-cola", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.5796941781913538, "name": "Matthews Correlation"}]}]}]}
|
text-classification
|
anirudh21/bert-base-uncased-finetuned-cola
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
bert-base-uncased-finetuned-cola
================================
This model is a fine-tuned version of bert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.9664
* Matthews Correlation: 0.5797
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
65,
98,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.11325017362833023,
0.0812758356332779,
-0.0017998769180849195,
0.12009552121162415,
0.16387249529361725,
0.03597983717918396,
0.11156629025936127,
0.12876233458518982,
-0.08335161954164505,
0.02751629427075386,
0.1264062076807022,
0.15765658020973206,
0.019992440938949585,
0.12018559128046036,
-0.04816516116261482,
-0.26051899790763855,
-0.010511714965105057,
0.05137874186038971,
-0.05034004524350166,
0.13128826022148132,
0.08885212242603302,
-0.12818656861782074,
0.0962095707654953,
0.01392721850425005,
-0.19697393476963043,
0.002868467941880226,
0.009638463146984577,
-0.05619091913104057,
0.14565445482730865,
0.028437048196792603,
0.12006663531064987,
-0.0032987322192639112,
0.08477217704057693,
-0.19016307592391968,
0.011393117718398571,
0.04880033805966377,
0.0022858590818941593,
0.09458186477422714,
0.050123583525419235,
0.003134331200271845,
0.1275525540113449,
-0.08717275410890579,
0.05313878133893013,
0.02806706912815571,
-0.11805792152881622,
-0.22305718064308167,
-0.07618046551942825,
0.04159955307841301,
0.0740705281496048,
0.11006677150726318,
-0.005639785435050726,
0.1308141052722931,
-0.08605938404798508,
0.08880318701267242,
0.23550336062908173,
-0.30583614110946655,
-0.06443625688552856,
0.03265710920095444,
0.012285517528653145,
0.03883311152458191,
-0.1104314923286438,
-0.03320043161511421,
0.05513565614819527,
0.04949193447828293,
0.12685690820217133,
-0.03112470917403698,
-0.11342370510101318,
0.012215370312333107,
-0.13411471247673035,
-0.026288814842700958,
0.1539786458015442,
0.04028100147843361,
-0.03150342404842377,
-0.05181613191962242,
-0.05343287065625191,
-0.15177714824676514,
-0.03826058283448219,
-0.0056069716811180115,
0.04760108143091202,
-0.027068566530942917,
-0.05595731362700462,
0.0036838229279965162,
-0.1128939539194107,
-0.06788966804742813,
-0.08033433556556702,
0.12181299179792404,
0.033589623868465424,
0.015907712280750275,
-0.0370270200073719,
0.109775111079216,
-0.012047775089740753,
-0.13138563930988312,
0.014832996763288975,
0.025630218908190727,
0.009522379375994205,
-0.044848375022411346,
-0.052508819848299026,
-0.05302315577864647,
0.01229069847613573,
0.13055101037025452,
-0.05346065014600754,
0.04409528151154518,
0.05234171450138092,
0.04307950660586357,
-0.09279292076826096,
0.19667470455169678,
-0.038260214030742645,
-0.02218792960047722,
0.006767041981220245,
0.03915246203541756,
0.01817500777542591,
-0.01055796816945076,
-0.11901400238275528,
0.0019544779788702726,
0.08567898720502853,
0.006048509385436773,
-0.06584642827510834,
0.07457306236028671,
-0.05257037654519081,
-0.017383383587002754,
0.0006280068191699684,
-0.0888577327132225,
0.030389923602342606,
0.002498867455869913,
-0.07382941991090775,
-0.021265869960188866,
0.0300530344247818,
0.015772582963109016,
-0.019487250596284866,
0.12534859776496887,
-0.09381720423698425,
0.029932312667369843,
-0.09329327195882797,
-0.10815554112195969,
0.022634536027908325,
-0.09775605797767639,
0.028863679617643356,
-0.09573209285736084,
-0.1677841991186142,
-0.013606186956167221,
0.0570659302175045,
-0.028021475300192833,
-0.060382694005966187,
-0.04458434879779816,
-0.06752058118581772,
0.01605340465903282,
-0.011641287244856358,
0.13161610066890717,
-0.06672880053520203,
0.09067602455615997,
0.03318045288324356,
0.0643010064959526,
-0.04610082507133484,
0.0545768141746521,
-0.104961097240448,
0.01095003355294466,
-0.16204912960529327,
0.025268353521823883,
-0.046960558742284775,
0.08063752204179764,
-0.08346693217754364,
-0.09894329309463501,
0.008691877126693726,
-0.0016926821554079652,
0.06715283542871475,
0.09742321074008942,
-0.1710415780544281,
-0.08145228773355484,
0.1614946871995926,
-0.07140012830495834,
-0.1350884735584259,
0.11643988639116287,
-0.0545065701007843,
0.049516938626766205,
0.06354523450136185,
0.16835786402225494,
0.06510867923498154,
-0.09108350425958633,
-0.008360018953680992,
0.02844308875501156,
0.053611110895872116,
-0.08353448659181595,
0.07455205917358398,
0.00392266595736146,
0.012171769514679909,
0.035311244428157806,
-0.022403843700885773,
0.06179405376315117,
-0.09068257361650467,
-0.09588084369897842,
-0.043618425726890564,
-0.08639056235551834,
0.032657064497470856,
0.07789120078086853,
0.07061269134283066,
-0.09810224920511246,
-0.08729390799999237,
0.046419985592365265,
0.07929067313671112,
-0.04685145616531372,
0.02679145336151123,
-0.05538740009069443,
0.07372362166643143,
-0.037138938903808594,
-0.024889251217246056,
-0.17947198450565338,
-0.031591761857271194,
0.0025699553079903126,
0.00027104539913125336,
0.013970430940389633,
0.019852332770824432,
0.06701859831809998,
0.05625145882368088,
-0.052319593727588654,
-0.014577965252101421,
-0.022202685475349426,
-0.0010289466008543968,
-0.13953229784965515,
-0.2043076455593109,
-0.03315692022442818,
-0.022695127874612808,
0.141119122505188,
-0.2048681080341339,
0.040859151631593704,
-0.008060089312493801,
0.07516436278820038,
0.00908584799617529,
-0.003364209784194827,
-0.04505103826522827,
0.07078433781862259,
-0.03831224516034126,
-0.04826338216662407,
0.07603529095649719,
0.01888037845492363,
-0.09175366163253784,
-0.042933713644742966,
-0.0914168655872345,
0.1744447946548462,
0.1380210965871811,
-0.11001050472259521,
-0.07561561465263367,
-0.013212217949330807,
-0.06705914437770844,
-0.03366536274552345,
-0.05237497016787529,
0.0310813020914793,
0.1884153038263321,
-0.004342822823673487,
0.1502731442451477,
-0.06727545708417892,
-0.050321921706199646,
0.024544494226574898,
-0.03312907740473747,
0.02288474328815937,
0.12549304962158203,
0.13732585310935974,
-0.0625513568520546,
0.15156804025173187,
0.14943420886993408,
-0.09030985087156296,
0.13565336167812347,
-0.04143119975924492,
-0.07433126121759415,
-0.015361492522060871,
-0.037960514426231384,
-0.007015106733888388,
0.1112690195441246,
-0.1571711301803589,
-0.00569313345476985,
0.03188199922442436,
0.015982702374458313,
0.024935567751526833,
-0.22176682949066162,
-0.038873136043548584,
0.03447072207927704,
-0.034860335290431976,
-0.020748944953083992,
-0.012351339682936668,
0.0049042352475225925,
0.1071660965681076,
0.009230108000338078,
-0.07983247935771942,
0.035771407186985016,
0.007041286677122116,
-0.08509243279695511,
0.22050073742866516,
-0.0732455626130104,
-0.15531153976917267,
-0.12630519270896912,
-0.07531169056892395,
-0.04058314859867096,
-0.002001648535951972,
0.06999749690294266,
-0.09772171080112457,
-0.03454066812992096,
-0.06494801491498947,
0.026123855262994766,
0.002242924179881811,
0.038012176752090454,
0.0022714107763022184,
0.0032895600888878107,
0.06618290394544601,
-0.10766226053237915,
-0.017943644896149635,
-0.060056332498788834,
-0.04557625949382782,
0.03699665516614914,
0.03504917398095131,
0.11433953791856766,
0.1497831493616104,
-0.014221231453120708,
0.013781113550066948,
-0.03170427307486534,
0.23820388317108154,
-0.06084933876991272,
-0.026581011712551117,
0.13501763343811035,
-0.0076615759171545506,
0.04777873679995537,
0.12092947214841843,
0.07617264240980148,
-0.07880193740129471,
0.002600674517452717,
0.03899788483977318,
-0.03430943936109543,
-0.23151859641075134,
-0.04990599676966667,
-0.05700865015387535,
0.003815891221165657,
0.09155996143817902,
0.028392238542437553,
0.031110122799873352,
0.07152868807315826,
0.0391230545938015,
0.07864564657211304,
-0.05178530141711235,
0.05781975015997887,
0.1154990866780281,
0.03813260793685913,
0.127434641122818,
-0.05280769616365433,
-0.062106065452098846,
0.044395819306373596,
-0.018649987876415253,
0.21916259825229645,
0.003968053963035345,
0.13120047748088837,
0.055209796875715256,
0.1672869324684143,
-0.0032082071993499994,
0.08467075973749161,
-0.010750634595751762,
-0.04991510882973671,
-0.010758607648313046,
-0.03968540206551552,
-0.03318402171134949,
0.02596471644937992,
-0.07350622117519379,
0.07018791139125824,
-0.12936514616012573,
0.008289091289043427,
0.059413567185401917,
0.24862781167030334,
0.04488257318735123,
-0.32264694571495056,
-0.0988081842660904,
0.0020014536567032337,
-0.02992011047899723,
-0.02737218514084816,
0.025928953662514687,
0.08604683727025986,
-0.09423001855611801,
0.031115587800741196,
-0.06771036982536316,
0.10112819820642471,
-0.04301420971751213,
0.05122147873044014,
0.08861610293388367,
0.09055721759796143,
0.0049886442720890045,
0.09118586778640747,
-0.2906533181667328,
0.2798357307910919,
0.006987671833485365,
0.06918463855981827,
-0.08365077525377274,
0.006707466207444668,
0.03907150775194168,
0.06553469598293304,
0.08288156241178513,
-0.014953190460801125,
-0.039788488298654556,
-0.19358643889427185,
-0.06484533101320267,
0.0339672788977623,
0.06623050570487976,
-0.03288320079445839,
0.08651845902204514,
-0.031395073980093,
0.007457742467522621,
0.07362592965364456,
0.008733662776648998,
-0.04875722900032997,
-0.10116855055093765,
-0.010509002022445202,
0.02958148531615734,
-0.06401415169239044,
-0.06195148453116417,
-0.1207931861281395,
-0.12192783504724503,
0.1639934778213501,
-0.03318662568926811,
-0.03521491587162018,
-0.11417360603809357,
0.08966812491416931,
0.06180043891072273,
-0.0941837728023529,
0.038484714925289154,
-0.0009707536664791405,
0.07954935729503632,
0.02598000504076481,
-0.07708893716335297,
0.11154722422361374,
-0.07479967921972275,
-0.15117181837558746,
-0.06584189832210541,
0.10492885112762451,
0.027630958706140518,
0.06822939962148666,
-0.013515608385205269,
0.012359414249658585,
-0.05078385770320892,
-0.09151500463485718,
0.017749764025211334,
-0.009329475462436676,
0.07841002941131592,
0.0031738318502902985,
-0.06693345308303833,
0.012862754054367542,
-0.054870735853910446,
-0.03405337408185005,
0.20014168322086334,
0.21855491399765015,
-0.10459254682064056,
0.019309353083372116,
0.025887245312333107,
-0.07127265632152557,
-0.20469018816947937,
0.03517827019095421,
0.04953690990805626,
0.012814527377486229,
0.032210033386945724,
-0.17098402976989746,
0.1580735146999359,
0.10622065514326096,
-0.015065732412040234,
0.10073093324899673,
-0.299551397562027,
-0.12669068574905396,
0.14076679944992065,
0.13006934523582458,
0.12501086294651031,
-0.13742731511592865,
-0.021279647946357727,
-0.025343095883727074,
-0.14574852585792542,
0.10519156605005264,
-0.10746797174215317,
0.11452843248844147,
-0.03742414340376854,
0.07620010524988174,
0.002816242864355445,
-0.06205955520272255,
0.11955329775810242,
0.026662414893507957,
0.09018383920192719,
-0.06074850261211395,
-0.042297229170799255,
0.0341629758477211,
-0.04408671706914902,
0.03661029040813446,
-0.10185978561639786,
0.026655469089746475,
-0.10676504671573639,
-0.026511017233133316,
-0.0700063407421112,
0.04556037113070488,
-0.04346230998635292,
-0.06344757974147797,
-0.03487858176231384,
0.022959129884839058,
0.04026412591338158,
-0.0137229198589921,
0.13840116560459137,
0.021211916580796242,
0.15411525964736938,
0.09435506165027618,
0.07961063086986542,
-0.08547985553741455,
-0.0797184407711029,
-0.015108020976185799,
-0.01672511175274849,
0.05510300397872925,
-0.14676183462142944,
0.02361373044550419,
0.1522795557975769,
0.023286614567041397,
0.13679563999176025,
0.0861886516213417,
-0.021892758086323738,
-0.0019361572340130806,
0.0652974396944046,
-0.1634119302034378,
-0.08178110420703888,
-0.014039850793778896,
-0.061166077852249146,
-0.13189998269081116,
0.048213761299848557,
0.0864153578877449,
-0.06775419414043427,
-0.008551523089408875,
-0.00740694859996438,
0.006305362097918987,
-0.05940462648868561,
0.18928800523281097,
0.06243189051747322,
0.04615224152803421,
-0.10399850457906723,
0.06477366387844086,
0.04499476030468941,
-0.07792001962661743,
0.003070139791816473,
0.07963893562555313,
-0.08338174968957901,
-0.052400678396224976,
0.08749459683895111,
0.19778034090995789,
-0.05147331953048706,
-0.05141191557049751,
-0.1413978934288025,
-0.13067592680454254,
0.08519208431243896,
0.1525023728609085,
0.12004642188549042,
0.013027970679104328,
-0.059264328330755234,
0.003653865307569504,
-0.11338938027620316,
0.09490028768777847,
0.04414375126361847,
0.06387224793434143,
-0.14424721896648407,
0.1478741616010666,
0.014303641393780708,
0.05093104764819145,
-0.020217612385749817,
0.03035593591630459,
-0.11080899834632874,
0.00768400589004159,
-0.10925600677728653,
-0.014188872650265694,
-0.03684592247009277,
0.0077499947510659695,
-0.0033408727031201124,
-0.054110124707221985,
-0.06255713850259781,
0.011927232146263123,
-0.1080976277589798,
-0.021136194467544556,
0.030454004183411598,
0.06746035814285278,
-0.1149725690484047,
-0.03279729187488556,
0.027025137096643448,
-0.06026068702340126,
0.06934566795825958,
0.04836318641901016,
0.024385672062635422,
0.058933038264513016,
-0.13959120213985443,
0.01653558574616909,
0.0693189948797226,
0.020435437560081482,
0.0718948245048523,
-0.09006860852241516,
-0.006668671499937773,
-0.002739422954618931,
0.04672892019152641,
0.02170906402170658,
0.07274547964334488,
-0.14131338894367218,
-0.0019496014574542642,
-0.01645248383283615,
-0.08669720590114594,
-0.06424178183078766,
0.0252529326826334,
0.09883679449558258,
0.010830315761268139,
0.19647841155529022,
-0.07557108253240585,
0.04333958774805069,
-0.22063951194286346,
0.011873209848999977,
-0.01501546148210764,
-0.10546151548624039,
-0.10812777280807495,
-0.0694873109459877,
0.06165764853358269,
-0.05573686584830284,
0.15247857570648193,
0.043157633394002914,
0.03789886459708214,
0.03379714488983154,
-0.005651878193020821,
0.018328923732042313,
0.015883320942521095,
0.20190127193927765,
0.03162865713238716,
-0.0364995002746582,
0.06019076332449913,
0.04753594845533371,
0.1018684133887291,
0.125466987490654,
0.2096361517906189,
0.14029313623905182,
0.009925403632223606,
0.09996341168880463,
0.04067710041999817,
-0.0595560185611248,
-0.15866446495056152,
0.0363604910671711,
-0.04750402644276619,
0.10207531601190567,
-0.02192346751689911,
0.21357475221157074,
0.07121794670820236,
-0.1699521839618683,
0.04399692267179489,
-0.06269006431102753,
-0.08588816970586777,
-0.12070256471633911,
-0.04586905241012573,
-0.08106168359518051,
-0.1309642642736435,
0.0001321805320912972,
-0.11273068934679031,
-0.0012162922648712993,
0.12198466807603836,
0.003683202899992466,
-0.023600250482559204,
0.16350850462913513,
0.011624647304415703,
0.029433852061629295,
0.0566394217312336,
0.011841543018817902,
-0.033431462943553925,
-0.12000541388988495,
-0.050130732357501984,
-0.01859196648001671,
-0.01961885765194893,
0.028178444132208824,
-0.06570537388324738,
-0.05181530490517616,
0.039616916328668594,
-0.01562797836959362,
-0.09667304158210754,
0.007670079357922077,
0.011419818736612797,
0.06166885048151016,
0.047199420630931854,
0.005478383507579565,
0.024718012660741806,
-0.00981750525534153,
0.2050236016511917,
-0.08058343827724457,
-0.06479468941688538,
-0.10787871479988098,
0.24570158123970032,
0.036584436893463135,
-0.021753570064902306,
0.029966039583086967,
-0.06766568124294281,
0.001812550937756896,
0.2563922703266144,
0.21230579912662506,
-0.07576889544725418,
-0.005493332166224718,
0.014079378917813301,
-0.007522681262344122,
-0.02169375866651535,
0.09983312338590622,
0.1424679160118103,
0.06358201801776886,
-0.10010802745819092,
-0.043535325676202774,
-0.05389847978949547,
-0.01864388771355152,
-0.034852199256420135,
0.07309868186712265,
0.049223218113183975,
0.006786705926060677,
-0.03663778677582741,
0.05149189755320549,
-0.06335853785276413,
-0.08562761545181274,
0.06320024281740189,
-0.2137106955051422,
-0.16493570804595947,
-0.013268045149743557,
0.10432645678520203,
0.009276872500777245,
0.06678687036037445,
-0.02361975610256195,
-0.0030751030426472425,
0.08530234545469284,
-0.019469894468784332,
-0.10770940780639648,
-0.08061136305332184,
0.09170249849557877,
-0.10910338163375854,
0.22337853908538818,
-0.0447578951716423,
0.05735598877072334,
0.12990860641002655,
0.0712352991104126,
-0.07847760617733002,
0.05848822370171547,
0.03836846724152565,
-0.06419610232114792,
0.026543453335762024,
0.0695822536945343,
-0.038894206285476685,
0.05666741356253624,
0.04231509193778038,
-0.14395594596862793,
0.022577930241823196,
-0.06142842397093773,
-0.06316056102514267,
-0.04586682468652725,
-0.023666471242904663,
-0.059554725885391235,
0.13229267299175262,
0.21798373758792877,
-0.02747350186109543,
-0.012410398572683334,
-0.07058558613061905,
0.01229069009423256,
0.05498321354389191,
0.021564923226833344,
-0.06255114823579788,
-0.2111641764640808,
0.022540688514709473,
0.046501778066158295,
-0.020975574851036072,
-0.2504262924194336,
-0.097313292324543,
0.002028163056820631,
-0.07102660834789276,
-0.0982884019613266,
0.06894835084676743,
0.09115249663591385,
0.05199338123202324,
-0.05813044682145119,
-0.05868222564458847,
-0.06910606473684311,
0.1469157636165619,
-0.1437578797340393,
-0.0993814542889595
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-mrpc
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6645
- Accuracy: 0.7917
- F1: 0.8590
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log | 1.0 | 63 | 0.5387 | 0.7402 | 0.8349 |
| No log | 2.0 | 126 | 0.5770 | 0.7696 | 0.8513 |
| No log | 3.0 | 189 | 0.5357 | 0.7574 | 0.8223 |
| No log | 4.0 | 252 | 0.6645 | 0.7917 | 0.8590 |
| No log | 5.0 | 315 | 0.6977 | 0.7721 | 0.8426 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy", "f1"], "model-index": [{"name": "bert-base-uncased-finetuned-mrpc", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "mrpc"}, "metrics": [{"type": "accuracy", "value": 0.7916666666666666, "name": "Accuracy"}, {"type": "f1", "value": 0.8590381426202321, "name": "F1"}]}]}]}
|
text-classification
|
anirudh21/bert-base-uncased-finetuned-mrpc
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
bert-base-uncased-finetuned-mrpc
================================
This model is a fine-tuned version of bert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6645
* Accuracy: 0.7917
* F1: 0.8590
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
65,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
-0.11133186519145966,
0.08905862271785736,
-0.0019529515411704779,
0.11926262825727463,
0.16365353763103485,
0.04162043333053589,
0.1258639693260193,
0.12645882368087769,
-0.07455145567655563,
0.022976869717240334,
0.12096518278121948,
0.15348933637142181,
0.023043740540742874,
0.10704247653484344,
-0.040501244366168976,
-0.26026225090026855,
-0.013247538357973099,
0.04851532354950905,
-0.06437662988901138,
0.13277100026607513,
0.0866004005074501,
-0.12581570446491241,
0.09601885080337524,
0.009649556130170822,
-0.19422127306461334,
0.0017206711927428842,
0.007977565750479698,
-0.05212786793708801,
0.15086549520492554,
0.02926788479089737,
0.12350449711084366,
0.0014905447605997324,
0.08821775764226913,
-0.19845278561115265,
0.011854307726025581,
0.04817909002304077,
0.0027517774142324924,
0.0955868512392044,
0.051991056650877,
0.007646949961781502,
0.12070808559656143,
-0.07913677394390106,
0.05419687181711197,
0.0324590690433979,
-0.12292025983333588,
-0.22264188528060913,
-0.07268179953098297,
0.040208227932453156,
0.06921274960041046,
0.10662046819925308,
-0.006979895289987326,
0.12474232167005539,
-0.09115926176309586,
0.08684293180704117,
0.2320433109998703,
-0.29412421584129333,
-0.06459763646125793,
0.04311281442642212,
0.01435175258666277,
0.04788975790143013,
-0.11191442608833313,
-0.03154014050960541,
0.05195839703083038,
0.0461265929043293,
0.12408581376075745,
-0.02880127541720867,
-0.11727513372898102,
0.017000213265419006,
-0.13744957745075226,
-0.023232145234942436,
0.15813182294368744,
0.04339705780148506,
-0.028095675632357597,
-0.04658012092113495,
-0.05194289609789848,
-0.15150019526481628,
-0.0341365784406662,
-0.009446341544389725,
0.050389364361763,
-0.026576237753033638,
-0.05501849949359894,
-0.0005341381765902042,
-0.11177557706832886,
-0.07694505900144577,
-0.07482677698135376,
0.12085600942373276,
0.033654965460300446,
0.013262685388326645,
-0.03246248513460159,
0.11479304730892181,
-0.011589611880481243,
-0.12685932219028473,
0.022527087479829788,
0.027043979614973068,
0.0019975421018898487,
-0.04214833676815033,
-0.05068589746952057,
-0.048043690621852875,
0.016031749546527863,
0.13082049787044525,
-0.04954429715871811,
0.042875126004219055,
0.059499725699424744,
0.04633919149637222,
-0.09586818516254425,
0.1958521455526352,
-0.043208133429288864,
-0.027714340016245842,
-0.006344474386423826,
0.048830073326826096,
0.013338039629161358,
-0.010817881673574448,
-0.1172495111823082,
0.0006783725111745298,
0.08582327514886856,
0.007432086858898401,
-0.06191590428352356,
0.0717836543917656,
-0.06002074480056763,
-0.021904723718762398,
-0.002673383802175522,
-0.08606187254190445,
0.024990789592266083,
0.006296610459685326,
-0.0766998827457428,
-0.021041015163064003,
0.03123394027352333,
0.016464808955788612,
-0.013112714514136314,
0.11979532986879349,
-0.09522491693496704,
0.02930484339594841,
-0.09236255288124084,
-0.11073702573776245,
0.019086187705397606,
-0.0985613614320755,
0.02576073631644249,
-0.09113888442516327,
-0.16786926984786987,
-0.013992294669151306,
0.05608450621366501,
-0.027498900890350342,
-0.05760985240340233,
-0.0475313775241375,
-0.06458979099988937,
0.012755324132740498,
-0.009550596587359905,
0.12700213491916656,
-0.06766325980424881,
0.09375490248203278,
0.03293578326702118,
0.06472074240446091,
-0.04454372823238373,
0.05696758255362511,
-0.10222452878952026,
0.00863958615809679,
-0.1531236618757248,
0.025275954976677895,
-0.054325055330991745,
0.07460157573223114,
-0.08293235301971436,
-0.0979761928319931,
0.006310020573437214,
0.001816469244658947,
0.0621962696313858,
0.10018028318881989,
-0.17708736658096313,
-0.08879605680704117,
0.16124749183654785,
-0.0710144191980362,
-0.13228903710842133,
0.11820337921380997,
-0.05107412859797478,
0.054325610399246216,
0.06299453973770142,
0.16892650723457336,
0.07679277658462524,
-0.08739560097455978,
-0.003738977015018463,
0.03158546984195709,
0.05849802494049072,
-0.07640291005373001,
0.07802613079547882,
-0.0030734259635210037,
0.009865626692771912,
0.033974140882492065,
-0.03170346841216087,
0.06573118269443512,
-0.09504654258489609,
-0.09940528124570847,
-0.037974968552589417,
-0.08879686892032623,
0.044805895537137985,
0.08003824949264526,
0.06611926853656769,
-0.09458979964256287,
-0.08521248400211334,
0.04980127513408661,
0.08448254317045212,
-0.04761314392089844,
0.020896904170513153,
-0.05010635033249855,
0.06983544677495956,
-0.032594289630651474,
-0.02474372647702694,
-0.17688089609146118,
-0.037930309772491455,
0.003483257722109556,
-0.0039383661933243275,
0.02035306766629219,
0.03337440267205238,
0.06896461546421051,
0.06123299151659012,
-0.053501445800065994,
-0.02038055844604969,
-0.03648308664560318,
-0.00002287392817379441,
-0.1335783153772354,
-0.21121446788311005,
-0.03343959152698517,
-0.019917041063308716,
0.15845483541488647,
-0.20825287699699402,
0.04502106085419655,
-0.011167707853019238,
0.07090239226818085,
0.012584364973008633,
-0.005436633713543415,
-0.044851191341876984,
0.0748448297381401,
-0.03540888428688049,
-0.04858015850186348,
0.07784289866685867,
0.011372145265340805,
-0.09612031280994415,
-0.049932003021240234,
-0.09360146522521973,
0.17633934319019318,
0.13601675629615784,
-0.11704548448324203,
-0.07527592033147812,
-0.016421815380454063,
-0.06632382422685623,
-0.036007825285196304,
-0.04987955093383789,
0.025685802102088928,
0.1804990917444229,
-0.003062593284994364,
0.1465587466955185,
-0.06498240679502487,
-0.043712347745895386,
0.020373014733195305,
-0.033853475004434586,
0.025405069813132286,
0.13192012906074524,
0.1413542479276657,
-0.05207671597599983,
0.1545637845993042,
0.15501214563846588,
-0.08902566134929657,
0.14543376863002777,
-0.04214373603463173,
-0.0748271718621254,
-0.016834532842040062,
-0.03663995862007141,
-0.008665396831929684,
0.11529027670621872,
-0.16037599742412567,
-0.005457292776554823,
0.02912897802889347,
0.014270984567701817,
0.02504456229507923,
-0.2274867594242096,
-0.04320798069238663,
0.03586262837052345,
-0.04130497947335243,
-0.01643715798854828,
-0.014064619317650795,
0.0027289174031466246,
0.10714861750602722,
0.005249135196208954,
-0.08377781510353088,
0.030804669484496117,
0.003663246054202318,
-0.08201051503419876,
0.22146055102348328,
-0.07282085716724396,
-0.15294183790683746,
-0.13444940745830536,
-0.06660070270299911,
-0.04344457760453224,
-0.0014246109640225768,
0.06601156294345856,
-0.10414940118789673,
-0.028113164007663727,
-0.06262822449207306,
0.03830212354660034,
0.004589039832353592,
0.03437737375497818,
-0.004018603358417749,
0.00657397136092186,
0.06800608336925507,
-0.11233603954315186,
-0.013323396444320679,
-0.06123647093772888,
-0.054307855665683746,
0.03794674575328827,
0.03348831459879875,
0.1152501180768013,
0.15420082211494446,
-0.008943500928580761,
0.009582378901541233,
-0.02896696701645851,
0.23741787672042847,
-0.06176336109638214,
-0.02700284868478775,
0.1373521089553833,
-0.007044375408440828,
0.046653326600790024,
0.11434922367334366,
0.08235130459070206,
-0.07713320851325989,
0.001954220002517104,
0.04405386000871658,
-0.030877048149704933,
-0.23404191434383392,
-0.04935338348150253,
-0.05210452526807785,
0.010211923159658909,
0.08878380060195923,
0.0263607706874609,
0.033074650913476944,
0.0701175332069397,
0.03870290890336037,
0.07352612167596817,
-0.04850434139370918,
0.04964708164334297,
0.11228486150503159,
0.03282718360424042,
0.1266881227493286,
-0.04986396059393883,
-0.06297986954450607,
0.043346017599105835,
-0.015013319440186024,
0.2204647958278656,
0.010978217236697674,
0.13841557502746582,
0.06098794937133789,
0.1687219887971878,
-0.007599520031362772,
0.07964138686656952,
-0.006470580119639635,
-0.0480966717004776,
-0.012450417503714561,
-0.042461179196834564,
-0.03277895227074623,
0.02386617846786976,
-0.0633908063173294,
0.07316172122955322,
-0.12784387171268463,
0.002153912326321006,
0.05816841125488281,
0.24054603278636932,
0.043740857392549515,
-0.31777966022491455,
-0.09679872542619705,
0.0009995258878916502,
-0.02303040400147438,
-0.019651269540190697,
0.023238854482769966,
0.09026823192834854,
-0.09268199652433395,
0.023662075400352478,
-0.06703127175569534,
0.09906945377588272,
-0.051414329558610916,
0.052619848400354385,
0.09066727012395859,
0.09268876165151596,
0.004245291464030743,
0.09179902076721191,
-0.2841222882270813,
0.2825339734554291,
0.006451143883168697,
0.06203732267022133,
-0.07681001722812653,
0.004484171513468027,
0.042655907571315765,
0.0672745406627655,
0.07669815421104431,
-0.013844494707882404,
-0.013681527227163315,
-0.20734888315200806,
-0.06543828547000885,
0.03263978287577629,
0.07243955135345459,
-0.045596618205308914,
0.08456134051084518,
-0.03200501576066017,
0.009431060403585434,
0.07785575091838837,
0.007943558506667614,
-0.05700425058603287,
-0.0989052876830101,
-0.0066357944160699844,
0.026470888406038284,
-0.058259185403585434,
-0.06398952752351761,
-0.1230858787894249,
-0.12608890235424042,
0.15262210369110107,
-0.044219762086868286,
-0.03065975196659565,
-0.11101675778627396,
0.08675365895032883,
0.06787785142660141,
-0.09378422051668167,
0.03687017410993576,
0.0008370286086574197,
0.07200143486261368,
0.029249131679534912,
-0.07188139110803604,
0.11157342791557312,
-0.07399313151836395,
-0.1556416153907776,
-0.06647086888551712,
0.104044109582901,
0.0331459641456604,
0.06995214521884918,
-0.019532447680830956,
0.00719364732503891,
-0.04930555820465088,
-0.09108687192201614,
0.020983189344406128,
-0.006218554452061653,
0.0660010576248169,
0.01643621176481247,
-0.07195252925157547,
0.016427665948867798,
-0.0584491491317749,
-0.03676079213619232,
0.195094496011734,
0.231685072183609,
-0.10245425999164581,
0.014935157261788845,
0.0347125418484211,
-0.07203398644924164,
-0.20408256351947784,
0.03791389986872673,
0.04614792764186859,
0.011792877689003944,
0.04392878711223602,
-0.18103481829166412,
0.15197651088237762,
0.1111534908413887,
-0.014523960649967194,
0.10267108678817749,
-0.3126608729362488,
-0.1235581636428833,
0.14298833906650543,
0.1342335194349289,
0.11511382460594177,
-0.14407789707183838,
-0.024876737967133522,
-0.02181370183825493,
-0.1444968283176422,
0.11025463044643402,
-0.1117299497127533,
0.12037906050682068,
-0.0398285835981369,
0.06697919219732285,
0.0030573406256735325,
-0.060316167771816254,
0.1276148408651352,
0.024638628587126732,
0.09279965609312057,
-0.05677800625562668,
-0.04094500467181206,
0.03420594707131386,
-0.039200082421302795,
0.03079371154308319,
-0.10382209718227386,
0.025593414902687073,
-0.10117554664611816,
-0.02422892302274704,
-0.07245633751153946,
0.045847080647945404,
-0.046680010855197906,
-0.06783711165189743,
-0.033105358481407166,
0.028009602800011635,
0.031175268813967705,
-0.013056238181889057,
0.12842966616153717,
0.018968183547258377,
0.15660439431667328,
0.09254854172468185,
0.07436569780111313,
-0.07338538765907288,
-0.08729100972414017,
-0.020110439509153366,
-0.015169162303209305,
0.05668302997946739,
-0.14192290604114532,
0.019950851798057556,
0.1506595015525818,
0.024422260001301765,
0.14566189050674438,
0.0867476612329483,
-0.02538667619228363,
-0.004407581873238087,
0.06520739942789078,
-0.16033421456813812,
-0.08821505308151245,
-0.018662557005882263,
-0.06888008862733841,
-0.13312126696109772,
0.04920397698879242,
0.08933067321777344,
-0.06783970445394516,
-0.005022241733968258,
-0.005429842043668032,
0.005148249678313732,
-0.05956215411424637,
0.19488242268562317,
0.06826712936162949,
0.04803852364420891,
-0.09914720803499222,
0.06858506798744202,
0.04464946314692497,
-0.07150598615407944,
-0.005870631895959377,
0.07404763996601105,
-0.08173765242099762,
-0.053046490997076035,
0.07989339530467987,
0.20063315331935883,
-0.052188098430633545,
-0.05016574636101723,
-0.1473959982395172,
-0.12722855806350708,
0.07782372087240219,
0.14665807783603668,
0.12170588970184326,
0.014889874495565891,
-0.05797991901636124,
0.006879615131765604,
-0.10471697151660919,
0.09580202400684357,
0.04195791110396385,
0.06361298263072968,
-0.1446261703968048,
0.14945848286151886,
0.018656911328434944,
0.04625488072633743,
-0.01877550035715103,
0.026508020237088203,
-0.11395233124494553,
0.006303508300334215,
-0.1044108048081398,
-0.023725476115942,
-0.02777123637497425,
0.009469111450016499,
-0.003293682122603059,
-0.0564289465546608,
-0.05815250054001808,
0.007981112226843834,
-0.10841017216444016,
-0.023606132715940475,
0.03359111398458481,
0.07412499189376831,
-0.11748779565095901,
-0.03310609608888626,
0.03258558735251427,
-0.05866628140211105,
0.06956537067890167,
0.041607704013586044,
0.027655839920043945,
0.062097810208797455,
-0.1473986804485321,
0.019762378185987473,
0.06487227231264114,
0.022647852078080177,
0.06713290512561798,
-0.08713969588279724,
-0.010961699299514294,
-0.014117802493274212,
0.05160350725054741,
0.021655473858118057,
0.07142218202352524,
-0.13985759019851685,
-0.0021922437008470297,
-0.022398816421628,
-0.09011746197938919,
-0.06335126608610153,
0.026643486693501472,
0.09845574200153351,
0.013312122784554958,
0.1954699158668518,
-0.07467038184404373,
0.04432955011725426,
-0.22673293948173523,
0.012374944984912872,
-0.012795829214155674,
-0.10381277650594711,
-0.10733368992805481,
-0.07530756294727325,
0.05926626920700073,
-0.058445077389478683,
0.14853167533874512,
0.03978360444307327,
0.03640449047088623,
0.030186934396624565,
-0.004084479529410601,
0.017567522823810577,
0.014031066559255123,
0.20188163220882416,
0.03866882249712944,
-0.03264259919524193,
0.057398777455091476,
0.049991462379693985,
0.09843739122152328,
0.11804857105016708,
0.20447298884391785,
0.1436595320701599,
-0.00314920162782073,
0.09408754110336304,
0.048237256705760956,
-0.06430841237306595,
-0.14881686866283417,
0.04514099285006523,
-0.04295225813984871,
0.09882565587759018,
-0.02223559468984604,
0.22379912436008453,
0.06590496003627777,
-0.16823959350585938,
0.045339588075876236,
-0.0586080364882946,
-0.08733558654785156,
-0.12066669017076492,
-0.03335215151309967,
-0.07706573605537415,
-0.1345870941877365,
-0.0034736711531877518,
-0.11086759716272354,
-0.004712322261184454,
0.13174545764923096,
0.00435912050306797,
-0.022046202793717384,
0.16329890489578247,
0.01685476116836071,
0.027369504794478416,
0.05427151545882225,
0.011038078926503658,
-0.03601040318608284,
-0.13445979356765747,
-0.0557694211602211,
-0.014484543353319168,
-0.012583866715431213,
0.024787230417132378,
-0.06907813996076584,
-0.06132526695728302,
0.038356371223926544,
-0.014558801427483559,
-0.10189516097307205,
0.008760918863117695,
0.00524946441873908,
0.05576273426413536,
0.0421140231192112,
0.0041798437014222145,
0.0240002628415823,
-0.007524000480771065,
0.20796890556812286,
-0.07884730398654938,
-0.06447882950305939,
-0.10116071254014969,
0.24150483310222626,
0.03024158999323845,
-0.01677129603922367,
0.03052443638443947,
-0.068937286734581,
0.00332126347348094,
0.2530229985713959,
0.2168397456407547,
-0.08563920855522156,
-0.005170372314751148,
0.019505873322486877,
-0.0077157290652394295,
-0.02484976127743721,
0.09807652980089188,
0.13559643924236298,
0.0541095957159996,
-0.09923514723777771,
-0.03597794473171234,
-0.05173882842063904,
-0.019268035888671875,
-0.028210937976837158,
0.06725333631038666,
0.05516672506928444,
0.010451171547174454,
-0.041946202516555786,
0.0554967075586319,
-0.0605655238032341,
-0.08724214881658554,
0.06884598731994629,
-0.21638870239257812,
-0.16818119585514069,
-0.021216057240962982,
0.1115516647696495,
0.007965395227074623,
0.06611339747905731,
-0.02789974957704544,
-0.0024506242480129004,
0.08178551495075226,
-0.018245579674839973,
-0.11011886596679688,
-0.0889228880405426,
0.09163569658994675,
-0.0986751914024353,
0.22029219567775726,
-0.04808669164776802,
0.058025676757097244,
0.12799890339374542,
0.06588960438966751,
-0.07130536437034607,
0.06044834852218628,
0.04130556061863899,
-0.06485708802938461,
0.020893828943371773,
0.06521593034267426,
-0.03691593185067177,
0.06423820555210114,
0.04287641867995262,
-0.1430591344833374,
0.0246424600481987,
-0.05352221056818962,
-0.07084938883781433,
-0.0436973012983799,
-0.025169692933559418,
-0.05865161120891571,
0.13127975165843964,
0.21824824810028076,
-0.026447517797350883,
-0.012017136439681053,
-0.06991426646709442,
0.011169633828103542,
0.060161199420690536,
0.025424111634492874,
-0.06303494423627853,
-0.20723320543766022,
0.020028190687298775,
0.04028359428048134,
-0.0187641941010952,
-0.2560427784919739,
-0.09827766567468643,
0.0031050678808242083,
-0.06987909972667694,
-0.0955338403582573,
0.07019004225730896,
0.09278862178325653,
0.055528778582811356,
-0.05535483732819557,
-0.06456311047077179,
-0.0702747106552124,
0.14895184338092804,
-0.14866593480110168,
-0.10021559149026871
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-qnli
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6268
- Accuracy: 0.7917
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 63 | 0.5339 | 0.7620 |
| No log | 2.0 | 126 | 0.4728 | 0.7866 |
| No log | 3.0 | 189 | 0.5386 | 0.7847 |
| No log | 4.0 | 252 | 0.6096 | 0.7904 |
| No log | 5.0 | 315 | 0.6268 | 0.7917 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "bert-base-uncased-finetuned-qnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "qnli"}, "metrics": [{"type": "accuracy", "value": 0.791689547867472, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/bert-base-uncased-finetuned-qnli
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
bert-base-uncased-finetuned-qnli
================================
This model is a fine-tuned version of bert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6268
* Accuracy: 0.7917
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
65,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
-0.11133186519145966,
0.08905862271785736,
-0.0019529515411704779,
0.11926262825727463,
0.16365353763103485,
0.04162043333053589,
0.1258639693260193,
0.12645882368087769,
-0.07455145567655563,
0.022976869717240334,
0.12096518278121948,
0.15348933637142181,
0.023043740540742874,
0.10704247653484344,
-0.040501244366168976,
-0.26026225090026855,
-0.013247538357973099,
0.04851532354950905,
-0.06437662988901138,
0.13277100026607513,
0.0866004005074501,
-0.12581570446491241,
0.09601885080337524,
0.009649556130170822,
-0.19422127306461334,
0.0017206711927428842,
0.007977565750479698,
-0.05212786793708801,
0.15086549520492554,
0.02926788479089737,
0.12350449711084366,
0.0014905447605997324,
0.08821775764226913,
-0.19845278561115265,
0.011854307726025581,
0.04817909002304077,
0.0027517774142324924,
0.0955868512392044,
0.051991056650877,
0.007646949961781502,
0.12070808559656143,
-0.07913677394390106,
0.05419687181711197,
0.0324590690433979,
-0.12292025983333588,
-0.22264188528060913,
-0.07268179953098297,
0.040208227932453156,
0.06921274960041046,
0.10662046819925308,
-0.006979895289987326,
0.12474232167005539,
-0.09115926176309586,
0.08684293180704117,
0.2320433109998703,
-0.29412421584129333,
-0.06459763646125793,
0.04311281442642212,
0.01435175258666277,
0.04788975790143013,
-0.11191442608833313,
-0.03154014050960541,
0.05195839703083038,
0.0461265929043293,
0.12408581376075745,
-0.02880127541720867,
-0.11727513372898102,
0.017000213265419006,
-0.13744957745075226,
-0.023232145234942436,
0.15813182294368744,
0.04339705780148506,
-0.028095675632357597,
-0.04658012092113495,
-0.05194289609789848,
-0.15150019526481628,
-0.0341365784406662,
-0.009446341544389725,
0.050389364361763,
-0.026576237753033638,
-0.05501849949359894,
-0.0005341381765902042,
-0.11177557706832886,
-0.07694505900144577,
-0.07482677698135376,
0.12085600942373276,
0.033654965460300446,
0.013262685388326645,
-0.03246248513460159,
0.11479304730892181,
-0.011589611880481243,
-0.12685932219028473,
0.022527087479829788,
0.027043979614973068,
0.0019975421018898487,
-0.04214833676815033,
-0.05068589746952057,
-0.048043690621852875,
0.016031749546527863,
0.13082049787044525,
-0.04954429715871811,
0.042875126004219055,
0.059499725699424744,
0.04633919149637222,
-0.09586818516254425,
0.1958521455526352,
-0.043208133429288864,
-0.027714340016245842,
-0.006344474386423826,
0.048830073326826096,
0.013338039629161358,
-0.010817881673574448,
-0.1172495111823082,
0.0006783725111745298,
0.08582327514886856,
0.007432086858898401,
-0.06191590428352356,
0.0717836543917656,
-0.06002074480056763,
-0.021904723718762398,
-0.002673383802175522,
-0.08606187254190445,
0.024990789592266083,
0.006296610459685326,
-0.0766998827457428,
-0.021041015163064003,
0.03123394027352333,
0.016464808955788612,
-0.013112714514136314,
0.11979532986879349,
-0.09522491693496704,
0.02930484339594841,
-0.09236255288124084,
-0.11073702573776245,
0.019086187705397606,
-0.0985613614320755,
0.02576073631644249,
-0.09113888442516327,
-0.16786926984786987,
-0.013992294669151306,
0.05608450621366501,
-0.027498900890350342,
-0.05760985240340233,
-0.0475313775241375,
-0.06458979099988937,
0.012755324132740498,
-0.009550596587359905,
0.12700213491916656,
-0.06766325980424881,
0.09375490248203278,
0.03293578326702118,
0.06472074240446091,
-0.04454372823238373,
0.05696758255362511,
-0.10222452878952026,
0.00863958615809679,
-0.1531236618757248,
0.025275954976677895,
-0.054325055330991745,
0.07460157573223114,
-0.08293235301971436,
-0.0979761928319931,
0.006310020573437214,
0.001816469244658947,
0.0621962696313858,
0.10018028318881989,
-0.17708736658096313,
-0.08879605680704117,
0.16124749183654785,
-0.0710144191980362,
-0.13228903710842133,
0.11820337921380997,
-0.05107412859797478,
0.054325610399246216,
0.06299453973770142,
0.16892650723457336,
0.07679277658462524,
-0.08739560097455978,
-0.003738977015018463,
0.03158546984195709,
0.05849802494049072,
-0.07640291005373001,
0.07802613079547882,
-0.0030734259635210037,
0.009865626692771912,
0.033974140882492065,
-0.03170346841216087,
0.06573118269443512,
-0.09504654258489609,
-0.09940528124570847,
-0.037974968552589417,
-0.08879686892032623,
0.044805895537137985,
0.08003824949264526,
0.06611926853656769,
-0.09458979964256287,
-0.08521248400211334,
0.04980127513408661,
0.08448254317045212,
-0.04761314392089844,
0.020896904170513153,
-0.05010635033249855,
0.06983544677495956,
-0.032594289630651474,
-0.02474372647702694,
-0.17688089609146118,
-0.037930309772491455,
0.003483257722109556,
-0.0039383661933243275,
0.02035306766629219,
0.03337440267205238,
0.06896461546421051,
0.06123299151659012,
-0.053501445800065994,
-0.02038055844604969,
-0.03648308664560318,
-0.00002287392817379441,
-0.1335783153772354,
-0.21121446788311005,
-0.03343959152698517,
-0.019917041063308716,
0.15845483541488647,
-0.20825287699699402,
0.04502106085419655,
-0.011167707853019238,
0.07090239226818085,
0.012584364973008633,
-0.005436633713543415,
-0.044851191341876984,
0.0748448297381401,
-0.03540888428688049,
-0.04858015850186348,
0.07784289866685867,
0.011372145265340805,
-0.09612031280994415,
-0.049932003021240234,
-0.09360146522521973,
0.17633934319019318,
0.13601675629615784,
-0.11704548448324203,
-0.07527592033147812,
-0.016421815380454063,
-0.06632382422685623,
-0.036007825285196304,
-0.04987955093383789,
0.025685802102088928,
0.1804990917444229,
-0.003062593284994364,
0.1465587466955185,
-0.06498240679502487,
-0.043712347745895386,
0.020373014733195305,
-0.033853475004434586,
0.025405069813132286,
0.13192012906074524,
0.1413542479276657,
-0.05207671597599983,
0.1545637845993042,
0.15501214563846588,
-0.08902566134929657,
0.14543376863002777,
-0.04214373603463173,
-0.0748271718621254,
-0.016834532842040062,
-0.03663995862007141,
-0.008665396831929684,
0.11529027670621872,
-0.16037599742412567,
-0.005457292776554823,
0.02912897802889347,
0.014270984567701817,
0.02504456229507923,
-0.2274867594242096,
-0.04320798069238663,
0.03586262837052345,
-0.04130497947335243,
-0.01643715798854828,
-0.014064619317650795,
0.0027289174031466246,
0.10714861750602722,
0.005249135196208954,
-0.08377781510353088,
0.030804669484496117,
0.003663246054202318,
-0.08201051503419876,
0.22146055102348328,
-0.07282085716724396,
-0.15294183790683746,
-0.13444940745830536,
-0.06660070270299911,
-0.04344457760453224,
-0.0014246109640225768,
0.06601156294345856,
-0.10414940118789673,
-0.028113164007663727,
-0.06262822449207306,
0.03830212354660034,
0.004589039832353592,
0.03437737375497818,
-0.004018603358417749,
0.00657397136092186,
0.06800608336925507,
-0.11233603954315186,
-0.013323396444320679,
-0.06123647093772888,
-0.054307855665683746,
0.03794674575328827,
0.03348831459879875,
0.1152501180768013,
0.15420082211494446,
-0.008943500928580761,
0.009582378901541233,
-0.02896696701645851,
0.23741787672042847,
-0.06176336109638214,
-0.02700284868478775,
0.1373521089553833,
-0.007044375408440828,
0.046653326600790024,
0.11434922367334366,
0.08235130459070206,
-0.07713320851325989,
0.001954220002517104,
0.04405386000871658,
-0.030877048149704933,
-0.23404191434383392,
-0.04935338348150253,
-0.05210452526807785,
0.010211923159658909,
0.08878380060195923,
0.0263607706874609,
0.033074650913476944,
0.0701175332069397,
0.03870290890336037,
0.07352612167596817,
-0.04850434139370918,
0.04964708164334297,
0.11228486150503159,
0.03282718360424042,
0.1266881227493286,
-0.04986396059393883,
-0.06297986954450607,
0.043346017599105835,
-0.015013319440186024,
0.2204647958278656,
0.010978217236697674,
0.13841557502746582,
0.06098794937133789,
0.1687219887971878,
-0.007599520031362772,
0.07964138686656952,
-0.006470580119639635,
-0.0480966717004776,
-0.012450417503714561,
-0.042461179196834564,
-0.03277895227074623,
0.02386617846786976,
-0.0633908063173294,
0.07316172122955322,
-0.12784387171268463,
0.002153912326321006,
0.05816841125488281,
0.24054603278636932,
0.043740857392549515,
-0.31777966022491455,
-0.09679872542619705,
0.0009995258878916502,
-0.02303040400147438,
-0.019651269540190697,
0.023238854482769966,
0.09026823192834854,
-0.09268199652433395,
0.023662075400352478,
-0.06703127175569534,
0.09906945377588272,
-0.051414329558610916,
0.052619848400354385,
0.09066727012395859,
0.09268876165151596,
0.004245291464030743,
0.09179902076721191,
-0.2841222882270813,
0.2825339734554291,
0.006451143883168697,
0.06203732267022133,
-0.07681001722812653,
0.004484171513468027,
0.042655907571315765,
0.0672745406627655,
0.07669815421104431,
-0.013844494707882404,
-0.013681527227163315,
-0.20734888315200806,
-0.06543828547000885,
0.03263978287577629,
0.07243955135345459,
-0.045596618205308914,
0.08456134051084518,
-0.03200501576066017,
0.009431060403585434,
0.07785575091838837,
0.007943558506667614,
-0.05700425058603287,
-0.0989052876830101,
-0.0066357944160699844,
0.026470888406038284,
-0.058259185403585434,
-0.06398952752351761,
-0.1230858787894249,
-0.12608890235424042,
0.15262210369110107,
-0.044219762086868286,
-0.03065975196659565,
-0.11101675778627396,
0.08675365895032883,
0.06787785142660141,
-0.09378422051668167,
0.03687017410993576,
0.0008370286086574197,
0.07200143486261368,
0.029249131679534912,
-0.07188139110803604,
0.11157342791557312,
-0.07399313151836395,
-0.1556416153907776,
-0.06647086888551712,
0.104044109582901,
0.0331459641456604,
0.06995214521884918,
-0.019532447680830956,
0.00719364732503891,
-0.04930555820465088,
-0.09108687192201614,
0.020983189344406128,
-0.006218554452061653,
0.0660010576248169,
0.01643621176481247,
-0.07195252925157547,
0.016427665948867798,
-0.0584491491317749,
-0.03676079213619232,
0.195094496011734,
0.231685072183609,
-0.10245425999164581,
0.014935157261788845,
0.0347125418484211,
-0.07203398644924164,
-0.20408256351947784,
0.03791389986872673,
0.04614792764186859,
0.011792877689003944,
0.04392878711223602,
-0.18103481829166412,
0.15197651088237762,
0.1111534908413887,
-0.014523960649967194,
0.10267108678817749,
-0.3126608729362488,
-0.1235581636428833,
0.14298833906650543,
0.1342335194349289,
0.11511382460594177,
-0.14407789707183838,
-0.024876737967133522,
-0.02181370183825493,
-0.1444968283176422,
0.11025463044643402,
-0.1117299497127533,
0.12037906050682068,
-0.0398285835981369,
0.06697919219732285,
0.0030573406256735325,
-0.060316167771816254,
0.1276148408651352,
0.024638628587126732,
0.09279965609312057,
-0.05677800625562668,
-0.04094500467181206,
0.03420594707131386,
-0.039200082421302795,
0.03079371154308319,
-0.10382209718227386,
0.025593414902687073,
-0.10117554664611816,
-0.02422892302274704,
-0.07245633751153946,
0.045847080647945404,
-0.046680010855197906,
-0.06783711165189743,
-0.033105358481407166,
0.028009602800011635,
0.031175268813967705,
-0.013056238181889057,
0.12842966616153717,
0.018968183547258377,
0.15660439431667328,
0.09254854172468185,
0.07436569780111313,
-0.07338538765907288,
-0.08729100972414017,
-0.020110439509153366,
-0.015169162303209305,
0.05668302997946739,
-0.14192290604114532,
0.019950851798057556,
0.1506595015525818,
0.024422260001301765,
0.14566189050674438,
0.0867476612329483,
-0.02538667619228363,
-0.004407581873238087,
0.06520739942789078,
-0.16033421456813812,
-0.08821505308151245,
-0.018662557005882263,
-0.06888008862733841,
-0.13312126696109772,
0.04920397698879242,
0.08933067321777344,
-0.06783970445394516,
-0.005022241733968258,
-0.005429842043668032,
0.005148249678313732,
-0.05956215411424637,
0.19488242268562317,
0.06826712936162949,
0.04803852364420891,
-0.09914720803499222,
0.06858506798744202,
0.04464946314692497,
-0.07150598615407944,
-0.005870631895959377,
0.07404763996601105,
-0.08173765242099762,
-0.053046490997076035,
0.07989339530467987,
0.20063315331935883,
-0.052188098430633545,
-0.05016574636101723,
-0.1473959982395172,
-0.12722855806350708,
0.07782372087240219,
0.14665807783603668,
0.12170588970184326,
0.014889874495565891,
-0.05797991901636124,
0.006879615131765604,
-0.10471697151660919,
0.09580202400684357,
0.04195791110396385,
0.06361298263072968,
-0.1446261703968048,
0.14945848286151886,
0.018656911328434944,
0.04625488072633743,
-0.01877550035715103,
0.026508020237088203,
-0.11395233124494553,
0.006303508300334215,
-0.1044108048081398,
-0.023725476115942,
-0.02777123637497425,
0.009469111450016499,
-0.003293682122603059,
-0.0564289465546608,
-0.05815250054001808,
0.007981112226843834,
-0.10841017216444016,
-0.023606132715940475,
0.03359111398458481,
0.07412499189376831,
-0.11748779565095901,
-0.03310609608888626,
0.03258558735251427,
-0.05866628140211105,
0.06956537067890167,
0.041607704013586044,
0.027655839920043945,
0.062097810208797455,
-0.1473986804485321,
0.019762378185987473,
0.06487227231264114,
0.022647852078080177,
0.06713290512561798,
-0.08713969588279724,
-0.010961699299514294,
-0.014117802493274212,
0.05160350725054741,
0.021655473858118057,
0.07142218202352524,
-0.13985759019851685,
-0.0021922437008470297,
-0.022398816421628,
-0.09011746197938919,
-0.06335126608610153,
0.026643486693501472,
0.09845574200153351,
0.013312122784554958,
0.1954699158668518,
-0.07467038184404373,
0.04432955011725426,
-0.22673293948173523,
0.012374944984912872,
-0.012795829214155674,
-0.10381277650594711,
-0.10733368992805481,
-0.07530756294727325,
0.05926626920700073,
-0.058445077389478683,
0.14853167533874512,
0.03978360444307327,
0.03640449047088623,
0.030186934396624565,
-0.004084479529410601,
0.017567522823810577,
0.014031066559255123,
0.20188163220882416,
0.03866882249712944,
-0.03264259919524193,
0.057398777455091476,
0.049991462379693985,
0.09843739122152328,
0.11804857105016708,
0.20447298884391785,
0.1436595320701599,
-0.00314920162782073,
0.09408754110336304,
0.048237256705760956,
-0.06430841237306595,
-0.14881686866283417,
0.04514099285006523,
-0.04295225813984871,
0.09882565587759018,
-0.02223559468984604,
0.22379912436008453,
0.06590496003627777,
-0.16823959350585938,
0.045339588075876236,
-0.0586080364882946,
-0.08733558654785156,
-0.12066669017076492,
-0.03335215151309967,
-0.07706573605537415,
-0.1345870941877365,
-0.0034736711531877518,
-0.11086759716272354,
-0.004712322261184454,
0.13174545764923096,
0.00435912050306797,
-0.022046202793717384,
0.16329890489578247,
0.01685476116836071,
0.027369504794478416,
0.05427151545882225,
0.011038078926503658,
-0.03601040318608284,
-0.13445979356765747,
-0.0557694211602211,
-0.014484543353319168,
-0.012583866715431213,
0.024787230417132378,
-0.06907813996076584,
-0.06132526695728302,
0.038356371223926544,
-0.014558801427483559,
-0.10189516097307205,
0.008760918863117695,
0.00524946441873908,
0.05576273426413536,
0.0421140231192112,
0.0041798437014222145,
0.0240002628415823,
-0.007524000480771065,
0.20796890556812286,
-0.07884730398654938,
-0.06447882950305939,
-0.10116071254014969,
0.24150483310222626,
0.03024158999323845,
-0.01677129603922367,
0.03052443638443947,
-0.068937286734581,
0.00332126347348094,
0.2530229985713959,
0.2168397456407547,
-0.08563920855522156,
-0.005170372314751148,
0.019505873322486877,
-0.0077157290652394295,
-0.02484976127743721,
0.09807652980089188,
0.13559643924236298,
0.0541095957159996,
-0.09923514723777771,
-0.03597794473171234,
-0.05173882842063904,
-0.019268035888671875,
-0.028210937976837158,
0.06725333631038666,
0.05516672506928444,
0.010451171547174454,
-0.041946202516555786,
0.0554967075586319,
-0.0605655238032341,
-0.08724214881658554,
0.06884598731994629,
-0.21638870239257812,
-0.16818119585514069,
-0.021216057240962982,
0.1115516647696495,
0.007965395227074623,
0.06611339747905731,
-0.02789974957704544,
-0.0024506242480129004,
0.08178551495075226,
-0.018245579674839973,
-0.11011886596679688,
-0.0889228880405426,
0.09163569658994675,
-0.0986751914024353,
0.22029219567775726,
-0.04808669164776802,
0.058025676757097244,
0.12799890339374542,
0.06588960438966751,
-0.07130536437034607,
0.06044834852218628,
0.04130556061863899,
-0.06485708802938461,
0.020893828943371773,
0.06521593034267426,
-0.03691593185067177,
0.06423820555210114,
0.04287641867995262,
-0.1430591344833374,
0.0246424600481987,
-0.05352221056818962,
-0.07084938883781433,
-0.0436973012983799,
-0.025169692933559418,
-0.05865161120891571,
0.13127975165843964,
0.21824824810028076,
-0.026447517797350883,
-0.012017136439681053,
-0.06991426646709442,
0.011169633828103542,
0.060161199420690536,
0.025424111634492874,
-0.06303494423627853,
-0.20723320543766022,
0.020028190687298775,
0.04028359428048134,
-0.0187641941010952,
-0.2560427784919739,
-0.09827766567468643,
0.0031050678808242083,
-0.06987909972667694,
-0.0955338403582573,
0.07019004225730896,
0.09278862178325653,
0.055528778582811356,
-0.05535483732819557,
-0.06456311047077179,
-0.0702747106552124,
0.14895184338092804,
-0.14866593480110168,
-0.10021559149026871
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-rte
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8075
- Accuracy: 0.6643
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 63 | 0.6777 | 0.5668 |
| No log | 2.0 | 126 | 0.6723 | 0.6282 |
| No log | 3.0 | 189 | 0.7238 | 0.6318 |
| No log | 4.0 | 252 | 0.7993 | 0.6354 |
| No log | 5.0 | 315 | 0.8075 | 0.6643 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "bert-base-uncased-finetuned-rte", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "rte"}, "metrics": [{"type": "accuracy", "value": 0.6642599277978339, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/bert-base-uncased-finetuned-rte
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
bert-base-uncased-finetuned-rte
===============================
This model is a fine-tuned version of bert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.8075
* Accuracy: 0.6643
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.1
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
65,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.1\n* Tokenizers 0.10.3"
] |
[
-0.11133186519145966,
0.08905862271785736,
-0.0019529515411704779,
0.11926262825727463,
0.16365353763103485,
0.04162043333053589,
0.1258639693260193,
0.12645882368087769,
-0.07455145567655563,
0.022976869717240334,
0.12096518278121948,
0.15348933637142181,
0.023043740540742874,
0.10704247653484344,
-0.040501244366168976,
-0.26026225090026855,
-0.013247538357973099,
0.04851532354950905,
-0.06437662988901138,
0.13277100026607513,
0.0866004005074501,
-0.12581570446491241,
0.09601885080337524,
0.009649556130170822,
-0.19422127306461334,
0.0017206711927428842,
0.007977565750479698,
-0.05212786793708801,
0.15086549520492554,
0.02926788479089737,
0.12350449711084366,
0.0014905447605997324,
0.08821775764226913,
-0.19845278561115265,
0.011854307726025581,
0.04817909002304077,
0.0027517774142324924,
0.0955868512392044,
0.051991056650877,
0.007646949961781502,
0.12070808559656143,
-0.07913677394390106,
0.05419687181711197,
0.0324590690433979,
-0.12292025983333588,
-0.22264188528060913,
-0.07268179953098297,
0.040208227932453156,
0.06921274960041046,
0.10662046819925308,
-0.006979895289987326,
0.12474232167005539,
-0.09115926176309586,
0.08684293180704117,
0.2320433109998703,
-0.29412421584129333,
-0.06459763646125793,
0.04311281442642212,
0.01435175258666277,
0.04788975790143013,
-0.11191442608833313,
-0.03154014050960541,
0.05195839703083038,
0.0461265929043293,
0.12408581376075745,
-0.02880127541720867,
-0.11727513372898102,
0.017000213265419006,
-0.13744957745075226,
-0.023232145234942436,
0.15813182294368744,
0.04339705780148506,
-0.028095675632357597,
-0.04658012092113495,
-0.05194289609789848,
-0.15150019526481628,
-0.0341365784406662,
-0.009446341544389725,
0.050389364361763,
-0.026576237753033638,
-0.05501849949359894,
-0.0005341381765902042,
-0.11177557706832886,
-0.07694505900144577,
-0.07482677698135376,
0.12085600942373276,
0.033654965460300446,
0.013262685388326645,
-0.03246248513460159,
0.11479304730892181,
-0.011589611880481243,
-0.12685932219028473,
0.022527087479829788,
0.027043979614973068,
0.0019975421018898487,
-0.04214833676815033,
-0.05068589746952057,
-0.048043690621852875,
0.016031749546527863,
0.13082049787044525,
-0.04954429715871811,
0.042875126004219055,
0.059499725699424744,
0.04633919149637222,
-0.09586818516254425,
0.1958521455526352,
-0.043208133429288864,
-0.027714340016245842,
-0.006344474386423826,
0.048830073326826096,
0.013338039629161358,
-0.010817881673574448,
-0.1172495111823082,
0.0006783725111745298,
0.08582327514886856,
0.007432086858898401,
-0.06191590428352356,
0.0717836543917656,
-0.06002074480056763,
-0.021904723718762398,
-0.002673383802175522,
-0.08606187254190445,
0.024990789592266083,
0.006296610459685326,
-0.0766998827457428,
-0.021041015163064003,
0.03123394027352333,
0.016464808955788612,
-0.013112714514136314,
0.11979532986879349,
-0.09522491693496704,
0.02930484339594841,
-0.09236255288124084,
-0.11073702573776245,
0.019086187705397606,
-0.0985613614320755,
0.02576073631644249,
-0.09113888442516327,
-0.16786926984786987,
-0.013992294669151306,
0.05608450621366501,
-0.027498900890350342,
-0.05760985240340233,
-0.0475313775241375,
-0.06458979099988937,
0.012755324132740498,
-0.009550596587359905,
0.12700213491916656,
-0.06766325980424881,
0.09375490248203278,
0.03293578326702118,
0.06472074240446091,
-0.04454372823238373,
0.05696758255362511,
-0.10222452878952026,
0.00863958615809679,
-0.1531236618757248,
0.025275954976677895,
-0.054325055330991745,
0.07460157573223114,
-0.08293235301971436,
-0.0979761928319931,
0.006310020573437214,
0.001816469244658947,
0.0621962696313858,
0.10018028318881989,
-0.17708736658096313,
-0.08879605680704117,
0.16124749183654785,
-0.0710144191980362,
-0.13228903710842133,
0.11820337921380997,
-0.05107412859797478,
0.054325610399246216,
0.06299453973770142,
0.16892650723457336,
0.07679277658462524,
-0.08739560097455978,
-0.003738977015018463,
0.03158546984195709,
0.05849802494049072,
-0.07640291005373001,
0.07802613079547882,
-0.0030734259635210037,
0.009865626692771912,
0.033974140882492065,
-0.03170346841216087,
0.06573118269443512,
-0.09504654258489609,
-0.09940528124570847,
-0.037974968552589417,
-0.08879686892032623,
0.044805895537137985,
0.08003824949264526,
0.06611926853656769,
-0.09458979964256287,
-0.08521248400211334,
0.04980127513408661,
0.08448254317045212,
-0.04761314392089844,
0.020896904170513153,
-0.05010635033249855,
0.06983544677495956,
-0.032594289630651474,
-0.02474372647702694,
-0.17688089609146118,
-0.037930309772491455,
0.003483257722109556,
-0.0039383661933243275,
0.02035306766629219,
0.03337440267205238,
0.06896461546421051,
0.06123299151659012,
-0.053501445800065994,
-0.02038055844604969,
-0.03648308664560318,
-0.00002287392817379441,
-0.1335783153772354,
-0.21121446788311005,
-0.03343959152698517,
-0.019917041063308716,
0.15845483541488647,
-0.20825287699699402,
0.04502106085419655,
-0.011167707853019238,
0.07090239226818085,
0.012584364973008633,
-0.005436633713543415,
-0.044851191341876984,
0.0748448297381401,
-0.03540888428688049,
-0.04858015850186348,
0.07784289866685867,
0.011372145265340805,
-0.09612031280994415,
-0.049932003021240234,
-0.09360146522521973,
0.17633934319019318,
0.13601675629615784,
-0.11704548448324203,
-0.07527592033147812,
-0.016421815380454063,
-0.06632382422685623,
-0.036007825285196304,
-0.04987955093383789,
0.025685802102088928,
0.1804990917444229,
-0.003062593284994364,
0.1465587466955185,
-0.06498240679502487,
-0.043712347745895386,
0.020373014733195305,
-0.033853475004434586,
0.025405069813132286,
0.13192012906074524,
0.1413542479276657,
-0.05207671597599983,
0.1545637845993042,
0.15501214563846588,
-0.08902566134929657,
0.14543376863002777,
-0.04214373603463173,
-0.0748271718621254,
-0.016834532842040062,
-0.03663995862007141,
-0.008665396831929684,
0.11529027670621872,
-0.16037599742412567,
-0.005457292776554823,
0.02912897802889347,
0.014270984567701817,
0.02504456229507923,
-0.2274867594242096,
-0.04320798069238663,
0.03586262837052345,
-0.04130497947335243,
-0.01643715798854828,
-0.014064619317650795,
0.0027289174031466246,
0.10714861750602722,
0.005249135196208954,
-0.08377781510353088,
0.030804669484496117,
0.003663246054202318,
-0.08201051503419876,
0.22146055102348328,
-0.07282085716724396,
-0.15294183790683746,
-0.13444940745830536,
-0.06660070270299911,
-0.04344457760453224,
-0.0014246109640225768,
0.06601156294345856,
-0.10414940118789673,
-0.028113164007663727,
-0.06262822449207306,
0.03830212354660034,
0.004589039832353592,
0.03437737375497818,
-0.004018603358417749,
0.00657397136092186,
0.06800608336925507,
-0.11233603954315186,
-0.013323396444320679,
-0.06123647093772888,
-0.054307855665683746,
0.03794674575328827,
0.03348831459879875,
0.1152501180768013,
0.15420082211494446,
-0.008943500928580761,
0.009582378901541233,
-0.02896696701645851,
0.23741787672042847,
-0.06176336109638214,
-0.02700284868478775,
0.1373521089553833,
-0.007044375408440828,
0.046653326600790024,
0.11434922367334366,
0.08235130459070206,
-0.07713320851325989,
0.001954220002517104,
0.04405386000871658,
-0.030877048149704933,
-0.23404191434383392,
-0.04935338348150253,
-0.05210452526807785,
0.010211923159658909,
0.08878380060195923,
0.0263607706874609,
0.033074650913476944,
0.0701175332069397,
0.03870290890336037,
0.07352612167596817,
-0.04850434139370918,
0.04964708164334297,
0.11228486150503159,
0.03282718360424042,
0.1266881227493286,
-0.04986396059393883,
-0.06297986954450607,
0.043346017599105835,
-0.015013319440186024,
0.2204647958278656,
0.010978217236697674,
0.13841557502746582,
0.06098794937133789,
0.1687219887971878,
-0.007599520031362772,
0.07964138686656952,
-0.006470580119639635,
-0.0480966717004776,
-0.012450417503714561,
-0.042461179196834564,
-0.03277895227074623,
0.02386617846786976,
-0.0633908063173294,
0.07316172122955322,
-0.12784387171268463,
0.002153912326321006,
0.05816841125488281,
0.24054603278636932,
0.043740857392549515,
-0.31777966022491455,
-0.09679872542619705,
0.0009995258878916502,
-0.02303040400147438,
-0.019651269540190697,
0.023238854482769966,
0.09026823192834854,
-0.09268199652433395,
0.023662075400352478,
-0.06703127175569534,
0.09906945377588272,
-0.051414329558610916,
0.052619848400354385,
0.09066727012395859,
0.09268876165151596,
0.004245291464030743,
0.09179902076721191,
-0.2841222882270813,
0.2825339734554291,
0.006451143883168697,
0.06203732267022133,
-0.07681001722812653,
0.004484171513468027,
0.042655907571315765,
0.0672745406627655,
0.07669815421104431,
-0.013844494707882404,
-0.013681527227163315,
-0.20734888315200806,
-0.06543828547000885,
0.03263978287577629,
0.07243955135345459,
-0.045596618205308914,
0.08456134051084518,
-0.03200501576066017,
0.009431060403585434,
0.07785575091838837,
0.007943558506667614,
-0.05700425058603287,
-0.0989052876830101,
-0.0066357944160699844,
0.026470888406038284,
-0.058259185403585434,
-0.06398952752351761,
-0.1230858787894249,
-0.12608890235424042,
0.15262210369110107,
-0.044219762086868286,
-0.03065975196659565,
-0.11101675778627396,
0.08675365895032883,
0.06787785142660141,
-0.09378422051668167,
0.03687017410993576,
0.0008370286086574197,
0.07200143486261368,
0.029249131679534912,
-0.07188139110803604,
0.11157342791557312,
-0.07399313151836395,
-0.1556416153907776,
-0.06647086888551712,
0.104044109582901,
0.0331459641456604,
0.06995214521884918,
-0.019532447680830956,
0.00719364732503891,
-0.04930555820465088,
-0.09108687192201614,
0.020983189344406128,
-0.006218554452061653,
0.0660010576248169,
0.01643621176481247,
-0.07195252925157547,
0.016427665948867798,
-0.0584491491317749,
-0.03676079213619232,
0.195094496011734,
0.231685072183609,
-0.10245425999164581,
0.014935157261788845,
0.0347125418484211,
-0.07203398644924164,
-0.20408256351947784,
0.03791389986872673,
0.04614792764186859,
0.011792877689003944,
0.04392878711223602,
-0.18103481829166412,
0.15197651088237762,
0.1111534908413887,
-0.014523960649967194,
0.10267108678817749,
-0.3126608729362488,
-0.1235581636428833,
0.14298833906650543,
0.1342335194349289,
0.11511382460594177,
-0.14407789707183838,
-0.024876737967133522,
-0.02181370183825493,
-0.1444968283176422,
0.11025463044643402,
-0.1117299497127533,
0.12037906050682068,
-0.0398285835981369,
0.06697919219732285,
0.0030573406256735325,
-0.060316167771816254,
0.1276148408651352,
0.024638628587126732,
0.09279965609312057,
-0.05677800625562668,
-0.04094500467181206,
0.03420594707131386,
-0.039200082421302795,
0.03079371154308319,
-0.10382209718227386,
0.025593414902687073,
-0.10117554664611816,
-0.02422892302274704,
-0.07245633751153946,
0.045847080647945404,
-0.046680010855197906,
-0.06783711165189743,
-0.033105358481407166,
0.028009602800011635,
0.031175268813967705,
-0.013056238181889057,
0.12842966616153717,
0.018968183547258377,
0.15660439431667328,
0.09254854172468185,
0.07436569780111313,
-0.07338538765907288,
-0.08729100972414017,
-0.020110439509153366,
-0.015169162303209305,
0.05668302997946739,
-0.14192290604114532,
0.019950851798057556,
0.1506595015525818,
0.024422260001301765,
0.14566189050674438,
0.0867476612329483,
-0.02538667619228363,
-0.004407581873238087,
0.06520739942789078,
-0.16033421456813812,
-0.08821505308151245,
-0.018662557005882263,
-0.06888008862733841,
-0.13312126696109772,
0.04920397698879242,
0.08933067321777344,
-0.06783970445394516,
-0.005022241733968258,
-0.005429842043668032,
0.005148249678313732,
-0.05956215411424637,
0.19488242268562317,
0.06826712936162949,
0.04803852364420891,
-0.09914720803499222,
0.06858506798744202,
0.04464946314692497,
-0.07150598615407944,
-0.005870631895959377,
0.07404763996601105,
-0.08173765242099762,
-0.053046490997076035,
0.07989339530467987,
0.20063315331935883,
-0.052188098430633545,
-0.05016574636101723,
-0.1473959982395172,
-0.12722855806350708,
0.07782372087240219,
0.14665807783603668,
0.12170588970184326,
0.014889874495565891,
-0.05797991901636124,
0.006879615131765604,
-0.10471697151660919,
0.09580202400684357,
0.04195791110396385,
0.06361298263072968,
-0.1446261703968048,
0.14945848286151886,
0.018656911328434944,
0.04625488072633743,
-0.01877550035715103,
0.026508020237088203,
-0.11395233124494553,
0.006303508300334215,
-0.1044108048081398,
-0.023725476115942,
-0.02777123637497425,
0.009469111450016499,
-0.003293682122603059,
-0.0564289465546608,
-0.05815250054001808,
0.007981112226843834,
-0.10841017216444016,
-0.023606132715940475,
0.03359111398458481,
0.07412499189376831,
-0.11748779565095901,
-0.03310609608888626,
0.03258558735251427,
-0.05866628140211105,
0.06956537067890167,
0.041607704013586044,
0.027655839920043945,
0.062097810208797455,
-0.1473986804485321,
0.019762378185987473,
0.06487227231264114,
0.022647852078080177,
0.06713290512561798,
-0.08713969588279724,
-0.010961699299514294,
-0.014117802493274212,
0.05160350725054741,
0.021655473858118057,
0.07142218202352524,
-0.13985759019851685,
-0.0021922437008470297,
-0.022398816421628,
-0.09011746197938919,
-0.06335126608610153,
0.026643486693501472,
0.09845574200153351,
0.013312122784554958,
0.1954699158668518,
-0.07467038184404373,
0.04432955011725426,
-0.22673293948173523,
0.012374944984912872,
-0.012795829214155674,
-0.10381277650594711,
-0.10733368992805481,
-0.07530756294727325,
0.05926626920700073,
-0.058445077389478683,
0.14853167533874512,
0.03978360444307327,
0.03640449047088623,
0.030186934396624565,
-0.004084479529410601,
0.017567522823810577,
0.014031066559255123,
0.20188163220882416,
0.03866882249712944,
-0.03264259919524193,
0.057398777455091476,
0.049991462379693985,
0.09843739122152328,
0.11804857105016708,
0.20447298884391785,
0.1436595320701599,
-0.00314920162782073,
0.09408754110336304,
0.048237256705760956,
-0.06430841237306595,
-0.14881686866283417,
0.04514099285006523,
-0.04295225813984871,
0.09882565587759018,
-0.02223559468984604,
0.22379912436008453,
0.06590496003627777,
-0.16823959350585938,
0.045339588075876236,
-0.0586080364882946,
-0.08733558654785156,
-0.12066669017076492,
-0.03335215151309967,
-0.07706573605537415,
-0.1345870941877365,
-0.0034736711531877518,
-0.11086759716272354,
-0.004712322261184454,
0.13174545764923096,
0.00435912050306797,
-0.022046202793717384,
0.16329890489578247,
0.01685476116836071,
0.027369504794478416,
0.05427151545882225,
0.011038078926503658,
-0.03601040318608284,
-0.13445979356765747,
-0.0557694211602211,
-0.014484543353319168,
-0.012583866715431213,
0.024787230417132378,
-0.06907813996076584,
-0.06132526695728302,
0.038356371223926544,
-0.014558801427483559,
-0.10189516097307205,
0.008760918863117695,
0.00524946441873908,
0.05576273426413536,
0.0421140231192112,
0.0041798437014222145,
0.0240002628415823,
-0.007524000480771065,
0.20796890556812286,
-0.07884730398654938,
-0.06447882950305939,
-0.10116071254014969,
0.24150483310222626,
0.03024158999323845,
-0.01677129603922367,
0.03052443638443947,
-0.068937286734581,
0.00332126347348094,
0.2530229985713959,
0.2168397456407547,
-0.08563920855522156,
-0.005170372314751148,
0.019505873322486877,
-0.0077157290652394295,
-0.02484976127743721,
0.09807652980089188,
0.13559643924236298,
0.0541095957159996,
-0.09923514723777771,
-0.03597794473171234,
-0.05173882842063904,
-0.019268035888671875,
-0.028210937976837158,
0.06725333631038666,
0.05516672506928444,
0.010451171547174454,
-0.041946202516555786,
0.0554967075586319,
-0.0605655238032341,
-0.08724214881658554,
0.06884598731994629,
-0.21638870239257812,
-0.16818119585514069,
-0.021216057240962982,
0.1115516647696495,
0.007965395227074623,
0.06611339747905731,
-0.02789974957704544,
-0.0024506242480129004,
0.08178551495075226,
-0.018245579674839973,
-0.11011886596679688,
-0.0889228880405426,
0.09163569658994675,
-0.0986751914024353,
0.22029219567775726,
-0.04808669164776802,
0.058025676757097244,
0.12799890339374542,
0.06588960438966751,
-0.07130536437034607,
0.06044834852218628,
0.04130556061863899,
-0.06485708802938461,
0.020893828943371773,
0.06521593034267426,
-0.03691593185067177,
0.06423820555210114,
0.04287641867995262,
-0.1430591344833374,
0.0246424600481987,
-0.05352221056818962,
-0.07084938883781433,
-0.0436973012983799,
-0.025169692933559418,
-0.05865161120891571,
0.13127975165843964,
0.21824824810028076,
-0.026447517797350883,
-0.012017136439681053,
-0.06991426646709442,
0.011169633828103542,
0.060161199420690536,
0.025424111634492874,
-0.06303494423627853,
-0.20723320543766022,
0.020028190687298775,
0.04028359428048134,
-0.0187641941010952,
-0.2560427784919739,
-0.09827766567468643,
0.0031050678808242083,
-0.06987909972667694,
-0.0955338403582573,
0.07019004225730896,
0.09278862178325653,
0.055528778582811356,
-0.05535483732819557,
-0.06456311047077179,
-0.0702747106552124,
0.14895184338092804,
-0.14866593480110168,
-0.10021559149026871
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-wnli
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6854
- Accuracy: 0.5634
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.6854 | 0.5634 |
| No log | 2.0 | 80 | 0.6983 | 0.3239 |
| No log | 3.0 | 120 | 0.6995 | 0.5352 |
| No log | 4.0 | 160 | 0.6986 | 0.5634 |
| No log | 5.0 | 200 | 0.6996 | 0.5634 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "bert-base-uncased-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.5633802816901409, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/bert-base-uncased-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
bert-base-uncased-finetuned-wnli
================================
This model is a fine-tuned version of bert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6854
* Accuracy: 0.5634
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
65,
98,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.11325017362833023,
0.0812758356332779,
-0.0017998769180849195,
0.12009552121162415,
0.16387249529361725,
0.03597983717918396,
0.11156629025936127,
0.12876233458518982,
-0.08335161954164505,
0.02751629427075386,
0.1264062076807022,
0.15765658020973206,
0.019992440938949585,
0.12018559128046036,
-0.04816516116261482,
-0.26051899790763855,
-0.010511714965105057,
0.05137874186038971,
-0.05034004524350166,
0.13128826022148132,
0.08885212242603302,
-0.12818656861782074,
0.0962095707654953,
0.01392721850425005,
-0.19697393476963043,
0.002868467941880226,
0.009638463146984577,
-0.05619091913104057,
0.14565445482730865,
0.028437048196792603,
0.12006663531064987,
-0.0032987322192639112,
0.08477217704057693,
-0.19016307592391968,
0.011393117718398571,
0.04880033805966377,
0.0022858590818941593,
0.09458186477422714,
0.050123583525419235,
0.003134331200271845,
0.1275525540113449,
-0.08717275410890579,
0.05313878133893013,
0.02806706912815571,
-0.11805792152881622,
-0.22305718064308167,
-0.07618046551942825,
0.04159955307841301,
0.0740705281496048,
0.11006677150726318,
-0.005639785435050726,
0.1308141052722931,
-0.08605938404798508,
0.08880318701267242,
0.23550336062908173,
-0.30583614110946655,
-0.06443625688552856,
0.03265710920095444,
0.012285517528653145,
0.03883311152458191,
-0.1104314923286438,
-0.03320043161511421,
0.05513565614819527,
0.04949193447828293,
0.12685690820217133,
-0.03112470917403698,
-0.11342370510101318,
0.012215370312333107,
-0.13411471247673035,
-0.026288814842700958,
0.1539786458015442,
0.04028100147843361,
-0.03150342404842377,
-0.05181613191962242,
-0.05343287065625191,
-0.15177714824676514,
-0.03826058283448219,
-0.0056069716811180115,
0.04760108143091202,
-0.027068566530942917,
-0.05595731362700462,
0.0036838229279965162,
-0.1128939539194107,
-0.06788966804742813,
-0.08033433556556702,
0.12181299179792404,
0.033589623868465424,
0.015907712280750275,
-0.0370270200073719,
0.109775111079216,
-0.012047775089740753,
-0.13138563930988312,
0.014832996763288975,
0.025630218908190727,
0.009522379375994205,
-0.044848375022411346,
-0.052508819848299026,
-0.05302315577864647,
0.01229069847613573,
0.13055101037025452,
-0.05346065014600754,
0.04409528151154518,
0.05234171450138092,
0.04307950660586357,
-0.09279292076826096,
0.19667470455169678,
-0.038260214030742645,
-0.02218792960047722,
0.006767041981220245,
0.03915246203541756,
0.01817500777542591,
-0.01055796816945076,
-0.11901400238275528,
0.0019544779788702726,
0.08567898720502853,
0.006048509385436773,
-0.06584642827510834,
0.07457306236028671,
-0.05257037654519081,
-0.017383383587002754,
0.0006280068191699684,
-0.0888577327132225,
0.030389923602342606,
0.002498867455869913,
-0.07382941991090775,
-0.021265869960188866,
0.0300530344247818,
0.015772582963109016,
-0.019487250596284866,
0.12534859776496887,
-0.09381720423698425,
0.029932312667369843,
-0.09329327195882797,
-0.10815554112195969,
0.022634536027908325,
-0.09775605797767639,
0.028863679617643356,
-0.09573209285736084,
-0.1677841991186142,
-0.013606186956167221,
0.0570659302175045,
-0.028021475300192833,
-0.060382694005966187,
-0.04458434879779816,
-0.06752058118581772,
0.01605340465903282,
-0.011641287244856358,
0.13161610066890717,
-0.06672880053520203,
0.09067602455615997,
0.03318045288324356,
0.0643010064959526,
-0.04610082507133484,
0.0545768141746521,
-0.104961097240448,
0.01095003355294466,
-0.16204912960529327,
0.025268353521823883,
-0.046960558742284775,
0.08063752204179764,
-0.08346693217754364,
-0.09894329309463501,
0.008691877126693726,
-0.0016926821554079652,
0.06715283542871475,
0.09742321074008942,
-0.1710415780544281,
-0.08145228773355484,
0.1614946871995926,
-0.07140012830495834,
-0.1350884735584259,
0.11643988639116287,
-0.0545065701007843,
0.049516938626766205,
0.06354523450136185,
0.16835786402225494,
0.06510867923498154,
-0.09108350425958633,
-0.008360018953680992,
0.02844308875501156,
0.053611110895872116,
-0.08353448659181595,
0.07455205917358398,
0.00392266595736146,
0.012171769514679909,
0.035311244428157806,
-0.022403843700885773,
0.06179405376315117,
-0.09068257361650467,
-0.09588084369897842,
-0.043618425726890564,
-0.08639056235551834,
0.032657064497470856,
0.07789120078086853,
0.07061269134283066,
-0.09810224920511246,
-0.08729390799999237,
0.046419985592365265,
0.07929067313671112,
-0.04685145616531372,
0.02679145336151123,
-0.05538740009069443,
0.07372362166643143,
-0.037138938903808594,
-0.024889251217246056,
-0.17947198450565338,
-0.031591761857271194,
0.0025699553079903126,
0.00027104539913125336,
0.013970430940389633,
0.019852332770824432,
0.06701859831809998,
0.05625145882368088,
-0.052319593727588654,
-0.014577965252101421,
-0.022202685475349426,
-0.0010289466008543968,
-0.13953229784965515,
-0.2043076455593109,
-0.03315692022442818,
-0.022695127874612808,
0.141119122505188,
-0.2048681080341339,
0.040859151631593704,
-0.008060089312493801,
0.07516436278820038,
0.00908584799617529,
-0.003364209784194827,
-0.04505103826522827,
0.07078433781862259,
-0.03831224516034126,
-0.04826338216662407,
0.07603529095649719,
0.01888037845492363,
-0.09175366163253784,
-0.042933713644742966,
-0.0914168655872345,
0.1744447946548462,
0.1380210965871811,
-0.11001050472259521,
-0.07561561465263367,
-0.013212217949330807,
-0.06705914437770844,
-0.03366536274552345,
-0.05237497016787529,
0.0310813020914793,
0.1884153038263321,
-0.004342822823673487,
0.1502731442451477,
-0.06727545708417892,
-0.050321921706199646,
0.024544494226574898,
-0.03312907740473747,
0.02288474328815937,
0.12549304962158203,
0.13732585310935974,
-0.0625513568520546,
0.15156804025173187,
0.14943420886993408,
-0.09030985087156296,
0.13565336167812347,
-0.04143119975924492,
-0.07433126121759415,
-0.015361492522060871,
-0.037960514426231384,
-0.007015106733888388,
0.1112690195441246,
-0.1571711301803589,
-0.00569313345476985,
0.03188199922442436,
0.015982702374458313,
0.024935567751526833,
-0.22176682949066162,
-0.038873136043548584,
0.03447072207927704,
-0.034860335290431976,
-0.020748944953083992,
-0.012351339682936668,
0.0049042352475225925,
0.1071660965681076,
0.009230108000338078,
-0.07983247935771942,
0.035771407186985016,
0.007041286677122116,
-0.08509243279695511,
0.22050073742866516,
-0.0732455626130104,
-0.15531153976917267,
-0.12630519270896912,
-0.07531169056892395,
-0.04058314859867096,
-0.002001648535951972,
0.06999749690294266,
-0.09772171080112457,
-0.03454066812992096,
-0.06494801491498947,
0.026123855262994766,
0.002242924179881811,
0.038012176752090454,
0.0022714107763022184,
0.0032895600888878107,
0.06618290394544601,
-0.10766226053237915,
-0.017943644896149635,
-0.060056332498788834,
-0.04557625949382782,
0.03699665516614914,
0.03504917398095131,
0.11433953791856766,
0.1497831493616104,
-0.014221231453120708,
0.013781113550066948,
-0.03170427307486534,
0.23820388317108154,
-0.06084933876991272,
-0.026581011712551117,
0.13501763343811035,
-0.0076615759171545506,
0.04777873679995537,
0.12092947214841843,
0.07617264240980148,
-0.07880193740129471,
0.002600674517452717,
0.03899788483977318,
-0.03430943936109543,
-0.23151859641075134,
-0.04990599676966667,
-0.05700865015387535,
0.003815891221165657,
0.09155996143817902,
0.028392238542437553,
0.031110122799873352,
0.07152868807315826,
0.0391230545938015,
0.07864564657211304,
-0.05178530141711235,
0.05781975015997887,
0.1154990866780281,
0.03813260793685913,
0.127434641122818,
-0.05280769616365433,
-0.062106065452098846,
0.044395819306373596,
-0.018649987876415253,
0.21916259825229645,
0.003968053963035345,
0.13120047748088837,
0.055209796875715256,
0.1672869324684143,
-0.0032082071993499994,
0.08467075973749161,
-0.010750634595751762,
-0.04991510882973671,
-0.010758607648313046,
-0.03968540206551552,
-0.03318402171134949,
0.02596471644937992,
-0.07350622117519379,
0.07018791139125824,
-0.12936514616012573,
0.008289091289043427,
0.059413567185401917,
0.24862781167030334,
0.04488257318735123,
-0.32264694571495056,
-0.0988081842660904,
0.0020014536567032337,
-0.02992011047899723,
-0.02737218514084816,
0.025928953662514687,
0.08604683727025986,
-0.09423001855611801,
0.031115587800741196,
-0.06771036982536316,
0.10112819820642471,
-0.04301420971751213,
0.05122147873044014,
0.08861610293388367,
0.09055721759796143,
0.0049886442720890045,
0.09118586778640747,
-0.2906533181667328,
0.2798357307910919,
0.006987671833485365,
0.06918463855981827,
-0.08365077525377274,
0.006707466207444668,
0.03907150775194168,
0.06553469598293304,
0.08288156241178513,
-0.014953190460801125,
-0.039788488298654556,
-0.19358643889427185,
-0.06484533101320267,
0.0339672788977623,
0.06623050570487976,
-0.03288320079445839,
0.08651845902204514,
-0.031395073980093,
0.007457742467522621,
0.07362592965364456,
0.008733662776648998,
-0.04875722900032997,
-0.10116855055093765,
-0.010509002022445202,
0.02958148531615734,
-0.06401415169239044,
-0.06195148453116417,
-0.1207931861281395,
-0.12192783504724503,
0.1639934778213501,
-0.03318662568926811,
-0.03521491587162018,
-0.11417360603809357,
0.08966812491416931,
0.06180043891072273,
-0.0941837728023529,
0.038484714925289154,
-0.0009707536664791405,
0.07954935729503632,
0.02598000504076481,
-0.07708893716335297,
0.11154722422361374,
-0.07479967921972275,
-0.15117181837558746,
-0.06584189832210541,
0.10492885112762451,
0.027630958706140518,
0.06822939962148666,
-0.013515608385205269,
0.012359414249658585,
-0.05078385770320892,
-0.09151500463485718,
0.017749764025211334,
-0.009329475462436676,
0.07841002941131592,
0.0031738318502902985,
-0.06693345308303833,
0.012862754054367542,
-0.054870735853910446,
-0.03405337408185005,
0.20014168322086334,
0.21855491399765015,
-0.10459254682064056,
0.019309353083372116,
0.025887245312333107,
-0.07127265632152557,
-0.20469018816947937,
0.03517827019095421,
0.04953690990805626,
0.012814527377486229,
0.032210033386945724,
-0.17098402976989746,
0.1580735146999359,
0.10622065514326096,
-0.015065732412040234,
0.10073093324899673,
-0.299551397562027,
-0.12669068574905396,
0.14076679944992065,
0.13006934523582458,
0.12501086294651031,
-0.13742731511592865,
-0.021279647946357727,
-0.025343095883727074,
-0.14574852585792542,
0.10519156605005264,
-0.10746797174215317,
0.11452843248844147,
-0.03742414340376854,
0.07620010524988174,
0.002816242864355445,
-0.06205955520272255,
0.11955329775810242,
0.026662414893507957,
0.09018383920192719,
-0.06074850261211395,
-0.042297229170799255,
0.0341629758477211,
-0.04408671706914902,
0.03661029040813446,
-0.10185978561639786,
0.026655469089746475,
-0.10676504671573639,
-0.026511017233133316,
-0.0700063407421112,
0.04556037113070488,
-0.04346230998635292,
-0.06344757974147797,
-0.03487858176231384,
0.022959129884839058,
0.04026412591338158,
-0.0137229198589921,
0.13840116560459137,
0.021211916580796242,
0.15411525964736938,
0.09435506165027618,
0.07961063086986542,
-0.08547985553741455,
-0.0797184407711029,
-0.015108020976185799,
-0.01672511175274849,
0.05510300397872925,
-0.14676183462142944,
0.02361373044550419,
0.1522795557975769,
0.023286614567041397,
0.13679563999176025,
0.0861886516213417,
-0.021892758086323738,
-0.0019361572340130806,
0.0652974396944046,
-0.1634119302034378,
-0.08178110420703888,
-0.014039850793778896,
-0.061166077852249146,
-0.13189998269081116,
0.048213761299848557,
0.0864153578877449,
-0.06775419414043427,
-0.008551523089408875,
-0.00740694859996438,
0.006305362097918987,
-0.05940462648868561,
0.18928800523281097,
0.06243189051747322,
0.04615224152803421,
-0.10399850457906723,
0.06477366387844086,
0.04499476030468941,
-0.07792001962661743,
0.003070139791816473,
0.07963893562555313,
-0.08338174968957901,
-0.052400678396224976,
0.08749459683895111,
0.19778034090995789,
-0.05147331953048706,
-0.05141191557049751,
-0.1413978934288025,
-0.13067592680454254,
0.08519208431243896,
0.1525023728609085,
0.12004642188549042,
0.013027970679104328,
-0.059264328330755234,
0.003653865307569504,
-0.11338938027620316,
0.09490028768777847,
0.04414375126361847,
0.06387224793434143,
-0.14424721896648407,
0.1478741616010666,
0.014303641393780708,
0.05093104764819145,
-0.020217612385749817,
0.03035593591630459,
-0.11080899834632874,
0.00768400589004159,
-0.10925600677728653,
-0.014188872650265694,
-0.03684592247009277,
0.0077499947510659695,
-0.0033408727031201124,
-0.054110124707221985,
-0.06255713850259781,
0.011927232146263123,
-0.1080976277589798,
-0.021136194467544556,
0.030454004183411598,
0.06746035814285278,
-0.1149725690484047,
-0.03279729187488556,
0.027025137096643448,
-0.06026068702340126,
0.06934566795825958,
0.04836318641901016,
0.024385672062635422,
0.058933038264513016,
-0.13959120213985443,
0.01653558574616909,
0.0693189948797226,
0.020435437560081482,
0.0718948245048523,
-0.09006860852241516,
-0.006668671499937773,
-0.002739422954618931,
0.04672892019152641,
0.02170906402170658,
0.07274547964334488,
-0.14131338894367218,
-0.0019496014574542642,
-0.01645248383283615,
-0.08669720590114594,
-0.06424178183078766,
0.0252529326826334,
0.09883679449558258,
0.010830315761268139,
0.19647841155529022,
-0.07557108253240585,
0.04333958774805069,
-0.22063951194286346,
0.011873209848999977,
-0.01501546148210764,
-0.10546151548624039,
-0.10812777280807495,
-0.0694873109459877,
0.06165764853358269,
-0.05573686584830284,
0.15247857570648193,
0.043157633394002914,
0.03789886459708214,
0.03379714488983154,
-0.005651878193020821,
0.018328923732042313,
0.015883320942521095,
0.20190127193927765,
0.03162865713238716,
-0.0364995002746582,
0.06019076332449913,
0.04753594845533371,
0.1018684133887291,
0.125466987490654,
0.2096361517906189,
0.14029313623905182,
0.009925403632223606,
0.09996341168880463,
0.04067710041999817,
-0.0595560185611248,
-0.15866446495056152,
0.0363604910671711,
-0.04750402644276619,
0.10207531601190567,
-0.02192346751689911,
0.21357475221157074,
0.07121794670820236,
-0.1699521839618683,
0.04399692267179489,
-0.06269006431102753,
-0.08588816970586777,
-0.12070256471633911,
-0.04586905241012573,
-0.08106168359518051,
-0.1309642642736435,
0.0001321805320912972,
-0.11273068934679031,
-0.0012162922648712993,
0.12198466807603836,
0.003683202899992466,
-0.023600250482559204,
0.16350850462913513,
0.011624647304415703,
0.029433852061629295,
0.0566394217312336,
0.011841543018817902,
-0.033431462943553925,
-0.12000541388988495,
-0.050130732357501984,
-0.01859196648001671,
-0.01961885765194893,
0.028178444132208824,
-0.06570537388324738,
-0.05181530490517616,
0.039616916328668594,
-0.01562797836959362,
-0.09667304158210754,
0.007670079357922077,
0.011419818736612797,
0.06166885048151016,
0.047199420630931854,
0.005478383507579565,
0.024718012660741806,
-0.00981750525534153,
0.2050236016511917,
-0.08058343827724457,
-0.06479468941688538,
-0.10787871479988098,
0.24570158123970032,
0.036584436893463135,
-0.021753570064902306,
0.029966039583086967,
-0.06766568124294281,
0.001812550937756896,
0.2563922703266144,
0.21230579912662506,
-0.07576889544725418,
-0.005493332166224718,
0.014079378917813301,
-0.007522681262344122,
-0.02169375866651535,
0.09983312338590622,
0.1424679160118103,
0.06358201801776886,
-0.10010802745819092,
-0.043535325676202774,
-0.05389847978949547,
-0.01864388771355152,
-0.034852199256420135,
0.07309868186712265,
0.049223218113183975,
0.006786705926060677,
-0.03663778677582741,
0.05149189755320549,
-0.06335853785276413,
-0.08562761545181274,
0.06320024281740189,
-0.2137106955051422,
-0.16493570804595947,
-0.013268045149743557,
0.10432645678520203,
0.009276872500777245,
0.06678687036037445,
-0.02361975610256195,
-0.0030751030426472425,
0.08530234545469284,
-0.019469894468784332,
-0.10770940780639648,
-0.08061136305332184,
0.09170249849557877,
-0.10910338163375854,
0.22337853908538818,
-0.0447578951716423,
0.05735598877072334,
0.12990860641002655,
0.0712352991104126,
-0.07847760617733002,
0.05848822370171547,
0.03836846724152565,
-0.06419610232114792,
0.026543453335762024,
0.0695822536945343,
-0.038894206285476685,
0.05666741356253624,
0.04231509193778038,
-0.14395594596862793,
0.022577930241823196,
-0.06142842397093773,
-0.06316056102514267,
-0.04586682468652725,
-0.023666471242904663,
-0.059554725885391235,
0.13229267299175262,
0.21798373758792877,
-0.02747350186109543,
-0.012410398572683334,
-0.07058558613061905,
0.01229069009423256,
0.05498321354389191,
0.021564923226833344,
-0.06255114823579788,
-0.2111641764640808,
0.022540688514709473,
0.046501778066158295,
-0.020975574851036072,
-0.2504262924194336,
-0.097313292324543,
0.002028163056820631,
-0.07102660834789276,
-0.0982884019613266,
0.06894835084676743,
0.09115249663591385,
0.05199338123202324,
-0.05813044682145119,
-0.05868222564458847,
-0.06910606473684311,
0.1469157636165619,
-0.1437578797340393,
-0.0993814542889595
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8623
- Matthews Correlation: 0.5224
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5278 | 1.0 | 535 | 0.5223 | 0.4007 |
| 0.3515 | 2.0 | 1070 | 0.5150 | 0.4993 |
| 0.2391 | 3.0 | 1605 | 0.6471 | 0.5103 |
| 0.1841 | 4.0 | 2140 | 0.7640 | 0.5153 |
| 0.1312 | 5.0 | 2675 | 0.8623 | 0.5224 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["matthews_correlation"], "model-index": [{"name": "distilbert-base-uncased-finetuned-cola", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "cola"}, "metrics": [{"type": "matthews_correlation", "value": 0.5224154837835395, "name": "Matthews Correlation"}]}]}]}
|
text-classification
|
anirudh21/distilbert-base-uncased-finetuned-cola
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-cola
======================================
This model is a fine-tuned version of distilbert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.8623
* Matthews Correlation: 0.5224
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
67,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10177629441022873,
0.09868992865085602,
-0.002423677360638976,
0.12112317979335785,
0.1650812178850174,
0.03426579013466835,
0.1299346536397934,
0.12770213186740875,
-0.08564593642950058,
0.021304525434970856,
0.1207699179649353,
0.16118186712265015,
0.023844653740525246,
0.10977903753519058,
-0.04856811463832855,
-0.26443108916282654,
-0.014250527136027813,
0.05101621896028519,
-0.05502324551343918,
0.13528308272361755,
0.09085173904895782,
-0.12123988568782806,
0.09099695831537247,
0.010560519993305206,
-0.19217939674854279,
0.0012736907228827477,
-0.00007869464025134221,
-0.05158104747533798,
0.1480625867843628,
0.026134053245186806,
0.1222379058599472,
0.004350635223090649,
0.08203282207250595,
-0.20211967825889587,
0.0109197236597538,
0.047976233065128326,
0.0034287353046238422,
0.09363500773906708,
0.04515067860484123,
0.002628615591675043,
0.12063173204660416,
-0.0803351029753685,
0.05412057787179947,
0.025187760591506958,
-0.11920642107725143,
-0.2130812108516693,
-0.07939155399799347,
0.036630984395742416,
0.07490450143814087,
0.10550613701343536,
-0.007987946271896362,
0.12026778608560562,
-0.08140391111373901,
0.09315772354602814,
0.22560088336467743,
-0.2846624255180359,
-0.06620592623949051,
0.04440826550126076,
0.014406891539692879,
0.04722030460834503,
-0.10258147865533829,
-0.03414628654718399,
0.04851415753364563,
0.0509343147277832,
0.1279968023300171,
-0.027470313012599945,
-0.11697384715080261,
0.006325297988951206,
-0.14067994058132172,
-0.031778451055288315,
0.16862210631370544,
0.04232211038470268,
-0.027326999232172966,
-0.05612191930413246,
-0.05705082044005394,
-0.1505020707845688,
-0.035502392798662186,
-0.01559263002127409,
0.049344535917043686,
-0.023357374593615532,
-0.04166480526328087,
-0.009466851130127907,
-0.10793615877628326,
-0.06508545577526093,
-0.07387420535087585,
0.11076346784830093,
0.03758743405342102,
0.006860504392534494,
-0.029825663194060326,
0.11246810108423233,
-0.007811444811522961,
-0.12158264964818954,
0.025255942717194557,
0.022893251851201057,
0.01350346952676773,
-0.03933148831129074,
-0.05274882912635803,
-0.06423640251159668,
0.01213061437010765,
0.12725643813610077,
-0.05570182576775551,
0.04239019751548767,
0.05044251307845116,
0.04930534213781357,
-0.09428779035806656,
0.19049058854579926,
-0.034276507794857025,
-0.025200827047228813,
0.0005440381937660277,
0.05053231865167618,
0.017217280343174934,
-0.011495225131511688,
-0.12165343761444092,
0.004494988825172186,
0.08820939809083939,
0.007856707088649273,
-0.06190384179353714,
0.07444976270198822,
-0.059118129312992096,
-0.024367066100239754,
0.0022545091342180967,
-0.09038243442773819,
0.021844087168574333,
0.0009826826862990856,
-0.07183235138654709,
-0.020876631140708923,
0.036078646779060364,
0.015619035810232162,
-0.01955125480890274,
0.1076565608382225,
-0.08774691820144653,
0.02745640277862549,
-0.09359776973724365,
-0.10990453511476517,
0.016444405540823936,
-0.10769655555486679,
0.022347012534737587,
-0.09183300286531448,
-0.179530531167984,
-0.017432404682040215,
0.05996386706829071,
-0.024156158789992332,
-0.057796917855739594,
-0.058523666113615036,
-0.06686114519834518,
0.011313402093946934,
-0.00775144575163722,
0.11726417392492294,
-0.06450776755809784,
0.09395397454500198,
0.025765664875507355,
0.06269765645265579,
-0.042549166828393936,
0.05981731042265892,
-0.10193105041980743,
0.013144214637577534,
-0.15104839205741882,
0.0400441437959671,
-0.05145822837948799,
0.06935621798038483,
-0.08194040507078171,
-0.10615845024585724,
0.002663051476702094,
-0.0028380511794239283,
0.06268789619207382,
0.09636878967285156,
-0.18458586931228638,
-0.08252619951963425,
0.1638277769088745,
-0.07281412184238434,
-0.12196049094200134,
0.12136763334274292,
-0.0580064132809639,
0.05861146003007889,
0.05951378867030144,
0.17990033328533173,
0.08656419813632965,
-0.07854954153299332,
0.0018196254968643188,
0.023756947368383408,
0.05026758089661598,
-0.06338667124509811,
0.06808595359325409,
0.0018259093631058931,
0.01989728771150112,
0.03576963394880295,
-0.027762150391936302,
0.0641099065542221,
-0.08784174174070358,
-0.09886786341667175,
-0.0402458980679512,
-0.08253508061170578,
0.0465523786842823,
0.07975026965141296,
0.06739030033349991,
-0.09414571523666382,
-0.07797343283891678,
0.05024334043264389,
0.08206507563591003,
-0.058022212237119675,
0.024602338671684265,
-0.049497149884700775,
0.07355623692274094,
-0.023844702169299126,
-0.021642537787556648,
-0.17968407273292542,
-0.032223403453826904,
0.007671972271054983,
0.0017039760714396834,
0.018056152388453484,
0.03337876871228218,
0.06358686834573746,
0.06016344204545021,
-0.05022745952010155,
-0.019244832918047905,
-0.0352572537958622,
-0.0006887496565468609,
-0.12579292058944702,
-0.19718441367149353,
-0.028450479730963707,
-0.02205497771501541,
0.16075244545936584,
-0.2082725465297699,
0.05154874175786972,
-0.014233555644750595,
0.06959009915590286,
0.012252251617610455,
-0.0065546054393053055,
-0.037181925028562546,
0.07644595205783844,
-0.04241294413805008,
-0.05049571767449379,
0.08215155452489853,
0.01450799684971571,
-0.09128501266241074,
-0.050215501338243484,
-0.09774215519428253,
0.1582167148590088,
0.12925271689891815,
-0.11003289371728897,
-0.07731720805168152,
-0.023380616679787636,
-0.0669984295964241,
-0.034903690218925476,
-0.04613172262907028,
0.026549331843852997,
0.1879379004240036,
-0.0049881828017532825,
0.14970116317272186,
-0.06918737292289734,
-0.043393924832344055,
0.018244462087750435,
-0.03694281727075577,
0.01636327989399433,
0.13443101942539215,
0.13418081402778625,
-0.06011265516281128,
0.15530750155448914,
0.14804664254188538,
-0.08511312305927277,
0.1510668396949768,
-0.04195278137922287,
-0.06577235460281372,
-0.01610550656914711,
-0.029684927314519882,
-0.011206655763089657,
0.10058020800352097,
-0.15690045058727264,
-0.002004367997869849,
0.030701281502842903,
0.015433641150593758,
0.02562039904296398,
-0.22722068428993225,
-0.04094555974006653,
0.03781639412045479,
-0.044886574149131775,
-0.006275756284594536,
-0.00595852779224515,
0.005161743611097336,
0.10115070641040802,
-0.0004888595431111753,
-0.08683908730745316,
0.03661135584115982,
0.0027518956921994686,
-0.08374463021755219,
0.2156449258327484,
-0.081173375248909,
-0.17226016521453857,
-0.13096117973327637,
-0.07049524784088135,
-0.047677770256996155,
-0.0015197350876405835,
0.06859971582889557,
-0.09709104150533676,
-0.02635144256055355,
-0.07209304720163345,
0.025860309600830078,
0.007600440643727779,
0.022702498361468315,
0.0029937936924397945,
0.007451718207448721,
0.06447025388479233,
-0.11222215741872787,
-0.015035846270620823,
-0.058496929705142975,
-0.04542740061879158,
0.044325705617666245,
0.02816983126103878,
0.11031338572502136,
0.15270480513572693,
-0.013083916157484055,
0.012781093828380108,
-0.03162777051329613,
0.23638051748275757,
-0.06001908704638481,
-0.0208893995732069,
0.1460668295621872,
-0.007206457667052746,
0.051826294511556625,
0.11371733248233795,
0.07455668598413467,
-0.07760798186063766,
0.003885192796587944,
0.037943821400403976,
-0.034547463059425354,
-0.23284892737865448,
-0.053616248071193695,
-0.05520634725689888,
0.011296672746539116,
0.089586041867733,
0.02373400144279003,
0.030841536819934845,
0.07044725120067596,
0.04116436094045639,
0.07381468266248703,
-0.037414710968732834,
0.05087532848119736,
0.1295747458934784,
0.030546288937330246,
0.12477082759141922,
-0.04690399020910263,
-0.06424792855978012,
0.040794432163238525,
-0.010863419622182846,
0.22334444522857666,
0.009917938150465488,
0.13131633400917053,
0.06607729941606522,
0.1649666577577591,
-0.009735438972711563,
0.07535355538129807,
-0.010470135137438774,
-0.03799179568886757,
-0.0162015613168478,
-0.03991588577628136,
-0.04029099643230438,
0.023612579330801964,
-0.06360199302434921,
0.06474608182907104,
-0.12359699606895447,
0.014717328362166882,
0.05894068628549576,
0.24849559366703033,
0.03377196192741394,
-0.3200716972351074,
-0.09709673374891281,
0.0007384793716482818,
-0.029719963669776917,
-0.019953938201069832,
0.026196908205747604,
0.09424502402544022,
-0.09753169119358063,
0.029457727447152138,
-0.07466296851634979,
0.0962631106376648,
-0.055720455944538116,
0.05116957798600197,
0.08169417083263397,
0.09054762125015259,
0.011883794330060482,
0.09314082562923431,
-0.2884000837802887,
0.27650073170661926,
0.0002981654542963952,
0.056524623185396194,
-0.07594560086727142,
0.008374286815524101,
0.041159119457006454,
0.06563553959131241,
0.07949315011501312,
-0.01224528532475233,
-0.01759219914674759,
-0.18658626079559326,
-0.06754712015390396,
0.027284175157546997,
0.06876907497644424,
-0.04146112501621246,
0.08209217339754105,
-0.031855158507823944,
0.008920346386730671,
0.07337794452905655,
0.0023068361915647984,
-0.053541041910648346,
-0.10786380618810654,
-0.005141935311257839,
0.022722337394952774,
-0.06008746474981308,
-0.06107710674405098,
-0.12087935954332352,
-0.12990501523017883,
0.155729740858078,
-0.035367779433727264,
-0.038886189460754395,
-0.106304831802845,
0.08325839787721634,
0.05899275466799736,
-0.08922765403985977,
0.0430733785033226,
0.0019881264306604862,
0.07539381086826324,
0.02154691517353058,
-0.07059939950704575,
0.10247620195150375,
-0.07363677769899368,
-0.1557348668575287,
-0.0653294250369072,
0.10667534172534943,
0.033229321241378784,
0.06670800596475601,
-0.014250626787543297,
0.00431025680154562,
-0.046546995639801025,
-0.08820211887359619,
0.020896978676319122,
0.004755881614983082,
0.07743469625711441,
0.018997633829712868,
-0.07543632388114929,
0.01201551128178835,
-0.06481624394655228,
-0.034086693078279495,
0.20510898530483246,
0.2221778780221939,
-0.09982394427061081,
0.024086005985736847,
0.025836989283561707,
-0.0738971009850502,
-0.19793148338794708,
0.03522804379463196,
0.05483577400445938,
0.008683490566909313,
0.04294142872095108,
-0.18465115129947662,
0.13146370649337769,
0.10747389495372772,
-0.011773521080613136,
0.10583452880382538,
-0.324044793844223,
-0.12059681862592697,
0.13578835129737854,
0.13629050552845,
0.09854068607091904,
-0.1321391612291336,
-0.02271113730967045,
-0.018754245713353157,
-0.1379910707473755,
0.11446153372526169,
-0.09142790734767914,
0.120912104845047,
-0.037892017513513565,
0.07596635818481445,
0.0028597572818398476,
-0.0584384985268116,
0.12073198705911636,
0.023140931501984596,
0.0945015400648117,
-0.058765899389982224,
-0.033337973058223724,
0.03047853522002697,
-0.042802438139915466,
0.03446131944656372,
-0.09962396323680878,
0.029662422835826874,
-0.10281215608119965,
-0.0251001063734293,
-0.06927776336669922,
0.04619433730840683,
-0.04536258056759834,
-0.06819522380828857,
-0.037817176431417465,
0.025476092472672462,
0.04637615382671356,
-0.007411271333694458,
0.12242395430803299,
0.02384062111377716,
0.1488822102546692,
0.09686450660228729,
0.07455138862133026,
-0.06877368688583374,
-0.08208677172660828,
-0.026989970356225967,
-0.01078053005039692,
0.050466980785131454,
-0.1369129866361618,
0.019369233399629593,
0.15191002190113068,
0.020388750359416008,
0.15307851135730743,
0.08298779278993607,
-0.021846970543265343,
-0.00145810900721699,
0.059030331671237946,
-0.16558411717414856,
-0.09374777227640152,
-0.017512774094939232,
-0.06781873852014542,
-0.12064754962921143,
0.04518076777458191,
0.09283262491226196,
-0.06830855458974838,
-0.006686370354145765,
-0.004924751818180084,
0.013755558989942074,
-0.05032142624258995,
0.18429741263389587,
0.06282955408096313,
0.047867823392152786,
-0.096428282558918,
0.07266043871641159,
0.0449543371796608,
-0.07330744713544846,
0.0033405697904527187,
0.07132815569639206,
-0.08534601330757141,
-0.05450327321887016,
0.06432835012674332,
0.1912047564983368,
-0.043927162885665894,
-0.04855562746524811,
-0.1453658491373062,
-0.12287921458482742,
0.07764768600463867,
0.1408335417509079,
0.11843205243349075,
0.01058033388108015,
-0.06616745889186859,
0.0029106447473168373,
-0.10731884837150574,
0.10160065442323685,
0.045956265181303024,
0.06211207062005997,
-0.14301945269107819,
0.14211498200893402,
0.020740197971463203,
0.04820193350315094,
-0.01806846633553505,
0.023208048194646835,
-0.10020429641008377,
0.007697694003582001,
-0.09298595041036606,
-0.019537312909960747,
-0.029065001755952835,
0.011588165536522865,
-0.005960927344858646,
-0.04729988053441048,
-0.0542338490486145,
0.010628738440573215,
-0.10766087472438812,
-0.023693302646279335,
0.030114110559225082,
0.07296796143054962,
-0.10916557163000107,
-0.035426318645477295,
0.030875829979777336,
-0.06108306720852852,
0.07371184974908829,
0.04329424351453781,
0.015606247819960117,
0.050780802965164185,
-0.13902859389781952,
0.02026754431426525,
0.07313340902328491,
0.029131997376680374,
0.06079116463661194,
-0.09932733327150345,
-0.007924248464405537,
-0.00831547100096941,
0.039642333984375,
0.02172110602259636,
0.07442112267017365,
-0.1410936713218689,
0.0035281842574477196,
-0.02308511547744274,
-0.08286073803901672,
-0.06700614094734192,
0.028149420395493507,
0.08893175423145294,
0.018416542559862137,
0.19928090274333954,
-0.07619873434305191,
0.049508191645145416,
-0.21921874582767487,
0.007203821558505297,
-0.006482485681772232,
-0.11031077802181244,
-0.10131534934043884,
-0.07205703109502792,
0.05513612926006317,
-0.060717787593603134,
0.1499030590057373,
0.04670583829283714,
0.0190992783755064,
0.02442006766796112,
-0.011104658246040344,
0.0123064573854208,
0.009836219251155853,
0.18994872272014618,
0.030648769810795784,
-0.03457606956362724,
0.05985496938228607,
0.044767893850803375,
0.10333859920501709,
0.11458932608366013,
0.2000289261341095,
0.14501407742500305,
-0.009634922258555889,
0.09320678561925888,
0.043747033923864365,
-0.055918898433446884,
-0.15551309287548065,
0.05201669782400131,
-0.0348605252802372,
0.10937416553497314,
-0.02125314436852932,
0.22091102600097656,
0.06478510051965714,
-0.1696740686893463,
0.051610227674245834,
-0.05148177221417427,
-0.08721215277910233,
-0.11527423560619354,
-0.049710892140865326,
-0.07701697945594788,
-0.13180910050868988,
-0.003880183445289731,
-0.11571096628904343,
-0.0028426540084183216,
0.12518630921840668,
0.003630818100646138,
-0.027395818382501602,
0.15849998593330383,
0.014401617459952831,
0.022212907671928406,
0.06015627831220627,
0.008367235772311687,
-0.03863980621099472,
-0.14036662876605988,
-0.059315942227840424,
-0.012330079451203346,
-0.008793477900326252,
0.03116128407418728,
-0.06153199076652527,
-0.04473326727747917,
0.03117159940302372,
-0.02042582258582115,
-0.09601709246635437,
0.006006556563079357,
0.011131567880511284,
0.0533316507935524,
0.044897403568029404,
0.009516828693449497,
0.018703876063227654,
-0.0037758296821266413,
0.20052506029605865,
-0.0717770978808403,
-0.06498154252767563,
-0.10246375948190689,
0.23358504474163055,
0.0361536405980587,
-0.018478771671652794,
0.03453867882490158,
-0.06657195836305618,
0.004409546032547951,
0.24914872646331787,
0.21641024947166443,
-0.07975707203149796,
-0.0060849878937006,
0.016893045976758003,
-0.007865218445658684,
-0.021979933604598045,
0.09790797531604767,
0.14306801557540894,
0.04558771848678589,
-0.09230106323957443,
-0.04356169328093529,
-0.058740004897117615,
-0.0174243226647377,
-0.03374728187918663,
0.0693250447511673,
0.051049165427684784,
0.009482786059379578,
-0.03563119098544121,
0.0567263662815094,
-0.06666052341461182,
-0.09042596817016602,
0.05730389058589935,
-0.21876642107963562,
-0.16741296648979187,
-0.01652800291776657,
0.1028960645198822,
0.0017809192650020123,
0.061948828399181366,
-0.029001614078879356,
-0.003035155590623617,
0.09021490812301636,
-0.019437670707702637,
-0.09757450222969055,
-0.07405272871255875,
0.0848613753914833,
-0.11156534403562546,
0.21836979687213898,
-0.047714509069919586,
0.054364051669836044,
0.1254379153251648,
0.06760457158088684,
-0.0644994005560875,
0.06536801904439926,
0.04187343269586563,
-0.04156811535358429,
0.023007867857813835,
0.06869390606880188,
-0.03251289203763008,
0.06446241587400436,
0.047979000955820084,
-0.13706746697425842,
0.02370663918554783,
-0.04812363535165787,
-0.06919568032026291,
-0.043872371315956116,
-0.020693952217698097,
-0.06029629707336426,
0.12902742624282837,
0.2190595120191574,
-0.024821428582072258,
-0.00955967791378498,
-0.07230399549007416,
0.00883461069315672,
0.05615578591823578,
0.021983714774250984,
-0.057219840586185455,
-0.21056394279003143,
0.016822319477796555,
0.04565083608031273,
-0.01846429333090782,
-0.25154390931129456,
-0.10084811598062515,
0.004124476574361324,
-0.07295944541692734,
-0.0947342962026596,
0.07150569558143616,
0.08810579776763916,
0.054769519716501236,
-0.05578319728374481,
-0.04721960425376892,
-0.07467550039291382,
0.14914295077323914,
-0.1454761028289795,
-0.09100616723299026
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-mrpc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3830
- Accuracy: 0.8456
- F1: 0.8959
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log | 1.0 | 230 | 0.3826 | 0.8186 | 0.8683 |
| No log | 2.0 | 460 | 0.3830 | 0.8456 | 0.8959 |
| 0.4408 | 3.0 | 690 | 0.3835 | 0.8382 | 0.8866 |
| 0.4408 | 4.0 | 920 | 0.5036 | 0.8431 | 0.8919 |
| 0.1941 | 5.0 | 1150 | 0.5783 | 0.8431 | 0.8930 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy", "f1"], "model-index": [{"name": "distilbert-base-uncased-finetuned-mrpc", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "mrpc"}, "metrics": [{"type": "accuracy", "value": 0.8455882352941176, "name": "Accuracy"}, {"type": "f1", "value": 0.8958677685950412, "name": "F1"}]}]}]}
|
text-classification
|
anirudh21/distilbert-base-uncased-finetuned-mrpc
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-mrpc
======================================
This model is a fine-tuned version of distilbert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.3830
* Accuracy: 0.8456
* F1: 0.8959
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
67,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10177629441022873,
0.09868992865085602,
-0.002423677360638976,
0.12112317979335785,
0.1650812178850174,
0.03426579013466835,
0.1299346536397934,
0.12770213186740875,
-0.08564593642950058,
0.021304525434970856,
0.1207699179649353,
0.16118186712265015,
0.023844653740525246,
0.10977903753519058,
-0.04856811463832855,
-0.26443108916282654,
-0.014250527136027813,
0.05101621896028519,
-0.05502324551343918,
0.13528308272361755,
0.09085173904895782,
-0.12123988568782806,
0.09099695831537247,
0.010560519993305206,
-0.19217939674854279,
0.0012736907228827477,
-0.00007869464025134221,
-0.05158104747533798,
0.1480625867843628,
0.026134053245186806,
0.1222379058599472,
0.004350635223090649,
0.08203282207250595,
-0.20211967825889587,
0.0109197236597538,
0.047976233065128326,
0.0034287353046238422,
0.09363500773906708,
0.04515067860484123,
0.002628615591675043,
0.12063173204660416,
-0.0803351029753685,
0.05412057787179947,
0.025187760591506958,
-0.11920642107725143,
-0.2130812108516693,
-0.07939155399799347,
0.036630984395742416,
0.07490450143814087,
0.10550613701343536,
-0.007987946271896362,
0.12026778608560562,
-0.08140391111373901,
0.09315772354602814,
0.22560088336467743,
-0.2846624255180359,
-0.06620592623949051,
0.04440826550126076,
0.014406891539692879,
0.04722030460834503,
-0.10258147865533829,
-0.03414628654718399,
0.04851415753364563,
0.0509343147277832,
0.1279968023300171,
-0.027470313012599945,
-0.11697384715080261,
0.006325297988951206,
-0.14067994058132172,
-0.031778451055288315,
0.16862210631370544,
0.04232211038470268,
-0.027326999232172966,
-0.05612191930413246,
-0.05705082044005394,
-0.1505020707845688,
-0.035502392798662186,
-0.01559263002127409,
0.049344535917043686,
-0.023357374593615532,
-0.04166480526328087,
-0.009466851130127907,
-0.10793615877628326,
-0.06508545577526093,
-0.07387420535087585,
0.11076346784830093,
0.03758743405342102,
0.006860504392534494,
-0.029825663194060326,
0.11246810108423233,
-0.007811444811522961,
-0.12158264964818954,
0.025255942717194557,
0.022893251851201057,
0.01350346952676773,
-0.03933148831129074,
-0.05274882912635803,
-0.06423640251159668,
0.01213061437010765,
0.12725643813610077,
-0.05570182576775551,
0.04239019751548767,
0.05044251307845116,
0.04930534213781357,
-0.09428779035806656,
0.19049058854579926,
-0.034276507794857025,
-0.025200827047228813,
0.0005440381937660277,
0.05053231865167618,
0.017217280343174934,
-0.011495225131511688,
-0.12165343761444092,
0.004494988825172186,
0.08820939809083939,
0.007856707088649273,
-0.06190384179353714,
0.07444976270198822,
-0.059118129312992096,
-0.024367066100239754,
0.0022545091342180967,
-0.09038243442773819,
0.021844087168574333,
0.0009826826862990856,
-0.07183235138654709,
-0.020876631140708923,
0.036078646779060364,
0.015619035810232162,
-0.01955125480890274,
0.1076565608382225,
-0.08774691820144653,
0.02745640277862549,
-0.09359776973724365,
-0.10990453511476517,
0.016444405540823936,
-0.10769655555486679,
0.022347012534737587,
-0.09183300286531448,
-0.179530531167984,
-0.017432404682040215,
0.05996386706829071,
-0.024156158789992332,
-0.057796917855739594,
-0.058523666113615036,
-0.06686114519834518,
0.011313402093946934,
-0.00775144575163722,
0.11726417392492294,
-0.06450776755809784,
0.09395397454500198,
0.025765664875507355,
0.06269765645265579,
-0.042549166828393936,
0.05981731042265892,
-0.10193105041980743,
0.013144214637577534,
-0.15104839205741882,
0.0400441437959671,
-0.05145822837948799,
0.06935621798038483,
-0.08194040507078171,
-0.10615845024585724,
0.002663051476702094,
-0.0028380511794239283,
0.06268789619207382,
0.09636878967285156,
-0.18458586931228638,
-0.08252619951963425,
0.1638277769088745,
-0.07281412184238434,
-0.12196049094200134,
0.12136763334274292,
-0.0580064132809639,
0.05861146003007889,
0.05951378867030144,
0.17990033328533173,
0.08656419813632965,
-0.07854954153299332,
0.0018196254968643188,
0.023756947368383408,
0.05026758089661598,
-0.06338667124509811,
0.06808595359325409,
0.0018259093631058931,
0.01989728771150112,
0.03576963394880295,
-0.027762150391936302,
0.0641099065542221,
-0.08784174174070358,
-0.09886786341667175,
-0.0402458980679512,
-0.08253508061170578,
0.0465523786842823,
0.07975026965141296,
0.06739030033349991,
-0.09414571523666382,
-0.07797343283891678,
0.05024334043264389,
0.08206507563591003,
-0.058022212237119675,
0.024602338671684265,
-0.049497149884700775,
0.07355623692274094,
-0.023844702169299126,
-0.021642537787556648,
-0.17968407273292542,
-0.032223403453826904,
0.007671972271054983,
0.0017039760714396834,
0.018056152388453484,
0.03337876871228218,
0.06358686834573746,
0.06016344204545021,
-0.05022745952010155,
-0.019244832918047905,
-0.0352572537958622,
-0.0006887496565468609,
-0.12579292058944702,
-0.19718441367149353,
-0.028450479730963707,
-0.02205497771501541,
0.16075244545936584,
-0.2082725465297699,
0.05154874175786972,
-0.014233555644750595,
0.06959009915590286,
0.012252251617610455,
-0.0065546054393053055,
-0.037181925028562546,
0.07644595205783844,
-0.04241294413805008,
-0.05049571767449379,
0.08215155452489853,
0.01450799684971571,
-0.09128501266241074,
-0.050215501338243484,
-0.09774215519428253,
0.1582167148590088,
0.12925271689891815,
-0.11003289371728897,
-0.07731720805168152,
-0.023380616679787636,
-0.0669984295964241,
-0.034903690218925476,
-0.04613172262907028,
0.026549331843852997,
0.1879379004240036,
-0.0049881828017532825,
0.14970116317272186,
-0.06918737292289734,
-0.043393924832344055,
0.018244462087750435,
-0.03694281727075577,
0.01636327989399433,
0.13443101942539215,
0.13418081402778625,
-0.06011265516281128,
0.15530750155448914,
0.14804664254188538,
-0.08511312305927277,
0.1510668396949768,
-0.04195278137922287,
-0.06577235460281372,
-0.01610550656914711,
-0.029684927314519882,
-0.011206655763089657,
0.10058020800352097,
-0.15690045058727264,
-0.002004367997869849,
0.030701281502842903,
0.015433641150593758,
0.02562039904296398,
-0.22722068428993225,
-0.04094555974006653,
0.03781639412045479,
-0.044886574149131775,
-0.006275756284594536,
-0.00595852779224515,
0.005161743611097336,
0.10115070641040802,
-0.0004888595431111753,
-0.08683908730745316,
0.03661135584115982,
0.0027518956921994686,
-0.08374463021755219,
0.2156449258327484,
-0.081173375248909,
-0.17226016521453857,
-0.13096117973327637,
-0.07049524784088135,
-0.047677770256996155,
-0.0015197350876405835,
0.06859971582889557,
-0.09709104150533676,
-0.02635144256055355,
-0.07209304720163345,
0.025860309600830078,
0.007600440643727779,
0.022702498361468315,
0.0029937936924397945,
0.007451718207448721,
0.06447025388479233,
-0.11222215741872787,
-0.015035846270620823,
-0.058496929705142975,
-0.04542740061879158,
0.044325705617666245,
0.02816983126103878,
0.11031338572502136,
0.15270480513572693,
-0.013083916157484055,
0.012781093828380108,
-0.03162777051329613,
0.23638051748275757,
-0.06001908704638481,
-0.0208893995732069,
0.1460668295621872,
-0.007206457667052746,
0.051826294511556625,
0.11371733248233795,
0.07455668598413467,
-0.07760798186063766,
0.003885192796587944,
0.037943821400403976,
-0.034547463059425354,
-0.23284892737865448,
-0.053616248071193695,
-0.05520634725689888,
0.011296672746539116,
0.089586041867733,
0.02373400144279003,
0.030841536819934845,
0.07044725120067596,
0.04116436094045639,
0.07381468266248703,
-0.037414710968732834,
0.05087532848119736,
0.1295747458934784,
0.030546288937330246,
0.12477082759141922,
-0.04690399020910263,
-0.06424792855978012,
0.040794432163238525,
-0.010863419622182846,
0.22334444522857666,
0.009917938150465488,
0.13131633400917053,
0.06607729941606522,
0.1649666577577591,
-0.009735438972711563,
0.07535355538129807,
-0.010470135137438774,
-0.03799179568886757,
-0.0162015613168478,
-0.03991588577628136,
-0.04029099643230438,
0.023612579330801964,
-0.06360199302434921,
0.06474608182907104,
-0.12359699606895447,
0.014717328362166882,
0.05894068628549576,
0.24849559366703033,
0.03377196192741394,
-0.3200716972351074,
-0.09709673374891281,
0.0007384793716482818,
-0.029719963669776917,
-0.019953938201069832,
0.026196908205747604,
0.09424502402544022,
-0.09753169119358063,
0.029457727447152138,
-0.07466296851634979,
0.0962631106376648,
-0.055720455944538116,
0.05116957798600197,
0.08169417083263397,
0.09054762125015259,
0.011883794330060482,
0.09314082562923431,
-0.2884000837802887,
0.27650073170661926,
0.0002981654542963952,
0.056524623185396194,
-0.07594560086727142,
0.008374286815524101,
0.041159119457006454,
0.06563553959131241,
0.07949315011501312,
-0.01224528532475233,
-0.01759219914674759,
-0.18658626079559326,
-0.06754712015390396,
0.027284175157546997,
0.06876907497644424,
-0.04146112501621246,
0.08209217339754105,
-0.031855158507823944,
0.008920346386730671,
0.07337794452905655,
0.0023068361915647984,
-0.053541041910648346,
-0.10786380618810654,
-0.005141935311257839,
0.022722337394952774,
-0.06008746474981308,
-0.06107710674405098,
-0.12087935954332352,
-0.12990501523017883,
0.155729740858078,
-0.035367779433727264,
-0.038886189460754395,
-0.106304831802845,
0.08325839787721634,
0.05899275466799736,
-0.08922765403985977,
0.0430733785033226,
0.0019881264306604862,
0.07539381086826324,
0.02154691517353058,
-0.07059939950704575,
0.10247620195150375,
-0.07363677769899368,
-0.1557348668575287,
-0.0653294250369072,
0.10667534172534943,
0.033229321241378784,
0.06670800596475601,
-0.014250626787543297,
0.00431025680154562,
-0.046546995639801025,
-0.08820211887359619,
0.020896978676319122,
0.004755881614983082,
0.07743469625711441,
0.018997633829712868,
-0.07543632388114929,
0.01201551128178835,
-0.06481624394655228,
-0.034086693078279495,
0.20510898530483246,
0.2221778780221939,
-0.09982394427061081,
0.024086005985736847,
0.025836989283561707,
-0.0738971009850502,
-0.19793148338794708,
0.03522804379463196,
0.05483577400445938,
0.008683490566909313,
0.04294142872095108,
-0.18465115129947662,
0.13146370649337769,
0.10747389495372772,
-0.011773521080613136,
0.10583452880382538,
-0.324044793844223,
-0.12059681862592697,
0.13578835129737854,
0.13629050552845,
0.09854068607091904,
-0.1321391612291336,
-0.02271113730967045,
-0.018754245713353157,
-0.1379910707473755,
0.11446153372526169,
-0.09142790734767914,
0.120912104845047,
-0.037892017513513565,
0.07596635818481445,
0.0028597572818398476,
-0.0584384985268116,
0.12073198705911636,
0.023140931501984596,
0.0945015400648117,
-0.058765899389982224,
-0.033337973058223724,
0.03047853522002697,
-0.042802438139915466,
0.03446131944656372,
-0.09962396323680878,
0.029662422835826874,
-0.10281215608119965,
-0.0251001063734293,
-0.06927776336669922,
0.04619433730840683,
-0.04536258056759834,
-0.06819522380828857,
-0.037817176431417465,
0.025476092472672462,
0.04637615382671356,
-0.007411271333694458,
0.12242395430803299,
0.02384062111377716,
0.1488822102546692,
0.09686450660228729,
0.07455138862133026,
-0.06877368688583374,
-0.08208677172660828,
-0.026989970356225967,
-0.01078053005039692,
0.050466980785131454,
-0.1369129866361618,
0.019369233399629593,
0.15191002190113068,
0.020388750359416008,
0.15307851135730743,
0.08298779278993607,
-0.021846970543265343,
-0.00145810900721699,
0.059030331671237946,
-0.16558411717414856,
-0.09374777227640152,
-0.017512774094939232,
-0.06781873852014542,
-0.12064754962921143,
0.04518076777458191,
0.09283262491226196,
-0.06830855458974838,
-0.006686370354145765,
-0.004924751818180084,
0.013755558989942074,
-0.05032142624258995,
0.18429741263389587,
0.06282955408096313,
0.047867823392152786,
-0.096428282558918,
0.07266043871641159,
0.0449543371796608,
-0.07330744713544846,
0.0033405697904527187,
0.07132815569639206,
-0.08534601330757141,
-0.05450327321887016,
0.06432835012674332,
0.1912047564983368,
-0.043927162885665894,
-0.04855562746524811,
-0.1453658491373062,
-0.12287921458482742,
0.07764768600463867,
0.1408335417509079,
0.11843205243349075,
0.01058033388108015,
-0.06616745889186859,
0.0029106447473168373,
-0.10731884837150574,
0.10160065442323685,
0.045956265181303024,
0.06211207062005997,
-0.14301945269107819,
0.14211498200893402,
0.020740197971463203,
0.04820193350315094,
-0.01806846633553505,
0.023208048194646835,
-0.10020429641008377,
0.007697694003582001,
-0.09298595041036606,
-0.019537312909960747,
-0.029065001755952835,
0.011588165536522865,
-0.005960927344858646,
-0.04729988053441048,
-0.0542338490486145,
0.010628738440573215,
-0.10766087472438812,
-0.023693302646279335,
0.030114110559225082,
0.07296796143054962,
-0.10916557163000107,
-0.035426318645477295,
0.030875829979777336,
-0.06108306720852852,
0.07371184974908829,
0.04329424351453781,
0.015606247819960117,
0.050780802965164185,
-0.13902859389781952,
0.02026754431426525,
0.07313340902328491,
0.029131997376680374,
0.06079116463661194,
-0.09932733327150345,
-0.007924248464405537,
-0.00831547100096941,
0.039642333984375,
0.02172110602259636,
0.07442112267017365,
-0.1410936713218689,
0.0035281842574477196,
-0.02308511547744274,
-0.08286073803901672,
-0.06700614094734192,
0.028149420395493507,
0.08893175423145294,
0.018416542559862137,
0.19928090274333954,
-0.07619873434305191,
0.049508191645145416,
-0.21921874582767487,
0.007203821558505297,
-0.006482485681772232,
-0.11031077802181244,
-0.10131534934043884,
-0.07205703109502792,
0.05513612926006317,
-0.060717787593603134,
0.1499030590057373,
0.04670583829283714,
0.0190992783755064,
0.02442006766796112,
-0.011104658246040344,
0.0123064573854208,
0.009836219251155853,
0.18994872272014618,
0.030648769810795784,
-0.03457606956362724,
0.05985496938228607,
0.044767893850803375,
0.10333859920501709,
0.11458932608366013,
0.2000289261341095,
0.14501407742500305,
-0.009634922258555889,
0.09320678561925888,
0.043747033923864365,
-0.055918898433446884,
-0.15551309287548065,
0.05201669782400131,
-0.0348605252802372,
0.10937416553497314,
-0.02125314436852932,
0.22091102600097656,
0.06478510051965714,
-0.1696740686893463,
0.051610227674245834,
-0.05148177221417427,
-0.08721215277910233,
-0.11527423560619354,
-0.049710892140865326,
-0.07701697945594788,
-0.13180910050868988,
-0.003880183445289731,
-0.11571096628904343,
-0.0028426540084183216,
0.12518630921840668,
0.003630818100646138,
-0.027395818382501602,
0.15849998593330383,
0.014401617459952831,
0.022212907671928406,
0.06015627831220627,
0.008367235772311687,
-0.03863980621099472,
-0.14036662876605988,
-0.059315942227840424,
-0.012330079451203346,
-0.008793477900326252,
0.03116128407418728,
-0.06153199076652527,
-0.04473326727747917,
0.03117159940302372,
-0.02042582258582115,
-0.09601709246635437,
0.006006556563079357,
0.011131567880511284,
0.0533316507935524,
0.044897403568029404,
0.009516828693449497,
0.018703876063227654,
-0.0037758296821266413,
0.20052506029605865,
-0.0717770978808403,
-0.06498154252767563,
-0.10246375948190689,
0.23358504474163055,
0.0361536405980587,
-0.018478771671652794,
0.03453867882490158,
-0.06657195836305618,
0.004409546032547951,
0.24914872646331787,
0.21641024947166443,
-0.07975707203149796,
-0.0060849878937006,
0.016893045976758003,
-0.007865218445658684,
-0.021979933604598045,
0.09790797531604767,
0.14306801557540894,
0.04558771848678589,
-0.09230106323957443,
-0.04356169328093529,
-0.058740004897117615,
-0.0174243226647377,
-0.03374728187918663,
0.0693250447511673,
0.051049165427684784,
0.009482786059379578,
-0.03563119098544121,
0.0567263662815094,
-0.06666052341461182,
-0.09042596817016602,
0.05730389058589935,
-0.21876642107963562,
-0.16741296648979187,
-0.01652800291776657,
0.1028960645198822,
0.0017809192650020123,
0.061948828399181366,
-0.029001614078879356,
-0.003035155590623617,
0.09021490812301636,
-0.019437670707702637,
-0.09757450222969055,
-0.07405272871255875,
0.0848613753914833,
-0.11156534403562546,
0.21836979687213898,
-0.047714509069919586,
0.054364051669836044,
0.1254379153251648,
0.06760457158088684,
-0.0644994005560875,
0.06536801904439926,
0.04187343269586563,
-0.04156811535358429,
0.023007867857813835,
0.06869390606880188,
-0.03251289203763008,
0.06446241587400436,
0.047979000955820084,
-0.13706746697425842,
0.02370663918554783,
-0.04812363535165787,
-0.06919568032026291,
-0.043872371315956116,
-0.020693952217698097,
-0.06029629707336426,
0.12902742624282837,
0.2190595120191574,
-0.024821428582072258,
-0.00955967791378498,
-0.07230399549007416,
0.00883461069315672,
0.05615578591823578,
0.021983714774250984,
-0.057219840586185455,
-0.21056394279003143,
0.016822319477796555,
0.04565083608031273,
-0.01846429333090782,
-0.25154390931129456,
-0.10084811598062515,
0.004124476574361324,
-0.07295944541692734,
-0.0947342962026596,
0.07150569558143616,
0.08810579776763916,
0.054769519716501236,
-0.05578319728374481,
-0.04721960425376892,
-0.07467550039291382,
0.14914295077323914,
-0.1454761028289795,
-0.09100616723299026
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-qnli
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8121
- Accuracy: 0.6065
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 156 | 0.6949 | 0.4874 |
| No log | 2.0 | 312 | 0.6596 | 0.5957 |
| No log | 3.0 | 468 | 0.7186 | 0.5812 |
| 0.6026 | 4.0 | 624 | 0.7727 | 0.6029 |
| 0.6026 | 5.0 | 780 | 0.8121 | 0.6065 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "distilbert-base-uncased-finetuned-qnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "rte"}, "metrics": [{"type": "accuracy", "value": 0.6064981949458483, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/distilbert-base-uncased-finetuned-qnli
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-qnli
======================================
This model is a fine-tuned version of distilbert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.8121
* Accuracy: 0.6065
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
67,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10177629441022873,
0.09868992865085602,
-0.002423677360638976,
0.12112317979335785,
0.1650812178850174,
0.03426579013466835,
0.1299346536397934,
0.12770213186740875,
-0.08564593642950058,
0.021304525434970856,
0.1207699179649353,
0.16118186712265015,
0.023844653740525246,
0.10977903753519058,
-0.04856811463832855,
-0.26443108916282654,
-0.014250527136027813,
0.05101621896028519,
-0.05502324551343918,
0.13528308272361755,
0.09085173904895782,
-0.12123988568782806,
0.09099695831537247,
0.010560519993305206,
-0.19217939674854279,
0.0012736907228827477,
-0.00007869464025134221,
-0.05158104747533798,
0.1480625867843628,
0.026134053245186806,
0.1222379058599472,
0.004350635223090649,
0.08203282207250595,
-0.20211967825889587,
0.0109197236597538,
0.047976233065128326,
0.0034287353046238422,
0.09363500773906708,
0.04515067860484123,
0.002628615591675043,
0.12063173204660416,
-0.0803351029753685,
0.05412057787179947,
0.025187760591506958,
-0.11920642107725143,
-0.2130812108516693,
-0.07939155399799347,
0.036630984395742416,
0.07490450143814087,
0.10550613701343536,
-0.007987946271896362,
0.12026778608560562,
-0.08140391111373901,
0.09315772354602814,
0.22560088336467743,
-0.2846624255180359,
-0.06620592623949051,
0.04440826550126076,
0.014406891539692879,
0.04722030460834503,
-0.10258147865533829,
-0.03414628654718399,
0.04851415753364563,
0.0509343147277832,
0.1279968023300171,
-0.027470313012599945,
-0.11697384715080261,
0.006325297988951206,
-0.14067994058132172,
-0.031778451055288315,
0.16862210631370544,
0.04232211038470268,
-0.027326999232172966,
-0.05612191930413246,
-0.05705082044005394,
-0.1505020707845688,
-0.035502392798662186,
-0.01559263002127409,
0.049344535917043686,
-0.023357374593615532,
-0.04166480526328087,
-0.009466851130127907,
-0.10793615877628326,
-0.06508545577526093,
-0.07387420535087585,
0.11076346784830093,
0.03758743405342102,
0.006860504392534494,
-0.029825663194060326,
0.11246810108423233,
-0.007811444811522961,
-0.12158264964818954,
0.025255942717194557,
0.022893251851201057,
0.01350346952676773,
-0.03933148831129074,
-0.05274882912635803,
-0.06423640251159668,
0.01213061437010765,
0.12725643813610077,
-0.05570182576775551,
0.04239019751548767,
0.05044251307845116,
0.04930534213781357,
-0.09428779035806656,
0.19049058854579926,
-0.034276507794857025,
-0.025200827047228813,
0.0005440381937660277,
0.05053231865167618,
0.017217280343174934,
-0.011495225131511688,
-0.12165343761444092,
0.004494988825172186,
0.08820939809083939,
0.007856707088649273,
-0.06190384179353714,
0.07444976270198822,
-0.059118129312992096,
-0.024367066100239754,
0.0022545091342180967,
-0.09038243442773819,
0.021844087168574333,
0.0009826826862990856,
-0.07183235138654709,
-0.020876631140708923,
0.036078646779060364,
0.015619035810232162,
-0.01955125480890274,
0.1076565608382225,
-0.08774691820144653,
0.02745640277862549,
-0.09359776973724365,
-0.10990453511476517,
0.016444405540823936,
-0.10769655555486679,
0.022347012534737587,
-0.09183300286531448,
-0.179530531167984,
-0.017432404682040215,
0.05996386706829071,
-0.024156158789992332,
-0.057796917855739594,
-0.058523666113615036,
-0.06686114519834518,
0.011313402093946934,
-0.00775144575163722,
0.11726417392492294,
-0.06450776755809784,
0.09395397454500198,
0.025765664875507355,
0.06269765645265579,
-0.042549166828393936,
0.05981731042265892,
-0.10193105041980743,
0.013144214637577534,
-0.15104839205741882,
0.0400441437959671,
-0.05145822837948799,
0.06935621798038483,
-0.08194040507078171,
-0.10615845024585724,
0.002663051476702094,
-0.0028380511794239283,
0.06268789619207382,
0.09636878967285156,
-0.18458586931228638,
-0.08252619951963425,
0.1638277769088745,
-0.07281412184238434,
-0.12196049094200134,
0.12136763334274292,
-0.0580064132809639,
0.05861146003007889,
0.05951378867030144,
0.17990033328533173,
0.08656419813632965,
-0.07854954153299332,
0.0018196254968643188,
0.023756947368383408,
0.05026758089661598,
-0.06338667124509811,
0.06808595359325409,
0.0018259093631058931,
0.01989728771150112,
0.03576963394880295,
-0.027762150391936302,
0.0641099065542221,
-0.08784174174070358,
-0.09886786341667175,
-0.0402458980679512,
-0.08253508061170578,
0.0465523786842823,
0.07975026965141296,
0.06739030033349991,
-0.09414571523666382,
-0.07797343283891678,
0.05024334043264389,
0.08206507563591003,
-0.058022212237119675,
0.024602338671684265,
-0.049497149884700775,
0.07355623692274094,
-0.023844702169299126,
-0.021642537787556648,
-0.17968407273292542,
-0.032223403453826904,
0.007671972271054983,
0.0017039760714396834,
0.018056152388453484,
0.03337876871228218,
0.06358686834573746,
0.06016344204545021,
-0.05022745952010155,
-0.019244832918047905,
-0.0352572537958622,
-0.0006887496565468609,
-0.12579292058944702,
-0.19718441367149353,
-0.028450479730963707,
-0.02205497771501541,
0.16075244545936584,
-0.2082725465297699,
0.05154874175786972,
-0.014233555644750595,
0.06959009915590286,
0.012252251617610455,
-0.0065546054393053055,
-0.037181925028562546,
0.07644595205783844,
-0.04241294413805008,
-0.05049571767449379,
0.08215155452489853,
0.01450799684971571,
-0.09128501266241074,
-0.050215501338243484,
-0.09774215519428253,
0.1582167148590088,
0.12925271689891815,
-0.11003289371728897,
-0.07731720805168152,
-0.023380616679787636,
-0.0669984295964241,
-0.034903690218925476,
-0.04613172262907028,
0.026549331843852997,
0.1879379004240036,
-0.0049881828017532825,
0.14970116317272186,
-0.06918737292289734,
-0.043393924832344055,
0.018244462087750435,
-0.03694281727075577,
0.01636327989399433,
0.13443101942539215,
0.13418081402778625,
-0.06011265516281128,
0.15530750155448914,
0.14804664254188538,
-0.08511312305927277,
0.1510668396949768,
-0.04195278137922287,
-0.06577235460281372,
-0.01610550656914711,
-0.029684927314519882,
-0.011206655763089657,
0.10058020800352097,
-0.15690045058727264,
-0.002004367997869849,
0.030701281502842903,
0.015433641150593758,
0.02562039904296398,
-0.22722068428993225,
-0.04094555974006653,
0.03781639412045479,
-0.044886574149131775,
-0.006275756284594536,
-0.00595852779224515,
0.005161743611097336,
0.10115070641040802,
-0.0004888595431111753,
-0.08683908730745316,
0.03661135584115982,
0.0027518956921994686,
-0.08374463021755219,
0.2156449258327484,
-0.081173375248909,
-0.17226016521453857,
-0.13096117973327637,
-0.07049524784088135,
-0.047677770256996155,
-0.0015197350876405835,
0.06859971582889557,
-0.09709104150533676,
-0.02635144256055355,
-0.07209304720163345,
0.025860309600830078,
0.007600440643727779,
0.022702498361468315,
0.0029937936924397945,
0.007451718207448721,
0.06447025388479233,
-0.11222215741872787,
-0.015035846270620823,
-0.058496929705142975,
-0.04542740061879158,
0.044325705617666245,
0.02816983126103878,
0.11031338572502136,
0.15270480513572693,
-0.013083916157484055,
0.012781093828380108,
-0.03162777051329613,
0.23638051748275757,
-0.06001908704638481,
-0.0208893995732069,
0.1460668295621872,
-0.007206457667052746,
0.051826294511556625,
0.11371733248233795,
0.07455668598413467,
-0.07760798186063766,
0.003885192796587944,
0.037943821400403976,
-0.034547463059425354,
-0.23284892737865448,
-0.053616248071193695,
-0.05520634725689888,
0.011296672746539116,
0.089586041867733,
0.02373400144279003,
0.030841536819934845,
0.07044725120067596,
0.04116436094045639,
0.07381468266248703,
-0.037414710968732834,
0.05087532848119736,
0.1295747458934784,
0.030546288937330246,
0.12477082759141922,
-0.04690399020910263,
-0.06424792855978012,
0.040794432163238525,
-0.010863419622182846,
0.22334444522857666,
0.009917938150465488,
0.13131633400917053,
0.06607729941606522,
0.1649666577577591,
-0.009735438972711563,
0.07535355538129807,
-0.010470135137438774,
-0.03799179568886757,
-0.0162015613168478,
-0.03991588577628136,
-0.04029099643230438,
0.023612579330801964,
-0.06360199302434921,
0.06474608182907104,
-0.12359699606895447,
0.014717328362166882,
0.05894068628549576,
0.24849559366703033,
0.03377196192741394,
-0.3200716972351074,
-0.09709673374891281,
0.0007384793716482818,
-0.029719963669776917,
-0.019953938201069832,
0.026196908205747604,
0.09424502402544022,
-0.09753169119358063,
0.029457727447152138,
-0.07466296851634979,
0.0962631106376648,
-0.055720455944538116,
0.05116957798600197,
0.08169417083263397,
0.09054762125015259,
0.011883794330060482,
0.09314082562923431,
-0.2884000837802887,
0.27650073170661926,
0.0002981654542963952,
0.056524623185396194,
-0.07594560086727142,
0.008374286815524101,
0.041159119457006454,
0.06563553959131241,
0.07949315011501312,
-0.01224528532475233,
-0.01759219914674759,
-0.18658626079559326,
-0.06754712015390396,
0.027284175157546997,
0.06876907497644424,
-0.04146112501621246,
0.08209217339754105,
-0.031855158507823944,
0.008920346386730671,
0.07337794452905655,
0.0023068361915647984,
-0.053541041910648346,
-0.10786380618810654,
-0.005141935311257839,
0.022722337394952774,
-0.06008746474981308,
-0.06107710674405098,
-0.12087935954332352,
-0.12990501523017883,
0.155729740858078,
-0.035367779433727264,
-0.038886189460754395,
-0.106304831802845,
0.08325839787721634,
0.05899275466799736,
-0.08922765403985977,
0.0430733785033226,
0.0019881264306604862,
0.07539381086826324,
0.02154691517353058,
-0.07059939950704575,
0.10247620195150375,
-0.07363677769899368,
-0.1557348668575287,
-0.0653294250369072,
0.10667534172534943,
0.033229321241378784,
0.06670800596475601,
-0.014250626787543297,
0.00431025680154562,
-0.046546995639801025,
-0.08820211887359619,
0.020896978676319122,
0.004755881614983082,
0.07743469625711441,
0.018997633829712868,
-0.07543632388114929,
0.01201551128178835,
-0.06481624394655228,
-0.034086693078279495,
0.20510898530483246,
0.2221778780221939,
-0.09982394427061081,
0.024086005985736847,
0.025836989283561707,
-0.0738971009850502,
-0.19793148338794708,
0.03522804379463196,
0.05483577400445938,
0.008683490566909313,
0.04294142872095108,
-0.18465115129947662,
0.13146370649337769,
0.10747389495372772,
-0.011773521080613136,
0.10583452880382538,
-0.324044793844223,
-0.12059681862592697,
0.13578835129737854,
0.13629050552845,
0.09854068607091904,
-0.1321391612291336,
-0.02271113730967045,
-0.018754245713353157,
-0.1379910707473755,
0.11446153372526169,
-0.09142790734767914,
0.120912104845047,
-0.037892017513513565,
0.07596635818481445,
0.0028597572818398476,
-0.0584384985268116,
0.12073198705911636,
0.023140931501984596,
0.0945015400648117,
-0.058765899389982224,
-0.033337973058223724,
0.03047853522002697,
-0.042802438139915466,
0.03446131944656372,
-0.09962396323680878,
0.029662422835826874,
-0.10281215608119965,
-0.0251001063734293,
-0.06927776336669922,
0.04619433730840683,
-0.04536258056759834,
-0.06819522380828857,
-0.037817176431417465,
0.025476092472672462,
0.04637615382671356,
-0.007411271333694458,
0.12242395430803299,
0.02384062111377716,
0.1488822102546692,
0.09686450660228729,
0.07455138862133026,
-0.06877368688583374,
-0.08208677172660828,
-0.026989970356225967,
-0.01078053005039692,
0.050466980785131454,
-0.1369129866361618,
0.019369233399629593,
0.15191002190113068,
0.020388750359416008,
0.15307851135730743,
0.08298779278993607,
-0.021846970543265343,
-0.00145810900721699,
0.059030331671237946,
-0.16558411717414856,
-0.09374777227640152,
-0.017512774094939232,
-0.06781873852014542,
-0.12064754962921143,
0.04518076777458191,
0.09283262491226196,
-0.06830855458974838,
-0.006686370354145765,
-0.004924751818180084,
0.013755558989942074,
-0.05032142624258995,
0.18429741263389587,
0.06282955408096313,
0.047867823392152786,
-0.096428282558918,
0.07266043871641159,
0.0449543371796608,
-0.07330744713544846,
0.0033405697904527187,
0.07132815569639206,
-0.08534601330757141,
-0.05450327321887016,
0.06432835012674332,
0.1912047564983368,
-0.043927162885665894,
-0.04855562746524811,
-0.1453658491373062,
-0.12287921458482742,
0.07764768600463867,
0.1408335417509079,
0.11843205243349075,
0.01058033388108015,
-0.06616745889186859,
0.0029106447473168373,
-0.10731884837150574,
0.10160065442323685,
0.045956265181303024,
0.06211207062005997,
-0.14301945269107819,
0.14211498200893402,
0.020740197971463203,
0.04820193350315094,
-0.01806846633553505,
0.023208048194646835,
-0.10020429641008377,
0.007697694003582001,
-0.09298595041036606,
-0.019537312909960747,
-0.029065001755952835,
0.011588165536522865,
-0.005960927344858646,
-0.04729988053441048,
-0.0542338490486145,
0.010628738440573215,
-0.10766087472438812,
-0.023693302646279335,
0.030114110559225082,
0.07296796143054962,
-0.10916557163000107,
-0.035426318645477295,
0.030875829979777336,
-0.06108306720852852,
0.07371184974908829,
0.04329424351453781,
0.015606247819960117,
0.050780802965164185,
-0.13902859389781952,
0.02026754431426525,
0.07313340902328491,
0.029131997376680374,
0.06079116463661194,
-0.09932733327150345,
-0.007924248464405537,
-0.00831547100096941,
0.039642333984375,
0.02172110602259636,
0.07442112267017365,
-0.1410936713218689,
0.0035281842574477196,
-0.02308511547744274,
-0.08286073803901672,
-0.06700614094734192,
0.028149420395493507,
0.08893175423145294,
0.018416542559862137,
0.19928090274333954,
-0.07619873434305191,
0.049508191645145416,
-0.21921874582767487,
0.007203821558505297,
-0.006482485681772232,
-0.11031077802181244,
-0.10131534934043884,
-0.07205703109502792,
0.05513612926006317,
-0.060717787593603134,
0.1499030590057373,
0.04670583829283714,
0.0190992783755064,
0.02442006766796112,
-0.011104658246040344,
0.0123064573854208,
0.009836219251155853,
0.18994872272014618,
0.030648769810795784,
-0.03457606956362724,
0.05985496938228607,
0.044767893850803375,
0.10333859920501709,
0.11458932608366013,
0.2000289261341095,
0.14501407742500305,
-0.009634922258555889,
0.09320678561925888,
0.043747033923864365,
-0.055918898433446884,
-0.15551309287548065,
0.05201669782400131,
-0.0348605252802372,
0.10937416553497314,
-0.02125314436852932,
0.22091102600097656,
0.06478510051965714,
-0.1696740686893463,
0.051610227674245834,
-0.05148177221417427,
-0.08721215277910233,
-0.11527423560619354,
-0.049710892140865326,
-0.07701697945594788,
-0.13180910050868988,
-0.003880183445289731,
-0.11571096628904343,
-0.0028426540084183216,
0.12518630921840668,
0.003630818100646138,
-0.027395818382501602,
0.15849998593330383,
0.014401617459952831,
0.022212907671928406,
0.06015627831220627,
0.008367235772311687,
-0.03863980621099472,
-0.14036662876605988,
-0.059315942227840424,
-0.012330079451203346,
-0.008793477900326252,
0.03116128407418728,
-0.06153199076652527,
-0.04473326727747917,
0.03117159940302372,
-0.02042582258582115,
-0.09601709246635437,
0.006006556563079357,
0.011131567880511284,
0.0533316507935524,
0.044897403568029404,
0.009516828693449497,
0.018703876063227654,
-0.0037758296821266413,
0.20052506029605865,
-0.0717770978808403,
-0.06498154252767563,
-0.10246375948190689,
0.23358504474163055,
0.0361536405980587,
-0.018478771671652794,
0.03453867882490158,
-0.06657195836305618,
0.004409546032547951,
0.24914872646331787,
0.21641024947166443,
-0.07975707203149796,
-0.0060849878937006,
0.016893045976758003,
-0.007865218445658684,
-0.021979933604598045,
0.09790797531604767,
0.14306801557540894,
0.04558771848678589,
-0.09230106323957443,
-0.04356169328093529,
-0.058740004897117615,
-0.0174243226647377,
-0.03374728187918663,
0.0693250447511673,
0.051049165427684784,
0.009482786059379578,
-0.03563119098544121,
0.0567263662815094,
-0.06666052341461182,
-0.09042596817016602,
0.05730389058589935,
-0.21876642107963562,
-0.16741296648979187,
-0.01652800291776657,
0.1028960645198822,
0.0017809192650020123,
0.061948828399181366,
-0.029001614078879356,
-0.003035155590623617,
0.09021490812301636,
-0.019437670707702637,
-0.09757450222969055,
-0.07405272871255875,
0.0848613753914833,
-0.11156534403562546,
0.21836979687213898,
-0.047714509069919586,
0.054364051669836044,
0.1254379153251648,
0.06760457158088684,
-0.0644994005560875,
0.06536801904439926,
0.04187343269586563,
-0.04156811535358429,
0.023007867857813835,
0.06869390606880188,
-0.03251289203763008,
0.06446241587400436,
0.047979000955820084,
-0.13706746697425842,
0.02370663918554783,
-0.04812363535165787,
-0.06919568032026291,
-0.043872371315956116,
-0.020693952217698097,
-0.06029629707336426,
0.12902742624282837,
0.2190595120191574,
-0.024821428582072258,
-0.00955967791378498,
-0.07230399549007416,
0.00883461069315672,
0.05615578591823578,
0.021983714774250984,
-0.057219840586185455,
-0.21056394279003143,
0.016822319477796555,
0.04565083608031273,
-0.01846429333090782,
-0.25154390931129456,
-0.10084811598062515,
0.004124476574361324,
-0.07295944541692734,
-0.0947342962026596,
0.07150569558143616,
0.08810579776763916,
0.054769519716501236,
-0.05578319728374481,
-0.04721960425376892,
-0.07467550039291382,
0.14914295077323914,
-0.1454761028289795,
-0.09100616723299026
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-rte
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6661
- Accuracy: 0.6173
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 156 | 0.6921 | 0.5162 |
| No log | 2.0 | 312 | 0.6661 | 0.6173 |
| No log | 3.0 | 468 | 0.7794 | 0.5632 |
| 0.5903 | 4.0 | 624 | 0.8832 | 0.5921 |
| 0.5903 | 5.0 | 780 | 0.9376 | 0.5921 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "distilbert-base-uncased-finetuned-rte", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "rte"}, "metrics": [{"type": "accuracy", "value": 0.6173285198555957, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/distilbert-base-uncased-finetuned-rte
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-rte
=====================================
This model is a fine-tuned version of distilbert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6661
* Accuracy: 0.6173
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
67,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10177629441022873,
0.09868992865085602,
-0.002423677360638976,
0.12112317979335785,
0.1650812178850174,
0.03426579013466835,
0.1299346536397934,
0.12770213186740875,
-0.08564593642950058,
0.021304525434970856,
0.1207699179649353,
0.16118186712265015,
0.023844653740525246,
0.10977903753519058,
-0.04856811463832855,
-0.26443108916282654,
-0.014250527136027813,
0.05101621896028519,
-0.05502324551343918,
0.13528308272361755,
0.09085173904895782,
-0.12123988568782806,
0.09099695831537247,
0.010560519993305206,
-0.19217939674854279,
0.0012736907228827477,
-0.00007869464025134221,
-0.05158104747533798,
0.1480625867843628,
0.026134053245186806,
0.1222379058599472,
0.004350635223090649,
0.08203282207250595,
-0.20211967825889587,
0.0109197236597538,
0.047976233065128326,
0.0034287353046238422,
0.09363500773906708,
0.04515067860484123,
0.002628615591675043,
0.12063173204660416,
-0.0803351029753685,
0.05412057787179947,
0.025187760591506958,
-0.11920642107725143,
-0.2130812108516693,
-0.07939155399799347,
0.036630984395742416,
0.07490450143814087,
0.10550613701343536,
-0.007987946271896362,
0.12026778608560562,
-0.08140391111373901,
0.09315772354602814,
0.22560088336467743,
-0.2846624255180359,
-0.06620592623949051,
0.04440826550126076,
0.014406891539692879,
0.04722030460834503,
-0.10258147865533829,
-0.03414628654718399,
0.04851415753364563,
0.0509343147277832,
0.1279968023300171,
-0.027470313012599945,
-0.11697384715080261,
0.006325297988951206,
-0.14067994058132172,
-0.031778451055288315,
0.16862210631370544,
0.04232211038470268,
-0.027326999232172966,
-0.05612191930413246,
-0.05705082044005394,
-0.1505020707845688,
-0.035502392798662186,
-0.01559263002127409,
0.049344535917043686,
-0.023357374593615532,
-0.04166480526328087,
-0.009466851130127907,
-0.10793615877628326,
-0.06508545577526093,
-0.07387420535087585,
0.11076346784830093,
0.03758743405342102,
0.006860504392534494,
-0.029825663194060326,
0.11246810108423233,
-0.007811444811522961,
-0.12158264964818954,
0.025255942717194557,
0.022893251851201057,
0.01350346952676773,
-0.03933148831129074,
-0.05274882912635803,
-0.06423640251159668,
0.01213061437010765,
0.12725643813610077,
-0.05570182576775551,
0.04239019751548767,
0.05044251307845116,
0.04930534213781357,
-0.09428779035806656,
0.19049058854579926,
-0.034276507794857025,
-0.025200827047228813,
0.0005440381937660277,
0.05053231865167618,
0.017217280343174934,
-0.011495225131511688,
-0.12165343761444092,
0.004494988825172186,
0.08820939809083939,
0.007856707088649273,
-0.06190384179353714,
0.07444976270198822,
-0.059118129312992096,
-0.024367066100239754,
0.0022545091342180967,
-0.09038243442773819,
0.021844087168574333,
0.0009826826862990856,
-0.07183235138654709,
-0.020876631140708923,
0.036078646779060364,
0.015619035810232162,
-0.01955125480890274,
0.1076565608382225,
-0.08774691820144653,
0.02745640277862549,
-0.09359776973724365,
-0.10990453511476517,
0.016444405540823936,
-0.10769655555486679,
0.022347012534737587,
-0.09183300286531448,
-0.179530531167984,
-0.017432404682040215,
0.05996386706829071,
-0.024156158789992332,
-0.057796917855739594,
-0.058523666113615036,
-0.06686114519834518,
0.011313402093946934,
-0.00775144575163722,
0.11726417392492294,
-0.06450776755809784,
0.09395397454500198,
0.025765664875507355,
0.06269765645265579,
-0.042549166828393936,
0.05981731042265892,
-0.10193105041980743,
0.013144214637577534,
-0.15104839205741882,
0.0400441437959671,
-0.05145822837948799,
0.06935621798038483,
-0.08194040507078171,
-0.10615845024585724,
0.002663051476702094,
-0.0028380511794239283,
0.06268789619207382,
0.09636878967285156,
-0.18458586931228638,
-0.08252619951963425,
0.1638277769088745,
-0.07281412184238434,
-0.12196049094200134,
0.12136763334274292,
-0.0580064132809639,
0.05861146003007889,
0.05951378867030144,
0.17990033328533173,
0.08656419813632965,
-0.07854954153299332,
0.0018196254968643188,
0.023756947368383408,
0.05026758089661598,
-0.06338667124509811,
0.06808595359325409,
0.0018259093631058931,
0.01989728771150112,
0.03576963394880295,
-0.027762150391936302,
0.0641099065542221,
-0.08784174174070358,
-0.09886786341667175,
-0.0402458980679512,
-0.08253508061170578,
0.0465523786842823,
0.07975026965141296,
0.06739030033349991,
-0.09414571523666382,
-0.07797343283891678,
0.05024334043264389,
0.08206507563591003,
-0.058022212237119675,
0.024602338671684265,
-0.049497149884700775,
0.07355623692274094,
-0.023844702169299126,
-0.021642537787556648,
-0.17968407273292542,
-0.032223403453826904,
0.007671972271054983,
0.0017039760714396834,
0.018056152388453484,
0.03337876871228218,
0.06358686834573746,
0.06016344204545021,
-0.05022745952010155,
-0.019244832918047905,
-0.0352572537958622,
-0.0006887496565468609,
-0.12579292058944702,
-0.19718441367149353,
-0.028450479730963707,
-0.02205497771501541,
0.16075244545936584,
-0.2082725465297699,
0.05154874175786972,
-0.014233555644750595,
0.06959009915590286,
0.012252251617610455,
-0.0065546054393053055,
-0.037181925028562546,
0.07644595205783844,
-0.04241294413805008,
-0.05049571767449379,
0.08215155452489853,
0.01450799684971571,
-0.09128501266241074,
-0.050215501338243484,
-0.09774215519428253,
0.1582167148590088,
0.12925271689891815,
-0.11003289371728897,
-0.07731720805168152,
-0.023380616679787636,
-0.0669984295964241,
-0.034903690218925476,
-0.04613172262907028,
0.026549331843852997,
0.1879379004240036,
-0.0049881828017532825,
0.14970116317272186,
-0.06918737292289734,
-0.043393924832344055,
0.018244462087750435,
-0.03694281727075577,
0.01636327989399433,
0.13443101942539215,
0.13418081402778625,
-0.06011265516281128,
0.15530750155448914,
0.14804664254188538,
-0.08511312305927277,
0.1510668396949768,
-0.04195278137922287,
-0.06577235460281372,
-0.01610550656914711,
-0.029684927314519882,
-0.011206655763089657,
0.10058020800352097,
-0.15690045058727264,
-0.002004367997869849,
0.030701281502842903,
0.015433641150593758,
0.02562039904296398,
-0.22722068428993225,
-0.04094555974006653,
0.03781639412045479,
-0.044886574149131775,
-0.006275756284594536,
-0.00595852779224515,
0.005161743611097336,
0.10115070641040802,
-0.0004888595431111753,
-0.08683908730745316,
0.03661135584115982,
0.0027518956921994686,
-0.08374463021755219,
0.2156449258327484,
-0.081173375248909,
-0.17226016521453857,
-0.13096117973327637,
-0.07049524784088135,
-0.047677770256996155,
-0.0015197350876405835,
0.06859971582889557,
-0.09709104150533676,
-0.02635144256055355,
-0.07209304720163345,
0.025860309600830078,
0.007600440643727779,
0.022702498361468315,
0.0029937936924397945,
0.007451718207448721,
0.06447025388479233,
-0.11222215741872787,
-0.015035846270620823,
-0.058496929705142975,
-0.04542740061879158,
0.044325705617666245,
0.02816983126103878,
0.11031338572502136,
0.15270480513572693,
-0.013083916157484055,
0.012781093828380108,
-0.03162777051329613,
0.23638051748275757,
-0.06001908704638481,
-0.0208893995732069,
0.1460668295621872,
-0.007206457667052746,
0.051826294511556625,
0.11371733248233795,
0.07455668598413467,
-0.07760798186063766,
0.003885192796587944,
0.037943821400403976,
-0.034547463059425354,
-0.23284892737865448,
-0.053616248071193695,
-0.05520634725689888,
0.011296672746539116,
0.089586041867733,
0.02373400144279003,
0.030841536819934845,
0.07044725120067596,
0.04116436094045639,
0.07381468266248703,
-0.037414710968732834,
0.05087532848119736,
0.1295747458934784,
0.030546288937330246,
0.12477082759141922,
-0.04690399020910263,
-0.06424792855978012,
0.040794432163238525,
-0.010863419622182846,
0.22334444522857666,
0.009917938150465488,
0.13131633400917053,
0.06607729941606522,
0.1649666577577591,
-0.009735438972711563,
0.07535355538129807,
-0.010470135137438774,
-0.03799179568886757,
-0.0162015613168478,
-0.03991588577628136,
-0.04029099643230438,
0.023612579330801964,
-0.06360199302434921,
0.06474608182907104,
-0.12359699606895447,
0.014717328362166882,
0.05894068628549576,
0.24849559366703033,
0.03377196192741394,
-0.3200716972351074,
-0.09709673374891281,
0.0007384793716482818,
-0.029719963669776917,
-0.019953938201069832,
0.026196908205747604,
0.09424502402544022,
-0.09753169119358063,
0.029457727447152138,
-0.07466296851634979,
0.0962631106376648,
-0.055720455944538116,
0.05116957798600197,
0.08169417083263397,
0.09054762125015259,
0.011883794330060482,
0.09314082562923431,
-0.2884000837802887,
0.27650073170661926,
0.0002981654542963952,
0.056524623185396194,
-0.07594560086727142,
0.008374286815524101,
0.041159119457006454,
0.06563553959131241,
0.07949315011501312,
-0.01224528532475233,
-0.01759219914674759,
-0.18658626079559326,
-0.06754712015390396,
0.027284175157546997,
0.06876907497644424,
-0.04146112501621246,
0.08209217339754105,
-0.031855158507823944,
0.008920346386730671,
0.07337794452905655,
0.0023068361915647984,
-0.053541041910648346,
-0.10786380618810654,
-0.005141935311257839,
0.022722337394952774,
-0.06008746474981308,
-0.06107710674405098,
-0.12087935954332352,
-0.12990501523017883,
0.155729740858078,
-0.035367779433727264,
-0.038886189460754395,
-0.106304831802845,
0.08325839787721634,
0.05899275466799736,
-0.08922765403985977,
0.0430733785033226,
0.0019881264306604862,
0.07539381086826324,
0.02154691517353058,
-0.07059939950704575,
0.10247620195150375,
-0.07363677769899368,
-0.1557348668575287,
-0.0653294250369072,
0.10667534172534943,
0.033229321241378784,
0.06670800596475601,
-0.014250626787543297,
0.00431025680154562,
-0.046546995639801025,
-0.08820211887359619,
0.020896978676319122,
0.004755881614983082,
0.07743469625711441,
0.018997633829712868,
-0.07543632388114929,
0.01201551128178835,
-0.06481624394655228,
-0.034086693078279495,
0.20510898530483246,
0.2221778780221939,
-0.09982394427061081,
0.024086005985736847,
0.025836989283561707,
-0.0738971009850502,
-0.19793148338794708,
0.03522804379463196,
0.05483577400445938,
0.008683490566909313,
0.04294142872095108,
-0.18465115129947662,
0.13146370649337769,
0.10747389495372772,
-0.011773521080613136,
0.10583452880382538,
-0.324044793844223,
-0.12059681862592697,
0.13578835129737854,
0.13629050552845,
0.09854068607091904,
-0.1321391612291336,
-0.02271113730967045,
-0.018754245713353157,
-0.1379910707473755,
0.11446153372526169,
-0.09142790734767914,
0.120912104845047,
-0.037892017513513565,
0.07596635818481445,
0.0028597572818398476,
-0.0584384985268116,
0.12073198705911636,
0.023140931501984596,
0.0945015400648117,
-0.058765899389982224,
-0.033337973058223724,
0.03047853522002697,
-0.042802438139915466,
0.03446131944656372,
-0.09962396323680878,
0.029662422835826874,
-0.10281215608119965,
-0.0251001063734293,
-0.06927776336669922,
0.04619433730840683,
-0.04536258056759834,
-0.06819522380828857,
-0.037817176431417465,
0.025476092472672462,
0.04637615382671356,
-0.007411271333694458,
0.12242395430803299,
0.02384062111377716,
0.1488822102546692,
0.09686450660228729,
0.07455138862133026,
-0.06877368688583374,
-0.08208677172660828,
-0.026989970356225967,
-0.01078053005039692,
0.050466980785131454,
-0.1369129866361618,
0.019369233399629593,
0.15191002190113068,
0.020388750359416008,
0.15307851135730743,
0.08298779278993607,
-0.021846970543265343,
-0.00145810900721699,
0.059030331671237946,
-0.16558411717414856,
-0.09374777227640152,
-0.017512774094939232,
-0.06781873852014542,
-0.12064754962921143,
0.04518076777458191,
0.09283262491226196,
-0.06830855458974838,
-0.006686370354145765,
-0.004924751818180084,
0.013755558989942074,
-0.05032142624258995,
0.18429741263389587,
0.06282955408096313,
0.047867823392152786,
-0.096428282558918,
0.07266043871641159,
0.0449543371796608,
-0.07330744713544846,
0.0033405697904527187,
0.07132815569639206,
-0.08534601330757141,
-0.05450327321887016,
0.06432835012674332,
0.1912047564983368,
-0.043927162885665894,
-0.04855562746524811,
-0.1453658491373062,
-0.12287921458482742,
0.07764768600463867,
0.1408335417509079,
0.11843205243349075,
0.01058033388108015,
-0.06616745889186859,
0.0029106447473168373,
-0.10731884837150574,
0.10160065442323685,
0.045956265181303024,
0.06211207062005997,
-0.14301945269107819,
0.14211498200893402,
0.020740197971463203,
0.04820193350315094,
-0.01806846633553505,
0.023208048194646835,
-0.10020429641008377,
0.007697694003582001,
-0.09298595041036606,
-0.019537312909960747,
-0.029065001755952835,
0.011588165536522865,
-0.005960927344858646,
-0.04729988053441048,
-0.0542338490486145,
0.010628738440573215,
-0.10766087472438812,
-0.023693302646279335,
0.030114110559225082,
0.07296796143054962,
-0.10916557163000107,
-0.035426318645477295,
0.030875829979777336,
-0.06108306720852852,
0.07371184974908829,
0.04329424351453781,
0.015606247819960117,
0.050780802965164185,
-0.13902859389781952,
0.02026754431426525,
0.07313340902328491,
0.029131997376680374,
0.06079116463661194,
-0.09932733327150345,
-0.007924248464405537,
-0.00831547100096941,
0.039642333984375,
0.02172110602259636,
0.07442112267017365,
-0.1410936713218689,
0.0035281842574477196,
-0.02308511547744274,
-0.08286073803901672,
-0.06700614094734192,
0.028149420395493507,
0.08893175423145294,
0.018416542559862137,
0.19928090274333954,
-0.07619873434305191,
0.049508191645145416,
-0.21921874582767487,
0.007203821558505297,
-0.006482485681772232,
-0.11031077802181244,
-0.10131534934043884,
-0.07205703109502792,
0.05513612926006317,
-0.060717787593603134,
0.1499030590057373,
0.04670583829283714,
0.0190992783755064,
0.02442006766796112,
-0.011104658246040344,
0.0123064573854208,
0.009836219251155853,
0.18994872272014618,
0.030648769810795784,
-0.03457606956362724,
0.05985496938228607,
0.044767893850803375,
0.10333859920501709,
0.11458932608366013,
0.2000289261341095,
0.14501407742500305,
-0.009634922258555889,
0.09320678561925888,
0.043747033923864365,
-0.055918898433446884,
-0.15551309287548065,
0.05201669782400131,
-0.0348605252802372,
0.10937416553497314,
-0.02125314436852932,
0.22091102600097656,
0.06478510051965714,
-0.1696740686893463,
0.051610227674245834,
-0.05148177221417427,
-0.08721215277910233,
-0.11527423560619354,
-0.049710892140865326,
-0.07701697945594788,
-0.13180910050868988,
-0.003880183445289731,
-0.11571096628904343,
-0.0028426540084183216,
0.12518630921840668,
0.003630818100646138,
-0.027395818382501602,
0.15849998593330383,
0.014401617459952831,
0.022212907671928406,
0.06015627831220627,
0.008367235772311687,
-0.03863980621099472,
-0.14036662876605988,
-0.059315942227840424,
-0.012330079451203346,
-0.008793477900326252,
0.03116128407418728,
-0.06153199076652527,
-0.04473326727747917,
0.03117159940302372,
-0.02042582258582115,
-0.09601709246635437,
0.006006556563079357,
0.011131567880511284,
0.0533316507935524,
0.044897403568029404,
0.009516828693449497,
0.018703876063227654,
-0.0037758296821266413,
0.20052506029605865,
-0.0717770978808403,
-0.06498154252767563,
-0.10246375948190689,
0.23358504474163055,
0.0361536405980587,
-0.018478771671652794,
0.03453867882490158,
-0.06657195836305618,
0.004409546032547951,
0.24914872646331787,
0.21641024947166443,
-0.07975707203149796,
-0.0060849878937006,
0.016893045976758003,
-0.007865218445658684,
-0.021979933604598045,
0.09790797531604767,
0.14306801557540894,
0.04558771848678589,
-0.09230106323957443,
-0.04356169328093529,
-0.058740004897117615,
-0.0174243226647377,
-0.03374728187918663,
0.0693250447511673,
0.051049165427684784,
0.009482786059379578,
-0.03563119098544121,
0.0567263662815094,
-0.06666052341461182,
-0.09042596817016602,
0.05730389058589935,
-0.21876642107963562,
-0.16741296648979187,
-0.01652800291776657,
0.1028960645198822,
0.0017809192650020123,
0.061948828399181366,
-0.029001614078879356,
-0.003035155590623617,
0.09021490812301636,
-0.019437670707702637,
-0.09757450222969055,
-0.07405272871255875,
0.0848613753914833,
-0.11156534403562546,
0.21836979687213898,
-0.047714509069919586,
0.054364051669836044,
0.1254379153251648,
0.06760457158088684,
-0.0644994005560875,
0.06536801904439926,
0.04187343269586563,
-0.04156811535358429,
0.023007867857813835,
0.06869390606880188,
-0.03251289203763008,
0.06446241587400436,
0.047979000955820084,
-0.13706746697425842,
0.02370663918554783,
-0.04812363535165787,
-0.06919568032026291,
-0.043872371315956116,
-0.020693952217698097,
-0.06029629707336426,
0.12902742624282837,
0.2190595120191574,
-0.024821428582072258,
-0.00955967791378498,
-0.07230399549007416,
0.00883461069315672,
0.05615578591823578,
0.021983714774250984,
-0.057219840586185455,
-0.21056394279003143,
0.016822319477796555,
0.04565083608031273,
-0.01846429333090782,
-0.25154390931129456,
-0.10084811598062515,
0.004124476574361324,
-0.07295944541692734,
-0.0947342962026596,
0.07150569558143616,
0.08810579776763916,
0.054769519716501236,
-0.05578319728374481,
-0.04721960425376892,
-0.07467550039291382,
0.14914295077323914,
-0.1454761028289795,
-0.09100616723299026
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-sst2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4028
- Accuracy: 0.9083
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.188 | 1.0 | 4210 | 0.3127 | 0.9037 |
| 0.1299 | 2.0 | 8420 | 0.3887 | 0.9048 |
| 0.0845 | 3.0 | 12630 | 0.4028 | 0.9083 |
| 0.0691 | 4.0 | 16840 | 0.3924 | 0.9071 |
| 0.052 | 5.0 | 21050 | 0.5047 | 0.9002 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "distilbert-base-uncased-finetuned-sst2", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "sst2"}, "metrics": [{"type": "accuracy", "value": 0.908256880733945, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/distilbert-base-uncased-finetuned-sst2
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-sst2
======================================
This model is a fine-tuned version of distilbert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4028
* Accuracy: 0.9083
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
67,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10177629441022873,
0.09868992865085602,
-0.002423677360638976,
0.12112317979335785,
0.1650812178850174,
0.03426579013466835,
0.1299346536397934,
0.12770213186740875,
-0.08564593642950058,
0.021304525434970856,
0.1207699179649353,
0.16118186712265015,
0.023844653740525246,
0.10977903753519058,
-0.04856811463832855,
-0.26443108916282654,
-0.014250527136027813,
0.05101621896028519,
-0.05502324551343918,
0.13528308272361755,
0.09085173904895782,
-0.12123988568782806,
0.09099695831537247,
0.010560519993305206,
-0.19217939674854279,
0.0012736907228827477,
-0.00007869464025134221,
-0.05158104747533798,
0.1480625867843628,
0.026134053245186806,
0.1222379058599472,
0.004350635223090649,
0.08203282207250595,
-0.20211967825889587,
0.0109197236597538,
0.047976233065128326,
0.0034287353046238422,
0.09363500773906708,
0.04515067860484123,
0.002628615591675043,
0.12063173204660416,
-0.0803351029753685,
0.05412057787179947,
0.025187760591506958,
-0.11920642107725143,
-0.2130812108516693,
-0.07939155399799347,
0.036630984395742416,
0.07490450143814087,
0.10550613701343536,
-0.007987946271896362,
0.12026778608560562,
-0.08140391111373901,
0.09315772354602814,
0.22560088336467743,
-0.2846624255180359,
-0.06620592623949051,
0.04440826550126076,
0.014406891539692879,
0.04722030460834503,
-0.10258147865533829,
-0.03414628654718399,
0.04851415753364563,
0.0509343147277832,
0.1279968023300171,
-0.027470313012599945,
-0.11697384715080261,
0.006325297988951206,
-0.14067994058132172,
-0.031778451055288315,
0.16862210631370544,
0.04232211038470268,
-0.027326999232172966,
-0.05612191930413246,
-0.05705082044005394,
-0.1505020707845688,
-0.035502392798662186,
-0.01559263002127409,
0.049344535917043686,
-0.023357374593615532,
-0.04166480526328087,
-0.009466851130127907,
-0.10793615877628326,
-0.06508545577526093,
-0.07387420535087585,
0.11076346784830093,
0.03758743405342102,
0.006860504392534494,
-0.029825663194060326,
0.11246810108423233,
-0.007811444811522961,
-0.12158264964818954,
0.025255942717194557,
0.022893251851201057,
0.01350346952676773,
-0.03933148831129074,
-0.05274882912635803,
-0.06423640251159668,
0.01213061437010765,
0.12725643813610077,
-0.05570182576775551,
0.04239019751548767,
0.05044251307845116,
0.04930534213781357,
-0.09428779035806656,
0.19049058854579926,
-0.034276507794857025,
-0.025200827047228813,
0.0005440381937660277,
0.05053231865167618,
0.017217280343174934,
-0.011495225131511688,
-0.12165343761444092,
0.004494988825172186,
0.08820939809083939,
0.007856707088649273,
-0.06190384179353714,
0.07444976270198822,
-0.059118129312992096,
-0.024367066100239754,
0.0022545091342180967,
-0.09038243442773819,
0.021844087168574333,
0.0009826826862990856,
-0.07183235138654709,
-0.020876631140708923,
0.036078646779060364,
0.015619035810232162,
-0.01955125480890274,
0.1076565608382225,
-0.08774691820144653,
0.02745640277862549,
-0.09359776973724365,
-0.10990453511476517,
0.016444405540823936,
-0.10769655555486679,
0.022347012534737587,
-0.09183300286531448,
-0.179530531167984,
-0.017432404682040215,
0.05996386706829071,
-0.024156158789992332,
-0.057796917855739594,
-0.058523666113615036,
-0.06686114519834518,
0.011313402093946934,
-0.00775144575163722,
0.11726417392492294,
-0.06450776755809784,
0.09395397454500198,
0.025765664875507355,
0.06269765645265579,
-0.042549166828393936,
0.05981731042265892,
-0.10193105041980743,
0.013144214637577534,
-0.15104839205741882,
0.0400441437959671,
-0.05145822837948799,
0.06935621798038483,
-0.08194040507078171,
-0.10615845024585724,
0.002663051476702094,
-0.0028380511794239283,
0.06268789619207382,
0.09636878967285156,
-0.18458586931228638,
-0.08252619951963425,
0.1638277769088745,
-0.07281412184238434,
-0.12196049094200134,
0.12136763334274292,
-0.0580064132809639,
0.05861146003007889,
0.05951378867030144,
0.17990033328533173,
0.08656419813632965,
-0.07854954153299332,
0.0018196254968643188,
0.023756947368383408,
0.05026758089661598,
-0.06338667124509811,
0.06808595359325409,
0.0018259093631058931,
0.01989728771150112,
0.03576963394880295,
-0.027762150391936302,
0.0641099065542221,
-0.08784174174070358,
-0.09886786341667175,
-0.0402458980679512,
-0.08253508061170578,
0.0465523786842823,
0.07975026965141296,
0.06739030033349991,
-0.09414571523666382,
-0.07797343283891678,
0.05024334043264389,
0.08206507563591003,
-0.058022212237119675,
0.024602338671684265,
-0.049497149884700775,
0.07355623692274094,
-0.023844702169299126,
-0.021642537787556648,
-0.17968407273292542,
-0.032223403453826904,
0.007671972271054983,
0.0017039760714396834,
0.018056152388453484,
0.03337876871228218,
0.06358686834573746,
0.06016344204545021,
-0.05022745952010155,
-0.019244832918047905,
-0.0352572537958622,
-0.0006887496565468609,
-0.12579292058944702,
-0.19718441367149353,
-0.028450479730963707,
-0.02205497771501541,
0.16075244545936584,
-0.2082725465297699,
0.05154874175786972,
-0.014233555644750595,
0.06959009915590286,
0.012252251617610455,
-0.0065546054393053055,
-0.037181925028562546,
0.07644595205783844,
-0.04241294413805008,
-0.05049571767449379,
0.08215155452489853,
0.01450799684971571,
-0.09128501266241074,
-0.050215501338243484,
-0.09774215519428253,
0.1582167148590088,
0.12925271689891815,
-0.11003289371728897,
-0.07731720805168152,
-0.023380616679787636,
-0.0669984295964241,
-0.034903690218925476,
-0.04613172262907028,
0.026549331843852997,
0.1879379004240036,
-0.0049881828017532825,
0.14970116317272186,
-0.06918737292289734,
-0.043393924832344055,
0.018244462087750435,
-0.03694281727075577,
0.01636327989399433,
0.13443101942539215,
0.13418081402778625,
-0.06011265516281128,
0.15530750155448914,
0.14804664254188538,
-0.08511312305927277,
0.1510668396949768,
-0.04195278137922287,
-0.06577235460281372,
-0.01610550656914711,
-0.029684927314519882,
-0.011206655763089657,
0.10058020800352097,
-0.15690045058727264,
-0.002004367997869849,
0.030701281502842903,
0.015433641150593758,
0.02562039904296398,
-0.22722068428993225,
-0.04094555974006653,
0.03781639412045479,
-0.044886574149131775,
-0.006275756284594536,
-0.00595852779224515,
0.005161743611097336,
0.10115070641040802,
-0.0004888595431111753,
-0.08683908730745316,
0.03661135584115982,
0.0027518956921994686,
-0.08374463021755219,
0.2156449258327484,
-0.081173375248909,
-0.17226016521453857,
-0.13096117973327637,
-0.07049524784088135,
-0.047677770256996155,
-0.0015197350876405835,
0.06859971582889557,
-0.09709104150533676,
-0.02635144256055355,
-0.07209304720163345,
0.025860309600830078,
0.007600440643727779,
0.022702498361468315,
0.0029937936924397945,
0.007451718207448721,
0.06447025388479233,
-0.11222215741872787,
-0.015035846270620823,
-0.058496929705142975,
-0.04542740061879158,
0.044325705617666245,
0.02816983126103878,
0.11031338572502136,
0.15270480513572693,
-0.013083916157484055,
0.012781093828380108,
-0.03162777051329613,
0.23638051748275757,
-0.06001908704638481,
-0.0208893995732069,
0.1460668295621872,
-0.007206457667052746,
0.051826294511556625,
0.11371733248233795,
0.07455668598413467,
-0.07760798186063766,
0.003885192796587944,
0.037943821400403976,
-0.034547463059425354,
-0.23284892737865448,
-0.053616248071193695,
-0.05520634725689888,
0.011296672746539116,
0.089586041867733,
0.02373400144279003,
0.030841536819934845,
0.07044725120067596,
0.04116436094045639,
0.07381468266248703,
-0.037414710968732834,
0.05087532848119736,
0.1295747458934784,
0.030546288937330246,
0.12477082759141922,
-0.04690399020910263,
-0.06424792855978012,
0.040794432163238525,
-0.010863419622182846,
0.22334444522857666,
0.009917938150465488,
0.13131633400917053,
0.06607729941606522,
0.1649666577577591,
-0.009735438972711563,
0.07535355538129807,
-0.010470135137438774,
-0.03799179568886757,
-0.0162015613168478,
-0.03991588577628136,
-0.04029099643230438,
0.023612579330801964,
-0.06360199302434921,
0.06474608182907104,
-0.12359699606895447,
0.014717328362166882,
0.05894068628549576,
0.24849559366703033,
0.03377196192741394,
-0.3200716972351074,
-0.09709673374891281,
0.0007384793716482818,
-0.029719963669776917,
-0.019953938201069832,
0.026196908205747604,
0.09424502402544022,
-0.09753169119358063,
0.029457727447152138,
-0.07466296851634979,
0.0962631106376648,
-0.055720455944538116,
0.05116957798600197,
0.08169417083263397,
0.09054762125015259,
0.011883794330060482,
0.09314082562923431,
-0.2884000837802887,
0.27650073170661926,
0.0002981654542963952,
0.056524623185396194,
-0.07594560086727142,
0.008374286815524101,
0.041159119457006454,
0.06563553959131241,
0.07949315011501312,
-0.01224528532475233,
-0.01759219914674759,
-0.18658626079559326,
-0.06754712015390396,
0.027284175157546997,
0.06876907497644424,
-0.04146112501621246,
0.08209217339754105,
-0.031855158507823944,
0.008920346386730671,
0.07337794452905655,
0.0023068361915647984,
-0.053541041910648346,
-0.10786380618810654,
-0.005141935311257839,
0.022722337394952774,
-0.06008746474981308,
-0.06107710674405098,
-0.12087935954332352,
-0.12990501523017883,
0.155729740858078,
-0.035367779433727264,
-0.038886189460754395,
-0.106304831802845,
0.08325839787721634,
0.05899275466799736,
-0.08922765403985977,
0.0430733785033226,
0.0019881264306604862,
0.07539381086826324,
0.02154691517353058,
-0.07059939950704575,
0.10247620195150375,
-0.07363677769899368,
-0.1557348668575287,
-0.0653294250369072,
0.10667534172534943,
0.033229321241378784,
0.06670800596475601,
-0.014250626787543297,
0.00431025680154562,
-0.046546995639801025,
-0.08820211887359619,
0.020896978676319122,
0.004755881614983082,
0.07743469625711441,
0.018997633829712868,
-0.07543632388114929,
0.01201551128178835,
-0.06481624394655228,
-0.034086693078279495,
0.20510898530483246,
0.2221778780221939,
-0.09982394427061081,
0.024086005985736847,
0.025836989283561707,
-0.0738971009850502,
-0.19793148338794708,
0.03522804379463196,
0.05483577400445938,
0.008683490566909313,
0.04294142872095108,
-0.18465115129947662,
0.13146370649337769,
0.10747389495372772,
-0.011773521080613136,
0.10583452880382538,
-0.324044793844223,
-0.12059681862592697,
0.13578835129737854,
0.13629050552845,
0.09854068607091904,
-0.1321391612291336,
-0.02271113730967045,
-0.018754245713353157,
-0.1379910707473755,
0.11446153372526169,
-0.09142790734767914,
0.120912104845047,
-0.037892017513513565,
0.07596635818481445,
0.0028597572818398476,
-0.0584384985268116,
0.12073198705911636,
0.023140931501984596,
0.0945015400648117,
-0.058765899389982224,
-0.033337973058223724,
0.03047853522002697,
-0.042802438139915466,
0.03446131944656372,
-0.09962396323680878,
0.029662422835826874,
-0.10281215608119965,
-0.0251001063734293,
-0.06927776336669922,
0.04619433730840683,
-0.04536258056759834,
-0.06819522380828857,
-0.037817176431417465,
0.025476092472672462,
0.04637615382671356,
-0.007411271333694458,
0.12242395430803299,
0.02384062111377716,
0.1488822102546692,
0.09686450660228729,
0.07455138862133026,
-0.06877368688583374,
-0.08208677172660828,
-0.026989970356225967,
-0.01078053005039692,
0.050466980785131454,
-0.1369129866361618,
0.019369233399629593,
0.15191002190113068,
0.020388750359416008,
0.15307851135730743,
0.08298779278993607,
-0.021846970543265343,
-0.00145810900721699,
0.059030331671237946,
-0.16558411717414856,
-0.09374777227640152,
-0.017512774094939232,
-0.06781873852014542,
-0.12064754962921143,
0.04518076777458191,
0.09283262491226196,
-0.06830855458974838,
-0.006686370354145765,
-0.004924751818180084,
0.013755558989942074,
-0.05032142624258995,
0.18429741263389587,
0.06282955408096313,
0.047867823392152786,
-0.096428282558918,
0.07266043871641159,
0.0449543371796608,
-0.07330744713544846,
0.0033405697904527187,
0.07132815569639206,
-0.08534601330757141,
-0.05450327321887016,
0.06432835012674332,
0.1912047564983368,
-0.043927162885665894,
-0.04855562746524811,
-0.1453658491373062,
-0.12287921458482742,
0.07764768600463867,
0.1408335417509079,
0.11843205243349075,
0.01058033388108015,
-0.06616745889186859,
0.0029106447473168373,
-0.10731884837150574,
0.10160065442323685,
0.045956265181303024,
0.06211207062005997,
-0.14301945269107819,
0.14211498200893402,
0.020740197971463203,
0.04820193350315094,
-0.01806846633553505,
0.023208048194646835,
-0.10020429641008377,
0.007697694003582001,
-0.09298595041036606,
-0.019537312909960747,
-0.029065001755952835,
0.011588165536522865,
-0.005960927344858646,
-0.04729988053441048,
-0.0542338490486145,
0.010628738440573215,
-0.10766087472438812,
-0.023693302646279335,
0.030114110559225082,
0.07296796143054962,
-0.10916557163000107,
-0.035426318645477295,
0.030875829979777336,
-0.06108306720852852,
0.07371184974908829,
0.04329424351453781,
0.015606247819960117,
0.050780802965164185,
-0.13902859389781952,
0.02026754431426525,
0.07313340902328491,
0.029131997376680374,
0.06079116463661194,
-0.09932733327150345,
-0.007924248464405537,
-0.00831547100096941,
0.039642333984375,
0.02172110602259636,
0.07442112267017365,
-0.1410936713218689,
0.0035281842574477196,
-0.02308511547744274,
-0.08286073803901672,
-0.06700614094734192,
0.028149420395493507,
0.08893175423145294,
0.018416542559862137,
0.19928090274333954,
-0.07619873434305191,
0.049508191645145416,
-0.21921874582767487,
0.007203821558505297,
-0.006482485681772232,
-0.11031077802181244,
-0.10131534934043884,
-0.07205703109502792,
0.05513612926006317,
-0.060717787593603134,
0.1499030590057373,
0.04670583829283714,
0.0190992783755064,
0.02442006766796112,
-0.011104658246040344,
0.0123064573854208,
0.009836219251155853,
0.18994872272014618,
0.030648769810795784,
-0.03457606956362724,
0.05985496938228607,
0.044767893850803375,
0.10333859920501709,
0.11458932608366013,
0.2000289261341095,
0.14501407742500305,
-0.009634922258555889,
0.09320678561925888,
0.043747033923864365,
-0.055918898433446884,
-0.15551309287548065,
0.05201669782400131,
-0.0348605252802372,
0.10937416553497314,
-0.02125314436852932,
0.22091102600097656,
0.06478510051965714,
-0.1696740686893463,
0.051610227674245834,
-0.05148177221417427,
-0.08721215277910233,
-0.11527423560619354,
-0.049710892140865326,
-0.07701697945594788,
-0.13180910050868988,
-0.003880183445289731,
-0.11571096628904343,
-0.0028426540084183216,
0.12518630921840668,
0.003630818100646138,
-0.027395818382501602,
0.15849998593330383,
0.014401617459952831,
0.022212907671928406,
0.06015627831220627,
0.008367235772311687,
-0.03863980621099472,
-0.14036662876605988,
-0.059315942227840424,
-0.012330079451203346,
-0.008793477900326252,
0.03116128407418728,
-0.06153199076652527,
-0.04473326727747917,
0.03117159940302372,
-0.02042582258582115,
-0.09601709246635437,
0.006006556563079357,
0.011131567880511284,
0.0533316507935524,
0.044897403568029404,
0.009516828693449497,
0.018703876063227654,
-0.0037758296821266413,
0.20052506029605865,
-0.0717770978808403,
-0.06498154252767563,
-0.10246375948190689,
0.23358504474163055,
0.0361536405980587,
-0.018478771671652794,
0.03453867882490158,
-0.06657195836305618,
0.004409546032547951,
0.24914872646331787,
0.21641024947166443,
-0.07975707203149796,
-0.0060849878937006,
0.016893045976758003,
-0.007865218445658684,
-0.021979933604598045,
0.09790797531604767,
0.14306801557540894,
0.04558771848678589,
-0.09230106323957443,
-0.04356169328093529,
-0.058740004897117615,
-0.0174243226647377,
-0.03374728187918663,
0.0693250447511673,
0.051049165427684784,
0.009482786059379578,
-0.03563119098544121,
0.0567263662815094,
-0.06666052341461182,
-0.09042596817016602,
0.05730389058589935,
-0.21876642107963562,
-0.16741296648979187,
-0.01652800291776657,
0.1028960645198822,
0.0017809192650020123,
0.061948828399181366,
-0.029001614078879356,
-0.003035155590623617,
0.09021490812301636,
-0.019437670707702637,
-0.09757450222969055,
-0.07405272871255875,
0.0848613753914833,
-0.11156534403562546,
0.21836979687213898,
-0.047714509069919586,
0.054364051669836044,
0.1254379153251648,
0.06760457158088684,
-0.0644994005560875,
0.06536801904439926,
0.04187343269586563,
-0.04156811535358429,
0.023007867857813835,
0.06869390606880188,
-0.03251289203763008,
0.06446241587400436,
0.047979000955820084,
-0.13706746697425842,
0.02370663918554783,
-0.04812363535165787,
-0.06919568032026291,
-0.043872371315956116,
-0.020693952217698097,
-0.06029629707336426,
0.12902742624282837,
0.2190595120191574,
-0.024821428582072258,
-0.00955967791378498,
-0.07230399549007416,
0.00883461069315672,
0.05615578591823578,
0.021983714774250984,
-0.057219840586185455,
-0.21056394279003143,
0.016822319477796555,
0.04565083608031273,
-0.01846429333090782,
-0.25154390931129456,
-0.10084811598062515,
0.004124476574361324,
-0.07295944541692734,
-0.0947342962026596,
0.07150569558143616,
0.08810579776763916,
0.054769519716501236,
-0.05578319728374481,
-0.04721960425376892,
-0.07467550039291382,
0.14914295077323914,
-0.1454761028289795,
-0.09100616723299026
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-wnli
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6883
- Accuracy: 0.5634
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.6883 | 0.5634 |
| No log | 2.0 | 80 | 0.6934 | 0.5634 |
| No log | 3.0 | 120 | 0.6960 | 0.5211 |
| No log | 4.0 | 160 | 0.6958 | 0.5634 |
| No log | 5.0 | 200 | 0.6964 | 0.5634 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "distilbert-base-uncased-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.5633802816901409, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/distilbert-base-uncased-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
distilbert-base-uncased-finetuned-wnli
======================================
This model is a fine-tuned version of distilbert-base-uncased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6883
* Accuracy: 0.5634
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
67,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #distilbert #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10177629441022873,
0.09868992865085602,
-0.002423677360638976,
0.12112317979335785,
0.1650812178850174,
0.03426579013466835,
0.1299346536397934,
0.12770213186740875,
-0.08564593642950058,
0.021304525434970856,
0.1207699179649353,
0.16118186712265015,
0.023844653740525246,
0.10977903753519058,
-0.04856811463832855,
-0.26443108916282654,
-0.014250527136027813,
0.05101621896028519,
-0.05502324551343918,
0.13528308272361755,
0.09085173904895782,
-0.12123988568782806,
0.09099695831537247,
0.010560519993305206,
-0.19217939674854279,
0.0012736907228827477,
-0.00007869464025134221,
-0.05158104747533798,
0.1480625867843628,
0.026134053245186806,
0.1222379058599472,
0.004350635223090649,
0.08203282207250595,
-0.20211967825889587,
0.0109197236597538,
0.047976233065128326,
0.0034287353046238422,
0.09363500773906708,
0.04515067860484123,
0.002628615591675043,
0.12063173204660416,
-0.0803351029753685,
0.05412057787179947,
0.025187760591506958,
-0.11920642107725143,
-0.2130812108516693,
-0.07939155399799347,
0.036630984395742416,
0.07490450143814087,
0.10550613701343536,
-0.007987946271896362,
0.12026778608560562,
-0.08140391111373901,
0.09315772354602814,
0.22560088336467743,
-0.2846624255180359,
-0.06620592623949051,
0.04440826550126076,
0.014406891539692879,
0.04722030460834503,
-0.10258147865533829,
-0.03414628654718399,
0.04851415753364563,
0.0509343147277832,
0.1279968023300171,
-0.027470313012599945,
-0.11697384715080261,
0.006325297988951206,
-0.14067994058132172,
-0.031778451055288315,
0.16862210631370544,
0.04232211038470268,
-0.027326999232172966,
-0.05612191930413246,
-0.05705082044005394,
-0.1505020707845688,
-0.035502392798662186,
-0.01559263002127409,
0.049344535917043686,
-0.023357374593615532,
-0.04166480526328087,
-0.009466851130127907,
-0.10793615877628326,
-0.06508545577526093,
-0.07387420535087585,
0.11076346784830093,
0.03758743405342102,
0.006860504392534494,
-0.029825663194060326,
0.11246810108423233,
-0.007811444811522961,
-0.12158264964818954,
0.025255942717194557,
0.022893251851201057,
0.01350346952676773,
-0.03933148831129074,
-0.05274882912635803,
-0.06423640251159668,
0.01213061437010765,
0.12725643813610077,
-0.05570182576775551,
0.04239019751548767,
0.05044251307845116,
0.04930534213781357,
-0.09428779035806656,
0.19049058854579926,
-0.034276507794857025,
-0.025200827047228813,
0.0005440381937660277,
0.05053231865167618,
0.017217280343174934,
-0.011495225131511688,
-0.12165343761444092,
0.004494988825172186,
0.08820939809083939,
0.007856707088649273,
-0.06190384179353714,
0.07444976270198822,
-0.059118129312992096,
-0.024367066100239754,
0.0022545091342180967,
-0.09038243442773819,
0.021844087168574333,
0.0009826826862990856,
-0.07183235138654709,
-0.020876631140708923,
0.036078646779060364,
0.015619035810232162,
-0.01955125480890274,
0.1076565608382225,
-0.08774691820144653,
0.02745640277862549,
-0.09359776973724365,
-0.10990453511476517,
0.016444405540823936,
-0.10769655555486679,
0.022347012534737587,
-0.09183300286531448,
-0.179530531167984,
-0.017432404682040215,
0.05996386706829071,
-0.024156158789992332,
-0.057796917855739594,
-0.058523666113615036,
-0.06686114519834518,
0.011313402093946934,
-0.00775144575163722,
0.11726417392492294,
-0.06450776755809784,
0.09395397454500198,
0.025765664875507355,
0.06269765645265579,
-0.042549166828393936,
0.05981731042265892,
-0.10193105041980743,
0.013144214637577534,
-0.15104839205741882,
0.0400441437959671,
-0.05145822837948799,
0.06935621798038483,
-0.08194040507078171,
-0.10615845024585724,
0.002663051476702094,
-0.0028380511794239283,
0.06268789619207382,
0.09636878967285156,
-0.18458586931228638,
-0.08252619951963425,
0.1638277769088745,
-0.07281412184238434,
-0.12196049094200134,
0.12136763334274292,
-0.0580064132809639,
0.05861146003007889,
0.05951378867030144,
0.17990033328533173,
0.08656419813632965,
-0.07854954153299332,
0.0018196254968643188,
0.023756947368383408,
0.05026758089661598,
-0.06338667124509811,
0.06808595359325409,
0.0018259093631058931,
0.01989728771150112,
0.03576963394880295,
-0.027762150391936302,
0.0641099065542221,
-0.08784174174070358,
-0.09886786341667175,
-0.0402458980679512,
-0.08253508061170578,
0.0465523786842823,
0.07975026965141296,
0.06739030033349991,
-0.09414571523666382,
-0.07797343283891678,
0.05024334043264389,
0.08206507563591003,
-0.058022212237119675,
0.024602338671684265,
-0.049497149884700775,
0.07355623692274094,
-0.023844702169299126,
-0.021642537787556648,
-0.17968407273292542,
-0.032223403453826904,
0.007671972271054983,
0.0017039760714396834,
0.018056152388453484,
0.03337876871228218,
0.06358686834573746,
0.06016344204545021,
-0.05022745952010155,
-0.019244832918047905,
-0.0352572537958622,
-0.0006887496565468609,
-0.12579292058944702,
-0.19718441367149353,
-0.028450479730963707,
-0.02205497771501541,
0.16075244545936584,
-0.2082725465297699,
0.05154874175786972,
-0.014233555644750595,
0.06959009915590286,
0.012252251617610455,
-0.0065546054393053055,
-0.037181925028562546,
0.07644595205783844,
-0.04241294413805008,
-0.05049571767449379,
0.08215155452489853,
0.01450799684971571,
-0.09128501266241074,
-0.050215501338243484,
-0.09774215519428253,
0.1582167148590088,
0.12925271689891815,
-0.11003289371728897,
-0.07731720805168152,
-0.023380616679787636,
-0.0669984295964241,
-0.034903690218925476,
-0.04613172262907028,
0.026549331843852997,
0.1879379004240036,
-0.0049881828017532825,
0.14970116317272186,
-0.06918737292289734,
-0.043393924832344055,
0.018244462087750435,
-0.03694281727075577,
0.01636327989399433,
0.13443101942539215,
0.13418081402778625,
-0.06011265516281128,
0.15530750155448914,
0.14804664254188538,
-0.08511312305927277,
0.1510668396949768,
-0.04195278137922287,
-0.06577235460281372,
-0.01610550656914711,
-0.029684927314519882,
-0.011206655763089657,
0.10058020800352097,
-0.15690045058727264,
-0.002004367997869849,
0.030701281502842903,
0.015433641150593758,
0.02562039904296398,
-0.22722068428993225,
-0.04094555974006653,
0.03781639412045479,
-0.044886574149131775,
-0.006275756284594536,
-0.00595852779224515,
0.005161743611097336,
0.10115070641040802,
-0.0004888595431111753,
-0.08683908730745316,
0.03661135584115982,
0.0027518956921994686,
-0.08374463021755219,
0.2156449258327484,
-0.081173375248909,
-0.17226016521453857,
-0.13096117973327637,
-0.07049524784088135,
-0.047677770256996155,
-0.0015197350876405835,
0.06859971582889557,
-0.09709104150533676,
-0.02635144256055355,
-0.07209304720163345,
0.025860309600830078,
0.007600440643727779,
0.022702498361468315,
0.0029937936924397945,
0.007451718207448721,
0.06447025388479233,
-0.11222215741872787,
-0.015035846270620823,
-0.058496929705142975,
-0.04542740061879158,
0.044325705617666245,
0.02816983126103878,
0.11031338572502136,
0.15270480513572693,
-0.013083916157484055,
0.012781093828380108,
-0.03162777051329613,
0.23638051748275757,
-0.06001908704638481,
-0.0208893995732069,
0.1460668295621872,
-0.007206457667052746,
0.051826294511556625,
0.11371733248233795,
0.07455668598413467,
-0.07760798186063766,
0.003885192796587944,
0.037943821400403976,
-0.034547463059425354,
-0.23284892737865448,
-0.053616248071193695,
-0.05520634725689888,
0.011296672746539116,
0.089586041867733,
0.02373400144279003,
0.030841536819934845,
0.07044725120067596,
0.04116436094045639,
0.07381468266248703,
-0.037414710968732834,
0.05087532848119736,
0.1295747458934784,
0.030546288937330246,
0.12477082759141922,
-0.04690399020910263,
-0.06424792855978012,
0.040794432163238525,
-0.010863419622182846,
0.22334444522857666,
0.009917938150465488,
0.13131633400917053,
0.06607729941606522,
0.1649666577577591,
-0.009735438972711563,
0.07535355538129807,
-0.010470135137438774,
-0.03799179568886757,
-0.0162015613168478,
-0.03991588577628136,
-0.04029099643230438,
0.023612579330801964,
-0.06360199302434921,
0.06474608182907104,
-0.12359699606895447,
0.014717328362166882,
0.05894068628549576,
0.24849559366703033,
0.03377196192741394,
-0.3200716972351074,
-0.09709673374891281,
0.0007384793716482818,
-0.029719963669776917,
-0.019953938201069832,
0.026196908205747604,
0.09424502402544022,
-0.09753169119358063,
0.029457727447152138,
-0.07466296851634979,
0.0962631106376648,
-0.055720455944538116,
0.05116957798600197,
0.08169417083263397,
0.09054762125015259,
0.011883794330060482,
0.09314082562923431,
-0.2884000837802887,
0.27650073170661926,
0.0002981654542963952,
0.056524623185396194,
-0.07594560086727142,
0.008374286815524101,
0.041159119457006454,
0.06563553959131241,
0.07949315011501312,
-0.01224528532475233,
-0.01759219914674759,
-0.18658626079559326,
-0.06754712015390396,
0.027284175157546997,
0.06876907497644424,
-0.04146112501621246,
0.08209217339754105,
-0.031855158507823944,
0.008920346386730671,
0.07337794452905655,
0.0023068361915647984,
-0.053541041910648346,
-0.10786380618810654,
-0.005141935311257839,
0.022722337394952774,
-0.06008746474981308,
-0.06107710674405098,
-0.12087935954332352,
-0.12990501523017883,
0.155729740858078,
-0.035367779433727264,
-0.038886189460754395,
-0.106304831802845,
0.08325839787721634,
0.05899275466799736,
-0.08922765403985977,
0.0430733785033226,
0.0019881264306604862,
0.07539381086826324,
0.02154691517353058,
-0.07059939950704575,
0.10247620195150375,
-0.07363677769899368,
-0.1557348668575287,
-0.0653294250369072,
0.10667534172534943,
0.033229321241378784,
0.06670800596475601,
-0.014250626787543297,
0.00431025680154562,
-0.046546995639801025,
-0.08820211887359619,
0.020896978676319122,
0.004755881614983082,
0.07743469625711441,
0.018997633829712868,
-0.07543632388114929,
0.01201551128178835,
-0.06481624394655228,
-0.034086693078279495,
0.20510898530483246,
0.2221778780221939,
-0.09982394427061081,
0.024086005985736847,
0.025836989283561707,
-0.0738971009850502,
-0.19793148338794708,
0.03522804379463196,
0.05483577400445938,
0.008683490566909313,
0.04294142872095108,
-0.18465115129947662,
0.13146370649337769,
0.10747389495372772,
-0.011773521080613136,
0.10583452880382538,
-0.324044793844223,
-0.12059681862592697,
0.13578835129737854,
0.13629050552845,
0.09854068607091904,
-0.1321391612291336,
-0.02271113730967045,
-0.018754245713353157,
-0.1379910707473755,
0.11446153372526169,
-0.09142790734767914,
0.120912104845047,
-0.037892017513513565,
0.07596635818481445,
0.0028597572818398476,
-0.0584384985268116,
0.12073198705911636,
0.023140931501984596,
0.0945015400648117,
-0.058765899389982224,
-0.033337973058223724,
0.03047853522002697,
-0.042802438139915466,
0.03446131944656372,
-0.09962396323680878,
0.029662422835826874,
-0.10281215608119965,
-0.0251001063734293,
-0.06927776336669922,
0.04619433730840683,
-0.04536258056759834,
-0.06819522380828857,
-0.037817176431417465,
0.025476092472672462,
0.04637615382671356,
-0.007411271333694458,
0.12242395430803299,
0.02384062111377716,
0.1488822102546692,
0.09686450660228729,
0.07455138862133026,
-0.06877368688583374,
-0.08208677172660828,
-0.026989970356225967,
-0.01078053005039692,
0.050466980785131454,
-0.1369129866361618,
0.019369233399629593,
0.15191002190113068,
0.020388750359416008,
0.15307851135730743,
0.08298779278993607,
-0.021846970543265343,
-0.00145810900721699,
0.059030331671237946,
-0.16558411717414856,
-0.09374777227640152,
-0.017512774094939232,
-0.06781873852014542,
-0.12064754962921143,
0.04518076777458191,
0.09283262491226196,
-0.06830855458974838,
-0.006686370354145765,
-0.004924751818180084,
0.013755558989942074,
-0.05032142624258995,
0.18429741263389587,
0.06282955408096313,
0.047867823392152786,
-0.096428282558918,
0.07266043871641159,
0.0449543371796608,
-0.07330744713544846,
0.0033405697904527187,
0.07132815569639206,
-0.08534601330757141,
-0.05450327321887016,
0.06432835012674332,
0.1912047564983368,
-0.043927162885665894,
-0.04855562746524811,
-0.1453658491373062,
-0.12287921458482742,
0.07764768600463867,
0.1408335417509079,
0.11843205243349075,
0.01058033388108015,
-0.06616745889186859,
0.0029106447473168373,
-0.10731884837150574,
0.10160065442323685,
0.045956265181303024,
0.06211207062005997,
-0.14301945269107819,
0.14211498200893402,
0.020740197971463203,
0.04820193350315094,
-0.01806846633553505,
0.023208048194646835,
-0.10020429641008377,
0.007697694003582001,
-0.09298595041036606,
-0.019537312909960747,
-0.029065001755952835,
0.011588165536522865,
-0.005960927344858646,
-0.04729988053441048,
-0.0542338490486145,
0.010628738440573215,
-0.10766087472438812,
-0.023693302646279335,
0.030114110559225082,
0.07296796143054962,
-0.10916557163000107,
-0.035426318645477295,
0.030875829979777336,
-0.06108306720852852,
0.07371184974908829,
0.04329424351453781,
0.015606247819960117,
0.050780802965164185,
-0.13902859389781952,
0.02026754431426525,
0.07313340902328491,
0.029131997376680374,
0.06079116463661194,
-0.09932733327150345,
-0.007924248464405537,
-0.00831547100096941,
0.039642333984375,
0.02172110602259636,
0.07442112267017365,
-0.1410936713218689,
0.0035281842574477196,
-0.02308511547744274,
-0.08286073803901672,
-0.06700614094734192,
0.028149420395493507,
0.08893175423145294,
0.018416542559862137,
0.19928090274333954,
-0.07619873434305191,
0.049508191645145416,
-0.21921874582767487,
0.007203821558505297,
-0.006482485681772232,
-0.11031077802181244,
-0.10131534934043884,
-0.07205703109502792,
0.05513612926006317,
-0.060717787593603134,
0.1499030590057373,
0.04670583829283714,
0.0190992783755064,
0.02442006766796112,
-0.011104658246040344,
0.0123064573854208,
0.009836219251155853,
0.18994872272014618,
0.030648769810795784,
-0.03457606956362724,
0.05985496938228607,
0.044767893850803375,
0.10333859920501709,
0.11458932608366013,
0.2000289261341095,
0.14501407742500305,
-0.009634922258555889,
0.09320678561925888,
0.043747033923864365,
-0.055918898433446884,
-0.15551309287548065,
0.05201669782400131,
-0.0348605252802372,
0.10937416553497314,
-0.02125314436852932,
0.22091102600097656,
0.06478510051965714,
-0.1696740686893463,
0.051610227674245834,
-0.05148177221417427,
-0.08721215277910233,
-0.11527423560619354,
-0.049710892140865326,
-0.07701697945594788,
-0.13180910050868988,
-0.003880183445289731,
-0.11571096628904343,
-0.0028426540084183216,
0.12518630921840668,
0.003630818100646138,
-0.027395818382501602,
0.15849998593330383,
0.014401617459952831,
0.022212907671928406,
0.06015627831220627,
0.008367235772311687,
-0.03863980621099472,
-0.14036662876605988,
-0.059315942227840424,
-0.012330079451203346,
-0.008793477900326252,
0.03116128407418728,
-0.06153199076652527,
-0.04473326727747917,
0.03117159940302372,
-0.02042582258582115,
-0.09601709246635437,
0.006006556563079357,
0.011131567880511284,
0.0533316507935524,
0.044897403568029404,
0.009516828693449497,
0.018703876063227654,
-0.0037758296821266413,
0.20052506029605865,
-0.0717770978808403,
-0.06498154252767563,
-0.10246375948190689,
0.23358504474163055,
0.0361536405980587,
-0.018478771671652794,
0.03453867882490158,
-0.06657195836305618,
0.004409546032547951,
0.24914872646331787,
0.21641024947166443,
-0.07975707203149796,
-0.0060849878937006,
0.016893045976758003,
-0.007865218445658684,
-0.021979933604598045,
0.09790797531604767,
0.14306801557540894,
0.04558771848678589,
-0.09230106323957443,
-0.04356169328093529,
-0.058740004897117615,
-0.0174243226647377,
-0.03374728187918663,
0.0693250447511673,
0.051049165427684784,
0.009482786059379578,
-0.03563119098544121,
0.0567263662815094,
-0.06666052341461182,
-0.09042596817016602,
0.05730389058589935,
-0.21876642107963562,
-0.16741296648979187,
-0.01652800291776657,
0.1028960645198822,
0.0017809192650020123,
0.061948828399181366,
-0.029001614078879356,
-0.003035155590623617,
0.09021490812301636,
-0.019437670707702637,
-0.09757450222969055,
-0.07405272871255875,
0.0848613753914833,
-0.11156534403562546,
0.21836979687213898,
-0.047714509069919586,
0.054364051669836044,
0.1254379153251648,
0.06760457158088684,
-0.0644994005560875,
0.06536801904439926,
0.04187343269586563,
-0.04156811535358429,
0.023007867857813835,
0.06869390606880188,
-0.03251289203763008,
0.06446241587400436,
0.047979000955820084,
-0.13706746697425842,
0.02370663918554783,
-0.04812363535165787,
-0.06919568032026291,
-0.043872371315956116,
-0.020693952217698097,
-0.06029629707336426,
0.12902742624282837,
0.2190595120191574,
-0.024821428582072258,
-0.00955967791378498,
-0.07230399549007416,
0.00883461069315672,
0.05615578591823578,
0.021983714774250984,
-0.057219840586185455,
-0.21056394279003143,
0.016822319477796555,
0.04565083608031273,
-0.01846429333090782,
-0.25154390931129456,
-0.10084811598062515,
0.004124476574361324,
-0.07295944541692734,
-0.0947342962026596,
0.07150569558143616,
0.08810579776763916,
0.054769519716501236,
-0.05578319728374481,
-0.04721960425376892,
-0.07467550039291382,
0.14914295077323914,
-0.1454761028289795,
-0.09100616723299026
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# electra-base-discriminator-finetuned-rte
This model is a fine-tuned version of [google/electra-base-discriminator](https://huggingface.co/google/electra-base-discriminator) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4793
- Accuracy: 0.8231
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 156 | 0.6076 | 0.6570 |
| No log | 2.0 | 312 | 0.4824 | 0.7762 |
| No log | 3.0 | 468 | 0.4793 | 0.8231 |
| 0.4411 | 4.0 | 624 | 0.7056 | 0.7906 |
| 0.4411 | 5.0 | 780 | 0.6849 | 0.8159 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "electra-base-discriminator-finetuned-rte", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "rte"}, "metrics": [{"type": "accuracy", "value": 0.8231046931407943, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/electra-base-discriminator-finetuned-rte
|
[
"transformers",
"pytorch",
"tensorboard",
"electra",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #electra #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
electra-base-discriminator-finetuned-rte
========================================
This model is a fine-tuned version of google/electra-base-discriminator on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.4793
* Accuracy: 0.8231
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #electra #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #electra #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.10993218421936035,
0.09303925931453705,
-0.0018562558107078075,
0.11975739896297455,
0.17762814462184906,
0.03187306597828865,
0.09995214641094208,
0.12710557878017426,
-0.08821161836385727,
0.02938605286180973,
0.13181909918785095,
0.17248812317848206,
0.02147456631064415,
0.11136481910943985,
-0.05232442170381546,
-0.26265934109687805,
-0.0085792550817132,
0.04750598967075348,
-0.06312747299671173,
0.13089106976985931,
0.09350745379924774,
-0.12371840327978134,
0.0995192676782608,
0.019917162135243416,
-0.18416346609592438,
0.008807512931525707,
-0.000048867084842640907,
-0.06115612015128136,
0.14592516422271729,
0.034306108951568604,
0.11762060970067978,
-0.00911291129887104,
0.08563094586133957,
-0.1951460987329483,
0.013859066180884838,
0.0488617904484272,
-0.002918922109529376,
0.09141622483730316,
0.035062674432992935,
0.003967548720538616,
0.1587071418762207,
-0.0852406695485115,
0.05617314204573631,
0.02328403666615486,
-0.11948850005865097,
-0.20560768246650696,
-0.0779259130358696,
0.030848832800984383,
0.08741845935583115,
0.11399535834789276,
-0.005196638870984316,
0.1362311691045761,
-0.08127991110086441,
0.08750331401824951,
0.2212054431438446,
-0.2868227958679199,
-0.06537944823503494,
0.0169778224080801,
0.018395235762000084,
0.035376835614442825,
-0.09749932587146759,
-0.03057807683944702,
0.057718969881534576,
0.04944853112101555,
0.11963897198438644,
-0.023592226207256317,
-0.10478898137807846,
0.01874537207186222,
-0.13447271287441254,
-0.04013517126441002,
0.173468217253685,
0.04529625549912453,
-0.03147750347852707,
-0.05682028830051422,
-0.053107790648937225,
-0.1579091101884842,
-0.030751265585422516,
-0.005766710266470909,
0.042122676968574524,
-0.024105016142129898,
-0.043130431324243546,
0.0011519145919010043,
-0.1125771775841713,
-0.06111526861786842,
-0.0783105120062828,
0.12700434029102325,
0.0365166962146759,
0.018269825726747513,
-0.03523501753807068,
0.10577011853456497,
0.006595340557396412,
-0.12484759092330933,
0.0063986945897340775,
0.026714814826846123,
0.011504312977194786,
-0.04626300185918808,
-0.053255971521139145,
-0.06518927961587906,
0.016250334680080414,
0.1351046860218048,
-0.034201495349407196,
0.042291734367609024,
0.0633799359202385,
0.04074253514409065,
-0.08117340505123138,
0.18557307124137878,
-0.03318364545702934,
-0.011626181192696095,
0.012305648066103458,
0.03446268290281296,
0.026384979486465454,
-0.01277640275657177,
-0.12349376082420349,
0.003581979777663946,
0.07299818843603134,
0.013957533985376358,
-0.06314987689256668,
0.06885430961847305,
-0.05203275755047798,
-0.021058836951851845,
0.010806522332131863,
-0.08922680467367172,
0.029732314869761467,
-0.004704609513282776,
-0.07978168874979019,
-0.01965859718620777,
0.03036697395145893,
0.025049975141882896,
-0.021466488018631935,
0.10741426050662994,
-0.09047967940568924,
0.024893928319215775,
-0.09548268467187881,
-0.10385119169950485,
0.02405233308672905,
-0.12066178768873215,
0.04083560034632683,
-0.10284851491451263,
-0.19468048214912415,
-0.012204325757920742,
0.0591849684715271,
-0.024119358509778976,
-0.0751781314611435,
-0.057571858167648315,
-0.06249626725912094,
0.02043125592172146,
-0.011472281999886036,
0.11457894742488861,
-0.06786517798900604,
0.08566759526729584,
0.032077889889478683,
0.05974006652832031,
-0.03833478316664696,
0.05214707553386688,
-0.10209894180297852,
0.008537751622498035,
-0.14924949407577515,
0.030569937080144882,
-0.03233163803815842,
0.08155159652233124,
-0.0876268818974495,
-0.09882117807865143,
0.011876729317009449,
0.0030013099312782288,
0.06287110596895218,
0.09573369473218918,
-0.1693664789199829,
-0.07770423591136932,
0.14866259694099426,
-0.07934831082820892,
-0.13363078236579895,
0.12035750597715378,
-0.05827704817056656,
0.042403385043144226,
0.06371321529150009,
0.1437835842370987,
0.059065740555524826,
-0.08460871875286102,
-0.014025665819644928,
0.007826954126358032,
0.034052010625600815,
-0.0782342404127121,
0.07619888335466385,
0.007573139853775501,
0.015471214428544044,
0.028043072670698166,
-0.01616419479250908,
0.06630200892686844,
-0.08420117944478989,
-0.09790369123220444,
-0.047176435589790344,
-0.07874805480241776,
0.03403299301862717,
0.07676862180233002,
0.06206413730978966,
-0.10093844681978226,
-0.0900677740573883,
0.046655330806970596,
0.08729763329029083,
-0.05616789683699608,
0.02331736870110035,
-0.06781508773565292,
0.07085300236940384,
-0.050842080265283585,
-0.02671901136636734,
-0.17297154664993286,
-0.028807468712329865,
-0.004454363137483597,
-0.004673823714256287,
0.014174513518810272,
0.020979048684239388,
0.06160533428192139,
0.05672350153326988,
-0.04055402800440788,
-0.014758279547095299,
-0.03196488693356514,
-0.005129905417561531,
-0.12734556198120117,
-0.1980491280555725,
-0.03147100657224655,
-0.02565714530646801,
0.1387709379196167,
-0.22493639588356018,
0.0432548001408577,
-0.022894784808158875,
0.07332275807857513,
0.008166888728737831,
-0.003848462598398328,
-0.04335092753171921,
0.0682745948433876,
-0.052231524139642715,
-0.04929289594292641,
0.07083682715892792,
0.020703019574284554,
-0.09721647202968597,
-0.0519232302904129,
-0.09582822769880295,
0.15381859242916107,
0.12923119962215424,
-0.10309077799320221,
-0.07164105772972107,
-0.0083346888422966,
-0.06123215705156326,
-0.033842071890830994,
-0.06052513048052788,
0.03598521649837494,
0.20838044583797455,
-0.008705025538802147,
0.1514522135257721,
-0.07073947787284851,
-0.04729161784052849,
0.028893057256937027,
-0.03693579137325287,
0.018599754199385643,
0.1305069476366043,
0.1555924415588379,
-0.07595258206129074,
0.14878009259700775,
0.12888412177562714,
-0.09408707916736603,
0.13803602755069733,
-0.037724193185567856,
-0.07006612420082092,
-0.0151677830144763,
-0.03873472660779953,
-0.006152993068099022,
0.09859757125377655,
-0.16054300963878632,
0.00044495463953353465,
0.030342187732458115,
0.015624715015292168,
0.022183913737535477,
-0.21764636039733887,
-0.04037681221961975,
0.04375113546848297,
-0.03616701811552048,
-0.02169610746204853,
-0.0051982467994093895,
0.0011029565939679742,
0.09880496561527252,
0.002264350187033415,
-0.08198396861553192,
0.04158433899283409,
0.012333126738667488,
-0.08977969735860825,
0.21849697828292847,
-0.07294631749391556,
-0.1600169986486435,
-0.12679468095302582,
-0.06509174406528473,
-0.037019047886133194,
0.004053042735904455,
0.07418730854988098,
-0.09482315182685852,
-0.04170699790120125,
-0.07617878168821335,
0.014309358783066273,
0.0034513489808887243,
0.030703971162438393,
0.014478925615549088,
0.007094135507941246,
0.05695921182632446,
-0.10524827986955643,
-0.012378708459436893,
-0.0519661046564579,
-0.04015309363603592,
0.0367339663207531,
0.039968814700841904,
0.11802729964256287,
0.14835304021835327,
-0.01666850969195366,
0.013988040387630463,
-0.02906576730310917,
0.22939100861549377,
-0.06861412525177002,
-0.028029391542077065,
0.15066766738891602,
-0.005935404449701309,
0.044070128351449966,
0.11753897368907928,
0.06605994701385498,
-0.07593461871147156,
0.00543209770694375,
0.03088444657623768,
-0.04107694700360298,
-0.235112264752388,
-0.052737366408109665,
-0.05770053341984749,
-0.002185037126764655,
0.09727739542722702,
0.032169878482818604,
0.03930181637406349,
0.07662338018417358,
0.03596808388829231,
0.07360756397247314,
-0.054025694727897644,
0.06227331981062889,
0.11977670341730118,
0.050643883645534515,
0.12243027240037918,
-0.05779528245329857,
-0.06631535291671753,
0.04130024462938309,
-0.01903594098985195,
0.23022998869419098,
0.015007009729743004,
0.14573970437049866,
0.05565778538584709,
0.1535622775554657,
-0.0005968016921542585,
0.08172149956226349,
-0.006551842670887709,
-0.049307454377412796,
-0.013366537168622017,
-0.03192223235964775,
-0.03543268144130707,
0.03208816424012184,
-0.07185499370098114,
0.06844620406627655,
-0.12878894805908203,
0.027398373931646347,
0.05281217768788338,
0.2502261698246002,
0.03661729395389557,
-0.33694279193878174,
-0.09630628675222397,
0.006317086983472109,
-0.037142813205718994,
-0.030822260305285454,
0.02474534511566162,
0.08048032224178314,
-0.09706829488277435,
0.037767499685287476,
-0.07365977764129639,
0.10132131725549698,
-0.047447267919778824,
0.04141630604863167,
0.07737306505441666,
0.09100155532360077,
0.004611567594110966,
0.09646192193031311,
-0.2767905592918396,
0.2734326124191284,
-0.001023047138005495,
0.07582991570234299,
-0.08475460112094879,
0.009737282060086727,
0.038403574377298355,
0.06259626895189285,
0.08297977596521378,
-0.018109217286109924,
-0.08623840659856796,
-0.1842016875743866,
-0.06493131816387177,
0.03816809132695198,
0.04816474765539169,
-0.03193073347210884,
0.09269404411315918,
-0.03255445882678032,
0.013009076938033104,
0.0689353346824646,
-0.0008464672137051821,
-0.043154459446668625,
-0.11219543218612671,
-0.018749535083770752,
0.032622016966342926,
-0.053593773394823074,
-0.05234724283218384,
-0.11583185195922852,
-0.1189291775226593,
0.14456146955490112,
-0.04827075079083443,
-0.04067594185471535,
-0.11432097107172012,
0.08901102840900421,
0.05766443535685539,
-0.09570139646530151,
0.051527850329875946,
0.009181435219943523,
0.08645030111074448,
0.03281962126493454,
-0.0706310123205185,
0.10370880365371704,
-0.07966743409633636,
-0.1503232717514038,
-0.0629192367196083,
0.08772337436676025,
0.03594321757555008,
0.06015893444418907,
-0.007348454091697931,
0.017944328486919403,
-0.056459251791238785,
-0.0866134986281395,
0.02172291837632656,
-0.0022606197744607925,
0.09106890857219696,
0.017863746732473373,
-0.050611209124326706,
0.015585520304739475,
-0.054170649498701096,
-0.04604087769985199,
0.20429325103759766,
0.21279276907444,
-0.09902685135602951,
0.01883353292942047,
0.013565711677074432,
-0.07652195543050766,
-0.18549098074436188,
0.023287760093808174,
0.052796270698308945,
0.020619532093405724,
0.04055831581354141,
-0.18354250490665436,
0.14787302911281586,
0.11467036604881287,
-0.01167855504900217,
0.11124106496572495,
-0.3092828691005707,
-0.1202506572008133,
0.13276176154613495,
0.12265028804540634,
0.1171359196305275,
-0.12558132410049438,
-0.01402408815920353,
-0.03074740432202816,
-0.1345207244157791,
0.09962276369333267,
-0.08962751179933548,
0.11052592098712921,
-0.03924911469221115,
0.0827980563044548,
0.0015659835189580917,
-0.057843804359436035,
0.12301284819841385,
0.019426845014095306,
0.09309196472167969,
-0.06273852288722992,
-0.03151955455541611,
0.01008270587772131,
-0.04370889067649841,
0.04556463286280632,
-0.11929388344287872,
0.02470386028289795,
-0.1207934096455574,
-0.030746610835194588,
-0.060320496559143066,
0.05135529115796089,
-0.03341133892536163,
-0.05600405111908913,
-0.03839591145515442,
0.008260402828454971,
0.05163063108921051,
-0.0055680968798696995,
0.15884771943092346,
0.0260087251663208,
0.1466272622346878,
0.07920314371585846,
0.08837610483169556,
-0.07449349761009216,
-0.08372492343187332,
-0.022218074649572372,
-0.008739529177546501,
0.05441742017865181,
-0.15279950201511383,
0.026358669623732567,
0.14814555644989014,
0.01793789677321911,
0.14715786278247833,
0.08141490817070007,
-0.014006084762513638,
0.017506493255496025,
0.061306171119213104,
-0.15042272210121155,
-0.07854358106851578,
-0.01290026493370533,
-0.04538888856768608,
-0.124518483877182,
0.04827684909105301,
0.08916214853525162,
-0.07279811799526215,
-0.012631990015506744,
-0.013764576055109501,
0.011335966177284718,
-0.049227289855480194,
0.17471757531166077,
0.05657902732491493,
0.04618668928742409,
-0.10172462463378906,
0.06917989999055862,
0.045334137976169586,
-0.08747346699237823,
0.020230140537023544,
0.07296179234981537,
-0.07737967371940613,
-0.056503742933273315,
0.08207309991121292,
0.20515675842761993,
-0.07268963754177094,
-0.059580422937870026,
-0.14837035536766052,
-0.1235557571053505,
0.0908619835972786,
0.15014784038066864,
0.11396310478448868,
0.002191154519096017,
-0.06248815730214119,
0.001188953290693462,
-0.12312684208154678,
0.09436533600091934,
0.05079396069049835,
0.05825848504900932,
-0.15327033400535583,
0.12664328515529633,
0.016738098114728928,
0.04621300473809242,
-0.016041206195950508,
0.02562527544796467,
-0.09077546000480652,
0.012571540661156178,
-0.09827841073274612,
-0.004030080046504736,
-0.03934266045689583,
0.008737364783883095,
0.001799125224351883,
-0.046036310493946075,
-0.06403874605894089,
0.013945028185844421,
-0.10473553091287613,
-0.01699390448629856,
0.03233787789940834,
0.06791332364082336,
-0.10198774933815002,
-0.03585846349596977,
0.026827169582247734,
-0.06131670996546745,
0.06869570910930634,
0.048540595918893814,
0.0222605150192976,
0.04965469613671303,
-0.13199225068092346,
0.03003573790192604,
0.07863347232341766,
0.022135267034173012,
0.06137925758957863,
-0.10601583123207092,
-0.0007533631869591773,
0.000573481316678226,
0.03743851184844971,
0.013618634082376957,
0.046978339552879333,
-0.14038409292697906,
0.00015889028145466,
-0.007508769165724516,
-0.08565139770507812,
-0.07067415863275528,
0.02340167760848999,
0.09075982868671417,
0.007828038185834885,
0.2145799845457077,
-0.07112079858779907,
0.041902463883161545,
-0.2117060422897339,
0.0127671854570508,
-0.012437400408089161,
-0.09761452674865723,
-0.11836007237434387,
-0.0634927824139595,
0.055777717381715775,
-0.06098109111189842,
0.1452653706073761,
0.041852839291095734,
0.027462540194392204,
0.029232097789645195,
-0.0026169908232986927,
0.019868170842528343,
0.014651289209723473,
0.20861364901065826,
0.032792799174785614,
-0.03748399764299393,
0.059693578630685806,
0.03904883563518524,
0.10668560862541199,
0.11475371569395065,
0.19278621673583984,
0.12879374623298645,
0.01076571550220251,
0.10934294015169144,
0.03128606826066971,
-0.05863118916749954,
-0.156927689909935,
0.03646245226264,
-0.039669279009103775,
0.10560476034879684,
-0.0005126527394168079,
0.21131570637226105,
0.0905061587691307,
-0.16577453911304474,
0.0380982905626297,
-0.0589977391064167,
-0.08098681271076202,
-0.12119633704423904,
-0.07406049221754074,
-0.08644623309373856,
-0.13388611376285553,
0.007452758494764566,
-0.11749137192964554,
0.006951653864234686,
0.12492231279611588,
-0.0023579136468470097,
-0.027557147666811943,
0.14081402122974396,
0.02850053459405899,
0.02905062772333622,
0.044530052691698074,
0.007642372045665979,
-0.029235411435365677,
-0.10651388764381409,
-0.050203531980514526,
-0.013492216356098652,
-0.030439767986536026,
0.03399078920483589,
-0.06657307595014572,
-0.04580254480242729,
0.049214381724596024,
-0.019320320338010788,
-0.0966344028711319,
0.009096598252654076,
0.014410317875444889,
0.05934319272637367,
0.04073020815849304,
0.011165669187903404,
0.024138860404491425,
-0.0059270127676427364,
0.2053600400686264,
-0.07232540100812912,
-0.0562208816409111,
-0.11182606220245361,
0.21762651205062866,
0.053876686841249466,
-0.02000606618821621,
0.031390197575092316,
-0.06914068013429642,
0.007513601798564196,
0.23325438797473907,
0.19339855015277863,
-0.05827881768345833,
-0.012427110224962234,
-0.004847933538258076,
-0.00956749264150858,
-0.02099345065653324,
0.0910603478550911,
0.1416623294353485,
0.0469084233045578,
-0.09867993742227554,
-0.04819963127374649,
-0.061097849160432816,
-0.009727383963763714,
-0.0412248857319355,
0.06169692426919937,
0.04181385412812233,
0.001692313002422452,
-0.020432209596037865,
0.057825811207294464,
-0.06730454415082932,
-0.07403378933668137,
0.06150418892502785,
-0.19884897768497467,
-0.16156165301799774,
-0.010300726629793644,
0.1102413535118103,
0.01749129593372345,
0.06399773061275482,
-0.02622102200984955,
-0.015628423541784286,
0.08604495227336884,
-0.016923977062106133,
-0.11325999349355698,
-0.07480964064598083,
0.08663355559110641,
-0.10703404247760773,
0.21945820748806,
-0.04706016927957535,
0.05619185045361519,
0.12956269085407257,
0.06781532615423203,
-0.07991959899663925,
0.06199520826339722,
0.03496561571955681,
-0.03855300322175026,
0.04563915729522705,
0.07112543284893036,
-0.03436211869120598,
0.05684373900294304,
0.04618493467569351,
-0.1332332044839859,
0.018335504457354546,
-0.06255962699651718,
-0.053373973816633224,
-0.03750753775238991,
-0.011835905723273754,
-0.05690811946988106,
0.134381964802742,
0.20424987375736237,
-0.03040335513651371,
-0.01295243389904499,
-0.07077576220035553,
0.017100604251027107,
0.0582498274743557,
0.028955571353435516,
-0.057430993765592575,
-0.20965342223644257,
0.0193637665361166,
0.04905003309249878,
-0.019286978989839554,
-0.24684861302375793,
-0.09661472588777542,
-0.006176378112286329,
-0.08670622855424881,
-0.09091965109109879,
0.06341086328029633,
0.1018756553530693,
0.05184410884976387,
-0.06113895773887634,
-0.028823498636484146,
-0.06907874345779419,
0.1478630006313324,
-0.13758938014507294,
-0.10041738301515579
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# electra-base-discriminator-finetuned-wnli
This model is a fine-tuned version of [google/electra-base-discriminator](https://huggingface.co/google/electra-base-discriminator) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6893
- Accuracy: 0.5634
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.6893 | 0.5634 |
| No log | 2.0 | 80 | 0.7042 | 0.4225 |
| No log | 3.0 | 120 | 0.7008 | 0.3803 |
| No log | 4.0 | 160 | 0.6998 | 0.5634 |
| No log | 5.0 | 200 | 0.7016 | 0.5352 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "electra-base-discriminator-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.5633802816901409, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/electra-base-discriminator-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"electra",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #electra #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
|
electra-base-discriminator-finetuned-wnli
=========================================
This model is a fine-tuned version of google/electra-base-discriminator on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6893
* Accuracy: 0.5634
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #electra #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
66,
98,
4,
35
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #electra #text-classification #generated_from_trainer #dataset-glue #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.10993218421936035,
0.09303925931453705,
-0.0018562558107078075,
0.11975739896297455,
0.17762814462184906,
0.03187306597828865,
0.09995214641094208,
0.12710557878017426,
-0.08821161836385727,
0.02938605286180973,
0.13181909918785095,
0.17248812317848206,
0.02147456631064415,
0.11136481910943985,
-0.05232442170381546,
-0.26265934109687805,
-0.0085792550817132,
0.04750598967075348,
-0.06312747299671173,
0.13089106976985931,
0.09350745379924774,
-0.12371840327978134,
0.0995192676782608,
0.019917162135243416,
-0.18416346609592438,
0.008807512931525707,
-0.000048867084842640907,
-0.06115612015128136,
0.14592516422271729,
0.034306108951568604,
0.11762060970067978,
-0.00911291129887104,
0.08563094586133957,
-0.1951460987329483,
0.013859066180884838,
0.0488617904484272,
-0.002918922109529376,
0.09141622483730316,
0.035062674432992935,
0.003967548720538616,
0.1587071418762207,
-0.0852406695485115,
0.05617314204573631,
0.02328403666615486,
-0.11948850005865097,
-0.20560768246650696,
-0.0779259130358696,
0.030848832800984383,
0.08741845935583115,
0.11399535834789276,
-0.005196638870984316,
0.1362311691045761,
-0.08127991110086441,
0.08750331401824951,
0.2212054431438446,
-0.2868227958679199,
-0.06537944823503494,
0.0169778224080801,
0.018395235762000084,
0.035376835614442825,
-0.09749932587146759,
-0.03057807683944702,
0.057718969881534576,
0.04944853112101555,
0.11963897198438644,
-0.023592226207256317,
-0.10478898137807846,
0.01874537207186222,
-0.13447271287441254,
-0.04013517126441002,
0.173468217253685,
0.04529625549912453,
-0.03147750347852707,
-0.05682028830051422,
-0.053107790648937225,
-0.1579091101884842,
-0.030751265585422516,
-0.005766710266470909,
0.042122676968574524,
-0.024105016142129898,
-0.043130431324243546,
0.0011519145919010043,
-0.1125771775841713,
-0.06111526861786842,
-0.0783105120062828,
0.12700434029102325,
0.0365166962146759,
0.018269825726747513,
-0.03523501753807068,
0.10577011853456497,
0.006595340557396412,
-0.12484759092330933,
0.0063986945897340775,
0.026714814826846123,
0.011504312977194786,
-0.04626300185918808,
-0.053255971521139145,
-0.06518927961587906,
0.016250334680080414,
0.1351046860218048,
-0.034201495349407196,
0.042291734367609024,
0.0633799359202385,
0.04074253514409065,
-0.08117340505123138,
0.18557307124137878,
-0.03318364545702934,
-0.011626181192696095,
0.012305648066103458,
0.03446268290281296,
0.026384979486465454,
-0.01277640275657177,
-0.12349376082420349,
0.003581979777663946,
0.07299818843603134,
0.013957533985376358,
-0.06314987689256668,
0.06885430961847305,
-0.05203275755047798,
-0.021058836951851845,
0.010806522332131863,
-0.08922680467367172,
0.029732314869761467,
-0.004704609513282776,
-0.07978168874979019,
-0.01965859718620777,
0.03036697395145893,
0.025049975141882896,
-0.021466488018631935,
0.10741426050662994,
-0.09047967940568924,
0.024893928319215775,
-0.09548268467187881,
-0.10385119169950485,
0.02405233308672905,
-0.12066178768873215,
0.04083560034632683,
-0.10284851491451263,
-0.19468048214912415,
-0.012204325757920742,
0.0591849684715271,
-0.024119358509778976,
-0.0751781314611435,
-0.057571858167648315,
-0.06249626725912094,
0.02043125592172146,
-0.011472281999886036,
0.11457894742488861,
-0.06786517798900604,
0.08566759526729584,
0.032077889889478683,
0.05974006652832031,
-0.03833478316664696,
0.05214707553386688,
-0.10209894180297852,
0.008537751622498035,
-0.14924949407577515,
0.030569937080144882,
-0.03233163803815842,
0.08155159652233124,
-0.0876268818974495,
-0.09882117807865143,
0.011876729317009449,
0.0030013099312782288,
0.06287110596895218,
0.09573369473218918,
-0.1693664789199829,
-0.07770423591136932,
0.14866259694099426,
-0.07934831082820892,
-0.13363078236579895,
0.12035750597715378,
-0.05827704817056656,
0.042403385043144226,
0.06371321529150009,
0.1437835842370987,
0.059065740555524826,
-0.08460871875286102,
-0.014025665819644928,
0.007826954126358032,
0.034052010625600815,
-0.0782342404127121,
0.07619888335466385,
0.007573139853775501,
0.015471214428544044,
0.028043072670698166,
-0.01616419479250908,
0.06630200892686844,
-0.08420117944478989,
-0.09790369123220444,
-0.047176435589790344,
-0.07874805480241776,
0.03403299301862717,
0.07676862180233002,
0.06206413730978966,
-0.10093844681978226,
-0.0900677740573883,
0.046655330806970596,
0.08729763329029083,
-0.05616789683699608,
0.02331736870110035,
-0.06781508773565292,
0.07085300236940384,
-0.050842080265283585,
-0.02671901136636734,
-0.17297154664993286,
-0.028807468712329865,
-0.004454363137483597,
-0.004673823714256287,
0.014174513518810272,
0.020979048684239388,
0.06160533428192139,
0.05672350153326988,
-0.04055402800440788,
-0.014758279547095299,
-0.03196488693356514,
-0.005129905417561531,
-0.12734556198120117,
-0.1980491280555725,
-0.03147100657224655,
-0.02565714530646801,
0.1387709379196167,
-0.22493639588356018,
0.0432548001408577,
-0.022894784808158875,
0.07332275807857513,
0.008166888728737831,
-0.003848462598398328,
-0.04335092753171921,
0.0682745948433876,
-0.052231524139642715,
-0.04929289594292641,
0.07083682715892792,
0.020703019574284554,
-0.09721647202968597,
-0.0519232302904129,
-0.09582822769880295,
0.15381859242916107,
0.12923119962215424,
-0.10309077799320221,
-0.07164105772972107,
-0.0083346888422966,
-0.06123215705156326,
-0.033842071890830994,
-0.06052513048052788,
0.03598521649837494,
0.20838044583797455,
-0.008705025538802147,
0.1514522135257721,
-0.07073947787284851,
-0.04729161784052849,
0.028893057256937027,
-0.03693579137325287,
0.018599754199385643,
0.1305069476366043,
0.1555924415588379,
-0.07595258206129074,
0.14878009259700775,
0.12888412177562714,
-0.09408707916736603,
0.13803602755069733,
-0.037724193185567856,
-0.07006612420082092,
-0.0151677830144763,
-0.03873472660779953,
-0.006152993068099022,
0.09859757125377655,
-0.16054300963878632,
0.00044495463953353465,
0.030342187732458115,
0.015624715015292168,
0.022183913737535477,
-0.21764636039733887,
-0.04037681221961975,
0.04375113546848297,
-0.03616701811552048,
-0.02169610746204853,
-0.0051982467994093895,
0.0011029565939679742,
0.09880496561527252,
0.002264350187033415,
-0.08198396861553192,
0.04158433899283409,
0.012333126738667488,
-0.08977969735860825,
0.21849697828292847,
-0.07294631749391556,
-0.1600169986486435,
-0.12679468095302582,
-0.06509174406528473,
-0.037019047886133194,
0.004053042735904455,
0.07418730854988098,
-0.09482315182685852,
-0.04170699790120125,
-0.07617878168821335,
0.014309358783066273,
0.0034513489808887243,
0.030703971162438393,
0.014478925615549088,
0.007094135507941246,
0.05695921182632446,
-0.10524827986955643,
-0.012378708459436893,
-0.0519661046564579,
-0.04015309363603592,
0.0367339663207531,
0.039968814700841904,
0.11802729964256287,
0.14835304021835327,
-0.01666850969195366,
0.013988040387630463,
-0.02906576730310917,
0.22939100861549377,
-0.06861412525177002,
-0.028029391542077065,
0.15066766738891602,
-0.005935404449701309,
0.044070128351449966,
0.11753897368907928,
0.06605994701385498,
-0.07593461871147156,
0.00543209770694375,
0.03088444657623768,
-0.04107694700360298,
-0.235112264752388,
-0.052737366408109665,
-0.05770053341984749,
-0.002185037126764655,
0.09727739542722702,
0.032169878482818604,
0.03930181637406349,
0.07662338018417358,
0.03596808388829231,
0.07360756397247314,
-0.054025694727897644,
0.06227331981062889,
0.11977670341730118,
0.050643883645534515,
0.12243027240037918,
-0.05779528245329857,
-0.06631535291671753,
0.04130024462938309,
-0.01903594098985195,
0.23022998869419098,
0.015007009729743004,
0.14573970437049866,
0.05565778538584709,
0.1535622775554657,
-0.0005968016921542585,
0.08172149956226349,
-0.006551842670887709,
-0.049307454377412796,
-0.013366537168622017,
-0.03192223235964775,
-0.03543268144130707,
0.03208816424012184,
-0.07185499370098114,
0.06844620406627655,
-0.12878894805908203,
0.027398373931646347,
0.05281217768788338,
0.2502261698246002,
0.03661729395389557,
-0.33694279193878174,
-0.09630628675222397,
0.006317086983472109,
-0.037142813205718994,
-0.030822260305285454,
0.02474534511566162,
0.08048032224178314,
-0.09706829488277435,
0.037767499685287476,
-0.07365977764129639,
0.10132131725549698,
-0.047447267919778824,
0.04141630604863167,
0.07737306505441666,
0.09100155532360077,
0.004611567594110966,
0.09646192193031311,
-0.2767905592918396,
0.2734326124191284,
-0.001023047138005495,
0.07582991570234299,
-0.08475460112094879,
0.009737282060086727,
0.038403574377298355,
0.06259626895189285,
0.08297977596521378,
-0.018109217286109924,
-0.08623840659856796,
-0.1842016875743866,
-0.06493131816387177,
0.03816809132695198,
0.04816474765539169,
-0.03193073347210884,
0.09269404411315918,
-0.03255445882678032,
0.013009076938033104,
0.0689353346824646,
-0.0008464672137051821,
-0.043154459446668625,
-0.11219543218612671,
-0.018749535083770752,
0.032622016966342926,
-0.053593773394823074,
-0.05234724283218384,
-0.11583185195922852,
-0.1189291775226593,
0.14456146955490112,
-0.04827075079083443,
-0.04067594185471535,
-0.11432097107172012,
0.08901102840900421,
0.05766443535685539,
-0.09570139646530151,
0.051527850329875946,
0.009181435219943523,
0.08645030111074448,
0.03281962126493454,
-0.0706310123205185,
0.10370880365371704,
-0.07966743409633636,
-0.1503232717514038,
-0.0629192367196083,
0.08772337436676025,
0.03594321757555008,
0.06015893444418907,
-0.007348454091697931,
0.017944328486919403,
-0.056459251791238785,
-0.0866134986281395,
0.02172291837632656,
-0.0022606197744607925,
0.09106890857219696,
0.017863746732473373,
-0.050611209124326706,
0.015585520304739475,
-0.054170649498701096,
-0.04604087769985199,
0.20429325103759766,
0.21279276907444,
-0.09902685135602951,
0.01883353292942047,
0.013565711677074432,
-0.07652195543050766,
-0.18549098074436188,
0.023287760093808174,
0.052796270698308945,
0.020619532093405724,
0.04055831581354141,
-0.18354250490665436,
0.14787302911281586,
0.11467036604881287,
-0.01167855504900217,
0.11124106496572495,
-0.3092828691005707,
-0.1202506572008133,
0.13276176154613495,
0.12265028804540634,
0.1171359196305275,
-0.12558132410049438,
-0.01402408815920353,
-0.03074740432202816,
-0.1345207244157791,
0.09962276369333267,
-0.08962751179933548,
0.11052592098712921,
-0.03924911469221115,
0.0827980563044548,
0.0015659835189580917,
-0.057843804359436035,
0.12301284819841385,
0.019426845014095306,
0.09309196472167969,
-0.06273852288722992,
-0.03151955455541611,
0.01008270587772131,
-0.04370889067649841,
0.04556463286280632,
-0.11929388344287872,
0.02470386028289795,
-0.1207934096455574,
-0.030746610835194588,
-0.060320496559143066,
0.05135529115796089,
-0.03341133892536163,
-0.05600405111908913,
-0.03839591145515442,
0.008260402828454971,
0.05163063108921051,
-0.0055680968798696995,
0.15884771943092346,
0.0260087251663208,
0.1466272622346878,
0.07920314371585846,
0.08837610483169556,
-0.07449349761009216,
-0.08372492343187332,
-0.022218074649572372,
-0.008739529177546501,
0.05441742017865181,
-0.15279950201511383,
0.026358669623732567,
0.14814555644989014,
0.01793789677321911,
0.14715786278247833,
0.08141490817070007,
-0.014006084762513638,
0.017506493255496025,
0.061306171119213104,
-0.15042272210121155,
-0.07854358106851578,
-0.01290026493370533,
-0.04538888856768608,
-0.124518483877182,
0.04827684909105301,
0.08916214853525162,
-0.07279811799526215,
-0.012631990015506744,
-0.013764576055109501,
0.011335966177284718,
-0.049227289855480194,
0.17471757531166077,
0.05657902732491493,
0.04618668928742409,
-0.10172462463378906,
0.06917989999055862,
0.045334137976169586,
-0.08747346699237823,
0.020230140537023544,
0.07296179234981537,
-0.07737967371940613,
-0.056503742933273315,
0.08207309991121292,
0.20515675842761993,
-0.07268963754177094,
-0.059580422937870026,
-0.14837035536766052,
-0.1235557571053505,
0.0908619835972786,
0.15014784038066864,
0.11396310478448868,
0.002191154519096017,
-0.06248815730214119,
0.001188953290693462,
-0.12312684208154678,
0.09436533600091934,
0.05079396069049835,
0.05825848504900932,
-0.15327033400535583,
0.12664328515529633,
0.016738098114728928,
0.04621300473809242,
-0.016041206195950508,
0.02562527544796467,
-0.09077546000480652,
0.012571540661156178,
-0.09827841073274612,
-0.004030080046504736,
-0.03934266045689583,
0.008737364783883095,
0.001799125224351883,
-0.046036310493946075,
-0.06403874605894089,
0.013945028185844421,
-0.10473553091287613,
-0.01699390448629856,
0.03233787789940834,
0.06791332364082336,
-0.10198774933815002,
-0.03585846349596977,
0.026827169582247734,
-0.06131670996546745,
0.06869570910930634,
0.048540595918893814,
0.0222605150192976,
0.04965469613671303,
-0.13199225068092346,
0.03003573790192604,
0.07863347232341766,
0.022135267034173012,
0.06137925758957863,
-0.10601583123207092,
-0.0007533631869591773,
0.000573481316678226,
0.03743851184844971,
0.013618634082376957,
0.046978339552879333,
-0.14038409292697906,
0.00015889028145466,
-0.007508769165724516,
-0.08565139770507812,
-0.07067415863275528,
0.02340167760848999,
0.09075982868671417,
0.007828038185834885,
0.2145799845457077,
-0.07112079858779907,
0.041902463883161545,
-0.2117060422897339,
0.0127671854570508,
-0.012437400408089161,
-0.09761452674865723,
-0.11836007237434387,
-0.0634927824139595,
0.055777717381715775,
-0.06098109111189842,
0.1452653706073761,
0.041852839291095734,
0.027462540194392204,
0.029232097789645195,
-0.0026169908232986927,
0.019868170842528343,
0.014651289209723473,
0.20861364901065826,
0.032792799174785614,
-0.03748399764299393,
0.059693578630685806,
0.03904883563518524,
0.10668560862541199,
0.11475371569395065,
0.19278621673583984,
0.12879374623298645,
0.01076571550220251,
0.10934294015169144,
0.03128606826066971,
-0.05863118916749954,
-0.156927689909935,
0.03646245226264,
-0.039669279009103775,
0.10560476034879684,
-0.0005126527394168079,
0.21131570637226105,
0.0905061587691307,
-0.16577453911304474,
0.0380982905626297,
-0.0589977391064167,
-0.08098681271076202,
-0.12119633704423904,
-0.07406049221754074,
-0.08644623309373856,
-0.13388611376285553,
0.007452758494764566,
-0.11749137192964554,
0.006951653864234686,
0.12492231279611588,
-0.0023579136468470097,
-0.027557147666811943,
0.14081402122974396,
0.02850053459405899,
0.02905062772333622,
0.044530052691698074,
0.007642372045665979,
-0.029235411435365677,
-0.10651388764381409,
-0.050203531980514526,
-0.013492216356098652,
-0.030439767986536026,
0.03399078920483589,
-0.06657307595014572,
-0.04580254480242729,
0.049214381724596024,
-0.019320320338010788,
-0.0966344028711319,
0.009096598252654076,
0.014410317875444889,
0.05934319272637367,
0.04073020815849304,
0.011165669187903404,
0.024138860404491425,
-0.0059270127676427364,
0.2053600400686264,
-0.07232540100812912,
-0.0562208816409111,
-0.11182606220245361,
0.21762651205062866,
0.053876686841249466,
-0.02000606618821621,
0.031390197575092316,
-0.06914068013429642,
0.007513601798564196,
0.23325438797473907,
0.19339855015277863,
-0.05827881768345833,
-0.012427110224962234,
-0.004847933538258076,
-0.00956749264150858,
-0.02099345065653324,
0.0910603478550911,
0.1416623294353485,
0.0469084233045578,
-0.09867993742227554,
-0.04819963127374649,
-0.061097849160432816,
-0.009727383963763714,
-0.0412248857319355,
0.06169692426919937,
0.04181385412812233,
0.001692313002422452,
-0.020432209596037865,
0.057825811207294464,
-0.06730454415082932,
-0.07403378933668137,
0.06150418892502785,
-0.19884897768497467,
-0.16156165301799774,
-0.010300726629793644,
0.1102413535118103,
0.01749129593372345,
0.06399773061275482,
-0.02622102200984955,
-0.015628423541784286,
0.08604495227336884,
-0.016923977062106133,
-0.11325999349355698,
-0.07480964064598083,
0.08663355559110641,
-0.10703404247760773,
0.21945820748806,
-0.04706016927957535,
0.05619185045361519,
0.12956269085407257,
0.06781532615423203,
-0.07991959899663925,
0.06199520826339722,
0.03496561571955681,
-0.03855300322175026,
0.04563915729522705,
0.07112543284893036,
-0.03436211869120598,
0.05684373900294304,
0.04618493467569351,
-0.1332332044839859,
0.018335504457354546,
-0.06255962699651718,
-0.053373973816633224,
-0.03750753775238991,
-0.011835905723273754,
-0.05690811946988106,
0.134381964802742,
0.20424987375736237,
-0.03040335513651371,
-0.01295243389904499,
-0.07077576220035553,
0.017100604251027107,
0.0582498274743557,
0.028955571353435516,
-0.057430993765592575,
-0.20965342223644257,
0.0193637665361166,
0.04905003309249878,
-0.019286978989839554,
-0.24684861302375793,
-0.09661472588777542,
-0.006176378112286329,
-0.08670622855424881,
-0.09091965109109879,
0.06341086328029633,
0.1018756553530693,
0.05184410884976387,
-0.06113895773887634,
-0.028823498636484146,
-0.06907874345779419,
0.1478630006313324,
-0.13758938014507294,
-0.10041738301515579
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlnet-base-cased-finetuned-rte
This model is a fine-tuned version of [xlnet-base-cased](https://huggingface.co/xlnet-base-cased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0656
- Accuracy: 0.6895
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 156 | 0.7007 | 0.4874 |
| No log | 2.0 | 312 | 0.6289 | 0.6751 |
| No log | 3.0 | 468 | 0.7020 | 0.6606 |
| 0.6146 | 4.0 | 624 | 1.0573 | 0.6570 |
| 0.6146 | 5.0 | 780 | 1.0656 | 0.6895 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "xlnet-base-cased-finetuned-rte", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "rte"}, "metrics": [{"type": "accuracy", "value": 0.6895306859205776, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/xlnet-base-cased-finetuned-rte
|
[
"transformers",
"pytorch",
"tensorboard",
"xlnet",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #xlnet #text-classification #generated_from_trainer #dataset-glue #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us
|
xlnet-base-cased-finetuned-rte
==============================
This model is a fine-tuned version of xlnet-base-cased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 1.0656
* Accuracy: 0.6895
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #xlnet #text-classification #generated_from_trainer #dataset-glue #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
63,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #xlnet #text-classification #generated_from_trainer #dataset-glue #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10428813844919205,
0.08606728911399841,
-0.0016652968479320407,
0.11203289777040482,
0.178323894739151,
0.03587399050593376,
0.1422092765569687,
0.12105150520801544,
-0.07487485557794571,
0.01749599538743496,
0.11799986660480499,
0.16327886283397675,
0.01426203828305006,
0.11028683930635452,
-0.05076964199542999,
-0.25780999660491943,
-0.01838730089366436,
0.05112311989068985,
-0.09375208616256714,
0.13316062092781067,
0.09461808949708939,
-0.12941020727157593,
0.09752862900495529,
-0.0012786698061972857,
-0.22555625438690186,
0.009620069526135921,
0.024519119411706924,
-0.048298053443431854,
0.14869609475135803,
0.0437336266040802,
0.13435246050357819,
0.009876473806798458,
0.08514367043972015,
-0.2008315473794937,
0.015185787342488766,
0.05411100760102272,
-0.005987721029669046,
0.09637220203876495,
0.05563075467944145,
-0.000902180967386812,
0.12637796998023987,
-0.09877689182758331,
0.03961428627371788,
0.026241080835461617,
-0.12148396670818329,
-0.18894241750240326,
-0.06203315407037735,
0.026262447237968445,
0.06079692393541336,
0.10516218096017838,
-0.011427421122789383,
0.1503337323665619,
-0.08175568282604218,
0.10497426241636276,
0.21432170271873474,
-0.2972007095813751,
-0.07189786434173584,
0.07312695682048798,
0.032733794301748276,
0.07328260689973831,
-0.10627708584070206,
-0.007345761638134718,
0.06488141417503357,
0.03776244819164276,
0.12117583304643631,
-0.031421370804309845,
-0.10192602872848511,
0.024684742093086243,
-0.1451515555381775,
-0.004046661779284477,
0.14530040323734283,
0.03847802057862282,
-0.0186499934643507,
-0.04539604112505913,
-0.053627386689186096,
-0.14591990411281586,
-0.029724255204200745,
-0.011557523161172867,
0.04957367479801178,
-0.03277236595749855,
-0.08243786543607712,
-0.016274595633149147,
-0.1090828999876976,
-0.06553417444229126,
-0.07860966771841049,
0.145026296377182,
0.030148550868034363,
0.010920589789748192,
-0.038184601813554764,
0.09992051124572754,
-0.002928889123722911,
-0.1167655885219574,
0.02607799507677555,
0.029134228825569153,
-0.01679358258843422,
-0.0598924458026886,
-0.05425398051738739,
-0.08107319474220276,
0.016468899324536324,
0.08951442688703537,
-0.0565398633480072,
0.03710445389151573,
0.05194235220551491,
0.0524422787129879,
-0.08050522208213806,
0.19223135709762573,
-0.05798216164112091,
-0.017917320132255554,
-0.005939421709626913,
0.04240904375910759,
-0.0007607306470163167,
-0.016512064263224602,
-0.1277061104774475,
0.003440083470195532,
0.09100478887557983,
0.010132648050785065,
-0.05871643126010895,
0.08156256377696991,
-0.0504530668258667,
-0.028745079413056374,
-0.020040571689605713,
-0.08557922393083572,
0.037039514631032944,
0.0008006365387700498,
-0.08948250859975815,
-0.023264644667506218,
0.014190313406288624,
0.015073247253894806,
-0.011834132485091686,
0.10333071649074554,
-0.10848590731620789,
0.04278672859072685,
-0.09352458268404007,
-0.1328708529472351,
0.009103232994675636,
-0.08720165491104126,
0.01835586130619049,
-0.09100461006164551,
-0.16319847106933594,
-0.025429701432585716,
0.048447366803884506,
-0.022504184395074844,
-0.04433698207139969,
-0.06973545253276825,
-0.0808270052075386,
0.005245080217719078,
-0.004493770655244589,
0.11511240154504776,
-0.061242084950208664,
0.09593571722507477,
0.031782861799001694,
0.062484025955200195,
-0.05710206553339958,
0.04814092814922333,
-0.09673654288053513,
0.0025846746284514666,
-0.14792665839195251,
0.051310788840055466,
-0.054677799344062805,
0.07148873805999756,
-0.07182297855615616,
-0.10717858374118805,
0.019370103254914284,
0.011312977410852909,
0.05906069651246071,
0.09013873338699341,
-0.19556672871112823,
-0.09377948939800262,
0.16637885570526123,
-0.06235509738326073,
-0.11713363975286484,
0.1163918599486351,
-0.06718885153532028,
0.058450646698474884,
0.0750528872013092,
0.178889200091362,
0.07683201134204865,
-0.07832994312047958,
-0.0036994963884353638,
0.0246751569211483,
0.03712381422519684,
-0.07048708945512772,
0.05341239273548126,
0.02404945157468319,
0.0218147411942482,
0.02379576861858368,
-0.038289107382297516,
0.0652037113904953,
-0.10946324467658997,
-0.09683749824762344,
-0.03266213834285736,
-0.0925590917468071,
0.05532373487949371,
0.07832688838243484,
0.07349096983671188,
-0.0981440618634224,
-0.08118050545454025,
0.07853281497955322,
0.08280482888221741,
-0.06553175300359726,
0.020533261820673943,
-0.06335542351007462,
0.05687185376882553,
-0.03261362388730049,
-0.02954253926873207,
-0.17402143776416779,
-0.06212783232331276,
0.0028035438153892756,
0.015381124801933765,
0.036423224955797195,
0.05486239120364189,
0.06420231610536575,
0.058370839804410934,
-0.04917372763156891,
-0.009852666407823563,
-0.01811136305332184,
-0.003412325168028474,
-0.13720856606960297,
-0.21637782454490662,
-0.0275924950838089,
-0.01988442987203598,
0.1610165238380432,
-0.24399851262569427,
0.04963685944676399,
-0.015102245844900608,
0.06734810769557953,
0.014052843675017357,
-0.0064779724925756454,
-0.04390536621212959,
0.0841355249285698,
-0.04472869262099266,
-0.05011444911360741,
0.07337099313735962,
0.009592469781637192,
-0.11098098009824753,
-0.05216618999838829,
-0.1159026250243187,
0.17623698711395264,
0.13631559908390045,
-0.14021918177604675,
-0.08245372027158737,
0.0016534766182303429,
-0.054511819034814835,
-0.029891300946474075,
-0.04886205494403839,
0.03647077456116676,
0.17578579485416412,
-0.010163011960685253,
0.14994913339614868,
-0.06591308861970901,
-0.0447867214679718,
0.020595714449882507,
-0.04379599168896675,
0.026350202038884163,
0.12835721671581268,
0.12501417100429535,
-0.09030787646770477,
0.14809127151966095,
0.117082878947258,
-0.09137064963579178,
0.15674786269664764,
-0.027972502633929253,
-0.05675051733851433,
-0.026638804003596306,
-0.028015512973070145,
-0.00710601732134819,
0.10669802874326706,
-0.1339329481124878,
-0.012586062774062157,
0.01129439938813448,
0.006538182497024536,
0.03407806530594826,
-0.23141683638095856,
-0.05051086097955704,
0.03455999493598938,
-0.047444019466638565,
-0.01590023934841156,
-0.023215211927890778,
0.0030036706011742353,
0.10639581829309464,
0.0008437093929387629,
-0.09557357430458069,
0.03471217676997185,
-0.0037551524583250284,
-0.09060638397932053,
0.225599005818367,
-0.07743661105632782,
-0.16429094970226288,
-0.1258813440799713,
-0.05752002075314522,
-0.05760033428668976,
-0.0012936909915879369,
0.04069944843649864,
-0.09997933357954025,
-0.02542676404118538,
-0.06273780018091202,
0.019831322133541107,
-0.007582434918731451,
0.02831296995282173,
-0.007439093664288521,
0.012375585734844208,
0.06147059425711632,
-0.11688028275966644,
-0.004769910126924515,
-0.06928548216819763,
-0.07352591305971146,
0.045227084308862686,
0.0291226077824831,
0.11675691604614258,
0.17049051821231842,
-0.029428109526634216,
0.011371353641152382,
-0.035355180501937866,
0.24059024453163147,
-0.07139472663402557,
-0.030119135975837708,
0.10570178925991058,
-0.00069289596285671,
0.044496968388557434,
0.10907603055238724,
0.08492789417505264,
-0.08458196371793747,
0.002109389752149582,
0.041447658091783524,
-0.03312678635120392,
-0.23897050321102142,
-0.05825506150722504,
-0.05032741650938988,
0.0016728303162381053,
0.07271090894937515,
0.03140116110444069,
0.05221502482891083,
0.06331983953714371,
0.04798385500907898,
0.06487210094928741,
-0.0482594333589077,
0.04874210059642792,
0.10411611199378967,
0.0427270382642746,
0.13101470470428467,
-0.04498549550771713,
-0.0688958391547203,
0.03409098461270332,
-0.018930330872535706,
0.22341980040073395,
0.013603595085442066,
0.15078309178352356,
0.05548296868801117,
0.16499081254005432,
-0.0005183197790756822,
0.06507028639316559,
-0.0009120108443312347,
-0.05340998247265816,
0.000046952030970714986,
-0.039398808032274246,
-0.01643037609755993,
0.019372664391994476,
-0.05342580005526543,
0.052974067628383636,
-0.1212552934885025,
0.012116641737520695,
0.06376424431800842,
0.20285353064537048,
0.03746386617422104,
-0.3264169692993164,
-0.08442109823226929,
-0.004374874290078878,
-0.027873331680893898,
-0.013427142053842545,
0.011061340570449829,
0.12043124437332153,
-0.0897056981921196,
0.02995370142161846,
-0.07365009933710098,
0.09697625786066055,
-0.05822952836751938,
0.05427476391196251,
0.08772054314613342,
0.10604920238256454,
-0.005647209472954273,
0.08791260421276093,
-0.28875961899757385,
0.2853658199310303,
0.01082118134945631,
0.06648030132055283,
-0.06753084808588028,
-0.01380794495344162,
0.026510443538427353,
0.07847429811954498,
0.06303410232067108,
-0.009497535414993763,
-0.0150214908644557,
-0.21086567640304565,
-0.046908073127269745,
0.03283198922872543,
0.09624595940113068,
-0.02120836265385151,
0.1001904085278511,
-0.025227027013897896,
0.0070177107118070126,
0.0870286300778389,
-0.013844323344528675,
-0.053847536444664,
-0.09353027492761612,
-0.011314580217003822,
0.03737982362508774,
-0.058129534125328064,
-0.05567801371216774,
-0.12020870298147202,
-0.14593689143657684,
0.17864301800727844,
-0.04390694946050644,
-0.03127536177635193,
-0.10239233821630478,
0.09985890984535217,
0.06045155227184296,
-0.09050890058279037,
0.025585457682609558,
0.01704566180706024,
0.07839331775903702,
0.026851747184991837,
-0.06737703830003738,
0.11558223515748978,
-0.05665621906518936,
-0.14432455599308014,
-0.06044209375977516,
0.09293290972709656,
0.04543357715010643,
0.06662038713693619,
-0.011638748459517956,
0.0023645483888685703,
-0.04353378713130951,
-0.09017695486545563,
0.018018774688243866,
-0.013892876915633678,
0.035127948969602585,
0.0355663038790226,
-0.04938352480530739,
0.01542927697300911,
-0.07626110315322876,
-0.02373092621564865,
0.1995227187871933,
0.231485515832901,
-0.10526424646377563,
-0.005280467215925455,
0.026019224897027016,
-0.07955366373062134,
-0.20057065784931183,
0.07382825016975403,
0.047398459166288376,
0.013760693371295929,
0.03886720538139343,
-0.17620733380317688,
0.13552062213420868,
0.10219420492649078,
-0.0018867338076233864,
0.11118956655263901,
-0.32304176688194275,
-0.12585006654262543,
0.10949832946062088,
0.13401854038238525,
0.11137726157903671,
-0.13701069355010986,
-0.01774488389492035,
-0.016652880236506462,
-0.10845309495925903,
0.1375017762184143,
-0.11218367516994476,
0.1264999806880951,
-0.02207484468817711,
0.08391615003347397,
0.007154947612434626,
-0.05524202436208725,
0.11886292695999146,
0.03299669176340103,
0.10945820063352585,
-0.04844710975885391,
-0.05539398640394211,
0.04960281029343605,
-0.02878173440694809,
0.013832468539476395,
-0.09251520782709122,
0.01927126944065094,
-0.08728360384702682,
-0.022615475580096245,
-0.07731232792139053,
0.05082353204488754,
-0.03860398381948471,
-0.06797187775373459,
-0.04515999183058739,
0.031026950106024742,
0.03595404326915741,
-0.006345473695546389,
0.12954525649547577,
0.009522918611764908,
0.16245856881141663,
0.10650363564491272,
0.08639518916606903,
-0.07463172823190689,
-0.06675920635461807,
-0.005186889320611954,
-0.008627697825431824,
0.0594828836619854,
-0.135049507021904,
0.020472705364227295,
0.15682247281074524,
0.024287838488817215,
0.14766530692577362,
0.09198787808418274,
-0.025421183556318283,
-0.00040218979120254517,
0.06658532470464706,
-0.1524987816810608,
-0.11303617805242538,
-0.019208403304219246,
-0.10019081830978394,
-0.11879875510931015,
0.06146860122680664,
0.10162826627492905,
-0.07725005596876144,
-0.0038541185203939676,
-0.005616243928670883,
-0.0030056012328714132,
-0.0658537894487381,
0.19596898555755615,
0.08189929276704788,
0.047177769243717194,
-0.10251355916261673,
0.057310473173856735,
0.04781726002693176,
-0.04890630394220352,
-0.0007590145105496049,
0.08679723739624023,
-0.07584496587514877,
-0.04651613160967827,
0.0704900473356247,
0.2091292440891266,
-0.0893586054444313,
-0.043515849858522415,
-0.1532253921031952,
-0.1202746033668518,
0.06285014748573303,
0.1698121279478073,
0.1164172813296318,
0.01633572205901146,
-0.05523233488202095,
0.009002409875392914,
-0.1276037096977234,
0.08944763243198395,
0.0454486608505249,
0.06814064830541611,
-0.1552525758743286,
0.19672484695911407,
0.011780589818954468,
0.055862948298454285,
-0.028078770264983177,
0.023050114512443542,
-0.11338090896606445,
0.012657035142183304,
-0.10317298024892807,
-0.02269485965371132,
-0.027423297986388206,
0.006688912399113178,
-0.00011877462384290993,
-0.054342519491910934,
-0.06359311193227768,
0.006026039831340313,
-0.1119275763630867,
-0.02125752903521061,
0.04321863502264023,
0.05345725640654564,
-0.11293870210647583,
-0.035522714257240295,
0.011882747523486614,
-0.04839710891246796,
0.0674266666173935,
0.03036271594464779,
0.01697402074933052,
0.0654403567314148,
-0.17007938027381897,
0.0216713547706604,
0.06603197008371353,
0.020713888108730316,
0.0726521834731102,
-0.06662551313638687,
-0.004492159932851791,
-0.009896024130284786,
0.07287111133337021,
0.02615693397819996,
0.04661388695240021,
-0.12614667415618896,
0.005225708708167076,
-0.03942665457725525,
-0.06504091620445251,
-0.06597025692462921,
0.041805610060691833,
0.07780006527900696,
0.022560512647032738,
0.19609634578227997,
-0.0811176747083664,
0.04195390269160271,
-0.2219500094652176,
0.008380915969610214,
-0.007908070459961891,
-0.11123660951852798,
-0.0963941216468811,
-0.07295104116201401,
0.06990380585193634,
-0.06863756477832794,
0.15336152911186218,
0.05478281155228615,
0.02688126638531685,
0.026156766340136528,
-0.01036914624273777,
0.0010789562948048115,
0.021314244717359543,
0.20502901077270508,
0.04216735064983368,
-0.03351347893476486,
0.05987225100398064,
0.05677120387554169,
0.10363412648439407,
0.11534404009580612,
0.20879516005516052,
0.12810689210891724,
-0.01669846475124359,
0.0945393294095993,
0.057853225618600845,
-0.06248157098889351,
-0.13778401911258698,
0.05854183807969093,
-0.053721167147159576,
0.0936954990029335,
-0.033193040639162064,
0.1907767653465271,
0.07390537112951279,
-0.16480226814746857,
0.04034055396914482,
-0.049034133553504944,
-0.1006176769733429,
-0.12444210797548294,
-0.0444168858230114,
-0.08323399722576141,
-0.1394290179014206,
-0.0036284613888710737,
-0.12123534828424454,
-0.00023062652326188982,
0.10855002701282501,
0.007553000934422016,
-0.028276080265641212,
0.14754849672317505,
0.031906381249427795,
0.02262558788061142,
0.06185629963874817,
0.004160131793469191,
-0.025280732661485672,
-0.13284632563591003,
-0.048323988914489746,
-0.01635207049548626,
-0.009591094218194485,
0.030624650418758392,
-0.060525212436914444,
-0.049109090119600296,
0.04747637361288071,
-0.031822267919778824,
-0.10725878924131393,
0.009598834440112114,
0.021189076825976372,
0.053891558200120926,
0.026095254346728325,
0.010946900583803654,
0.00505652604624629,
-0.015545427799224854,
0.21612727642059326,
-0.07698795199394226,
-0.07292301952838898,
-0.09133568406105042,
0.25457391142845154,
0.026203498244285583,
-0.002381684957072139,
0.022264929488301277,
-0.0692046731710434,
0.006356778088957071,
0.24845321476459503,
0.24113351106643677,
-0.10209083557128906,
0.002063155174255371,
0.005386540666222572,
-0.008007694967091084,
-0.027793673798441887,
0.10733611136674881,
0.11228428781032562,
0.06658362597227097,
-0.09837423264980316,
-0.035231295973062515,
-0.054743289947509766,
-0.008530682884156704,
-0.026892706751823425,
0.03835967928171158,
0.06288773566484451,
0.018653782084584236,
-0.04654109477996826,
0.0643942728638649,
-0.08694345504045486,
-0.0811210423707962,
0.061419710516929626,
-0.20919528603553772,
-0.15798348188400269,
-0.021361786872148514,
0.10454515367746353,
0.003392850048840046,
0.07379980385303497,
-0.03373359516263008,
-0.00033476055250503123,
0.0568898469209671,
-0.021446174010634422,
-0.10673440992832184,
-0.07390060275793076,
0.08734328299760818,
-0.11699157953262329,
0.19343653321266174,
-0.052960291504859924,
0.06736491620540619,
0.12963174283504486,
0.05639595165848732,
-0.06109137460589409,
0.06447287648916245,
0.03861542418599129,
-0.05846170336008072,
0.031629566103219986,
0.07515596598386765,
-0.03316196799278259,
0.0722062811255455,
0.04893116652965546,
-0.13736364245414734,
0.029363475739955902,
-0.07103103399276733,
-0.06018224358558655,
-0.05051732063293457,
-0.04084402695298195,
-0.05114667862653732,
0.12673650681972504,
0.22431668639183044,
-0.026565589010715485,
0.006210958585143089,
-0.06552431732416153,
-0.008117343299090862,
0.06160584092140198,
0.05171484127640724,
-0.0638611912727356,
-0.23273693025112152,
0.011224966496229172,
0.05697038024663925,
-0.013584474101662636,
-0.2518930435180664,
-0.07719819247722626,
-0.007188118062913418,
-0.0740160271525383,
-0.08013597130775452,
0.08560343831777573,
0.11117090284824371,
0.05491456016898155,
-0.05599109083414078,
-0.05876316875219345,
-0.07226286083459854,
0.15498897433280945,
-0.14144767820835114,
-0.09858812391757965
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlnet-base-cased-finetuned-wnli
This model is a fine-tuned version of [xlnet-base-cased](https://huggingface.co/xlnet-base-cased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6874
- Accuracy: 0.5634
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.7209 | 0.5352 |
| No log | 2.0 | 80 | 0.6874 | 0.5634 |
| No log | 3.0 | 120 | 0.6908 | 0.5634 |
| No log | 4.0 | 160 | 0.6987 | 0.4930 |
| No log | 5.0 | 200 | 0.6952 | 0.5634 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
{"license": "mit", "tags": ["generated_from_trainer"], "datasets": ["glue"], "metrics": ["accuracy"], "model-index": [{"name": "xlnet-base-cased-finetuned-wnli", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "glue", "type": "glue", "args": "wnli"}, "metrics": [{"type": "accuracy", "value": 0.5633802816901409, "name": "Accuracy"}]}]}]}
|
text-classification
|
anirudh21/xlnet-base-cased-finetuned-wnli
|
[
"transformers",
"pytorch",
"tensorboard",
"xlnet",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #xlnet #text-classification #generated_from_trainer #dataset-glue #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us
|
xlnet-base-cased-finetuned-wnli
===============================
This model is a fine-tuned version of xlnet-base-cased on the glue dataset.
It achieves the following results on the evaluation set:
* Loss: 0.6874
* Accuracy: 0.5634
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 2e-05
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* num\_epochs: 5
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.10.0+cu111
* Datasets 1.17.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #xlnet #text-classification #generated_from_trainer #dataset-glue #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
63,
98,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #xlnet #text-classification #generated_from_trainer #dataset-glue #license-mit #model-index #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.10.0+cu111\n* Datasets 1.17.0\n* Tokenizers 0.10.3"
] |
[
-0.10428813844919205,
0.08606728911399841,
-0.0016652968479320407,
0.11203289777040482,
0.178323894739151,
0.03587399050593376,
0.1422092765569687,
0.12105150520801544,
-0.07487485557794571,
0.01749599538743496,
0.11799986660480499,
0.16327886283397675,
0.01426203828305006,
0.11028683930635452,
-0.05076964199542999,
-0.25780999660491943,
-0.01838730089366436,
0.05112311989068985,
-0.09375208616256714,
0.13316062092781067,
0.09461808949708939,
-0.12941020727157593,
0.09752862900495529,
-0.0012786698061972857,
-0.22555625438690186,
0.009620069526135921,
0.024519119411706924,
-0.048298053443431854,
0.14869609475135803,
0.0437336266040802,
0.13435246050357819,
0.009876473806798458,
0.08514367043972015,
-0.2008315473794937,
0.015185787342488766,
0.05411100760102272,
-0.005987721029669046,
0.09637220203876495,
0.05563075467944145,
-0.000902180967386812,
0.12637796998023987,
-0.09877689182758331,
0.03961428627371788,
0.026241080835461617,
-0.12148396670818329,
-0.18894241750240326,
-0.06203315407037735,
0.026262447237968445,
0.06079692393541336,
0.10516218096017838,
-0.011427421122789383,
0.1503337323665619,
-0.08175568282604218,
0.10497426241636276,
0.21432170271873474,
-0.2972007095813751,
-0.07189786434173584,
0.07312695682048798,
0.032733794301748276,
0.07328260689973831,
-0.10627708584070206,
-0.007345761638134718,
0.06488141417503357,
0.03776244819164276,
0.12117583304643631,
-0.031421370804309845,
-0.10192602872848511,
0.024684742093086243,
-0.1451515555381775,
-0.004046661779284477,
0.14530040323734283,
0.03847802057862282,
-0.0186499934643507,
-0.04539604112505913,
-0.053627386689186096,
-0.14591990411281586,
-0.029724255204200745,
-0.011557523161172867,
0.04957367479801178,
-0.03277236595749855,
-0.08243786543607712,
-0.016274595633149147,
-0.1090828999876976,
-0.06553417444229126,
-0.07860966771841049,
0.145026296377182,
0.030148550868034363,
0.010920589789748192,
-0.038184601813554764,
0.09992051124572754,
-0.002928889123722911,
-0.1167655885219574,
0.02607799507677555,
0.029134228825569153,
-0.01679358258843422,
-0.0598924458026886,
-0.05425398051738739,
-0.08107319474220276,
0.016468899324536324,
0.08951442688703537,
-0.0565398633480072,
0.03710445389151573,
0.05194235220551491,
0.0524422787129879,
-0.08050522208213806,
0.19223135709762573,
-0.05798216164112091,
-0.017917320132255554,
-0.005939421709626913,
0.04240904375910759,
-0.0007607306470163167,
-0.016512064263224602,
-0.1277061104774475,
0.003440083470195532,
0.09100478887557983,
0.010132648050785065,
-0.05871643126010895,
0.08156256377696991,
-0.0504530668258667,
-0.028745079413056374,
-0.020040571689605713,
-0.08557922393083572,
0.037039514631032944,
0.0008006365387700498,
-0.08948250859975815,
-0.023264644667506218,
0.014190313406288624,
0.015073247253894806,
-0.011834132485091686,
0.10333071649074554,
-0.10848590731620789,
0.04278672859072685,
-0.09352458268404007,
-0.1328708529472351,
0.009103232994675636,
-0.08720165491104126,
0.01835586130619049,
-0.09100461006164551,
-0.16319847106933594,
-0.025429701432585716,
0.048447366803884506,
-0.022504184395074844,
-0.04433698207139969,
-0.06973545253276825,
-0.0808270052075386,
0.005245080217719078,
-0.004493770655244589,
0.11511240154504776,
-0.061242084950208664,
0.09593571722507477,
0.031782861799001694,
0.062484025955200195,
-0.05710206553339958,
0.04814092814922333,
-0.09673654288053513,
0.0025846746284514666,
-0.14792665839195251,
0.051310788840055466,
-0.054677799344062805,
0.07148873805999756,
-0.07182297855615616,
-0.10717858374118805,
0.019370103254914284,
0.011312977410852909,
0.05906069651246071,
0.09013873338699341,
-0.19556672871112823,
-0.09377948939800262,
0.16637885570526123,
-0.06235509738326073,
-0.11713363975286484,
0.1163918599486351,
-0.06718885153532028,
0.058450646698474884,
0.0750528872013092,
0.178889200091362,
0.07683201134204865,
-0.07832994312047958,
-0.0036994963884353638,
0.0246751569211483,
0.03712381422519684,
-0.07048708945512772,
0.05341239273548126,
0.02404945157468319,
0.0218147411942482,
0.02379576861858368,
-0.038289107382297516,
0.0652037113904953,
-0.10946324467658997,
-0.09683749824762344,
-0.03266213834285736,
-0.0925590917468071,
0.05532373487949371,
0.07832688838243484,
0.07349096983671188,
-0.0981440618634224,
-0.08118050545454025,
0.07853281497955322,
0.08280482888221741,
-0.06553175300359726,
0.020533261820673943,
-0.06335542351007462,
0.05687185376882553,
-0.03261362388730049,
-0.02954253926873207,
-0.17402143776416779,
-0.06212783232331276,
0.0028035438153892756,
0.015381124801933765,
0.036423224955797195,
0.05486239120364189,
0.06420231610536575,
0.058370839804410934,
-0.04917372763156891,
-0.009852666407823563,
-0.01811136305332184,
-0.003412325168028474,
-0.13720856606960297,
-0.21637782454490662,
-0.0275924950838089,
-0.01988442987203598,
0.1610165238380432,
-0.24399851262569427,
0.04963685944676399,
-0.015102245844900608,
0.06734810769557953,
0.014052843675017357,
-0.0064779724925756454,
-0.04390536621212959,
0.0841355249285698,
-0.04472869262099266,
-0.05011444911360741,
0.07337099313735962,
0.009592469781637192,
-0.11098098009824753,
-0.05216618999838829,
-0.1159026250243187,
0.17623698711395264,
0.13631559908390045,
-0.14021918177604675,
-0.08245372027158737,
0.0016534766182303429,
-0.054511819034814835,
-0.029891300946474075,
-0.04886205494403839,
0.03647077456116676,
0.17578579485416412,
-0.010163011960685253,
0.14994913339614868,
-0.06591308861970901,
-0.0447867214679718,
0.020595714449882507,
-0.04379599168896675,
0.026350202038884163,
0.12835721671581268,
0.12501417100429535,
-0.09030787646770477,
0.14809127151966095,
0.117082878947258,
-0.09137064963579178,
0.15674786269664764,
-0.027972502633929253,
-0.05675051733851433,
-0.026638804003596306,
-0.028015512973070145,
-0.00710601732134819,
0.10669802874326706,
-0.1339329481124878,
-0.012586062774062157,
0.01129439938813448,
0.006538182497024536,
0.03407806530594826,
-0.23141683638095856,
-0.05051086097955704,
0.03455999493598938,
-0.047444019466638565,
-0.01590023934841156,
-0.023215211927890778,
0.0030036706011742353,
0.10639581829309464,
0.0008437093929387629,
-0.09557357430458069,
0.03471217676997185,
-0.0037551524583250284,
-0.09060638397932053,
0.225599005818367,
-0.07743661105632782,
-0.16429094970226288,
-0.1258813440799713,
-0.05752002075314522,
-0.05760033428668976,
-0.0012936909915879369,
0.04069944843649864,
-0.09997933357954025,
-0.02542676404118538,
-0.06273780018091202,
0.019831322133541107,
-0.007582434918731451,
0.02831296995282173,
-0.007439093664288521,
0.012375585734844208,
0.06147059425711632,
-0.11688028275966644,
-0.004769910126924515,
-0.06928548216819763,
-0.07352591305971146,
0.045227084308862686,
0.0291226077824831,
0.11675691604614258,
0.17049051821231842,
-0.029428109526634216,
0.011371353641152382,
-0.035355180501937866,
0.24059024453163147,
-0.07139472663402557,
-0.030119135975837708,
0.10570178925991058,
-0.00069289596285671,
0.044496968388557434,
0.10907603055238724,
0.08492789417505264,
-0.08458196371793747,
0.002109389752149582,
0.041447658091783524,
-0.03312678635120392,
-0.23897050321102142,
-0.05825506150722504,
-0.05032741650938988,
0.0016728303162381053,
0.07271090894937515,
0.03140116110444069,
0.05221502482891083,
0.06331983953714371,
0.04798385500907898,
0.06487210094928741,
-0.0482594333589077,
0.04874210059642792,
0.10411611199378967,
0.0427270382642746,
0.13101470470428467,
-0.04498549550771713,
-0.0688958391547203,
0.03409098461270332,
-0.018930330872535706,
0.22341980040073395,
0.013603595085442066,
0.15078309178352356,
0.05548296868801117,
0.16499081254005432,
-0.0005183197790756822,
0.06507028639316559,
-0.0009120108443312347,
-0.05340998247265816,
0.000046952030970714986,
-0.039398808032274246,
-0.01643037609755993,
0.019372664391994476,
-0.05342580005526543,
0.052974067628383636,
-0.1212552934885025,
0.012116641737520695,
0.06376424431800842,
0.20285353064537048,
0.03746386617422104,
-0.3264169692993164,
-0.08442109823226929,
-0.004374874290078878,
-0.027873331680893898,
-0.013427142053842545,
0.011061340570449829,
0.12043124437332153,
-0.0897056981921196,
0.02995370142161846,
-0.07365009933710098,
0.09697625786066055,
-0.05822952836751938,
0.05427476391196251,
0.08772054314613342,
0.10604920238256454,
-0.005647209472954273,
0.08791260421276093,
-0.28875961899757385,
0.2853658199310303,
0.01082118134945631,
0.06648030132055283,
-0.06753084808588028,
-0.01380794495344162,
0.026510443538427353,
0.07847429811954498,
0.06303410232067108,
-0.009497535414993763,
-0.0150214908644557,
-0.21086567640304565,
-0.046908073127269745,
0.03283198922872543,
0.09624595940113068,
-0.02120836265385151,
0.1001904085278511,
-0.025227027013897896,
0.0070177107118070126,
0.0870286300778389,
-0.013844323344528675,
-0.053847536444664,
-0.09353027492761612,
-0.011314580217003822,
0.03737982362508774,
-0.058129534125328064,
-0.05567801371216774,
-0.12020870298147202,
-0.14593689143657684,
0.17864301800727844,
-0.04390694946050644,
-0.03127536177635193,
-0.10239233821630478,
0.09985890984535217,
0.06045155227184296,
-0.09050890058279037,
0.025585457682609558,
0.01704566180706024,
0.07839331775903702,
0.026851747184991837,
-0.06737703830003738,
0.11558223515748978,
-0.05665621906518936,
-0.14432455599308014,
-0.06044209375977516,
0.09293290972709656,
0.04543357715010643,
0.06662038713693619,
-0.011638748459517956,
0.0023645483888685703,
-0.04353378713130951,
-0.09017695486545563,
0.018018774688243866,
-0.013892876915633678,
0.035127948969602585,
0.0355663038790226,
-0.04938352480530739,
0.01542927697300911,
-0.07626110315322876,
-0.02373092621564865,
0.1995227187871933,
0.231485515832901,
-0.10526424646377563,
-0.005280467215925455,
0.026019224897027016,
-0.07955366373062134,
-0.20057065784931183,
0.07382825016975403,
0.047398459166288376,
0.013760693371295929,
0.03886720538139343,
-0.17620733380317688,
0.13552062213420868,
0.10219420492649078,
-0.0018867338076233864,
0.11118956655263901,
-0.32304176688194275,
-0.12585006654262543,
0.10949832946062088,
0.13401854038238525,
0.11137726157903671,
-0.13701069355010986,
-0.01774488389492035,
-0.016652880236506462,
-0.10845309495925903,
0.1375017762184143,
-0.11218367516994476,
0.1264999806880951,
-0.02207484468817711,
0.08391615003347397,
0.007154947612434626,
-0.05524202436208725,
0.11886292695999146,
0.03299669176340103,
0.10945820063352585,
-0.04844710975885391,
-0.05539398640394211,
0.04960281029343605,
-0.02878173440694809,
0.013832468539476395,
-0.09251520782709122,
0.01927126944065094,
-0.08728360384702682,
-0.022615475580096245,
-0.07731232792139053,
0.05082353204488754,
-0.03860398381948471,
-0.06797187775373459,
-0.04515999183058739,
0.031026950106024742,
0.03595404326915741,
-0.006345473695546389,
0.12954525649547577,
0.009522918611764908,
0.16245856881141663,
0.10650363564491272,
0.08639518916606903,
-0.07463172823190689,
-0.06675920635461807,
-0.005186889320611954,
-0.008627697825431824,
0.0594828836619854,
-0.135049507021904,
0.020472705364227295,
0.15682247281074524,
0.024287838488817215,
0.14766530692577362,
0.09198787808418274,
-0.025421183556318283,
-0.00040218979120254517,
0.06658532470464706,
-0.1524987816810608,
-0.11303617805242538,
-0.019208403304219246,
-0.10019081830978394,
-0.11879875510931015,
0.06146860122680664,
0.10162826627492905,
-0.07725005596876144,
-0.0038541185203939676,
-0.005616243928670883,
-0.0030056012328714132,
-0.0658537894487381,
0.19596898555755615,
0.08189929276704788,
0.047177769243717194,
-0.10251355916261673,
0.057310473173856735,
0.04781726002693176,
-0.04890630394220352,
-0.0007590145105496049,
0.08679723739624023,
-0.07584496587514877,
-0.04651613160967827,
0.0704900473356247,
0.2091292440891266,
-0.0893586054444313,
-0.043515849858522415,
-0.1532253921031952,
-0.1202746033668518,
0.06285014748573303,
0.1698121279478073,
0.1164172813296318,
0.01633572205901146,
-0.05523233488202095,
0.009002409875392914,
-0.1276037096977234,
0.08944763243198395,
0.0454486608505249,
0.06814064830541611,
-0.1552525758743286,
0.19672484695911407,
0.011780589818954468,
0.055862948298454285,
-0.028078770264983177,
0.023050114512443542,
-0.11338090896606445,
0.012657035142183304,
-0.10317298024892807,
-0.02269485965371132,
-0.027423297986388206,
0.006688912399113178,
-0.00011877462384290993,
-0.054342519491910934,
-0.06359311193227768,
0.006026039831340313,
-0.1119275763630867,
-0.02125752903521061,
0.04321863502264023,
0.05345725640654564,
-0.11293870210647583,
-0.035522714257240295,
0.011882747523486614,
-0.04839710891246796,
0.0674266666173935,
0.03036271594464779,
0.01697402074933052,
0.0654403567314148,
-0.17007938027381897,
0.0216713547706604,
0.06603197008371353,
0.020713888108730316,
0.0726521834731102,
-0.06662551313638687,
-0.004492159932851791,
-0.009896024130284786,
0.07287111133337021,
0.02615693397819996,
0.04661388695240021,
-0.12614667415618896,
0.005225708708167076,
-0.03942665457725525,
-0.06504091620445251,
-0.06597025692462921,
0.041805610060691833,
0.07780006527900696,
0.022560512647032738,
0.19609634578227997,
-0.0811176747083664,
0.04195390269160271,
-0.2219500094652176,
0.008380915969610214,
-0.007908070459961891,
-0.11123660951852798,
-0.0963941216468811,
-0.07295104116201401,
0.06990380585193634,
-0.06863756477832794,
0.15336152911186218,
0.05478281155228615,
0.02688126638531685,
0.026156766340136528,
-0.01036914624273777,
0.0010789562948048115,
0.021314244717359543,
0.20502901077270508,
0.04216735064983368,
-0.03351347893476486,
0.05987225100398064,
0.05677120387554169,
0.10363412648439407,
0.11534404009580612,
0.20879516005516052,
0.12810689210891724,
-0.01669846475124359,
0.0945393294095993,
0.057853225618600845,
-0.06248157098889351,
-0.13778401911258698,
0.05854183807969093,
-0.053721167147159576,
0.0936954990029335,
-0.033193040639162064,
0.1907767653465271,
0.07390537112951279,
-0.16480226814746857,
0.04034055396914482,
-0.049034133553504944,
-0.1006176769733429,
-0.12444210797548294,
-0.0444168858230114,
-0.08323399722576141,
-0.1394290179014206,
-0.0036284613888710737,
-0.12123534828424454,
-0.00023062652326188982,
0.10855002701282501,
0.007553000934422016,
-0.028276080265641212,
0.14754849672317505,
0.031906381249427795,
0.02262558788061142,
0.06185629963874817,
0.004160131793469191,
-0.025280732661485672,
-0.13284632563591003,
-0.048323988914489746,
-0.01635207049548626,
-0.009591094218194485,
0.030624650418758392,
-0.060525212436914444,
-0.049109090119600296,
0.04747637361288071,
-0.031822267919778824,
-0.10725878924131393,
0.009598834440112114,
0.021189076825976372,
0.053891558200120926,
0.026095254346728325,
0.010946900583803654,
0.00505652604624629,
-0.015545427799224854,
0.21612727642059326,
-0.07698795199394226,
-0.07292301952838898,
-0.09133568406105042,
0.25457391142845154,
0.026203498244285583,
-0.002381684957072139,
0.022264929488301277,
-0.0692046731710434,
0.006356778088957071,
0.24845321476459503,
0.24113351106643677,
-0.10209083557128906,
0.002063155174255371,
0.005386540666222572,
-0.008007694967091084,
-0.027793673798441887,
0.10733611136674881,
0.11228428781032562,
0.06658362597227097,
-0.09837423264980316,
-0.035231295973062515,
-0.054743289947509766,
-0.008530682884156704,
-0.026892706751823425,
0.03835967928171158,
0.06288773566484451,
0.018653782084584236,
-0.04654109477996826,
0.0643942728638649,
-0.08694345504045486,
-0.0811210423707962,
0.061419710516929626,
-0.20919528603553772,
-0.15798348188400269,
-0.021361786872148514,
0.10454515367746353,
0.003392850048840046,
0.07379980385303497,
-0.03373359516263008,
-0.00033476055250503123,
0.0568898469209671,
-0.021446174010634422,
-0.10673440992832184,
-0.07390060275793076,
0.08734328299760818,
-0.11699157953262329,
0.19343653321266174,
-0.052960291504859924,
0.06736491620540619,
0.12963174283504486,
0.05639595165848732,
-0.06109137460589409,
0.06447287648916245,
0.03861542418599129,
-0.05846170336008072,
0.031629566103219986,
0.07515596598386765,
-0.03316196799278259,
0.0722062811255455,
0.04893116652965546,
-0.13736364245414734,
0.029363475739955902,
-0.07103103399276733,
-0.06018224358558655,
-0.05051732063293457,
-0.04084402695298195,
-0.05114667862653732,
0.12673650681972504,
0.22431668639183044,
-0.026565589010715485,
0.006210958585143089,
-0.06552431732416153,
-0.008117343299090862,
0.06160584092140198,
0.05171484127640724,
-0.0638611912727356,
-0.23273693025112152,
0.011224966496229172,
0.05697038024663925,
-0.013584474101662636,
-0.2518930435180664,
-0.07719819247722626,
-0.007188118062913418,
-0.0740160271525383,
-0.08013597130775452,
0.08560343831777573,
0.11117090284824371,
0.05491456016898155,
-0.05599109083414078,
-0.05876316875219345,
-0.07226286083459854,
0.15498897433280945,
-0.14144767820835114,
-0.09858812391757965
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wavlm-base-english
This model is a fine-tuned version of [microsoft/wavlm-base](https://huggingface.co/microsoft/wavlm-base) on the english_ASR - CLEAN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0955
- Wer: 0.0773
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 2.8664 | 0.17 | 300 | 2.8439 | 1.0 |
| 0.5009 | 0.34 | 600 | 0.2709 | 0.2162 |
| 0.2056 | 0.5 | 900 | 0.1934 | 0.1602 |
| 0.1648 | 0.67 | 1200 | 0.1576 | 0.1306 |
| 0.1922 | 0.84 | 1500 | 0.1358 | 0.1114 |
| 0.093 | 1.01 | 1800 | 0.1277 | 0.1035 |
| 0.0652 | 1.18 | 2100 | 0.1251 | 0.1005 |
| 0.0848 | 1.35 | 2400 | 0.1188 | 0.0964 |
| 0.0706 | 1.51 | 2700 | 0.1091 | 0.0905 |
| 0.0846 | 1.68 | 3000 | 0.1018 | 0.0840 |
| 0.0684 | 1.85 | 3300 | 0.0978 | 0.0809 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.9.1
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"tags": ["automatic-speech-recognition", "english_asr", "generated_from_trainer"], "model-index": [{"name": "wavlm-base-english", "results": []}]}
|
automatic-speech-recognition
|
anjulRajendraSharma/WavLm-base-en
|
[
"transformers",
"pytorch",
"tensorboard",
"wavlm",
"automatic-speech-recognition",
"english_asr",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #wavlm #automatic-speech-recognition #english_asr #generated_from_trainer #endpoints_compatible #region-us
|
wavlm-base-english
==================
This model is a fine-tuned version of microsoft/wavlm-base on the english\_ASR - CLEAN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0955
* Wer: 0.0773
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 1.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.9.1
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.1\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wavlm #automatic-speech-recognition #english_asr #generated_from_trainer #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.1\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
52,
130,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wavlm #automatic-speech-recognition #english_asr #generated_from_trainer #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.1\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.11998158693313599,
0.04163835570216179,
-0.002532340120524168,
0.07453712075948715,
0.1470426470041275,
-0.0006227655103430152,
0.09204435348510742,
0.12833712995052338,
-0.059435419738292694,
0.05654207244515419,
0.114042729139328,
0.14007169008255005,
0.02805384062230587,
0.09174080938100815,
-0.04367753118276596,
-0.2907477915287018,
-0.0038359840400516987,
0.016593236476182938,
-0.05149126425385475,
0.11086759716272354,
0.09236364811658859,
-0.12970536947250366,
0.0540427565574646,
0.015484705567359924,
-0.13736559450626373,
0.020547842606902122,
0.012963888235390186,
-0.08559231460094452,
0.1300203949213028,
0.020728329196572304,
0.0963023230433464,
0.012599548324942589,
0.07799196988344193,
-0.23346303403377533,
0.010769392363727093,
0.03227852284908295,
0.04652915149927139,
0.06125182658433914,
0.07571414858102798,
-0.036594364792108536,
0.11318966001272202,
-0.1174011081457138,
0.06626462936401367,
0.036252815276384354,
-0.11542708426713943,
-0.2642792761325836,
-0.049505479633808136,
0.01394646242260933,
0.0533268079161644,
0.10278477519750595,
-0.02330102026462555,
0.09885250777006149,
-0.08609762042760849,
0.11662919074296951,
0.24172165989875793,
-0.2664618194103241,
-0.04250776767730713,
-0.03741735592484474,
0.024110740050673485,
0.08941745012998581,
-0.11998509615659714,
-0.016020435839891434,
0.026989897713065147,
0.05289561673998833,
0.10579681396484375,
-0.025397993624210358,
-0.07384190708398819,
0.01064884103834629,
-0.15105384588241577,
-0.029967186972498894,
0.10252679139375687,
0.026961516588926315,
-0.03246214613318443,
-0.08950458467006683,
-0.03718537837266922,
-0.15612639486789703,
-0.06565035134553909,
-0.005900940857827663,
0.027607807889580727,
-0.03960608318448067,
-0.08182232081890106,
-0.020669156685471535,
-0.07996556907892227,
-0.06634156405925751,
-0.03461482748389244,
0.18282592296600342,
0.039254605770111084,
-0.006011317949742079,
-0.030829407274723053,
0.06690853089094162,
0.011014251969754696,
-0.13401751220226288,
0.01887364685535431,
0.03979991376399994,
-0.03272539749741554,
-0.009543301537632942,
-0.06589335948228836,
-0.0532478503882885,
0.016599154099822044,
0.06514106690883636,
-0.09422759711742401,
0.0788780078291893,
0.007012942340224981,
0.030039895325899124,
-0.10821480304002762,
0.21789363026618958,
-0.05108920484781265,
-0.009958796203136444,
-0.02599724940955639,
0.06981737166643143,
-0.008696301840245724,
-0.01883615180850029,
-0.09168125689029694,
0.006074585020542145,
0.11805877834558487,
0.021466361358761787,
-0.05911409854888916,
0.051620811223983765,
-0.035990502685308456,
-0.015977539122104645,
-0.037088967859745026,
-0.14346227049827576,
0.0482996329665184,
0.033610474318265915,
-0.06523275375366211,
0.043218474835157394,
0.015388667583465576,
0.0013053083093836904,
-0.0615902878344059,
0.09164489805698395,
-0.06688928604125977,
0.04857346788048744,
-0.06282931566238403,
-0.1283837854862213,
0.005803560838103294,
-0.08322475850582123,
0.011825286783277988,
-0.09618104994297028,
-0.11752592772245407,
-0.022086238488554955,
0.02765611559152603,
-0.03472535312175751,
-0.004172281827777624,
-0.08417142182588577,
-0.08164291083812714,
0.035271383821964264,
-0.03061952255666256,
0.13092872500419617,
-0.057592663913965225,
0.10848447680473328,
0.04468162730336189,
0.0778765007853508,
-0.01132360752671957,
0.07890886813402176,
-0.06859254837036133,
0.006404153537005186,
-0.13627415895462036,
0.11274529248476028,
-0.0869605615735054,
0.03226960822939873,
-0.10979652404785156,
-0.11898405104875565,
-0.01854320801794529,
0.02110060676932335,
0.088826484978199,
0.09237752109766006,
-0.16933484375476837,
-0.10607761889696121,
0.1910850554704666,
-0.06451275199651718,
-0.04481879621744156,
0.1351730227470398,
-0.04070369899272919,
0.0025219167582690716,
0.07185325771570206,
0.24636167287826538,
0.058359164744615555,
-0.11369559168815613,
0.020800955593585968,
-0.01588575728237629,
0.06658358871936798,
0.0005163674359209836,
0.04034335911273956,
-0.02900022454559803,
0.031186504289507866,
0.02797418273985386,
-0.008827661164104939,
0.06734974682331085,
-0.09273618459701538,
-0.08984125405550003,
-0.04383203387260437,
-0.10576362162828445,
0.042053770273923874,
0.061097774654626846,
0.06401940435171127,
-0.08924935758113861,
-0.0900711789727211,
0.04404124245047569,
0.06301183998584747,
-0.09685917943716049,
0.05197065323591232,
-0.08191177248954773,
0.05276383459568024,
-0.029231403023004532,
-0.016221314668655396,
-0.19744537770748138,
0.03146635368466377,
0.01539506483823061,
0.009136722423136234,
0.040053606033325195,
-0.0035726868081837893,
0.09403330832719803,
0.04036545380949974,
-0.05406863987445831,
-0.032854508608579636,
-0.013037359341979027,
0.006673816125839949,
-0.10282808542251587,
-0.19798223674297333,
-0.02296856790781021,
-0.03509178385138512,
0.09431236237287521,
-0.18861518800258636,
0.011683163233101368,
0.027913058176636696,
0.06833603978157043,
0.02231740392744541,
-0.034388236701488495,
-0.006192964501678944,
0.10150706768035889,
-0.008702677674591541,
-0.04608242213726044,
0.06207280978560448,
-0.016800345852971077,
-0.10260186344385147,
0.011694320477545261,
-0.14812177419662476,
0.10064385831356049,
0.14088574051856995,
-0.08238600194454193,
-0.07766525447368622,
0.009627871215343475,
-0.04875809699296951,
-0.028758829459547997,
-0.036274801939725876,
0.018506517633795738,
0.228802889585495,
-0.002744724042713642,
0.14716829359531403,
-0.06532549113035202,
-0.0189987625926733,
0.03206583112478256,
-0.02432536520063877,
0.01118962001055479,
0.15463177859783173,
0.02554449252784252,
-0.025028595700860023,
0.09826070070266724,
0.10352136939764023,
-0.10440899431705475,
0.15020868182182312,
-0.04987797886133194,
-0.10051663219928741,
0.004095674026757479,
-0.006813181564211845,
0.0035113433841615915,
0.08674632757902145,
-0.15794843435287476,
-0.0292816124856472,
0.022882871329784393,
0.04049661383032799,
0.021121442317962646,
-0.22175127267837524,
-0.007079440634697676,
0.029795648530125618,
-0.0804755762219429,
-0.03240428864955902,
0.0033691979479044676,
0.021314237266778946,
0.10259048640727997,
-0.006798440124839544,
-0.08207064121961594,
-0.00005717980093322694,
-0.01889782026410103,
-0.0871141254901886,
0.1817876547574997,
-0.09758219867944717,
-0.16977347433567047,
-0.10301809757947922,
-0.06717758625745773,
-0.03170153498649597,
0.0031479711178690195,
0.05907163396477699,
-0.12112843245267868,
-0.029240909963846207,
-0.06719610095024109,
0.039518244564533234,
-0.03778156265616417,
0.03189520910382271,
0.010884154587984085,
-0.011318795382976532,
0.06497877836227417,
-0.11105329543352127,
-0.009141365997493267,
-0.05846146121621132,
-0.025704529136419296,
0.04240855947136879,
0.046266861259937286,
0.09713945537805557,
0.1606743037700653,
-0.013058501295745373,
0.031109528616070747,
-0.03301915526390076,
0.21497902274131775,
-0.07736773788928986,
-0.04076753184199333,
0.09922386705875397,
-0.01847119629383087,
0.039068784564733505,
0.09625834971666336,
0.0567852258682251,
-0.09169716387987137,
-0.01084692869335413,
0.02175862528383732,
-0.040363412350416183,
-0.22057311236858368,
-0.0552191324532032,
-0.05940880626440048,
-0.02914823219180107,
0.1054290309548378,
0.027613459154963493,
-0.0020494135096669197,
0.02148035168647766,
0.04963405802845955,
0.01370504405349493,
-0.013521118089556694,
0.04228420928120613,
0.13736826181411743,
0.024551721289753914,
0.12268680334091187,
-0.02718352898955345,
-0.05978541076183319,
0.023652123287320137,
-0.01051757950335741,
0.21311764419078827,
0.022052135318517685,
0.11878958344459534,
0.03267250955104828,
0.16844865679740906,
0.00970893632620573,
0.07563219219446182,
0.0074583422392606735,
-0.033222854137420654,
0.016752392053604126,
-0.051408614963293076,
-0.05202333256602287,
0.031179439276456833,
0.053513914346694946,
0.021354416385293007,
-0.12226515263319016,
-0.02862081490457058,
0.04134034365415573,
0.2858757972717285,
0.04500506818294525,
-0.2933906018733978,
-0.08972669392824173,
-0.008488542400300503,
-0.08939678966999054,
-0.018439549952745438,
0.04200068488717079,
0.09884311258792877,
-0.07907745242118835,
0.054470788687467575,
-0.05103391036391258,
0.0857473611831665,
-0.0522688627243042,
0.03464842960238457,
0.017931587994098663,
0.08414259552955627,
0.013450159691274166,
0.047020331025123596,
-0.31936711072921753,
0.2992134988307953,
0.017131373286247253,
0.08345726132392883,
-0.05633071810007095,
-0.015300001949071884,
0.028523141518235207,
-0.0078080217353999615,
0.08289085328578949,
-0.009722167626023293,
-0.08117790520191193,
-0.18760503828525543,
-0.06436292827129364,
0.026978902518749237,
0.15414710342884064,
0.02102554589509964,
0.10736607760190964,
-0.023086586967110634,
-0.003892855951562524,
0.07085947692394257,
-0.08818413317203522,
-0.09726419299840927,
-0.095626100897789,
-0.005406948272138834,
0.08958418667316437,
0.01408030092716217,
-0.0484631210565567,
-0.10953710973262787,
-0.13494904339313507,
0.11584111303091049,
-0.0718645453453064,
-0.018827831372618675,
-0.123893141746521,
0.07466824352741241,
0.12396661937236786,
-0.07752233743667603,
0.028704838827252388,
0.030470028519630432,
0.06234534829854965,
0.018467258661985397,
-0.053892962634563446,
0.10387034714221954,
-0.06474635004997253,
-0.1727917343378067,
-0.017885494977235794,
0.14876557886600494,
0.05091026797890663,
0.07379410415887833,
-0.016954675316810608,
0.04073470085859299,
-0.032474201172590256,
-0.07844270020723343,
0.07470544427633286,
0.04548806697130203,
-0.008814245462417603,
0.05936620756983757,
-0.062117744237184525,
-0.019068708643317223,
-0.09017277508974075,
-0.02660028450191021,
0.20749308168888092,
0.22409434616565704,
-0.08598675578832626,
0.06530939787626266,
0.060406360775232315,
-0.0647466778755188,
-0.1946263164281845,
0.025197679176926613,
0.08700767904520035,
0.023418234661221504,
0.023361820727586746,
-0.17956651747226715,
0.071558877825737,
0.05069415643811226,
-0.0019688275642693043,
0.08719142526388168,
-0.30321377515792847,
-0.14710135757923126,
0.14350654184818268,
0.11855833232402802,
0.08339009433984756,
-0.1551773101091385,
-0.036490049213171005,
-0.014680902473628521,
-0.08189891278743744,
0.07286092638969421,
-0.030522093176841736,
0.14625117182731628,
-0.020952614024281502,
0.12628675997257233,
0.021253937855362892,
-0.056846942752599716,
0.11226662993431091,
0.02428676187992096,
0.04831443727016449,
-0.02807535044848919,
0.011524666100740433,
0.007492984179407358,
-0.022820325568318367,
0.04822926968336105,
-0.05790156498551369,
0.02194518968462944,
-0.07946830242872238,
-0.03841802850365639,
-0.10954103618860245,
0.033888377249240875,
-0.0007265887106768787,
-0.03898947685956955,
-0.010162408463656902,
0.008793751709163189,
0.06504692137241364,
-0.005666190292686224,
0.10130519419908524,
-0.06756848096847534,
0.12279968708753586,
0.11314909160137177,
0.12121441960334778,
-0.061313703656196594,
-0.07111066579818726,
0.0007582858670502901,
-0.016363373026251793,
0.05652259662747383,
-0.10483234375715256,
0.043371398001909256,
0.1420988142490387,
0.049600474536418915,
0.1417151391506195,
0.0764908492565155,
-0.0484144501388073,
0.016751805320382118,
0.0461842305958271,
-0.12463750690221786,
-0.14933857321739197,
-0.002481906209141016,
-0.029612433165311813,
-0.0926567018032074,
0.027822941541671753,
0.11235175281763077,
-0.06388284265995026,
-0.003983058966696262,
-0.0231733750551939,
0.01879633590579033,
-0.07234236598014832,
0.22779366374015808,
0.08160041272640228,
0.04681781679391861,
-0.10767946392297745,
0.072519451379776,
0.0297352634370327,
-0.156979501247406,
0.06290553510189056,
0.09338133782148361,
-0.0705217719078064,
-0.0395837239921093,
0.021711163222789764,
0.1273939609527588,
-0.037506237626075745,
-0.05587111413478851,
-0.11409370601177216,
-0.13519929349422455,
0.09684748202562332,
0.15274940431118011,
0.06654511392116547,
0.01122418325394392,
-0.076797716319561,
0.013512063771486282,
-0.1209079697728157,
0.08543479442596436,
0.05498049780726433,
0.0453963428735733,
-0.11799059063196182,
0.1591956466436386,
0.0070688314735889435,
0.04013863578438759,
-0.024800123646855354,
-0.0024473881348967552,
-0.09983040392398834,
0.04301941394805908,
-0.11980690062046051,
-0.016298146918416023,
-0.03950633108615875,
0.007251098286360502,
0.007718921639025211,
-0.08356009423732758,
-0.05149896442890167,
0.0239020474255085,
-0.12348625808954239,
-0.02694845385849476,
0.0018106942297890782,
0.03590220957994461,
-0.11388847976922989,
-0.0322745144367218,
0.021934974938631058,
-0.06530527770519257,
0.06884729117155075,
0.0794583261013031,
-0.052944861352443695,
0.08160001039505005,
-0.13762715458869934,
-0.015433699823915958,
0.06730056554079056,
0.0012909055221825838,
0.04996560513973236,
-0.11632642894983292,
-0.01219093892723322,
0.013849628157913685,
0.07739196717739105,
0.03507186472415924,
0.1124134510755539,
-0.12504898011684418,
0.017734255641698837,
-0.05012917146086693,
-0.06496607512235641,
-0.07814478874206543,
0.053501639515161514,
0.07388566434383392,
0.040734920650720596,
0.1665755957365036,
-0.10835888236761093,
0.051257044076919556,
-0.18463672697544098,
0.002423262922093272,
-0.03330413997173309,
-0.09900689125061035,
-0.05583468824625015,
-0.039862558245658875,
0.0976385623216629,
-0.05793595686554909,
0.13453109562397003,
-0.0006930747185833752,
0.05002014338970184,
0.029245974496006966,
-0.09760068356990814,
-0.026460107415914536,
0.03335731849074364,
0.2411070317029953,
0.03813379257917404,
-0.041235532611608505,
0.05609863996505737,
0.04093090072274208,
0.09008461236953735,
0.20178799331188202,
0.18372276425361633,
0.18152481317520142,
-0.0023905341513454914,
0.11359725892543793,
0.03810369223356247,
-0.0759560763835907,
-0.13318447768688202,
0.0774720162153244,
-0.05246661603450775,
0.11562809348106384,
-0.03727570176124573,
0.22242453694343567,
0.09699341654777527,
-0.1603240966796875,
0.08715705573558807,
-0.03414086624979973,
-0.1027391329407692,
-0.13086307048797607,
-0.03175564110279083,
-0.082534059882164,
-0.16630160808563232,
0.014201851561665535,
-0.11034226417541504,
0.06427384912967682,
0.057299498468637466,
0.03594272583723068,
0.012908383272588253,
0.16890905797481537,
0.014663507230579853,
0.0019459545146673918,
0.10636401176452637,
-0.010090297088027,
-0.025514712557196617,
-0.07684583216905594,
-0.06598733365535736,
0.029902644455432892,
-0.031601641327142715,
0.04240717366337776,
-0.030615709722042084,
-0.11799801886081696,
0.035881757736206055,
-0.05057896673679352,
-0.0954534113407135,
0.014200318604707718,
0.027705533429980278,
0.08085718005895615,
0.07233326882123947,
0.03487979248166084,
-0.043873947113752365,
-0.021555524319410324,
0.21555186808109283,
-0.09311560541391373,
-0.10295198857784271,
-0.09809660911560059,
0.2718040347099304,
0.05268971249461174,
0.004324762616306543,
-0.0026161191053688526,
-0.06375130265951157,
0.00787526648491621,
0.25201207399368286,
0.18092527985572815,
-0.05499201640486717,
0.00652813958004117,
-0.011869695968925953,
0.002491292078047991,
-0.016007250174880028,
0.08390436321496964,
0.14495240151882172,
0.08899461477994919,
-0.07718940824270248,
-0.05774896219372749,
-0.05235978215932846,
-0.03928554058074951,
-0.061132531613111496,
0.08419179171323776,
0.029202783480286598,
-0.011579962447285652,
-0.06132054328918457,
0.07504217326641083,
-0.08996940404176712,
-0.12341266870498657,
0.01802864857017994,
-0.19796575605869293,
-0.1371220052242279,
-0.021161988377571106,
0.07047933340072632,
0.034721147269010544,
0.04301748797297478,
-0.01586991548538208,
-0.009172268211841583,
0.06118795648217201,
-0.005612165667116642,
-0.0785558819770813,
-0.03872337192296982,
0.07737231254577637,
-0.12655359506607056,
0.11419066041707993,
-0.0329565703868866,
0.06874293088912964,
0.10999137908220291,
0.10147783160209656,
-0.046400610357522964,
0.0836714655160904,
0.03979054093360901,
-0.09519223868846893,
0.04440470039844513,
0.18167716264724731,
-0.032016854733228683,
0.07600022107362747,
0.04080798849463463,
-0.12378472834825516,
0.033783506602048874,
-0.08344094455242157,
-0.07281014323234558,
-0.034677278250455856,
-0.042584266513586044,
-0.04529016837477684,
0.12536205351352692,
0.1976609230041504,
-0.03294936195015907,
0.007462077774107456,
-0.07151228189468384,
0.0030124036129564047,
0.028927847743034363,
0.04767664894461632,
-0.08201234042644501,
-0.25852611660957336,
-0.00010463649232406169,
0.04014723747968674,
-0.017571622505784035,
-0.24040311574935913,
-0.09385418146848679,
0.025227000936865807,
-0.062015350908041,
-0.06645318120718002,
0.11024084687232971,
0.07651247829198837,
0.04431873932480812,
-0.04354936257004738,
-0.0969240590929985,
-0.012340723536908627,
0.19149592518806458,
-0.1866261512041092,
-0.06065918132662773
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wavlm-libri-clean-100h-base
This model is a fine-tuned version of [microsoft/wavlm-base](https://huggingface.co/microsoft/wavlm-base) on the LIBRISPEECH_ASR - CLEAN dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0955
- Wer: 0.0773
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 2.8664 | 0.17 | 300 | 2.8439 | 1.0 |
| 0.5009 | 0.34 | 600 | 0.2709 | 0.2162 |
| 0.2056 | 0.5 | 900 | 0.1934 | 0.1602 |
| 0.1648 | 0.67 | 1200 | 0.1576 | 0.1306 |
| 0.1922 | 0.84 | 1500 | 0.1358 | 0.1114 |
| 0.093 | 1.01 | 1800 | 0.1277 | 0.1035 |
| 0.0652 | 1.18 | 2100 | 0.1251 | 0.1005 |
| 0.0848 | 1.35 | 2400 | 0.1188 | 0.0964 |
| 0.0706 | 1.51 | 2700 | 0.1091 | 0.0905 |
| 0.0846 | 1.68 | 3000 | 0.1018 | 0.0840 |
| 0.0684 | 1.85 | 3300 | 0.0978 | 0.0809 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.9.1
- Datasets 1.18.0
- Tokenizers 0.10.3
|
{"tags": ["automatic-speech-recognition", "librispeech_asr", "generated_from_trainer"], "model-index": [{"name": "wavlm-libri-clean-100h-base", "results": []}]}
|
automatic-speech-recognition
|
anjulRajendraSharma/wavlm-base-libri-clean-100
|
[
"transformers",
"pytorch",
"tensorboard",
"wavlm",
"automatic-speech-recognition",
"librispeech_asr",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #wavlm #automatic-speech-recognition #librispeech_asr #generated_from_trainer #endpoints_compatible #region-us
|
wavlm-libri-clean-100h-base
===========================
This model is a fine-tuned version of microsoft/wavlm-base on the LIBRISPEECH\_ASR - CLEAN dataset.
It achieves the following results on the evaluation set:
* Loss: 0.0955
* Wer: 0.0773
Model description
-----------------
More information needed
Intended uses & limitations
---------------------------
More information needed
Training and evaluation data
----------------------------
More information needed
Training procedure
------------------
### Training hyperparameters
The following hyperparameters were used during training:
* learning\_rate: 0.0003
* train\_batch\_size: 16
* eval\_batch\_size: 16
* seed: 42
* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
* lr\_scheduler\_type: linear
* lr\_scheduler\_warmup\_steps: 500
* num\_epochs: 1.0
* mixed\_precision\_training: Native AMP
### Training results
### Framework versions
* Transformers 4.15.0
* Pytorch 1.9.1
* Datasets 1.18.0
* Tokenizers 0.10.3
|
[
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.1\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #wavlm #automatic-speech-recognition #librispeech_asr #generated_from_trainer #endpoints_compatible #region-us \n",
"### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP",
"### Training results",
"### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.1\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
53,
130,
4,
33
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #wavlm #automatic-speech-recognition #librispeech_asr #generated_from_trainer #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* num\\_epochs: 1.0\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.15.0\n* Pytorch 1.9.1\n* Datasets 1.18.0\n* Tokenizers 0.10.3"
] |
[
-0.1113317608833313,
0.05163652077317238,
-0.003161116736009717,
0.07574845105409622,
0.13238266110420227,
-0.00946098193526268,
0.09934161603450775,
0.13532514870166779,
-0.07627210021018982,
0.06872375309467316,
0.12121498584747314,
0.14556971192359924,
0.022185176610946655,
0.11784261465072632,
-0.052558835595846176,
-0.2855158746242523,
0.006716356612741947,
0.029094595462083817,
-0.025811845436692238,
0.11374841630458832,
0.08712143450975418,
-0.12930037081241608,
0.05435663089156151,
0.01631789654493332,
-0.14340437948703766,
0.019118711352348328,
0.014291436411440372,
-0.08883298933506012,
0.11959568411111832,
0.015060145407915115,
0.08315783739089966,
0.022718163207173347,
0.07097861915826797,
-0.22207435965538025,
0.010232403874397278,
0.035953812301158905,
0.03883221000432968,
0.06335027515888214,
0.06256655603647232,
-0.049659185111522675,
0.13054166734218597,
-0.10844804346561432,
0.07213439047336578,
0.03407984972000122,
-0.11789747327566147,
-0.27217739820480347,
-0.06176833435893059,
0.022847220301628113,
0.05364687368273735,
0.0951579362154007,
-0.02077983133494854,
0.12109668552875519,
-0.08146371692419052,
0.11965113878250122,
0.26404696702957153,
-0.2715003788471222,
-0.04285112023353577,
-0.035993363708257675,
0.025310223922133446,
0.07512804865837097,
-0.11463954299688339,
-0.01289327722042799,
0.03218818083405495,
0.047955673187971115,
0.12225513905286789,
-0.02615693397819996,
-0.05951200798153877,
0.0033278546761721373,
-0.14888738095760345,
-0.039558615535497665,
0.0995466485619545,
0.017725074663758278,
-0.0368439182639122,
-0.0918428972363472,
-0.03974641487002373,
-0.16097277402877808,
-0.06269177049398422,
-0.0017880817176774144,
0.0322030708193779,
-0.047788795083761215,
-0.09756151586771011,
-0.010766642168164253,
-0.07026049494743347,
-0.06910639256238937,
-0.02709275111556053,
0.18747609853744507,
0.040355533361434937,
-0.003692193189635873,
-0.03818590193986893,
0.06916003674268723,
-0.0023696545977145433,
-0.1387982964515686,
-0.0028600068762898445,
0.04234529659152031,
-0.023084411397576332,
-0.01736976020038128,
-0.0628172904253006,
-0.04475419223308563,
0.017489325255155563,
0.11255679279565811,
-0.0944342091679573,
0.084541454911232,
-0.004056765232235193,
0.02289719693362713,
-0.10413864254951477,
0.20570024847984314,
-0.03474551439285278,
0.0066858818754553795,
-0.011115510948002338,
0.06962998956441879,
0.011938817799091339,
-0.026016192510724068,
-0.08662262558937073,
0.018575990572571754,
0.11420704424381256,
0.029154745861887932,
-0.05855061858892441,
0.05076778307557106,
-0.03696277365088463,
-0.010554107837378979,
-0.03675362467765808,
-0.14061188697814941,
0.048959098756313324,
0.02875867486000061,
-0.06708145886659622,
0.02223053015768528,
0.018125196918845177,
-0.00619215052574873,
-0.05887928232550621,
0.10290293395519257,
-0.06834173947572708,
0.04650231823325157,
-0.06295086443424225,
-0.12881402671337128,
0.01093428023159504,
-0.09502141177654266,
0.006754789035767317,
-0.0934346467256546,
-0.10671953856945038,
-0.023529941216111183,
0.026935230940580368,
-0.03595489636063576,
-0.008962621912360191,
-0.08161810040473938,
-0.08368080854415894,
0.03641142696142197,
-0.028068508952856064,
0.11375845968723297,
-0.06317763775587082,
0.10487544536590576,
0.054407428950071335,
0.08519937098026276,
-0.003110807156190276,
0.07098329067230225,
-0.07192663103342056,
0.015363933518528938,
-0.16445353627204895,
0.10409232974052429,
-0.08766636252403259,
0.041040483862161636,
-0.10212335735559464,
-0.12053170055150986,
-0.004064782988280058,
0.011339926160871983,
0.0960780680179596,
0.09802553057670593,
-0.1601274460554123,
-0.10648305714130402,
0.1951880007982254,
-0.0803743377327919,
-0.05342980846762657,
0.12519356608390808,
-0.04276081547141075,
-0.004263666924089193,
0.07386016845703125,
0.26377156376838684,
0.05857149884104729,
-0.11405106633901596,
0.012809636071324348,
-0.031173091381788254,
0.06535962969064713,
-0.012478647753596306,
0.04144275560975075,
-0.019196655601263046,
0.040286168456077576,
0.025506118312478065,
0.00023662399325985461,
0.049647118896245956,
-0.09429094195365906,
-0.08316663652658463,
-0.049628205597400665,
-0.09648940712213516,
0.031228583306074142,
0.05498069152235985,
0.05787576735019684,
-0.10568740963935852,
-0.09090667217969894,
0.04891885817050934,
0.06865889579057693,
-0.09979831427335739,
0.0605480819940567,
-0.09034640341997147,
0.06492286175489426,
-0.01815994456410408,
-0.010731087997555733,
-0.20039819180965424,
0.04078787565231323,
0.019075334072113037,
-0.007315457798540592,
0.04455588385462761,
-0.034108687192201614,
0.08771773427724838,
0.04525135084986687,
-0.04516637697815895,
-0.0314689576625824,
-0.01267380453646183,
0.003148340852931142,
-0.10202339291572571,
-0.20360811054706573,
-0.02902616187930107,
-0.038446325808763504,
0.08943213522434235,
-0.17837998270988464,
0.02056678757071495,
0.051577646285295486,
0.07702798396348953,
0.028983736410737038,
-0.0377034954726696,
-0.0006434252136386931,
0.10127804428339005,
-0.01265986543148756,
-0.0532362200319767,
0.06033733859658241,
-0.007156952749937773,
-0.09214932471513748,
0.004622810520231724,
-0.1533423513174057,
0.1094784289598465,
0.14189234375953674,
-0.04726051539182663,
-0.08050058037042618,
0.0012562632327899337,
-0.04995869845151901,
-0.023834655061364174,
-0.03568723425269127,
0.03419147804379463,
0.2150026559829712,
-0.002816117135807872,
0.1479821801185608,
-0.07515877485275269,
-0.03463711962103844,
0.036949895322322845,
-0.02629878744482994,
0.011352133005857468,
0.14857207238674164,
0.028348367661237717,
-0.02490920014679432,
0.1021636426448822,
0.0872596949338913,
-0.09883295744657516,
0.14247363805770874,
-0.054592885076999664,
-0.09227647632360458,
-0.007245865184813738,
-0.004392691422253847,
0.01872623711824417,
0.0928841233253479,
-0.15465882420539856,
-0.03722258657217026,
0.01991516351699829,
0.03031671606004238,
0.017681177705526352,
-0.2178957164287567,
-0.0008988460758700967,
0.03674694523215294,
-0.0759037435054779,
-0.04536103084683418,
0.00006308064621407539,
0.01781461015343666,
0.1037377268075943,
-0.004557745065540075,
-0.07958042621612549,
-0.0023875767365098,
-0.011094961315393448,
-0.08327910304069519,
0.183299258351326,
-0.0958600714802742,
-0.16834262013435364,
-0.09821464121341705,
-0.07828568667173386,
-0.04286946356296539,
0.003283722558990121,
0.07204441726207733,
-0.1133328378200531,
-0.03154373914003372,
-0.07673230767250061,
0.017739931121468544,
-0.029121877625584602,
0.04262087494134903,
0.018607383593916893,
-0.009504263289272785,
0.06487862765789032,
-0.11680767685174942,
-0.019023757427930832,
-0.06561015546321869,
-0.016585158184170723,
0.04774445295333862,
0.052170298993587494,
0.10199679434299469,
0.1565740704536438,
-0.01529640518128872,
0.043300777673721313,
-0.039099499583244324,
0.20994071662425995,
-0.07132921367883682,
-0.042972054332494736,
0.10690539330244064,
-0.01595720835030079,
0.04875625669956207,
0.09061220288276672,
0.052712179720401764,
-0.10298807919025421,
-0.009405550546944141,
0.02148417755961418,
-0.0507967509329319,
-0.21108636260032654,
-0.049739982932806015,
-0.05724102258682251,
-0.01860436610877514,
0.11070157587528229,
0.027766967192292213,
0.007899043150246143,
0.02213316224515438,
0.04477156326174736,
0.01263426337391138,
-0.011227316223084927,
0.06478338688611984,
0.13921159505844116,
0.029768772423267365,
0.130473792552948,
-0.036398280411958694,
-0.05491683632135391,
0.024207452312111855,
-0.014677624218165874,
0.22403432428836823,
0.01339814718812704,
0.12893681228160858,
0.04252561926841736,
0.16244195401668549,
0.017359040677547455,
0.07523608207702637,
-0.006470642052590847,
-0.032418541610240936,
0.017287980765104294,
-0.056354496628046036,
-0.048078492283821106,
0.025388946756720543,
0.02928055450320244,
0.02768661454319954,
-0.128167524933815,
-0.022634444758296013,
0.04239867627620697,
0.2964293956756592,
0.04508870840072632,
-0.3040156960487366,
-0.09466709941625595,
-0.00798326637595892,
-0.07916539907455444,
-0.011935669928789139,
0.04129181429743767,
0.09775243699550629,
-0.07065741717815399,
0.0644608736038208,
-0.050035275518894196,
0.09036726504564285,
-0.0556558258831501,
0.03821232169866562,
0.010643146932125092,
0.0887405276298523,
0.004079063888639212,
0.04236188158392906,
-0.31920844316482544,
0.28702297806739807,
0.01609787717461586,
0.08674462884664536,
-0.05775201693177223,
-0.007856632582843304,
0.02533838339149952,
-0.00011264794738963246,
0.08019346743822098,
-0.013339872471988201,
-0.10005201399326324,
-0.1904822736978531,
-0.06657741963863373,
0.023514196276664734,
0.15078620612621307,
0.020766502246260643,
0.11420189589262009,
-0.016603823751211166,
-0.005675575230270624,
0.0643385797739029,
-0.098627470433712,
-0.08757264912128448,
-0.09945240616798401,
-0.0020922734402120113,
0.08492617309093475,
0.0024354388006031513,
-0.05421360209584236,
-0.10718604177236557,
-0.09621249884366989,
0.1336240917444229,
-0.05724387615919113,
-0.025762103497982025,
-0.12309523671865463,
0.060951877385377884,
0.11144479364156723,
-0.07968232780694962,
0.035693880170583725,
0.024579357355833054,
0.07515580207109451,
0.008672136813402176,
-0.06188398599624634,
0.1016794741153717,
-0.062156833708286285,
-0.1671169251203537,
-0.015643011778593063,
0.15370674431324005,
0.04166291654109955,
0.07177387177944183,
-0.01118231937289238,
0.045670513063669205,
-0.02538614720106125,
-0.08405565470457077,
0.06685008853673935,
0.03712601959705353,
0.009695258922874928,
0.033905331045389175,
-0.0508209690451622,
-0.006084883119910955,
-0.08905985206365585,
-0.01979869417846203,
0.19902046024799347,
0.24063941836357117,
-0.08782996237277985,
0.08143994957208633,
0.05592905357480049,
-0.059142500162124634,
-0.18849115073680878,
0.02393614873290062,
0.07568515837192535,
0.01334795355796814,
0.006172007415443659,
-0.1896609365940094,
0.06192873790860176,
0.05266301706433296,
-0.0019535450264811516,
0.09255209565162659,
-0.3096010386943817,
-0.14692112803459167,
0.13538150489330292,
0.11936844140291214,
0.08808663487434387,
-0.1540398746728897,
-0.042587146162986755,
-0.01579287089407444,
-0.08091090619564056,
0.08645989000797272,
-0.05225299671292305,
0.14789102971553802,
-0.024331260472536087,
0.11115212738513947,
0.022486714646220207,
-0.05485660210251808,
0.10820329189300537,
0.023874765262007713,
0.060327500104904175,
-0.03486957773566246,
0.013675462454557419,
0.02350359596312046,
-0.04107533395290375,
0.05026795715093613,
-0.06852661818265915,
0.025901617482304573,
-0.08648708462715149,
-0.038816940039396286,
-0.10092874616384506,
0.03564293310046196,
-0.002399266231805086,
-0.03645074740052223,
-0.023438835516572,
0.0022985029499977827,
0.06490951031446457,
-0.010282840579748154,
0.11905116587877274,
-0.05929241701960564,
0.13665509223937988,
0.12722072005271912,
0.10956835746765137,
-0.0854843258857727,
-0.07048140466213226,
0.007292140740901232,
-0.019521325826644897,
0.05694209039211273,
-0.11681699007749557,
0.038316238671541214,
0.1389397829771042,
0.05756181851029396,
0.13117806613445282,
0.07919564843177795,
-0.04723064601421356,
0.01806841418147087,
0.04907810688018799,
-0.1372091919183731,
-0.13456778228282928,
0.0028575314208865166,
-0.01789185404777527,
-0.08837787806987762,
0.047175854444503784,
0.11319490522146225,
-0.061706919223070145,
-0.006623178254812956,
-0.020605353638529778,
0.012411581352353096,
-0.06265860050916672,
0.22070959210395813,
0.06977777928113937,
0.05326329544186592,
-0.11997532099485397,
0.0680033415555954,
0.028348412364721298,
-0.14960534870624542,
0.06463149189949036,
0.10119213163852692,
-0.07109244912862778,
-0.032020214945077896,
0.020493656396865845,
0.135512575507164,
-0.03530973196029663,
-0.04977785050868988,
-0.13256579637527466,
-0.13799384236335754,
0.10857351869344711,
0.1841304749250412,
0.07004819065332413,
0.01332462951540947,
-0.07462328672409058,
0.017554106190800667,
-0.13528601825237274,
0.08110768347978592,
0.05538859963417053,
0.05311650037765503,
-0.11915375292301178,
0.1700955331325531,
0.013484620489180088,
0.0369705967605114,
-0.02217935584485531,
-0.010501711629331112,
-0.11562540382146835,
0.04203863814473152,
-0.11184146255254745,
-0.009252948686480522,
-0.0465397872030735,
0.008476722985506058,
0.006945252884179354,
-0.06742691993713379,
-0.05558391660451889,
0.03544331341981888,
-0.12347590923309326,
-0.025303760543465614,
-0.0027261145878583193,
0.03462601453065872,
-0.1307975947856903,
-0.025702159851789474,
0.01445313822478056,
-0.08096346259117126,
0.07112986594438553,
0.08353161066770554,
-0.0469617024064064,
0.06652441620826721,
-0.11726967245340347,
-0.02472607232630253,
0.07321794331073761,
-0.007908511906862259,
0.04985443875193596,
-0.12303290516138077,
-0.008219793438911438,
0.011179790832102299,
0.062129516154527664,
0.035393860191106796,
0.1170632392168045,
-0.1263328343629837,
0.015301820822060108,
-0.042931780219078064,
-0.07042136788368225,
-0.06940273195505142,
0.05465198680758476,
0.08199115842580795,
0.03790983930230141,
0.16623257100582123,
-0.10633440315723419,
0.04447364807128906,
-0.18491360545158386,
-0.003964684903621674,
-0.026852088049054146,
-0.11112771928310394,
-0.06415031105279922,
-0.033455170691013336,
0.09486211091279984,
-0.05223032459616661,
0.12737929821014404,
0.0005647166981361806,
0.04567629098892212,
0.03152827173471451,
-0.08106624335050583,
-0.03296225517988205,
0.03473353758454323,
0.23298174142837524,
0.03680821508169174,
-0.04535820707678795,
0.0692354068160057,
0.037785131484270096,
0.0937834158539772,
0.1871572583913803,
0.18901927769184113,
0.16866496205329895,
0.01733933575451374,
0.10836748033761978,
0.03908238932490349,
-0.060015417635440826,
-0.14962352812290192,
0.08018597960472107,
-0.04674888402223587,
0.12211570888757706,
-0.028417354449629784,
0.2152394950389862,
0.101061150431633,
-0.15515708923339844,
0.08636238425970078,
-0.026443863287568092,
-0.09837283939123154,
-0.1327931135892868,
-0.04630192741751671,
-0.08576054871082306,
-0.1622251570224762,
0.014849571511149406,
-0.1126793920993805,
0.06732624024152756,
0.05328354984521866,
0.03340446203947067,
0.021202722564339638,
0.16011245548725128,
0.009900867007672787,
0.011366378515958786,
0.09989655762910843,
-0.007636502850800753,
-0.043655093759298325,
-0.06420166045427322,
-0.06936538219451904,
0.036472074687480927,
-0.037515223026275635,
0.042170554399490356,
-0.020697863772511482,
-0.0950121060013771,
0.03621389716863632,
-0.047707974910736084,
-0.09513572603464127,
0.019388195127248764,
0.025822069495916367,
0.08824752271175385,
0.07300026714801788,
0.032070938497781754,
-0.048696164041757584,
-0.01746630296111107,
0.2187063843011856,
-0.09089379757642746,
-0.10040810704231262,
-0.10016993433237076,
0.29647743701934814,
0.05295487120747566,
-0.00025530735729262233,
0.0020133014768362045,
-0.06314679980278015,
-0.001717928797006607,
0.23547708988189697,
0.1683039665222168,
-0.033273033797740936,
0.012650715187191963,
-0.022074967622756958,
0.0013746165204793215,
-0.01053546741604805,
0.08773057907819748,
0.14112775027751923,
0.08781053870916367,
-0.06753474473953247,
-0.054361492395401,
-0.052672337740659714,
-0.03888959810137749,
-0.06808920204639435,
0.08192698657512665,
0.030339185148477554,
-0.013667445629835129,
-0.056253887712955475,
0.0645681619644165,
-0.08506302535533905,
-0.09998096525669098,
0.029609765857458115,
-0.20909033715724945,
-0.13760259747505188,
-0.015154222026467323,
0.06253795325756073,
0.025102658197283745,
0.04849417507648468,
-0.008875468745827675,
-0.009090682491660118,
0.06341114640235901,
-0.0035881593357771635,
-0.08599485456943512,
-0.0561843141913414,
0.08122049272060394,
-0.13164019584655762,
0.12408551573753357,
-0.03945187106728554,
0.060630664229393005,
0.11237750947475433,
0.09358163923025131,
-0.05921739339828491,
0.08383864909410477,
0.03499815613031387,
-0.10643671452999115,
0.03181812912225723,
0.18016885221004486,
-0.037535328418016434,
0.07712863385677338,
0.033476945012807846,
-0.1267378330230713,
0.023615805432200432,
-0.08290984481573105,
-0.05831145495176315,
-0.03569447994232178,
-0.048308830708265305,
-0.04590396210551262,
0.11796541512012482,
0.19370447099208832,
-0.03787892311811447,
0.0062349336221814156,
-0.06730811297893524,
0.010516203008592129,
0.041215479373931885,
0.02570345625281334,
-0.07392922788858414,
-0.26404592394828796,
0.004147571511566639,
0.04258657619357109,
-0.015183608047664165,
-0.2597670555114746,
-0.09692978858947754,
0.015737971290946007,
-0.0570504330098629,
-0.07527502626180649,
0.10609984397888184,
0.07413669675588608,
0.04814750328660011,
-0.05022377148270607,
-0.10336172580718994,
-0.014079662039875984,
0.18967045843601227,
-0.17552343010902405,
-0.05924071744084358
] |
null | null |
transformers
|
Model to summarize the meeting transcripts.
|
{}
|
text2text-generation
|
ankitkhowal/minutes-of-meeting
|
[
"transformers",
"pytorch",
"bart",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #bart #text2text-generation #autotrain_compatible #endpoints_compatible #region-us
|
Model to summarize the meeting transcripts.
|
[] |
[
"TAGS\n#transformers #pytorch #bart #text2text-generation #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
38
] |
[
"passage: TAGS\n#transformers #pytorch #bart #text2text-generation #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.0318886935710907,
0.0096189696341753,
-0.007606509141623974,
0.001847309060394764,
0.14772942662239075,
0.02160252071917057,
0.1334671527147293,
0.12192132323980331,
0.01626305654644966,
-0.028771573677659035,
0.14323928952217102,
0.1992919147014618,
-0.024837283417582512,
0.16441908478736877,
-0.08063437789678574,
-0.2598302364349365,
0.05283436179161072,
0.07620055228471756,
0.042803842574357986,
0.12526075541973114,
0.08328705281019211,
-0.0723603218793869,
0.0735759362578392,
-0.032824575901031494,
-0.1756834238767624,
0.0490216426551342,
0.03857412934303284,
-0.11679597198963165,
0.10073336213827133,
0.044240228831768036,
0.14631077647209167,
0.03982788696885109,
-0.06636208295822144,
-0.1283167600631714,
0.034403931349515915,
-0.012848195619881153,
-0.06484833359718323,
0.0399547815322876,
0.0953272208571434,
-0.10738255828619003,
0.0691714733839035,
0.07824554294347763,
-0.0009729358134791255,
0.051042620092630386,
-0.13444896042346954,
-0.04049830511212349,
-0.027359316125512123,
0.032211143523454666,
0.06613613665103912,
0.08172227442264557,
-0.002032114891335368,
0.12712892889976501,
-0.09982932358980179,
0.129965141415596,
0.14824458956718445,
-0.30388808250427246,
-0.012859301641583443,
0.046737879514694214,
0.08478476107120514,
0.05252804234623909,
-0.024114781990647316,
0.03457428514957428,
0.02039201185107231,
0.030345473438501358,
-0.011317359283566475,
-0.08197411149740219,
-0.12215276807546616,
0.02166357822716236,
-0.0591701939702034,
-0.05171586200594902,
0.2085941880941391,
-0.08335429430007935,
0.052767593413591385,
-0.035482101142406464,
-0.09834597259759903,
-0.04572887346148491,
-0.030292777344584465,
0.012228906154632568,
-0.07027875632047653,
0.06717630475759506,
-0.024450715631246567,
-0.051333602517843246,
-0.139779195189476,
0.009453840553760529,
-0.19736739993095398,
0.1813373863697052,
0.003591995220631361,
0.05767586827278137,
-0.22959688305854797,
0.07853192090988159,
0.032813530415296555,
-0.11851274222135544,
0.05080040171742439,
-0.09867561608552933,
0.05972042679786682,
0.0018865568563342094,
-0.07992928475141525,
-0.09396862983703613,
0.07693824172019958,
0.15202535688877106,
0.0570392943918705,
0.03992437571287155,
-0.050025515258312225,
0.08682024478912354,
0.005244566593319178,
0.08907615393400192,
0.0664292722940445,
-0.08503730595111847,
0.05343325808644295,
-0.12424403429031372,
0.024643665179610252,
-0.07267257571220398,
-0.1592642068862915,
-0.049884188920259476,
0.04776405170559883,
0.07991299778223038,
0.05279557406902313,
0.05907239764928818,
-0.05020331218838692,
-0.017758851870894432,
0.07987207174301147,
-0.07385040819644928,
0.012242065742611885,
0.0059492760337889194,
0.024120714515447617,
0.1218082532286644,
0.002459706272929907,
0.013701106421649456,
-0.09565173089504242,
0.10911376029253006,
-0.04307014122605324,
0.0018043185118585825,
-0.05350656807422638,
-0.05434543266892433,
0.03374440222978592,
-0.09654112905263901,
0.023122891783714294,
-0.16352982819080353,
-0.16724829375743866,
0.007726968731731176,
0.012249041348695755,
-0.003505572210997343,
-0.033027712255716324,
-0.03621775656938553,
-0.005919867195188999,
0.05795317515730858,
-0.08099211752414703,
-0.01666862703859806,
-0.042124539613723755,
0.10648948699235916,
-0.0036009550094604492,
0.08183425664901733,
-0.16650360822677612,
0.06636875867843628,
-0.11494749039411545,
-0.03601225093007088,
-0.09505820274353027,
0.035676173865795135,
0.0012118915328755975,
0.15507112443447113,
0.018306683748960495,
-0.009874354116618633,
-0.10775242745876312,
0.05634631961584091,
-0.018046220764517784,
0.19111143052577972,
-0.10143165290355682,
-0.11551348119974136,
0.2726460099220276,
-0.08581715822219849,
-0.15655820071697235,
0.0745999738574028,
0.0038844537921249866,
0.03687083348631859,
0.10288292169570923,
0.17752613127231598,
0.06705204397439957,
-0.009362038224935532,
0.09385570883750916,
0.10423172265291214,
-0.08946622908115387,
-0.12407493591308594,
-0.005046447739005089,
-0.010976677760481834,
-0.10818316042423248,
0.06363586336374283,
0.10392338782548904,
0.08065568655729294,
-0.05380159243941307,
-0.03353259712457657,
-0.03641233593225479,
-0.013897283934056759,
0.10549046844244003,
0.026039913296699524,
0.1269293576478958,
-0.0887337177991867,
-0.012849004939198494,
-0.014335026033222675,
-0.01335445512086153,
0.003660373855382204,
0.047355301678180695,
-0.0394231453537941,
0.10426464676856995,
-0.0038209620397537947,
0.03976859524846077,
-0.20150139927864075,
-0.06227380782365799,
-0.014103719033300877,
0.13805876672267914,
0.0026925376150757074,
0.11038415879011154,
0.07020298391580582,
-0.037029724568128586,
0.0012128711678087711,
-0.02271571010351181,
0.1580611914396286,
-0.01027847919613123,
-0.07842051237821579,
-0.048077233135700226,
0.057122793048620224,
-0.07903748750686646,
-0.004407473839819431,
-0.04450428858399391,
0.023696700111031532,
0.03759979456663132,
0.13724929094314575,
0.016397079452872276,
0.04251628741621971,
-0.035238903015851974,
0.039600640535354614,
-0.08605601638555527,
0.029575467109680176,
0.10157769173383713,
0.014959105290472507,
-0.07233147323131561,
0.2013004571199417,
-0.1762475073337555,
0.2476811707019806,
0.212754026055336,
-0.27035385370254517,
0.024685995653271675,
-0.04099436104297638,
-0.01907818578183651,
0.011665256693959236,
0.04634355753660202,
-0.016739649698138237,
0.04964808002114296,
0.017727717757225037,
0.19336023926734924,
-0.03373531624674797,
-0.04479131102561951,
-0.01514318399131298,
-0.0669543519616127,
-0.010299740359187126,
0.053888946771621704,
0.030631281435489655,
-0.13001108169555664,
0.18424159288406372,
0.2280101478099823,
0.025706658139824867,
0.18376773595809937,
0.022436637431383133,
-0.0006371058989316225,
0.07107949256896973,
-0.019524546340107918,
-0.0311858169734478,
-0.06731158494949341,
-0.18211530148983002,
-0.03696368262171745,
0.0802861750125885,
0.012168895453214645,
0.09167364984750748,
-0.11742155998945236,
-0.02911762334406376,
-0.006805825047194958,
0.01033635064959526,
-0.0067903995513916016,
0.09016279131174088,
0.07038231194019318,
0.10586395859718323,
-0.02393309772014618,
-0.022413793951272964,
0.11036060005426407,
0.012995772063732147,
-0.08900920301675797,
0.15894490480422974,
-0.1406673640012741,
-0.35423746705055237,
-0.18273937702178955,
-0.16229088604450226,
-0.02076569013297558,
0.0609489381313324,
0.13703471422195435,
-0.07584594935178757,
-0.0263918898999691,
0.026742594316601753,
0.02490844950079918,
-0.04625730961561203,
0.02991250529885292,
-0.05131559073925018,
0.052110932767391205,
-0.05489020794630051,
-0.07748328149318695,
-0.0574861541390419,
-0.010487782768905163,
-0.02249021828174591,
0.1595280021429062,
-0.12328781187534332,
0.08628269284963608,
0.14141078293323517,
0.007060518022626638,
0.06670382618904114,
-0.019857754930853844,
0.16306842863559723,
-0.07192041724920273,
-0.01979277841746807,
0.20871739089488983,
-0.06324824690818787,
0.08185254782438278,
0.13290759921073914,
0.0014225946506485343,
-0.0659416913986206,
0.036012422293424606,
-0.06399042159318924,
-0.09558235108852386,
-0.2123410850763321,
-0.11684432625770569,
-0.1203472763299942,
0.08290492743253708,
0.05139313265681267,
0.05275741592049599,
0.13240981101989746,
0.08159936964511871,
-0.01733388565480709,
0.026433371007442474,
0.004888464231044054,
0.09375131875276566,
0.19021064043045044,
-0.019578177481889725,
0.1585603803396225,
-0.07485145330429077,
-0.12854962050914764,
0.09895215928554535,
0.041452351957559586,
0.10840454697608948,
0.08608803153038025,
0.03231623023748398,
0.014189885929226875,
0.09416954219341278,
0.154267817735672,
0.14471131563186646,
0.04096266254782677,
-0.01844504289329052,
-0.014848168008029461,
-0.019450439140200615,
-0.07458235323429108,
0.03915075212717056,
0.035822778940200806,
-0.1387520581483841,
-0.05972644314169884,
-0.12952616810798645,
0.07954489439725876,
0.08016408234834671,
0.05154338851571083,
-0.21658846735954285,
0.010716202668845654,
0.09096989780664444,
-0.032574914395809174,
-0.10637634992599487,
0.05862889811396599,
-0.020082173869013786,
-0.14357052743434906,
0.08368008583784103,
-0.041197724640369415,
0.13136643171310425,
-0.016771353781223297,
0.09348223358392715,
-0.07598451524972916,
-0.11635787039995193,
0.04214917868375778,
0.1051286980509758,
-0.3333568274974823,
0.19576585292816162,
-0.005010900087654591,
-0.049015872180461884,
-0.09523550420999527,
-0.010616874322295189,
0.01414541807025671,
0.1286686807870865,
0.06345471739768982,
-0.005398744251579046,
-0.07144735753536224,
-0.12392700463533401,
-0.01792708784341812,
0.022568373009562492,
0.13874293863773346,
-0.025139471516013145,
0.005983670707792044,
-0.03845648840069771,
-0.028553906828165054,
-0.035857707262039185,
-0.016320165246725082,
-0.00006297017534961924,
-0.17832933366298676,
0.07639492303133011,
0.05771424248814583,
0.06800124049186707,
0.0010011186823248863,
-0.021595092490315437,
-0.03323802351951599,
0.2092742919921875,
-0.058046482503414154,
-0.07962445914745331,
-0.11384638398885727,
-0.0835738554596901,
0.04440158233046532,
-0.09057649224996567,
0.050487905740737915,
-0.08124570548534393,
0.033525500446558,
-0.08009763807058334,
-0.21084435284137726,
0.11020998656749725,
-0.10891846567392349,
-0.03099050186574459,
-0.06553018093109131,
0.1588260680437088,
-0.08069963753223419,
0.012498589232563972,
0.03211408853530884,
0.008073501288890839,
-0.12820586562156677,
-0.0717388316988945,
-0.03555352985858917,
-0.005153812002390623,
0.040916040539741516,
0.009281206876039505,
-0.06748723983764648,
-0.048664286732673645,
-0.019427035003900528,
-0.015754178166389465,
0.28569287061691284,
0.1475352942943573,
-0.0609462708234787,
0.18389250338077545,
0.13568007946014404,
-0.0724790170788765,
-0.31663528084754944,
-0.12009056657552719,
-0.09762241691350937,
-0.018755966797471046,
-0.02749086543917656,
-0.1484328955411911,
0.09650276601314545,
-0.027682725340127945,
-0.03106766566634178,
0.11427687108516693,
-0.2001187950372696,
-0.09288447350263596,
0.16899827122688293,
-0.01052873209118843,
0.3756139874458313,
-0.12983417510986328,
-0.11189896613359451,
-0.08148515969514847,
-0.1828407496213913,
0.13390463590621948,
-0.048098571598529816,
0.08482028543949127,
-0.03219360485672951,
0.1280236542224884,
0.04634714499115944,
-0.05130390822887421,
0.07594503462314606,
-0.0059575652703642845,
0.0011326826643198729,
-0.12049131095409393,
-0.037878088653087616,
0.045219022780656815,
-0.019460856914520264,
0.02688322775065899,
-0.06330608576536179,
0.017958227545022964,
-0.15472637116909027,
-0.035788290202617645,
-0.09085410833358765,
0.06023867800831795,
0.02375026047229767,
-0.034585386514663696,
0.030826816335320473,
-0.07676052302122116,
-0.01754079759120941,
0.01558644324541092,
0.21057024598121643,
-0.045633990317583084,
0.17369864881038666,
0.1272992342710495,
0.10232851654291153,
-0.15472714602947235,
0.053716737776994705,
-0.06897317618131638,
-0.07662619650363922,
0.05559123307466507,
-0.06246177479624748,
0.06266094744205475,
0.1154441237449646,
-0.051855891942977905,
0.04971490427851677,
0.09952472150325775,
0.02619350329041481,
-0.0035122428089380264,
0.16450707614421844,
-0.2513583302497864,
0.056844647973775864,
-0.06753088533878326,
0.03402099013328552,
0.061399027705192566,
0.05922277644276619,
0.1611248403787613,
0.05132662132382393,
-0.05297247692942619,
-0.02526140585541725,
-0.00987333245575428,
-0.041071340441703796,
0.06454507261514664,
0.03129083290696144,
0.028051014989614487,
-0.13582447171211243,
0.03623811900615692,
0.014572393149137497,
-0.16184279322624207,
-0.008318443782627583,
0.18180175125598907,
-0.13040250539779663,
-0.11827728897333145,
-0.0003556807932909578,
0.13719442486763,
-0.1603710651397705,
-0.04452987387776375,
-0.07251504808664322,
-0.10844884812831879,
0.07334499061107635,
0.18042348325252533,
0.08402121067047119,
0.08780218660831451,
-0.03176552429795265,
-0.019938340410590172,
-0.007511130999773741,
-0.012712632305920124,
0.04139583930373192,
0.05297689884901047,
-0.062334027141332626,
0.07110132277011871,
-0.02785586751997471,
0.13658872246742249,
-0.09310124069452286,
-0.05491911247372627,
-0.14652810990810394,
0.036501798778772354,
-0.14568524062633514,
-0.05431540310382843,
-0.09395479410886765,
-0.05915261059999466,
-0.0031735983211547136,
-0.04335375502705574,
-0.036842718720436096,
-0.05164634436368942,
-0.11949747055768967,
0.01925506442785263,
-0.05278397724032402,
0.007503626402467489,
-0.09032180160284042,
-0.013594781048595905,
0.09462499618530273,
-0.039998870342969894,
0.08794531226158142,
0.15195296704769135,
-0.0913335531949997,
0.09685277938842773,
-0.14502272009849548,
-0.11198843270540237,
0.09822298586368561,
0.024706006050109863,
0.05699537321925163,
0.10741657018661499,
0.013904707506299019,
0.10026929527521133,
0.05376213416457176,
0.05014893040060997,
0.0676761269569397,
-0.12273658812046051,
0.03846529871225357,
-0.027500825002789497,
-0.17917148768901825,
-0.05761365592479706,
-0.03858879581093788,
0.07174218446016312,
0.021226610988378525,
0.13927356898784637,
-0.04824436455965042,
0.11497675627470016,
-0.046819183975458145,
0.021025141701102257,
-0.005755160469561815,
-0.1860162913799286,
-0.06618314236402512,
-0.09151892364025116,
0.013906091451644897,
0.01864214614033699,
0.2196902483701706,
0.011389349587261677,
0.05455735698342323,
0.03266080841422081,
0.06021501496434212,
0.01324552483856678,
0.002735121175646782,
0.19415953755378723,
0.09189482778310776,
-0.05106740444898605,
-0.10222998261451721,
0.07806974649429321,
0.015882208943367004,
-0.01169833354651928,
0.13383661210536957,
0.056038159877061844,
-0.0000010217938779533142,
0.11240266263484955,
-0.0205028485506773,
0.06780295819044113,
-0.13827838003635406,
-0.22069905698299408,
-0.03221917152404785,
0.04884885624051094,
-0.011768092401325703,
0.11001694947481155,
0.1406608521938324,
-0.029341092333197594,
0.027563219889998436,
-0.030912354588508606,
-0.044107139110565186,
-0.17532260715961456,
-0.11427092552185059,
-0.09254711121320724,
-0.1073429211974144,
0.004966165870428085,
-0.07934264838695526,
0.053909074515104294,
0.05231036990880966,
0.03997195512056351,
-0.06067992001771927,
0.09453555941581726,
0.05611402541399002,
-0.08242598921060562,
0.05563623085618019,
-0.039030808955430984,
0.057145651429891586,
0.0187069084495306,
-0.01721620187163353,
-0.13299258053302765,
0.006979918107390404,
-0.02085346356034279,
0.06046636775135994,
-0.059781454503536224,
-0.0046447101049125195,
-0.1300465315580368,
-0.11646021157503128,
-0.040615372359752655,
0.05290810763835907,
-0.010315419174730778,
0.1543513685464859,
0.010002214461565018,
0.006278361659497023,
0.02952582575380802,
0.21361204981803894,
-0.08304668962955475,
-0.0997757539153099,
-0.027381369844079018,
0.1809464693069458,
0.06438671052455902,
0.09963551163673401,
-0.03316444158554077,
0.008969796821475029,
-0.09206332266330719,
0.34561264514923096,
0.2768198251724243,
-0.05752769485116005,
0.03757641464471817,
0.027008935809135437,
0.04348360002040863,
0.12249194085597992,
0.13782788813114166,
0.08802539855241776,
0.2891320586204529,
-0.06964948028326035,
-0.02117864601314068,
-0.014709130860865116,
-0.022582873702049255,
-0.12718559801578522,
0.0898301973938942,
0.0016211067559197545,
-0.06520198285579681,
-0.04728702828288078,
0.09117545187473297,
-0.19861075282096863,
0.15790048241615295,
-0.055438071489334106,
-0.19179017841815948,
-0.04403429105877876,
0.006371563300490379,
0.1934935450553894,
-0.006737882271409035,
0.08999232947826385,
-0.0039511388167738914,
-0.07897204160690308,
0.06476317346096039,
0.0007307027699425817,
-0.21401961147785187,
0.006611082702875137,
0.05997241288423538,
-0.15226763486862183,
-0.009286170825362206,
-0.02284272201359272,
0.0413786917924881,
0.0732390508055687,
0.07314230501651764,
-0.038510482758283615,
0.04344063252210617,
-0.002147699473425746,
-0.03637602925300598,
0.019572056829929352,
0.05892649292945862,
0.0019930254202336073,
-0.09937852621078491,
0.06121731922030449,
-0.1571817398071289,
0.051574014127254486,
-0.013654936105012894,
-0.015761863440275192,
-0.003717645537108183,
0.01800469495356083,
-0.046465542167425156,
0.06301630288362503,
0.06507781893014908,
-0.007565658539533615,
-0.017934875562787056,
-0.041166454553604126,
-0.026228422299027443,
0.003031977917999029,
-0.08298347890377045,
-0.11668974161148071,
-0.1263510286808014,
-0.11484553664922714,
0.14168350398540497,
0.008285743184387684,
-0.22244790196418762,
0.00014119630213826895,
-0.11391746252775192,
0.04563592001795769,
-0.1792900711297989,
0.10082010924816132,
0.0644376203417778,
-0.0009037127019837499,
0.005460427142679691,
-0.07608924806118011,
0.044657569378614426,
0.08494101464748383,
-0.11684994399547577,
-0.10534628480672836
] |
null | null |
transformers
|
# Open Domain Question Answering
A core goal in artificial intelligence is to build systems that can read the web, and then answer complex questions about any topic. These question-answering (QA) systems could have a big impact on the way that we access information. Furthermore, open-domain question answering is a benchmark task in the development of Artificial Intelligence, since understanding text and being able to answer questions about it is something that we generally associate with intelligence.
# The Natural Questions Dataset
To help spur development in open-domain question answering, we have created the Natural Questions (NQ) corpus, along with a challenge website based on this data. The NQ corpus contains questions from real users, and it requires QA systems to read and comprehend an entire Wikipedia article that may or may not contain the answer to the question. The inclusion of real user questions, and the requirement that solutions should read an entire page to find the answer, cause NQ to be a more realistic and challenging task than prior QA datasets.
|
{"tags": ["small answer"], "datasets": ["natural_questions"]}
|
question-answering
|
ankur310794/bert-large-uncased-nq-small-answer
|
[
"transformers",
"tf",
"bert",
"question-answering",
"small answer",
"dataset:natural_questions",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #tf #bert #question-answering #small answer #dataset-natural_questions #endpoints_compatible #region-us
|
# Open Domain Question Answering
A core goal in artificial intelligence is to build systems that can read the web, and then answer complex questions about any topic. These question-answering (QA) systems could have a big impact on the way that we access information. Furthermore, open-domain question answering is a benchmark task in the development of Artificial Intelligence, since understanding text and being able to answer questions about it is something that we generally associate with intelligence.
# The Natural Questions Dataset
To help spur development in open-domain question answering, we have created the Natural Questions (NQ) corpus, along with a challenge website based on this data. The NQ corpus contains questions from real users, and it requires QA systems to read and comprehend an entire Wikipedia article that may or may not contain the answer to the question. The inclusion of real user questions, and the requirement that solutions should read an entire page to find the answer, cause NQ to be a more realistic and challenging task than prior QA datasets.
|
[
"# Open Domain Question Answering\nA core goal in artificial intelligence is to build systems that can read the web, and then answer complex questions about any topic. These question-answering (QA) systems could have a big impact on the way that we access information. Furthermore, open-domain question answering is a benchmark task in the development of Artificial Intelligence, since understanding text and being able to answer questions about it is something that we generally associate with intelligence.",
"# The Natural Questions Dataset\nTo help spur development in open-domain question answering, we have created the Natural Questions (NQ) corpus, along with a challenge website based on this data. The NQ corpus contains questions from real users, and it requires QA systems to read and comprehend an entire Wikipedia article that may or may not contain the answer to the question. The inclusion of real user questions, and the requirement that solutions should read an entire page to find the answer, cause NQ to be a more realistic and challenging task than prior QA datasets."
] |
[
"TAGS\n#transformers #tf #bert #question-answering #small answer #dataset-natural_questions #endpoints_compatible #region-us \n",
"# Open Domain Question Answering\nA core goal in artificial intelligence is to build systems that can read the web, and then answer complex questions about any topic. These question-answering (QA) systems could have a big impact on the way that we access information. Furthermore, open-domain question answering is a benchmark task in the development of Artificial Intelligence, since understanding text and being able to answer questions about it is something that we generally associate with intelligence.",
"# The Natural Questions Dataset\nTo help spur development in open-domain question answering, we have created the Natural Questions (NQ) corpus, along with a challenge website based on this data. The NQ corpus contains questions from real users, and it requires QA systems to read and comprehend an entire Wikipedia article that may or may not contain the answer to the question. The inclusion of real user questions, and the requirement that solutions should read an entire page to find the answer, cause NQ to be a more realistic and challenging task than prior QA datasets."
] |
[
40,
98,
125
] |
[
"passage: TAGS\n#transformers #tf #bert #question-answering #small answer #dataset-natural_questions #endpoints_compatible #region-us \n# Open Domain Question Answering\nA core goal in artificial intelligence is to build systems that can read the web, and then answer complex questions about any topic. These question-answering (QA) systems could have a big impact on the way that we access information. Furthermore, open-domain question answering is a benchmark task in the development of Artificial Intelligence, since understanding text and being able to answer questions about it is something that we generally associate with intelligence.# The Natural Questions Dataset\nTo help spur development in open-domain question answering, we have created the Natural Questions (NQ) corpus, along with a challenge website based on this data. The NQ corpus contains questions from real users, and it requires QA systems to read and comprehend an entire Wikipedia article that may or may not contain the answer to the question. The inclusion of real user questions, and the requirement that solutions should read an entire page to find the answer, cause NQ to be a more realistic and challenging task than prior QA datasets."
] |
[
0.06341800838708878,
0.09114639461040497,
-0.0007522614905610681,
0.01632395014166832,
0.07480042427778244,
0.018070057034492493,
0.025344377383589745,
0.10081589221954346,
0.10633332282304764,
0.054983291774988174,
0.08856486529111862,
-0.0036973801907151937,
-0.010275817476212978,
-0.013579466380178928,
0.07652714848518372,
-0.1060975044965744,
0.01911846362054348,
0.013699431903660297,
0.02535111829638481,
0.09403588622808456,
0.06512893736362457,
-0.06691844761371613,
0.07031898200511932,
0.017912786453962326,
-0.024243095889687538,
0.021231506019830704,
-0.01916070654988289,
-0.041343171149492264,
0.12438856065273285,
0.11021144688129425,
-0.0032038132194429636,
0.006590098142623901,
-0.015557900071144104,
-0.17054933309555054,
0.017731625586748123,
0.02705175057053566,
0.014974547550082207,
0.05010223388671875,
-0.06848205626010895,
0.11011967808008194,
0.024329153820872307,
-0.03745395317673683,
0.04838259145617485,
0.12323469668626785,
-0.1244530975818634,
-0.1471565067768097,
0.015086771920323372,
-0.0023155936505645514,
-0.031213460490107536,
0.12514205276966095,
-0.06308646500110626,
0.04606901481747627,
-0.0617893822491169,
0.08343689143657684,
0.07403253763914108,
-0.12416643649339676,
-0.03809625282883644,
0.10503699630498886,
-0.0033005340956151485,
0.12775829434394836,
0.04441005364060402,
0.0398310124874115,
0.0543731190264225,
-0.018814507871866226,
0.025982197374105453,
-0.03423565998673439,
-0.14029046893119812,
-0.011992930434644222,
-0.06916721165180206,
-0.05904456600546837,
0.22742655873298645,
0.0667695701122284,
-0.04554327204823494,
-0.07123135030269623,
-0.021005691960453987,
0.14612236618995667,
0.03392234444618225,
-0.13567283749580383,
-0.056734926998615265,
0.01922348327934742,
-0.05130444094538689,
-0.1781083047389984,
-0.02921244688332081,
-0.003047587350010872,
-0.03854406252503395,
0.07594551146030426,
0.04867127165198326,
0.07741034775972366,
-0.13324323296546936,
0.03927529975771904,
-0.08014259487390518,
-0.03501681238412857,
-0.023016855120658875,
-0.09910877048969269,
-0.09514009952545166,
0.06463392078876495,
-0.07528499513864517,
-0.001621621078811586,
0.022969763725996017,
0.05180264264345169,
0.034774526953697205,
0.013754025101661682,
-0.05626910179853439,
-0.007178288884460926,
0.08234327286481857,
0.12909537553787231,
-0.0970936045050621,
-0.05001576617360115,
0.048959486186504364,
-0.1081080436706543,
-0.012625766918063164,
-0.08137036859989166,
-0.04627138748764992,
0.0182510893791914,
-0.02017815411090851,
0.06265603750944138,
0.1425093114376068,
0.08494026213884354,
0.042435258626937866,
-0.05208246782422066,
-0.008352993056178093,
0.018975766375660896,
-0.10667909681797028,
0.02411315217614174,
-0.07495764642953873,
-0.04779813438653946,
0.027929043397307396,
0.010678072459995747,
-0.09573758393526077,
-0.09808764606714249,
-0.01472460012882948,
-0.15930059552192688,
-0.030121153220534325,
-0.04750847443938255,
0.06502292305231094,
-0.10366877168416977,
0.049072571098804474,
-0.09282001107931137,
-0.06874793022871017,
-0.007155594415962696,
0.053193915635347366,
-0.01540741790086031,
-0.10790412127971649,
0.05995108187198639,
0.05443796142935753,
-0.09203633666038513,
-0.027660423889756203,
-0.06229327619075775,
-0.01407638005912304,
0.0662299171090126,
0.023605285212397575,
0.05853985995054245,
-0.122299924492836,
0.030676456168293953,
-0.06433887034654617,
-0.02013147994875908,
0.08348803222179413,
0.0629749521613121,
-0.0032069331500679255,
-0.0418027900159359,
-0.06320315599441528,
-0.06468603014945984,
0.004579247906804085,
0.00724303163588047,
-0.005282019264996052,
0.07876370847225189,
0.06880128383636475,
0.0620611310005188,
0.0654483437538147,
0.03403080254793167,
-0.16955867409706116,
0.15339700877666473,
-0.02946937084197998,
0.17175926268100739,
0.1081870049238205,
0.06906116753816605,
0.1081785187125206,
0.05543968081474304,
0.008564097806811333,
0.002603171393275261,
-0.017612166702747345,
0.09864044934511185,
-0.015504583716392517,
0.06129305437207222,
-0.1265827715396881,
0.07570414245128632,
0.012196908704936504,
-0.030246930196881294,
-0.045481834560632706,
-0.1138252392411232,
-0.06531807780265808,
-0.06370873749256134,
0.026367856189608574,
0.022144777700304985,
-0.025633513927459717,
-0.016668351367115974,
-0.032654959708452225,
-0.20412473380565643,
-0.03051302768290043,
-0.007904086261987686,
0.014814844354987144,
-0.16198872029781342,
-0.0007824655622243881,
-0.10512281209230423,
0.09047164022922516,
-0.15292182564735413,
-0.20706920325756073,
0.04002811014652252,
-0.052210692316293716,
0.15301749110221863,
0.22718356549739838,
-0.0280240960419178,
-0.09377897530794144,
-0.06642141938209534,
0.03399002552032471,
-0.03188173845410347,
-0.029109926894307137,
0.0032736421562731266,
-0.10173917561769485,
0.051990263164043427,
-0.06986315548419952,
0.01309414766728878,
-0.012513563968241215,
-0.015315690077841282,
-0.07975344359874725,
-0.05610555410385132,
-0.022800004109740257,
-0.004600695800036192,
0.018266750499606133,
0.0008821596275083721,
0.021138334646821022,
0.053124889731407166,
0.04689878970384598,
-0.04323554039001465,
-0.14732563495635986,
-0.01131210383027792,
-0.07358647882938385,
0.05980569124221802,
0.024261360988020897,
-0.11615593731403351,
-0.021093692630529404,
0.11707626283168793,
-0.08110910654067993,
-0.027034781873226166,
-0.018083743751049042,
-0.07376252859830856,
0.22168107330799103,
-0.026855379343032837,
-0.06629232317209244,
-0.092086561024189,
-0.013461564667522907,
0.0491056852042675,
-0.01028013601899147,
0.06993585079908371,
-0.0025805856566876173,
0.038901153951883316,
-0.19565574824810028,
-0.03165964037179947,
0.042413562536239624,
0.0880812481045723,
0.09102751314640045,
0.01882331259548664,
-0.03693630173802376,
-0.007126233074814081,
0.03322022035717964,
-0.05283091217279434,
0.10691510140895844,
-0.1669435203075409,
0.0339505635201931,
0.06530095636844635,
0.02903677336871624,
0.015827113762497902,
-0.04298245534300804,
0.010680913925170898,
0.050510141998529434,
-0.0728418305516243,
-0.1050390899181366,
-0.011644304729998112,
-0.015225717797875404,
0.13381971418857574,
0.06578544527292252,
0.07693221420049667,
0.029295602813363075,
-0.04091813042759895,
-0.14108707010746002,
0.11171150207519531,
0.006879473105072975,
-0.18330518901348114,
0.010531306266784668,
-0.05912778899073601,
-0.09441162645816803,
-0.019215544685721397,
0.08160655200481415,
-0.12081390619277954,
-0.02468608319759369,
-0.048335567116737366,
0.09710239619016647,
-0.016748499125242233,
-0.08788282424211502,
-0.016212547197937965,
-0.043626293540000916,
-0.006633071228861809,
-0.10033688694238663,
0.014907119795680046,
-0.03757837414741516,
-0.12840141355991364,
0.04928493872284889,
-0.06656685471534729,
0.1894904524087906,
0.09057586640119553,
-0.0165106114000082,
-0.0735883042216301,
-0.05381431803107262,
0.34457966685295105,
-0.12150441110134125,
0.11413716524839401,
0.16289235651493073,
-0.06447617709636688,
0.016531052067875862,
0.10825614631175995,
0.00890120305120945,
-0.07671446353197098,
0.10484249889850616,
0.0901840478181839,
-0.14180125296115875,
-0.265645831823349,
-0.07226945459842682,
-0.04339880496263504,
-0.012923527508974075,
-0.005238175857812166,
0.02532937191426754,
0.153061181306839,
0.09806520491838455,
-0.0807076096534729,
-0.02905431017279625,
0.011489814147353172,
0.043947841972112656,
0.2146802693605423,
-0.04915313422679901,
0.10770954936742783,
-0.026710081845521927,
-0.05257560685276985,
0.046443257480859756,
0.08093252032995224,
0.19397902488708496,
0.003298594383522868,
0.15473808348178864,
0.0386495366692543,
0.1403045952320099,
-0.03264951333403587,
0.060470324009656906,
-0.03930763155221939,
0.02799270860850811,
-0.1151861846446991,
-0.01054293755441904,
-0.06227875500917435,
0.07095570862293243,
0.09421764314174652,
-0.06618908792734146,
0.001743691973388195,
-0.026793835684657097,
0.008680110797286034,
0.20130647718906403,
-0.0038707826752215624,
-0.0007942216470837593,
-0.08411359041929245,
0.03522749990224838,
0.04674718528985977,
-0.11037465184926987,
0.07029713690280914,
0.08747559785842896,
-0.10619989782571793,
-0.07421048730611801,
0.009800801984965801,
0.14528203010559082,
-0.023784426972270012,
0.07166124880313873,
-0.07427439838647842,
-0.12537196278572083,
0.04569115489721298,
0.14511528611183167,
-0.15308941900730133,
0.07191680371761322,
0.04091108962893486,
-0.020888805389404297,
-0.12715332210063934,
-0.04896939918398857,
-0.06125530228018761,
-0.03877978399395943,
0.1872231364250183,
0.011207557283341885,
0.02696255035698414,
-0.05939608812332153,
-0.008102060295641422,
0.16922253370285034,
0.08345561474561691,
0.003742719069123268,
0.04160279035568237,
0.020578911527991295,
0.06887411326169968,
0.04156362637877464,
0.16820472478866577,
-0.06357969343662262,
-0.1608741581439972,
0.01460010651499033,
-0.010747984051704407,
0.07624954730272293,
0.0032958851661533117,
0.07296846807003021,
0.060630083084106445,
0.11716719716787338,
-0.013514567166566849,
-0.08694121986627579,
-0.10494106262922287,
-0.03989005461335182,
0.0558294877409935,
-0.06278328597545624,
-0.04841293767094612,
-0.022014275193214417,
0.024096602573990822,
0.015833856537938118,
-0.09731081128120422,
0.039794012904167175,
-0.08319387584924698,
-0.1381213217973709,
-0.09799253940582275,
-0.017728131264448166,
0.047131869941949844,
0.0764843225479126,
0.012158320285379887,
-0.10263658314943314,
-0.002202175557613373,
-0.15845756232738495,
0.022059010341763496,
-0.00015970401000231504,
-0.18808862566947937,
0.03818787261843681,
-0.06530499458312988,
0.1872493028640747,
-0.12465465068817139,
0.02192855440080166,
0.14120694994926453,
0.06361109763383865,
-0.04923593997955322,
0.038223352283239365,
0.2783659100532532,
-0.05841562896966934,
-0.20121388137340546,
0.009053132496774197,
-0.0771200880408287,
-0.07655353844165802,
0.03313247486948967,
-0.13160136342048645,
0.1482831984758377,
-0.07463324815034866,
0.0016989297000691295,
-0.014375489205121994,
-0.08906283974647522,
-0.035041384398937225,
0.0803607702255249,
0.0534089133143425,
0.34806138277053833,
-0.09054436534643173,
0.0390973761677742,
0.054595451802015305,
-0.11469396948814392,
0.1105283796787262,
-0.10317523777484894,
0.03882642462849617,
0.012029657140374184,
0.07538513839244843,
0.033663682639598846,
-0.04065895453095436,
0.040019690990448,
-0.08435636758804321,
0.0013545961119234562,
-0.05064032971858978,
0.044895097613334656,
0.07510519027709961,
-0.013153339736163616,
0.09659240394830704,
0.13275843858718872,
0.12167598307132721,
-0.026492489501833916,
-0.06046714633703232,
-0.13861677050590515,
-0.01166844554245472,
-0.07062015682458878,
-0.17160305380821228,
-0.17519737780094147,
0.04237771034240723,
0.2025996893644333,
0.006346399895846844,
0.04114349186420441,
-0.08945837616920471,
0.05809316784143448,
0.10419297218322754,
0.15560220181941986,
-0.05495015159249306,
-0.052010808140039444,
0.022457318380475044,
-0.04076933488249779,
0.09215471148490906,
-0.11193783581256866,
0.06313907355070114,
0.15573297441005707,
-0.016429828479886055,
-0.0030401027761399746,
0.03058544173836708,
-0.09746511280536652,
0.03660806268453598,
-0.06592576950788498,
-0.06950279325246811,
-0.22657284140586853,
-0.009387733414769173,
0.05363507941365242,
-0.11915740370750427,
-0.046508051455020905,
0.09365028142929077,
-0.035455357283353806,
0.0004345503984950483,
0.009855222888290882,
-0.017030205577611923,
0.03320906311273575,
0.002116380026564002,
0.046440210193395615,
0.03920746222138405,
-0.06373618543148041,
0.003162533976137638,
0.056805334985256195,
-0.0751352533698082,
0.036754779517650604,
-0.10515515506267548,
-0.09612099826335907,
-0.01006371434777975,
-0.01564701646566391,
-0.029405711218714714,
-0.135872945189476,
-0.08806749433279037,
0.003954979125410318,
-0.13488507270812988,
0.01007175911217928,
0.07668562978506088,
-0.039617449045181274,
-0.03351031243801117,
-0.021168123930692673,
0.02254013903439045,
-0.07047118991613388,
0.04343738034367561,
-0.12030644714832306,
-0.011649935506284237,
-0.004243323113769293,
-0.18008513748645782,
0.01418233197182417,
0.17857632040977478,
-0.06526251882314682,
-0.0877906009554863,
-0.1205621212720871,
0.03770354390144348,
-0.2764579653739929,
-0.0534663200378418,
-0.022503314539790154,
-0.006918135564774275,
-0.03994033858180046,
-0.06568484008312225,
-0.03254348039627075,
0.009491942822933197,
-0.031665489077568054,
-0.02136358991265297,
0.05703214555978775,
0.01410429272800684,
-0.10537083446979523,
-0.011832524091005325,
0.06349728256464005,
0.010707725770771503,
0.0950876995921135,
-0.06124414876103401,
-0.17582011222839355,
0.01492875162512064,
-0.03541276231408119,
0.03010820969939232,
-0.05388450622558594,
0.09905453771352768,
-0.006532001309096813,
-0.07989192754030228,
-0.010234110057353973,
-0.03073679842054844,
-0.05106211453676224,
0.011696703732013702,
-0.05588389188051224,
-0.06218980252742767,
0.07230368256568909,
-0.023428820073604584,
-0.040521617978811264,
-0.027796460315585136,
-0.039278652518987656,
-0.05352478846907616,
0.012582032941281796,
0.03892992436885834,
0.0029983404092490673,
0.08806533366441727,
-0.13163168728351593,
-0.004259379114955664,
0.06418471038341522,
0.027963552623987198,
-0.035337965935468674,
-0.08101272583007812,
-0.029685089364647865,
-0.060663316398859024,
0.23902709782123566,
-0.13157276809215546,
0.08669064939022064,
0.0339483805000782,
-0.055621709674596786,
-0.003667280310764909,
-0.11182300001382828,
0.033410266041755676,
-0.0013669291511178017,
-0.04446934908628464,
0.04203417897224426,
0.010329047217965126,
-0.11178695410490036,
0.06848812103271484,
0.1188947930932045,
0.013067344203591347,
0.25764429569244385,
0.016472643241286278,
-0.0017653554677963257,
0.042488180100917816,
0.08573586493730545,
-0.01358471903949976,
0.08526555448770523,
0.020241372287273407,
0.04266684502363205,
-0.13385999202728271,
0.11642397195100784,
-0.023998811841011047,
0.1266849786043167,
-0.047362033277750015,
-0.011786030605435371,
-0.06516183912754059,
-0.0025098188780248165,
-0.024758441373705864,
-0.02637254074215889,
0.00776837021112442,
-0.16796743869781494,
-0.07079233229160309,
-0.055144935846328735,
0.043869346380233765,
-0.1317405253648758,
0.14700865745544434,
-0.011330286972224712,
-0.05523307994008064,
0.01591949723660946,
-0.00100768415722996,
0.07251182943582535,
0.07868395745754242,
0.07663434743881226,
-0.018815627321600914,
0.1175907552242279,
0.039719358086586,
0.12113599479198456,
-0.06642913073301315,
-0.014865167438983917,
-0.061134323477745056,
-0.03075144626200199,
-0.060897983610630035,
0.04101261496543884,
0.03718758746981621,
0.23315753042697906,
-0.03434361517429352,
-0.05203806981444359,
0.010808056220412254,
0.23174569010734558,
-0.05393800511956215,
-0.004134895745664835,
-0.13380464911460876,
0.23247478902339935,
-0.031949155032634735,
-0.009136458858847618,
0.01007299404591322,
-0.010928818956017494,
-0.056479256600141525,
0.2764843702316284,
0.153377503156662,
-0.06523147225379944,
-0.03835689276456833,
-0.018352443352341652,
0.02048225700855255,
-0.041380904614925385,
0.0910421684384346,
0.06680457293987274,
0.3500213921070099,
-0.06226099655032158,
0.11455145478248596,
0.02250072918832302,
0.06319164484739304,
0.034896571189165115,
0.08543100953102112,
0.11150292307138443,
-0.01334538497030735,
-0.06223262846469879,
0.08032441139221191,
-0.23919379711151123,
-0.13750824332237244,
0.012025461532175541,
-0.016813166439533234,
-0.06263618916273117,
0.04407311603426933,
-0.09476299583911896,
-0.04751411825418472,
0.09338431805372238,
-0.023981928825378418,
0.04809824749827385,
0.025586649775505066,
-0.008043884299695492,
-0.005085566081106663,
0.02984347753226757,
0.0986853763461113,
0.10025573521852493,
0.2548554241657257,
0.0492115244269371,
0.068291574716568,
0.06639135628938675,
-0.09155770391225815,
-0.08898326754570007,
0.06817011535167694,
0.08833066374063492,
-0.05545906722545624,
-0.07200963795185089,
0.07336094230413437,
0.0269674863666296,
0.17071396112442017,
0.17741240561008453,
-0.05022340640425682,
0.030524807050824165,
0.07225190848112106,
0.11259390413761139,
-0.13554169237613678,
0.011744259856641293,
-0.10988824814558029,
0.13410358130931854,
0.09098632633686066,
-0.0413263700902462,
-0.026559116318821907,
-0.019812431186437607,
0.05975755676627159,
-0.002027788432314992,
0.08684495836496353,
-0.03412995859980583,
-0.040086012333631516,
-0.0016308611957356334,
0.021448791027069092,
-0.016307752579450607,
-0.20161373913288116,
-0.060742802917957306,
0.13173046708106995,
0.014047426171600819,
0.1046173945069313,
0.07206390798091888,
0.06744664907455444,
-0.019823279231786728,
-0.010028485208749771,
-0.12660269439220428,
-0.04595692455768585,
0.09470364451408386,
-0.12710772454738617,
-0.08055856823921204
] |
null | null |
transformers
|
# My Awesome Model
|
{"tags": ["conversational"]}
|
text-generation
|
ann101020/le2sbot-hp
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
# My Awesome Model
|
[
"# My Awesome Model"
] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n",
"# My Awesome Model"
] |
[
51,
4
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# My Awesome Model"
] |
[
-0.05259015038609505,
0.05521034821867943,
-0.005910294596105814,
0.017722278833389282,
0.15250112116336823,
0.02286236733198166,
0.07657632976770401,
0.09513414651155472,
-0.025391526520252228,
-0.047348517924547195,
0.15119488537311554,
0.19781284034252167,
-0.020334534347057343,
0.101333387196064,
-0.04688440263271332,
-0.3143521845340729,
0.06439975649118423,
0.05463787540793419,
-0.015605635941028595,
0.12023304402828217,
0.09468326717615128,
-0.0530015267431736,
0.08742043375968933,
-0.012155864387750626,
-0.1293085366487503,
-0.0027921805158257484,
-0.002384399762377143,
-0.10180269181728363,
0.11194873601198196,
0.033712033182382584,
0.05166437849402428,
0.0182647667825222,
-0.05843055993318558,
-0.139859139919281,
0.03845210000872612,
-0.015005595050752163,
-0.05602653697133064,
0.05648263916373253,
0.059830192476511,
-0.07164353132247925,
0.1669619083404541,
0.13275989890098572,
-0.04237370565533638,
0.056127581745386124,
-0.17620700597763062,
0.017941240221261978,
0.01800798624753952,
0.019184142351150513,
0.05306641012430191,
0.10830496996641159,
-0.03932326287031174,
0.09217294305562973,
-0.11410652846097946,
0.08313368260860443,
0.07800983637571335,
-0.29151955246925354,
-0.025312699377536774,
0.10440942645072937,
0.06437138468027115,
0.048375632613897324,
-0.013386772945523262,
0.0621674507856369,
0.02149512618780136,
0.008602659218013287,
0.02225899137556553,
-0.06727100163698196,
-0.05789240449666977,
0.032748885452747345,
-0.0967593789100647,
-0.03634428232908249,
0.19753605127334595,
-0.024647634476423264,
0.053590498864650726,
-0.06265407055616379,
-0.11300963163375854,
-0.039751436561346054,
-0.050429005175828934,
-0.029761891812086105,
-0.05090925097465515,
0.09489558637142181,
0.004352911841124296,
-0.09534718841314316,
-0.13405443727970123,
-0.01370926946401596,
-0.1618979275226593,
0.15892250835895538,
0.012579603120684624,
0.046201955527067184,
-0.19210097193717957,
0.11465331166982651,
-0.03857925534248352,
-0.08259090781211853,
0.030513519421219826,
-0.12010065466165543,
0.03160654753446579,
-0.008132083341479301,
-0.019599268212914467,
-0.049325279891490936,
0.061037879437208176,
0.08101806789636612,
0.018783701583743095,
0.005755073390901089,
0.018167443573474884,
0.05343452841043472,
0.05891622602939606,
0.10033947974443436,
-0.02891627699136734,
-0.0625043511390686,
0.0025436533614993095,
-0.12051084637641907,
-0.01122665498405695,
-0.05357983708381653,
-0.18095199763774872,
0.002246231772005558,
0.02455340512096882,
0.05192234739661217,
0.011778532527387142,
0.09955989569425583,
-0.028496338054537773,
-0.026898741722106934,
0.06898727267980576,
0.002862759632989764,
-0.015707949176430702,
-0.005368964280933142,
-0.010934269987046719,
0.11485416442155838,
-0.023099146783351898,
0.04774846136569977,
-0.12022071331739426,
0.020393015816807747,
-0.07851235568523407,
-0.0019349842332303524,
-0.06214260309934616,
-0.04864754155278206,
-0.0019346009939908981,
-0.06985589861869812,
0.021118074655532837,
-0.14833110570907593,
-0.17990200221538544,
-0.005064866971224546,
0.021302316337823868,
-0.052403319627046585,
-0.09162671118974686,
-0.0982397273182869,
-0.02586611732840538,
0.03574685752391815,
-0.05873546749353409,
0.013170980848371983,
-0.06884536147117615,
0.06542801111936569,
0.0029820678755640984,
0.05682007595896721,
-0.14085575938224792,
0.08719147741794586,
-0.12582023441791534,
-0.023288866505026817,
-0.061977192759513855,
0.1109607070684433,
0.024780582636594772,
0.1267160177230835,
0.004311583004891872,
-0.0033308975398540497,
-0.08729329705238342,
0.08271238207817078,
-0.04243258014321327,
0.22770646214485168,
-0.10479787737131119,
-0.08809807151556015,
0.2632525563240051,
-0.05423165112733841,
-0.16432519257068634,
0.10179096460342407,
-0.014350244775414467,
0.12198644131422043,
0.13850919902324677,
0.16080057621002197,
0.007628654129803181,
0.03313867375254631,
0.10115300863981247,
0.08631709218025208,
-0.08573295921087265,
-0.0611947737634182,
0.023627014830708504,
-0.011463395319879055,
-0.10670105367898941,
0.046802595257759094,
0.04794782027602196,
0.08188598603010178,
-0.04982871189713478,
-0.028600862249732018,
-0.01972118206322193,
-0.044152840971946716,
0.05264130234718323,
0.007675500120967627,
0.13217447698116302,
-0.03674980252981186,
-0.03692879155278206,
-0.023745311424136162,
0.01699630729854107,
-0.03115241602063179,
0.007061392068862915,
-0.05687357112765312,
0.11091547459363937,
-0.03406180441379547,
0.051789235323667526,
-0.16953988373279572,
-0.04873261600732803,
-0.02087729424238205,
0.1402055323123932,
0.04973345249891281,
0.1329866498708725,
0.06287940591573715,
-0.010758201591670513,
0.00859389640390873,
0.007998145185410976,
0.13181665539741516,
0.007865442894399166,
-0.07660657912492752,
-0.047718439251184464,
0.09176599979400635,
-0.05973208695650101,
0.06147782504558563,
-0.098741315305233,
-0.004747362341731787,
-0.01433002483099699,
0.08674649894237518,
0.006352655589580536,
0.029382232576608658,
-0.006192679051309824,
0.003654100699350238,
-0.06161240115761757,
0.017873648554086685,
0.12492607533931732,
-0.01421504095196724,
-0.07439801841974258,
0.22084392607212067,
-0.15798072516918182,
0.18006981909275055,
0.18165533244609833,
-0.3081994652748108,
0.024602634832262993,
-0.08860466629266739,
-0.036338552832603455,
0.03426366671919823,
0.0491504967212677,
-0.034147560596466064,
0.16587987542152405,
-0.016766328364610672,
0.201018825173378,
-0.03547777235507965,
-0.01287798210978508,
-0.010399105958640575,
-0.03656993433833122,
-0.010632630437612534,
0.09065473079681396,
0.15122920274734497,
-0.1677125245332718,
0.18270380795001984,
0.1660280078649521,
0.06873020529747009,
0.17776396870613098,
0.034313347190618515,
-0.006856906693428755,
0.07112615555524826,
-0.022670727223157883,
-0.07675548642873764,
-0.049287427216768265,
-0.26302891969680786,
-0.027947327122092247,
0.06471601128578186,
0.04510856419801712,
0.11924877762794495,
-0.10971947014331818,
-0.037208184599876404,
0.010892451740801334,
-0.013165894895792007,
0.02132410928606987,
0.09682225435972214,
0.01171150617301464,
0.11804302036762238,
-0.021027036011219025,
-0.05209195241332054,
0.0898953229188919,
0.02727191150188446,
-0.0787680521607399,
0.19168277084827423,
-0.10074768215417862,
-0.3233809769153595,
-0.11354339867830276,
-0.18166927993297577,
-0.017843691632151604,
0.05878754332661629,
0.08049646019935608,
-0.09228580445051193,
-0.02625267766416073,
-0.01639235019683838,
0.0758359357714653,
-0.09145816415548325,
-0.015880629420280457,
-0.09367848187685013,
0.034986745566129684,
-0.10827737301588058,
-0.07011983543634415,
-0.05141967162489891,
-0.03368452936410904,
-0.04457031562924385,
0.13157756626605988,
-0.12242637574672699,
0.06396433711051941,
0.2076517641544342,
0.06227295100688934,
0.05622440204024315,
-0.0229496993124485,
0.23288212716579437,
-0.10842552781105042,
0.02383521944284439,
0.1717897206544876,
-0.03566030040383339,
0.0727933868765831,
0.13435456156730652,
0.006721907295286655,
-0.08144525438547134,
0.03465581312775612,
-0.04592517390847206,
-0.08630958944559097,
-0.20441576838493347,
-0.14156180620193481,
-0.12814727425575256,
0.07913564145565033,
0.03285396471619606,
0.05478321388363838,
0.15024253726005554,
0.11386489123106003,
0.007987297140061855,
0.00976672861725092,
-0.006888182368129492,
0.05438044294714928,
0.17482298612594604,
-0.05838097631931305,
0.10041683167219162,
-0.037591226398944855,
-0.1924494504928589,
0.08022978901863098,
0.04309763014316559,
0.08280511945486069,
0.07474655658006668,
0.0856199786067009,
0.013537914492189884,
0.03723837807774544,
0.10897084325551987,
0.1165735274553299,
0.031679023057222366,
-0.038079675287008286,
-0.04882059991359711,
-0.026300756260752678,
-0.03285675123333931,
0.05745977535843849,
0.07790146768093109,
-0.1608346849679947,
-0.06348084658384323,
-0.06350091099739075,
0.07662643492221832,
0.09017108380794525,
0.11811108142137527,
-0.21219493448734283,
0.01579318381845951,
0.092556893825531,
-0.0494147390127182,
-0.1304239183664322,
0.07402537018060684,
-0.00466050673276186,
-0.1397053301334381,
0.037663187831640244,
-0.014095795340836048,
0.1359514445066452,
-0.0778401643037796,
0.10336452722549438,
-0.08307972550392151,
-0.06147889420390129,
0.03632286190986633,
0.1355396956205368,
-0.30774354934692383,
0.2137020230293274,
-0.022472934797406197,
-0.05296783149242401,
-0.10508129745721817,
-0.011727629229426384,
0.020913105458021164,
0.09079049527645111,
0.10090240091085434,
-0.0025442070327699184,
0.0061299679800868034,
-0.0345483273267746,
-0.053218815475702286,
0.024456629529595375,
0.07957815378904343,
-0.08542889356613159,
0.0017540202243253589,
-0.02361489273607731,
-0.004407065454870462,
-0.032844748347997665,
-0.01189463958144188,
-0.011617658659815788,
-0.16786961257457733,
0.06556065380573273,
-0.002625665394589305,
0.11129079759120941,
0.03491498529911041,
0.0024013579823076725,
-0.1009332686662674,
0.19977013766765594,
0.01796281896531582,
-0.08052749931812286,
-0.08830537647008896,
-0.03254766762256622,
0.03660419583320618,
-0.06121435388922691,
0.027481911703944206,
-0.06916457414627075,
0.033381566405296326,
-0.06441576033830643,
-0.18325145542621613,
0.1268530637025833,
-0.10945470631122589,
-0.03609596937894821,
-0.04321056231856346,
0.18323224782943726,
-0.00929707009345293,
-0.0011623724130913615,
0.05866571143269539,
0.0032208464108407497,
-0.1347510665655136,
-0.10740556567907333,
0.020214511081576347,
-0.015275230631232262,
0.009142245166003704,
0.05559912323951721,
-0.009665844030678272,
0.00045268211397342384,
-0.039558928459882736,
-0.023234419524669647,
0.32348164916038513,
0.10732097923755646,
-0.04944206401705742,
0.17007054388523102,
0.13087597489356995,
-0.0827672928571701,
-0.30699312686920166,
-0.10971353948116302,
-0.10529600828886032,
-0.026918673887848854,
-0.037983208894729614,
-0.19617970287799835,
0.09504909813404083,
-0.03528566658496857,
-0.022136637941002846,
0.11253651231527328,
-0.2759084105491638,
-0.0770430713891983,
0.1826775223016739,
0.003314757253974676,
0.3998824954032898,
-0.10265109688043594,
-0.08777514100074768,
-0.06741699576377869,
-0.1120782196521759,
0.2033512443304062,
-0.05560711398720741,
0.08663415163755417,
-0.00517998356372118,
0.15513743460178375,
0.055607251822948456,
-0.02176513522863388,
0.08932057023048401,
-0.005811662413179874,
-0.0546204075217247,
-0.1219351515173912,
-0.03444604203104973,
-0.009159418754279613,
0.007239421829581261,
0.03589896112680435,
-0.04242607578635216,
0.01279151439666748,
-0.1399589478969574,
-0.045490626245737076,
-0.0764620453119278,
0.024699507281184196,
0.021008269861340523,
-0.0652410089969635,
-0.01643640361726284,
-0.03945036977529526,
-0.012804778292775154,
0.03164318576455116,
0.15236099064350128,
-0.06478006392717361,
0.1476556956768036,
0.04904455319046974,
0.15412139892578125,
-0.14745712280273438,
-0.02258288487792015,
-0.06896031647920609,
-0.05498642474412918,
0.04900865629315376,
-0.10053684562444687,
0.050061121582984924,
0.1202658861875534,
-0.0742902010679245,
0.0987328365445137,
0.0922594666481018,
-0.01938629150390625,
0.0012483424507081509,
0.1226617842912674,
-0.2489612102508545,
-0.07742628455162048,
-0.10509459674358368,
0.013337249867618084,
0.10138551890850067,
0.06995654851198196,
0.17304721474647522,
-0.0037713919300585985,
-0.036284226924180984,
-0.0064643872901797295,
0.025414984673261642,
-0.03540204465389252,
0.05724727362394333,
-0.002706433180719614,
0.016663886606693268,
-0.15213344991207123,
0.060368724167346954,
-0.00024176653823815286,
-0.1438901126384735,
-0.013603870756924152,
0.16073721647262573,
-0.11208858340978622,
-0.15145981311798096,
-0.007263668347150087,
0.13685113191604614,
-0.13171035051345825,
-0.03302847594022751,
-0.03708777576684952,
-0.170182466506958,
0.07439173012971878,
0.1024777740240097,
0.08549231290817261,
0.08025266975164413,
-0.06620611250400543,
-0.00807863101363182,
-0.011656313203275204,
-0.026087598875164986,
0.031810320913791656,
-0.023377234116196632,
-0.09044221043586731,
0.03872343525290489,
-0.026654237881302834,
0.13591371476650238,
-0.09607382118701935,
-0.09331836551427841,
-0.135749951004982,
0.039314381778240204,
-0.12405620515346527,
-0.08138058334589005,
-0.12200927734375,
-0.0591500885784626,
0.00224387738853693,
-0.0001289021165575832,
-0.035674065351486206,
-0.06687422841787338,
-0.13582271337509155,
0.04366770386695862,
-0.04484611004590988,
0.0013091047294437885,
-0.040241483598947525,
0.04561002552509308,
0.06766383349895477,
-0.03493715822696686,
0.13722217082977295,
0.11722734570503235,
-0.07864081114530563,
0.08946478366851807,
-0.16657429933547974,
-0.0683990865945816,
0.08854512125253677,
0.008173754438757896,
0.06165994703769684,
0.06743349134922028,
0.033807408064603806,
0.06109451875090599,
0.04151686280965805,
0.03488299250602722,
0.01739438995718956,
-0.09271225333213806,
0.015541021712124348,
0.022296719253063202,
-0.1294609159231186,
-0.04801803454756737,
-0.029226921498775482,
0.00939185917377472,
0.008117396384477615,
0.11003357172012329,
-0.0426274873316288,
0.09439733624458313,
-0.05888751894235611,
0.036728594452142715,
0.016222506761550903,
-0.16461637616157532,
-0.020102784037590027,
-0.11915475130081177,
0.028684545308351517,
-0.0033096212428063154,
0.25625869631767273,
0.06346847862005234,
0.020517030730843544,
0.01250078622251749,
0.08567021042108536,
0.07241600006818771,
0.02562166005373001,
0.1956365555524826,
0.10854171961545944,
-0.05020022392272949,
-0.12334850430488586,
0.09686340391635895,
0.034720368683338165,
0.06432123482227325,
0.13385434448719025,
-0.026959087699651718,
0.002498799469321966,
0.11019360274076462,
0.011678861454129219,
0.04961980879306793,
-0.09859088063240051,
-0.16400282084941864,
-0.00994415208697319,
0.061864156275987625,
-0.04559077322483063,
0.12240655720233917,
0.11382720619440079,
-0.020697353407740593,
0.03180128335952759,
-0.010503606870770454,
-0.05694027617573738,
-0.16998925805091858,
-0.1630837321281433,
-0.08357038348913193,
-0.11794789135456085,
-0.0027763545513153076,
-0.11386270076036453,
0.013879159465432167,
0.06452289968729019,
0.0604364387691021,
-0.09019444137811661,
0.08891061693429947,
0.0687386617064476,
-0.11843101680278778,
0.08828350901603699,
-0.033263903111219406,
0.07249268144369125,
0.0015160300536081195,
0.003872724948450923,
-0.13800905644893646,
0.032393742352724075,
-0.008493867702782154,
0.04159298539161682,
-0.09244006127119064,
0.022458361461758614,
-0.11297028511762619,
-0.07659684121608734,
-0.07971972227096558,
0.05093973129987717,
-0.03541257977485657,
0.1390930563211441,
0.001295371213927865,
-0.035233911126852036,
0.024190181866288185,
0.22729112207889557,
-0.06350252777338028,
-0.030667411163449287,
-0.0618741400539875,
0.21414142847061157,
0.024466563016176224,
0.10703565180301666,
-0.016775688156485558,
0.019240234047174454,
-0.0764411985874176,
0.3689337372779846,
0.344390869140625,
-0.1225387305021286,
-0.0015968306688591838,
0.031062176451086998,
0.036916591227054596,
0.11621878296136856,
0.12602226436138153,
0.057955991476774216,
0.2995031177997589,
-0.08396036922931671,
-0.002026971662417054,
-0.02688612788915634,
-0.03624163940548897,
-0.04409930482506752,
0.10547586530447006,
0.06835740804672241,
-0.03330419585108757,
-0.027012333273887634,
0.1376710683107376,
-0.2966996431350708,
0.12323499470949173,
-0.15714547038078308,
-0.1487535685300827,
-0.06873904913663864,
-0.005042468197643757,
0.08589684963226318,
0.04748665541410446,
0.1069009080529213,
-0.019124338403344154,
-0.08203735202550888,
0.05766449123620987,
0.0320524163544178,
-0.22844897210597992,
0.011852608993649483,
0.08361081779003143,
-0.06153005734086037,
0.011767351068556309,
-0.017906347289681435,
0.038472190499305725,
0.07790610194206238,
0.025976579636335373,
-0.032770540565252304,
0.06325861811637878,
-0.005814229138195515,
-0.05033424496650696,
0.04302205145359039,
0.05059972032904625,
0.017107632011175156,
-0.1511564701795578,
0.07320158183574677,
-0.1762860119342804,
0.0566408596932888,
-0.005331212189048529,
-0.04948166385293007,
0.000018263708625454456,
0.01998119056224823,
-0.06808236241340637,
0.05880929157137871,
0.0952666699886322,
-0.012173139490187168,
-0.002317852806299925,
-0.056667573750019073,
0.007662574760615826,
-0.0679154172539711,
-0.0747012197971344,
-0.10497893393039703,
-0.1338900774717331,
-0.11392296850681305,
0.10846775025129318,
-0.011928223073482513,
-0.19833622872829437,
0.02906924858689308,
-0.11258108913898468,
0.04933213070034981,
-0.13360801339149475,
0.08599711954593658,
0.1282832771539688,
0.021543797105550766,
-0.01265349704772234,
0.04020093381404877,
0.01591683179140091,
0.08550478518009186,
-0.09200563281774521,
-0.10515180230140686
] |
null | null |
transformers
|
A POS-tagger for Old Church Slavonic trained on the Old Church Slavonic UD treebank (https://github.com/UniversalDependencies/UD_Old_Church_Slavonic-PROIEL). GitHub with api: https://github.com/annadmitrieva/chu-api
|
{"language": ["chu"], "license": "mit", "tags": ["Old Church Slavonic", "POS-tagging"], "widget": [{"text": "\u041d\u0435 \u043e\u0441\u046b\u0436\u0434\u0430\u0438\u0442\u0435 \u0434\u0430 \u043d\u0435 \u043e\u0441\u046b\u0436\u0434\u0435\u043d\u0438 \u0431\u046b\u0434\u0435\u0442\u0435"}]}
|
token-classification
|
annadmitrieva/old-church-slavonic-pos
|
[
"transformers",
"pytorch",
"safetensors",
"distilbert",
"token-classification",
"Old Church Slavonic",
"POS-tagging",
"chu",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"chu"
] |
TAGS
#transformers #pytorch #safetensors #distilbert #token-classification #Old Church Slavonic #POS-tagging #chu #license-mit #autotrain_compatible #endpoints_compatible #region-us
|
A POS-tagger for Old Church Slavonic trained on the Old Church Slavonic UD treebank (URL GitHub with api: URL
|
[] |
[
"TAGS\n#transformers #pytorch #safetensors #distilbert #token-classification #Old Church Slavonic #POS-tagging #chu #license-mit #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
62
] |
[
"passage: TAGS\n#transformers #pytorch #safetensors #distilbert #token-classification #Old Church Slavonic #POS-tagging #chu #license-mit #autotrain_compatible #endpoints_compatible #region-us \n"
] |
[
-0.0474105067551136,
0.11634867638349533,
-0.008265712298452854,
0.058698635548353195,
0.11759233474731445,
0.01770544983446598,
0.06867824494838715,
0.017077209427952766,
0.10730880498886108,
0.05126281827688217,
0.1232663244009018,
0.1885610967874527,
-0.03957781940698624,
-0.03329659625887871,
-0.0684065893292427,
-0.25793853402137756,
0.04430854320526123,
0.04395622760057449,
-0.00957451481372118,
0.0898505300283432,
0.061951953917741776,
-0.09723620116710663,
0.04799428582191467,
-0.056412223726511,
-0.04070793837308884,
0.029816102236509323,
-0.05005974695086479,
-0.0744568407535553,
0.08750627189874649,
0.008253339678049088,
0.09151284396648407,
0.003276665462180972,
-0.03375831991434097,
-0.08733843266963959,
0.025557735934853554,
-0.062150418758392334,
-0.0837860256433487,
-0.02313963882625103,
0.0041264016181230545,
-0.06349455565214157,
0.07698779553174973,
0.03809438273310661,
-0.02829466015100479,
0.00815641600638628,
-0.13703663647174835,
-0.15534377098083496,
-0.08866699039936066,
0.08738839626312256,
0.03572046756744385,
0.07123053818941116,
0.03077213652431965,
0.13035671412944794,
-0.17231512069702148,
0.10412133485078812,
0.12779377400875092,
-0.33750948309898376,
-0.03362104296684265,
0.14270304143428802,
0.08323051035404205,
0.09995169937610626,
-0.06449639797210693,
0.06401795148849487,
0.039977964013814926,
0.021276023238897324,
-0.0629025548696518,
-0.057138122618198395,
0.015006042085587978,
0.041897401213645935,
-0.15426203608512878,
-0.09160186350345612,
0.2073662430047989,
-0.09801914542913437,
0.06666362285614014,
0.03496044874191284,
-0.06013134494423866,
-0.025297338142991066,
-0.006367773283272982,
-0.01732347346842289,
-0.04672420769929886,
0.060169752687215805,
0.06270171701908112,
0.024588532745838165,
-0.08031263947486877,
0.03752122074365616,
-0.20598402619361877,
0.2263157218694687,
0.05745304003357887,
-0.003357744077220559,
-0.06795895099639893,
0.06084558367729187,
-0.019984928891062737,
-0.09527337551116943,
-0.01834353245794773,
-0.024522490799427032,
0.12178734689950943,
-0.01959048956632614,
-0.042958978563547134,
0.12998555600643158,
0.06396379321813583,
0.295026570558548,
0.03128892183303833,
0.029887381941080093,
-0.031195763498544693,
0.08437161892652512,
0.026819197461009026,
0.07374532520771027,
0.07713264971971512,
-0.037245973944664,
0.021507224068045616,
-0.06740592420101166,
0.059258878231048584,
-0.007450584787875414,
-0.15908639132976532,
-0.09638546407222748,
0.06136205047369003,
0.1375163495540619,
0.029751049354672432,
0.05694204941391945,
-0.04275280237197876,
0.03265036642551422,
0.05403586104512215,
-0.04900665953755379,
0.07028448581695557,
-0.025349203497171402,
0.08821388334035873,
0.03984959051012993,
-0.04843538627028465,
-0.00957223679870367,
-0.05072654038667679,
0.18995225429534912,
-0.011666352860629559,
0.015934448689222336,
-0.0055434382520616055,
-0.08040493726730347,
0.08665529638528824,
-0.1999172419309616,
0.0028976313769817352,
-0.12713482975959778,
-0.13194307684898376,
0.009652414359152317,
-0.025164751335978508,
-0.0032066511921584606,
-0.0018704994581639767,
-0.002643517218530178,
-0.04663848504424095,
-0.02303055301308632,
-0.034319303929805756,
-0.1316903829574585,
-0.08333660662174225,
0.10877861082553864,
-0.10138093680143356,
0.0024266194086521864,
-0.08973231166601181,
0.04627131298184395,
-0.11324514448642731,
0.0015638802433386445,
-0.17071691155433655,
-0.05587819218635559,
-0.12300998717546463,
0.1479438841342926,
0.09228334575891495,
-0.0006734570488333702,
-0.09741275757551193,
0.06629016250371933,
-0.0052545079961419106,
0.1158532053232193,
-0.12077326327562332,
-0.07394228875637054,
0.17923447489738464,
-0.19305996596813202,
-0.12294309586286545,
0.12817978858947754,
0.05457429215312004,
-0.01921391487121582,
0.07531779259443283,
0.13547562062740326,
0.13326828181743622,
-0.07001069188117981,
-0.0008837837958708405,
0.07404530048370361,
-0.06448796391487122,
-0.10919466614723206,
-0.007244568783789873,
-0.0679580420255661,
0.01546523068100214,
0.03398599848151207,
0.011035487055778503,
0.03777652606368065,
-0.05524367094039917,
-0.0075918217189610004,
-0.023264292627573013,
-0.0010877355234697461,
0.06576617062091827,
0.04437108337879181,
0.03665223345160484,
-0.10166704654693604,
0.002559307962656021,
0.00012733228504657745,
-0.02780165523290634,
0.06494992226362228,
0.062073759734630585,
-0.07716239988803864,
0.19173774123191833,
0.0836549922823906,
0.010057457722723484,
-0.135535329580307,
0.006910534575581551,
0.014781326986849308,
0.11255879700183868,
0.00441023102030158,
0.053641464561223984,
0.0856257975101471,
0.0077789174392819405,
-0.039921753108501434,
-0.018530111759901047,
0.12557464838027954,
0.01277666911482811,
-0.095286525785923,
-0.08446000516414642,
0.03270133584737778,
-0.03169900178909302,
-0.01196819357573986,
-0.08053554594516754,
0.052405402064323425,
0.18353866040706635,
0.18368756771087646,
-0.05441748723387718,
0.06530039757490158,
-0.08480077981948853,
0.07379547506570816,
-0.03605526685714722,
0.0008221969474107027,
0.08341491222381592,
0.03453139215707779,
-0.033498890697956085,
0.1107851043343544,
-0.17195966839790344,
0.2967969477176666,
0.20714354515075684,
-0.23230543732643127,
0.035355690866708755,
-0.06760264933109283,
-0.025682181119918823,
0.010192886926233768,
0.04064522683620453,
-0.043101005256175995,
-0.045179665088653564,
-0.00788352731615305,
0.08787588030099869,
-0.023524712771177292,
-0.07070564478635788,
0.0014301091432571411,
-0.05106084421277046,
-0.09582579135894775,
0.10750926285982132,
0.10296908020973206,
-0.2156399041414261,
0.209473118185997,
0.3545410633087158,
-0.01953730918467045,
0.09293241053819656,
-0.07829494774341583,
0.021935882046818733,
0.025595184415578842,
-0.02308598905801773,
-0.018355175852775574,
0.04333227500319481,
-0.17175869643688202,
-0.007158085238188505,
0.048840172588825226,
0.02759169228374958,
0.0139846196398139,
-0.11779829114675522,
-0.03979530557990074,
0.02959231100976467,
0.029349271208047867,
-0.022389719262719154,
0.09544049203395844,
0.010874379426240921,
0.0972672775387764,
-0.08275540173053741,
-0.07607053220272064,
0.10693252831697464,
0.04398731142282486,
-0.03611709922552109,
0.12474344670772552,
-0.1829809844493866,
-0.23578640818595886,
-0.11470863968133926,
-0.1407761126756668,
0.04821084439754486,
0.03265396133065224,
0.13051441311836243,
-0.06788971275091171,
-0.05106007680296898,
0.021451741456985474,
-0.020211244001984596,
-0.0712568387389183,
0.016511350870132446,
-0.1493675857782364,
-0.015108310617506504,
-0.10346982628107071,
-0.0471099317073822,
-0.08227726817131042,
-0.0629660114645958,
-0.061520252376794815,
0.14951719343662262,
-0.086258664727211,
0.0704444944858551,
0.05865097790956497,
0.01982821337878704,
0.06927232444286346,
-0.04628189653158188,
0.1350127011537552,
-0.04116841033101082,
-0.003542720340192318,
0.18956582248210907,
-0.07267685234546661,
0.09594901651144028,
0.1315418928861618,
0.02044052444398403,
-0.0614522360265255,
-0.03475421294569969,
0.00794148724526167,
-0.11904728412628174,
-0.17541274428367615,
-0.1121600791811943,
-0.13155485689640045,
0.10041596740484238,
0.02766185998916626,
0.06398944556713104,
0.04206732288002968,
0.04641389474272728,
-0.057845793664455414,
-0.0749964788556099,
0.0021590811666101217,
0.06712999939918518,
0.32745885848999023,
0.0008435109630227089,
0.10175289213657379,
-0.05078975483775139,
-0.07290703058242798,
0.0969100072979927,
-0.050465960055589676,
0.09928235411643982,
0.1062106117606163,
-0.06365463137626648,
0.08004340529441833,
0.2162604033946991,
0.13914965093135834,
0.039100658148527145,
0.0265359990298748,
-0.02018672227859497,
-0.005515547469258308,
-0.04806795343756676,
-0.05353359133005142,
-0.025255447253584862,
0.044441357254981995,
-0.1092064306139946,
-0.03381022438406944,
-0.11882858723402023,
0.05718756094574928,
0.08697264641523361,
0.022650621831417084,
-0.15920789539813995,
-0.03603818267583847,
0.05352415889501572,
0.078511081635952,
-0.09672137349843979,
0.07811077684164047,
-0.01845255121588707,
-0.08586810529232025,
0.11826625466346741,
-0.009007804095745087,
0.10082422196865082,
0.025640180334448814,
0.1038886159658432,
-0.01680644042789936,
-0.11065337061882019,
0.0009065076010301709,
0.04039006307721138,
-0.2957231104373932,
0.20881421864032745,
0.0073619140312075615,
-0.03790118172764778,
-0.04847204312682152,
-0.02423705905675888,
0.0706871747970581,
0.2567981779575348,
0.02695460245013237,
0.0527188703417778,
-0.07339321076869965,
-0.09828947484493256,
0.038279399275779724,
-0.028524124994874,
0.07860517501831055,
-0.07727834582328796,
-0.0016928762197494507,
-0.03467097133398056,
0.014104855246841908,
-0.03830500692129135,
-0.07643139362335205,
-0.014179627411067486,
-0.12863892316818237,
0.05987114459276199,
0.06787695735692978,
0.05407346785068512,
-0.02526257000863552,
-0.09162383526563644,
-0.18762172758579254,
0.1364554613828659,
-0.10590958595275879,
-0.016487756744027138,
-0.08606453984975815,
-0.05262012034654617,
0.05977889895439148,
-0.06499585509300232,
0.06628607213497162,
-0.04690183699131012,
-0.03668814152479172,
-0.06900132447481155,
-0.05206760764122009,
0.13755057752132416,
-0.11100966483354568,
-0.07581239938735962,
-0.02308228425681591,
0.22793325781822205,
0.008456047624349594,
0.027326907962560654,
-0.016063297167420387,
0.05007924512028694,
-0.00674104830250144,
-0.06891147792339325,
0.03395961597561836,
0.09952034801244736,
0.0650191605091095,
0.08357848227024078,
-0.08612757921218872,
-0.06787315011024475,
0.022307848557829857,
-0.003134719328954816,
0.15028515458106995,
0.29914841055870056,
-0.035574741661548615,
0.06058768182992935,
0.10340540111064911,
-0.06793262809515,
-0.35660025477409363,
-0.11513934284448624,
-0.17889472842216492,
-0.0072630178183317184,
0.05492782965302467,
-0.042468491941690445,
0.12058591842651367,
0.0762956514954567,
-0.06250174343585968,
0.036143288016319275,
-0.20245930552482605,
-0.09361761808395386,
0.18548491597175598,
-0.04013096168637276,
0.3831343352794647,
-0.0838465616106987,
-0.09987321496009827,
0.05081998556852341,
-0.17811216413974762,
0.009937813505530357,
-0.04384274035692215,
0.038439709693193436,
-0.005376093089580536,
0.008584530092775822,
0.04569803550839424,
-0.048008449375629425,
0.09993443638086319,
0.033969540148973465,
0.02257426269352436,
-0.15448705852031708,
-0.0789606124162674,
0.090849369764328,
-0.035936493426561356,
-0.0972866490483284,
-0.06873943656682968,
0.00720955478027463,
-0.11166628450155258,
-0.016855388879776,
-0.021153179928660393,
0.12134959548711777,
-0.029831713065505028,
-0.03759189695119858,
-0.01899980567395687,
0.00968296267092228,
-0.05887357518076897,
0.006499661598354578,
0.20179609954357147,
-0.023194171488285065,
0.11732624471187592,
0.08649896830320358,
0.08098337799310684,
-0.14161372184753418,
0.07633925974369049,
-0.09075915813446045,
-0.03678693622350693,
0.03846551477909088,
0.10384568572044373,
0.04649445787072182,
0.15906774997711182,
-0.015815960243344307,
-0.0047510406002402306,
0.08338862657546997,
0.05199510604143143,
-0.06305187195539474,
0.08357159048318863,
-0.18478387594223022,
-0.09296302497386932,
0.0023321316111832857,
-0.06472688913345337,
0.04140454903244972,
0.13621419668197632,
0.0996135026216507,
0.06852547079324722,
-0.01572713628411293,
0.007823392748832703,
-0.022973494604229927,
0.01988813281059265,
0.048573609441518784,
-0.03316108137369156,
-0.004682008642703295,
-0.09753776341676712,
0.022654986009001732,
-0.04601661115884781,
-0.16356778144836426,
-0.03932303935289383,
0.11245742440223694,
-0.15552164614200592,
-0.13776136934757233,
-0.03552516549825668,
0.043354976922273636,
-0.1382078230381012,
-0.030620798468589783,
-0.019021283835172653,
-0.1735755354166031,
0.11613985896110535,
0.21858499944210052,
0.10460959374904633,
0.050443075597286224,
-0.0327950194478035,
0.015413815155625343,
0.015097606927156448,
0.06970001012086868,
0.021631693467497826,
0.0436236709356308,
-0.15433034300804138,
0.04959337040781975,
-0.018016565591096878,
0.01932300627231598,
-0.058671772480010986,
-0.04147465154528618,
-0.11831024289131165,
0.04664551094174385,
-0.07518437504768372,
-0.06244877725839615,
-0.1241380050778389,
-0.024007907137274742,
0.0024501809384673834,
-0.16774196922779083,
-0.03287828713655472,
-0.035412877798080444,
-0.09775854647159576,
0.07445935159921646,
0.03517257049679756,
0.10660552233457565,
-0.05391235277056694,
-0.06653514504432678,
0.1066296398639679,
-0.03260350227355957,
0.11362891644239426,
0.1341092884540558,
-0.05288289114832878,
0.0728512704372406,
-0.1273462176322937,
-0.0946686640381813,
0.12528681755065918,
0.04995177313685417,
0.029760463163256645,
0.17097420990467072,
-0.023855436593294144,
0.07748769968748093,
-0.03567756339907646,
0.1040666326880455,
0.03668508678674698,
-0.07607483863830566,
0.08751126378774643,
-0.004979080520570278,
-0.17809021472930908,
0.0050339726731181145,
-0.02705438621342182,
0.1376253366470337,
-0.012555747292935848,
0.16628390550613403,
-0.06069072708487511,
-0.0016117087798193097,
-0.057143695652484894,
0.03804833069443703,
0.0072130970656871796,
-0.18220023810863495,
-0.12275983393192291,
-0.07990266382694244,
0.013662490993738174,
0.0384882390499115,
0.16037866473197937,
-0.020777205005288124,
-0.0720447525382042,
0.06738237291574478,
0.10471594333648682,
0.016837310045957565,
-0.01650996133685112,
0.2004603147506714,
0.08218609541654587,
-0.06173615902662277,
-0.10369645059108734,
0.004622071981430054,
-0.02556667849421501,
-0.13720063865184784,
0.10669928044080734,
0.11829311400651932,
0.028833746910095215,
0.016410863026976585,
0.055345483124256134,
0.016994431614875793,
-0.15251320600509644,
-0.1274818331003189,
0.01892501674592495,
0.03920381888747215,
-0.0012240660144016147,
0.14930778741836548,
0.1400579810142517,
-0.01495968084782362,
-0.021551018580794334,
-0.05468396842479706,
-0.0022757151164114475,
-0.14913752675056458,
-0.1681307554244995,
-0.04505442827939987,
-0.05791958048939705,
0.01870054006576538,
-0.009043541736900806,
0.03972825035452843,
0.06722494959831238,
0.04923127964138985,
-0.04692695289850235,
0.01578737422823906,
-0.04961396008729935,
-0.03409310802817345,
0.006710524670779705,
-0.05489983409643173,
0.06892271339893341,
-0.08811528980731964,
-0.11341703683137894,
-0.09381775557994843,
-0.019395316019654274,
-0.054201312363147736,
-0.011105109006166458,
-0.0026864390820264816,
-0.028899583965539932,
-0.1644727885723114,
-0.05920371785759926,
-0.025869173929095268,
0.06941544264554977,
0.008073225617408752,
0.025867359712719917,
0.028265774250030518,
-0.010381766594946384,
0.03654398396611214,
0.1674565076828003,
-0.01507575437426567,
-0.1653399020433426,
0.07502194494009018,
0.15986232459545135,
0.11879545450210571,
0.10413447767496109,
0.02562720514833927,
0.03182845935225487,
0.05129969120025635,
0.1418733298778534,
0.24933451414108276,
-0.0014035621425136924,
0.06591559946537018,
0.02359781041741371,
0.02721569687128067,
0.11122895032167435,
0.005149662494659424,
0.06680771708488464,
0.19719190895557404,
-0.041373271495103836,
-0.03980334848165512,
-0.088617242872715,
-0.01976822316646576,
-0.17299893498420715,
-0.0047083753161132336,
-0.004999286495149136,
-0.08702559769153595,
0.01624167338013649,
0.0885075032711029,
-0.10214193165302277,
0.13351821899414062,
0.04310128092765808,
-0.1852610856294632,
-0.05681882053613663,
0.0015699954237788916,
0.16506370902061462,
0.07793749123811722,
0.05502821132540703,
-0.07591848820447922,
-0.08486863970756531,
-0.010085036978125572,
0.019723959267139435,
-0.13319715857505798,
-0.04763300344347954,
0.054960835725069046,
0.040814414620399475,
0.08204079419374466,
-0.02961929887533188,
0.04534395411610603,
0.11884446442127228,
0.08709125220775604,
-0.02349299192428589,
0.012844526208937168,
0.057610947638750076,
-0.04907649755477905,
-0.0110700698569417,
-0.14964793622493744,
0.016654135659337044,
-0.0005920145777054131,
0.09541333466768265,
-0.16511020064353943,
0.06839893758296967,
0.010588554665446281,
-0.08856011927127838,
0.008192461915314198,
0.09461256116628647,
-0.028950486332178116,
0.04076170176267624,
0.045734137296676636,
0.007989814504981041,
-0.053198739886283875,
-0.0550294890999794,
-0.060740694403648376,
0.028090203180909157,
-0.08392855525016785,
-0.04042330011725426,
-0.0761353075504303,
-0.0246942937374115,
0.05149832367897034,
-0.02698250487446785,
0.013595850206911564,
-0.09075700491666794,
-0.11206284910440445,
0.08268749713897705,
-0.21447716653347015,
0.03901194781064987,
0.06363561004400253,
0.0036026439629495144,
0.023458492010831833,
-0.06469336897134781,
0.012340853922069073,
0.09721982479095459,
-0.08261135220527649,
-0.020463382825255394
] |
null | null |
transformers
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-finetuned-addresso
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 5
- eval_batch_size: 5
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Framework versions
- Transformers 4.12.5
- Pytorch 1.8.1
- Datasets 1.15.1
- Tokenizers 0.10.3
|
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "model-index": [{"name": "bert-base-uncased-finetuned-addresso", "results": []}]}
|
text-classification
|
annafavaro/bert-base-uncased-finetuned-addresso
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
# bert-base-uncased-finetuned-addresso
This model is a fine-tuned version of bert-base-uncased on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 5
- eval_batch_size: 5
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Framework versions
- Transformers 4.12.5
- Pytorch 1.8.1
- Datasets 1.15.1
- Tokenizers 0.10.3
|
[
"# bert-base-uncased-finetuned-addresso\n\nThis model is a fine-tuned version of bert-base-uncased on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 5\n- eval_batch_size: 5\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1",
"### Framework versions\n\n- Transformers 4.12.5\n- Pytorch 1.8.1\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
"TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"# bert-base-uncased-finetuned-addresso\n\nThis model is a fine-tuned version of bert-base-uncased on an unknown dataset.",
"## Model description\n\nMore information needed",
"## Intended uses & limitations\n\nMore information needed",
"## Training and evaluation data\n\nMore information needed",
"## Training procedure",
"### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 5\n- eval_batch_size: 5\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1",
"### Framework versions\n\n- Transformers 4.12.5\n- Pytorch 1.8.1\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
55,
43,
6,
12,
8,
3,
90,
30
] |
[
"passage: TAGS\n#transformers #pytorch #tensorboard #bert #text-classification #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# bert-base-uncased-finetuned-addresso\n\nThis model is a fine-tuned version of bert-base-uncased on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 5\n- eval_batch_size: 5\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1### Framework versions\n\n- Transformers 4.12.5\n- Pytorch 1.8.1\n- Datasets 1.15.1\n- Tokenizers 0.10.3"
] |
[
-0.06482783704996109,
0.07127998024225235,
-0.0020309744868427515,
0.0748772844672203,
0.1943807750940323,
0.02254786156117916,
0.12746836245059967,
0.0818110778927803,
-0.1132730022072792,
0.03403251990675926,
0.06650880724191666,
0.08765990287065506,
0.015962351113557816,
0.10446649044752121,
-0.05554581061005592,
-0.25444239377975464,
0.009845377877354622,
0.04743649810552597,
-0.10624119639396667,
0.08194869756698608,
0.09809279441833496,
-0.12528400123119354,
0.07506228983402252,
0.02615467458963394,
-0.20760567486286163,
0.045496195554733276,
-0.021749945357441902,
-0.07475604861974716,
0.10122986137866974,
0.024434959515929222,
0.14821498095989227,
-0.008880967274308205,
0.12879233062267303,
-0.1670479029417038,
-0.0004716544644907117,
0.08660799264907837,
0.032091058790683746,
0.0796973779797554,
0.03509058058261871,
0.02169569954276085,
0.06668152660131454,
-0.08655978739261627,
0.09844454377889633,
0.020835816860198975,
-0.05912534520030022,
-0.20986609160900116,
-0.05398185923695564,
0.04781926050782204,
0.07941637933254242,
0.08787696063518524,
0.016827154904603958,
0.1291610300540924,
-0.06833364069461823,
0.07538694888353348,
0.22728238999843597,
-0.2820639908313751,
-0.07429713755846024,
0.0583631657063961,
0.032888878136873245,
0.06056205928325653,
-0.08548000454902649,
-0.025947466492652893,
0.05725432559847832,
0.05455540120601654,
0.12222535163164139,
-0.02919747866690159,
-0.10787362605333328,
-0.018304575234651566,
-0.1479939967393875,
0.01616409607231617,
0.16848725080490112,
0.024382727220654488,
-0.06122123450040817,
-0.030890630558133125,
-0.07897752523422241,
-0.04588529095053673,
-0.03909917548298836,
-0.06050833687186241,
0.061419133096933365,
-0.041522737592458725,
-0.06878223270177841,
-0.07379425317049026,
-0.07038010656833649,
-0.05624677985906601,
-0.024884942919015884,
0.1803014725446701,
0.05481278523802757,
0.02458324283361435,
-0.05746012181043625,
0.08222071081399918,
-0.04690936952829361,
-0.11194325983524323,
0.01663736253976822,
-0.004999787081032991,
-0.005269996821880341,
-0.05779242888092995,
-0.07784036546945572,
-0.04577965289354324,
0.008760667406022549,
0.16394290328025818,
-0.05274555832147598,
0.07085705548524857,
-0.0008801071089692414,
0.010783814825117588,
-0.04155841842293739,
0.1498125195503235,
-0.03562990948557854,
-0.02623867616057396,
0.02280987612903118,
0.05492512881755829,
0.008636078797280788,
-0.006391433533281088,
-0.11896242201328278,
0.011525725945830345,
0.07058390974998474,
0.006299552973359823,
-0.08077353239059448,
0.05225706472992897,
-0.009211190044879913,
-0.04908572882413864,
-0.020238490775227547,
-0.1155654564499855,
0.03933054581284523,
-0.020206393674016,
-0.06847377121448517,
0.011098627932369709,
0.041396789252758026,
0.008137228898704052,
-0.03235330805182457,
0.1268593966960907,
-0.09894480556249619,
0.011223584413528442,
-0.1149035170674324,
-0.11435352265834808,
-0.003983956761658192,
-0.09493635594844818,
0.016313739120960236,
-0.09666376560926437,
-0.18486396968364716,
-0.0012387498281896114,
0.06905325502157211,
-0.02793380618095398,
-0.02748860977590084,
-0.028408462181687355,
-0.07399045675992966,
0.012074343860149384,
-0.015006317757070065,
0.12208043783903122,
-0.04356575757265091,
0.0540771521627903,
0.032133642584085464,
0.03483156859874725,
-0.05642194300889969,
0.054095420986413956,
-0.08234420418739319,
0.00705278804525733,
-0.18981213867664337,
0.04174603521823883,
-0.09537146985530853,
0.023840315639972687,
-0.08613316714763641,
-0.10509175062179565,
0.011220532469451427,
0.0069501339457929134,
0.07132108509540558,
0.07543934881687164,
-0.12569601833820343,
-0.058089207857847214,
0.1340712159872055,
-0.07954874634742737,
-0.07478845119476318,
0.08866428583860397,
-0.056648992002010345,
0.056348446756601334,
0.06259752810001373,
0.13083192706108093,
0.05982320010662079,
-0.13509933650493622,
-0.01003497838973999,
0.0182887502014637,
0.08504799008369446,
-0.02003195323050022,
0.036013033241033554,
-0.005909984931349754,
-0.014977451413869858,
0.01913730800151825,
-0.05848417058587074,
-0.017067622393369675,
-0.09326506406068802,
-0.06583848595619202,
-0.05344030633568764,
-0.0938539206981659,
0.03801330178976059,
0.024282267317175865,
0.07439607381820679,
-0.06934529542922974,
-0.10127466171979904,
0.19989338517189026,
0.09820044785737991,
-0.07840946316719055,
0.026531167328357697,
-0.07089631259441376,
0.03710761293768883,
-0.03969868645071983,
-0.013262985274195671,
-0.2132486253976822,
-0.09429936856031418,
0.028118789196014404,
-0.03413846716284752,
0.054796118289232254,
0.03197386488318443,
0.05245034396648407,
0.07875534892082214,
-0.04700471833348274,
0.019009172916412354,
-0.07833432406187057,
0.002981848083436489,
-0.12927205860614777,
-0.17793045938014984,
-0.061129070818424225,
-0.02534143626689911,
0.11483053117990494,
-0.2141788750886917,
0.03653190657496452,
-0.04209783673286438,
0.12581036984920502,
0.02863951027393341,
-0.023782210424542427,
-0.06574627012014389,
0.0819670781493187,
-0.026252061128616333,
-0.07513885200023651,
0.04807289317250252,
0.015100068412721157,
-0.05231960117816925,
-0.12748883664608002,
-0.12923656404018402,
0.1179465800523758,
0.1126578077673912,
-0.07279430329799652,
-0.0677889958024025,
0.015973616391420364,
-0.04446060210466385,
-0.032350990921258926,
-0.07610908895730972,
-0.006528378929942846,
0.19473443925380707,
-0.021413588896393776,
0.16281673312187195,
-0.06844127178192139,
-0.03660007566213608,
0.01956287957727909,
-0.02525261417031288,
0.00795168150216341,
0.04781927540898323,
0.10929710417985916,
-0.07041371613740921,
0.10857638716697693,
0.13064192235469818,
-0.12934978306293488,
0.129508838057518,
-0.02557157725095749,
-0.06461670994758606,
0.0017241956666111946,
-0.03892230987548828,
-0.01462436094880104,
0.09749121963977814,
-0.16354595124721527,
-0.012755957432091236,
0.0268046036362648,
0.02348237857222557,
0.04770719259977341,
-0.1845017969608307,
0.02357657626271248,
0.02734464965760708,
-0.014813501387834549,
-0.006884340662509203,
-0.04923297464847565,
0.02195499651134014,
0.08876961469650269,
0.025379111990332603,
-0.04512178525328636,
0.033738091588020325,
0.011743451468646526,
-0.0659957081079483,
0.20087891817092896,
-0.1297997683286667,
-0.1280069500207901,
-0.12086914479732513,
-0.05426810309290886,
-0.0789153054356575,
0.0015907140914350748,
0.04987316578626633,
-0.08877213299274445,
-0.06888382881879807,
-0.053730178624391556,
0.019874775782227516,
-0.01854623481631279,
0.009621970355510712,
0.07986113429069519,
-0.0062939184717834,
0.09203676134347916,
-0.13781417906284332,
-0.005938954651355743,
-0.04227641597390175,
-0.12083905190229416,
-0.004131841938942671,
0.05733755603432655,
0.09680427610874176,
0.12741997838020325,
-0.04607066884636879,
0.013409686274826527,
-0.02476895973086357,
0.2438976913690567,
-0.04083620756864548,
-0.034070055931806564,
0.130591481924057,
-0.0006612929282709956,
0.051469042897224426,
0.08942621201276779,
0.0697643980383873,
-0.10268253833055496,
0.027446875348687172,
0.08706175535917282,
-0.03571101278066635,
-0.21866685152053833,
-0.042874522507190704,
-0.026343934237957,
-0.07475528866052628,
0.1012221947312355,
0.03485661745071411,
0.016621798276901245,
0.07582280039787292,
0.014436637982726097,
0.12458053976297379,
-0.03603839874267578,
0.09939361363649368,
0.14016090333461761,
0.046723753213882446,
0.12822003662586212,
-0.030461890622973442,
-0.05581929534673691,
0.05884920433163643,
-0.016485368832945824,
0.27276360988616943,
0.01056771818548441,
0.05529798939824104,
0.05308819189667702,
0.1270957589149475,
-0.018954208120703697,
0.06702067703008652,
-0.008122342638671398,
-0.018726002424955368,
-0.00877716951072216,
-0.0599750280380249,
-0.021232038736343384,
0.013612331822514534,
-0.08878146857023239,
0.06001707911491394,
-0.1015816479921341,
0.01851636916399002,
0.022571049630641937,
0.2724939286708832,
-0.0051433369517326355,
-0.30137982964515686,
-0.0824236050248146,
-0.0009932322427630424,
-0.029318014159798622,
-0.06483785808086395,
0.02623801678419113,
0.0875699445605278,
-0.10312759131193161,
0.061154838651418686,
-0.06357145309448242,
0.10677103698253632,
-0.000544574111700058,
0.03903781995177269,
0.08097449690103531,
0.1609259843826294,
0.0004277942643966526,
0.06688416004180908,
-0.24540048837661743,
0.21155893802642822,
0.03190566599369049,
0.12232138961553574,
-0.06331686675548553,
0.027951359748840332,
0.02402138151228428,
0.09151279181241989,
0.044485773891210556,
-0.01762104406952858,
0.014433196745812893,
-0.18140743672847748,
-0.03191894292831421,
0.053866975009441376,
0.12924876809120178,
-0.01296161487698555,
0.09779172390699387,
-0.042958855628967285,
0.013293405994772911,
0.06786082684993744,
-0.034877270460128784,
-0.16583849489688873,
-0.10926494747400284,
-0.0058926246128976345,
0.04052126407623291,
-0.06797628849744797,
-0.06330692023038864,
-0.10937657207250595,
-0.05965428799390793,
0.17022934556007385,
0.0008190429653041065,
-0.04218725860118866,
-0.12707547843456268,
0.0820937305688858,
0.07691386342048645,
-0.05734375864267349,
0.05111538618803024,
0.0037491335533559322,
0.11601132154464722,
0.03615519404411316,
-0.1210191622376442,
0.08452130109071732,
-0.09079315513372421,
-0.14539624750614166,
-0.04150857403874397,
0.060971882194280624,
0.05956388637423515,
0.030709395185112953,
-0.002243492752313614,
0.017728926613926888,
-0.005500136408954859,
-0.08587019145488739,
-0.02496328018605709,
0.04379953071475029,
0.09765604883432388,
0.07023851573467255,
-0.11221930384635925,
-0.004877164959907532,
-0.02145123854279518,
0.018938368186354637,
0.11181559413671494,
0.18864159286022186,
-0.0807129442691803,
0.014265134930610657,
0.10372155159711838,
-0.09551047533750534,
-0.1938483715057373,
0.08355789631605148,
0.1150779202580452,
-0.013502406887710094,
0.0250273235142231,
-0.2301628291606903,
0.2003554403781891,
0.13904792070388794,
-0.023296112194657326,
0.07003854215145111,
-0.254669189453125,
-0.13193564116954803,
0.12157884240150452,
0.13654108345508575,
0.1072758361697197,
-0.1519617736339569,
-0.020558878779411316,
-0.0579586885869503,
-0.16811944544315338,
0.1548812985420227,
-0.14593477547168732,
0.11150141060352325,
0.0010514033492654562,
0.09156664460897446,
0.010070144198834896,
-0.029858732596039772,
0.11764966696500778,
0.035878174006938934,
0.10127396136522293,
-0.0477503165602684,
0.023724885657429695,
0.08749599009752274,
-0.045134562999010086,
0.03710810840129852,
-0.010998078621923923,
0.04206577315926552,
-0.08947563916444778,
-0.02285541221499443,
-0.06004130095243454,
0.06792858242988586,
-0.03262588009238243,
-0.07366415858268738,
-0.04543084651231766,
0.006601712200790644,
0.03646131232380867,
-0.02596798539161682,
0.14474833011627197,
0.03938424214720726,
0.1439361423254013,
0.1238657757639885,
0.09168225526809692,
-0.10233992338180542,
-0.08092840760946274,
0.006892133504152298,
-0.029559431597590446,
0.09152509272098541,
-0.11339717358350754,
0.03876519948244095,
0.11755739152431488,
0.03865289315581322,
0.12366588413715363,
0.07926517724990845,
-0.02655557170510292,
0.005541619379073381,
0.04145324230194092,
-0.1335441917181015,
-0.11690329760313034,
-0.006528540048748255,
-0.028150996193289757,
-0.11843185871839523,
0.09016934037208557,
0.12867923080921173,
-0.07010656595230103,
-0.008421570993959904,
-0.020283197984099388,
-0.015657974407076836,
-0.03965875506401062,
0.18670882284641266,
0.04791470617055893,
0.044516097754240036,
-0.09730249643325806,
0.1257469803094864,
0.06875155121088028,
-0.049491047859191895,
0.043182648718357086,
0.04300791025161743,
-0.08831536769866943,
-0.012590600177645683,
0.06941433995962143,
0.19431298971176147,
-0.09560523927211761,
-0.04660581424832344,
-0.11031771451234818,
-0.10097156465053558,
0.049522534012794495,
0.16346707940101624,
0.08274250477552414,
-0.04646121710538864,
-0.06204168125987053,
0.07495921105146408,
-0.14729119837284088,
0.07200373709201813,
0.02011757902801037,
0.09302335977554321,
-0.15527069568634033,
0.12429793179035187,
0.026232216507196426,
0.03527490422129631,
-0.032713185995817184,
0.01698041521012783,
-0.1148754209280014,
-0.02406720072031021,
-0.17539483308792114,
-0.018336301669478416,
-0.0029778413008898497,
0.003850010922178626,
0.0005384054384194314,
-0.034224431961774826,
-0.06369123607873917,
0.05262862890958786,
-0.0855221152305603,
-0.03500119224190712,
0.04442349448800087,
0.038364339619874954,
-0.13584786653518677,
0.005877737421542406,
0.008716482669115067,
-0.08260458707809448,
0.053644172847270966,
0.06791216135025024,
0.011017367243766785,
0.05276920646429062,
-0.1260472536087036,
-0.030220529064536095,
0.05506206303834915,
0.04055769741535187,
0.09521279484033585,
-0.039130937308073044,
-0.010245230980217457,
-0.013597169890999794,
0.09555211663246155,
0.0010559933725744486,
0.11895984411239624,
-0.1280859261751175,
-0.010789777152240276,
-0.05583572015166283,
-0.058643173426389694,
-0.05599839240312576,
0.03203649818897247,
0.11417573690414429,
0.04526040330529213,
0.17988811433315277,
-0.07765397429466248,
0.00009340154065284878,
-0.17264962196350098,
-0.025497563183307648,
-0.005759535823017359,
-0.06700249016284943,
-0.06243986263871193,
-0.04660474881529808,
0.05489515885710716,
-0.056803859770298004,
0.14052744209766388,
0.006926666479557753,
0.10342966020107269,
0.03380339592695236,
-0.041473325341939926,
-0.029574686661362648,
-0.003120236797258258,
0.2033691108226776,
0.06760731339454651,
-0.01561333891004324,
0.06013897433876991,
0.037064097821712494,
0.10256253927946091,
0.07694888114929199,
0.19496870040893555,
0.12809236347675323,
-0.08007717877626419,
0.07468537241220474,
0.05985938757658005,
-0.08189867436885834,
-0.18416105210781097,
0.06448280811309814,
-0.003918569069355726,
0.13957487046718597,
-0.050268664956092834,
0.1659354418516159,
0.08513946831226349,
-0.14603590965270996,
0.03673909977078438,
-0.05874859169125557,
-0.09227850288152695,
-0.1428954154253006,
-0.014899106696248055,
-0.08174628764390945,
-0.14529503881931305,
0.005542591214179993,
-0.14597125351428986,
0.021918928250670433,
0.12892936170101166,
-0.004473341163247824,
0.013092444278299809,
0.16308923065662384,
-0.07594950497150421,
0.011767691932618618,
0.04774288833141327,
-0.0007578060613013804,
-0.024609539657831192,
-0.08286842703819275,
-0.07727253437042236,
0.015173918567597866,
0.024820934981107712,
0.059062857180833817,
-0.05560570955276489,
-0.02293265238404274,
0.03284610062837601,
-0.009783509187400341,
-0.055837150663137436,
0.02767094410955906,
0.03119293600320816,
0.03437473252415657,
0.03697701916098595,
0.00037367083132267,
-0.022803854197263718,
-0.03279315307736397,
0.27131277322769165,
-0.09997837245464325,
-0.07954549044370651,
-0.11923228204250336,
0.23711881041526794,
0.05489432439208031,
0.005741704721003771,
0.05218159407377243,
-0.09054107218980789,
-0.03901854157447815,
0.20498888194561005,
0.1526886522769928,
-0.07900801301002502,
-0.02225271612405777,
0.00841880775988102,
-0.020533263683319092,
-0.059610478579998016,
0.1463993340730667,
0.13479220867156982,
0.05783594772219658,
-0.049329109489917755,
-0.0390130952000618,
-0.00757700065150857,
-0.0011081632692366838,
-0.12755174934864044,
0.03752902150154114,
0.03285277634859085,
-0.01044008880853653,
-0.023686781525611877,
0.03545396029949188,
-0.0005624783225357533,
-0.18568232655525208,
0.02435450814664364,
-0.13456642627716064,
-0.16792801022529602,
-0.025609705597162247,
0.07342006266117096,
-0.03224695473909378,
0.060551196336746216,
-0.024826906621456146,
-0.014231882058084011,
0.1310618817806244,
-0.02728888764977455,
-0.03743394836783409,
-0.11774065345525742,
0.10766338557004929,
-0.10528386384248734,
0.22948811948299408,
-0.006597437895834446,
0.06548415869474411,
0.11115207523107529,
0.044069696217775345,
-0.08548576384782791,
0.03589988499879837,
0.04028264805674553,
-0.0874578207731247,
0.014100431464612484,
0.10439809411764145,
-0.07000457495450974,
0.08478646725416183,
0.01957501471042633,
-0.17240706086158752,
-0.01884405128657818,
-0.011146796867251396,
-0.05424334108829498,
-0.04781334847211838,
-0.03646528348326683,
-0.09805415570735931,
0.11880992352962494,
0.20953521132469177,
-0.01626778021454811,
-0.003906672354787588,
-0.08207675814628601,
0.045402105897665024,
0.07434853911399841,
0.07040543854236603,
-0.055882737040519714,
-0.23611235618591309,
0.018572039902210236,
0.03231353312730789,
-0.007916754111647606,
-0.24496731162071228,
-0.07079659402370453,
0.03734647110104561,
-0.03427828848361969,
-0.07317657768726349,
0.07006927579641342,
0.08166645467281342,
0.042308028787374496,
-0.0600370429456234,
-0.1314670592546463,
-0.08981449902057648,
0.157475546002388,
-0.16047711670398712,
-0.07317055761814117
] |
null | null | null |
ktrain predictor for NER of ADR in patient forum discussions. Created in ktrain 0.29 with transformers 4.10. See requirements.txt to run model.
|
{}
| null |
annedirkson/ADR_extraction_patient_forum
|
[
"tf",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#tf #region-us
|
ktrain predictor for NER of ADR in patient forum discussions. Created in ktrain 0.29 with transformers 4.10. See URL to run model.
|
[] |
[
"TAGS\n#tf #region-us \n"
] |
[
9
] |
[
"passage: TAGS\n#tf #region-us \n"
] |
[
0.02717403694987297,
-0.0850527435541153,
-0.008868269622325897,
-0.021143348887562752,
0.07884491235017776,
0.06604291498661041,
-0.01935601979494095,
0.048123981803655624,
0.17828558385372162,
-0.053398214280605316,
0.08256072551012039,
0.0009474587859585881,
-0.022046472877264023,
0.09297667443752289,
-0.02209344506263733,
-0.19969743490219116,
0.00041887242696247995,
-0.0015154649736359715,
-0.042260799556970596,
0.04567328467965126,
-0.03771711140871048,
-0.026329118758440018,
0.030627479776740074,
-0.1159137561917305,
-0.13830040395259857,
0.12074124813079834,
0.06795552372932434,
-0.04049430415034294,
0.12169930338859558,
0.05518333241343498,
0.12711569666862488,
0.02800789475440979,
-0.1264970302581787,
-0.14019377529621124,
0.04070953652262688,
0.015509658493101597,
-0.1364237517118454,
0.009036021307110786,
0.07121166586875916,
-0.061136167496442795,
0.059957318007946014,
0.12422697246074677,
-0.02070854976773262,
0.07782208174467087,
-0.2704756557941437,
-0.17011389136314392,
-0.04335585609078407,
-0.011380821466445923,
0.009439419023692608,
0.02034422568976879,
0.03494933620095253,
0.22119572758674622,
-0.19268950819969177,
0.08085671812295914,
0.0667259618639946,
-0.27503007650375366,
0.02575431950390339,
0.14233475923538208,
-0.03822958096861839,
0.1485864818096161,
-0.022014182060956955,
0.08272524923086166,
0.06752051413059235,
0.00688638212159276,
-0.089729443192482,
-0.09771507233381271,
-0.19509509205818176,
0.1414337307214737,
-0.10734352469444275,
-0.08794605731964111,
0.35105767846107483,
0.0310226883739233,
-0.001470237853936851,
0.21522894501686096,
-0.09059171378612518,
-0.046921294182538986,
0.0703350156545639,
-0.03347092494368553,
-0.021047772839665413,
0.1274232715368271,
0.1346426159143448,
-0.07975554466247559,
-0.13711398839950562,
0.00797922071069479,
-0.30296027660369873,
0.17487899959087372,
-0.03417843580245972,
0.1232241541147232,
-0.2726263999938965,
0.015277048572897911,
-0.2604098916053772,
-0.0025342307053506374,
0.0632655918598175,
-0.07521133124828339,
-0.0705866888165474,
-0.057595036923885345,
-0.03067977912724018,
-0.032462842762470245,
0.0665893629193306,
0.11917254328727722,
-0.08973272144794464,
0.039811961352825165,
-0.16164931654930115,
0.11579488962888718,
0.017941581085324287,
0.033652376383543015,
0.030433742329478264,
0.05717121809720993,
-0.005282871890813112,
-0.25493112206459045,
-0.08231059461832047,
-0.053496260195970535,
-0.07874602824449539,
-0.018844015896320343,
-0.0834488794207573,
0.1340535581111908,
-0.027304908260703087,
-0.02118043787777424,
-0.048680201172828674,
0.04640261083841324,
0.03816661238670349,
-0.038869790732860565,
-0.08223643898963928,
-0.020980896428227425,
0.022619226947426796,
0.06617368012666702,
-0.072126604616642,
-0.028710564598441124,
0.041142065078020096,
0.049063000828027725,
-0.1366422474384308,
-0.042985327541828156,
-0.003312063403427601,
-0.0043203337118029594,
0.07464902848005295,
-0.0869700014591217,
0.030992023646831512,
-0.1929476112127304,
-0.0472056120634079,
0.0482732430100441,
-0.029292818158864975,
-0.01024946104735136,
0.12697157263755798,
0.06623061001300812,
0.004117061849683523,
-0.04233861342072487,
-0.019819604232907295,
-0.03495892509818077,
-0.06701481342315674,
0.09131357073783875,
-0.024492869153618813,
0.05391552671790123,
-0.23763039708137512,
0.018832607194781303,
-0.08696866035461426,
0.053401507437229156,
-0.13578394055366516,
-0.02757594734430313,
-0.022242018952965736,
0.17814864218235016,
0.015685174614191055,
0.04663572460412979,
-0.27110999822616577,
0.021877288818359375,
-0.08512535691261292,
0.17755696177482605,
-0.16860368847846985,
-0.06670980155467987,
0.2544674575328827,
-0.09709347784519196,
-0.14496742188930511,
0.046707138419151306,
0.05935780331492424,
0.04641662538051605,
0.06388852745294571,
0.410780131816864,
0.004673480987548828,
-0.13018378615379333,
0.15353848040103912,
0.22013212740421295,
-0.19294801354408264,
-0.09771259129047394,
0.0450412854552269,
-0.1074908971786499,
-0.22106847167015076,
-0.01014086976647377,
0.07999573647975922,
0.12680090963840485,
-0.08026163280010223,
0.002472531283274293,
0.047156259417533875,
-0.007242947816848755,
0.09082997590303421,
0.04170104116201401,
0.09191294014453888,
-0.08928097784519196,
0.10163529962301254,
-0.06283904612064362,
-0.009731750003993511,
0.14271113276481628,
0.009920487180352211,
-0.04340474307537079,
0.07706569135189056,
0.03029152750968933,
0.046988703310489655,
-0.15473642945289612,
-0.22551152110099792,
0.026446353644132614,
0.1752074658870697,
0.04971577599644661,
0.1923559606075287,
0.09056459367275238,
-0.08277427405118942,
-0.021399736404418945,
0.014442592859268188,
0.09127546101808548,
0.04057294502854347,
0.04075543209910393,
-0.03846470266580582,
0.1246790736913681,
-0.08546542376279831,
-0.10931690037250519,
-0.07317113876342773,
-0.028430841863155365,
0.2242366522550583,
0.030650822445750237,
0.10337765514850616,
0.01933816261589527,
0.01822059229016304,
0.030919868499040604,
0.08019567281007767,
-0.03319520875811577,
0.052064426243305206,
-0.013382078148424625,
-0.05563787370920181,
0.18501773476600647,
-0.07589933276176453,
0.2834354043006897,
0.15232636034488678,
-0.16404320299625397,
-0.09509498625993729,
-0.01736132986843586,
-0.025077451020479202,
0.001909477636218071,
0.12994548678398132,
-0.11213427037000656,
-0.026438463479280472,
-0.0257734265178442,
0.03532560169696808,
-0.04055468365550041,
-0.05263499543070793,
-0.01917918212711811,
-0.020069655030965805,
-0.08579973131418228,
0.09432121366262436,
0.07586310803890228,
-0.2295633852481842,
0.1382981836795807,
0.3559110760688782,
0.1796407550573349,
0.26193368434906006,
-0.11397716403007507,
-0.05855877697467804,
0.011532061733305454,
0.06026087701320648,
-0.021124225109815598,
0.07123273611068726,
-0.14530611038208008,
-0.008005826734006405,
0.0368618443608284,
0.03432610630989075,
0.08440769463777542,
-0.11618398874998093,
-0.09510861337184906,
0.0029869279824197292,
-0.026261117309331894,
-0.10964810848236084,
0.11012405902147293,
-0.031061027199029922,
0.10527210682630539,
0.039033908396959305,
-0.06190889701247215,
0.1389293074607849,
0.003243963932618499,
-0.13659702241420746,
0.09190700948238373,
-0.20584796369075775,
-0.2565053105354309,
-0.052567996084690094,
-0.054230593144893646,
0.05180581659078598,
0.010775500908493996,
0.016439594328403473,
-0.14750497043132782,
-0.00936737097799778,
0.05936877429485321,
0.03910360857844353,
-0.17949117720127106,
0.03446900099515915,
-0.022245118394494057,
0.06179428845643997,
-0.0514342226088047,
-0.016059985384345055,
-0.04964269697666168,
-0.06048547849059105,
-0.03312843292951584,
0.10799992829561234,
-0.1608588993549347,
0.12976780533790588,
0.21070943772792816,
0.028973262757062912,
0.09744609892368317,
-0.08195380866527557,
0.1790461391210556,
-0.12664757668972015,
-0.021499238908290863,
0.09458950161933899,
-0.0719936415553093,
0.03432232886552811,
0.11047855019569397,
0.027225371450185776,
-0.11436644941568375,
0.013327416032552719,
-0.008767236024141312,
-0.1515822410583496,
-0.20803174376487732,
-0.05325626954436302,
-0.1431025117635727,
0.16778264939785004,
-0.058733150362968445,
0.08097824454307556,
0.16500656306743622,
-0.05470458045601845,
0.12694261968135834,
-0.05262105166912079,
-0.04224622622132301,
-0.026009192690253258,
0.09739430993795395,
-0.0371873714029789,
-0.05030309781432152,
-0.10384762287139893,
-0.003709179814904928,
0.17872263491153717,
0.06415131688117981,
0.06738148629665375,
0.1878499835729599,
0.017725571990013123,
0.044345464557409286,
0.08392728865146637,
0.10997383296489716,
0.11611323058605194,
0.01523895189166069,
-0.06838341802358627,
-0.023196758702397346,
-0.022009072825312614,
0.049760691821575165,
-0.00935616996139288,
0.08982796967029572,
-0.23563462495803833,
-0.010928129777312279,
-0.22800509631633759,
0.09860672056674957,
-0.08199206739664078,
0.08540554344654083,
0.009525075554847717,
0.08308512717485428,
0.03622916713356972,
0.01877555437386036,
-0.012388920411467552,
0.13321413099765778,
0.14962659776210785,
-0.11656196415424347,
0.042060691863298416,
0.0688527375459671,
0.07596452534198761,
0.0869511291384697,
0.08671046793460846,
-0.02421536296606064,
-0.13187971711158752,
-0.018008289858698845,
0.04761394113302231,
-0.24226294457912445,
0.3007655143737793,
0.029850153252482414,
-0.146848663687706,
-0.03066643700003624,
-0.12310768663883209,
0.005882469471544027,
0.16395089030265808,
0.13341280817985535,
0.08789431303739548,
-0.057201988995075226,
-0.11609908938407898,
0.07388512045145035,
-0.020925631746649742,
0.19865243136882782,
-0.008668205700814724,
-0.1292501837015152,
-0.023054441437125206,
0.010980689898133278,
-0.0016276967944577336,
0.1562221348285675,
-0.012750579044222832,
-0.017227984964847565,
0.002756120404228568,
-0.006488984450697899,
-0.06410319358110428,
0.0026286165229976177,
0.07253312319517136,
-0.013497952371835709,
-0.01913713663816452,
0.04064587131142616,
0.019095588475465775,
-0.15303049981594086,
-0.1324007362127304,
0.0672294870018959,
-0.07941722869873047,
0.003969867713749409,
-0.05856693536043167,
-0.1448700726032257,
-0.06501573324203491,
-0.23802143335342407,
0.14097581803798676,
-0.05953530594706535,
0.10631141066551208,
-0.02782464399933815,
0.19487741589546204,
-0.0976346880197525,
0.05912959948182106,
-0.044061627238988876,
-0.03172034025192261,
0.07814668864011765,
-0.07014089822769165,
0.1342686414718628,
-0.24209095537662506,
-0.013489527627825737,
0.11034577339887619,
-0.03585628420114517,
0.06518436223268509,
0.021693933755159378,
-0.05027263984084129,
0.22330157458782196,
0.2487058937549591,
-0.026084737852215767,
0.15694501996040344,
0.16753646731376648,
-0.020935555920004845,
-0.23530246317386627,
-0.006903721950948238,
-0.21484902501106262,
-0.08337562531232834,
0.12223940342664719,
-0.13747304677963257,
0.03902800381183624,
0.0939217358827591,
0.0010952826123684645,
0.3430778682231903,
-0.21177361905574799,
-0.009985336102545261,
0.1372734159231186,
-0.05210535228252411,
0.47172674536705017,
-0.1271582543849945,
-0.11997315287590027,
0.0815739557147026,
-0.11366132646799088,
0.1517474353313446,
-0.13138315081596375,
0.029561515897512436,
0.0255831740796566,
-0.02374509908258915,
0.05028553307056427,
-0.00786512903869152,
0.10975268483161926,
0.021521033719182014,
0.09373980015516281,
-0.09065324813127518,
-0.17102524638175964,
0.1288868486881256,
0.019541898742318153,
-0.03531601279973984,
0.15949246287345886,
-0.015569837763905525,
-0.09701990336179733,
0.0463201180100441,
-0.1232416108250618,
0.0032187465112656355,
0.059858646243810654,
-0.0867670401930809,
-0.022623684257268906,
0.04102484509348869,
-0.11957581341266632,
-0.06114617735147476,
0.13741794228553772,
-0.08614290505647659,
0.21285583078861237,
0.06624244898557663,
0.032228127121925354,
-0.1354018747806549,
0.01047166995704174,
-0.06558533012866974,
-0.04777657985687256,
0.06017064303159714,
-0.10746829211711884,
0.04421492666006088,
0.11987651884555817,
-0.00028638483490794897,
0.06740226596593857,
0.08938949555158615,
-0.07416799664497375,
-0.04142049327492714,
0.18825039267539978,
-0.22916147112846375,
-0.12095700204372406,
-0.1039200946688652,
-0.2088722139596939,
0.18829181790351868,
-0.00033193003037013113,
0.08079792559146881,
0.10683409869670868,
0.05332430079579353,
0.004853931255638599,
-0.08454360067844391,
-0.10531865060329437,
-0.021416716277599335,
0.11170656234025955,
-0.0036168815568089485,
-0.08204439282417297,
0.1367543786764145,
0.06214701756834984,
-0.0844874456524849,
-0.044769253581762314,
0.24389880895614624,
-0.09824013710021973,
-0.06865081191062927,
-0.14207328855991364,
0.0799635574221611,
-0.1154094859957695,
-0.042342398315668106,
0.06996291875839233,
-0.025765985250473022,
0.03759332746267319,
0.35033339262008667,
-0.0014574960805475712,
0.14303019642829895,
0.04988197982311249,
-0.02733386866748333,
0.15122100710868835,
-0.09843126684427261,
-0.13155727088451385,
-0.026046710088849068,
-0.12582315504550934,
0.05420156568288803,
-0.053652845323085785,
0.2096225619316101,
-0.11101435124874115,
-0.10032586753368378,
-0.24927568435668945,
0.06682116538286209,
-0.08582878857851028,
-0.11772854626178741,
0.026482315734028816,
-0.05452973395586014,
0.06110908463597298,
-0.01597735472023487,
-0.022840779274702072,
-0.034086089581251144,
-0.16206711530685425,
0.08766068518161774,
0.10225462913513184,
0.0334702804684639,
-0.0017433000029996037,
-0.03288787975907326,
0.11959493905305862,
0.030483977869153023,
0.13645082712173462,
0.1581796109676361,
-0.016835417598485947,
0.21972838044166565,
-0.14576731622219086,
-0.09877008944749832,
0.12693777680397034,
-0.016846541315317154,
0.08400796353816986,
0.19594350457191467,
-0.03968813642859459,
-0.028964776545763016,
-0.05892662703990936,
0.08430557698011398,
-0.1088123619556427,
-0.08649290353059769,
-0.022326575592160225,
0.003364799777045846,
-0.21979905664920807,
-0.02176833525300026,
-0.1472289115190506,
0.14305536448955536,
0.022358523681759834,
-0.03921317681670189,
0.05692265182733536,
0.06586925685405731,
-0.010081776417791843,
-0.03367740660905838,
0.03290082514286041,
-0.12625771760940552,
0.024649139493703842,
-0.01577853038907051,
0.0049084872007369995,
0.0037789340130984783,
0.21141743659973145,
-0.03039473667740822,
-0.004658805206418037,
0.025475207716226578,
0.04434748366475105,
0.02890673466026783,
-0.026134492829442024,
0.12308955937623978,
0.06094574183225632,
-0.09079490602016449,
-0.18515504896640778,
0.05709775164723396,
-0.08041227608919144,
-0.09655862301588058,
0.2132132202386856,
-0.011032160371541977,
-0.0498165488243103,
0.017368722707033157,
0.015895338729023933,
-0.038362376391887665,
0.054460570216178894,
-0.23657652735710144,
0.024670686572790146,
0.002929141279309988,
-0.01258489117026329,
-0.048942871391773224,
0.20124459266662598,
-0.0353495329618454,
0.08595104515552521,
-0.050877586007118225,
0.012544764205813408,
-0.12850943207740784,
-0.08488316088914871,
0.034536056220531464,
-0.1126176044344902,
0.042023174464702606,
-0.048515982925891876,
0.031012289226055145,
0.15582618117332458,
0.0739961713552475,
-0.003876814153045416,
0.14785565435886383,
-0.09909118711948395,
-0.144780695438385,
0.06316819787025452,
-0.007257051300257444,
0.06261762976646423,
-0.04910898953676224,
-0.06487711519002914,
-0.09342164546251297,
-0.11874929070472717,
-0.1416812539100647,
0.010283995419740677,
-0.011591076850891113,
-0.07210418581962585,
-0.18373948335647583,
-0.03681862726807594,
-0.04052470624446869,
0.09620131552219391,
-0.1130053848028183,
0.11246046423912048,
0.012194735929369926,
0.0019887308590114117,
0.03048613853752613,
0.16527454555034637,
-0.016040582209825516,
0.019492071121931076,
-0.05144422873854637,
0.1734255999326706,
-0.04004361107945442,
0.13704326748847961,
-0.0776640996336937,
-0.035643141716718674,
-0.057162247598171234,
0.26574304699897766,
0.23077107965946198,
-0.09328072518110275,
0.02422376722097397,
0.01214075181633234,
0.05587726831436157,
0.13885337114334106,
0.21728123724460602,
0.032460786402225494,
0.27867618203163147,
-0.025291087105870247,
-0.00876180361956358,
0.012466421350836754,
0.07244925200939178,
-0.029872586950659752,
0.1044088676571846,
0.12353460490703583,
-0.033191386610269547,
-0.07762714475393295,
0.15343908965587616,
-0.18617749214172363,
0.12963999807834625,
0.03220813348889351,
-0.18334601819515228,
-0.016535360366106033,
-0.10126204043626785,
-0.033897873014211655,
-0.040622614324092865,
0.1163916140794754,
-0.08412478864192963,
-0.14704664051532745,
-0.1700751930475235,
0.07020367681980133,
-0.34893110394477844,
-0.2056313157081604,
0.09749093651771545,
0.061539795249700546,
0.021139536052942276,
-0.03948601335287094,
-0.00437897490337491,
-0.018801681697368622,
0.03479531407356262,
0.005350892897695303,
0.06393729895353317,
0.039459969848394394,
0.004331011325120926,
-0.22488799691200256,
-0.01230921782553196,
0.050469908863306046,
-0.12342507392168045,
0.05510052666068077,
-0.11021779477596283,
-0.04029572755098343,
0.14694444835186005,
-0.019185082986950874,
0.03388690575957298,
0.02084450237452984,
-0.15405040979385376,
0.053510215133428574,
0.09726190567016602,
0.03100808523595333,
0.0325760617852211,
-0.06833793967962265,
-0.056453824043273926,
0.10628129541873932,
-0.17016896605491638,
-0.14204639196395874,
0.15940703451633453,
-0.062126755714416504,
0.12933465838432312,
-0.0804775059223175,
-0.039271190762519836,
0.015225794166326523,
-0.07323937118053436,
0.14804202318191528,
-0.11058732122182846,
0.08404544740915298,
0.11253806203603745,
0.02362382598221302,
0.0498819425702095,
-0.1676456332206726,
0.08748652040958405,
0.010140333324670792,
-0.04800152778625488,
-0.014966586604714394
] |
null | null |
transformers
|
# German GPT-2 model
**Note**: This model was de-anonymized and now lives at:
https://huggingface.co/dbmdz/german-gpt2
Please use the new model name instead!
|
{"language": "de", "license": "mit", "widget": [{"text": "Heute ist sehr sch\u00f6nes Wetter in"}]}
|
text-generation
|
anonymous-german-nlp/german-gpt2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"gpt2",
"text-generation",
"de",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"de"
] |
TAGS
#transformers #pytorch #tf #jax #gpt2 #text-generation #de #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
# German GPT-2 model
Note: This model was de-anonymized and now lives at:
URL
Please use the new model name instead!
|
[
"# German GPT-2 model\n\nNote: This model was de-anonymized and now lives at:\n\nURL\n\nPlease use the new model name instead!"
] |
[
"TAGS\n#transformers #pytorch #tf #jax #gpt2 #text-generation #de #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"# German GPT-2 model\n\nNote: This model was de-anonymized and now lives at:\n\nURL\n\nPlease use the new model name instead!"
] |
[
64,
30
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #gpt2 #text-generation #de #license-mit #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n# German GPT-2 model\n\nNote: This model was de-anonymized and now lives at:\n\nURL\n\nPlease use the new model name instead!"
] |
[
-0.004146211314946413,
0.052292048931121826,
-0.0017446852289140224,
0.10401535034179688,
0.06770245730876923,
0.04612460359930992,
0.16857320070266724,
0.08686314523220062,
0.04005904495716095,
-0.043726298958063126,
0.15250873565673828,
0.11394444108009338,
-0.010773802176117897,
0.11226895451545715,
0.006834208499640226,
-0.32639795541763306,
0.10027890652418137,
0.020932987332344055,
-0.03049357607960701,
0.07669755071401596,
0.1295587718486786,
-0.007808450143784285,
0.13466811180114746,
0.03754570707678795,
-0.10230156034231186,
0.02399960346519947,
0.059490662068128586,
-0.10181722790002823,
0.114165298640728,
0.0567164309322834,
0.041275735944509506,
0.06526504456996918,
0.031615182757377625,
0.014064384624361992,
0.020574623718857765,
-0.006308617535978556,
-0.10198620706796646,
0.0671619325876236,
-0.011570177972316742,
-0.0104652289301157,
0.20365245640277863,
0.08989328891038895,
-0.034699101001024246,
-0.014629073441028595,
-0.12147289514541626,
-0.14198677241802216,
-0.045931581407785416,
0.16700297594070435,
-0.05094432458281517,
0.04326477646827698,
0.00829167477786541,
0.1266251653432846,
-0.0892551839351654,
0.04867285117506981,
0.11899357289075851,
-0.3673287332057953,
-0.018484193831682205,
0.19879266619682312,
0.05962751805782318,
-0.006792906206101179,
0.008333247154951096,
0.12211193144321442,
0.038816533982753754,
0.03653288260102272,
0.043969180434942245,
-0.04301433265209198,
-0.007596329320222139,
0.04072488099336624,
-0.104734867811203,
-0.07307813316583633,
0.19957351684570312,
-0.030754026025533676,
-0.0020896594505757093,
-0.0251996461302042,
-0.0888960137963295,
0.024465275928378105,
-0.01959490031003952,
-0.10848776996135712,
-0.0182937141507864,
0.05109265446662903,
0.033405378460884094,
-0.13632360100746155,
-0.07878472656011581,
-0.05848713964223862,
-0.15206755697727203,
0.24087658524513245,
0.017366958782076836,
0.10358400642871857,
-0.1565927267074585,
0.12190970778465271,
-0.2065938264131546,
-0.09862641245126724,
-0.009519199840724468,
-0.13170304894447327,
0.124495729804039,
0.03197849541902542,
-0.04874715954065323,
-0.013748752884566784,
0.04342101886868477,
0.13098779320716858,
0.045260604470968246,
-0.0380730926990509,
0.05904991924762726,
0.08947157114744186,
0.01547185331583023,
0.13336631655693054,
-0.08082497864961624,
-0.053855471312999725,
0.07448676973581314,
-0.08793243765830994,
0.03342029079794884,
-0.0367877371609211,
-0.2315431535243988,
-0.04647444188594818,
0.018850114196538925,
0.009433425962924957,
0.017293009907007217,
0.164878249168396,
-0.0035458134952932596,
-0.03470992296934128,
0.1682012528181076,
-0.013662251643836498,
0.007650303188711405,
-0.07273545861244202,
-0.000016373944163206033,
0.06254211068153381,
0.036811910569667816,
0.000600361090619117,
-0.058570656925439835,
0.021815789863467216,
-0.08518850803375244,
-0.0863618478178978,
-0.06224353611469269,
-0.11847801506519318,
0.030337223783135414,
0.01481977291405201,
0.006603074260056019,
-0.17927832901477814,
-0.17531909048557281,
0.050456345081329346,
0.0714808851480484,
-0.0031973617151379585,
-0.07176379859447479,
0.022802848368883133,
-0.06754770874977112,
0.06693919748067856,
-0.041381869465112686,
-0.035664815455675125,
-0.03356527164578438,
0.031176814809441566,
-0.06688419729471207,
0.10878784954547882,
-0.2484581470489502,
0.0669030025601387,
-0.10127029567956924,
-0.021750090643763542,
-0.16855034232139587,
0.028734920546412468,
-0.03324766457080841,
-0.012212519533932209,
-0.006469004787504673,
-0.06422402709722519,
-0.04883856698870659,
0.08062411099672318,
-0.009569935500621796,
0.1309739500284195,
-0.11389981210231781,
-0.10027898102998734,
0.23314079642295837,
-0.13780605792999268,
-0.13964314758777618,
0.08911114186048508,
-0.00308074033819139,
0.06182469055056572,
0.08672021329402924,
0.16388075053691864,
0.0840025246143341,
-0.08050226420164108,
0.05228419601917267,
0.07239318639039993,
-0.1384262591600418,
-0.07590438425540924,
0.04178276285529137,
0.0017047091387212276,
-0.21768628060817719,
0.0643579363822937,
-0.13124153017997742,
0.015099816024303436,
-0.04145544022321701,
-0.020258046686649323,
-0.04338189214468002,
-0.008444469422101974,
0.10386542230844498,
-0.04660406708717346,
0.09383706748485565,
-0.04728597030043602,
-0.10455272346735,
0.040687669068574905,
0.040428608655929565,
0.006141968071460724,
0.0068289004266262054,
-0.040854379534721375,
0.1273738145828247,
-0.003221274120733142,
0.057165928184986115,
-0.0886780172586441,
-0.09800665825605392,
-0.00576305715367198,
0.041165389120578766,
0.06708233803510666,
0.17620302736759186,
0.07589804381132126,
-0.02557668648660183,
-0.04542999342083931,
0.02317991480231285,
0.03312442824244499,
0.016594916582107544,
-0.022054649889469147,
-0.09336157143115997,
-0.027534648776054382,
-0.029177898541092873,
-0.03966621309518814,
-0.014648922719061375,
0.0027619400061666965,
-0.04076787829399109,
0.08259785920381546,
-0.010768325999379158,
0.0663367360830307,
-0.07403264939785004,
-0.01934998482465744,
-0.06556235253810883,
-0.01017635315656662,
0.03806828334927559,
0.010669127106666565,
-0.042082853615283966,
0.22839128971099854,
-0.06808433681726456,
0.2582804560661316,
0.18852339684963226,
-0.0910840779542923,
-0.040052659809589386,
0.08346520364284515,
-0.039766695350408554,
0.05460658669471741,
0.06484398990869522,
-0.07161002606153488,
0.10043509304523468,
-0.07526103407144547,
0.10698391497135162,
-0.09465004503726959,
0.0008046855218708515,
-0.003603581804782152,
-0.04358905553817749,
-0.054138053208589554,
0.03323812410235405,
0.1500823199748993,
-0.13172754645347595,
0.12988010048866272,
0.189460888504982,
-0.014792882837355137,
0.18607386946678162,
0.025444652885198593,
-0.007485576905310154,
-0.015372129157185555,
-0.08691908419132233,
-0.03207157924771309,
0.06545715779066086,
-0.09594450891017914,
-0.01914137415587902,
0.09401978552341461,
0.03612760454416275,
0.07553821802139282,
-0.10056672990322113,
-0.03259501978754997,
0.008043880574405193,
-0.020340902730822563,
-0.04872255399823189,
0.11703679710626602,
-0.03344898298382759,
0.13494423031806946,
0.006197758950293064,
-0.10785522311925888,
0.08983567357063293,
0.043818216770887375,
-0.06765075773000717,
0.15294253826141357,
-0.0628613755106926,
-0.2617343068122864,
-0.12628905475139618,
-0.022068152204155922,
-0.08544472604990005,
0.058062389492988586,
0.08022889494895935,
-0.008032609708607197,
-0.0525667667388916,
-0.02058514580130577,
0.09330832213163376,
-0.053362373262643814,
0.05038239061832428,
-0.08662831038236618,
-0.03654133155941963,
-0.03486268222332001,
-0.12567827105522156,
-0.07901978492736816,
-0.05306737869977951,
-0.07468146085739136,
0.10105065256357193,
-0.08808975666761398,
0.06723485141992569,
0.1348811686038971,
-0.06448782980442047,
0.04968656972050667,
-0.012915467843413353,
0.20760385692119598,
-0.04939508065581322,
0.09026502072811127,
0.16141918301582336,
0.05453570932149887,
0.0760999321937561,
0.11758141964673996,
0.005728523246943951,
-0.04485388845205307,
0.0034046012442559004,
-0.026836169883608818,
-0.11552584171295166,
-0.14264073967933655,
-0.12241895496845245,
-0.059447333216667175,
0.016051532700657845,
0.029955551028251648,
0.05581382289528847,
0.1816939264535904,
0.08849996328353882,
-0.027589917182922363,
0.07288356870412827,
-0.0378415621817112,
0.04811090975999832,
0.1418585628271103,
-0.026941968128085136,
0.1361856609582901,
-0.062070682644844055,
-0.11881577223539352,
0.15372180938720703,
0.01723037101328373,
0.055359430611133575,
0.07025092840194702,
-0.04721295088529587,
0.08107426762580872,
0.08182407170534134,
0.11115504056215286,
0.09841810166835785,
0.011749346740543842,
-0.039537377655506134,
-0.0723659098148346,
-0.07690104097127914,
0.020411068573594093,
0.09233038127422333,
0.010081209242343903,
-0.11414904147386551,
-0.04146653413772583,
-0.07087401300668716,
0.06486963480710983,
0.0283830463886261,
0.09300842881202698,
-0.2526229918003082,
-0.07294619828462601,
0.03683290630578995,
-0.015579069964587688,
-0.05622752010822296,
0.036272190511226654,
-0.02884565480053425,
-0.12075312435626984,
0.07031596451997757,
0.033914171159267426,
0.09001927822828293,
0.02660948410630226,
0.06318042427301407,
0.029577629640698433,
-0.020998947322368622,
0.0027327181305736303,
0.0819702073931694,
-0.2445390671491623,
0.26273590326309204,
0.0005449476884678006,
-0.043836940079927444,
-0.11326078325510025,
-0.011039037257432938,
0.06362961232662201,
0.1702304184436798,
0.10475308448076248,
0.027365168556571007,
-0.08717264235019684,
0.060815900564193726,
-0.04831087961792946,
0.05020992085337639,
0.053089454770088196,
-0.07535312324762344,
0.004107046406716108,
-0.06268562376499176,
-0.0008311254787258804,
0.026082346215844154,
0.150069460272789,
-0.08232630789279938,
-0.054715681821107864,
0.07883454114198685,
0.10001412034034729,
0.026013970375061035,
-0.004451392684131861,
-0.07228364050388336,
-0.10367823392152786,
0.22116434574127197,
0.07185501605272293,
-0.10683424770832062,
-0.1450803279876709,
-0.060997553169727325,
0.05152325704693794,
-0.08368213474750519,
0.07045471668243408,
-0.05378590151667595,
0.03960033506155014,
-0.02095774933695793,
-0.227796733379364,
0.1353544145822525,
-0.07743348181247711,
-0.03620650991797447,
0.011026063933968544,
0.11392796039581299,
-0.04364040866494179,
0.02025531977415085,
0.05099571496248245,
0.010716570541262627,
-0.0730975866317749,
-0.16094790399074554,
0.025925153866410255,
-0.014310263097286224,
-0.01069643348455429,
-0.05337737500667572,
-0.07899437844753265,
-0.035164374858140945,
0.09886635839939117,
0.07803381979465485,
0.15516237914562225,
0.12319771200418472,
-0.10741859674453735,
0.150858074426651,
0.10993979871273041,
-0.036857083439826965,
-0.34546664357185364,
-0.09360641986131668,
-0.0653187483549118,
0.011389481835067272,
-0.03075222298502922,
-0.12165265530347824,
0.047457754611968994,
0.015719018876552582,
-0.07451207935810089,
0.07734636217355728,
-0.1881413757801056,
-0.10747761279344559,
0.13944490253925323,
-0.0363839827477932,
0.3040919899940491,
-0.09013035148382187,
-0.06355276703834534,
-0.06775522232055664,
-0.2029726207256317,
0.1580049842596054,
-0.0533011294901371,
0.08520250022411346,
-0.021097328513860703,
0.14778520166873932,
0.00895211473107338,
-0.019725702702999115,
0.09987913072109222,
0.012549103237688541,
0.03262735530734062,
-0.1201055571436882,
-0.10072673112154007,
0.17904217541217804,
-0.019100598990917206,
0.16017982363700867,
-0.02852756902575493,
0.05682140588760376,
-0.07773055136203766,
-0.054389338940382004,
-0.12467177212238312,
0.091567762196064,
-0.0061605460941791534,
-0.1203320249915123,
-0.04696228727698326,
0.020977575331926346,
-0.02276434190571308,
-0.01949315331876278,
0.07686644047498703,
-0.027211712673306465,
0.05112529173493385,
0.04930726811289787,
0.035383619368076324,
-0.10305674374103546,
0.03424397110939026,
-0.008909642696380615,
-0.09459683299064636,
0.12688910961151123,
-0.15259452164173126,
0.043452490121126175,
0.060882408171892166,
-0.03166875243186951,
0.032417748123407364,
0.08128514885902405,
-0.025448696687817574,
-0.03300786018371582,
0.1397842913866043,
-0.22703948616981506,
-0.061011865735054016,
-0.08987864851951599,
-0.0758276879787445,
0.08576308190822601,
0.1528882533311844,
0.17084889113903046,
-0.06742832064628601,
-0.041112229228019714,
-0.0066753411665558815,
-0.027595868334174156,
-0.08784182369709015,
-0.05243817716836929,
0.07400190830230713,
-0.01295922975987196,
-0.10123367607593536,
0.03507097810506821,
-0.00699603371322155,
0.009256049990653992,
0.029662495478987694,
0.007326300721615553,
-0.09408359974622726,
-0.11530566960573196,
-0.04092693701386452,
0.04953916370868683,
-0.2272692173719406,
-0.044673074036836624,
-0.005355822388082743,
-0.08222611993551254,
0.06019175797700882,
0.0437559075653553,
0.08122199028730392,
0.07578682899475098,
-0.037195947021245956,
0.006359584629535675,
-0.02289155311882496,
-0.03237099573016167,
-0.044555775821208954,
0.02761290781199932,
-0.10857846587896347,
0.08926766365766525,
-0.06673772633075714,
0.0904652327299118,
-0.11422254890203476,
0.00582286948338151,
-0.1470559537410736,
-0.03970568627119064,
-0.16074229776859283,
-0.09847371280193329,
-0.020207537338137627,
-0.027630247175693512,
0.005706453695893288,
-0.08810269087553024,
-0.08406677842140198,
0.02173001877963543,
-0.12930892407894135,
-0.008600877597928047,
-0.008024429902434349,
0.03237040713429451,
-0.07873818278312683,
-0.0005235639400780201,
0.04241170361638069,
0.019997380673885345,
0.12611731886863708,
0.07735622674226761,
-0.027592763304710388,
0.10296045243740082,
-0.10204824060201645,
-0.02041454240679741,
0.048626549541950226,
0.01428100187331438,
0.06596387922763824,
0.06213011592626572,
0.03546925634145737,
0.04851411283016205,
-0.0002380812948103994,
0.058370668441057205,
-0.051778655499219894,
-0.07957424223423004,
0.03290005773305893,
0.0011167047778144479,
-0.07590370625257492,
0.015198004432022572,
0.01109830942004919,
0.0512632392346859,
0.009126262739300728,
0.09587841480970383,
-0.01666591688990593,
-0.023448418825864792,
-0.11022201180458069,
0.03268985450267792,
-0.010087065398693085,
-0.13726896047592163,
-0.07264184206724167,
-0.07858406752347946,
-0.008547690697014332,
0.010291851125657558,
0.29010123014450073,
0.14545989036560059,
-0.07660370320081711,
0.008895376697182655,
0.12276654690504074,
0.09258697181940079,
-0.017876246944069862,
0.22462813556194305,
0.03147813677787781,
-0.004638021811842918,
-0.10170527547597885,
0.05976738780736923,
-0.004754581023007631,
-0.03612511605024338,
0.17761148512363434,
-0.0007883433718234301,
0.0304254200309515,
0.06967014819383621,
0.08171822130680084,
0.054723627865314484,
-0.18237590789794922,
-0.18993869423866272,
0.0045096236281096935,
0.10420643538236618,
-0.04300948604941368,
-0.026340991258621216,
0.12380540370941162,
-0.04935302212834358,
0.05837079510092735,
0.00539188040420413,
0.005912238731980324,
-0.1519766002893448,
-0.2650834023952484,
-0.10284795612096786,
-0.1808215230703354,
-0.014358285814523697,
-0.09107287228107452,
-0.004629611503332853,
0.04001321271061897,
0.047277290374040604,
-0.09193235635757446,
0.03214282914996147,
-0.12496498972177505,
-0.0794474333524704,
0.10195139795541763,
-0.02743694745004177,
0.007758161053061485,
-0.008507114835083485,
-0.01960606686770916,
-0.10839898139238358,
0.05877294763922691,
-0.0382358655333519,
0.011724865064024925,
-0.02406146004796028,
-0.01280888170003891,
-0.08509070426225662,
-0.04105390980839729,
-0.07627221941947937,
0.032078105956315994,
0.02606062963604927,
0.03585775941610336,
-0.012411698698997498,
-0.05500544235110283,
0.036543551832437515,
0.19658775627613068,
-0.03846440836787224,
-0.1002817377448082,
-0.08775481581687927,
0.25865569710731506,
-0.004026188049465418,
0.11768713593482971,
-0.0049153827130794525,
-0.03351878747344017,
-0.07743906229734421,
0.23387552797794342,
0.4007292091846466,
-0.08669859915971756,
0.025581663474440575,
0.021088682115077972,
0.016865301877260208,
0.08554811775684357,
0.1535852551460266,
-0.004339142702519894,
0.2386467605829239,
-0.04052448272705078,
-0.04357437044382095,
-0.03719262406229973,
0.0277701448649168,
-0.034384697675704956,
0.1370689868927002,
0.027637101709842682,
-0.13880489766597748,
-0.04361279308795929,
0.052506834268569946,
-0.14017793536186218,
0.060968734323978424,
-0.0875777006149292,
-0.09512408822774887,
-0.10840729624032974,
0.028949009254574776,
0.0015815882943570614,
0.07452604174613953,
0.11276080459356308,
-0.04388584941625595,
-0.03434251993894577,
0.09161324054002762,
0.0024388115853071213,
-0.1735701709985733,
-0.05624684691429138,
0.09988363087177277,
0.07705222070217133,
0.12856854498386383,
0.0056543173268437386,
0.03507617861032486,
0.07629848271608353,
-0.016398701816797256,
-0.03860802575945854,
0.05134080722928047,
0.0007653604261577129,
-0.057464856654405594,
-0.006102892104536295,
-0.07162769138813019,
-0.016395023092627525,
-0.10480178147554398,
0.04934599623084068,
-0.10835527628660202,
0.023639898747205734,
0.05186815187335014,
-0.0741339772939682,
-0.07667441666126251,
0.07365840673446655,
-0.09704338014125824,
0.09224803000688553,
0.1296052634716034,
-0.02151622250676155,
-0.03497730940580368,
-0.056290362030267715,
0.07859671860933304,
0.0709967240691185,
-0.09060946106910706,
-0.06195784732699394,
-0.05312913656234741,
-0.05121317505836487,
0.008798163384199142,
-0.00360797974281013,
-0.17921601235866547,
-0.006269725505262613,
-0.074793241918087,
0.039493314921855927,
-0.10416435450315475,
0.06526138633489609,
0.12889286875724792,
-0.003141808556392789,
0.02340768091380596,
0.04571923613548279,
-0.004785116761922836,
0.03160111978650093,
-0.08587165921926498,
-0.0921373963356018
] |
null | null |
transformers
|
# Disclaimer: This page is under maintenance. Please DO NOT refer to the information on this page to make any decision yet.
# Vaccinating COVID tweets
A fine-tuned model for fact-classification task on English tweets about COVID-19/vaccine.
## Intended uses & limitations
You can classify if the input tweet (or any others statement) about COVID-19/vaccine is `true`, `false` or `misleading`.
Note that since this model was trained with data up to May 2020, the most recent information may not be reflected.
#### How to use
You can use this model directly on this page or using `transformers` in python.
- Load pipeline and implement with input sequence
```python
from transformers import pipeline
pipe = pipeline("sentiment-analysis", model = "ans/vaccinating-covid-tweets")
seq = "Vaccines to prevent SARS-CoV-2 infection are considered the most promising approach for curbing the pandemic."
pipe(seq)
```
- Expected output
```python
[
{
"label": "false",
"score": 0.07972867041826248
},
{
"label": "misleading",
"score": 0.019911376759409904
},
{
"label": "true",
"score": 0.9003599882125854
}
]
```
- `true` examples
```python
"By the end of 2020, several vaccines had become available for use in different parts of the world."
"Vaccines to prevent SARS-CoV-2 infection are considered the most promising approach for curbing the pandemic."
"RNA vaccines were the first vaccines for SARS-CoV-2 to be produced and represent an entirely new vaccine approach."
```
- `false` examples
```python
"COVID-19 vaccine caused new strain in UK."
```
#### Limitations and bias
To conservatively classify whether an input sequence is true or not, the model may have predictions biased toward `false` or `misleading`.
## Training data & Procedure
#### Pre-trained baseline model
- Pre-trained model: [BERTweet](https://github.com/VinAIResearch/BERTweet)
- trained based on the RoBERTa pre-training procedure
- 850M General English Tweets (Jan 2012 to Aug 2019)
- 23M COVID-19 English Tweets
- Size of the model: >134M parameters
- Further training
- Pre-training with recent COVID-19/vaccine tweets and fine-tuning for fact classification
#### 1) Pre-training language model
- The model was pre-trained on COVID-19/vaccined related tweets using a masked language modeling (MLM) objective starting from BERTweet.
- Following datasets on English tweets were used:
- Tweets with trending #CovidVaccine hashtag, 207,000 tweets uploaded across Aug 2020 to Apr 2021 ([kaggle](https://www.kaggle.com/kaushiksuresh147/covidvaccine-tweets))
- Tweets about all COVID-19 vaccines, 78,000 tweets uploaded across Dec 2020 to May 2021 ([kaggle](https://www.kaggle.com/gpreda/all-covid19-vaccines-tweets))
- COVID-19 Twitter chatter dataset, 590,000 tweets uploaded across Mar 2021 to May 2021 ([github](https://github.com/thepanacealab/covid19_twitter))
#### 2) Fine-tuning for fact classification
- A fine-tuned model from pre-trained language model (1) for fact-classification task on COVID-19/vaccine.
- COVID-19/vaccine-related statements were collected from [Poynter](https://www.poynter.org/ifcn-covid-19-misinformation/) and [Snopes](https://www.snopes.com/) using Selenium resulting in over 14,000 fact-checked statements from Jan 2020 to May 2021.
- Original labels were divided within following three categories:
- `False`: includes false, no evidence, manipulated, fake, not true, unproven and unverified
- `Misleading`: includes misleading, exaggerated, out of context and needs context
- `True`: includes true and correct
## Evaluation results
| Training loss | Validation loss | Training accuracy | Validation accuracy |
| --- | --- | --- | --- |
| 0.1062 | 0.1006 | 96.3% | 94.5% |
# Contributors
- This model is a part of final team project from MLDL for DS class at SNU.
- Team BIBI - Vaccinating COVID-NineTweets
- Team members: Ahn, Hyunju; An, Jiyong; An, Seungchan; Jeong, Seokho; Kim, Jungmin; Kim, Sangbeom
- Advisor: Prof. Wen-Syan Li
<a href="https://gsds.snu.ac.kr/"><img src="https://gsds.snu.ac.kr/wp-content/uploads/sites/50/2021/04/GSDS_logo2-e1619068952717.png" width="200" height="80"></a>
|
{"language": "en", "license": "apache-2.0", "datasets": ["tweets"], "widget": [{"text": "Vaccines to prevent SARS-CoV-2 infection are considered the most promising approach for curbing the pandemic."}]}
|
text-classification
|
ans/vaccinating-covid-tweets
|
[
"transformers",
"pytorch",
"roberta",
"text-classification",
"en",
"dataset:tweets",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #roberta #text-classification #en #dataset-tweets #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
|
Disclaimer: This page is under maintenance. Please DO NOT refer to the information on this page to make any decision yet.
=========================================================================================================================
Vaccinating COVID tweets
========================
A fine-tuned model for fact-classification task on English tweets about COVID-19/vaccine.
Intended uses & limitations
---------------------------
You can classify if the input tweet (or any others statement) about COVID-19/vaccine is 'true', 'false' or 'misleading'.
Note that since this model was trained with data up to May 2020, the most recent information may not be reflected.
#### How to use
You can use this model directly on this page or using 'transformers' in python.
* Load pipeline and implement with input sequence
* Expected output
* 'true' examples
* 'false' examples
#### Limitations and bias
To conservatively classify whether an input sequence is true or not, the model may have predictions biased toward 'false' or 'misleading'.
Training data & Procedure
-------------------------
#### Pre-trained baseline model
* Pre-trained model: BERTweet
+ trained based on the RoBERTa pre-training procedure
+ 850M General English Tweets (Jan 2012 to Aug 2019)
+ 23M COVID-19 English Tweets
+ Size of the model: >134M parameters
* Further training
+ Pre-training with recent COVID-19/vaccine tweets and fine-tuning for fact classification
#### 1) Pre-training language model
* The model was pre-trained on COVID-19/vaccined related tweets using a masked language modeling (MLM) objective starting from BERTweet.
* Following datasets on English tweets were used:
+ Tweets with trending #CovidVaccine hashtag, 207,000 tweets uploaded across Aug 2020 to Apr 2021 (kaggle)
+ Tweets about all COVID-19 vaccines, 78,000 tweets uploaded across Dec 2020 to May 2021 (kaggle)
+ COVID-19 Twitter chatter dataset, 590,000 tweets uploaded across Mar 2021 to May 2021 (github)
#### 2) Fine-tuning for fact classification
* A fine-tuned model from pre-trained language model (1) for fact-classification task on COVID-19/vaccine.
* COVID-19/vaccine-related statements were collected from Poynter and Snopes using Selenium resulting in over 14,000 fact-checked statements from Jan 2020 to May 2021.
* Original labels were divided within following three categories:
+ 'False': includes false, no evidence, manipulated, fake, not true, unproven and unverified
+ 'Misleading': includes misleading, exaggerated, out of context and needs context
+ 'True': includes true and correct
Evaluation results
------------------
Contributors
============
* This model is a part of final team project from MLDL for DS class at SNU.
+ Team BIBI - Vaccinating COVID-NineTweets
+ Team members: Ahn, Hyunju; An, Jiyong; An, Seungchan; Jeong, Seokho; Kim, Jungmin; Kim, Sangbeom
+ Advisor: Prof. Wen-Syan Li
<a href="URL src="URL width="200" height="80">
|
[
"#### How to use\n\n\nYou can use this model directly on this page or using 'transformers' in python.\n\n\n* Load pipeline and implement with input sequence\n* Expected output\n* 'true' examples\n* 'false' examples",
"#### Limitations and bias\n\n\nTo conservatively classify whether an input sequence is true or not, the model may have predictions biased toward 'false' or 'misleading'.\n\n\nTraining data & Procedure\n-------------------------",
"#### Pre-trained baseline model\n\n\n* Pre-trained model: BERTweet\n\t+ trained based on the RoBERTa pre-training procedure\n\t+ 850M General English Tweets (Jan 2012 to Aug 2019)\n\t+ 23M COVID-19 English Tweets\n\t+ Size of the model: >134M parameters\n* Further training\n\t+ Pre-training with recent COVID-19/vaccine tweets and fine-tuning for fact classification",
"#### 1) Pre-training language model\n\n\n* The model was pre-trained on COVID-19/vaccined related tweets using a masked language modeling (MLM) objective starting from BERTweet.\n* Following datasets on English tweets were used:\n\t+ Tweets with trending #CovidVaccine hashtag, 207,000 tweets uploaded across Aug 2020 to Apr 2021 (kaggle)\n\t+ Tweets about all COVID-19 vaccines, 78,000 tweets uploaded across Dec 2020 to May 2021 (kaggle)\n\t+ COVID-19 Twitter chatter dataset, 590,000 tweets uploaded across Mar 2021 to May 2021 (github)",
"#### 2) Fine-tuning for fact classification\n\n\n* A fine-tuned model from pre-trained language model (1) for fact-classification task on COVID-19/vaccine.\n* COVID-19/vaccine-related statements were collected from Poynter and Snopes using Selenium resulting in over 14,000 fact-checked statements from Jan 2020 to May 2021.\n* Original labels were divided within following three categories:\n\t+ 'False': includes false, no evidence, manipulated, fake, not true, unproven and unverified\n\t+ 'Misleading': includes misleading, exaggerated, out of context and needs context\n\t+ 'True': includes true and correct\n\n\nEvaluation results\n------------------\n\n\n\nContributors\n============\n\n\n* This model is a part of final team project from MLDL for DS class at SNU.\n\t+ Team BIBI - Vaccinating COVID-NineTweets\n\t+ Team members: Ahn, Hyunju; An, Jiyong; An, Seungchan; Jeong, Seokho; Kim, Jungmin; Kim, Sangbeom\n\t+ Advisor: Prof. Wen-Syan Li\n\n\n<a href=\"URL src=\"URL width=\"200\" height=\"80\">"
] |
[
"TAGS\n#transformers #pytorch #roberta #text-classification #en #dataset-tweets #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n",
"#### How to use\n\n\nYou can use this model directly on this page or using 'transformers' in python.\n\n\n* Load pipeline and implement with input sequence\n* Expected output\n* 'true' examples\n* 'false' examples",
"#### Limitations and bias\n\n\nTo conservatively classify whether an input sequence is true or not, the model may have predictions biased toward 'false' or 'misleading'.\n\n\nTraining data & Procedure\n-------------------------",
"#### Pre-trained baseline model\n\n\n* Pre-trained model: BERTweet\n\t+ trained based on the RoBERTa pre-training procedure\n\t+ 850M General English Tweets (Jan 2012 to Aug 2019)\n\t+ 23M COVID-19 English Tweets\n\t+ Size of the model: >134M parameters\n* Further training\n\t+ Pre-training with recent COVID-19/vaccine tweets and fine-tuning for fact classification",
"#### 1) Pre-training language model\n\n\n* The model was pre-trained on COVID-19/vaccined related tweets using a masked language modeling (MLM) objective starting from BERTweet.\n* Following datasets on English tweets were used:\n\t+ Tweets with trending #CovidVaccine hashtag, 207,000 tweets uploaded across Aug 2020 to Apr 2021 (kaggle)\n\t+ Tweets about all COVID-19 vaccines, 78,000 tweets uploaded across Dec 2020 to May 2021 (kaggle)\n\t+ COVID-19 Twitter chatter dataset, 590,000 tweets uploaded across Mar 2021 to May 2021 (github)",
"#### 2) Fine-tuning for fact classification\n\n\n* A fine-tuned model from pre-trained language model (1) for fact-classification task on COVID-19/vaccine.\n* COVID-19/vaccine-related statements were collected from Poynter and Snopes using Selenium resulting in over 14,000 fact-checked statements from Jan 2020 to May 2021.\n* Original labels were divided within following three categories:\n\t+ 'False': includes false, no evidence, manipulated, fake, not true, unproven and unverified\n\t+ 'Misleading': includes misleading, exaggerated, out of context and needs context\n\t+ 'True': includes true and correct\n\n\nEvaluation results\n------------------\n\n\n\nContributors\n============\n\n\n* This model is a part of final team project from MLDL for DS class at SNU.\n\t+ Team BIBI - Vaccinating COVID-NineTweets\n\t+ Team members: Ahn, Hyunju; An, Jiyong; An, Seungchan; Jeong, Seokho; Kim, Jungmin; Kim, Sangbeom\n\t+ Advisor: Prof. Wen-Syan Li\n\n\n<a href=\"URL src=\"URL width=\"200\" height=\"80\">"
] |
[
53,
55,
52,
94,
144,
276
] |
[
"passage: TAGS\n#transformers #pytorch #roberta #text-classification #en #dataset-tweets #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n#### How to use\n\n\nYou can use this model directly on this page or using 'transformers' in python.\n\n\n* Load pipeline and implement with input sequence\n* Expected output\n* 'true' examples\n* 'false' examples#### Limitations and bias\n\n\nTo conservatively classify whether an input sequence is true or not, the model may have predictions biased toward 'false' or 'misleading'.\n\n\nTraining data & Procedure\n-------------------------#### Pre-trained baseline model\n\n\n* Pre-trained model: BERTweet\n\t+ trained based on the RoBERTa pre-training procedure\n\t+ 850M General English Tweets (Jan 2012 to Aug 2019)\n\t+ 23M COVID-19 English Tweets\n\t+ Size of the model: >134M parameters\n* Further training\n\t+ Pre-training with recent COVID-19/vaccine tweets and fine-tuning for fact classification#### 1) Pre-training language model\n\n\n* The model was pre-trained on COVID-19/vaccined related tweets using a masked language modeling (MLM) objective starting from BERTweet.\n* Following datasets on English tweets were used:\n\t+ Tweets with trending #CovidVaccine hashtag, 207,000 tweets uploaded across Aug 2020 to Apr 2021 (kaggle)\n\t+ Tweets about all COVID-19 vaccines, 78,000 tweets uploaded across Dec 2020 to May 2021 (kaggle)\n\t+ COVID-19 Twitter chatter dataset, 590,000 tweets uploaded across Mar 2021 to May 2021 (github)"
] |
[
-0.013448902405798435,
0.03936873748898506,
-0.004436479415744543,
-0.009491684846580029,
0.07302360236644745,
0.03746340423822403,
0.11799094080924988,
0.08363620936870575,
-0.049996268004179,
0.13610602915287018,
0.07488454133272171,
-0.031882982701063156,
0.09166305512189865,
0.13841623067855835,
0.061258960515260696,
-0.23377352952957153,
0.04660249501466751,
-0.10545200109481812,
-0.021129703149199486,
0.12765614688396454,
0.09380724281072617,
-0.0949796587228775,
0.08952781558036804,
0.030308671295642853,
-0.03533737733960152,
-0.039148397743701935,
0.00441170297563076,
-0.041563745588064194,
0.04488696902990341,
0.042271167039871216,
0.1036655381321907,
0.011265303008258343,
0.09421373903751373,
-0.21959622204303741,
0.02593880705535412,
0.08540822565555573,
-0.008223352022469044,
0.07211948931217194,
0.0899994894862175,
-0.06639910489320755,
0.131969153881073,
-0.12924420833587646,
0.12691614031791687,
0.08465275168418884,
-0.12200716882944107,
-0.08498994261026382,
-0.11340917646884918,
0.03757724165916443,
0.10530561208724976,
0.11514917761087418,
-0.052248142659664154,
0.13092122972011566,
-0.032333195209503174,
0.03913193196058273,
0.14871464669704437,
-0.19899381697177887,
-0.033997830003499985,
-0.0026676435954868793,
0.016924478113651276,
0.067461296916008,
-0.06267008930444717,
0.04529538378119469,
0.02189544029533863,
-0.0014081430854275823,
0.04999930411577225,
-0.003692243481054902,
0.06763387471437454,
-0.0540752038359642,
-0.12664908170700073,
-0.07909099012613297,
0.07105504721403122,
0.033992476761341095,
-0.04814741015434265,
-0.16337649524211884,
0.010594365186989307,
-0.0930943563580513,
-0.06133028119802475,
-0.0529850572347641,
0.02422441728413105,
-0.016002284362912178,
-0.02029547467827797,
-0.053665779531002045,
-0.10720721632242203,
0.027822909876704216,
-0.06264109909534454,
0.024490617215633392,
0.006271620746701956,
0.009202114306390285,
-0.016194455325603485,
0.05581517890095711,
0.03435241058468819,
-0.12826620042324066,
0.011042569763958454,
-0.013072270900011063,
-0.14839820563793182,
-0.05839892476797104,
-0.038485124707221985,
-0.01249348558485508,
0.04197731614112854,
0.09930972009897232,
-0.11122259497642517,
-0.008722657337784767,
-0.02126074768602848,
-0.034312814474105835,
0.058770835399627686,
0.0688132718205452,
-0.17336326837539673,
-0.02177235670387745,
-0.0036159136798232794,
0.002689637942239642,
-0.021962910890579224,
-0.0069915191270411015,
-0.015958230942487717,
0.03685222566127777,
0.00968978088349104,
0.08522342145442963,
0.030525509268045425,
0.04749440774321556,
-0.057309988886117935,
-0.05768413841724396,
0.11610102653503418,
-0.10140196233987808,
-0.0068834517151117325,
0.01779477670788765,
-0.029611749574542046,
0.04612741246819496,
0.0209355466067791,
0.06544557958841324,
-0.03165540471673012,
0.0416535809636116,
-0.12973259389400482,
-0.012871576473116875,
-0.030869200825691223,
-0.1331673413515091,
0.0596671923995018,
-0.01522588450461626,
-0.08317738026380539,
-0.11397188156843185,
-0.04033442586660385,
-0.04793194681406021,
-0.036298926919698715,
-0.03365134820342064,
-0.0186470877379179,
-0.055702872574329376,
-0.03532715514302254,
0.06651362776756287,
0.04311693459749222,
0.13308581709861755,
-0.05247367173433304,
0.015982914716005325,
-0.17738106846809387,
0.03336644917726517,
0.1097789853811264,
0.012785680592060089,
-0.11197349429130554,
0.048503682017326355,
-0.15333670377731323,
0.1272440254688263,
-0.08822315186262131,
0.10395189374685287,
-0.17992843687534332,
-0.05366526171565056,
0.020150575786828995,
-0.038446586579084396,
0.017712228000164032,
0.12890304625034332,
-0.08443253487348557,
-0.0644848644733429,
0.13885577023029327,
-0.056551795452833176,
-0.06459444016218185,
0.08847228437662125,
-0.06117379665374756,
0.026036053895950317,
0.12344221770763397,
0.01227949745953083,
0.10395784676074982,
-0.23182539641857147,
-0.006558496505022049,
-0.06827468425035477,
-0.03148294612765312,
0.2048846185207367,
0.07157071679830551,
-0.07628335058689117,
-0.06060903146862984,
-0.020489918068051338,
-0.027569109573960304,
0.02893870323896408,
-0.07724665105342865,
-0.03591879829764366,
0.06222033500671387,
-0.05998486280441284,
0.008962200954556465,
-0.008221488445997238,
-0.0247162114828825,
-0.05076756700873375,
-0.14243392646312714,
-0.03554174304008484,
0.09779636561870575,
-0.02427876740694046,
0.014073756523430347,
-0.12648199498653412,
-0.045430272817611694,
0.010997503995895386,
-0.01552498061209917,
-0.11902212351560593,
-0.2123035043478012,
0.029755376279354095,
-0.0011084777070209384,
0.10899335891008377,
0.012098335660994053,
0.015614221803843975,
0.06071534380316734,
-0.03912686929106712,
0.04357315972447395,
0.011939833872020245,
-0.004681963939219713,
-0.10485426336526871,
-0.18121540546417236,
0.01944657973945141,
-0.04848868027329445,
0.23103369772434235,
-0.06056256219744682,
-0.006147617008537054,
0.04395207017660141,
0.11555338650941849,
0.0567823201417923,
-0.027753489091992378,
0.0632747933268547,
0.003122609108686447,
0.03921777009963989,
-0.05168933793902397,
-0.0006015383987687528,
-0.042590685188770294,
-0.07347428798675537,
0.12919534742832184,
-0.17596712708473206,
-0.16972783207893372,
0.10617277026176453,
0.060603465884923935,
-0.1418415904045105,
-0.04652678966522217,
-0.05930449441075325,
0.006352711468935013,
-0.054402612149715424,
-0.0677676722407341,
0.16220659017562866,
0.024433987215161324,
0.08531919866800308,
-0.09277961403131485,
-0.0843268632888794,
0.021138329058885574,
-0.036027781665325165,
-0.09463170170783997,
0.10425332933664322,
0.012878679670393467,
-0.29877832531929016,
0.057022616267204285,
0.025656871497631073,
0.09425075352191925,
0.12515176832675934,
0.05858711525797844,
-0.09751690179109573,
-0.03187604621052742,
-0.04549437388777733,
0.03969128802418709,
-0.004740034230053425,
0.00007367608486674726,
0.05253325775265694,
0.07818099856376648,
-0.02751145139336586,
-0.004188488703221083,
-0.044914569705724716,
0.01617114432156086,
0.027088703587651253,
-0.030950604006648064,
-0.0014942786656320095,
0.010016624815762043,
0.04637350142002106,
0.15848393738269806,
0.04028449207544327,
0.03087894432246685,
-0.04813970625400543,
-0.02761221118271351,
-0.1605689525604248,
0.142806738615036,
-0.1775972694158554,
-0.27369093894958496,
-0.10752144455909729,
-0.08717995136976242,
0.030132602900266647,
-0.0018690311117097735,
-0.021390898153185844,
-0.10396222770214081,
-0.08399339020252228,
-0.11274690926074982,
0.02730276621878147,
0.016400080174207687,
-0.024619709700345993,
-0.02756365016102791,
0.024175774306058884,
0.0315726101398468,
-0.07139059156179428,
0.02590859867632389,
-0.019023945555090904,
-0.11014958471059799,
0.01621107943356037,
-0.012665731832385063,
0.07178981602191925,
0.1964973509311676,
0.03135334327816963,
-0.020207416266202927,
-0.05185316875576973,
0.2343757152557373,
-0.13968993723392487,
0.06855233013629913,
0.06806197762489319,
0.05263243243098259,
0.02851150743663311,
0.160502627491951,
0.025485342368483543,
-0.05657515674829483,
0.07349901646375656,
0.10853396356105804,
-0.03321949765086174,
-0.2496657520532608,
-0.10853330790996552,
-0.026471449062228203,
-0.09227298200130463,
0.1262916475534439,
0.014760329388082027,
0.1592530608177185,
0.04458293318748474,
-0.1328105926513672,
-0.032364971935749054,
0.0572197288274765,
0.06556034833192825,
-0.005757328122854233,
0.04855070635676384,
0.08965568244457245,
-0.012295658700168133,
-0.003802474355325103,
0.09990403801202774,
-0.11430351436138153,
0.1654636561870575,
-0.01498192548751831,
0.16088439524173737,
0.09223321080207825,
0.04477136954665184,
0.03910515829920769,
-0.028543539345264435,
0.020695112645626068,
0.020311228930950165,
-0.00020159699488431215,
-0.06600215286016464,
0.007805204950273037,
0.048548199236392975,
0.07393772155046463,
-0.03424039110541344,
-0.05357401818037033,
-0.012117632664740086,
0.04399263486266136,
0.20397810637950897,
0.08218325674533844,
-0.21075312793254852,
-0.05736628174781799,
0.04100416228175163,
-0.10321331769227982,
-0.05089782923460007,
-0.028998196125030518,
0.0956650897860527,
-0.16649232804775238,
0.13258890807628632,
0.0025439385790377855,
0.09678749740123749,
0.030117275193333626,
-0.02093559503555298,
0.04324540123343468,
0.05772409215569496,
-0.08577346801757812,
0.09979429095983505,
-0.21865275502204895,
0.19400738179683685,
0.02515997737646103,
0.021910421550273895,
-0.07409307360649109,
-0.024020036682486534,
0.011898175813257694,
0.0381416454911232,
0.14825724065303802,
0.03364044055342674,
0.04014749079942703,
-0.11421061307191849,
-0.14249363541603088,
-0.058264102786779404,
0.10793571174144745,
-0.1214773952960968,
0.12007047235965729,
0.02318059653043747,
-0.03501111641526222,
-0.05238334834575653,
-0.0532677099108696,
-0.15973307192325592,
-0.12897181510925293,
0.06674233824014664,
-0.0663154348731041,
0.0037028291262686253,
-0.025760147720575333,
-0.0526604950428009,
-0.12379032373428345,
0.11463568359613419,
-0.07926903665065765,
-0.10400499403476715,
-0.1754060834646225,
0.11910001188516617,
0.13711386919021606,
-0.08446696400642395,
0.03317122161388397,
-0.016842130571603775,
0.1546529084444046,
-0.055578116327524185,
-0.08774494379758835,
0.008734524250030518,
-0.06091680750250816,
-0.2567897439002991,
-0.027114512398838997,
0.21035470068454742,
0.1685274839401245,
0.09933950006961823,
0.03573409095406532,
0.06811003386974335,
0.013778973370790482,
-0.08308552205562592,
0.06724754720926285,
0.0781640037894249,
0.11712688952684402,
0.050711680203676224,
0.026585783809423447,
-0.10861580073833466,
-0.18371020257472992,
0.029298953711986542,
0.0798698365688324,
0.18023750185966492,
-0.08134585618972778,
0.1230444610118866,
0.10063550621271133,
-0.07664772868156433,
-0.15994621813297272,
-0.012275166809558868,
0.12766245007514954,
0.019186509773135185,
-0.03295823931694031,
-0.1772831231355667,
0.06646575033664703,
0.06759683042764664,
0.008635573089122772,
-0.020180076360702515,
-0.2309916615486145,
-0.17200803756713867,
0.04835929721593857,
-0.0030462536960840225,
-0.06596919149160385,
-0.06649891287088394,
-0.04302997142076492,
-0.017358800396323204,
-0.020989323034882545,
0.21369828283786774,
-0.04812205955386162,
0.050421103835105896,
0.038439974188804626,
0.0689103826880455,
0.03843791037797928,
-0.017697881907224655,
0.158563494682312,
0.044443968683481216,
0.030652564018964767,
-0.08947078138589859,
-0.06887169182300568,
0.11309856176376343,
-0.017979983240365982,
0.028468577191233635,
0.0945759117603302,
-0.001840716227889061,
-0.12240179628133774,
-0.048684172332286835,
-0.11222512274980545,
0.0403604693710804,
-0.04733658954501152,
-0.06151009723544121,
-0.11009226739406586,
0.09325729310512543,
0.09303731471300125,
-0.04893504083156586,
0.0167271476238966,
-0.13523532450199127,
0.08003516495227814,
0.08179961889982224,
0.18607768416404724,
0.04855773597955704,
0.04398803040385246,
-0.009457861073315144,
-0.06276567280292511,
0.0417964793741703,
-0.07706035673618317,
0.014801722951233387,
0.07215926051139832,
0.011980697512626648,
0.1035325899720192,
-0.04074949771165848,
-0.15889830887317657,
0.027213918045163155,
0.05735711380839348,
-0.11764559894800186,
-0.1261719912290573,
-0.03616420179605484,
0.09062705188989639,
-0.1129426509141922,
-0.10440288484096527,
0.16973461210727692,
-0.008212453685700893,
-0.05553159490227699,
-0.004423708189278841,
0.08200053125619888,
-0.008141941390931606,
0.15622203052043915,
0.024606818333268166,
0.034891948103904724,
-0.1268448680639267,
-0.0004224055155646056,
0.14729391038417816,
-0.03411964327096939,
0.05029318109154701,
0.14511272311210632,
-0.12622706592082977,
-0.048467740416526794,
-0.037843670696020126,
0.05073235556483269,
0.00026964364224113524,
-0.007060791831463575,
0.09818179905414581,
-0.10304144769906998,
0.07939890772104263,
0.2026837319135666,
-0.023828184232115746,
0.05091889575123787,
0.005413604434579611,
-0.03242276608943939,
-0.046896208077669144,
0.09233172982931137,
0.020748935639858246,
-0.010505962185561657,
-0.002139662392437458,
0.23538465797901154,
0.013128789141774178,
0.0483076274394989,
-0.025619499385356903,
-0.01506789866834879,
-0.03938886150717735,
-0.02438543736934662,
-0.04127100110054016,
0.02964506857097149,
-0.05139758810400963,
-0.02642364799976349,
0.0006902407039888203,
-0.07743100076913834,
-0.02951117232441902,
-0.027416575700044632,
-0.04784715920686722,
-0.0480385385453701,
-0.016986139118671417,
0.0668768659234047,
-0.1199243813753128,
-0.02092600055038929,
0.0934223160147667,
-0.03970371186733246,
0.12072804570198059,
0.023699887096881866,
0.024133754894137383,
0.009617453441023827,
-0.14315173029899597,
0.007817852310836315,
0.012657971121370792,
0.053198184818029404,
0.04300323501229286,
-0.1595688760280609,
-0.002468391554430127,
-0.05933351814746857,
-0.09273799508810043,
0.0374593548476696,
0.017024731263518333,
-0.07634694129228592,
0.03675999492406845,
0.0678798034787178,
-0.025865204632282257,
-0.10710569471120834,
0.032067425549030304,
0.031291257590055466,
-0.054635465145111084,
0.08343695849180222,
-0.02151918224990368,
0.07680755853652954,
-0.18998394906520844,
-0.028006231412291527,
-0.00950164720416069,
0.016843486577272415,
-0.03405380621552467,
-0.01262721698731184,
0.1045757606625557,
-0.0017647793283686042,
0.07151918113231659,
-0.014809328131377697,
0.008664475753903389,
0.055765971541404724,
0.09131667017936707,
0.08473312109708786,
0.05974756181240082,
0.007185664493590593,
0.04625372216105461,
-0.028740916401147842,
0.04221871495246887,
-0.05906609445810318,
-0.01101387944072485,
-0.11361180245876312,
0.18683047592639923,
0.06280326098203659,
0.18037371337413788,
-0.04828112944960594,
0.031191062182188034,
-0.1117546483874321,
0.012086382135748863,
0.042941514402627945,
-0.08460374176502228,
-0.005674763582646847,
-0.011305653490126133,
0.006267055403441191,
0.1830635666847229,
-0.22420433163642883,
0.07564204186201096,
0.013685542158782482,
-0.05213172733783722,
-0.06815557181835175,
-0.17719489336013794,
-0.05298640951514244,
-0.009601064957678318,
-0.0025801206938922405,
-0.12055569142103195,
0.012193741276860237,
0.05633369833230972,
0.059354688972234726,
0.042336586862802505,
0.07060336321592331,
-0.11300080269575119,
-0.08332499861717224,
0.06261924654245377,
0.046388525515794754,
0.07669074088335037,
-0.006887095980346203,
-0.035690680146217346,
0.06482977420091629,
0.05482111871242523,
0.055438872426748276,
0.0182416420429945,
0.04431169852614403,
0.06443492323160172,
-0.040364451706409454,
-0.11186239123344421,
0.055583447217941284,
-0.014384317211806774,
-0.004817750304937363,
0.0848829373717308,
0.07752072066068649,
0.06657341867685318,
-0.020832598209381104,
0.17779040336608887,
0.009336069226264954,
-0.051326822489500046,
-0.19734399020671844,
0.08739836513996124,
-0.04561146721243858,
0.022394096478819847,
0.03270407021045685,
-0.06942902505397797,
0.0071815140545368195,
0.19811232388019562,
0.1713210493326187,
-0.09484750032424927,
-0.0019338260171934962,
-0.04917290061712265,
0.010252753272652626,
0.0698394700884819,
0.09762924164533615,
0.002819836139678955,
0.18653002381324768,
-0.0956248790025711,
0.02189789153635502,
-0.016146739944815636,
-0.017231769859790802,
-0.06270604580640793,
0.02702566236257553,
0.006561540998518467,
0.004167002625763416,
-0.02674483321607113,
0.1454322636127472,
-0.06294499337673187,
-0.16544729471206665,
0.022965971380472183,
-0.037665579468011856,
-0.10533557087182999,
-0.006564251612871885,
-0.08202187716960907,
0.06460511684417725,
0.046312473714351654,
0.03502337634563446,
-0.004083321895450354,
0.11442834138870239,
0.04215778410434723,
-0.06360951066017151,
-0.09093111753463745,
0.13683708012104034,
-0.042131077498197556,
0.16507145762443542,
-0.0005629879888147116,
0.0809159129858017,
0.09370303899049759,
-0.03782068192958832,
-0.09084264189004898,
0.03360884636640549,
0.05624796822667122,
-0.011697614565491676,
0.062399089336395264,
0.15711937844753265,
0.039703529328107834,
0.029430950060486794,
0.10319078713655472,
-0.09943615645170212,
0.07222990691661835,
-0.1141638308763504,
0.003495379351079464,
-0.032514993101358414,
0.1313842535018921,
-0.08373183012008667,
0.11012730002403259,
0.19690857827663422,
-0.06149094179272652,
0.0254918672144413,
-0.02306312508881092,
-0.030159268528223038,
0.0005656505818478763,
0.021717147901654243,
-0.023567626252770424,
-0.1787758469581604,
0.015105617232620716,
-0.0608416311442852,
0.04157683625817299,
-0.18976303935050964,
-0.012562954798340797,
-0.017345959320664406,
-0.030848009511828423,
-0.020916663110256195,
0.14483711123466492,
-0.008312941528856754,
-0.01839405484497547,
-0.022343827411532402,
-0.020394252613186836,
0.05997467041015625,
0.14716726541519165,
-0.11052588373422623,
0.0017984699225053191
] |
null | null |
transformers
|
{"tags": ["conversational"]}
|
text-generation
|
anshengli2/DialogGPT-small-Bot
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"conversational",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
[] |
[
"TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
51
] |
[
"passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #conversational #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n"
] |
[
-0.009697278961539268,
0.03208012506365776,
-0.007204889785498381,
0.004809224978089333,
0.16726240515708923,
0.014898733235895634,
0.09765533357858658,
0.13672804832458496,
-0.007841327227652073,
-0.031050153076648712,
0.14490588009357452,
0.20411323010921478,
-0.006439372431486845,
0.0661218985915184,
-0.07572533935308456,
-0.2683109939098358,
0.05759621039032936,
0.046649303287267685,
0.016515716910362244,
0.1200079694390297,
0.08573378622531891,
-0.05473608896136284,
0.08714032918214798,
-0.014583407901227474,
-0.150366872549057,
0.017733458429574966,
0.043394338339567184,
-0.12260226160287857,
0.11910516023635864,
0.05462685227394104,
0.07063519209623337,
0.014929565601050854,
-0.07541623711585999,
-0.1631229966878891,
0.03031250834465027,
0.01425902172923088,
-0.0594632662832737,
0.04757995903491974,
0.059961482882499695,
-0.10165371745824814,
0.10819483548402786,
0.09530027210712433,
-0.013078106567263603,
0.06798283755779266,
-0.16849711537361145,
-0.020869607105851173,
-0.01446688175201416,
0.009899779222905636,
0.05550243332982063,
0.09964893013238907,
-0.03413357585668564,
0.10497362166643143,
-0.09214533120393753,
0.11017382889986038,
0.10932035744190216,
-0.32057443261146545,
-0.005767723545432091,
0.09167823940515518,
0.039358653128147125,
0.07352814823389053,
-0.04467793554067612,
0.06258884817361832,
0.018015462905168533,
0.017986174672842026,
-0.014015024527907372,
-0.07283061742782593,
-0.11612214148044586,
0.04717336222529411,
-0.08668071031570435,
-0.059868961572647095,
0.2244078367948532,
-0.05464440956711769,
0.06881742179393768,
-0.05281897634267807,
-0.10522868484258652,
-0.04308144748210907,
-0.029833965003490448,
0.00475557055324316,
-0.07660607248544693,
0.08692064881324768,
0.00869679357856512,
-0.09547875821590424,
-0.1376667022705078,
-0.02496783249080181,
-0.1776352822780609,
0.16140350699424744,
0.02465328387916088,
0.05232657864689827,
-0.2027255892753601,
0.09623090922832489,
0.017906051129102707,
-0.08045592904090881,
0.022091427817940712,
-0.10046248883008957,
0.029131146147847176,
0.013760408386588097,
-0.04754498973488808,
-0.061387211084365845,
0.0843690037727356,
0.11199145019054413,
-0.01731434464454651,
0.025486016646027565,
-0.039331406354904175,
0.08100687712430954,
0.03553595021367073,
0.09077847748994827,
0.007288969587534666,
-0.028338588774204254,
0.025842782109975815,
-0.13719046115875244,
-0.003647835226729512,
-0.07116208970546722,
-0.16572439670562744,
-0.021088803187012672,
0.02994808368384838,
0.08289173990488052,
0.015449047088623047,
0.11682453751564026,
-0.03272046521306038,
-0.025152435526251793,
0.03602350503206253,
-0.047656361013650894,
-0.012649794109165668,
0.016648368909955025,
0.013163427822291851,
0.12399329990148544,
-0.0022096503525972366,
0.03235051408410072,
-0.13653022050857544,
0.031423524022102356,
-0.06793295592069626,
-0.003740974934771657,
-0.03486552834510803,
-0.040637075901031494,
0.009043924510478973,
-0.06862333416938782,
0.003486064961180091,
-0.15030112862586975,
-0.15063877403736115,
0.007587034720927477,
-0.007836631499230862,
-0.04107699543237686,
-0.06370922178030014,
-0.06952770054340363,
-0.013550350442528725,
0.04251532256603241,
-0.07093454152345657,
-0.011352915316820145,
-0.06403283774852753,
0.11004766076803207,
-0.03197755664587021,
0.07921615242958069,
-0.11953279376029968,
0.08390819281339645,
-0.11260783672332764,
-0.02386913076043129,
-0.060801517218351364,
0.09317506104707718,
-0.0006014376995153725,
0.09549830108880997,
-0.006563255097717047,
-0.017931854352355003,
-0.07981178909540176,
0.06445012241601944,
-0.042872510850429535,
0.21701598167419434,
-0.0615808479487896,
-0.11181682348251343,
0.28781595826148987,
-0.052628401666879654,
-0.1370542049407959,
0.11647392809391022,
0.008682746440172195,
0.05777018144726753,
0.10703510791063309,
0.19733482599258423,
-0.015276194550096989,
0.004040541127324104,
0.09471915662288666,
0.11263324320316315,
-0.11276852339506149,
-0.033160366117954254,
0.013019153848290443,
-0.04081077128648758,
-0.10867965966463089,
0.04689536616206169,
0.09810488671064377,
0.07090286910533905,
-0.04786505550146103,
-0.03377414867281914,
-0.01366397924721241,
0.0052589005790650845,
0.08885077387094498,
-0.007157256826758385,
0.10962837189435959,
-0.05819983780384064,
-0.03796621412038803,
-0.029282379895448685,
-0.012126247398555279,
-0.03951939567923546,
0.03137664496898651,
-0.043376367539167404,
0.10821941494941711,
-0.011204327456653118,
0.06364280730485916,
-0.16185984015464783,
-0.07691477984189987,
-0.017002692446112633,
0.1581239402294159,
0.024538565427064896,
0.09859629720449448,
0.0552486926317215,
-0.040398042649030685,
-0.0012767292791977525,
0.012792680412530899,
0.15581141412258148,
-0.022091681137681007,
-0.065607450902462,
-0.052166227251291275,
0.08642971515655518,
-0.05641226842999458,
0.04504093527793884,
-0.05937713757157326,
0.012367865070700645,
0.05064384639263153,
0.10342344641685486,
-0.00018274025933351368,
0.03323284164071083,
-0.008164864964783192,
0.002145637758076191,
-0.058205123990774155,
0.007405933458358049,
0.10799351334571838,
0.00036868182360194623,
-0.07365862280130386,
0.22074243426322937,
-0.17796069383621216,
0.1765957772731781,
0.1893044263124466,
-0.299345999956131,
0.017949223518371582,
-0.10759581625461578,
-0.04561871662735939,
0.014407722279429436,
0.05567655712366104,
-0.0454222597181797,
0.1703362911939621,
-0.009871348738670349,
0.18874616920948029,
-0.04946064203977585,
-0.04464937001466751,
-0.0200483538210392,
-0.05118836089968681,
-0.0024189651012420654,
0.07781197130680084,
0.10685696452856064,
-0.13992026448249817,
0.1964332014322281,
0.1621224284172058,
0.048237916082143784,
0.19945049285888672,
0.015346456319093704,
-0.011589210480451584,
0.0909530371427536,
0.005220826715230942,
-0.058739423751831055,
-0.07409929484128952,
-0.2594851851463318,
-0.030033592134714127,
0.07992640137672424,
0.0422382652759552,
0.1212305948138237,
-0.11349532753229141,
-0.038956157863140106,
-0.01763172075152397,
-0.023146281018853188,
0.021672505885362625,
0.0914369598031044,
0.06075398623943329,
0.13201528787612915,
-0.001710098935291171,
-0.007300339173525572,
0.10524573177099228,
0.01783694699406624,
-0.09354141354560852,
0.18308524787425995,
-0.13652534782886505,
-0.37097251415252686,
-0.13911493122577667,
-0.18057456612586975,
-0.05449081212282181,
0.05712554603815079,
0.11679314076900482,
-0.12011238187551498,
-0.018752124160528183,
0.01578843593597412,
0.10931742936372757,
-0.08449502289295197,
0.0021454424131661654,
-0.06880278885364532,
0.0321490578353405,
-0.10310184955596924,
-0.09194442629814148,
-0.055416494607925415,
-0.031392451375722885,
-0.08001253753900528,
0.1423761546611786,
-0.10777941346168518,
0.04476889222860336,
0.20262959599494934,
0.04653622955083847,
0.05625178664922714,
-0.044105201959609985,
0.19377262890338898,
-0.11264272034168243,
-0.01661740615963936,
0.19215328991413116,
-0.048360925167798996,
0.07476246356964111,
0.1232115849852562,
-0.006348740309476852,
-0.08765771239995956,
0.03011748194694519,
-0.02085109055042267,
-0.07988511025905609,
-0.23219464719295502,
-0.13938382267951965,
-0.12429051846265793,
0.09477275609970093,
0.028005298227071762,
0.056365787982940674,
0.17219258844852448,
0.06577219814062119,
-0.038416244089603424,
0.006410336587578058,
0.02959546446800232,
0.08237514644861221,
0.23417828977108002,
-0.06035616248846054,
0.1364797055721283,
-0.03420931473374367,
-0.14982740581035614,
0.08169995993375778,
0.0713929831981659,
0.10213395953178406,
0.06678459793329239,
0.0804823637008667,
0.0149586396291852,
0.06188136339187622,
0.1311223804950714,
0.08191446959972382,
0.019586285576224327,
-0.02480296604335308,
-0.03388110175728798,
-0.025523077696561813,
-0.05937909707427025,
0.040128443390131,
0.06589099019765854,
-0.16763372719287872,
-0.039227183908224106,
-0.09338314831256866,
0.09657008945941925,
0.0873042419552803,
0.06609832495450974,
-0.1842060089111328,
-0.008006223477423191,
0.08488986641168594,
-0.03854905813932419,
-0.13727426528930664,
0.09535189718008041,
0.01523482333868742,
-0.15144726634025574,
0.03139317408204079,
-0.04061909019947052,
0.12188644707202911,
-0.07804752141237259,
0.09809603542089462,
-0.08108244836330414,
-0.07448557764291763,
0.02123199962079525,
0.1261177361011505,
-0.30527687072753906,
0.20240111649036407,
-0.0024993624538183212,
-0.06486981362104416,
-0.1243603527545929,
-0.0032166161108762026,
0.002410882618278265,
0.07357452809810638,
0.10519039630889893,
-0.007196315098553896,
0.001897757756523788,
-0.06300821900367737,
-0.01829923689365387,
0.032471053302288055,
0.13080233335494995,
-0.0401318334043026,
-0.021158374845981598,
-0.050194524228572845,
-0.001653497340157628,
-0.03173094615340233,
-0.06934895366430283,
0.02002747356891632,
-0.19509181380271912,
0.08751901984214783,
0.04166261479258537,
0.09648149460554123,
0.029994789510965347,
0.004265148192644119,
-0.09651939570903778,
0.24698667228221893,
-0.07148019969463348,
-0.10072879493236542,
-0.10919588059186935,
-0.046813901513814926,
0.03569883480668068,
-0.05628936365246773,
0.04309194162487984,
-0.0788632407784462,
0.028997479006648064,
-0.06352769583463669,
-0.19235502183437347,
0.12410202622413635,
-0.09027006477117538,
-0.04412810131907463,
-0.02371402643620968,
0.2110891044139862,
-0.05598580464720726,
0.010335659608244896,
0.02930437959730625,
0.01208863127976656,
-0.11645778268575668,
-0.09678568691015244,
0.031018631532788277,
-0.007351789623498917,
0.050603240728378296,
0.041841957718133926,
-0.05915454775094986,
-0.017138581722974777,
-0.052199993282556534,
-0.022926922887563705,
0.3496883809566498,
0.14231905341148376,
-0.043836336582899094,
0.19347235560417175,
0.12347975373268127,
-0.07452994585037231,
-0.3159443140029907,
-0.1066238060593605,
-0.10937739163637161,
-0.04680149629712105,
-0.07012093812227249,
-0.2002030611038208,
0.06474938243627548,
0.00662544509395957,
-0.013415241613984108,
0.12749312818050385,
-0.2561831772327423,
-0.07571036368608475,
0.15906259417533875,
-0.017980827018618584,
0.3745945692062378,
-0.1168576180934906,
-0.10926306992769241,
-0.03950892388820648,
-0.14175476133823395,
0.16968177258968353,
-0.01989765651524067,
0.11221715062856674,
-0.009765521623194218,
0.14388824999332428,
0.05548359826207161,
-0.023479344323277473,
0.08544106781482697,
0.004999885335564613,
-0.03290518373250961,
-0.10304180532693863,
-0.05676887184381485,
0.007092386484146118,
0.02477436140179634,
0.018026655539870262,
-0.041834570467472076,
0.02227151393890381,
-0.11731979995965958,
-0.04657655209302902,
-0.08982590585947037,
0.04431166127324104,
0.03899754583835602,
-0.07325074821710587,
-0.002380647463724017,
-0.07165111601352692,
-0.012272949330508709,
0.022334342822432518,
0.20356793701648712,
-0.08029330521821976,
0.16448934376239777,
0.09239562600851059,
0.12419285625219345,
-0.14376309514045715,
-0.00019283240544609725,
-0.0762530043721199,
-0.05611240118741989,
0.07737895101308823,
-0.09433035552501678,
0.058893077075481415,
0.10901971161365509,
-0.04567738622426987,
0.08828683942556381,
0.10377411544322968,
0.008936077356338501,
0.003213887568563223,
0.10916902124881744,
-0.2667325437068939,
-0.0296600554138422,
-0.07532413303852081,
0.000883326749317348,
0.09092561900615692,
0.08562852442264557,
0.18840822577476501,
0.025361526757478714,
-0.04293036088347435,
-0.002770674182102084,
0.028597986325621605,
-0.039021048694849014,
0.051667019724845886,
0.001123449532315135,
0.01947369985282421,
-0.1530752182006836,
0.072522833943367,
0.01490565575659275,
-0.15215420722961426,
0.021316176280379295,
0.16572684049606323,
-0.11656328290700912,
-0.1283872276544571,
-0.06520111113786697,
0.08313824236392975,
-0.11755692958831787,
-0.01578943058848381,
-0.03279297426342964,
-0.13145680725574493,
0.07992171496152878,
0.12629036605358124,
0.05557859688997269,
0.0972496047616005,
-0.06061713397502899,
-0.020469192415475845,
-0.018721895292401314,
-0.014099318534135818,
-0.012384648434817791,
-0.007667020428925753,
-0.055978111922740936,
0.0590752474963665,
-0.026677248999476433,
0.1425808072090149,
-0.09221141785383224,
-0.1037059873342514,
-0.16142144799232483,
0.0374140702188015,
-0.11013076454401016,
-0.08825794607400894,
-0.08821134269237518,
-0.050188567489385605,
0.002360827289521694,
-0.019856395199894905,
-0.04037635400891304,
-0.05829505994915962,
-0.12300454825162888,
0.0338277705013752,
-0.040771447122097015,
0.024727050215005875,
-0.07512269169092178,
0.015856385231018066,
0.08507686108350754,
-0.03285100311040878,
0.15655414760112762,
0.1450488418340683,
-0.1006515845656395,
0.10741901397705078,
-0.14806775748729706,
-0.09138492494821548,
0.11116421222686768,
0.015329592861235142,
0.0449691042304039,
0.09723787009716034,
0.013362943194806576,
0.0635865181684494,
0.032776717096567154,
0.05308786407113075,
0.027619892731308937,
-0.11959987878799438,
0.06483134627342224,
-0.03626115620136261,
-0.14700546860694885,
-0.049338050186634064,
-0.05282869189977646,
0.01647452637553215,
0.013054544106125832,
0.09622690081596375,
-0.05301849544048309,
0.10698331147432327,
-0.04055701196193695,
0.0346808135509491,
0.017554637044668198,
-0.1730053424835205,
-0.03816922754049301,
-0.08538098633289337,
0.03681723028421402,
0.014741539023816586,
0.25266793370246887,
0.030072299763560295,
0.012416383251547813,
0.032671261578798294,
0.08285367488861084,
0.03899408504366875,
0.010228337720036507,
0.17482228577136993,
0.1162426546216011,
-0.06621865928173065,
-0.10445023328065872,
0.0729617029428482,
0.016332454979419708,
0.01286179106682539,
0.13617953658103943,
0.008365051820874214,
0.005795429926365614,
0.08649782836437225,
-0.016865963116288185,
0.009968153201043606,
-0.10052056610584259,
-0.13426925241947174,
-0.022176474332809448,
0.05151832848787308,
-0.04655967652797699,
0.11727844923734665,
0.1406494379043579,
-0.01806013658642769,
0.03222079202532768,
-0.021771740168333054,
-0.05699979141354561,
-0.1683429479598999,
-0.1429590880870819,
-0.06883849948644638,
-0.13416796922683716,
0.00897989235818386,
-0.11180389672517776,
0.05395037308335304,
0.06001098081469536,
0.06750501692295074,
-0.06899319589138031,
0.10220931470394135,
0.04626858979463577,
-0.11440542340278625,
0.06264589726924896,
-0.0296088308095932,
0.09430401772260666,
-0.02759445086121559,
-0.019505485892295837,
-0.09039592742919922,
0.014574515633285046,
0.011419114656746387,
0.06245238706469536,
-0.04707273095846176,
0.007463190704584122,
-0.14696238934993744,
-0.08972041308879852,
-0.0523175448179245,
0.0718572810292244,
-0.050409089773893356,
0.14282815158367157,
0.00775480642914772,
-0.0170906875282526,
0.039554283022880554,
0.22787313163280487,
-0.07476283609867096,
-0.04778539761900902,
-0.05269690603017807,
0.20717895030975342,
0.02975541539490223,
0.1171872541308403,
-0.022938819602131844,
-0.006106364540755749,
-0.0919521227478981,
0.3764844834804535,
0.30030161142349243,
-0.09031439572572708,
0.011794124729931355,
0.02137952297925949,
0.04502861574292183,
0.1316293478012085,
0.1216534823179245,
0.10318691283464432,
0.3006802201271057,
-0.07452366501092911,
-0.04653361067175865,
-0.012629742734134197,
-0.023858042433857918,
-0.09059546142816544,
0.1021224707365036,
0.04839762672781944,
-0.06382183730602264,
-0.03313443064689636,
0.0954432487487793,
-0.25862133502960205,
0.1277991235256195,
-0.12311873584985733,
-0.17578600347042084,
-0.06654827296733856,
0.009760108776390553,
0.10465722531080246,
0.015642458572983742,
0.0946015790104866,
0.007128213066607714,
-0.11252258718013763,
0.06305865943431854,
0.03397420793771744,
-0.22762253880500793,
0.0006893770187161863,
0.06642123311758041,
-0.07006710022687912,
-0.0024247700348496437,
-0.026499588042497635,
0.05657242611050606,
0.0656052976846695,
0.054629553109407425,
-0.00971333310008049,
0.03816632181406021,
0.0034184439573436975,
-0.0585215799510479,
0.016623929142951965,
0.05121519789099693,
0.02472509816288948,
-0.09763528406620026,
0.06927435845136642,
-0.1574270874261856,
0.04766253009438515,
-0.0030655991286039352,
-0.04124255105853081,
0.006064958870410919,
0.008823691867291927,
-0.06491616368293762,
0.05165379121899605,
0.07916834205389023,
-0.0016257909592241049,
-0.0062433634884655476,
-0.057178743183612823,
-0.02632102556526661,
-0.027755750343203545,
-0.09291748702526093,
-0.10495562851428986,
-0.14682936668395996,
-0.11640441417694092,
0.09368976950645447,
-0.01011267676949501,
-0.1848134547472,
0.022154374048113823,
-0.08606051653623581,
0.08319322764873505,
-0.1670055389404297,
0.08040720224380493,
0.07041648775339127,
0.013038921169936657,
-0.0031511052511632442,
-0.02002427540719509,
0.054132770746946335,
0.086809903383255,
-0.10407156497240067,
-0.07400695979595184
] |
||
null | null | null |
This repository doesn't contain a model, but only a tokenizer that can be used with the
`tokenizers` library.
This tokenizer is just a copy of `bert-base-uncased`.
```python
from tokenizers import Tokenizer
tokenizer = Tokenizer.from_pretrained("anthony/tokenizers-test")
```
|
{}
| null |
anthony/tokenizers-test
|
[
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[] |
TAGS
#region-us
|
This repository doesn't contain a model, but only a tokenizer that can be used with the
'tokenizers' library.
This tokenizer is just a copy of 'bert-base-uncased'.
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
[
0.024608636274933815,
-0.026205500587821007,
-0.009666500613093376,
-0.10395516455173492,
0.08638657629489899,
0.059816278517246246,
0.01882290467619896,
0.020661840215325356,
0.23975107073783875,
-0.005599027033895254,
0.1219947561621666,
0.0015615287702530622,
-0.037353623658418655,
0.03733762726187706,
-0.0035912662278860807,
-0.17583473026752472,
0.03876631706953049,
-0.018274923786520958,
0.01843859627842903,
0.026470553129911423,
-0.07776834815740585,
-0.07564429938793182,
0.015296397730708122,
-0.10247814655303955,
-0.083692267537117,
0.11002834886312485,
0.031466204673051834,
-0.019670886918902397,
0.10779199749231339,
-0.04243955761194229,
0.18699054419994354,
-0.011512263678014278,
-0.11213519424200058,
-0.2536850869655609,
0.021806683391332626,
-0.01765260472893715,
-0.08747660368680954,
0.01506110467016697,
0.0665089413523674,
-0.09014441072940826,
-0.0588928684592247,
0.0795099288225174,
-0.01132340170443058,
0.04246443510055542,
-0.27593839168548584,
-0.12684126198291779,
-0.05297930911183357,
-0.1421966552734375,
0.08651168644428253,
0.04035491496324539,
0.008764253929257393,
0.15506891906261444,
-0.20897391438484192,
0.004104613792151213,
0.08255259692668915,
-0.2538507878780365,
0.05591634660959244,
0.17671173810958862,
0.03623908758163452,
0.18037272989749908,
0.0060391901060938835,
0.11029672622680664,
0.0716743916273117,
-0.024263937026262283,
-0.17590197920799255,
-0.08127854019403458,
-0.04696211963891983,
0.16642488539218903,
-0.06727185100317001,
-0.14248386025428772,
0.34701237082481384,
0.00015008423360995948,
0.009657775051891804,
0.16921205818653107,
-0.059524230659008026,
-0.09972117841243744,
0.07259953022003174,
0.016484731808304787,
0.018492350354790688,
0.1471305936574936,
0.16307872533798218,
-0.0458691343665123,
-0.13837823271751404,
-0.018630273640155792,
-0.22798998653888702,
0.17510560154914856,
-0.03248048573732376,
0.13137903809547424,
-0.27447956800460815,
0.01684025302529335,
-0.2570667266845703,
0.0032130838371813297,
0.04178816080093384,
-0.06004921346902847,
-0.0226522795855999,
-0.013265985064208508,
-0.08018817007541656,
0.004899587947875261,
0.06192673370242119,
0.1266920566558838,
-0.06128726154565811,
0.06128238886594772,
-0.09319206327199936,
0.141696035861969,
0.07166698575019836,
0.07868369668722153,
0.13037432730197906,
0.041205424815416336,
-0.07187089323997498,
-0.21872246265411377,
-0.0026476888451725245,
-0.06275863200426102,
-0.09502086788415909,
-0.0020165652967989445,
-0.11606067419052124,
0.17244569957256317,
-0.030802514404058456,
-0.09825427830219269,
-0.11208184063434601,
0.09148659557104111,
-0.032992321997880936,
-0.03437839448451996,
-0.03552987426519394,
-0.020977836102247238,
0.019381176680326462,
0.04704452306032181,
-0.1548958420753479,
-0.005131472367793322,
0.07039852440357208,
0.11502562463283539,
-0.1346137970685959,
-0.003783059772104025,
-0.07908964157104492,
0.03039063885807991,
0.07654735445976257,
-0.16510222852230072,
0.03158547356724739,
-0.1124754324555397,
-0.07531405985355377,
0.002912673633545637,
-0.015710093080997467,
-0.016202643513679504,
0.166526660323143,
-0.0020451415330171585,
0.0714716836810112,
-0.026345307007431984,
-0.05890209600329399,
-0.11243434250354767,
-0.08489254862070084,
0.05390460044145584,
0.03670717030763626,
0.03266148269176483,
-0.2193479984998703,
0.014805203303694725,
-0.12762966752052307,
0.1360815018415451,
-0.10566820204257965,
-0.04705966264009476,
-0.022842247039079666,
0.20562705397605896,
0.037286072969436646,
0.08762791007757187,
-0.22171171009540558,
0.039756543934345245,
-0.05404696613550186,
0.18480908870697021,
-0.1502426266670227,
-0.0799463614821434,
0.20813211798667908,
-0.07964949309825897,
-0.10115210711956024,
0.021235812455415726,
0.020391687750816345,
0.026287272572517395,
0.0766737088561058,
0.4564172327518463,
-0.09766800701618195,
-0.09146861732006073,
0.10178250074386597,
0.17055274546146393,
-0.12427149713039398,
-0.1827561855316162,
0.06446871906518936,
-0.16666454076766968,
-0.1973118633031845,
0.0018917324487119913,
0.09222044050693512,
0.038269978016614914,
-0.07875611633062363,
-0.020746968686580658,
0.06325206160545349,
-0.0007678253459744155,
0.09095914661884308,
0.03755716234445572,
0.09034032374620438,
-0.08716782182455063,
0.11115926504135132,
-0.05017651244997978,
0.004037132486701012,
0.1343354731798172,
0.027325427159667015,
-0.03223329409956932,
0.08694463223218918,
-0.0485352948307991,
0.05295134335756302,
-0.1662379503250122,
-0.15068690478801727,
0.03398871049284935,
0.06283251196146011,
0.03186952322721481,
0.1280253529548645,
0.08141885697841644,
-0.10732853412628174,
0.022690722718834877,
-0.004228927195072174,
0.058398615568876266,
0.03891623765230179,
0.006107209715992212,
0.008764320984482765,
0.0961301177740097,
-0.10607069730758667,
-0.13589619100093842,
-0.07336436957120895,
-0.014715781435370445,
0.14371353387832642,
-0.0302802175283432,
0.07690227776765823,
-0.004240254405885935,
0.00013200697139836848,
0.06930823624134064,
0.08137880265712738,
0.016412746161222458,
0.08971183747053146,
-0.05237193778157234,
-0.05160155147314072,
0.10863113403320312,
-0.13533565402030945,
0.17837053537368774,
0.14053137600421906,
-0.20532016456127167,
0.029453208670020103,
-0.06838275492191315,
0.03670361638069153,
-0.008162540383636951,
0.0975119024515152,
-0.08272241055965424,
-0.02106042578816414,
0.013134466484189034,
0.0052274600602686405,
-0.013007243163883686,
0.017682146281003952,
-0.07295988500118256,
-0.07787393033504486,
-0.10233919322490692,
0.08436838537454605,
0.11562882363796234,
-0.10282530635595322,
0.14214380085468292,
0.4384984076023102,
0.11495281755924225,
0.21582984924316406,
-0.09581480920314789,
-0.0412987545132637,
0.007486371789127588,
0.0001535322517156601,
-0.04476691037416458,
0.08031861484050751,
-0.15973517298698425,
-0.038901735097169876,
0.027348900213837624,
0.07128690183162689,
0.11475157737731934,
-0.14959022402763367,
-0.09639324247837067,
-0.00793045200407505,
0.0022841424215584993,
-0.1249532699584961,
0.023905446752905846,
-0.03974650055170059,
0.04015624523162842,
0.07232289016246796,
-0.021535737439990044,
0.13939237594604492,
-0.04166141897439957,
-0.0639561116695404,
0.07585346698760986,
-0.2017085999250412,
-0.23179671168327332,
-0.12309670448303223,
-0.14680525660514832,
0.04366797208786011,
0.05154111236333847,
0.01726446859538555,
-0.17635835707187653,
-0.015074856579303741,
0.07706750929355621,
0.07820965349674225,
-0.20886357128620148,
-0.022814949974417686,
-0.004290030337870121,
0.0895976573228836,
-0.10227091610431671,
-0.0017130117630586028,
-0.04419664293527603,
-0.10150232166051865,
0.0017003051470965147,
0.07279510796070099,
-0.137485533952713,
0.13807645440101624,
0.21589438617229462,
0.07225540280342102,
0.07359948754310608,
-0.019093448296189308,
0.09936179965734482,
-0.10856141895055771,
-0.16549113392829895,
0.08348225057125092,
-0.06234746053814888,
0.047262318432331085,
0.17534415423870087,
0.03307317942380905,
-0.13904969394207,
-0.015682822093367577,
-0.0402069091796875,
-0.15603256225585938,
-0.238995760679245,
-0.09178274869918823,
-0.1182505264878273,
0.16442428529262543,
0.0009358620154671371,
0.06651917099952698,
0.08258313685655594,
-0.022042419761419296,
0.16447891294956207,
-0.07379321753978729,
-0.07578866183757782,
-0.006978808436542749,
0.12375060468912125,
-0.056660156697034836,
-0.03080669604241848,
-0.10566964000463486,
-0.008295975625514984,
0.1151021271944046,
0.15304014086723328,
0.12214863300323486,
0.2957419455051422,
0.08268889784812927,
0.026645636186003685,
0.08958091586828232,
0.17622539401054382,
0.09495089203119278,
0.07838419824838638,
-0.045413073152303696,
-0.014814783819019794,
0.014317171648144722,
-0.04022889584302902,
0.010141594335436821,
0.14683100581169128,
-0.2679629921913147,
-0.006678564939647913,
-0.2710230350494385,
0.0965198427438736,
-0.10913380235433578,
0.11837165057659149,
-0.01015760749578476,
0.10194015502929688,
0.11082887649536133,
0.03233652561903,
-0.03858073800802231,
0.16613617539405823,
0.08450309932231903,
-0.11277695000171661,
0.001758623169735074,
0.03737903758883476,
0.09715615212917328,
-0.02818971499800682,
0.12721189856529236,
-0.11048974841833115,
-0.1464834064245224,
0.013753619976341724,
0.07152791321277618,
-0.15373679995536804,
0.3138748109340668,
0.012069208547472954,
-0.13481520116329193,
-0.01481647603213787,
-0.09957809001207352,
-0.006440147757530212,
0.1254177987575531,
0.09333524852991104,
0.07935678958892822,
-0.2185502052307129,
-0.13339371979236603,
0.05872276425361633,
-0.00575496768578887,
0.22408108413219452,
-0.034034017473459244,
-0.11356475204229355,
-0.027013886719942093,
0.04241163283586502,
-0.06043251231312752,
0.08524788916110992,
0.023536119610071182,
-0.08113526552915573,
-0.032957352697849274,
0.05323701351881027,
0.012368366122245789,
0.00524376705288887,
0.09360801428556442,
0.020107939839363098,
-0.0009265501867048442,
0.01785753294825554,
0.047885000705718994,
-0.0675911232829094,
-0.1984109878540039,
0.09357594698667526,
-0.05215044692158699,
0.0015536568826064467,
-0.08013670891523361,
-0.15122665464878082,
-0.08837161958217621,
-0.16009655594825745,
0.12540200352668762,
-0.034406669437885284,
0.12700119614601135,
-0.06619787961244583,
0.17341409623622894,
-0.07871770113706589,
0.04481020197272301,
-0.047349292784929276,
0.050332702696323395,
-0.007268077693879604,
-0.07756082713603973,
0.16585899889469147,
-0.15564003586769104,
0.01809087023139,
0.19572502374649048,
-0.018915493041276932,
0.07177707552909851,
0.021322092041373253,
-0.0636206790804863,
0.23147478699684143,
0.3014698624610901,
0.008138049393892288,
0.1665448248386383,
0.3018903136253357,
-0.07466315478086472,
-0.2642788887023926,
-0.05505012720823288,
-0.2841376066207886,
-0.05371501296758652,
0.10716094076633453,
-0.22523896396160126,
0.06986407935619354,
0.14383509755134583,
-0.06471995264291763,
0.30228954553604126,
-0.21825523674488068,
0.012589273042976856,
0.15434536337852478,
-0.08868814259767532,
0.5515313148498535,
-0.1133413165807724,
-0.17677772045135498,
-0.008122089318931103,
-0.08741296827793121,
0.10602109134197235,
-0.0340677872300148,
0.06877441704273224,
0.013465235009789467,
0.04797380417585373,
0.048932258039712906,
-0.03111894056200981,
0.22701001167297363,
0.008710170164704323,
0.09015397727489471,
-0.07378865778446198,
-0.18624304234981537,
0.11639340221881866,
-0.04359482601284981,
-0.08891059458255768,
0.0849778801202774,
-0.05942516401410103,
-0.11078983545303345,
0.04663389176130295,
-0.07950539886951447,
-0.024862350896000862,
0.08423490077257156,
-0.04678233340382576,
-0.042606171220541,
-0.008054176345467567,
-0.1618063747882843,
-0.0002289071271661669,
0.31360217928886414,
-0.07096036523580551,
0.16695955395698547,
0.03677211329340935,
0.00038613268407061696,
-0.11027684062719345,
0.030288029462099075,
-0.05203165486454964,
-0.021576624363660812,
0.09578979015350342,
-0.11096979677677155,
0.03204701095819473,
0.14160704612731934,
-0.04864364117383957,
0.05846960097551346,
0.09256096184253693,
-0.0849417969584465,
0.007583672646433115,
0.17753590643405914,
-0.17537221312522888,
-0.1273445188999176,
-0.006135711446404457,
-0.09862716495990753,
0.14055661857128143,
0.04394126310944557,
0.05191568285226822,
0.16669964790344238,
0.03967129811644554,
-0.029474308714270592,
-0.02817419543862343,
-0.1153380498290062,
-0.0201893113553524,
0.040153320878744125,
0.00045633706031367183,
-0.08791285753250122,
0.2262638509273529,
0.06409153342247009,
-0.1328488290309906,
-0.051157206296920776,
0.2161225974559784,
-0.06805316358804703,
-0.04911920800805092,
-0.223562553524971,
0.10752306133508682,
-0.07112517952919006,
-0.0965060144662857,
0.05453834682703018,
-0.02270081453025341,
0.005106312222778797,
0.181985542178154,
0.03941008821129799,
0.11070270836353302,
0.03738937899470329,
-0.02448922023177147,
0.15798696875572205,
-0.142850860953331,
-0.14191335439682007,
-0.025354057550430298,
-0.08757315576076508,
-0.13844476640224457,
-0.026804137974977493,
0.1617041826248169,
-0.09177309274673462,
-0.14772607386112213,
-0.2621181011199951,
0.10968475043773651,
-0.16432365775108337,
-0.10192688554525375,
-0.03469514101743698,
-0.08968492597341537,
0.0696166530251503,
0.030301768332719803,
-0.03093348816037178,
-0.06706760823726654,
-0.18593791127204895,
0.0816768929362297,
0.06349513679742813,
0.045533183962106705,
-0.017847947776317596,
0.0067379772663116455,
0.1720137596130371,
0.025955144315958023,
0.10040043294429779,
0.16762186586856842,
0.011397695168852806,
0.2246655523777008,
-0.1671202927827835,
-0.11496317386627197,
0.1336962729692459,
-0.026543032377958298,
0.06762003898620605,
0.16792191565036774,
-0.0772583931684494,
0.015526676550507545,
-0.028136352077126503,
0.07066910713911057,
-0.11003983020782471,
-0.105624258518219,
0.007937257178127766,
0.02567129209637642,
-0.2755882740020752,
-0.005599735304713249,
-0.19717298448085785,
0.14788752794265747,
0.02579621411859989,
0.03297143429517746,
0.10257530212402344,
0.10404334217309952,
0.08312062919139862,
-0.0017710148822516203,
0.03226327523589134,
-0.1176818460226059,
0.02753005363047123,
-0.059239376336336136,
-0.020663779228925705,
0.017624232918024063,
0.36952024698257446,
-0.03603357449173927,
-0.046802736818790436,
0.003710439894348383,
0.1307835876941681,
-0.02139742486178875,
0.017395347356796265,
0.13209912180900574,
0.12607666850090027,
-0.08595693111419678,
-0.1504845917224884,
0.04888554662466049,
-0.04565655067563057,
-0.02836887165904045,
0.1464131623506546,
0.05905961990356445,
0.1050296202301979,
0.0908031314611435,
-0.014463032595813274,
-0.00318976235575974,
0.012856799177825451,
-0.15486004948616028,
0.06223496049642563,
-0.010558074340224266,
0.012565906159579754,
0.017934376373887062,
0.15238402783870697,
-0.005540105979889631,
0.07739730179309845,
-0.09889880567789078,
0.004208535887300968,
-0.13498884439468384,
-0.07913459837436676,
0.03617347031831741,
-0.13393273949623108,
0.04141177982091904,
-0.01871878281235695,
0.029611799865961075,
0.30386561155319214,
0.02558239921927452,
-0.020639164373278618,
0.12512871623039246,
-0.1214587539434433,
-0.12050267308950424,
-0.001594188273884356,
-0.029960084706544876,
0.0791488066315651,
-0.02633434161543846,
-0.0997740775346756,
-0.1001306027173996,
-0.15166029334068298,
-0.09759195148944855,
0.05182836204767227,
-0.04993441700935364,
-0.059362251311540604,
-0.17634081840515137,
-0.05707859992980957,
-0.05147340148687363,
0.14025864005088806,
-0.12263951450586319,
0.15159130096435547,
-0.014490418136119843,
0.004084470681846142,
0.04405883327126503,
0.1950942426919937,
-0.03644494712352753,
0.08714226633310318,
0.0154351145029068,
0.1522706001996994,
-0.05119588226079941,
0.14720745384693146,
-0.10931728035211563,
-0.04014137014746666,
-0.06710435450077057,
0.21513493359088898,
0.25630924105644226,
-0.06136954948306084,
-0.008937356993556023,
-0.012760217301547527,
0.058654606342315674,
0.1073930487036705,
0.16049085557460785,
0.002326392102986574,
0.2802925705909729,
-0.03133585304021835,
0.04815128445625305,
0.02901598811149597,
0.013607407920062542,
-0.06336209923028946,
0.03397751972079277,
0.07539387792348862,
-0.035039983689785004,
-0.1412304788827896,
0.15837742388248444,
-0.21980468928813934,
0.18157227337360382,
0.11640069633722305,
-0.19996967911720276,
-0.013728445395827293,
-0.04882071167230606,
0.1689416468143463,
-0.0856364443898201,
0.1637246012687683,
-0.0903693437576294,
-0.2108195722103119,
-0.2056000679731369,
0.03867346793413162,
-0.34623071551322937,
-0.254462867975235,
0.10422009229660034,
0.1488201916217804,
0.04015883058309555,
-0.018507536500692368,
-0.019967829808592796,
-0.018367022275924683,
0.04877542704343796,
-0.0067357709631323814,
0.06014643982052803,
0.031397558748722076,
-0.02988368645310402,
-0.24127542972564697,
-0.029804671183228493,
0.023964406922459602,
-0.07093082368373871,
0.07464958727359772,
-0.06874357163906097,
-0.022495782002806664,
0.08059766888618469,
-0.03066304884850979,
0.03298592567443848,
-0.035373736172914505,
-0.16326889395713806,
0.027529051527380943,
0.03900543600320816,
0.036012712866067886,
0.00634160777553916,
0.0008072225609794259,
-0.03455270454287529,
0.0644603744149208,
-0.16716794669628143,
-0.16015739738941193,
0.14140215516090393,
-0.06745140254497528,
0.2779497504234314,
-0.05812826007604599,
-0.0809100940823555,
0.04766704887151718,
-0.03426874056458473,
0.1807648241519928,
-0.07756473124027252,
0.047254521399736404,
0.12766779959201813,
0.011127962730824947,
0.03121316432952881,
-0.3092964291572571,
0.11082969605922699,
-0.000795336440205574,
-0.006093299947679043,
-0.07581598311662674
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.